An example of what we're dealing with here

Page 5 paragraph 1 of this article lauds the performance of complex non-linear algorithms over poor ole logistic regression with reference to AUC performance improvement. Zero consideration of calibration whatsoever despite the fact that if you want to manage the risk of something as critical as suicide, maybe you want to have reliable estimate of the risk. It’s mind-boggling and dismaying how widespread this mode of thinking is, and how quickly it is infecting other fields. I’m fine with using complex non-linear algorithms. I’m not fine with the, “Let’s see if our algorithm ranks people by their risk better than another algorithm and call it good,” mentality.

1 Like