OHDSI Home | Forums | Wiki | Github

A thoughtful perspective piece on how to improving the clinical utility of prediction model by our own Nigam Shah

A shout out to @nigam for his thoughtful Viewpoint published in JAMA today, “Making Machine Learning Models Clinically Useful”.

As @nigam challenges all of us, we need to think beyond the mathematical performance of our patient-level prediction models and our measures of discrimination and calibration, and also start working toward evaluating clinical impact by incorporating the use and consequences of model adoption in healthcare. I think this is valuable context that @jennareps , @Rijnbeek and the rest of the Patient-Level Prediction workgroup can collaborate to integrate into our standardized framework.

1 Like

Congrats @nigam Nigam on a very nice piece highlighting important issues.

@jreps Jenna and @Rijnbeek Peter have integrated Decision Curve Analysis (DCA) as an output for predictive models in their outstanding PLP package. This is a very useful way of evaluating net clinical benefit that doesn’t require collection of cost or other information. It does require input from patients or doctors on the threshold of risk for an outcome at which treatment is justified. So to be more useful, new standard practices are needed for getting that input and incorporating it into predictive model evaluation.

@nigam cites the original Vickers article on DCA in the paper, though it does not highlight the fact that DCA is relatively tractable approach to the thorny obstacles of evaluating net clinical benefit.

One way to ensure the clinical value of the predictive modeling work in OHDSI is to build on the DCA functionality already in the PLP package.

Thank you for the shoutout! The key idea is to consider all three things: model performance as traditionally defined, the potential net-benefit (via DCA, Isocost lines etc), and work constraints in terms of how many actions does the system have the capacity to take. I believe our community is well positioned to make an advance here.

t