התחל במצב לא מקוון עם האפליקציה Player FM !
Ep97: The governance of AI
Manage episode 367934705 series 2898400
This is the third part of a series on artificial intelligence in medicine. Previously we explained how to train and test machine learning models that assist in decision-making, and then how to iron out ergonomic friction points in the clinical workflow. We’ve mentioned how deep learning neural networks are more capable than classical models at dealing with big noisy data sets, but also that the reasoning they use to solve questions asked of them might be inexplainable users.
This creates a certain unease among clinicians and regulators like Australia’s Therapeutic Goods Administration. According to some, we just need to test outcomes from use of AI-assisted decision-making with same rigor we do for pharmaceutical interventions, not all of which we fully understand. But despite updates to the SPIRIT and CONSORT for reporting of randomised controlled trials, there hasn’t yet been a lot of high quality clinical research into use of AI-based medical devices.
Guests
>Affiliate Associate Professor Paul Cooper PhD FAIDH CHIA AFHEA GAICD (Deakin University)
>Associate Professor Sandeep Reddy MBBS PhD IPFPH ECFMG CHIA FAcadTM FAIDH FCHSM SFHEA (Deakin University)
>Professor Brent Richards MBBS FRACP JJFICM (Gold Coast Hospital and Health Service; Director, IntelliHQ)
Production
Produced by Mic Cavazzini DPhil. Music licenced from Epidemic Sound includes ‘Thyone’ by Ben Elson, ‘Little Liberty’ by Paisley Pink. Music courtesy of Free Music Archive includes ‘Impulsing’ and ‘You are not alone’ by Borrtex. Image by WestEnd61 licenced from Getty Images.
Editorial feedback kindly provided by physician Rahul Barmanray and digital health academic Natasa Lazarevic.
Key References
A governance model for the application of AI in health care [Reddy, J Am Med Inform Assoc. 2020]
Machine learning in clinical practice: prospects and pitfalls [Med J Aust. 2019]
Evidence-based medicine and machine learning: a partnership with a common purpose [BMJ Evid Based Med. 2021]
Explainability for artificial intelligence in healthcare: a multidisciplinary perspective [BMC Med Inform Decis Mak. 2020]
Please visit the Pomegranate Health web page for a transcript and supporting references.Login to MyCPD to record listening and reading as a prefilled learning activity. Subscribe to new episode email alerts or search for ‘Pomegranate Health’ in Apple Podcasts, Spotify,Castbox, or any podcasting app.
120 פרקים
Manage episode 367934705 series 2898400
This is the third part of a series on artificial intelligence in medicine. Previously we explained how to train and test machine learning models that assist in decision-making, and then how to iron out ergonomic friction points in the clinical workflow. We’ve mentioned how deep learning neural networks are more capable than classical models at dealing with big noisy data sets, but also that the reasoning they use to solve questions asked of them might be inexplainable users.
This creates a certain unease among clinicians and regulators like Australia’s Therapeutic Goods Administration. According to some, we just need to test outcomes from use of AI-assisted decision-making with same rigor we do for pharmaceutical interventions, not all of which we fully understand. But despite updates to the SPIRIT and CONSORT for reporting of randomised controlled trials, there hasn’t yet been a lot of high quality clinical research into use of AI-based medical devices.
Guests
>Affiliate Associate Professor Paul Cooper PhD FAIDH CHIA AFHEA GAICD (Deakin University)
>Associate Professor Sandeep Reddy MBBS PhD IPFPH ECFMG CHIA FAcadTM FAIDH FCHSM SFHEA (Deakin University)
>Professor Brent Richards MBBS FRACP JJFICM (Gold Coast Hospital and Health Service; Director, IntelliHQ)
Production
Produced by Mic Cavazzini DPhil. Music licenced from Epidemic Sound includes ‘Thyone’ by Ben Elson, ‘Little Liberty’ by Paisley Pink. Music courtesy of Free Music Archive includes ‘Impulsing’ and ‘You are not alone’ by Borrtex. Image by WestEnd61 licenced from Getty Images.
Editorial feedback kindly provided by physician Rahul Barmanray and digital health academic Natasa Lazarevic.
Key References
A governance model for the application of AI in health care [Reddy, J Am Med Inform Assoc. 2020]
Machine learning in clinical practice: prospects and pitfalls [Med J Aust. 2019]
Evidence-based medicine and machine learning: a partnership with a common purpose [BMJ Evid Based Med. 2021]
Explainability for artificial intelligence in healthcare: a multidisciplinary perspective [BMC Med Inform Decis Mak. 2020]
Please visit the Pomegranate Health web page for a transcript and supporting references.Login to MyCPD to record listening and reading as a prefilled learning activity. Subscribe to new episode email alerts or search for ‘Pomegranate Health’ in Apple Podcasts, Spotify,Castbox, or any podcasting app.
120 פרקים
Semua episode
×ברוכים הבאים אל Player FM!
Player FM סורק את האינטרנט עבור פודקאסטים באיכות גבוהה בשבילכם כדי שתהנו מהם כרגע. זה יישום הפודקאסט הטוב ביותר והוא עובד על אנדרואיד, iPhone ואינטרנט. הירשמו לסנכרון מנויים במכשירים שונים.