Celebrate Your Worth

Medical AI Reveals New Designing Dynamics

Hurdles while developing a healthcare tool reveals that social and human factors are as important as technology while designing AI models

Based at North Carolina, USA, the Duke University Health System combines the Duke University School of Medicine, the Duke University School of Nursing, the Duke Clinic, and the member hospitals into a system of research, clinical care, and education. Consistently ranked among the top ten health care organisations in the United States, it is known to implement innovative solutions in addressing healthcare issues at large. Ever since November 2018, it is using an in-house deep-learning algorithm in its emergency department, that is now creating ripples in both healthcare and technology fraternities – for efficiency as well as for pioneering new avenues in developing practical AI models.

The solution is simply named Sepsis Watch, and it does what the name states – looks out for early warning signs of sepsis in patients admitted in Duke University hospitals. To an uninformed ear, that might not seem too big a task, but it is – in a big way! Sepsis is a life-threatening condition caused by the body’s response to any uncontrolled infection. It usually triggers inflammation throughout the body and if not contained, will eventually lead to multiple organ failure and sure death.

Patients suffering from any disease or trauma are susceptible to sepsis – hence, it is one of the leading causes of death in hospitals worldwide. Completely untreatable at an advanced stage, treatment only works if it is detected as early possible. However, being a secondary condition, signs and symptoms of sepsis are confusing to identify in most cases and conclusive diagnosis becomes impossible until it has progressed to the advanced stage – and then it is out of hand.

Against the backdrop of this bleak scenario, Sepsis Watch had made a huge difference at the Duke University Health System ever since it went live two years back. Development took around three and a half years of research that involved a rigorous digitizing process of health records and analysing over 32 million datasets. The resulting final solution was then packaged in a user-friendly iPad app.

The way it works is beguilingly simple yet effective. The algorithm analyses patient conditions and scores them on an hourly basis on how likely they are to develop sepsis going by their current condition. It is a continuous monitoring process where patients are categorised as medium or high risk. Those who fulfil all criteria are obviously diagnosed to have already contracted sepsis. At this stage, there is a check point for human intervention. Based on the alert flags generated by the model, a doctor confirms the diagnosis, and appropriate treatment is immediately started for the confirmed patients. The AI tool has dramatically reduced sepsis-induced patient deaths at the Duke University hospitals. It is now part of a federally registered clinical trial – the results of which are expected to be published in 2021. Experts are confident that it will prove to be a major progress in healthcare technology as an AI model that could successfully augment the physicians’ ability to diagnose disease.

However, two years after Sepsis Watch had been put to use, it is now being revealed for the first time that the challenges the developers faced were not all technological. Various human factors were involved in it – which only medical and healthcare professionals would fully appreciate – adding to the complexities of the problem. Communication and lines of authority were the most insurmountable of them.

Any healthcare system runs on the smooth interaction between doctors and nurses. Being a real-time monitoring tool, results flagged by Sepsis Watch would need to be checked and communicated to the attending doctor, who would then confirm the readings to make the final diagnosis – based on which treatment would start. This is a logical flow, but it would mean the person best positioned to check the tool and report the findings would be the nurses in the rapid response team (RRT). The doctors being too busy with their existing duties in the emergency department could not possibly keep monitoring a real-time app. The main responsibility of the RRT nurse is to continuously monitor patient status and intervene as needed. However, in the traditional emergency department protocol, RRT nurses do not call the attending physician. These physicians usually do not have any direct communications with the RRT. But Sepsis Watch required that, and it was a challenge for both the developers and the hospital administration to break an existing and time-honoured hospital protocol for the sake of a niche app – of which many were already skeptical.

This was a hurdle no technology team could have resolved. The new technology being introduced was creating disruptions in the legacy system. The project team with ample support from hospital managers went about repairing the “disruption” in several unconventional ways – even going to the extent of having head nurses host pizza parties to better about Sepsis Watch among fellow RRT nurses in informal and more relaxed settings. A more formal approach comprised formulating streamlined communication strategies – like scheduling discussions on multiple high-risk patients in one composite call, at a time when the doctors were free. And all the measures worked towards the success of Sepsis Watch.

This success goes to show the social realities involved for AI tools to succeed in the complexities of real world. And technology experts warn that unless such factors are considered more and more, AI models might not live up to the initial promise they generated. Assistant professor at MIT, Marzyeh Ghassemi, who specializes on machine-learning applications for healthcare, summed up the situation while talking to correspondents from the MIT Technology Review: “All machine-learning systems that are ever intended to be evaluated on or used by humans must have socio-technical constraints at front of mind… …the constraints that people need to be aware of are really human and logistical constraints”.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

Admission Test Dates for 2021

PGP in Data Science For Jan 2021 intake

Dates:  09-01-21,16-01-21

PGP in Cyber Security for 2021 intake

Dates: 09-01-21,16-01-21

PGP in Data Engineering for 2021 intake

Dates: 09-01-21,16-01-21

PG Diploma in Management for 2021 intake

Dates: 15-01-21, 21-01-21, 28-01-21

Admission Test Dates for 2021 intake