Big Data in intensive care

0
1



Recognizing complications early saves lives in the intensive care unit. Experience and knowledge of the medical staff could soon be supported by algorithms.

In the intensive care unit, up to 100 GB of data are generated per critical patient per day. This data will soon contribute to the safety of patients in the hospital. As part of a project funded by the Swiss National Science Foundation (SNSF), researchers are developing a system for detecting false alarms in emergencies and intensive care units, warning them of possible problems and thus allowing early intervention.

Pulse, oxygen saturation, blood levels, computed tomography and magnetic resonance imaging – the wealth of data is vast and takes time and experience to detect constellations that indicate a high risk of complications or even early-onset problems. As part of the "ICU Cockpit" project, researchers led by Emanuela Keller from the Neurosurgical Intensive Care Unit of the University Hospital Zurich together with ETH Zurich and IBM Research are working on accelerating this analysis with learning algorithms and making them more comprehensible in hectic intensive care operations.

Recognize real complications

The data were based on data from 400 patients who had been anonymised prior to further processing, as the SNSF stated in a statement on Monday. There were also video recordings. Based on this data, researchers trained learning algorithms to distinguish false alarms from true complications.

Reducing false alarms would make it easier for professionals to identify critical situations. In addition, various applications of the system should warn early against epileptic seizures or secondary brain damage.


Source link
https://www.computerworld.ch/software/big-data/big-data-intensivstation-2401191.html

Dmca

LEAVE A REPLY

Please enter your comment!
Please enter your name here