The same technology enabling driverless cars and 3D video games is powering this new ultrasound system
Digital communication is the new normal. People use video conferencing in global business meetings, summon car rides through their phones using GPS, and share photos on social media with their friends and followers. In healthcare, a similar type of communication is improving the connection between clinicians and their colleagues, and even clinicians and their patients.
As hospitals continue to expand, scans and images are often read by remote clinicians. Sending images between the sonographer, radiologist, and patient can be like playing a game of telephone – the message sometimes gets lost in communication.
“For decades, we’ve been relying on notes, referrals or verbal explanations to understand a patient’s case,” says Dr. John Cronan, MD, Radiologist-in-Chief at Lifespan, Rhode Island’s largest health system, and Chair of Radiology at The Warren Alpert Medical School of Brown University. “We see several patients with different types of lumps, bumps, or areas of pain but it’s not always clear to which part of the body the ultrasound image is referring.”
Dr. Cronan also described a case where a patient was referred to the clinic for symptoms of a gallbladder problem. The referral described pain and hyper sensitivity in the gallbladder. However, in the exam room, the doctors found that the patient’s pain was nowhere close to the gallbladder. Because of this they had to change their course of action – adding time to the overall process.
But Dr. Cronan and his team are now equipped with a tool – called Photo Assistant – to ensure their communication is precise.
The Photo Assistant app allows clinicians to take images on their Android™ smartphone or tablet to show the relevant scanned areas that may influence the exam and give more context. The photos are wirelessly and securely transmitted to the ultrasound system and are included with the clinical images sent to the reviewing physician.
The team at Rhode Island Hospital, a Lifespan hospital, is using Photo Assistant six to eight times a day in a single exam room. “This has been a complete game changer. In a clinical setting, an image is worth well over a thousand words,” says Dr. Cronan. “If we’re not in direct contact with the patient or technologist, there can be a communication gap. We find that the clinical context provided by these images can save the radiologist as much as ten minutes per case. And when you’re seeing several patients in a day, this can become a significant amount of time saved.”
The images have not only improved communication between the clinicians on a specific case, but the patients also appreciate it. “Our patients are thrilled to show us exactly where something is. They point to it, and we snap a quick picture,” adds Dr. Cronan. “They also find it reassuring to see the image directly on the ultrasound’s screen.”
The Photo Assistant app is part of the new LOGIQ™ E10 ultrasound system from GE Healthcare – a next generation digital system that integrates artificial intelligence, cloud connectivity, and advanced algorithms to acquire and reconstruct data.
In fact, the same technology that is powering driverless cars and the next generation of 3D video gaming is behind this ultrasound system’s platform – the cSound™ Architecture. The LOGIQ™ E10 acquires data in a similar way to an MRI or CT system, and then leverages advanced GPU hardware technology with 48 times the data throughput and 10 times the processing power of previous systems to reconstruct the ultrasound images in real-time. The cSound™ Architecture is so powerful that it can process an amount of data equivalent to playing two entire DVDs in just one second, in real-time.
Artificial Intelligence – the technology that mimics the human brain – has proven that it has the potential to see the unseen, answer questions that had never even been asked, and consume information previously impossible for clinicians to digest.
In the case of the LOGIQ™ E10, intelligent algorithms help to segment lesions and identify vessels today, and the embedded AI platform could lead to an earlier diagnosis and a better outcome for the patient tomorrow.