The field of radiology has a unique vocabulary. However, a new language stratum is sneaking in, spurred by the emergence of artificial intelligence (AI) in diagnostic and interventional imaging. Astute radiologists, technologists, and others who work within or at the fringes of this realm are getting familiar with the basic terminology of AI.
Why AI cannot be ignored
Why can't overworked medical professionals with already burgeoning continuing education requirements disregard AI as a fad or trend, instead of remaining focused on radiology specific training? Because AI is not a fad or trend. It is already part of everyday life, and it is not likely to go away anytime soon. We live in a culture that requires task output that cannot be accomplished efficiently or affordably by human labor. That need – processing vast volumes of input to create predictable output – was the original genesis of computers, and it continues to drive the development of AI.
AI and healthcare
The public became aware of AI (though not generally by that name) gradually; almost unknowingly. Some instances had the novelty of a vigilant doorbell or electronic personal assistant. Others slipped quietly into our email spam filters, map apps, and internet searches.
Healthcare is no different. AI made its entrance into medicine in the early 1970s with MYCIN, a method of identifying bacterial infections. Today AI is used in robotic surgical techniques, diagnostic tools, workflows, and imaging.
This industry’s investment in AI is forecast at $6.6 billion by 20211 and predicted to equate to $150 billion in annual savings by 2026.2 Forbes3 follows this precept, citing AI as a tremendous growth opportunity in the healthcare sector, with the power to revolutionize areas such as in-patient care, insurance, clinical research, and drug development.
In a recent RadiologyToday article, Bibb Allen, MD, FACR, CMO of the American College of Radiology’s (ACR) Data Science Institute (DSI) talked about existing and upcoming functions of AI in radiology. “’AI-lite’ is already being used in radiology in a number of ways, such as computer-aided detection for cancer, auto-segmentation of organs in 3D postprocessing, natural language processing to facilitate critical results reporting, consultation of best guidelines for recommendations, and quantification and kinetics in postprocessing.” Allen goes on to say, “We believe that AI is poised to significantly increase the value radiology professionals are able to provide their patients. While AI for imaging will not come all at once, early adopters of AI in their practices will be ready to be future leaders in health care."4
Four critical AI terms every radiology worker needs to understand:
- AI – Set your childhood sci-fi notions aside. In plain English, artificial intelligence is the area of computer science focused on creating machine and software systems capable of using data to solve problems or make decisions similar to human thinking. How does the popular social media platform, Pinterest, know what might interest you, when so few users label their pins? Pinterest uses AI computer vision5 to identify objects in images you have pinned, search the database for other photos with similar objects, and recommend them to you.
- Algorithm – This is a set of unambiguous, step-by-step, mathematical instructions that tells a computer how to perform a task. These instructions can be direct and straightforward, such as sending a bulk email at a specific time. Or, an algorithm can be a complex, layered instruction set. For example, visual and spatial recognition that tells an autonomous vehicle to avoid impact with a pedestrian. For a down-to-earth visualization of an algorithm, think about telling a child exactly how to make a PB&J sandwich. Take a plate (better to say a large plate if you have more than one size, to avoid ambiguity) from the cabinet and put it on the counter. Place two (again being specific) slices of bread from the bag in the breadbox onto the plate. Take the jar of peanut butter from the shelf and remove the cap. And so on. That is basically how a mathematical algorithm instructs a computer.
- Machine learning – The concept of AI spawned the powerful application of machine learning. The computer is supplied with extensive, relevant datasets and appropriate algorithms but not with explicit instructions for how to perform. It uses the information to "learn" by identifying patterns and signals within the data. The computer then applies this "experience" as the basis for decision making, when confronted with problems or decisions.
You see an example of machine learning every time you use a large online retailer like Amazon.com. Through machine learning, the platform develops insight into associations between products available for sale. It recognizes the likelihood that if you have placed a faucet water filtration device into your virtual shopping cart, you may also need refill cartridges.
- Deep learning – A neural network, the basis for machine learning, is inspired by the functioning of the human brain. This simplified model is comprised of interconnected artificial neurons. As computations are performed on data, artificial neurons activate or “fire,” passing information to the next layer within the network. The process of using a neural network with multiple, hidden layers to solve problems is deep learning. This elevates machine learning to a progressively sensitive level that can evaluate more abstract patterns.
Video-to-video synthesis6 may not sound like an everyday example of deep learning, but if you or your kids play video games, you have seen it. This deep learning approach generates photorealistic gaming visuals by learning from real-life video footage.
Are you interested in gaining a better understanding of how AI applies to healthcare technologies and techniques? Watch for Part 2 in this series, to raise your comfort level with more AI terms.