"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has."

Margaret Mead

Review article
peer-reviewed

Artificial Intelligence: Is It Armageddon for Breast Radiologists?



Abstract

Artificial Intelligence (AI) has taken radiology by storm, in particular, mammogram interpretation, and we have seen a recent surge in the number of publications on potential uses of AI in breast radiology. Breast cancer exerts a lot of burden on the National Health Service (NHS) and is the second most common cancer in the UK as of 2018. New cases of breast cancer have been on the rise in the past decade, while the survival rate has been improving. The NHS breast cancer screening program led to an improvement in survival rate. The expansion of the screening program led to more mammograms, thereby putting more work on the hands of radiologists, and the issue of double reading further worsens the workload. The introduction of computer-aided detection (CAD) systems to help radiologists was found not to have the expected outcome of improving the performance of readers. Unreliability of CAD systems has led to the explosion of studies and development of applications with the potential use in breast imaging. The purported success recorded with the use of machine learning in breast radiology has led to people postulating ideas that AI will replace breast radiologists. Of course, AI has many applications and potential uses in radiology, but will it replace radiologists? We reviewed many articles on the use of AI in breast radiology to give future radiologists and radiologists full information on this topic. This article focuses on explaining the basic principles and terminology of AI in radiology, potential uses, and limitations of AI in radiology. We have also analysed articles and answered the question of whether AI will replace radiologists.

Introduction & Background

Women in the UK have one in eight chances of developing cancer of the breast during their lives [1]. According to the official statistics of 2018, breast cancer is the second most common cancer diagnosed in the UK, with 47,476 recorded cases. It has been the most frequently diagnosed cancer in England since 1996 [2,3]. The incidence rate increased from 2008 to 2018 while the mortality rate decreased as shown in Table 1.

2008  2018
Mortality rate Incidence rate Mortality rate Incidence rate
39.9 165.5 34.1 167.7

This decreasing mortality contrasts to the increase in new cases, indicating that the number of patients surviving breast cancer has improved [3]. This is as a result of early detection of breast cancer through expanded mammography screening [3]. Mammography is the best method available to detect cancer of the breast before the lesions become clinically visible, and mortality is reduced by as much as 30% [4]. However, mammograms are complex, and the high numbers (1.79 million mammograms done in 2017-18 under the National Health Service [NHS] Breast Screening Program) of exams per reader can result in inaccurate diagnosis [1,5]. The incorrect diagnosis led to double reading of mammograms in the UK and Europe [6]. When using double reading, sensitivity of mammography is increased by 5%-15% when compared with single reading [6]. Computer-aided detection (CAD) systems introduced because radiologists missed about 25% of visible cancers on mammograms due to interpretation errors [7]. The reason for the introduction of CAD systems was to try and improve human detection performance [8]. However, recent studies show that CAD models have not had the expected impact, and there was no significant improvement in the diagnostic accuracy of mammography, or reduction in recall rates [9]. Advances in AI are promising to address the flaws of CAD systems [10]. A present-day breast radiologist and those aspiring to be radiologists should be aware of the fundamental principles of AI, its use, limitations, and what the future holds. The nitty-gritty of AI is not of much importance to radiologists. Still, they must know the technical jargon used by application developers to communicate with them and be prepared for the future efficiently.

Review

Artificial intelligence

Artificial intelligence (AI) technology has existed for more than half a century and has become more and more sophisticated [11]. The first reports on AI use in radiology date back to 1963, but the flames of those years quickly dozed off [12]. A recent increase in amounts of electronic medical data and technological improvements brought with it a new vigour in AI applications [11-13]. AI is defined as a technology that could broadly mimic the intelligence of humans [11-13]. AI has two broad categories, namely, general AI and narrow AI [11-13]. General AI is a concept where machines can function and exhibit all the intellectual capabilities of humans such as reasoning, seeing and even hearing, whereas narrow AI is where technologies are only able to do specific tasks [11-13]. Narrow AI is what is achievable at this time, and general AI is still a pipe dream.

Machine Learning

The Idea of introducing technology in medicine started as a tool known as expert systems, and the goal was to make algorithms that would make decisions like physicians [13-15]. However, it was not possible to make an algorithm for all diseases given the different presentations and scenarios encountered in day-to-day medicine [13-15]. Machine learning (ML) then took over from expert systems in the 1990s [13-15]. ML is a component of AI, and it tries to replicate the learning part of human brains [13-15]. In ML, the algorithm learns from exposure to large and new data sets and improves with continuous exposure to data [13]. There is no need for explicit programming in ML, and this differentiates it from standard programming, which requires clear step-by-step instructions the program must take [14].

Categories of ML

ML can be divided into supervised, unsupervised and reinforcement types of learning [13-15]. In unsupervised learning, there is no labelled input data [13-15]. The algorithm is supposed to analyse the data, group the data, and produce an output. There is no feedback supplied [13-15]. In reinforcement learning, the algorithm analyses data, and it is either rewarded or punished depending on the accuracy of the output it produces (reinforcements) [13-15]. The machine learns how to act in a particular environment to maximize rewards [15]. Supervised learning is when the algorithm is provided with labelled data to learn from, and this is known as the training phase [15]. The algorithm is expected to find recurring patterns from the training data and be able to pair inputs to results. In radiology, this means the mammogram diagnosis suggested by AI has to match tissue diagnoses [13].

Artificial Neural Networks

Artificial neural networks (ANNs) are a component of ML that uses mathematical and statistical principles to analyse data [16]. These networks' development came after inspiration from the biologic nervous systems' way of processing information using a large number of highly interconnected neurons [16]. An ANN has one input layer and one output layer of neurons [13-16]. In between the input and output is one or more layers known as "hidden layers." A hidden segment consists of a set of neurons, with connections to all neurons in the previous and forward layers [13-16].

Deep Neural Network (Deep Learning)

Deep learning (DL) is the AI concept used in image interpretation. A deep neural network (DNN) is an ANN consisting of five or more layers of algorithms connected and organised according to the meaningfulness of the data, and this enables improved predictions from data [13-16]. These layers store data from inputs and provide an output that is liable to change in an orderly manner once the AI system learns new features from the data [13-16]. DNNs are good in that they continuously improve as the size of the training data increases and doesn’t only work as a classifier but also as a feature extractor [13-17]. The differences in the DNN and simple neural network are illustrated in Figure 1.

Convolutional Neural Networks

DL is a typical network that takes one-dimensional inputs, while convolutional neural networks (CNNs) take two- or three-dimensional shaped data [15,18]. Convolution is a mathematical principle that is used to find repeated features in images [15,18]. CNNs consist of an input, an output, as well as many hidden layers that get useful information by convolving (filtering) the data. CNNs are the most commonly used AI tool in breast imaging [15,18]. Figure 2 shows how a CNN extracts data from an image and process the data to give a possible diagnosis.

Transfer Learning

Transfer learning is when knowledge obtained from a different experience is used on a separate but related job, for example, the use of an algorithm trained for non-medical visual recognition in feature extraction on mammograms [15]. Knowledge attained from regular photo image analysis is transferable to mammogram analysis [15].

Use of AI in breast imaging

The role of AI in breast imaging is not about finding whether machines are more ingenious than humans. Instead, AI is for expanding, sharpening, and relaxing the mind of the radiologist so that radiologists can do the same for their patients [19]. Following are the current uses and possible future uses of AI in breast radiology.

Image Interpretation 

Advances in AI and imaging technology led to an increase in the proposed applications of AI in breast radiology. A number of studies have explored the use of deep learning in mammogram interpretation. DL tried in making a diagnosis of breast pathology in a number of cases, such as differentiating benign from malignant breast masses, separating masses from micro-calcifications, distinguishing between tumor and healthy tissue, discrimination between benign, malignant, and healthy tissue and detect masses in mammogram images [20-24]. AI has potential use in density segmentation and risk calculation, classifying breast tissue into different densities, namely, scattered and uniformly dense breast density categories, image segmentation that is mapping the edges of a lesion, lesion identification, measurement, labeling, comparison with previous images, comparing images from both left and right breasts and also the craniocaudal and mediolateral-oblique view of each breast and breast anatomy classification in mammograms [25-28].

Radiomics 

Radiomics is extraction of large amounts of features from diagnostic images, using algorithms [29,30]. These features are known as radiomic features [29,30]. On a mammogram, radiomics extract vast amounts of features that a human eye cannot see and link these features with each other and other data [29,30]. The obtained data provides valuable information to radiologists and helps to predict prognosis, treatment response and many different potential uses such as differentiating benign and malignant breast tumors [30].

Imaging Banks

There are now vast amounts of data stored due to the continually increasing memory capacity of computers [29]. Overloaded picture archiving and communications system (PACS) is now familiar because we save raw images and lots of data generated from imaging [29]. AI can be used to store quantitative photos, and these stored images would be a handy tool to use as training data sets through which algorithms are trained [29].

Structured Reporting

AI can be used to make reporting standard by helping with the reporting template, assist with choice of vocabulary and recommend the most probable diagnosis [31].

Deciding Exam Priority Level

AI tools can also help the radiologist deciding exam priority using appropriateness criteria [15]. A clinical decision support (CDS) system helps referring clinicians in deciding the most relevant imaging procedure [32,33]. AI together with CDS, can make the process better and ultimately improve efficiency in the radiology department [29,33].

Quick Identification of Negative Studies

AI can be used to improve sensitivity by finding studies that are normal and leaving the rest for human readers [29,34]. This practice would be useful in high-volume sites or where double reporting is practiced [29,34]. Another way is to use AI to classify exams into normal and abnormal, and one radiologist will read those labelled as normal while the abnormal ones will go for double reading [29,34].

Clerical Work and Clinical Data Management

Physicians are now loaded with increasing amounts of administrative work, and this is among the leading causes of burnout. We can rope in AI applications in this area [34-37]. The ultimate goal expected from such ML-based systems is to assist healthcare workers cut documentation time and improve on report quality [34-37].

AI challenges

Acceptance of AI

In the early days of autopilot, pilots were reluctant to accept the technology, and this is the current situation now with radiologists fearing for their jobs [34-38]. However, technological advancements are not new to radiology; they were always part of the specialty, and technology is what drives radiology [34-38]. Bertalan Mesko has dubbed AI “the stethoscope of the 21st century” [36]. It was not easy for the early physicians to incorporate and trust the stethoscope as one of their best tools [36]. There is a need to sensitize people about AI through different channels to make the adoption of AI smooth [37]. We should not forget acceptance by patients as well. There is need to sensitize patients about AI. We also need consent from patients to use AI on image interpretation. Patients should be able to choose between AI and humans.

Training Data

Vast amounts of images are now available from the PACS, but the challenge arises in labeling the data for AI training [28-38]. Image labeling takes a lot of time and needs a lot of effort, and also, this process must be very robust [29-38]. Another problem comes with rare conditions; it is difficult to find enough images to train the algorithm so that the algorithm can identify them on its own in future [29-38]. Sometimes random variations on pictures can be seen by the program as a pathological lesion [29-38]. If input data used in training is from a different ethnic group, age group or different gender, it may give different results if given raw data from other diverse groups of people [15].

Medicolegal Issues

The question is if AI systems make an autonomous decision and make a mistake, who is responsible: radiologist, machine or builder of the device [38-43]? Physicians always take responsibility for the medical decisions made for patients. In case something went wrong, the programmers may not take responsibility, given that, the machines are continuously learning in ways not known by the developers. Recommendations provided by the tool may need to be ratified by a radiologist, who may agree or disagree with the software [38-43].

Is it the end for breast radiologists?

Geoffrey Hinton, a cognitive psychologist and computer scientist from Canada, was quoted saying, "if you work as a radiologist, you're like the coyote that's already over the edge of the cliff, but hasn't yet looked down so doesn't realize there's no ground underneath him. People should stop training radiologists now. It's just self-evident that within five years, deep learning is going to do better than radiologists; we've got plenty of radiologists already" [29]. Yes, in terms of image analysis and computing, AI has the potential to be more efficient than radiologists [29-45]. Radiologists cannot process millions of mammograms in any reasonable space of time, and this has led to some ideas that radiologists are going to be displaced by AI [29-45]. In the recent years, we have seen algorithms in various domains that are comparable or even better than humans in various radiological tasks, especially in breast imaging, which is why it is our main focus in this article [29-45]. Because of this success of DL in breast imaging, people started discussing the possibility of automating image interpretation and bury radiologists. These are far-fetched expectations; AI systems have their limitations discussed above. Also, it is crucial to know that AI is good at solving super-specific isolated problems [15-29]. In contrast, humans can understand different concepts, reason and put together vast amounts of information from various aspects and come up with an inclusive decision [29-45]. It is difficult to predict the future of AI on radiology. Still, many authors believe that AI will become part and parcel of the daily job of radiologists, making them more efficient [29-45]. AI will be valuable in performing routine tasks and help radiologists to concentrate on more useful jobs [38-45]. Radiologists will spend more time discussing with clinicians, in multi-disciplinary team meetings contributing to the long-term care of patients. More time will also be freed up and will enable them to communicate abnormal results in person or through a telephone call, and even quality enforcement, education, policy formulation and interventional procedures [29-42]. In general, when automation comes, jobs are not lost, but humans promote themselves to tasks needing a human touch [43].

Future roles of a radiologist

Since Roentgen, AI will be the biggest thing to come into radiology. Radiologists have a vital role to play for the good of both AI and radiologists [29-44]. They are an essential piece of the puzzle for several challenges AI is facing now, such as the production of training images [29-47]. There is a need for many labelled images provided by experienced radiologists and creating these datasets is difficult and time-consuming [29-45]. However, even if necessary, the radiologist's role is not only image labeling. Radiologists should be involved in directing programmers to areas where AI methods are needed most because they are the end-users, and they know where they need help [40-47]. There is a need to increase partnership with AI developers in the development of applications to be used in breast radiology [46]. These professionals should become part and parcel of radiological departments, creating a “multidisciplinary AI team” that will ensure the adequacy of patient safety standards. In this way, radiologists can take responsibility for legal liability [45,46]. There is also a need for the development of rigorous evaluation criteria of applications before they are licensed for use and radiologists should take this responsibility [29-47].

Conclusions

Indeed, AI will change radiology, and it is not too early to incorporate it into your workplace. While there are even more grey areas that need clarification, our opinion is that AI will not replace radiologists. Still, those who incorporate AI in their daily work will probably be better off than those who don't. Every radiologist must prepare for a future of working with machines more. We predict a future where radiologists will continue to make strides and draw many benefits from the use of more sophisticated AI systems over the next 10 years. If you are interested in breast radiology, from medical students to radiologists, anticipate a more satisfying and rewarding career, especially if you have skills in programming. AI knowledge and data science must be a part of the medical school and radiology trainees’ curriculum. Finally, it is important to say our patients, not AI, should be central to every decision that we take as a professional.


References

  1. Breast cancer screening. (2018). Accessed: February 21, 2020: https://www.nhs.uk/conditions/breast-cancer-screening/.
  2. Breast cancer statistics. (2018). Accessed: February 21, 2020: https://www.cancerresearchuk.org/health-professional/cancer-statistics/statistics-by-cancer-type/breast-cancer.
  3. Cancer registration statistics, England: first release, 2018. (2018). Accessed: February 21, 2020: https://www.gov.uk/government/publications/cancer-registration-statistics-england-2018/cancer-registration-statistics....
  4. Dheeba J, Singh AN, Selvi TS: Computer-aided detection of breast cancer on mammograms: a swarm intelligence optimized wavelet neural network approach. J Biomed Inform. 2014, 49:45-52. 10.1016/j.jbi.2014.01.010
  5. Welch HG, Passow HJ: Quantifying the benefits and harms of screening mammography. JAMA Intern Med. 2014, 3:448-454. 10.1001/jamainternmed.2013.13635
  6. Watanabe AT, Lim V, Vu HX, et al.: Improved cancer detection using artificial intelligence: a retrospective evaluation of missed cancers on mammography. J Digit Imaging. 2019, 32:625-637. 10.1007/s10278-019-00192-5
  7. Dang PA , Freer PE, Humphrey KL, Halpern EF, Rafferty EA: Addition of tomosynthesis to conventional digital mammography: effect on image interpretation time of screening. Radiology. 2014, 270:49-56. 10.1148/radiol.13130765
  8. Li Y, Chen H, Cao L, Ma J: A survey of computer-aided detection of breast cancer with mammography. J Health Med Inform. 2016, 7:238. 10.4172/2157-7420.1000238
  9. Lehman CD, Wellman RD, Buist DSM, Kerlikowske K, Tosteson ANA, Miglioretti DL: Diagnostic accuracy of digital screening mammography with and without-aided detection. JAMA Intern Med. 2015, 175:1828-1837. 10.1001/jamainternmed.2015.5231
  10. Lee JG, Jun S, Cho YW, Lee H, Kim GB, Seo JB, Kim N: Deep learning in medical imaging: general overview. Korean J Radiol. 2017, 4:570-584. 10.3348/kjr.2017.18.4.570
  11. Nawrocki T, Maldjian PD, Slasky SE, Contractor SG: Artificial intelligence and radiology: have rumors of the radiologist's demise been greatly exaggerated?. Acad Radiol. 2018, 25:967-972. 10.1016/j.acra.2017.12.027
  12. Lodwick GS, Keats TE, Dorst JP: The coding of roentgen images for computer analysis as applied to lung cancer. Radiology. 1963, 81:185-200. 10.1148/81.2.185
  13. What’s the difference between artificial intelligence, machine learning, and deep learning?. (2016). Accessed: March 1, 2020: https://blogs.nvidia.com/blog/2016/07/29/whats-difference-artificial-intelligence-machine-learning-deep-learning-ai/.
  14. Kohli M, Prevedello LM, Filice RW, Geis JR: Implementing machine learning in radiology practice and research. AJR Am J Roentgenol. 2017, 208:754-760. 10.2214/AJR.16.17224
  15. Choy G, Khalilzadeh O, Michalski M, et al.: Current applications and future impact of machine learning in radiology. Radiology. 2018, 288:318-328. 10.1148/radiol.2018171820
  16. Dayhoff JE, DeLeo JM: Artificial neural networks: opening the black box. Cancer. 2001, 91:1615-1635.
  17. Waldrop MM: New feature: what are the limits of deep learning?. Proc Natl Acad Sci U S A. 2019, 116:1074-1077. 10.1073/pnas.1821594116
  18. Ribli D, Horváth A, Unger Z, Pollner P, Csabai I: Detecting and classifying lesions in mammograms with deep learning. Sci Rep. 2018, 8:4165. 10.1038/s41598-018-22437-z
  19. Your future doctor may not be human. This is the rise of AI in medicine. (2018). Accessed: March 6, 2020: https://futurism.com/ai-medicine-doctor.
  20. Cheng J, Ni D, Chou Y, et al.: Computer-aided diagnosis with deep learning architecture: applications to breast lesions in us images and pulmonary nodules in CT scans. Sci Rep. 2016, 6:24454. 10.1038/srep24454
  21. Wang J, Yang X, Cai H, Tan W, Jin C, Li L: Discrimination of breast cancer with microcalcifications on mammography by deep learning. Sci Rep. 2016, 6:27327. 10.1038/srep27327
  22. Ibrahim A, Gamble P, Jaroensri R, Abdelsamea MM, Mermel CH, Chen CP, Rakha EA: Artificial intelligence in digital breast pathology: techniques and applications. Breast. 2020, 49:267-273. 10.1016/j.breast.2019.12.007
  23. Ragab DA, Sharkas M, Marshall S, Ren J: Breast cancer detection using deep convolutional neural networks and support vector machines. PeerJ. 2019, 7:e6201. 10.7717/peerj.6201
  24. Agarwal R, Diaz O, Lladó X, Yap MH, Martí R: Automatic mass detection in mammograms using deep convolutional neural networks. J Med Imaging (Bellingham). 2019, 6:031409. 10.1117/1.JMI.6.3.031409
  25. Kallenberg M, Petersen K, Nielsen M, et al.: Unsupervised deep learning applied to breast density segmentation and mammographic risk scoring. IEEE Trans Med Imaging. 2016, 35:1322-1331. 10.1109/TMI.2016.2532122
  26. Liew C: The future of radiology augmented with artificial intelligence: a strategy for success. Eur J Radiol. 2018, 102:152-156. 10.1016/j.ejrad.2018.03.019
  27. Christoyianni I, Constantinou E, Dermatas E: Automatic detection of abnormal tissue in bilateral mammograms using neural networks. Methods and Applications of Artificial Intelligence. SETN 2004. Lecture Notes in Computer Science. Vouros GA, Panayiotopoulos T (ed): Springer, Berlin, Heidelberg; 2004. 3025:265-275. 10.1007/978-3-540-24674-9_28
  28. Fonseca P, Mendoza J, Wainer J, Ferrer J, Pinto J, Guerrero J, Castaneda B: Automatic breast density classification using a convolutional neural network architecture search procedure. J Med Imaging. 2015, 9414:941428. 10.1117/12.2081576
  29. Neri E, de Souza N, Brady A, Bayarri AA, Becker CD, Coppola F, Visser J: What the radiologist should know about artificial intelligence - an ESR white paper. Insights Imaging. 2019, 10:44. 10.1186/s13244-019-0738-2
  30. Gillies RJ, Kinahan PE, Hricak H: Radiomics: images are more than pictures, they are data. Radiology. 2016, 278:563-577. 10.1148/radiol.2015151169
  31. Hong Y, Kahn CE Jr: Content analysis of reporting templates and free-text radiology reports. J Digit Imaging. 2013, 26:843-849. 10.1007/s10278-013-9597-4
  32. Remedios D, Brkljacic B, Ebdon-Jackson S, Hierath M, Sinitsyn V, Vassileva J: Collaboration, campaigns and champions for appropriate imaging: feedback from the Zagreb workshop. Insights Imaging. 2018, 9:211-214. 10.1007/s13244-018-0602-9
  33. Sachs PB, Gassert G, Cain M, Rubinstein D, Davey M, Decoteau D: Imaging study protocol selection in the electronic medical record. J Am Coll Radiol. 2013, 10:220-222. 10.1016/j.jacr.2012.11.004
  34. Sechopoulos L, Mann RM : Stand-alone artificial intelligence - the future of breast cancer screening?. Breast. 2020, 49:254-260. 10.1016/j.breast.2019.12.014
  35. Will doctors fear being replaced by AI in the hospital settling?. (2018). Accessed: March 5, 2020: https://emerj.com/ai-sector-overviews/will-doctors-fear-being-replaced-by-ai-in-the-hospital-settling/.
  36. Artificial intelligence is the stethoscope of the 21st century. (2017). Accessed: March 5, 2020: https://medicalfuturist.com/ibm-watson-is-the-stethoscope-of-the-21st-century/.
  37. Ahuja AS: The impact of artificial intelligence in medicine on the future role of the physician. PeerJ. 2019, 7:e7702. 10.7717/peerj.7702
  38. Hosny A, Parmar C, Quackenbush J, Schwartz LH, Aerts HJWL: Artificial intelligence in radiology. Nat Rev Cancer. 2018, 18:500-510. 10.1038%2Fs41568-018-0016-5
  39. Pesapane F, Volonté C, Codari M, Sardanelli F: Artificial intelligence as a medical device in radiology: ethical and regulatory issues in Europe and the United States. Insights Imaging. 2018, 9:745-753. 10.1007/s13244-018-0645-y
  40. Altman RB: Artificial intelligence (AI) systems for interpreting complex medical datasets. Clin Pharmacol Ther. 2017, 101:585-586. 10.1002/cpt.650
  41. Recht M, Bryan RN: Artificial intelligence: threat or boon to radiologists?. J Am Coll Radiol. 2017, 14:1476-1480. 10.1016/j.jacr.2017.07.007
  42. Russell S, Bohannon J: Fears of an AI pioneer. Science. 2015, 349:252. 10.1126/science.349.6245.252
  43. Jha S, Topol EJ: Adapting to artificial intelligence: radiologists and pathologists as information specialists. JAMA. 2016, 316:2353-2354. 10.1001/jama.2016.17438
  44. Ravi D, Wong C, Deligianni F, Berthelot M, Andreu-Perez J, Lo B, Yang G: Deep learning for health informatics. IEEE J Biomed Health Inform. 2017, 21:4-21. 10.1109/JBHI.2016.2636665
  45. Chartrand G, Cheng PM, Vorontsov E, et al.: Deep learning: a primer for radiologists. Radiographics. 2017, 37:2113-2131. 10.1148/rg.2017170077
  46. Sardanelli F: Trends in radiology and experimental research. Eur Radiol Exp. 2017, 1:1. 10.1186/s41747-017-0006-5
  47. Lambin P, Leijenaar RTH, Deist TM, et al.: Radiomics: the bridge between medical imaging and personalized medicine. Nat Rev Clin Oncol. 2017, 14:749-762. 10.1038/nrclinonc.2017.141
Review article
peer-reviewed

Artificial Intelligence: Is It Armageddon for Breast Radiologists?


Author Information

Lawman Chiwome Corresponding Author

General Internal Medicine, University Hospitals of Morecambe Bay NHS Foundation Trust, Lancaster, GBR

Onosetale M. Okojie

Family Medicine, California Institute of Behavioral Neurosciences and Psychology, Fairfield, USA

A. K. M. Jamiur Rahman

Internal Medicine, California Institute of Behavioral Neurosciences and Psychology, Fairfield, USA

Faheem Javed

Anaesthesia, California Institute of Behavioral Neurosciences and Psychology, Fairfield, USA

Pousettef Hamid

Neurology, California Institute of Behavioral Neurosciences and Psychology, Fairfield, USA


Ethics Statement and Conflict of Interest Disclosures

Conflicts of interest: In compliance with the ICMJE uniform disclosure form, all authors declare the following: Payment/services info: All authors have declared that no financial support was received from any organization for the submitted work. Financial relationships: All authors have declared that they have no financial relationships at present or within the previous three years with any organizations that might have an interest in the submitted work. Other relationships: All authors have declared that there are no other relationships or activities that could appear to have influenced the submitted work.


Review article
peer-reviewed

Artificial Intelligence: Is It Armageddon for Breast Radiologists?


Figures etc.

SIQ
8.4
RATED BY 7 READERS
CONTRIBUTE RATING

Scholary Impact Quotient™ (SIQ™) is our unique post-publication peer review rating process. Learn more here.