This website is intended for healthcare professionals only.
Take a look at a selection of our recent media coverage:
7th March 2023
The use of an essential oil nasal spray led to a 40% reduction in total sino-nasal symptoms according to data presented in an abstract presented at the American Academy of Allergy and Immunology (AAAAI) at San Antonio and published in a supplement to the Journal of Allergy and Clinical Immunology.
Allergic rhinitis gives rise to symptoms which commonly include nasal congestion, nasal itch, rhinorrhoea and sneezing and is estimated to affect 18.1% of the global population, leading to impairment of both sleep and quality of life. Typically, treatment involves the use of antihistamines and or intranasal corticosteroids, although a recent randomised trial demonstrated how aromatherapy oils provided symptomatic relief in patients with perennial allergic rhinitis. Furthermore, the use of a lavender oil inhalation was also found to be of benefit in bronchial asthma.
Given the potential value of essential oils for perennial allergic rhinitis, in the current study presented at AAAAI, the role of an essential oil nasal spray, which contained menthol, eucalyptol, thymol, camphor, birch oil, pine oil, cinnamon and mint, was evaluated over a period of 7 days by patients with seasonal allergic rhinitis. Symptom severity was assessed using the Sino-Nasal Outcome Test (SNOT-22), a patient-reported outcome measure which assesses 22 symptoms, each of which were assessed at baseline and after one week, using a 0 to 5 scale, where 0 = no problem and 5 = ‘problem as bad as can be’
Essential oil nasal spray and SNOT-22 score
A total of 18 subjects (14 women) aged between 16 and 80 were included. The baseline SNOT-22 score was 37.1 and this was significantly reduced to 20.1 (p = 0.003) after one week. In fact, improvements were seen for 20 of the 22 self-reported symptoms, of which statistically significant improvements were documented for 12, with the greatest impact seen for runny nose, cough, postnasal discharge and quality of life features.
The author concluded that the essential oil spray alleviated physical and functional impairment in those with allergic seasonal related diseases.
Citation
Bielory L. Essential oil (EO) nasal spray use in seasonal allergic rhinitis. J Allergy Clin Immunol 2023
25th November 2022
Tumour infiltrating lymphocyte (TIL) scoring based on a machine-learning model has superior classification accuracy for an immune checkpoint inhibitor (ICI) response in patients with advanced non-small cell lung cancer (NSCLC) according to a retrospective analysis by an international research group.
Immunotherapy with immune-checkpoint inhibitors (ICI) has revolutionised the field of oncology for many patients. Nevertheless, not all patients with non-small cell lung cancer benefit from these agents with studies suggesting that in advanced disease, at 1 year, the overall survival rate with, for example, nivolumab, was only 51%. A more favourable response to ICI therapy occurs in those with high programmed cell death ligand-1 expression and a high tumour mutation burden (TMB). A further prognostic factor associated with an improved prognosis in NSCLC patients is high tumour-infiltrating lymphocyte (TIL) levels and which are visually assessed on routine haematoxylin and eosin-stained slides. However, with the increasing use of machine-based learning algorithms in healthcare, some preliminary data highlights the potential for such assessment of haematoxylin and eosin-stained slide sections.
Given the prognostic value of TIL levels, for the present study, researchers developed a machine-learning TIL scoring model to evaluate its association with clinical outcomes in patients with advanced NSCLC. The researchers undertook a retrospective analysis of patient cohorts prescribed PD-(L)1 inhibitors initially for a discovery cohort within a French hospital, followed by an independent validation cohort from hospitals in the UK and the Netherlands. The machine learning model counted tumour, stroma and tumour infiltrating lymphocyte cells whereas values for TMB and PD-L1 expression were determined separately.
Tumour infiltrating lymphocyte cells and ICI response
A total of 685 patients with advanced-stage NSCLCL treated with first or second-line ICI monotherapy were included within the two independent cohorts. The median age in both groups was 66.
Among patients in the discovery cohort, those with a higher TIL cell count had a significantly longer median progression-free survival (Hazard ratio, HR = 0.74, 95% CI 0.61 – 0.90, p = 0.003) and a significantly longer overall survival (HR = 0.76, p = 0.02). Moreover, similar findings of an association between higher tumour infiltrating lymphocyte cell count and both progression-free and overall survival were also observed in the validation cohort.
When using PD-L1 levels as a biomarker, the area under the curve (AUC) was 0.68 and for tumour infiltrating lymphocyte cell levels, only 0.55 and 0.59 for TMB. But when combined, both the PD-L1/TIL and TMB/PD-L1 had higher AUC values (0.68 and 0.70 respectively).
The authors concluded that TIL levels were robustly and independently associated with the response to ICI treatment and could be easily incorporated into the workflow of pathology laboratories at minimal additional cost and might even enhance precision therapy.
Citation
Rakaee M et al. Association of Machine Learning-Based Assessment of Tumor-Infiltrating Lymphocytes on Standard Histologic Images With Outcomes of Immunotherapy in Patients With NSCLC. JAMA Oncol 2022
11th November 2022
Using two separate frailty assessment scores on emergency department patients helps to identify different at-risk patient cohorts and highlights the potential benefit of using both to guide clinical decision-making according to Saudi Arabian researchers.
The term frailty is related to the ageing process and associated with adverse health outcomes. For example, among general surgical patients, the prevalence of frailty has been estimated to range from 10.4 and 37.0% and with a 30-day mortality rate of 8%. Among emergency department (ED) patients, identification of frailty may help guide clinical practice, especially given how the prevalence of frail patients encountered in ED ranges from 9.7% to 43.7%. The Clinical Frailty Scale (CFS) is a recognised frailty assessment tool that can be used to assess the risk of death in patients and has been shown to be an accurate score for predicting poor outcomes and is more practical for use in busy clinical environments such as an ED. An alternative frailty assessment tool is the Hospital Frailty Risk Score (HFRS) and which provides health systems with a low-cost, systematic way to screen for frailty and identify patients at greater risk of adverse outcomes. However, the predictive accuracy of HFRS has not been assessed within an ED and for the present study, the Saudi researchers set out to retrospectively determine the extent to which the CFS and HFRS correlated and their ability to predict adverse hospital-related outcomes for older adults attending an ED. The team developed logistic regression models to estimate the odds ratios (ORs) for both tools to predict both 30-day mortality, a length of stay > 10 days and 30-day remission.
Frailty assessment and clinical outcomes
A total of 12,237 patients with a mean age of 84.6 years (57.8% female) were eligible for inclusion in the analysis.
The correlation between two frailty assessments was low at 0.36 (95% CI 0.34 – 0.38) and the agreement between them was also poor (weighted kappa = 0.10, 95% CI 0.09 – 0.11).
In fully adjusted models, the estimates of 30-day mortality were similar between both frailty assessment tools for patients deemed at a high-risk of frailty (OR = 2.26 vs 2.16, CFS vs HFRS respectively).
The authors concluded that both tools were shown to be predictors of adverse outcomes but given the low level of agreement, each tool was actually identifying a different at-risk population, and which highlighted the potential value of using each tool in ED to help guide clinical decision-making.
Citation
Alshibani A et al. A comparison between the clinical frailty scale and the hospital frailty risk score to risk stratify older people with emergency care needs. BMC Emerg Med 2022
14th July 2022
Arterial stiffness progression, based on measurement of brachial-ankle pulse wave velocity, is slower among patients at high atherosclerotic risk who are prescribed statin drugs according to the findings of retrospective cohort study by Chinese researchers.
The term ‘arterial stiffness’ refers to the loss of elasticity in the walls of large arteries, especially the aorta, over time and results from a degenerative process affecting mainly the extracellular matrix as a result of aging and other risk factors. Moreover, arterial stiffness and wave reflections are now well accepted as the most important determinants of increasing systolic and pulse pressures in aging societies and increasingly used in the clinical assessment of patients with hypertension and various cardiovascular risk factors. The use of brachial-ankle pulse wave velocity, which can be used to non-invasively assess arterial stiffness progression, has been proposed as a surrogate end point for cardiovascular disease and a systematic review in 2012 found that an increase in brachial-ankle pulse wave velocity by 1 m/s corresponded with an increase of total cardiovascular events, cardiovascular mortality, and all-cause mortality.
Patients at a high risk of atherosclerotic disease are usually prescribed statin therapy although whether these drugs can reduce or prevent the development of arterial stiffness is unclear. For example, one analysis concluded that statin therapy had a beneficial effect on aortic arterial stiffness. In contrast, another study revealed how the use of atorvastatin actually increased arterial stiffness. Nonetheless, in many cases, the currently available studies included a low number of patients or were undertaken over a short period of time.
For the present study, the Chinese team retrospectively examined the relationship between statin use and the progression of arterial stiffness, based on measurement of brachial-ankle pulse wave velocity. The researchers used data from the Kailuan study which is a large, prospective study including over 100,000 individuals. A wide range of data was collected during the study including demographic and socioeconomic information, e.g., education level, average income of each family member as well as medical and lifestyle measures such as physical activity. Since 2010, individuals within the study at a high risk of peripheral artery disease, i.e., those with at least one risk factor such as hypertension or diabetes, were invited to have a baseline brachial-ankle pulse wave velocity (BaPWV) measurements and which was repeated at follow-up visits. The team included patients prescribed statins at least 6 months prior to their first BaPWV measurement. Furthermore, statin users were also divided into those who discontinued with their treatment within the first two years and those who had a high level of statin adherence. These statin users were then propensity matched with non-users.
Arterial stiffness and statin use
A total of 1310 individuals with a mean age of 64.6 years (75.7% male) using statins were propensity matched with the same number of non-statin users although the non-users had a slightly lower mean age (61.9 years).
The use of statins was associated with a significantly lower baseline BaPWV value compared to non-users (difference = -33.6 cm/s, 95% CI -62.1 to -5.1 cm/s).
During a mean follow-up of 4.8 years, the BaPWV increased from a mean of 1778.8 cm/s to 1831.9 cm/s in the statin group and from 1799 to 1870.8 in the non-statin group. Using a multivariable linear regression model, the authors found that statin use was associated with a significantly slower progression of BaPWV (difference = -23.3 cm/s/year).
They concluded that statin use appeared to be linked with a slowing of BaPWV progression in adults with a high atherosclerotic risk, suggesting that these drugs were able to prevent the development and worsening of subclinical cardiovascular lesions at an early stage.
Citation
Zhou YF et al. Association Between Statin Use and Progression of Arterial Stiffness Among Adults With High Atherosclerotic Risk JAMA Netw Open 2022
23rd May 2022
The presence of contrast agent pooling (CAP) is strongly associated with a subsequent in-hospital cardiac arrest one hour later, according to the findings of a retrospective analysis by a team from the Department of Emergency Medicine, Far Eastern Memorial Hospital, New Taipei City, Taiwan.
An in-hospital cardiac arrest is not uncommon and reported to occur in 0.8 to 4.6 per 1,000 patient admissions and is associated with a high level of mortality. With emergency departments, computed tomography (CT) scans are a widely used imaging modality for the detection of a number of conditions such as blood clots, kidney stones, head injuries etc. Prior to a CT and irrespective of the reason for the scan, radiologists ensure that a patient is clinically stable. However, despite the appearance of clinical stability, there are case reports of patients who experience a cardiac arrest while undergoing a CT scan. Interestingly, these reports have also described some characteristic features on the scan. The first case to be reported was in 2002 and the authors identified the presence of contrast agent pooling in the dependent parts of the right side of the body, including the venous system and the right lobe of the liver. This likely occurs due to a pump failure of the heart, leading to stasis of blood in the dependent organs of the body. Moreover, once the heart stops pumping, there is a drop in both arterial and venous pressures and because the contrast agent is heavier than blood, CAP occurs in the dependent portions of the venous system.
Although this CAP sign has been reported in several cases, but clearly it not a common phenomenon, it remains uncertain as to whether the presence of CAP has any potential prognostic value for an imminent in-hospital cardiac arrest. For the present study, the researchers performed a retrospective analysis of all patients admitted to their hospital and who underwent a CT scan and then experienced an in-hospital cardiac arrest and collected both demographic and clinical data for these patients. The occurrence of the CAP sign on a chest or abdominal scan was recorded and was defined by accumulation of the contrast agent in the renal, hepatic vein or dependent part of the liver, or if there was contrast agent layering over the vena cava. The primary outcome of interest was the accuracy of the CAP sign in predicting an imminent cardiac arrest (defined as within 1 hour after the CT scan).
CAP sign and cardiac arrest
A total of 128 patients with a mean age of 69 years (60.2% male) were included in the analysis, among whom, 8.6% were positive for the CAP sign on a CT scan.
With respect to the primary outcome, the accuracy of the CAP sign in predicting cardiac arrest was 85.94% (95% CI 78.69 – 91.45%) and the positive predictive value was 64%. Additionally, the CAP sign was significantly associated with a cardiac arrest within 1 hour (odds ratio, OR = 7.35, 95% CI 1.27 – 42.59).
The authors concluded that the CAP sign could be viewed as an imaging feature of circulatory failure and its presence should be taken as a warning sign to clinicians to allow them to provide timely interventions for critically ill patients.
Citation
Lee YH et al. Contrast Agent Pooling (C.A.P.) sign and imminent cardiac arrest: a retrospective study BMC Emerg Med 2022
6th May 2022
Use of an AI-based breast cancer protocol has been found to have a similar screening sensitivity and a slightly higher specificity than radiologists and might therefore be able to considerably reduce the workload of radiologists. This was the finding from a retrospective analysis by researchers from the Department of Computer Science and Public Health, University of Copenhagen, Copenhagen, Denmark.
Breast cancer arises in the epithelium of the ducts or lobules in the glandular tissue of the breast and according to the World Health Organization, in 2020 there were 2.3 million women diagnosed with breast cancer and 685 000 global deaths. Population screening of women enables the detection of the early signs of breast cancer and one European analysis of observational studies concluded that the estimated breast cancer mortality reduction from invited screening was 25-31% and 38-48% for women actually screened. Although screening mammography is the principle method for the detection of breast cancer, 10%-30% of breast cancers may be missed at mammography. Part of the reason for missing possible cancers may be due to behavioural factors. For example, in one study, six radiologists who reviewed 100 breast cancer scans, where the prevalence of disease was artificially set at 50%, missed 30% of the cancers. In contrast, when the prevalence was raised, participants missed just 12% of the same cancers. In other words, radiologists are more likely to be on the look-out of suspicious scans when they know that the disease has a much higher prevalence.
One potential way to remove the effect of behavioural influences is the use of an artificial intelligence (AI) based system for reading breast cancer scans. In fact, such systems have been shown to maintain non-inferior performance and reduced the workload of the second, radiologist reader by 88%. But whether an AI-based breast cancer system could be safely used for population-based screening and reduce the number of mammograms that required reading by a radiologist is uncertain and was the objective of the current study. Using a retrospective design, the Danish researchers examined whether the AI-based cancer protocol was able to detect normal, moderate-risk and suspicious mammograms. The team used data from a breast cancer screening program and each of the mammograms was scored from 0 to 10 (to designate the risk of malignancy) by the AI-based cancer tool. The team then compared to AI-based cancer system and radiologists with respect to screening performance and used the area under the receiver operating characteristics curve (AUC) to compare performance.
AI-based cancer screening protocol performance
The cohort included 114,421 women with a mean age of 59 years who underwent mammography screening. The scanning identified 791 screen-detected cancers, 327 interval cancers and 1473 long-term cancers.
The AI-based cancer system had a screening sensitivity of 69.7% (95% CI 66.9 – 72.4%) which was non-inferior to the radiologist sensitivity of 70.8% (p = 0.02). The AI-based screening specificity was 98.6% and which was significantly higher (p < 0.001) than that of the radiologist (98.1%).
Based on these findings, the authors calculated that use of the AI-based cancer system led to a 62.6% radiologist workload reduction. Moreover, the AI-based system reduced the number of false-positive screenings by 25.1%.
They concluded that incorporation of an AI-based cancer system for population-based screening could both improve these programs and reduce radiologist workload and called for a prospective trial to determine the impact of AI-based screening.
Citation
Lauritzen AD et al. An Artificial Intelligence–based Mammography Screening Protocol for Breast Cancer: Outcome and Radiologist Workload Radiology 2022
29th December 2021
Having pre-existing heart disease (PEHD) seems to improve the chance of survival to hospital after experiencing an out-of-hospital cardiac arrest. This was the somewhat counter-intuitive conclusion of a retrospective analysis by a team from the Department of Cardiology, Amsterdam UMC, The Netherlands.
Out-of-hospital (OOH) cardiac arrests are a common problem and in one European study of 37,054 such arrests, the hospital survival rate was only 26.4%. Patient characteristics are undoubtedly a potentially important contributor to overall survival and in particular, the presence of existing heart disease, although the impact of this factor on survival has been poorly studied. In one such study, having ischaemic heart disease led to a 50% improved odds of survival. However and in contrast, a second study showed that PEHD was an independent predictor of a poorer survival after an OOH cardiac arrest.
Given these contradictory findings, the Dutch researchers set out to explore the relationship between OOH cardiac arrest and prior cardiovascular disease. They retrospectively examined data from the AmsteRdam REsuscitation STudies (ARREST) registry, which contains information on all OOH cardiac arrests in the Northern part of Holland. Patient’s medical histories were obtained from their GP’s medical records for evidence of a wide range of cardiac diseases and patients were dichotomised as either having or not having PEHD. For the purposes of the study, the primary outcomes were survival to hospital admission and survival to hospital discharge and this data was obtained from hospital records. Secondary outcomes were a shockable initial rhythm (SIR) and acute myocardial infarction (AMI) as an immediate cause of OOH cardiac arrest. Using regression analysis, the team examined the association between PEHD and survival to hospital and survival to discharge and adjusted models for several co-morbidities.
Findings
A total of 3760 OOH cardiac arrest patients with a mean age of 68 years (70.9% male) were included in the analysis. Overall, 48.1% of the cohort had PEHD and on average were slightly older (mean age 71.4 vs 64.7, p < 0.001) and with a higher incidence of cardiovascular risk factors e.g., hypertension (59.3% vs 42.2, p < 0.01), obesity and hypercholesterolaemia.
Among those with PEHD, 41.9% survived to hospital admission compared to 38% of those without prior cardiovascular disease (p = 0.014). The presence of existing heart disease was associated with an increased survival odds after adjustment for all covariates (adjusted odds ratio, aOR = 1.25, 95% CI 1.05 – 1.47). However, prior cardiac disease was not associated with survival to discharge in fully adjusted models (aOR = 1.16, 95% CI 0.95 – 1.42).
Among 1680 patients with SIR, prior heart disease was also not associated with survival to hospital (aOR = 1.12, 95% CI 0.90 – 1.39) but prior heart disease was associated with a lower proportion of AMI (aOR = 0.33, 95% CI 0.25 – 0.42).
In their discussion, the authors recognised that their findings were somewhat counter-intuitive and were unable to explain the result. Nevertheless, they concluded that survival gains after an OOH cardiac arrest applied to a wide range of patients with prior heart disease.
Citation
van Dongen LH et al. Higher chances of survival to hospital admission after out-of-hospital cardiac arrest in patients with previously diagnosed heart disease Open Heart 2021
7th December 2021
Rates of diabetes-associated ocular complications (DAOC) in children have been found to be much higher in children diagnosed with type 2 as opposed to type 1 disease over the first 15 years after diagnosis. This was the finding of a retrospective analysis by a team from the Department of Ophthalmology, Mayo Clinic,
Rochester, US.
Diabetes is a common childhood condition, with a recent UK study finding that in 2019, there were an estimated 36,000 children with diabetes under the age of 19, an increase from from 31,500 in 2015. In children, type 1 disease accounts for the vast majority of cases although there is evidence to suggest that the prevalence of type 2 diabetes has increased between 2001 and 2009, in 10 – 19 year olds. Diabetes is associated with the development of micro-vascular complications including retinopathy, which remains the most common cause of blindness in working-age adults in the developed world. While sight loss in children due to diabetic retinopathy is much less common, guidance does recommend retinopathy screening of children with type 1 diabetes. However, far less is known about the development and progression of diabetic retinopathy among children with type 2 diabetes.
For the present study, the US researchers were interested in comparing the DAOC rates in children with both forms of diabetes. They turned to the medical records of children newly diagnosed with diabetes between 1970 and 2019 in Minnesota. They collected demographic and clinical data such as HbA1C and whether the individuals had undergone an eye examination and followed-up on these examinations. The researchers catalogued diabetic-associated ocular complications including non-proliferative diabetic retinopathy (NPDR), proliferative diabetic retinopathy (PDR), diabetic macular oedema (DME), a visually significant cataract (VSC) and the need for pars plana vitrectomy (PPV).
Findings
A total of 606 children were diagnosed with diabetes during the 50-year period, of whom, 525 (87.8%) had undergone at least one eye examination and were diagnosed with either type 1 (461) or type 2 (64) diabetes. The mean age of diagnosis among those with type 1 disease was 10.8 years (53.4% male) and 17.3 years (28.1% male) for type 2 disease. A DAOC occurred in 147 (31.9%) of those with type 1 disease,14 years after diagnosis and in 17 (26.6%) of those with type 2 disease. The hazard ratio, HR for developing any diabetic retinopathy between type 2 and type 1 disease was 1.88 (95% CI 1.13 – 3.12, p = 0.02).
Overall, 30.6% of those with type 1 disease developed a DAOC within 15 years compared to 52.7% of those with type 2 disease. While the risk of developing any of the other retinopathy complications included in the analysis was numerically higher among those with type 2 disease, the only statistically significant effect was the need for pars plana vitrectomy (HR = 4.06, 95% CI 1.34 – 12.33, p = 0.07), 15 years after the initial diagnosis.
The authors concluded that children with type 2 diabetes developed vision threatening retinopathy a shorter time after diagnosis than those with type 1 disease and suggested that such children should have ophthalmoscopy evaluations at least as frequently, or even more frequently, than those with type 1 disease.
Citation
Bai P et al. Ocular Sequelae in a Population-Based Cohort of Youth Diagnosed With Diabetes During a 50-Year Period. JAMA Ophthalmol 2021
15th November 2021
There was a high discrepancy in the interpretation of advanced radiological scans among radiologists who had the same specialist training. This was the unexpected finding from a retrospective analysis of nearly 6 million acute examinations according to researchers from the Department of Radiology and Imaging Sciences, Indianapolis, US. Fellowship or specialist training has been found to be a highly desired attribute when hiring radiologists and provides them with the necessary expertise in a specific area of practice. Nevertheless, despite having undertaking fellowship training, in practice, much of the work of radiologists requires them to be ‘multi-specialists‘, i.e., they are required to be able to interpret scans across a range of specialities such as neurology or abdominal radiology.
Although several studies have examined image interpretation and discrepancy rates between radiologists from different specialities, to date, no studies have specifically considered situations where the radiologists have been concordant, i.e., have undertaken the same sub-speciality training comparing differing specialities, i.e., discordant. For the present study, the US team turned to a large US tele-radiology company database which contracted with community hospitals and provided a service, 24 hour a day, 7 days a week, for a range of examinations including CT, MRI and ultrasound scanning. The company tele-radiologist provided an initial interpretation of a scan and then the second (final) interpretation was undertaken by a radiologist at the hospital.
For the purposes of the analysis, examination scans were classified as common or advanced, the presence of a discrepancy as either major or minor (based on factors such as patient safety and clinical outcomes) and the relationship between the two radiologists as either concordant or discordant.
Findings
The analysis included 5,883,980 examinations performed on patients with a mean age of 50 years. For the whole examinations, 40% were deemed concordant with respect to the interpreting radiologist’s specialist training. The overall discrepancy rate was 0.43%, of which 0.14% were deemed to by major.
Among concordant radiologists, the major discrepancy rate was lower with common compared to advanced examinations (0.13% vs 0.26% common vs advanced). In addition, among both common and advanced examinations, the frequency of major and minor discrepancies was not different between concordant and discordant radiologists.
However, in cases where the two radiologists were concordant, among advanced examinations, the frequency of major discrepancies was significantly higher (0.26% vs 0.18% concordant vs discordant) giving a 45% increased likelihood of a major discrepancy for examinations among concordant vs discordant specialities (hazard ratio, HR = 1.45, 95% CI 1.18 – 1.79). Similarly, the frequency of minor discrepancies was also higher (HR = 1.17) among concordant compared to discordant radiologists (0.34% vs 0.29%, concordant vs discordant).
In their discussion, the authors noted that for the interpretation of both common and advanced examinations, there were no important differences between concordant and discordant radiologists, which was unsurprising given that many of these examinations would be frequently encountered in practice. Nevertheless, the authors were unable to account for the high discrepancy rate for both major and minor discrepancies in advanced examinations for concordant radiologists. In other words, why there were differences in the interpretation when both radiologists had undertaken the same specialist training.
They concluded by suggesting that practice leaders should carefully consider efforts to match sub-specialities when interpreting scans.
Citation
Chong S et al. Interpretations of Examinations Outside of Radiologists’ Fellowship Training: Assessment of Discrepancy Rates Among 5.9 Million Examinations From a National Teleradiology Databank. AJR Am J Roentgenol 2021