This website is intended for healthcare professionals only.
Take a look at a selection of our recent media coverage:
12th June 2023
UVB phototherapy (UVBP) for the treatment of atopic eczema in adults does not increase the risk of cutaneous cancers, according to a retrospective analysis.
Published in the Journal of the American Academy of Dermatology, Taiwanese researchers retrospectively assessed whether UVBP use in adults with atopic eczema elevated the risk of cutaneous cancers.
The team undertook a nationwide, population-based cohort study from 2001-18 to estimate the risk of developing both non-melanoma skin cancer and melanoma. They excluded patients under 20 years of age, those with a prior diagnosis of skin cancer and individuals who had received PUVA therapy. The cohort of patients who had received UVBP were then matched 4:1 to a group of atopic eczema patients who had not received phototherapy.
The researchers calculated the number of UVBP sessions for each patient and adjusted their analysis for a number of covariates including immunosuppressant therapy.
After exclusion, a total of 1,241 patients in the UVBP group were matched to 4,964 patients in the non-UVBP group. For the entire cohort, mean age was 42.4 years and 65.8% were men.
Compared to those not receiving UVBP, there was no overall and significant increased risk of skin cancer in the phototherapy group (adjusted Hazard ratio, aHR = 0.91, 95% CI 0.35 – 2.35). Similarly, there was no increased risk of either non-melanoma skin cancer (aHR = 0.80, 95% CI 0.29 – 2.26) or cutaneous melanoma (aHR = 0.80, 95% CI 0.08 – 7.64).
In addition, the risk of either form of cutaneous cancer was not increased when analysed based on the number of UVBP sessions.
UVBP is a recommended second-line treatment following failure of first-line treatment in patients with atopic eczema. While the existing literature suggests that longer-term use of UVBP does not increase the risk of cancer, this evidence is derived from patients with psoriasis. Consequently, whether this treatment modality is also safe in the longer term among those with atopic eczema remains uncertain.
9th June 2023
SGLT2 inhibitor drug use in patients with diabetes appears to reduce cancer risk, according to a retrospective analysis by Taiwanese researchers.
Epidemiological evidence suggests that diabetes increases the risk of cancer. It is thought the combination of hyper-insulinaemia, chronic inflammation and hyperglycaemia could increase the growth of tumours. Consequently, it may be possible to reduce this risk with anti-diabetic treatment. A recent meta-analysis of randomised clinical trials, suggests that SGLT2 inhibitor drugs could lower cancer risk compared to placebo. However, the extent to which these drugs might reduce the risk of cancer in practice is less clear.
The current study, published in the Journal of Diabetes and its Complications, retrospectively compared cancer development among SGLT2 inhibitor users. The team matched these patients with a group not prescribed these drugs. The primary outcome was cancer development and the analysis adjusted for several potential confounders.
A cohort of 325,990 SGLT2 inhibitor users with mean age of 58.6 years (42.2% female) and 325,989 non-users was identified.
SGLT2 inhibitor users had a significantly lower cancer risk (adjusted hazard ratio, aHR = 0.79, 95% CI 0.76 – 0.83) than non-users. The risk of cancer was also higher among males (aHR = 1.35, 95% CI 1.30 – 1.41) and in patients aged 50-64 and older than 65 years.
Researchers also noticed that this risk reduction was dependent on the duration of SGLT2 inhibitor use. Short-term use (60-140 days) was actually linked to a higher cancer risk (aHR = 1.30, 95% CI 1.21 – 1.39).
In addition, while cancer risk was generally lower, there was a significant increased risk of pancreatic cancer (aHR = 1.51, 95% CI 1.22 – 1.87) which was consistent with the findings of a recent case study.
30th May 2023
Although ketamine use increases haemodynamic instability during rapid sequence intubation in trauma patients, it does not significantly affect the first-pass success rate compared to etomidate, according to a retrospective analysis.
Published in the journal BMC Emergency Medicine, Korean researchers considered whether the potential adverse effects of ketamine and etomidate could affect the first-pass success rate during rapid sequence intubation (RSI) in trauma patients.
The team retrospectively compared both sedatives, not only in terms of the effect on the first-pass success rate but also with respect to clinical outcomes. Patients given ketamine were propensity-matched 1:3 with etomidate and the results adjusted for injury severity and confounding baseline characteristics.
RSI represents the set of actions undertaken during induction of anaesthesia that secures the airway in trauma patients at risk of aspiration or regurgitation of gastric contents, to enable emergency orotracheal intubation. Ideally, the RSI procedure should allow for rapid and optimal intubation conditions through increasing the first-pass intubation rate whilst reducing adverse events in severely injured patients. Despite being a standard procedure, a recent survey identified significant variation in practice, prompting called for international RSI guidelines.
Both ketamine and etomidate are commonly used sedatives for RSI during emergency tracheal intubation. Nevertheless, both are associated with potential adverse effects which could affect clinical outcomes. For example, single dose use of etomidate may increase 28-day mortality, whereas ketamine use could increase the risk of cardiac arrest.
A total of 620 patients, of whom 19.9% received ketamine, were included in the retrospective analysis. The ketamine patients had a significantly faster initial heart rate (105.0 vs 97.7, p = 0.003) and were more hypotensive (114.2 vs 139.3 mmHg, p < 0.001) than those given etomidate.
However, when researchers considered the first-pass success rate, this was not significantly different (90.7% vs. 90.1%, ketamine vs etomidate, p > 0.999). Similarly, there were no differences in other clinical outcomes explored including final mortality (p = 0.348), length of intensive care unit stay (p = 0.99), ventilator days (p = 0.735) and overall hospital stay (p = 0.32).
The authors concluded that when used for RSI, although patients administered ketamine showed greater haemodynamic instability, this had no important impact on either the first-pass success rate or other relevant clinical outcomes.
7th March 2023
The use of an essential oil nasal spray led to a 40% reduction in total sino-nasal symptoms according to data presented in an abstract presented at the American Academy of Allergy and Immunology (AAAAI) at San Antonio and published in a supplement to the Journal of Allergy and Clinical Immunology.
Allergic rhinitis gives rise to symptoms which commonly include nasal congestion, nasal itch, rhinorrhoea and sneezing and is estimated to affect 18.1% of the global population, leading to impairment of both sleep and quality of life. Typically, treatment involves the use of antihistamines and or intranasal corticosteroids, although a recent randomised trial demonstrated how aromatherapy oils provided symptomatic relief in patients with perennial allergic rhinitis. Furthermore, the use of a lavender oil inhalation was also found to be of benefit in bronchial asthma.
Given the potential value of essential oils for perennial allergic rhinitis, in the current study presented at AAAAI, the role of an essential oil nasal spray, which contained menthol, eucalyptol, thymol, camphor, birch oil, pine oil, cinnamon and mint, was evaluated over a period of 7 days by patients with seasonal allergic rhinitis. Symptom severity was assessed using the Sino-Nasal Outcome Test (SNOT-22), a patient-reported outcome measure which assesses 22 symptoms, each of which were assessed at baseline and after one week, using a 0 to 5 scale, where 0 = no problem and 5 = ‘problem as bad as can be’
Essential oil nasal spray and SNOT-22 score
A total of 18 subjects (14 women) aged between 16 and 80 were included. The baseline SNOT-22 score was 37.1 and this was significantly reduced to 20.1 (p = 0.003) after one week. In fact, improvements were seen for 20 of the 22 self-reported symptoms, of which statistically significant improvements were documented for 12, with the greatest impact seen for runny nose, cough, postnasal discharge and quality of life features.
The author concluded that the essential oil spray alleviated physical and functional impairment in those with allergic seasonal related diseases.
Bielory L. Essential oil (EO) nasal spray use in seasonal allergic rhinitis. J Allergy Clin Immunol 2023
25th November 2022
Tumour infiltrating lymphocyte (TIL) scoring based on a machine-learning model has superior classification accuracy for an immune checkpoint inhibitor (ICI) response in patients with advanced non-small cell lung cancer (NSCLC) according to a retrospective analysis by an international research group.
Immunotherapy with immune-checkpoint inhibitors (ICI) has revolutionised the field of oncology for many patients. Nevertheless, not all patients with non-small cell lung cancer benefit from these agents with studies suggesting that in advanced disease, at 1 year, the overall survival rate with, for example, nivolumab, was only 51%. A more favourable response to ICI therapy occurs in those with high programmed cell death ligand-1 expression and a high tumour mutation burden (TMB). A further prognostic factor associated with an improved prognosis in NSCLC patients is high tumour-infiltrating lymphocyte (TIL) levels and which are visually assessed on routine haematoxylin and eosin-stained slides. However, with the increasing use of machine-based learning algorithms in healthcare, some preliminary data highlights the potential for such assessment of haematoxylin and eosin-stained slide sections.
Given the prognostic value of TIL levels, for the present study, researchers developed a machine-learning TIL scoring model to evaluate its association with clinical outcomes in patients with advanced NSCLC. The researchers undertook a retrospective analysis of patient cohorts prescribed PD-(L)1 inhibitors initially for a discovery cohort within a French hospital, followed by an independent validation cohort from hospitals in the UK and the Netherlands. The machine learning model counted tumour, stroma and tumour infiltrating lymphocyte cells whereas values for TMB and PD-L1 expression were determined separately.
Tumour infiltrating lymphocyte cells and ICI response
A total of 685 patients with advanced-stage NSCLCL treated with first or second-line ICI monotherapy were included within the two independent cohorts. The median age in both groups was 66.
Among patients in the discovery cohort, those with a higher TIL cell count had a significantly longer median progression-free survival (Hazard ratio, HR = 0.74, 95% CI 0.61 – 0.90, p = 0.003) and a significantly longer overall survival (HR = 0.76, p = 0.02). Moreover, similar findings of an association between higher tumour infiltrating lymphocyte cell count and both progression-free and overall survival were also observed in the validation cohort.
When using PD-L1 levels as a biomarker, the area under the curve (AUC) was 0.68 and for tumour infiltrating lymphocyte cell levels, only 0.55 and 0.59 for TMB. But when combined, both the PD-L1/TIL and TMB/PD-L1 had higher AUC values (0.68 and 0.70 respectively).
The authors concluded that TIL levels were robustly and independently associated with the response to ICI treatment and could be easily incorporated into the workflow of pathology laboratories at minimal additional cost and might even enhance precision therapy.
Rakaee M et al. Association of Machine Learning-Based Assessment of Tumor-Infiltrating Lymphocytes on Standard Histologic Images With Outcomes of Immunotherapy in Patients With NSCLC. JAMA Oncol 2022
11th November 2022
Using two separate frailty assessment scores on emergency department patients helps to identify different at-risk patient cohorts and highlights the potential benefit of using both to guide clinical decision-making according to Saudi Arabian researchers.
The term frailty is related to the ageing process and associated with adverse health outcomes. For example, among general surgical patients, the prevalence of frailty has been estimated to range from 10.4 and 37.0% and with a 30-day mortality rate of 8%. Among emergency department (ED) patients, identification of frailty may help guide clinical practice, especially given how the prevalence of frail patients encountered in ED ranges from 9.7% to 43.7%. The Clinical Frailty Scale (CFS) is a recognised frailty assessment tool that can be used to assess the risk of death in patients and has been shown to be an accurate score for predicting poor outcomes and is more practical for use in busy clinical environments such as an ED. An alternative frailty assessment tool is the Hospital Frailty Risk Score (HFRS) and which provides health systems with a low-cost, systematic way to screen for frailty and identify patients at greater risk of adverse outcomes. However, the predictive accuracy of HFRS has not been assessed within an ED and for the present study, the Saudi researchers set out to retrospectively determine the extent to which the CFS and HFRS correlated and their ability to predict adverse hospital-related outcomes for older adults attending an ED. The team developed logistic regression models to estimate the odds ratios (ORs) for both tools to predict both 30-day mortality, a length of stay > 10 days and 30-day remission.
Frailty assessment and clinical outcomes
A total of 12,237 patients with a mean age of 84.6 years (57.8% female) were eligible for inclusion in the analysis.
The correlation between two frailty assessments was low at 0.36 (95% CI 0.34 – 0.38) and the agreement between them was also poor (weighted kappa = 0.10, 95% CI 0.09 – 0.11).
In fully adjusted models, the estimates of 30-day mortality were similar between both frailty assessment tools for patients deemed at a high-risk of frailty (OR = 2.26 vs 2.16, CFS vs HFRS respectively).
The authors concluded that both tools were shown to be predictors of adverse outcomes but given the low level of agreement, each tool was actually identifying a different at-risk population, and which highlighted the potential value of using each tool in ED to help guide clinical decision-making.
Alshibani A et al. A comparison between the clinical frailty scale and the hospital frailty risk score to risk stratify older people with emergency care needs. BMC Emerg Med 2022
14th July 2022
Arterial stiffness progression, based on measurement of brachial-ankle pulse wave velocity, is slower among patients at high atherosclerotic risk who are prescribed statin drugs according to the findings of retrospective cohort study by Chinese researchers.
The term ‘arterial stiffness’ refers to the loss of elasticity in the walls of large arteries, especially the aorta, over time and results from a degenerative process affecting mainly the extracellular matrix as a result of aging and other risk factors.
Moreover, arterial stiffness and wave reflections are now well accepted as the most important determinants of increasing systolic and pulse pressures in aging societies and increasingly used in the clinical assessment of patients with hypertension and various cardiovascular risk factors.
The use of brachial-ankle pulse wave velocity, which can be used to non-invasively assess arterial stiffness progression, has been proposed as a surrogate end point for cardiovascular disease and a systematic review in 2012 found that an increase in brachial-ankle pulse wave velocity by 1 m/s corresponded with an increase of total cardiovascular events, cardiovascular mortality, and all-cause mortality.
Patients at a high risk of atherosclerotic disease are usually prescribed statin therapy although whether these drugs can reduce or prevent the development of arterial stiffness is unclear. For example, one analysis concluded that statin therapy had a beneficial effect on aortic arterial stiffness.
In contrast, another study revealed how the use of atorvastatin actually increased arterial stiffness. Nonetheless, in many cases, the currently available studies included a low number of patients or were undertaken over a short period of time.
For the present study, the Chinese team retrospectively examined the relationship between statin use and the progression of arterial stiffness, based on measurement of brachial-ankle pulse wave velocity. The researchers used data from the Kailuan study which is a large, prospective study including over 100,000 individuals.
A wide range of data was collected during the study including demographic and socioeconomic information, e.g., education level, average income of each family member as well as medical and lifestyle measures such as physical activity.
Since 2010, individuals within the study at a high risk of peripheral artery disease, i.e., those with at least one risk factor such as hypertension or diabetes, were invited to have a baseline brachial-ankle pulse wave velocity (BaPWV) measurements and which was repeated at follow-up visits.
The team included patients prescribed statins at least 6 months prior to their first BaPWV measurement. Furthermore, statin users were also divided into those who discontinued with their treatment within the first two years and those who had a high level of statin adherence. These statin users were then propensity matched with non-users.
Arterial stiffness and statin use
A total of 1310 individuals with a mean age of 64.6 years (75.7% male) using statins were propensity matched with the same number of non-statin users although the non-users had a slightly lower mean age (61.9 years).
The use of statins was associated with a significantly lower baseline BaPWV value compared to non-users (difference = -33.6 cm/s, 95% CI -62.1 to -5.1 cm/s).
During a mean follow-up of 4.8 years, the BaPWV increased from a mean of 1778.8 cm/s to 1831.9 cm/s in the statin group and from 1799 to 1870.8 in the non-statin group. Using a multivariable linear regression model, the authors found that statin use was associated with a significantly slower progression of BaPWV (difference = -23.3 cm/s/year).
They concluded that statin use appeared to be linked with a slowing of BaPWV progression in adults with a high atherosclerotic risk, suggesting that these drugs were able to prevent the development and worsening of subclinical cardiovascular lesions at an early stage.
Zhou YF et al. Association Between Statin Use and Progression of Arterial Stiffness Among Adults With High Atherosclerotic Risk JAMA Netw Open 2022
23rd May 2022
The presence of contrast agent pooling (CAP) is strongly associated with a subsequent in-hospital cardiac arrest one hour later, according to the findings of a retrospective analysis by a team from the Department of Emergency Medicine, Far Eastern Memorial Hospital, New Taipei City, Taiwan.
Within emergency departments, computed tomography (CT) scans are a widely used imaging modality for the detection of a number of conditions such as blood clots, kidney stones, head injuries etc. Prior to a CT and irrespective of the reason for the scan, radiologists ensure that a patient is clinically stable. However, despite the appearance of clinical stability, there are case reports of patients who experience a cardiac arrest while undergoing a CT scan.
Interestingly, these reports have also described some characteristic features on the scan. The first case to be reported was in 2002 and the authors identified the presence of contrast agent pooling in the dependent parts of the right side of the body, including the venous system and the right lobe of the liver. This likely occurs due to a pump failure of the heart, leading to stasis of blood in the dependent organs of the body.
Moreover, once the heart stops pumping, there is a drop in both arterial and venous pressures and because the contrast agent is heavier than blood, CAP occurs in the dependent portions of the venous system.
Although this CAP sign has been reported in several cases, but clearly it not a common phenomenon, it remains uncertain as to whether the presence of CAP has any potential prognostic value for an imminent in-hospital cardiac arrest.
For the present study, the researchers performed a retrospective analysis of all patients admitted to their hospital and who underwent a CT scan and then experienced an in-hospital cardiac arrest and collected both demographic and clinical data for these patients.
The occurrence of the CAP sign on a chest or abdominal scan was recorded and was defined by accumulation of the contrast agent in the renal, hepatic vein or dependent part of the liver, or if there was contrast agent layering over the vena cava.
The primary outcome of interest was the accuracy of the CAP sign in predicting an imminent cardiac arrest (defined as within 1 hour after the CT scan).
CAP sign and cardiac arrest
A total of 128 patients with a mean age of 69 years (60.2% male) were included in the analysis, among whom, 8.6% were positive for the CAP sign on a CT scan.
With respect to the primary outcome, the accuracy of the CAP sign in predicting cardiac arrest was 85.94% (95% CI 78.69 – 91.45%) and the positive predictive value was 64%. Additionally, the CAP sign was significantly associated with a cardiac arrest within 1 hour (odds ratio, OR = 7.35, 95% CI 1.27 – 42.59).
The authors concluded that the CAP sign could be viewed as an imaging feature of circulatory failure and its presence should be taken as a warning sign to clinicians to allow them to provide timely interventions for critically ill patients.
Lee YH et al. Contrast Agent Pooling (C.A.P.) sign and imminent cardiac arrest: a retrospective study BMC Emerg Med 2022
6th May 2022
Use of an AI-based breast cancer protocol has been found to have a similar screening sensitivity and a slightly higher specificity than radiologists and might therefore be able to considerably reduce the workload of radiologists.
This was the finding from a retrospective analysis by researchers from the Department of Computer Science and Public Health, University of Copenhagen, Copenhagen, Denmark.
Breast cancer arises in the epithelium of the ducts or lobules in the glandular tissue of the breast and according to the World Health Organization, in 2020 there were 2.3 million women diagnosed with breast cancer and 685 000 global deaths.
Population screening of women enables the detection of the early signs of breast cancer and one European analysis of observational studies concluded that the estimated breast cancer mortality reduction from invited screening was 25-31% and 38-48% for women actually screened.
Although screening mammography is the principle method for the detection of breast cancer, 10%-30% of breast cancers may be missed at mammography. Part of the reason for missing possible cancers may be due to behavioural factors.
For example, in one study, six radiologists who reviewed 100 breast cancer scans, where the prevalence of disease was artificially set at 50%, missed 30% of the cancers. In contrast, when the prevalence was raised, participants missed just 12% of the same cancers.
In other words, radiologists are more likely to be on the look-out of suspicious scans when they know that the disease has a much higher prevalence.
One potential way to remove the effect of behavioural influences is the use of an artificial intelligence (AI) based system for reading breast cancer scans. In fact, such systems have been shown to maintain non-inferior performance and reduced the workload of the second, radiologist reader by 88%.
But whether an AI-based breast cancer system could be safely used for population-based screening and reduce the number of mammograms that required reading by a radiologist is uncertain and was the objective of the current study.
Using a retrospective design, the Danish researchers examined whether the AI-based cancer protocol was able to detect normal, moderate-risk and suspicious mammograms. The team used data from a breast cancer screening program and each of the mammograms was scored from 0 to 10 (to designate the risk of malignancy) by the AI-based cancer tool.
The team then compared to AI-based cancer system and radiologists with respect to screening performance and used the area under the receiver operating characteristics curve (AUC) to compare performance.
AI-based cancer screening protocol performance
The cohort included 114,421 women with a mean age of 59 years who underwent mammography screening. The scanning identified 791 screen-detected cancers, 327 interval cancers and 1473 long-term cancers.
The AI-based cancer system had a screening sensitivity of 69.7% (95% CI 66.9 – 72.4%) which was non-inferior to the radiologist sensitivity of 70.8% (p = 0.02). The AI-based screening specificity was 98.6% and which was significantly higher (p < 0.001) than that of the radiologist (98.1%).
Based on these findings, the authors calculated that use of the AI-based cancer system led to a 62.6% radiologist workload reduction. Moreover, the AI-based system reduced the number of false-positive screenings by 25.1%.
They concluded that incorporation of an AI-based cancer system for population-based screening could both improve these programs and reduce radiologist workload and called for a prospective trial to determine the impact of AI-based screening.
Lauritzen AD et al. An Artificial Intelligence–based Mammography Screening Protocol for Breast Cancer: Outcome and Radiologist Workload Radiology 2022
29th December 2021
Having pre-existing heart disease (PEHD) seems to improve the chance of survival to hospital after experiencing an out-of-hospital cardiac arrest. This was the somewhat counter-intuitive conclusion of a retrospective analysis by a team from the Department of Cardiology, Amsterdam UMC, The Netherlands.
Out-of-hospital (OOH) cardiac arrests are a common problem and in one European study of 37,054 such arrests, the hospital survival rate was only 26.4%. Patient characteristics are undoubtedly a potentially important contributor to overall survival and in particular, the presence of existing heart disease, although the impact of this factor on survival has been poorly studied. In one such study, having ischaemic heart disease led to a 50% improved odds of survival. However and in contrast, a second study showed that PEHD was an independent predictor of a poorer survival after an OOH cardiac arrest.
Given these contradictory findings, the Dutch researchers set out to explore the relationship between OOH cardiac arrest and prior cardiovascular disease. They retrospectively examined data from the AmsteRdam REsuscitation STudies (ARREST) registry, which contains information on all OOH cardiac arrests in the Northern part of Holland. Patient’s medical histories were obtained from their GP’s medical records for evidence of a wide range of cardiac diseases and patients were dichotomised as either having or not having PEHD. For the purposes of the study, the primary outcomes were survival to hospital admission and survival to hospital discharge and this data was obtained from hospital records. Secondary outcomes were a shockable initial rhythm (SIR) and acute myocardial infarction (AMI) as an immediate cause of OOH cardiac arrest. Using regression analysis, the team examined the association between PEHD and survival to hospital and survival to discharge and adjusted models for several co-morbidities.
A total of 3760 OOH cardiac arrest patients with a mean age of 68 years (70.9% male) were included in the analysis. Overall, 48.1% of the cohort had PEHD and on average were slightly older (mean age 71.4 vs 64.7, p < 0.001) and with a higher incidence of cardiovascular risk factors e.g., hypertension (59.3% vs 42.2, p < 0.01), obesity and hypercholesterolaemia.
Among those with PEHD, 41.9% survived to hospital admission compared to 38% of those without prior cardiovascular disease (p = 0.014). The presence of existing heart disease was associated with an increased survival odds after adjustment for all covariates (adjusted odds ratio, aOR = 1.25, 95% CI 1.05 – 1.47). However, prior cardiac disease was not associated with survival to discharge in fully adjusted models (aOR = 1.16, 95% CI 0.95 – 1.42).
Among 1680 patients with SIR, prior heart disease was also not associated with survival to hospital (aOR = 1.12, 95% CI 0.90 – 1.39) but prior heart disease was associated with a lower proportion of AMI (aOR = 0.33, 95% CI 0.25 – 0.42).
In their discussion, the authors recognised that their findings were somewhat counter-intuitive and were unable to explain the result. Nevertheless, they concluded that survival gains after an OOH cardiac arrest applied to a wide range of patients with prior heart disease.
van Dongen LH et al. Higher chances of survival to hospital admission after out-of-hospital cardiac arrest in patients with previously diagnosed heart disease Open Heart 2021