This website is intended for healthcare professionals only.
Take a look at a selection of our recent media coverage:
2nd June 2023
Elevated levels of the protein Indian Hedgehog, which is released by a subset of aged and injured renal cells, have been identified in patients with chronic kidney disease (CKD) and is therefore a potential therapeutic target for the condition, according to a new study.
Published in the journal Science Translational Medicine, an international research team sought to understand the link between injury and inflammation in patients with CKD. The team hypothesised that signalling from renal epithelial cells activated glioma-associated oncogene 1 expressing cells (Gli1+) to induce fibrosis. But exactly how this process occurred was uncertain.
The ageing process is associated with chronic inflammation and a defective immune response. Conditions such as CKD are accompanied by immunosenescence, with such patients at a higher risk of cardiovascular disease. Previous work revealed a link between increased production of tumour necrosis factor (TNF), CKD and renal failure, but the underlying mechanism remains unclear, which led to the current study.
The researchers identified that leukocyte-derived tumour necrosis factor (TNF) promoted Gli1+ cell proliferation and cardio-renal fibrosis through induction and release of the Indian Hedgehog (IHH) protein from renal epithelial cells. Moreover, inhibition of TNF release led to a reduction in tissue fibrosis and also lowered Gli1+ cell proliferation.
In further work, it also became clear that increased circulating levels of IHH correlated with cardiovascular disease and loss of renal function in patients with CKD. Taken together, the authors suggested their findings indicated that IHH release from TNF-activated renal cells drove the activation and proliferation of Gli1+ expressing cells and was the missing link between inflammation and organ fibrosis.
While these represent preliminary findings, the authors suggested that therapies targeting TNF and/or IHH signalling warrant further investigation to assess their potential in patients with CKD and fibrosis.
A novel extracorporeal liver device has been found to be safe and able to reduce endotoxaemia and improve albumin function in patients with acute-on-chronic liver failure (ACLF) in a recent randomised trial.
Published in the Journal of Hepatology, researchers demonstrated that the novel extracorporeal liver device, DIALIVE, was as safe as standard care (SC) for patients with ACLF. DIALIVE is designed to achieve removal and replacement of dysfunctional albumin and reduction in endotoxaemia. However, prior to the recent trial, this had not been tested in humans.
ACLF is a syndrome that occurs in hospitalised patients with cirrhosis who present with acute decompensation and carries a high short-term mortality in excess of 15% after 28 days. It is characterised by intense systemic inflammation and associated with multiple organ failures.
Some 32 patients with alcohol-related ACLF were included in the trial and treated with either DIALIVE for up to five days or SC. The study endpoints were assessed at day 10. The primary outcome of interest was safety, defined in terms of the percentage of patients who experienced at least one serious adverse event (SAE) between days one and 10. A number of secondary outcomes were assessed including the change in plasma endotoxin level, albumin function, 28-day mortality and changes in individual organ function.
A total of 17 ACLF patients were randomised to receive DIALIVE and 15 to standard care. The DIALIVE therapy was administered for a median of three sessions and each session lasted for between eight and 12 hours.
The presence of any adverse event was similar in the two groups of ACLF patients (76.5% vs 80%, DIALIVE vs SC, p = 1.00). Similarly, the incidence of serious adverse events in those assigned to the novel extracorporeal device was also not significantly different (p = 0.76). There were also no differences in 28-day mortality.
While there was a trend towards a reduced endotoxaemia severity in the DIALIVE group, this difference was non-significant (p = 0.14). In contrast, there were significant differences favouring DIALIVE for the improvements in liver, kidney, coagulation and brain damage scores. In addition, there was a greater improvement in albumin function in the DIALIVE group. Finally, a higher proportion of patients achieved ACLF resolution with DIALIVE treatment (43% vs 27%).
The authors concluded that DIALIVE achieved its aims of reducing the level of endotoxin and improving albumin function in those with severe liver failure, ultimately leading to a greater proportion of patients whose condition resolved.
1st June 2023
SGLT-2 inhibitor use in people with both diabetes and atrial fibrillation reduces the risk of ischaemic strokes, according to the results of a longitudinal follow‐up study.
Atrial fibrillation (AF) is the most common global cardiac arrhythmia, affecting over three million people. Having AF increases the risk of ischaemic stroke with this risk stratified by the CHA2DS2-VASc score. Fasting hyperglycaemia is a risk factor for AF although the use of sodium-glucose cotransporter-2 (SGLT-2) inhibitors reduces this risk.
The researchers considered whether SGLT-2 inhibitors could therefore reduce the risk of ischaemic stroke in diabetics with AF. Published in the Journal of the American Heart Association, the Taiwanese study followed a group of patients with both diabetes and AF who were prescribed either empagliflozin or dapagliflozin. These individuals were propensity matched to non-users and the incidence of ischaemic strokes documented over the next five years.
A total of 6,614 patients, 801 prescribed one of the SGLT-2 inhibitors, had usable data for analysis.
After five years, 809 patients with diabetes and AF developed an ischaemic stroke. However, the rate was significantly lower among SGLT-2 inhibitor users (p = 0.021).
As expected, there was an increased risk of stroke per one-point increase CHA2DS2‐VASc score (hazard ratio, HR = 1.24, 95% CI 1.20 – 1.29, p < 0.001). Adjusting for the CHA2DS2‐VASc score lowered the stroke risk by 20% among SGLT-2 inhibitor users (HR= 0.80, 95% CI 0.64 – 0.99, p = 0.043).
The findings prompted the authors to suggest clinicians upgrade SGLT-2 inhibitors for glycaemic control, especially in those with co-existing AF.
A higher intake of dietary flavanols among older adults does not improve memory unless they have a poor diet, according to the findings of a three-year randomised trial.
In the three-year long randomised, double-blind, placebo-controlled trial published in the journal PNAS, researchers tested if a flavanol intervention could improve hippocampal-dependent memory.
The intervention was a cocoa extract containing 500 mg of cocoa flavanols per day. Diet was assessed using the alternative Healthy Eating Index (aHEI) and a urine biomarker was used to measure flavanol intake in a subset of patients. The primary outcome was a change in modified rey auditory verbal learning test – or ModRey – which is used to assess memory. Secondary outcomes included the Flanker task and ModBent, both of which are measures of cognitive function.
Cognitive ageing refers to age-related changes in cognitive function, such as reasoning, memory and processing speed, which typically occur as people get older. Research suggests that areas of hippocampal formation, in particular the dentate gyrus, are implicated in cognitive ageing.
Additionally, it is known that dietary flavanols enhance dentate gyrus function. A three-month randomised, placebo-controlled trial suggested that a flavanol-based dietary interventions may have a beneficial impact on cognitive ageing. Since this was a short-term trial, the US researchers wondered if a higher intake of flavanols over time could positively impact on the memory component of cognitive ageing.
In the recent trial, a total of 3,562 older adults with a mean age of 71 years (66.9% female) were included and randomised to flavanols (1,744) or placebo and followed for three years.
After one year, there were no significant differences in the primary outcome between the two groups (mean difference, MD = 0.08, p = 0.415). Similarly, there were no differences in the two secondary outcomes. In fact, over the following two years, differences in both the primary and secondary outcomes remained non-significant. However, when researchers examined the effect of flavanol intake across aHEI scores, they found a significant improvement in ModRey in those with the lowest tertile aHEI scores (p = 0.011) but not the medium or high tertile.
Based on these findings, the authors suggested that a low intake of flavanols was a potentially important factor driving the hippocampal-dependent component of cognitive ageing.
30th May 2023
Although ketamine use increases haemodynamic instability during rapid sequence intubation in trauma patients, it does not significantly affect the first-pass success rate compared to etomidate, according to a retrospective analysis.
Published in the journal BMC Emergency Medicine, Korean researchers considered whether the potential adverse effects of ketamine and etomidate could affect the first-pass success rate during rapid sequence intubation (RSI) in trauma patients.
The team retrospectively compared both sedatives, not only in terms of the effect on the first-pass success rate but also with respect to clinical outcomes. Patients given ketamine were propensity-matched 1:3 with etomidate and the results adjusted for injury severity and confounding baseline characteristics.
RSI represents the set of actions undertaken during induction of anaesthesia that secures the airway in trauma patients at risk of aspiration or regurgitation of gastric contents, to enable emergency orotracheal intubation. Ideally, the RSI procedure should allow for rapid and optimal intubation conditions through increasing the first-pass intubation rate whilst reducing adverse events in severely injured patients. Despite being a standard procedure, a recent survey identified significant variation in practice, prompting called for international RSI guidelines.
Both ketamine and etomidate are commonly used sedatives for RSI during emergency tracheal intubation. Nevertheless, both are associated with potential adverse effects which could affect clinical outcomes. For example, single dose use of etomidate may increase 28-day mortality, whereas ketamine use could increase the risk of cardiac arrest.
A total of 620 patients, of whom 19.9% received ketamine, were included in the retrospective analysis. The ketamine patients had a significantly faster initial heart rate (105.0 vs 97.7, p = 0.003) and were more hypotensive (114.2 vs 139.3 mmHg, p < 0.001) than those given etomidate.
However, when researchers considered the first-pass success rate, this was not significantly different (90.7% vs. 90.1%, ketamine vs etomidate, p > 0.999). Similarly, there were no differences in other clinical outcomes explored including final mortality (p = 0.348), length of intensive care unit stay (p = 0.99), ventilator days (p = 0.735) and overall hospital stay (p = 0.32).
The authors concluded that when used for RSI, although patients administered ketamine showed greater haemodynamic instability, this had no important impact on either the first-pass success rate or other relevant clinical outcomes.
A meta-analysis of 30 trials suggests that plant-based diets are associated with significant reductions in total cholesterol, LDL cholesterol and apolipoprotein B.
Published in the European Heart Journal, the Danish researchers undertook a systematic review and meta-analysis, to estimate the effect of vegetarian and vegan diets on plasma levels of the main lipid fractions, TC, LDLC, triglycerides (TGs) and ApoB.
Randomised trials looking at the effect of both plant-based diets on the different lipid fractions were considered in comparison to an omnivorous diet in adults over 18 years of age. A total of 30 eligible trials were identified, which included an equal number of both types of plant-based diets and had an overall sample size of 2,372 participants.
Compared to the omnivorous group, the plant-based diets significantly reduced TC (mean difference, MD = -0.34 mmol/L 95% CI -0.44 to -0.23, p < 0.001). Significant reductions were also seen with LDLC (MD = -0.30, 95% CI -0.40 to – 0.19, p < 0.001) and ApoB (MD = -0.34, 95% CI -0.44 to -0.23, p < 0.001). However, no significant reductions were seen in the level of triglycerides compared to an omnivorous diet.
Despite these significant reductions, the researchers also reported substantial heterogeneity in the findings for TC, LDLC and ApoB (ranging from 69-74%).
To get a better understanding of the impact of these reductions, it is worth putting the changes into context.
A previous analysis of the benefits from lipid reductions indicated that for every 1 mmol/L decrease in TC, there was a 17.5% reduction the risk of all-cause mortality, a 24.5%, reduction in coronary heart disease (CHD)-related mortality and 29.5% for any CHD-related event.
Using TC as an example, a 0.34 mmol/L reduction, as found in the current analysis, equates to a 6% decrease in the risk of all-cause mortality and a 10% reduction in the risk of a CHD-related event.
Consequently, whilst relatively modest for an individual, these reductions would become substantial at the population level, thereby potentially reducing the society burden and mortality associated with cardiovascular disease.
Data from the World Health Organization suggests that cardiovascular diseases result in nearly 18 million annual deaths, emphasising the importance of any strategies that could reduce this risk. One strategy at the individual level is to move away from an omnivorous diet. In fact, this approach was advocated in guidance from the European Society of Cardiology in 2021. The guidance suggested that individuals could reduce their risk of cardiovascular disease through adoption of a more plant-based and and less animal-based food pattern. In recent years there has been an increase in vegetarian eating patterns in several continents around the world, including Europe and North America. In addition, there are also reported increases in the number of people eating a vegan diet.
In a previous analysis from 2015, it was shown that a vegetarian diet reduced lipid markers such as total cholesterol (TC) and LDL cholesterol (LDLC). However, the analysis did not examine the impact of this plant-based dietary pattern on levels of apolipoprotein B (ApoB), despite strong evidence that this lipid is a more accurate indicator of cardiovascular risk than either TC or LDLC.
Using a machine learning approach, researchers have identified a molecule that can potentially treat infections due to Acinetobacter baumannii – a common cause of a nosocomial infection in hospitals that is often multi-drug resistant.
Acinetobacter baumannii (A. baumannii) is an opportunistic, nosocomial pathogen and one of the six most important multi-drug-resistant microorganisms in hospitals worldwide. The organism has become particularly troublesome in hospitals due to its ability to survive for prolonged periods of time and an innate resistance to both desiccation and disinfectants. As a result, there is a growing and urgent need to develop novel antibiotics to overcome drug-resistant organisms.
One novel approach to drug discovery is the use of machine learning algorithms. The value of this strategy was recently utilised to predict molecules with broad-spectrum antibacterial activity against Escherichia coli.
But could this approach also be used to develop narrow-spectrum antibiotics, targeting specific organisms, given how this strategy has at least two potential advantages? Firstly, the spread of resistance is less likely with narrow-spectrum antibiotics because of less selection pressures. Secondly, narrow-spectrum agents do not cause dysbiosis – that is disruption to the microbiome – during treatment.
In a recent study published in Nature Chemical Biology, US and Canadian researchers used machine learning to focus their attention on identifying antibiotics targeting A. baumannii. The team screened approximately 7,500 molecules, looking at the ability of these agents to inhibit the growth of the organism in vitro. The search identified a set of 480 active molecules – those able to inhibit growth of A. baumannii. Further filtering using the algorithm reduced this to 240 molecules that were structurally different to existing antibiotics.
Ultimately, one compound, abaucin, demonstrated a very potent inhibitory action. Looking at its possible mechanism of action, the team identified how the drug appeared to interfere with lipoprotein trafficking, which is involved in development of an organism’s outer membrane and an intrinsic antibiotic resistance factor.
To examine the potential value of abaucin, researchers then tested the activity of the molecule against 41 known strains of A. baumannii. Surprisingly, they found it able to overcome all intrinsic and acquired resistance mechanisms in these 41 isolates. Furthermore, in a mouse model of an infection due to A. baumannii, abaucin proved to be an effective treatment.
The team also found that abaucin displayed minimal growth inhibitory activity against 51 human commensal species isolates, suggesting that it was unlikely to cause dysbiosis during treatment.
Looking forward, the researchers suggested once a novel antibacterial agent had been identified, machine learning models could be trained to examine the growth inhibition of a pathogen of interest, as well as the potential toxicity of the drug to mammalian cells.
In addition, with more high-quality datasets on which to train machine learning models, the researchers felt that this approach could become widely employed in future to more efficiently identify structurally and functionally effective new antibacterial agents.
Consuming one to three serving of chocolate per week is enough to lower women’s risk of death, findings from a recent study suggest.
Focusing on post-menopausal women, free of cardiovascular disease (CVD) and cancer at baseline when enrolled in the study during 1993 through to 1998, the cohort were followed until March 2018. The outcomes of interest were all-cause mortality and cause-specific mortality from CVD, cancer and dementia.
Women’s intake of chocolate was categorised based on the intake frequency of a 1oz serving of chocolate as: none, less than one serving per week (<1 serving/wk), one to three serving per week (1-3 servings/wk), four to six servings per week (4-6 servings/wk) and more than one serving per day (≥1 serving/d).
Over 1,608,856 person-years of follow-up, there were a total of 25,388 deaths, which included 7,069 deaths from CVD, 7,030 from cancer and 3,279 from dementia. In multivariable adjusted analysis, compared to those who did not eat chocolate, the hazard ratio (HR) for all-cause mortality ranged from 0.95 (95% CI 0.92 – 0.98) for <1 serving/wk to 0.93 (95% CI 0.89 – 0.96) for 1-3 serving/wk (p for trend = 0.02).
For CVD mortality, the association was only significant for 1-3 servings/wk (HR = 0.88, 05% CI 0.82 – 0.95). In contrast, dementia mortality was significantly lower for both <1 serving/wk and 1-3 servings/wk.
Overall, there was no significant effect of chocolate intake on cancer mortality, but, in subgroup analysis, lung cancer mortality was significantly lower but only for 1-3 servings/wk (HR = 0.82, 95% CI 0.70 – 0.96).
The authors recognised how their analysis did not consider the different types of chocolate in their analysis, for example dark chocolate has purported health benefits, and this could have impacted on their findings. They also accepted that residual confounding could not be excluded, in other words, the findings could be due to other factors not considered.
A modest and inverse association between eating chocolate and mortality from all causes and cause-specific mortality from cardiovascular disease, cancer and dementia has previously been found in an analysis of data from Women’s Health Initiative (WHI) by US researchers.
Chocolate is known to contain a high content of the saturated fat, stearic acid and antioxidant flavonoids with the latter component likely responsible for a cardioprotective effect. Moreover, evidence from a meta-analysis of prospective studies, suggests that moderate consumption of chocolate is associated with a decreased risk of coronary heart disease (CHD), stroke and diabetes. But not all studies have concurred with this analysis. One, for instance, undertaken in women, was unable to find an association between chocolate intake and the risk of CHD and stroke. As well as potential cardiovascular benefits, it seems there is also an inverse relationship between regular intake of chocolate and a lower risk of cognitive decline.
25th May 2023
The use of short-stay wards located in an emergency department (ED) and managed by emergency care clinicians, benefits patients by reducing their length of stay (LOS) and 28-day mortality risk, according to the findings of a retrospective study by Korean researchers.
Published in BMC Emergency Medicine, the researchers hypothesised that ED clinician care within the ESSW was more likely to reduce patient’s LOS in the department and without affecting overall clinical care. They retrospectively analysed adult patients who visited the ED at a tertiary academic hospital in Seoul.
The patients were divided into three groups: those admitted to the ESSW and treated within the ED (ESSW-EM); those admitted but treated by other departments (ESSW-Other) and those who were admitted to general wards (GW). The researchers had a single, co-primary outcome which was ED length of stay and 28-day mortality.
A total of 29,596 patients were included in the analysis, with 31.3% categorised as ESSW-EM and 59.8% as GW.
When comparing ED LOS, the researchers found that the shortest time was for those in the ESSW-EM group (mean 7.1 hours). The mean ED LOS was 8.0 hours and 10.2 hours in the ESSW-Other and GW groups respectively (p < 0.001 for both comparisons). In addition, 28-day hospital mortality was 1.9% for the ESSW-EM group and 4.1% for the GW group (p < 0.001).
Using multivariable logistic regression analyses, being in the ESSW-EM group was independently associated with a lower hospital mortality compared with both the ESSW-Other group (adjusted p = 0.030) and the GW group (adjusted p < 0.001).
With patients’ LOS being a potential surrogate marker for overcrowding, the authors suggested that admission to an ESSW, under the care of emergency care clinicians, is a potentially effective strategy to alleviate emergency department overcrowding and improve patient outcomes.
Emergency department boarding, or overcrowding, is known to increase both hospital LOS and mortality. Consequently, in an effort the alleviate overcrowding, many Westernised countries have introduced a waiting time target, to reduce the time spend by patients in the department. This target has often been set at four hours and there is some evidence that it does reduce mortality rates. Nevertheless, a systematic review in 2010 concluded that the introduction of such targets, has not resulted in a consistent improvement in care.
An alternative proposed solution to reduce ED overcrowding and the associated mortality risks, is to have short-stay units within the ED. These emergency department short-stay wards (ESSW) are specific areas within the department designed to provide short-term care for a selected group of patients and hopefully to alleviate overcrowding.
While a potentially promising approach, a systematic review in 2015 noted insufficient evidence to make any firm conclusions on either the effectiveness or safety of short-stay units compared with inpatient care. Nevertheless, other work has shown that use of an ESSW is associated with a low rate of subsequent ICU admission. In contrast, an ESSW designed to manage patients with cardiac problems, actually increased patient’s hospital LOS.
Despite the limitations of the evidence, no previous studies have explored the potential benefit of using emergency care clinical staff within the ESSW.
24th May 2023
The rate of exacerbations in COPD patients with type 2 inflammation is lowered by treatment with dupilumab, according to the findings of a recent randomised, placebo trial.
Patients with chronic obstructive pulmonary disease (COPD) and type 2 inflammation experience a lower annualised rate of moderate to severe disease exacerbations when dupilumab is added to standard triple therapy, the BOREAS clinical trial group found.
Published in the New England Journal of Medicine, the study looked at COPD patients with type 2 inflammation, based on an elevated eosinophil count (≥300 cells/µL), who were in receipt of standard triple therapy. They were randomised them to either dupilumab 300 mg or placebo, given subcutaneously once every two weeks.
The primary endpoint was the annualised rate of moderate or severe exacerbations of COPD. Secondary outcomes included the change in the prebronchodilator FEV1 and the St. George’s Respiratory Questionnaire (SGRQ) for which lower scores indicated a better quality of life. In addition, the Evaluating Respiratory Symptoms in COPD (E-RS–COPD) scale was used, with, again, lower scores indicative of less severe symptoms.
A total of 939 patients were included and randomised to either dupilumab (468) or placebo. The mean baseline absolute blood eosinophil count was 401.
The annualised rate of moderate or severe exacerbations was 0.78 (95% CI 0.64 – 0.93) for those given dupilumab and 1.10 (95% CI 0.93 to 1.30) with placebo (rate ratio, RR = 0.70, 95% CI 0.58 to 0.86, p < 0.001).
For the secondary outcomes, the prebronchodilator FEV1 increased from baseline to week 12 by a mean of 160 ml with dupilumab and 77 ml with placebo (p < 0.001) and this difference was sustained through to week 52. Similarly, both the SGRQ and E-RS–COPD scores were significantly lower in those receiving dupilumab at week 52.
The authors concluded that the use of dupilumab in COPD patients with a type 2 inflammation phenotype, experience a lower annualised rate of moderate to severe exacerbations, improved quality of life and better lung function.
COPD exacerbations are linked to an accelerated decline in lung function, especially in patients with eosinophil counts greater than 350 cells/µL and not using inhaled corticosteroids.
Type 2 inflammation is present in a sub-set of COPD patients, with one study finding a prevalence of 37%, and such individuals show a better response to systemic corticosteroids such as prednisolone. Type 2 inflammation is characterised by increased eosinophil counts together with elevated levels of various interleukins including interleukin-5, interleukin-4 and interleukin-13.
Monoclonal antibody treatment targeting interleukin-5 with a view to reducing disease exacerbations has, to date, produced mixed results. For instance, use of benralizumab was not associated with a lower annualised rate of COPD exacerbations. In contrast, treatment with mepolizumab, did lower the annual rate of moderate or severe exacerbations. An alternative, yet untested therapeutic approach, is the use of dupilumab, which blocks two other interleukins elevated in those with the type 2 inflammation phenotype, namely, interleukin-4 and 13.