This website is intended for healthcare professionals only.

Hospital Healthcare Europe
Hospital Pharmacy Europe     Newsletter    Login        

Using artificial intelligence to bridge the divide in mental health care

mikkelwilliam / E+ via Getty Images

With digital transformation in mental health services ramping up, consultant psychiatrist Dr Arokia Antonysamy discusses the current demands, practices and challenges in mental health how leveraging artificial intelligence for assessment and treatment can lead to improved mental health outcomes.

In the midst of the global pandemic, many sectors – including mental health – tentatively embraced digital solutions like teleconsultation through remote working to maintain continuity of care. Yet there was a swift regression to traditional in-person assessments, even before the full deployment of Covid-19 vaccines.

A puzzling resistance surfaced, rooted in the perception that remote work diminishes staff commitment.1 In reality, studies have shown increased productivity, engagement, better retention and job satisfaction.2,3 This traditionalist view holds back the essential evolution of mental health practices into the digital age.

Consider the realm of robotic surgeries, first introduced in the 1980s. The hesitance to adopt such technologies broadly stemmed from cost considerations, liability fears and concern over reduced surgical workforce capacity. This hesitance is emblematic of a larger trend in healthcare, where investment and progress in specialised fields can be incremental at best ─ and mental health, often seen as a ‘Cinderella speciality’, lags far behind.

Digital transformation encompasses the adoption and integration of digital technologies to improve the delivery, accessibility and effectiveness of services. In mental health, this transformation involves various components including telepsychiatry, mobile health applications, wearable devices, artificial intelligence (AI) and data analytics.

The application of AI across various industries, including healthcare, has been promising. Yet, the mental health sector remains underfunded and overlooked when it comes to this digital innovation. The reluctance to invest in modern technologies could be detrimental,4 stalling innovative advancements that can potentially revolutionise mental health practices.

Growing demand for mental health services

The prevalence of common mental disorders in adults aged 16-64 increased from 14.1% in 1993 and 16.3% in 2000 to 17.5% in 2014.5 Additionally, long waiting times force 78% of patients to seek mental health help from emergency services and about a quarter wait for more than 12 weeks to start treatment, according to the Royal College of Psychiatrists.6

The existing gateway system, which mandates all referrals to secondary care mental health services to go through general practitioners (GPs), presents significant hurdles. Face-to-face GP consultations have become increasingly scarce, compounding the public’s dwindling confidence in primary care services.7 Most of the hospital admissions to psychiatric wards stem from crisis situations or emergency departments, rather than through planned referrals.8

Conversations with community members paint a stark picture of the challenges in accessing mental health services. A national survey of public attitudes to AI in healthcare noted many respondents find it beneficial due to the speed and efficiency, citing the lengthy waiting times and bureaucratic hurdles associated with specialist appointments as the main drivers towards public remaining open to technological approaches.9

Can AI transform mental health service delivery?

Digital transformation holds the promise of revolutionising mental healthcare delivery by bridging the current gaps involving long waits, the limited number of therapy choices available to patients and the number of sessions not being sufficient to satisfy patient need.10

Telepsychiatry platforms through remote working can provide timely and accessible consultations, circumventing the need for lengthy waiting lists and physical appointments. This eco-friendly approach provides opportunities for assessments to take place in the comfort of patients’ own homes. From my own experience, I have observed better patient and family engagement, reducing the number of ‘did not attends’.

AI-powered algorithms can streamline triage processes remotely, identifying high risk individuals and prioritising their access to care. Mobile applications and wearable devices offer continuous monitoring and support, empowering individuals to actively manage their mental wellbeing. Streamlining the whole pathway enables quick assessments and interventions, causing less disruption to individuals’ work or studies, thereby alleviating the indirect burden on society

Overcoming trust and stigma issues in mental health

Among adolescents and young adults, trust and stigma surrounding mental health are significant barriers to seeking help.11 In January 2025, BBC News reported increasing numbers of young people turning to AI therapist chatbots, with 18 million messages being shared with one particular psychologist chatbot since November 2024.12 AI-driven platforms may be able to draw them towards seeking help early, facilitating early intervention13 and recovery, reducing the likelihood of social isolation and withdrawal. However, the risk of these platforms lacking empathy, which is a core component of the human interaction, along with the lack of knowledge in crisis interventions and not capturing the non-verbal cues, cannot be underestimated.14

Discrimination issues also exist in current practices in mental health, where patients from ethnic minority groups have limited access to psychotherapy despite the absence of language and culture barriers.15 Those from minority ethnic group are more likely to be detained under the mental health act than their white counterparts.16

AI has the potential to address discrimination issues through developing predictive models that take into account all the cultural factors to eliminate biases17 that can impact on response to treatment and patient experience and outcomes.

AI as an enhancer to medical and other interventions

Traditionally, psychiatric doctors and experts rely on exhaustive assessments to determine the most suitable medication regime for the patient. These lengthy assessments consider a multitude of factors including a patient’s current and past history, vulnerability factors since birth, social history, family background, hospital admissions, physical health, medication history, risk factors, response limiting factors and other relevant information.

By harnessing the power of AI, these complex datasets can be distilled into actionable insights providing consultant psychiatrists with a comprehensive overview to guide treatment decisions. This collaborative approach not only reduces the risk of errors but also enhances efficiency,18 enabling doctors to see more patients and address treatment needs promptly.

Recent studies have highlighted the efficacy of self-help cognitive behavioural therapy interventions, which can be as effective as traditional face-to-face therapies19 or medication regimes in some populations. AI presents an opportunity to augment self-help interventions by incorporating interactive elements that engage and support individuals in managing their mental wellbeing.

However, while the potential of AI in self-help interventions is vast, efforts to mitigate human biases inherent in research methodologies are crucial during the process of predictive modelling to ensure the validity and inclusivity of AI-driven interventions.

Social factors also play a crucial role in mental health recovery with close knit community structures often associated with better prognosis20 and this goes some way to explain the faster recovery rate in the Eastern world.

However, social factors are often overlooked in mental health treatments in developed countries, partly due to the family structures being more nuclear with limited support from extended families. AI can narrow this gap by gathering evidence from global sources on key social components important to recovery.

What is holding us back in mental health?

While the potential of AI in revolutionising mental health care is huge, several challenges impede its widespread adoption and implementation.

Ethical and legal considerations

The use of AI in mental health practice, where utmost trust and confidentiality is fundamental, raises complex ethical and legal concerns regarding patient privacy, consent and data security.

Ensuring compliance with regulatory frameworks such as the General Data Protection Regulation and adherence to Information Commission Office guidelines is essential to build public confidence in using these technological aids.

Additionally, Floridi and Cowls’ five-principles framework can be used by those who design, study, implement or research AI in mental health to provide assurance to users.21

Fear of job displacements

Mental health professionals undergo many years of training and continue to keep up to date with clinical advances through regular appraisal and revalidation.

With AI emerging at the interface, professionals fear AI replacing their expertise gained from all those years of training and experience, leading to serious concerns around decision-making and accountability, especially when things go wrong.

The role of AI either as an ‘automator’, where it can perform the activities of an admin assistant like note taking and summarising at a speed beyond the scope of humans,22 or as an ‘augmenter’ where it quickly aids the decision-making process are creating fears around job losses.

Clinical jobs in mental health are critical to offer the human oversight and the exchange of emotional intelligence coupled with a comprehensive assessment.23

Lack of empathy

The practice of psychiatry is not about treating the disease, it’s about treating the person.  The fundamental aspect of this is the human interaction that is the epitome of mental health care.

Healthcare professionals and the public are equally concerned about the perceived lack of empathy and lack of human interaction in AI-driven solutions. There are serious concerns around humans losing this unique aspect in the longer term if they are left with interacting with robots for most part of their care.

AI should not replace human intervention entirely. Trained professionals must be available to review interactions, provide support and intervene when necessary. However, children and adolescents with increasing digital literacy find chatbots less intimidating than humans and the only way to resolve any safety issues is to ensure robust governance processes.24

Lack of research for AI-driven mental health interventions

The clinical validity and evidence base for the AI application remains questionable. Despite the growing interest in AI-driven mental health interventions, robust clinical input and evidence of effectiveness is limited.25

Rigorous research studies, randomised controlled trials and long-term outcome assessments are needed to establish the efficacy, safety and cost-effectiveness of AI-powered interventions in diverse patient populations.

Bias and fairness

AI algorithms are susceptible to bias,26 reflecting and perpetuating existing inequalities within healthcare systems.

Biases in training data, algorithmic decision-making and patient outcomes can disproportionately impact marginalised populations and exacerbate disparities in mental healthcare access and treatment.

All AI algorithms should be carefully built and underpinned by safety to mitigate as much bias as possible.

Limited access to technology

Access to AI-driven mental health interventions may be limited by socioeconomic factors, digital literacy and technology infrastructure. Marginalised communities including those in rural and low-income areas may face barriers in accessing and utilising AI-powered services, widening existing disparities in mental health services.

Global leaders have called for coordinated action to scale up the use of technology in mental health interventions,27 advocating community based approaches to ensure equity and access to technology.28

Integration challenges with mental health systems

Integrating AI technologies into existing mental health digital systems requires significant investments in infrastructure and IT workforce, organisational buy-in and workforce training.

Such integration requires clinicians to gain technological proficiency to augment their professional capability and enhance decision-making in person-centred care.

Algorithm transparency and accountability

The opaque nature of AI algorithms poses a challenge in understanding how information is fed into this technological black hole and therefore how decision-making is achieved.

Predictive modelling can provide reassurance to some extent. Ensuring transparency, interpretability and accountability in AI-driven decision-making processes is critical to build trust among clinicians, patients, carers, commissioners and other stakeholders.

Conclusion

Digital transformation has the potential to catalyse unprecedented advancements in mental health services. AI applications in mental health treatment represent a paradigm shift in how we approach patient care and support. Embracing the opportunities AI can offer is paramount to enhance treatment decision-making,29 empower individuals with self-help interventions and strengthen social support networks.

However, this transformation requires concerted efforts from policy makers, healthcare providers, technology developers, community stakeholders and healthcare professionals to ensure equitable access, transparency and effective implementation.

Author

Dr Arokia Antonysamy
Consultant psychiatrist, regional medical director at Cygnet Health Care and executive MBA at Warwick Business School

Acknowledgements

I would like to thank Professor Hila Lifshitz-Assaf for all her support and stimulating discussions around AI.

References

  1. Atkin D, Schoar A, Shinde S. Working from Home, Worker Sorting and Development. National Bureau of Economic Research; 2023. [Accessed April 2025].
  2. Taborosi S et al. Organizational commitment and trust at work by remote employees. J Eng Manage Compet 2020;10:48–60.
  3. Kortsch T et al. Does Remote Work Make People Happy? Effects of Flexibilization of Work Location and Working Hours on Happiness at Work and Affective Commitment in the German Banking Sector. Int J Environ Res Public Health 2022;19(15):9117.
  4. Ransbotham S et al. Reshaping business with artificial intelligence: Closing the gap between ambition and action. MIT Sloan Management Review 2017;59(1).
  5. NHS Digital. Mental Health and Wellbeing in England: Adult Psychiatric Morbidity Survey 2014. 29 Sep 2016 [Accessed April 2025].
  6. Royal College of Psychiatrists. Hidden waits force more than three quarters of mental health patients to seek help from emergency services. October 2022. [Accessed April 2025].
  7. Kontopantelis E et al. Consultation patterns and frequent attenders in UK primary care from 2000 to 2019: a retrospective cohort analysis of consultation events across 845 general practices. BMJ Open 2021;11:e054666.
  8. Roennfeldt H et al. Subjective Experiences of Mental Health Crisis Care in Emergency Departments: A Narrative Review of the Qualitative Literature. Int J Environ Res Public Health 2021;18(18):9650.
  9. Ada Lovelace Institute. A nationally representative survey of public attitudes to artificial intelligence in Britain. 2023. [Accessed April 2025].
  10. Mind. We still need to talk: A report on access to talking therapies. November 2013. [Accessed April 2025].
  11. Chandra A, Minkovitz C. Stigma starts early: Gender differences in teen willingness to use mental health services. J Adolesc Health 2006;38(6):754.e1–8.
  12. BBC News. Character.ai: Young people turning to AI therapist bots. 5 January 2024. [Accessed April 2025].
  13. Wittal CG et al. Perception and knowledge of artificial intelligence in healthcare, therapy and diagnostics: A population-representative survey. J Biotechnol Biomed 2023;6(2):129–39.
  14. Abd-alrazaq AA et al. An overview of the features of chatbots in mental health: A scoping review. Int J Med Inform 2019;132:103978.
  15. Bansal N et al. Understanding ethnic inequalities in mental healthcare in the UK: A meta-ethnography. PLoS Med 2022;19(12):e1004139.
  16. Gov.uk. Detentions under the Mental Health Act. 16 August 2024. [Accessed April 2025].
  17. Hswen Y, Abbasi J. How AI Could Help Clinicians Identify American Indian Patients at Risk for Suicide. JAMA 2025;333(6):449–51.
  18. Istvan B, Schulman KA, Zenios S. Addressing Health Care’s Administrative Cost Crisis. JAMA 2025;333(9):749–50.
  19. Axelsson E et al. Effect of Internet vs Face-to-Face Cognitive Behavior Therapy for Health Anxiety: A Randomized Non inferiority Clinical Trial. JAMA Psychiatry. 2020;77(9):915–24.
  20. Williams AJ et al. Social cohesion, mental wellbeing and health-related quality of life among a cohort of social housing residents in Cornwall: a cross sectional study. BMC Public Health. 2020;20:985.
  21. Floridi L, Cowls J. A Unified Framework of Five Principles for AI in Society. Harvard Data Science Review 2019 Jul 1;1(1).
  22. Department for Education. The impact of AI on UK jobs and training. November 2023. [Accessed April 2025].
  23. McCradden M, Hui K, Buchman DZ. Evidence, ethics and the promise of artificial intelligence in psychiatry. J Med Ethics 2023;49:573–9.
  24. Kurian N. ‘No, Alexa, no!’: designing Child-Safe AI and Protecting Children from the Risks of the ‘Empathy Gap’ in Large Language Models. Learning, Media and Technology 2024;July:1–14.
  25. Bedi S et al. Testing and Evaluation of Health Care Applications of Large Language Models: A Systematic Review. JAMA 2025;333(4):319–28.
  26. Ratwani RM, Sutton K, Galarraga JE. Addressing AI Algorithmic Bias in Health Care. JAMA 2024;332(13):1051–2.
  27. Patel V et al. The Lancet Commission on global mental health and sustainable development. Lancet 2018;392(10157):1553–98.
  28. Naslund J et al. Digital technology for treating and preventing mental disorders in low-income and middle-income countries: a narrative review of the literature. Lancet Psychiatry 2017;4:486–500.
  29. Blease CR et al. Generative artificial intelligence in primary care: an online survey of UK general practitioners. BMJ Health Care Inform 2024;31(1):e101102.
x