Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    2025-26 College Football Playoff Bracket: Updated After Week 12 Release

    Trump has ‘obligation’ to sue ‘very dishonest’ BBC | UK News

    Best of NFL Week 10 Press Conferences | NFL on FOX

    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram Pinterest VKontakte
    Sg Latest NewsSg Latest News
    • Home
    • Politics
    • Business
    • Technology
    • Entertainment
    • Health
    • Sports
    Sg Latest NewsSg Latest News
    Home»Technology»VA’s AI suicide prevention tools aren’t meant to replace clinical interventions — advocates want it to stay that way
    Technology

    VA’s AI suicide prevention tools aren’t meant to replace clinical interventions — advocates want it to stay that way

    AdminBy AdminNo Comments10 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    The Department of Veterans Affairs has been using some artificial intelligence capabilities to bolster its suicide prevention efforts, but VA says these tools augment the work of clinicians and are not designed to replace any human-led interventions. Lawmakers, researchers and advocates say that’s how these technologies should always be used.

    This article — the second in a series of pieces about VA’s adoption of AI tools to help prevent veteran suicides — is based on documents obtained through Freedom of Information Act requests and interviews with almost two dozen current and former VA officials and employees, researchers, veterans and veteran advocates over the past year.

    VA’s tools are vastly different than GenAI

    High-profile news stories have underscored concerns about generative AI chatbots being used as de facto therapists by members of the public or even reportedly playing a role in suicides. VA’s AI tools, however, operate behind the scenes and are essentially machine learning-based algorithms, rather than public-facing chatbots engaging with veterans in crisis. And the human-led interventions resulting from these tools are meant to be voluntary.

    One of VA’s most prolific uses of AI capabilities — the Recovery Engagement and Coordination for Health-Veteran Enhanced Treatment, or REACH VET, program — is a suicide prediction algorithm that was first launched in 2017. The tool scans the department’s electronic health records using specific variables to identify retired servicemembers in the top 0.1% tier of suicide risk. VA has since released a 2.0 version of the model earlier this year that includes new variables, such as military sexual trauma and intimate partner violence. 

    Rather than replacing or marginalizing the work of healthcare providers, lists of high-risk veterans identified through the model are provided to VA facilities through a centralized dashboard that is accessible to REACH VET coordinators, the Government Accountability Office noted in a September 2022 review of the program. These specialized personnel, who are required to be a part of the suicide prevention teams at every VA medical facility, then work with clinicians to contact the identify veterans and create individualized safety plans.

    These ensuing conversations are not scripted, and are designed to offer veterans voluntary resources or invite them to come to a VA medical facility for a visit. 

    During a House hearing last month, VA Chief Technology Officer and Chief AI Officer Charles Worthington said REACH VET “has used AI algorithms to identify over 130,000 veterans at elevated risk, improving outpatient care and reducing suicide attempts.”

    Maintaining the human touch

    In an interview with Nextgov/FCW last November, Dr. Matthew Miller, VA’s then-executive director of suicide prevention, said REACH VET enhances human-led interventions and provides “that pairing of the innovation and the technology with the human touch.” 

    Maintaining a human-first approach — and ensuring clinicians are the ones directly engaging with high-risk veterans — aligns with what advocates say is the best approach to bolstering VA’s suicide prevention efforts, since some retired servicemembers may be understandably leery of how the emerging capabilities are being used. 

    An official with the American Legion, for instance, told Nextgov/FCW that its members are largely hesitant about AI in general and would not be comfortable with removing the human element from the mental health process.

    A VA official reiterated those points during a House hearing last month, saying that AI tools only work behind the scenes to augment clinicians’ work.

    Evan Carey, acting director of VA’s National Artificial Intelligence Institute, told lawmakers at the time that “while we do use AI tools to surface risks and ensure that all veterans are flagged to get the care they need, what happens next is that a human at the VA reaches out to that veteran, or first reviews the information and decides if outreach is necessary.”

    Rep. Nikki Budzinski, D-Ill. — ranking member of the House Veterans Affairs’ Technology Modernization Subcommittee — told Carey that she wants the department “to ensure that human involvement isn’t eliminated as a part of the critical nature of the care that we want to be able to provide to a veteran with suicide prevention effort.”

    In a subsequent interview with Nextgov/FCW, Budzinski said she wants to make sure that clinician outreach to at-risk veterans is not minimized at all by the new tools. 

    “I think that is very critical to how we are providing support and treatments to veterans that are going through challenges around mental health, PTSD treatments that cannot be replaced through technology,” she said. “What technology can, I think, better help us identify are those veterans that are maybe at a higher risk, helping them to get access to the initial touch points to the right people at the VA for treatment. That’s how I’m hoping we can best utilize this technology.”

    A prevalence of ‘safety- and rights-impacting’ AI 

    As uses of AI become more prevalent across both the public and private sectors, the government has increasingly pressed for VA and other federal entities to be more transparent and careful with their adoption of the tools.

    Agencies have been required in recent years to publicly report their AI uses, a process that began during the first Trump administration and has continued through Trump 2.0. Biden-era guidance from the Office of Management and Budget also required agencies to report use cases identified as safety- and rights-impacting in their submitted AI inventories. 

    OMB defined rights-impacting AI uses as those whose output affects a decision or action for a specific individual or entity and has a “legal, material, binding or similarly significant effect” on their civil rights, civil liberties, privacy and equitable access to government services. Safety-impacting AI was similarly defined as a technology whose output could potentially affect the safety of a person or entity’s wellbeing, environment, assets or critical infrastructure.

    Of the 227 AI use cases VA reported in its December 2024 inventory, 145 of them met the criteria for being safety- and rights-impacting — roughly 64% of the department’s total. This included all of the uses related to suicide prevention tools. 

    These classifications, however, do not mean that the AI poses a harmful impact on veterans, but rather are meant to identify those specific uses as needing to meet a set of outlined “minimum practices” from OMB. These steps include conducting AI impact assessments, testing the capabilities “for performance in a real-world context” and independently evaluating the AI tools to prevent any adverse outcomes.

    While President Donald Trump rescinded former President Joe Biden’s AI guidance, an OMB memo issued in April also directed agencies “to implement minimum risk management practices for AI that could have significant impacts when deployed.”

    Interventions remain voluntary 

    While REACH VET is not the only AI-infused tool that VA is using or looking at adopting in order to better support veterans at high-risk of suicide, all of the capabilities the department currently employs are similar in that they elevate veterans in crisis to trained professionals. It’s an approach that aligns with other processes for identifying and treating health risks. 

    Dr. Christine Yu Moutier, the chief medical officer for the American Foundation for Suicide Prevention, likened the use of these types of AI tools to a cardiac intervention model, where cardiologists identify heightened risk factors and then try to prevent any negative outcomes. 

    “Those who are identified as having higher risk will have a way to interface, ideally, with an effective, culturally relevant, caring clinician or team who will then act in coordination to reduce that person’s risk of whatever is trying to be prevented,” she said. “And so I think the human element can be both in terms of improving the machine learning model to begin with as you go along, but also when it comes to patient care.”

    Outreach resulting from its AI-powered risk intervention approaches, VA stresses, is also voluntary. 

    VA’s internal customer experience survey tool, known as VSignals, collects data and feedback directly from veterans, eligible dependents and caregivers to enhance department services. The platform has also been using AI to analyze responses and then, if necessary, elevate veterans in crisis to the appropriate resources — whether that’s suicide prevention support or housing assistance for those experiencing homelessess. 

    John Boerstler, the head of public sector at Ipsos Public Affairs who served as VA’s Chief Experience Officer from February 2021 through September 2024, told Nextgov/FCW that the use of VSignals’ crisis alert algorithm has saved around 5,000 lives. 

    Boerstler said the tool has been used “to really identify crises; whether that’s a housing or mental health crisis. It’s automatically pulled using the large language models that have been built in the survey tool to identify those terms.”

    Other uses of AI as a suicide prevention tool are trying to carefully balance enhanced clinician engagement with veterans’ voluntary participation. 

    In one notable use case VA flagged in its 2024 AI inventory, the department said it was in the “acquisition and/or development” phase of using natural language processing to parse clinicians’ notes to identify veterans who have access to a gun and have an active opioid use disorder. 

    “We have developed NLP algorithms that are very good at assessing from notes whether a patient owns or has access to a gun,” VA said in its inventory. “But we use only notes already written by clinicians, [and] any mention of gun access came voluntarily from the patient.”

    As the department noted, the death rate for those using a gun to attempt suicide is approximately 85%. Although VA said it has considered asking all patients whether they have access to a firearm, it determined that the question could “cause concern among patients about why they’re always being asked about guns.”

    Given the inherent difficulties here — maintaining veterans’ 2nd Amendment rights, while also identifying veterans at risk of suicide who have access to a gun — VA said the intended use of the model’s output is “that a human mental health clinician be notified about patients at risk, then read the note or notes that led to the determination that the patient was at risk.”

    In these instances where a patient is identified as being at high risk of self-harm, VA said clinicians can take different steps, such as counseling veterans to give their guns or ammunition to a friend or a family. The department added that “this is always voluntary, but Veterans very often agree to it, because they see and fear the risk.”

    VA’s AI tools can only help veterans engaged with the department

    Suicide prevention continues to be a top priority for VA, but the overall figures have remained largely unchanged since 2008. VA statistics show that roughly 6,500 veterans take their own lives each year, with more than 17 retired servicemembers dying by suicide each day.

    While novel uses of AI tools have helped with the department’s prevention initiatives, they only represent a small portion of VA’s overall suicide prevention strategy. And VA’s use of these capabilities also has one glaring flaw: they are only able to provide better engagement with veterans in crisis who receive healthcare from VA facilities. 

    In statistics sent to Nextgov/FCW, VA Press Secretary Pete Kasperowicz said that around 60% of veterans who have died by suicide did not receive any care from the Veterans Health Administration — VA’s healthcare arm — “at any point in the two years prior to their death.”

    So while these AI tools can help clinicians better identify veterans at high-risk of self harm for more direct outreach, they are limited by the data that is included in VA’s electronic health record system.

    Budzinski also pointed to the disparity in suicide rates and expressed concern that “we’re only able to utilize technology to support veterans that are utilizing the VA.”

    If you are a veterans in crisis or are having thoughts of suicide, or if you know a veteran in crisis, you should call the Veterans Crisis Line for confidential crisis support. Dial 988 then Press 1, chat online at VeteransCrisisLine.net/Chat or send a text message to 838255. The line is available 24 hours a day, 365 days a year.



    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Admin
    • Website

    Related Posts

    Aqua Labs Launches $20 Million Startup Support Program, Calling For Founders Worldwide

    How to watch ‘It’s the Great Pumpkin, Charlie Brown’ for free on Apple TV+

    Stripe’s Former CTO Rahul Patil Joins Anthropic as New Tech Leader

    US government shutdown seen dragging into next week

    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    Judge reverses Trump administration’s cuts of billions of dollars to Harvard University

    Prabowo jets to meet Xi in China after deadly Indonesia protests

    This HP laptop with an astonishing 32GB of RAM is just $261

    Top Reviews
    9.1

    Review: Mi 10 Mobile with Qualcomm Snapdragon 870 Mobile Platform

    By Admin
    8.9

    Comparison of Mobile Phone Providers: 4G Connectivity & Speed

    By Admin
    8.9

    Which LED Lights for Nail Salon Safe? Comparison of Major Brands

    By Admin
    Sg Latest News
    Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
    • Get In Touch
    © 2025 SglatestNews. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.