There have been many high-profile stories in which chatbots have effectively encouraged and enabled people experiencing mental health crises to kill themselves, which has resulted in several wrongful death lawsuits against the companies responsible for the AI models behind the bots. Now we’ve got the inverse: if you want to use your right to die, you have to convince an AI that you are mentally capable of making such a decision.
According to Futurism, the creator of a controversial assisted-suicide device known as the Sarco has introduced a psychiatric test administered by AI to determine if a person is of sound enough mind to decide to end their life. If they are deemed of sound mind by the AI, the suicide pod will be powered on, and they will have up to 24 hours to decide to move forward to their final destination. If they miss the window, they’ll have to start over.
The Sarco that is central to this whole thing has already stirred up quite a bit of controversy before introducing the AI mental fitness test. Named after the sarcophagus by inventor Philip Nitschke, the Sarco was built in 2019 and used for the first time in 2024 when a 64-year-old American woman who had been suffering from complications associated with a severely compromised immune system, underwent the process of self-administered euthanasia in Switzerland, where assisted suicide is technically legal. She reportedly underwent a traditional psychiatric evaluation conducted by a Dutch psychiatrist before she pressed a button that released nitrogen within the capsule and ended her life because the AI assessment wasn’t ready at the time.
However, the use of the Sarco resulted in the arrest of Dr. Florian Willet, a pro-assisted suicide advocate who was present for the woman’s death. Swiss law enforcement arrested the doctor on the grounds of aiding and abetting a suicide. Under the country’s laws, assisted suicide is allowed as long as the person takes their own life with no “external assistance,” and those who help the person die must not do so for “any self-serving motive.” Dr. Willet would later die by assisted suicide in Germany in 2025, reportedly in part due to the psychological trauma he experienced following his arrest and detention.
It’s unclear if Willet was evaluated using the new AI assessment, but Nitschke will apparently include the new test in his latest version of the Sarco that he designed for couples, according to the Daily Mail. The “Double Dutch” model will evaluate both partners and allow them to enter a conjoined pod so they can pass on to the next life while lying next to each other.
The whole thing does raise a question, though: why do you need AI for this? They were able to find a psychiatrist for the one use of the pod thus far, and it’s not like they’re doing this at such a volume that they need to pass the assessment off to AI to expedite the process. Whatever your stance on assisted suicide may be, the inclusion of an AI test over a human assessment feels like it undermines the dignity of choosing to die. A person at the end of their life deserves to be taken seriously and receive human consideration, not pass a CAPTCHA.

