ChatGPT was not designed to provide guidance to suicidal people. The real problem is an exploitative and cruel mental health industry that can lock up suicidal people in horrific locked facilities at huge profits while inflicting additional trauma. There is a reason many people will never call 988 or open up to a mental health clinician about suicidal feelings given how horrible and exploitative locked facilities are. This is not ChatGPT’s fault, it’s the fault of a greedy mental health industry trying to look good, by locking up the suicidal instead of engaging with them, while inflicting traumatic harm on patients.
ChatGPT was not designed to provide guidance to suicidal people. The real problem is an exploitative and cruel mental health industry that can lock up suicidal people in horrific locked facilities at huge profits while inflicting additional trauma. There is a reason many people will never call 988 or open up to a mental health clinician about suicidal feelings given how horrible and exploitative locked facilities are. This is not ChatGPT’s fault, it’s the fault of a greedy mental health industry trying to look good, by locking up the suicidal instead of engaging with them, while inflicting traumatic harm on patients.