Its dangerous software that should not be in the hands of the general public until it has been made to not answer these types of questions. And yet apps are specifically being built for these types of questions for ChatGPT.
Thats like knowing gasoline shouldn’t be drank but opening a gasoline serving lemonade stand.
How does this piece of software know if the user is roleplaying or serious? Just a random user on the ChatGPT site. Not some purpose built application for anything.
Its dangerous software that should not be in the hands of the general public until it has been made to not answer these types of questions. And yet apps are specifically being built for these types of questions for ChatGPT.
Thats like knowing gasoline shouldn’t be drank but opening a gasoline serving lemonade stand.
Fixed that for you.
No argument there.
How does this piece of software know if the user is roleplaying or serious? Just a random user on the ChatGPT site. Not some purpose built application for anything.
why should it matter?
i personally value human life over having the ability to have an AI roleplay a realistic scenario where it is suggesting murder