• shalafi@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 days ago

    I’ve found ChatGPT to almost never be wrong, can’t think of an example ATM. Having said that, I have a sense for what it can and can’t do, what sort of inputs will output a solid answer.

    Where it goes hilariously sideways is if you talk to it like a person and keep following up. Hell no. You ask a question that can be answered objectively and stop.

    No way the output went straight to, “Sure! Bromine’s safe to eat.” Either he asked a loaded question to get the answer he wanted or this came after some back and forth.