Just your daily reminder to not trust or at the very least, fact check whatever chatgpt spews out because not only does it blatantly lie, but it also makes stuff way more than youd want to believe.
(btw batrapeton doesnt exist and is a fictional genus of jurassic amphibians that I made up for a story that I am writing. They never existed in any way shape or form and neither is there any trace of info about them online yet here we are with chatgpt going “trust me bro” about them lol)
Nobody asked it to imagine anything. What would x mean in y is a common phrasing
Yes, they did. OP instructed it to fill in the blank by asking “what would it mean”, not if it knows what it is. If you ask it, “Do you know what ‘batrapeton’ means in a paleontological context?” instead, it does a quick search and responds like this:
AI output hidden from delicate eyes (/s Actually, it's just long)
I just asked Gemini and it got the wrong answer even after google searching. Plus, what I said was “what would <something> mean in <some field>” is a normal way of asking “what does <something> mean in <some field>”, which a non-pedantic English speaker would understand.
Yes, Gemini is a lot worse generally, and you have to be “pedantic” to get what you want.