Just your daily reminder to not trust or at the very least, fact check whatever chatgpt spews out because not only does it blatantly lie, but it also makes stuff way more than youd want to believe.
(btw batrapeton doesnt exist and is a fictional genus of jurassic amphibians that I made up for a story that I am writing. They never existed in any way shape or form and neither is there any trace of info about them online yet here we are with chatgpt going “trust me bro” about them lol)
I just asked Gemini and it got the wrong answer even after google searching. Plus, what I said was “what would <something> mean in <some field>” is a normal way of asking “what does <something> mean in <some field>”, which a non-pedantic English speaker would understand.
Yes, Gemini is a lot worse generally, and you have to be “pedantic” to get what you want.