The number of questions on Stack Overflow fell by 78 percent in December 2025 compared to a year earlier. Developers are switching en masse to AI tools in
It can’t handle things it’s not trained on very well, or at least not anything substantially different from what it was trained on.
It can usually apply rules it’s trained on to a small corpus of data in its training data. Give me a list of female YA authors. But when you ask it for something more general (how many R’s there are in certain words) it often fails.
Actually, the Rs issue is funny because it WAS trained on that exact information which is why it says strawberry has two Rs, so it’s actually more proof that it only knows what it has been given data on. The thing is, when people misspelled strawberry as “strawbery”, then naturally, people respond, " Strawberry has two Rs." The problem is that LLM learning has no concept of context because it isn’t learning anything. The reinforcement mechanism is what the majority of its data tells it. It regurgitates that strawberry has two Rs because it has been reinforced by its dataset.
But that’s exactly how an LLM is trained. It doesn’t know how words are spelled because words are turned into numbers and processed. But it does know when its dataset has multiple correlations for something. Specifically, people spell out words, so it will regurgitate to you how to spell strawberry, but it can’t count letters because that’s not a thing that language models do.
Generative AI and LLMs are just giant reconstruction bots that take all the data they have and reconstruct something. That’s literally what they do.
Like, without knowing what your answer is for assassin, I will assume that your issue is that the question is probably “How many asses are in assassin?” But, like, that’s a joke. Assassins only has one ass, just like the rest of us. That’s a joke. And nobody would ever spell assassin as assin, so why would it learn that there are two asses in assassin?
I’m confused where you are getting your information from, but this is not particularly special behavior.
It can’t handle things it’s not trained on very well, or at least not anything substantially different from what it was trained on.
It can usually apply rules it’s trained on to a small corpus of data in its training data. Give me a list of female YA authors. But when you ask it for something more general (how many R’s there are in certain words) it often fails.
Actually, the Rs issue is funny because it WAS trained on that exact information which is why it says strawberry has two Rs, so it’s actually more proof that it only knows what it has been given data on. The thing is, when people misspelled strawberry as “strawbery”, then naturally, people respond, " Strawberry has two Rs." The problem is that LLM learning has no concept of context because it isn’t learning anything. The reinforcement mechanism is what the majority of its data tells it. It regurgitates that strawberry has two Rs because it has been reinforced by its dataset.
Interesting story, but I’ve seen the same work with how many ass in assassian
you can probe the stuff it’s bad at, and a lot of it doesn’t line up well with the story that it’s how people were corrected.
But that’s exactly how an LLM is trained. It doesn’t know how words are spelled because words are turned into numbers and processed. But it does know when its dataset has multiple correlations for something. Specifically, people spell out words, so it will regurgitate to you how to spell strawberry, but it can’t count letters because that’s not a thing that language models do.
Generative AI and LLMs are just giant reconstruction bots that take all the data they have and reconstruct something. That’s literally what they do.
Like, without knowing what your answer is for assassin, I will assume that your issue is that the question is probably “How many asses are in assassin?” But, like, that’s a joke. Assassins only has one ass, just like the rest of us. That’s a joke. And nobody would ever spell assassin as assin, so why would it learn that there are two asses in assassin?
I’m confused where you are getting your information from, but this is not particularly special behavior.