Freedom is the right to tell people what they do not want to hear.

  • George Orwell
  • 1 Post
  • 70 Comments
Joined 13 days ago
cake
Cake day: July 17th, 2025

help-circle
  • “Study my brain. I’m sorry,” Tisch quoted Tamura as having written in the note. The commissioner noted that Tamura had fatally shot himself in the chest.

    Didn’t shoot himself in the head to preserve the brain. Reminds me of the “Texas Tower Shooter” Charles Whitman.

    In his note, Whitman went on to request an autopsy be performed on his remains after he was dead to determine if there had been a biological cause for his actions and for his continuing and increasingly intense headaches.

    During the autopsy, Dr. Chenar reported that he discovered a pecan-sized brain tumor, above the red nucleus, in the white matter below the gray center thalamus, which he identified as an astrocytoma with slight necrosis.

    I’ve heard a neuroscientist talk about this and conclude that this tumor could very well have been the cause for his behavior.















  • No disagreement there. While it’s possible that Trump himself might not be - but also might be - guilty of any wrongdoing in this particular case, he sure acts like someone who is. And if he’s not protecting himself, then he’s protecting other powerful people around him who may have dirt on him, which they can use as leverage to stop him from throwing them under the bus without taking himself down in the process.

    But that’s a bit beside the point. My original argument was about refraining from accusing him of being a child rapist on insufficient evidence, no matter how much it might serve someone’s political agenda or how satisfying it might feel to finally see him face consequences. If there’s undeniable proof that he is guilty of what he’s being accused of here, then by all means he should be prosecuted. But I’m advocating for due process. These are extremely serious accusations that should not be spread as facts when there’s no way to know - no matter who we’re talking about.




  • I don’t think you even know what you’re talking about.

    You can define intelligence however you like, but if you come into a discussion using your own private definitions, all you get is people talking past each other and thinking they’re disagreeing when they’re not. Terms like this have a technical meaning for a reason. Sure, you can simplify things in a one-on-one conversation with someone who doesn’t know the jargon - but dragging those made-up definitions into an online discussion just muddies the water.

    The correct term here is “AI,” and it doesn’t somehow skip over the word “artificial.” What exactly do you think AI stands for? The fact that normies don’t understand what AI actually means and assume it implies general intelligence doesn’t suddenly make LLMs “not AI” - it just means normies don’t know what they’re talking about either.

    And for the record, the term is Artificial General Intelligence (AGI), not GAI.


  • Claims like this just create more confusion and lead to people saying things like “LLMs aren’t AI.”

    LLMs are intelligent - just not in the way people think.

    Their intelligence lies in their ability to generate natural-sounding language, and at that they’re extremely good. Expecting them to consistently output factual information isn’t a failure of the LLM - it’s a failure of the user’s expectations. LLMs are so good at generating text, and so often happen to be correct, that people start expecting general intelligence from them. But that’s never what they were designed to do.