• 51 Posts
  • 869 Comments
Joined 1 year ago
cake
Cake day: December 18th, 2023

help-circle


  • For the purposes of this Regulation:

    ‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person;

    GDPR

    Anything connected to your username is personal data. Your votes, posts, comments, settings subscriptions, and so on, but only as long as they are or can be actually connected to that username. Arguably, the posts and comments that you reply to also become part of your personal data in that they are necessary context. Any data that can be connected to an email address, or an IP address, is also personal data. When you log IPs for spam protection, you’re collecting personal data.

    It helps to understand the GDPR if you think about data protection rights as a kind of intellectual property. In EU law, the right to data protection is regarded as a fundamental right of its own, separate from the right to privacy. The US doesn’t have anything like it.



















  • Thank you for the long reply. I took some time to digest it. I believe I know what you mean.

    I can also say that the consciousness resides in a form of virtual reality in the brain, allowing us to manipulate reality in our minds to predict outcomes of our actions.

    We imagine what happens. Physicists use their imagination to understand physical systems. Einstein was famous for his thought experiments, such as imagining riding on a beam of light.

    We also use our physical intuition for unrelated things. In math or engineering, everything is a point in some space; a data point. An RGB color is a point in 3D color space. An image can be a single point in some high dimensional space.

    All our ancestor’s back to the beginning of life had to navigate an environment. Much of the evolution of our nervous system was occupied with navigating spaces and predicting physics. (This is why I believe language to be much easier than self-driving cars. See Moravec’s paradox.)

    One problem is, when I think abstract thoughts and concentrate, I tend to be much less aware of myself. I can’t spare the “CPU cycles”, so to say. I don’t think self-awareness is a necessary component of this “virtual environment”.

    There are people who are bad at visualizing; a condition known as aphantasia. There must be, at least, quite some diversity in the nature of this virtual environment.

    Some ideas about brain architecture seem to be implied. It should be possible to test some of these ideas by reference to neurological experiments or case studies, such as the work on split-brain patients. Perhaps the phenomenon of blindsight is directly relevant.

    I am reminded of the concept of latent representations in AI. Lately, as reasoning models have become the rage, there are attempts to let the reasoning happen in latent space.


  • I’ve heard that “argument” about a lot of slurs. Do you think any non-tech person is involved or interested enough to make any difference between the good tech-males and the bad tech-bros? Besides, why would there be a problem with a guy who likes football?

    BTW. Men are not the victims of that slur. The subtext is that good girls don’t do tech. Or if they do, they at least don’t make waves. They don’t invent things, become rich tech CEOs, or anything else that someone might find objectionable. They can become artists and make pretty things, or authors and write about their feelings; that sort of thing. You know, girl stuff.


  • You do it wrong, you provided the “answer” to the logic proposition, and got a parroted the proof for it.

    Well, that’s the same situation I was in and just what I did. For that matter, Peano was also in that situation.

    This is fixed now, and had to do with tokenizing info incorrectly.

    Not quite. It’s a fundamental part of tokenization. The LLM does not “see” the individual letters. By, for example, adding spaces between the letters one could force a different tokenization and a correct count (I tried back then). It’s interesting that the LLM counted 2 "r"s, as that is phonetically correct. One wonders how it picks up on these things. It’s not really clear why it should be able to count at all.

    It’s possible to make an LLM work on individual letters, but that is computationally inefficient. A few months ago, researchers at Meta proposed a possible solution called the Byte Latent Transformer (BLT). We’ll see if anything comes of it.

    In any case, I do not see the relation to consciousness. Certainly there are enough people who are not able to spell or count and one would not say that they lack consciousness, I assume.

    Yes, but if you instruct a parrot or LLM to say yes when asked if it is separate from it’s surroundings, it doesn’t mean it is just because it says so.

    That’s true. We need to observe the LLM in its natural habit. What an LLM typically does, is continue a text. (It could also be used to work backwards or fill in the middle, but never mind.) A base model is no good as a chatbot. It has to be instruct-tuned. In operation, the tuned model is given a chat log containing a system prompt, text from the user, and text that it has previously generated. It will then add a reply and terminate the output. This text, the chat log, could be said to be the sum of its “sensory perceptions” as well as its “short-term memory”. Within this, it is able to distinguish its own replies, that of the user, and possibly other texts.

    My example shows this level of understanding clearly isn’t there.

    Can you lay out what abilities are connected to consciousness? What tasks are diagnostic of consciousness? Could we use an IQ test and diagnose people as having or not consciousness?

    I was a bit confused by that question, because consciousness is not a construct, the brain is, of which consciousness is an emerging property.

    The brain is a physical object. Consciousness is both an emergent property and a construct; like, say, temperature or IQ.

    You are saying that there are different levels of consciousness. So, it must be something that is measurable and quantifiable. I assume a consciousness test would be similar to IQ test in that it would contain selected “puzzles”.

    We have to figure out how consciousness is different from IQ. What puzzles are diagnostic of consciousness and not of academic ability?