• Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    5
    ·
    8 hours ago

    In fairness the example I’ve seen MS give was taking a bunch of reviews and determining if the review was positive or negative from the text.

    It was never meant to mangle numbers, but we all know it’s going to be used for that anyway, because people still want to believe in a future where robots help them, rather than just take their jobs and advertise to them.

    • RunawayFixer@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 hours ago

      I would rather not have it attempt something that it can’t do, no direct result is better than a wrong result imo. Here it’s correctly identifying that it’s a calculation question and instead of suggesting using a formula, it tries to hallucinate a numerical answer itself. The creators of the model seem to have a mindset that the model must try to answer no matter what, instead of training it to not answer questions that it can’t answer correctly.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 hours ago

        As far as I can tell, the copilot command has to be given a range of data to work with, so here it’s pulling a number out of thin air. Be nice if the output from this was just “please tell this command which data to use” but as always it doesn’t know how to say “I don’t know”…

        Mostly because it never knew anything to start with.