• RunawayFixer@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 hours ago

    I would rather not have it attempt something that it can’t do, no direct result is better than a wrong result imo. Here it’s correctly identifying that it’s a calculation question and instead of suggesting using a formula, it tries to hallucinate a numerical answer itself. The creators of the model seem to have a mindset that the model must try to answer no matter what, instead of training it to not answer questions that it can’t answer correctly.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      3
      ·
      7 hours ago

      As far as I can tell, the copilot command has to be given a range of data to work with, so here it’s pulling a number out of thin air. Be nice if the output from this was just “please tell this command which data to use” but as always it doesn’t know how to say “I don’t know”…

      Mostly because it never knew anything to start with.