This might also be an automatic response to prevent discussion. Although I’m not sure since it’s MS’ AI.

  • otp@sh.itjust.works
    link
    fedilink
    arrow-up
    36
    arrow-down
    6
    ·
    edit-2
    6 months ago

    I think the LLM won here. If you’re being accusational and outright saying its previous statement is a lie, you’ve already made up your mind. The chatbot knows it can’t change your mind, so it suggests changing the topic.

    It’s not a spokesperson/bot for Microsoft, not a lawyer. So it knows when it should shut itself off.

    • naevaTheRat@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      13
      ·
      6 months ago

      The chatbot doesn’t know anything. It has no state like that, your text just gets appended to it’s text.

      It has been prompted to disengage from disagreement or something similar. By a human designer.

    • webghost0101@sopuli.xyz
      link
      fedilink
      arrow-up
      9
      ·
      6 months ago

      To add, i have seen this behavior the moment you get to argumentative so its not like its purposely singling some topics out.