If the machine predicts that you will take both Boxes A and B, Box B will be empty. But if the machine predicts that you will take Box B only, then Box B will contain $1,000,000,000. The machine has already done it’s prediction and the contents of box B has already been set. Which box/boxes do you take?

To reiterate, you choices are:

-Box A and B

-Box B only

(“Box A only” is not an option because no one is that stupid lol)

Please explain your reasoning.

My answer is:

spoiler

I mean I’d choose Box B only, I’d just gamble on the machine being right. If the machine is wrong, I’ll break that thing.


This is based on Newcomb’s Paradox (https://en.wikipedia.org/wiki/Newcomb’s_paradox), but I increased the money to make it more interesting.

  • kthxbye_reddit@feddit.de
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    The best case result is 1.001.000.000 (A+B) vs 1.000.000.000 (B) only. Worst case is I have 1.000.000 only.

    I go with B only because the difference feels tiny / irrelevant.

    Maybe I actually have free will and this is not determism kicking in, but who knows. I‘m not in for the odds with such a tiny benefit.

    • OptimusFine@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Worst case is I have 1.000.000 only.

      Except that’s not the worst case. If the machine predicted you would pick A&B, then B contains nothing, so if you then only picked B (i.e. the machine’s prediction was wrong), then you get zero. THAT’S the worst case. The question doesn’t assume the machine’s predictions are correct.

      • kthxbye_reddit@feddit.de
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Good point. Actually I was assuming that the machine’s predictions were never wrong. That’s also what is defined in the Newcomb’s Paradox wiki page.

        If that‘s not a 100% given, you are definitely right.

    • wols@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Well if you actually have free will, how can the machine predict your actions?

      What if someone opened box B and showed you what was in it? What would that mean? What would you do?

      • kthxbye_reddit@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        I meant, let’s imagine the machine predicted B and is wrong (because I take A+B). I would call that scenario „I have free will - no determinism.“ Then I will have 1.000.000.000 „only“. That’s a good result.

        Maybe interesting: Wiki - Determinism