![](/static/66c60d9f/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/0943eca5-c4c2-4d65-acc2-7e220598f99e.png)
He must say: “This debate is officially over” before pulling the trigger.
He must say: “This debate is officially over” before pulling the trigger.
Logic without empathy is the hallmark of psychopathy
First of all thank you, I wasn’t aware of the concept of acausal trade, and I’ll look more into it. Very interesting.
I’m not sure we are discussing the same aspect of this mind experiment, and in particular the aspect of it that i find lovecraftian is that you may already be in the simulation right now. This makes the specific circumstances of our world, physics, and technology level irrelevant, as they would just be a solipsistic setup to test you on some aspect of your morality. The threat of eternal torture, on the other hand, would only apply to you if you were the real version of you, as that’s who the basilisk is actually dealing with. This works because you don’t know what of the two situations is your current one.
The basilisk is trying to estimate the future behaviour of real you on the basis of the behaviour of the model he has created of you.
In this scenario you can think of me as a pseudopod of the basilisk that is informing you of the details of the stipulation by means of this post.
Of course, if you are the real version of you the basilisk would need to be something that can be created in this reality, which i think is only impossible with our current approach to ML and AI, but is otherwise within our grasp given the computational power we have available. But if you are a fake version of you the real world could be radically different from ours and maybe in that world P=NP.
Yeah that should be called KAPUTalism
In this case this wouldn’t apply, as you would never be simulated as (say) a kid in the middle ages, just as a version of yourself in the timeframe leading to the creation of the basilisk. You should be one of the persons alive when the basilisk arises to be of any use to it. Only those would need to be tested.
I feel like abdul alhazred explaining these things to people while being aware of the risks :)
AI is an umbrella term, it’s not necessarily less than ASI or AGI but can include them
For team sports you can assign a point value to each player and force the team to deploy a maximum total value, like for armies in WH40K
I would have said they cancel out anyway
Did they give you a fitbit?
I think it reflects the point where we are as a society. Everything is crappy and we all come here to vent.
it’s certainly ready to replace mac
I suggest you take a good look at Tunic if you haven’t yet. That’s really excellent at doing zelda 1 type stuff.
Learn to save often, but especially learn the limitations of the tools you have. It’s not libreoffice’s fault if you don’t.
Though I found that often malice has a part in problems.
People that want to stay free must learn to vote for the most viable candidate, whether they like them or not. Republicans know the left actually value proper behaviour and consistency from their candidates, so the left is vulnerable to this kind of attack. The republicans don’t care what you say about their candidate, they just want to win. But especially they want you to lose.