I’ve been watching Isaac Arthur episodes. In one he proposes that O’Neil cylinders would be potential havens for micro cultures. I tend to think of colony structures more like something created by a central authority.
He also brought up the question of motivations to colonize other star systems. This is where my centralist perspective pushes me into the idea of an AGI run government where redundancy is a critical aspect in everything. Like how do you get around the AI alignment problem, – redundancy of many systems running in parallel. How do you ensure the survival of sentient life, – the same type of redundancy.
The idea of colonies as havens for microcultures punches a big hole in my futurist fantasies. I hope there are a few people out here in Lemmy space that like to think about and discuss their ideas on this, or would like to start now.
TLDR: My optimistic view of what human culture could be like is summed up pretty well by the Orion’s Arm project.
I am familiar with the Orion’s Arm universe, a hard sci-fi transhumanist worldbuilding project… shall I recommend you take a trip through the Wormhole Nexus to the Sephirotic Empires where you’re ruled by benevolent S6 transapient dictators (supercharged AGI)? Because you’ll see a fuck ton of entities playing around like retirees. You’ll see “aliens” which actually are just extremely genetically modified humans. In fact, here (https://orionsarm.com/eg-topic/45b177d3ef3b1) is their “Culture and Society” page, which sums up a lot of my optimistic OA-based beliefs in a human culture.
Oh, and most Terragens (humans + any life that can trace its origin back to Earth) live in orbit. Story goes that we made an AI system that decided we humans were bad for the environment and then told us to get the fuck off Earth (Great Expulsion).
https://orionsarm.com
My (hopeless) attempt at explaining some of the terms:
Thanks, I’ll check it out. Sounds way too wild for a realistic future IMO, but still interesting. I don’t think anything with mass will ever come close to the speed of causality, folding or otherwise. That doesn’t have to be a bad thing IMO. For one it makes large scale conflicts pointless.