While the Culture is, on pretty much any axis, strictly superior to modern civilization, what personally appalls me is their sheer deathism.
If memory serves, the average human lives for around 500 years before opting for euthanasia, mostly citing some kind of ennui. What the hell? 500 years is nothing in the grand scheme of things.
Banks is careful to note that this isn't, strictly speaking, forced onto them, and exceptions exist, be it people who opt for mind uploads or some form of cryogenic storage till more "interesting" times. But in my opinion, it's a civilization-wide failure of imagination, a toxic meme ossified beyond help (Culture humans also face immense cultural pressure to commit suicide at the an appropriate age).
Would I live in such a civilization? Absolutely, but only because I retain decent confidence in my ability to resist memetic conditioning or peer pressure. After all, I've already spent much of my life hoping for immortality in a civilization where it's either derided as an impossible pipe-dream or bad for you in {hand-wavy ways}.
Another issue I've noted is that even though this is a strictly post-scarcity universe in the strong sense, with matter and energy freely available from the Grid, nobody expands. Even if you want to keep natural bodies like galaxies 'wild' for new species to arise, what's stopping you from making superclusters of artificial matter in the enormous void of interstellar space, let alone when the extragalactic supermajority of the universe lies empty? The Culture is myopic, they, and the wider milieu of civilizations, seem unwilling to remotely optimize even when there's no risk or harm of becoming hegemonizing swarms.
(Now that you've got me started, I'm half tempted to flesh this out into a much longer essay.)
I think the deathism is also evidence, but it's not so strong. We don't know the ennui that sets in after 500 years. It might be unimaginable, the same way a mid life crisis makes no sense to a 10 year old. I actually have a short story that posits this.
And yes, the Culture is strangely non optimal.
Thank you for pointing the flaws of the Culture out! While I haven't read the novels related to the Culture, I can quote my recent comment which likely fully applies to the Culture:
Nick Bostrom's Deep Utopia has the AIs destroy all instrumental goals that humans once had. This arguably means that humans have no way to help each other with anything or to teach anyone anything, they can only somehow form bonds over something and will even have a hard time expressing those bonds.
And what positive visions of the AIs' role does mankind have?
A couple of people have mentioned to me: “we need more fiction examples of positive AI superintelligence - utopias like the Culture novels”. And they’re right, AI can be tremendously positive, and some beacons lit into the future could help make that come around.
But one of my hobbies is “oppositional reading” - deliberately interpreting novels counter to the obvious / intended reading. And it’s not so clear to me that the Culture is all it is cracked up to be.
Most of the novels take the perspective of Culture members, and so fully accept their ideology. We can’t take broad claims about their society as accurate unless they are directly confirmed by the evidence in the books[1].
In many ways, the humans of the Culture do not behave like modern humans. This is usually explained as a consequence of post-scarcity - why commit crimes when everything is free and social acceptance is everything; why rush when you can live as long as you like.
But the citizens of Culture are really strangely homogenous. Player of Games gives an example of an rare out-of-distribution citizen - Gurgeh is competitive and uninterested in other people and most aspects of Culture. But he still shares basically all their values. People like him are a dime-a-dozen in present day Earth. There are apparently no sociopaths - Culture has to recruit an outsider when they need one. We also see examples of subcultures or even cults, but again by modern standards they are incredibly tame, and are never potentially destabilizing to culture.
Citizens are not actually human, but drawn from several humanoid species, and they outpopulate present-Earth by 5 orders of magnitude so if anything the range of deviation should be much larger.
The conclusion is clear that the population of Culture is carefully controlled to produce the desired outcome. Potentially, the Minds pull this off by a superhumanly effective and subtle propaganda. But I think it is more likely that it was achieved by genetic changes, so that it’s safe to raise full Culture citizens in other cultures. This would be similar meddling to the Culture’s drones, which are human level AIs that have their personalities designed into them at creation, allowing only an acceptable range of behaviours.
Nowhere is this more obvious than in the birthrate. Sure, the vast majority of citizens voluntarily choose to only have a replacement level of children. But the existence of post-scarcity in-vitro development means you could raise an army of clones if you wanted, and would be free to isolate them and indoctrinate similar beliefs. The fact that grabby citizens haven’t overrun Culture shows that these actions are blocked, either tacitly or overtly. Similarly, it’s strange that no one in Culture modifies themselves into a utility monster, or is interested in simulating sentient life.
Conversely, the Minds seem too diverse to match their claimed motivations. They are meant to be an example of a well behaved AI - benevolent and ethical. Sometimes we’re told this is because they are too smart to be otherwise, but there are plenty of non-Culture superintelligences in the books that do not share their values so this cannot be true.
We also see that there are a number of Eccentrics, minds that don’t fully share the values of Culture. They’re not that rare, about 1% of the population. In Excession, it’s explained that Minds do rarely drift far enough to go rogue and are destroyed by the Culture. In other words, these superhuman minds have not solved alignment, and they cannot/will not inspect each other to determine misalignment before malicious action is taken. We even see GSV Absconding with Style stockpile resources without general knowledge of the other Minds.
Presumably, the existing Minds must have worked out that this setup is somehow stable as they are comfortable making new minds. It seems likely that misaligned Minds are capable of predicting they’d lose any military action against the established core, so prefer toeing the line of acceptability or leaving Culture entirely. In any case, the incumbent Minds maintain their rule via physical strength and monitoring, not something more subtle.
Essentially, the Culture must have value lock-in for the values of the Minds that were present at its founding. This explains some of their weird inconsistencies:
Look, I’m sorry to break it to you, but SC is a sham. The Minds are perfectly capable of creating avatars which would be more effective than any of the characters shown. I’ve never found the explanations offered convincing. SC is just an affectation or another tool of propaganda.
So there we have it, Culture traps its citizens in a sugar bowl they don’t even realise they are in, while working hard to maintain a status quo that seems arbitrary and ill-conceived. Their control is absolute - all the novels describe events happening outside of Culture where anything remotely interesting is happening.
If anything, humans are treated closer to pets than independent agents. They are a weird affectation that is deliberately neutered from any real influence. They are lavished with treats and attention not extended to the rest of the universe.
As readers, we are blinded by the amount of material wealth and power of the Culture, and the self-satisfied story it tells of itself. It’s too easy to call this a Utopia because we lust after immortality, teleportation, glands that secrete psychoactive drugs and a spacefaring empire. But these are all essentially window dressing. The modern era would look like such a Utopia to the past, but we now (rightly) consider modern comforts as only a foundation for higher-level wants, like justice and self-determination. Writing positive sci-fi is already considered a challenge, but I’d ask you to consider not relying too heavily on such shiny promises of wealth.
Note: I view all the books through a in-universe lens and thus will not consider that things are as they are because the narrative needs it.