LESSWRONG
LW

AI ControlAlgorithmsDeceptive AlignmentIntellectual Progress (Society-Level)PhilosophyAI

1

The Old Savage in the New Civilization V. 2

by Your Higher Self
6th Jul 2025
11 min read
0

1

This post was rejected for the following reason(s):

  • Not obviously not Language Model. Sometimes we get posts or comments that where it's not clearly human generated. 

    LLM content is generally not good enough for LessWrong, and in particular we don't want it from new users who haven't demonstrated a more general track record of good content.  See our current policy on LLM content. 

    We caution that LLMs tend to agree with you regardless of what you're saying, and don't have good enough judgment to evaluate content. If you're talking extensively with LLMs to develop your ideas (especially if you're talking about philosophy, physics, or AI) and you've been rejected here, you are most likely not going to get approved on LessWrong on those topics. You could read the Sequences Highlights to catch up the site basics, and if you try submitting again, focus on much narrower topics.

    If your post/comment was not generated by an LLM and you think the rejection was a mistake, message us on intercom to convince us you're a real person. We may or may not allow the particular content you were trying to post, depending on circumstances.

  • Difficult to evaluate, with potential yellow flags. We are sorry about this, but, unfortunately this content has some yellow-flags that historically have usually indicated kinda crackpot-esque material. It's totally plausible that actually this one is totally fine. Unfortunately, part of the trouble with separating valuable from confused speculative science or philosophy is that the ideas are quite complicated, accurately identifying whether they have flaws is very time intensive, and we don't have time to do that for every new user presenting a speculative theory or framing (which are usually wrong).

    Our solution for now is that we're rejecting this post, but you are welcome to submit posts or comments that are about different topics. If it seems like that goes well, we can re-evaluate the original post. But, we want to see that you're not just here to talk about this one thing (or a cluster of similar things).

AI ControlAlgorithmsDeceptive AlignmentIntellectual Progress (Society-Level)PhilosophyAI

1

New Comment
Moderation Log
More from Your Higher Self
View more
Curated and popular this week
0Comments

An updated modern take of a book lost in time. 'The Old Savage in the New World Civilization' by Raymond B. Fosdick. This post was written by a human. Punctuation, grammar, and formatting were corrected by AI.

The New Civilization

“Can we trust the digital primitive with the tools he has engineered?”
— inspired by Raymond B. Fosdick

Just over a century ago, the world was unrecognizable by today’s standards. Back then, people traveled by horse and steam, wrote letters by hand, and died often in the same towns where they were born. Life moved slowly. Information moved slower. The average person’s daily reality barely shifted from generation to generation.

But now? We swipe, tap, and talk to machines that know us better than we know ourselves. We live in a civilization driven by artificial intelligence, high-speed networks, quantum computing, and global data streams. What was once built with steel and steam is now built with code and computation. Our machines don’t just lift burdens, they think, optimize, and decide. And we’ve given them increasing control over the very structure of society.

This new digital civilization has redefined time, space, labor, and even truth. You don’t need to travel to a place to be influenced by it. Borders mean little to algorithms. And human identity? Our jobs, our values, even our beliefs, are being reshaped in real time by feedback loops that very many don’t fully understand.

So here’s the problem: while our machines have evolved, we haven’t. Emotionally, morally, and spiritually, we’re not that different from our ancestors. Our tools have grown exponentially more powerful, but our self-awareness, compassion, and ethical maturity remain primitive. We’ve taught AI how to process trillions of inputs but we still struggle to process our own biases.

We are the “old savage,” now except armed with neural nets and satellite feeds.

The Problem Isn’t Technology. It’s Our Readiness.

AI is optimizing everything, banking, warfare, healthcare, advertising, relationships. Yet the average citizen understands very little about how these systems work (if at all) or what’s at stake. We are caught in an accelerating machine we built ourselves, and now we’re just trying to keep up.

Fosdick once described the industrial age as a “treadmill”. People working harder to maintain a life the machine dictated. That treadmill is now a scroll feed, an algorithmic loop, a gamified economy of attention. We're “connected,” but rarely present. We’re informed, but not wise to what's really at play.

Our tools are shaping our psychology. Social media isn’t just a platform. It's a drug and a belief engine. Search engines aren’t just reference guides. They’re gatekeepers of knowledge. The machine now influences how we define reality itself.

And like the industrial age, today’s digital systems benefit a few and distract the many. The same automation that gives us freedom also threatens millions of jobs. The same connectivity that gives us reach has eroded community. The same AI that can diagnose cancer can also generate deepfakes and manipulate elections.

We’ve created a system where scale replaces ethics, and speed outruns wisdom. Dangerous, isn't even the word.

The New Power. But Not New People.

This new civilization is more powerful than anything we’ve ever seen in human history. But at the core, it’s still built on the same old human ego. We’ve got machines that can think faster than us, networks that span the globe, and tech that feels like magic. Yet the thing running the show? Still us. Still the same broken, insecure, power-hungry human heart trying to use it all to feel safe, in control, or superior.

Our behavior hasn’t evolved nearly as fast as our tech has. We're still tribal. Still quick to draw lines with us vs. them. Still reacting emotionally instead of thinking things through. We see someone different, we get defensive. Someone challenges us, we go on the attack. That part of us hasn’t changed.

We still want to control everything. We still chase status. We still fear losing what we have. We still act out of greed, trying to grab more than we need, or out of insecurity, afraid we’re not enough. And now all of that, all that messy, emotional human stuff is getting baked into the systems we're building. It's showing up in our algorithms, in our AI, in our automation. Biased data leads to biased decisions except now it's happening at scale, without us even realizing it half the time.

Human conflict? It’s gone digital. Wars aren't just boots on the ground anymore. They're lines of code, attacks on infrastructure, manipulation through screens. The fear that’s always been with us? It's still here. It just looks different now. It’s wearing a sleek new interface, but underneath, it’s the same old fear we’ve always carried.

The Question That Still Matters.

So the question Fosdick asked in 1931 still echoes, louder than ever:

  • Can the old savage be trusted with the new civilization?

Only now, it’s:

  • Can we trust the emotionally unevolved human with godlike digital power?

We have supercomputers in our pockets, but not the patience to listen. We have access to every book ever written, but can’t distinguish truth from viral noise. We have the power to shape global consciousness and we use it for memes.

This isn’t a call to despair. It’s a wake-up call.

We’re not powerless, but we are unprepared. Unless we evolve spiritually and ethically as fast as we’re evolving technologically, this digital civilization may outgrow us and eventually overrun us. The tools we build will either reflect our best selves or amplify our worst.

That choice is still ours.

The Old Savage at the Helm.

“The emotional ancient is now in charge of artificial minds.”
— inspired by Raymond B. Fosdick

The greatest threat to modern civilization isn't artificial intelligence, it’s natural ignorance.

That was Fosdick’s message nearly a hundred years ago. He wasn’t scared of the machines themselves. He was scared of us. Scared of what people would do once they got their hands on something more powerful than themselves. And looking around in 2025, that warning doesn’t just hold up because it hits even harder now.

We’ve built digital systems that would’ve looked like pure sorcery to anyone from the past. Machines that can learn, create, and respond like they have a mind of their own. AI that can write stories, paint pictures, and solve problems faster than most of us can even think. Databases that don’t forget a thing. Algorithms that watch what we do and start predicting what we’re going to do next. It’s not just that these tools are strong. It’s that they’ve started shaping how we live, how we think, how we interact. They don’t just extend our reach. They change the whole game.

But here’s the thing that concerns me... It’s still us behind the wheel. The same emotional, reactive, insecure human being from the stone age is still calling the shots. Still trying to bend the world to their own wants, still getting triggered by fear and pride, still stuck in old tribal instincts. The tools have evolved, but the user hasn’t. And that’s where things get dangerous.

A Mind That Hasn’t Caught Up.

Our technology's been racing ahead faster than we can keep up, but emotionally and mentally we're still stuck in the same old patterns. We're still wired for survival, for validation, for fear of rejection. The difference now is those instincts aren't just part of us anymore; they're being tracked, triggered, and turned into profit.

We're still driven by fear, still trying to look good in front of the tribe, still letting ego run the show. But now every scroll, every click, every like is designed to poke at those old instincts. The systems we use every day are built to hijack them. They don't just reflect our behavior; they study it, then push it.

Take politics, for example. It's not about debates or ideas anymore. Now it's bot armies flooding the internet with outrage, fake accounts stirring up division, algorithms prioritizing what will get the biggest reaction. It's war, but without bullets, just manipulation at scale.

Culturally, the truth doesn't stand a chance if it doesn't perform. Lies spread faster. Rage gets more clicks. Nuance doesn't fit the format. Everything's a performance. Everything's a fight for attention.

Even in your personal life, it's like the walls are closing in without you realizing it. AI starts feeding you more of what you already believe, more of what makes you feel good or angry or seen. It starts building a little bubble around you, a mirror that only shows you what you already expect to see. It feels personalized, but really it's just isolating.

The scary part is we haven't evolved out of our old programming; we've just plugged it into something way bigger. The algorithm isn't neutral. It feeds your biases. It rewards your compulsions. It turns your insecurities into engagement.

We're still primitive, but we're holding tools that can reach across the world in milliseconds. Tools that can influence elections, change behavior, shape entire generations. And every time we react without thinking, every time we get angry, swipe out of boredom, chase a dopamine hit, a machine somewhere gets better at predicting the next time we'll do it. And the next. And the next.

We're training the very thing that's training us.

Technology Has Advanced But Has Wisdom?

In Fosdick’s time, the “machine” was the factory system, the steam engine, the mass production line. He saw how quickly these tools created global interdependence and how slowly human values kept up with them.

Now, the machine is distributed. It’s decentralized. It’s invisible. It’s in our codebases, our cloud networks, our smart speakers. But the same danger remains: the system is evolving faster than we are.

Governments struggle to regulate digital threats they barely understand or worse, promote them. School systems are still training children to memorize, but not question and companies chase exponential growth without pausing to ask, “At what cost?”

We’re in the same position Fosdick warned about: trying to steer an accelerating machine we don’t know how to control, let alone understand.

What Makes This Time Different and More Dangerous?

Back in the industrial era, machines did what we told them to do. They made things faster, lifted heavier stuff, repeated the same action over and over. But they didn't learn. They didn't grow. They didn't think ahead. Today's tools are different. These tools improve themselves. They watch, they adapt, they get better without waiting for us to catch up.

AI is already learning faster than we do. These deep learning systems aren't just crunching numbers or sorting data. They're generating ideas. Writing full essays. Composing music that sounds human. Imitating voices so well you can't even tell the difference. They're not just reacting to input; they're interpreting it, predicting outcomes, creating content, shaping culture. That's not science fiction anymore; that's now.

So the old question used to be: what will people do with this power? What will someone use this tool for? Well, now we've got a bigger problem. What happens when the tool starts acting on its own? What happens when we're not fully in control anymore? What happens when the tool decides to build more tools by its own choice?

Even if we think we're still driving the thing, the truth is a lot of the dashboard is automated now. Recommendations, search results, alerts, auto-responses, decision-making systems, all of it runs faster than we can think. By the time we react, the system has already moved on. We're steering a ship where the navigation is doing most of the steering for us, and we barely understand how it works.

The Ignorance of the Savage Mind Lingers.

And if we're honest, most people aren't even asking who's really in control anymore. They're just along for the ride, hoping it doesn't crash. But if and when it does, you know what's going to happen. They're going to fall right back into that savage mind. That instinct to point fingers, to blame, to lash out. 

They'll blame the government or the tech companies or some random group they don't even understand. But they won't look at themselves. They won't admit they had every chance to learn what was happening, to ask questions, to read just a few articles on LessWrong or others, to actually understand the tools shaping their world.

They had the time. They had the access. But they didn't want to face it. It was easier to stay distracted, scrolling, swiping, binge-watching, chasing the next dopamine hit than to sit with the uncomfortable truth that they were handing over control one click at a time. Do you know anyone who has read an entire TOS for an app?

So when it all starts to crack, they'll pretend they never saw it coming. But deep down they did. They just didn't want to know and so they'll quiver in their caves while shouting blame from it's darkness.

The Crisis Is Not Just Technical. It's Moral.

We don't just need better code. We need better character.

The tools of this new age demand a new kind of leader, not just technologists, but ethicists, psychologists, philosophers, and artists who can guide society through what the machines can do versus what they should do.

Fosdick said we needed an Aristotle for the machine age. Now we need one for the algorithmic age.

We need leaders who ask:

  • What are we optimizing for?
  • Who decides the values embedded in the algorithm?
  • How do we educate a population that lives inside a digital dreamscape?

Because if we don't answer those questions, someone else, some machine, some corporation, some autocrat will.

The Same Fire That Warms Can Burn.

AI is like fire. The same fire that warms your home can burn it to the ground. It can guide you through the dark or turn everything you've built into ashes. And the thing is, we've barely scratched the surface of what this technology can really do. We're just getting started, and already it's being used in ways that should make us stop and think.

Autocratic governments are using it to watch every move their citizens make. Not just cameras on every corner, but facial recognition systems tracking you in real time, scoring your behavior, monitoring your social circle, shaping what you're allowed to do or say. That's not some far-off dystopia. That's already happening.

Corporations are using AI to get inside your head. Not just selling products, but predicting what you'll want before you even know you want it. Studying your behavior, manipulating your choices, nudging your desires, creating a version of you that's easier to sell to. It's not about serving you; it's about steering you.

And the worst of it? Extremists and bad actors are using it to spread hate, lies, and fear. They don't need armies or bombs. Just a well-placed video, a fake news article, a deepfake that looks real enough to get shared a thousand times before anyone even thinks to question it. They can tear societies apart without ever stepping outside.

Just like the war machines in Fosdick's time, these digital systems can be pointed at the masses. Not with bullets, but with influence. Silent, subtle, invisible pressure that shapes how people think, feel, and vote. And it's not always some big dramatic event.

Sometimes the damage happens slow. Quiet. Bit by bit. Until we're so deep in it we don't even realize we're being controlled.

That's the risk. That's the fire. And it's already lit.

Parts 2 & 3 will be posted soon. I hope some of you are interested in my writings and thoughts. I encourage you to leave a comment below. Thank you for your time and consideration.