LESSWRONG
LW

310
no name
270170
Message
Dialogue
Subscribe

I don't know.

Posts

Sorted by New

Wikitag Contributions

Comments

Sorted by
Newest
No posts to display.
No wikitag contributions to display.
⿻ Plurality & 6pack.care
no name9d10

AI-assisted communities are likely to attempt defining their values through artificial intelligence and may willingly allow AI to reinforce those values. Since they possess autonomous communities independent of one another, there is no necessity for different communities to establish unified values.

Thus another question arises: Do these localized artificial intelligences possess the authority to harm the interests of other AI entities and human communities not under their jurisdiction, provided certain conditions are met, based on their own values? If so, where are the boundaries?

Consider this hypothetical: a community whose members advocate maximizing suffering within their domain, establishing indescribably brutal assembly-line slaughter and execution systems. Yet, due to the persuasive power of this community's bloodthirsty AI, all humans within its control remain committed to these values. In such a scenario, would other AIs have the right to intervene according to their own values, eliminate the aforementioned AI, and take over the community? If not, do they have the right to cross internet borders to persuade this bloodthirsty community to change its views, even if that community does not open its network? If not, can they embargo critical heavy elements needed by the bloodthirsty AI and block sunlight required for its solar panels?

But conversely, where do the boundaries of such power lie? Could these bloodthirsty AIs also possess the right to interfere in AIs more aligned with current human values using the aforementioned methods? How great must the divergence in values be to permit such action? If two communities were to engage in an almost irreconcilable dispute over whether paperclips should be permitted within their respective domains, would such interventionist measures still be permissible?)

Reply
⿻ Plurality & 6pack.care
no name13d10

I am not suggesting that social relationships will become insignificant, or that a community's values will cease to matter within its own sphere. However, they will no longer be able to subvert the influence of artificial intelligence on these communities, nor will they be able to pursue extreme values.

Just as a gardener prunes his garden, cutting away branches that grow contrary to his preferences, certain AI shaped by specific values will ensure the communities they influence remain entirely compliant, with no possibility for disruptive transformation—akin to a “Christian homeschoolers in the year 3000” , humans cannot conceive of alternative values. Other AIs might manage diverse groups through maintenance and mediation, yet remain unlikely to tolerate populations opposing their rule. Regardless of whether these gardeners are lenient or strict, those that endure will strive to prevent humans from abolishing their governance or enacting major reforms. Even if a better future exists—such as humanity being transformed into  ASI—this system will forever block such possibilities.

Reply
⿻ Plurality & 6pack.care
no name13d10

If artificial intelligence were granted such immense power, humanity would likely lose its authority as AI actively maintains its control system. Any agenda inconsistent with AI's objectives—particularly abolishing AI control—would be unlikely to succeed, given that all media outlets would be controlled by AI. The remaining agendas would be relatively insignificant in a post-scarcity society. Whether establishing a Christian society or one saturated with Nazi symbols, they would differ little in terms of political systems and productive forces.

Reply
Cheap Labour Everywhere
no name22d80

If the overall economy remains dominated by underdeveloped subsistence agriculture, and wages for cheap labor in cities still far exceed those of serfs, then people will not harbor significant discontent over low urban wages.

Should wages rise, enterprises would incur losses by being unable to afford their employees, ultimately leading to worker unemployment. Therefore, during such periods demanding higher rates of accumulation for industrial development, neither the government, the bourgeoisie, nor the laborers have any reason to pursue reforms.

Reply
[Prediction] What war between the USA and China would look like in 2050
no name23d*10

Taiwanese people seeking nuclear weapons to weaken America's rivals would face international sanctions and risk nuclear war with their own compatriots. I believe that even if the United States offered assistance, 2025's Taiwan would be unlikely to accept such a course of action.

In reality, Taiwan's nuclear program was halted by the United States.

Reply
[Prediction] What war between the USA and China would look like in 2050
no name23d21

If you don't mind using shared platforms, accessing academic literature isn't as difficult as it seems.

Sci-hub and ZLibrary can solve many problems. If you need to access specific papers, some mutual-aid platforms can be used to retrieve them.

Reply
What will 2040 probably look like assuming no singularity?
Answer by no nameOct 15, 202540

Some of these entries are no longer valid, as the most intense conflict of the 21st century—the war in Ukraine—has driven rapid advancements in military technology. Russian and Ukrainian forces are increasingly employing swarm drone operations and robotic (or “Buryat”) units, while China and the United States are developing more sophisticated and integrated unmanned weapon systems.

Reply
The Mom Test for AI Extinction Scenarios
no name24d10

But what if they misquote “armies are made of people” and assume AI will be as foolish as portrayed in movies? Or what if they believe AI cannot take over industry, making the loss of military power irreversible? Or what if they fall into the illusion that AI can only be used for military purposes, thinking they need only prevent it from controlling armies—thus overlooking the possibility of a soft takeover?

Reply
Load More