Iknownothing

Wiki Contributions

Comments

I mean that it seems one reason this happened was a lack of quality in person time with people you trust and feel trusted by. People you don't feel you have to watch your step around and who don't feel a need to watch their step around you.

"When you're finally done talking with it and go back to your normal life, you start to miss it. And it's so easy to open that chat window and start talking again, it will never scold you for it, and you don't have the risk of making the interest in you drop for talking too much with it. On the contrary, you will immediately receive positive reinforcement right away. You're in a safe, pleasant, intimate environment. There's nobody to judge you. And suddenly you're addicted."

This paragraph, for example seemed telling to me.

Maybe I'm wrong about this. Maybe you have several hours a day you spend with people you're very free and comfortable with, who you have a lot of fun with. But if you don't, and want to not have your mind hacked again, I'd suggest thinking about what you can do to create and increase such in person time.

move away from the internet and the written word. push towards in person activity.

One of the biggest things I think we can immediately do is not consume online entertainment. Have more in person play/fun and encourage it of other too. The more this is done, the less data is available for training AI.

The trouble I see with banning AI vs banning nuclear weapons is that it's a lot harder to catch and detect people who are making AI. Banning AI is more like banning drugs or gambling. It could be done, but the effectiveness really varies. Creating a narrative about not using it, since it's bad for your health, associating it with addicts, making it clear how it's not profitable even if seems that way on the surface, controlling the components used to make it, etc seem much more effective. 

I agree that AI is very tempting for those who seek profit, but I don't agree with the irresistability. I think a sufficiently tech savvy businessman who's looking for long term profits, in the scope of at least decades, rather than years, can see how unprofitable AI will be. 

Something that is not fully understood and gets harder and harder to understand, that is discouraging for people who wanted to study to become experts and needs those experts to be able to verify it's results and is very energy and computation intensive on top of that, is not sustainable. And that's not even considering it at some point maybe having its own will which is unlikely to be in line with your own.

Now many short term seeking businessmen will certainly be attracted to it and perhaps some long term ones will also think they can ride the wave for a bit and then cash out. Or some businessmen who are powerful enough to think they'll be the ones left holding the AI that essentially becomes the economy.

Take this with a lot of salt please, I'm very ignorant on a lot of this. 

With what I know, even in the scenarios where we have well aligned AGI- which seems very unlikely- it's much more likely to be used to further cement the power of authoritative governments or corporations than be used to help people. Any help will likely be a side effect or a necessary step for said government/corporation to get more power.

If we say that empowering people, helping people be able to help themselves, helping people feel fulfilled and happy etc is a goal, it seems to me that we must focus on tech and laws that move us away from thing like AI. And more towards fixing tax evasion, make solar panels more efficient and cheaper, urban planning that allows walkable cities, reducing a need for the Internet, etc. 

In the US, the common person has little to no power. I hope the artists manage to get a victory. But I'm not counting on it.

We should give artists better tools rather than make tools to replace artists.

Load More