Wiki Contributions

Comments

Giulio4d10

Due to the weather, we're moving this from Saturday to Sunday. Same time same place.

Giulio5mo3-4

I think there can be some parallels to be made between debates on gun control and debates on recent AI regulations

Giulio5mo20

Would be nice to have a website collating people's public p(doom) statements

Giulio6mo20

Given the name, I imagine this line of research is inspired by model organism research, although I wonder if a simpler "demonstrations of misalignment" could've been sufficient/better.

Giulio1y10

"don't hate the player, hate the game"

Moloch is "the game"

Giulio1y74

Asteroid movies probably made people more receptive to x-risk from cosmic collisions

maybe we need a movie about x-risk from misaligned AI? something like Ex Machina and/or Her but with more focus on consequences and less robots

idk could be "counterproductive" too I guess

Giulio1y10

It has come to my attention he’s on a sabbatical. That’s great, but his activity (tweets, podcasts) don’t suggest the level of detachment from engagement I was imagining

Giulio1y20

has EY considered taking a break? Like a really long (at least 1 year) vacation where he’s mostly disconnected from AI news and just the world im general. Maybe sail the world or something. Starting to seem like he has given up anyway. Maybe exiting the bubble a bit will allow for new hope (and ideas? motivation?) to form.

Giulio1y10

“Quote tweeting” this:

https://www.lesswrong.com/posts/sJaHghhQXdepZauCc/thesofakillers-s-shortform?commentId=y4NbKHLeDSsppTZ2P

Wonder if it’s worth synchronizing my Twitter with LW shortform.

Probably not. I think I will just handpick which tweets I repost here. Plus some shortform exclusives maybe.

Load More