Charlie Steiner

LW1.0 username Manfred. Day job is condensed matter physics, hobby is thinking I know how to assign anthropic probabilities.

Sequences

Philosophy Corner

Comments

Self-study ideas for micro-projects in "abstract" subjects?

Go for it - though it's about selection effects as much as it's advice.

What other problems would a successful AI safety algorithm solve?

The best technical solution might just be "use the FAI to find the solution." Friendly AI is already, at its core, just a formal method for evaluating which actions are good for humans.

It's plausible we could use AI alignment research to "align" corporations, but only in a weakened sense where there's some process that returns good answers in everyday contexts. But for "real" alignment where the corporation somehow does what's best for humans with high generality... well, that means using some process to evaluate actions, so this is the case of using FAI.

Self-study ideas for micro-projects in "abstract" subjects?

Practicing the saxophone only has a payoff in saxophone music. Similarly, abstract exercises will often only have a payoff in nice abstractions. For example, solving the wave equation of the hydrogen atom is a classic, but it cannot gain you anything directly because it is merely knowledge, and that same knowledge can be found in a textbook or wiki faster than it can be derived. You're just gonna have to be the sort of person for whom solving the wave equation of the hydrogen atom is a juicy project.

Comment on the lab leak hypothesis

Do you have a cite for previous work reporting or using this sequence (something like cct cgg cgg gca) for a cleavage site in viruses? I only ended up finding and looking through one bit of prior gain of function research that's the sort of genetic engineering you're hypothesizing ( https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3168280/ ) but it used a totally different sequence. Better yet, someone from pre-covid-19 times talking about how they made their code include "cggcgg" as a marker.

Oh No My AI (Filk)

To scan, the structure should be something like:

(2 syllable pickup) (3-4 syllable measure) (2-3 syllable measure) (2-3 syllable measure)

(optional 1-syllable pickup) (4 syllable measure [the last 2 being AI])

(1-2 syllable pickup) (3-4 syllable measure) (2-3 syllable beat) (2-3 syllable beat)

(optional 1-syllable pickup) (4 syllable measure [the last 2 being AI])

(1-2 syllable pickup) (2-4 syllable measure) (2-3 syllable beat) (2-3 syllable beat)

(optional 1-syllable pickup) (4 syllable measure [the last 2 being AI])

(optional 1 syllable pickup) (4 syllable measure [the last 2 being AI])

(1 syllable pickup) (4 syllable measure [the last 2 being AI])

(1 syllable pickup) (4 syllable measure [the last 2 being AI])

 

E.g.

(I was) (gonna) (go to work but) (then I got high)


(I) (just got a) (new promotion but) (I got high)


[no pickup] (Now I'm) (selling dope and) (I know why)

 

[no pickup] ('Cause I got high)


(Be)(cause I got high)


(Be)(cause I got high)

What is the most effective way to donate to AGI XRisk mitigation?

Right on time, turns out there's more grants - but now I'm not sure if these are academic-style or not (I guess we might see the recipients later). https://futureoflife.org/fli-announces-grants-program-for-existential-risk-reduction/?fbclid=IwAR3_pMQ0tDd_EOg_RShlLY8i71nGFliu0YH8kzbc7fClACEgxIo2uK6gPW8&cn-reloaded=1

Oh No My AI (Filk)

I was gonna help humanity

with my AI

So I put a definition of brains

in my AI

Now it satisfies the preferences

of the fruit fly (oh yeah)

oh no my AI

oh no my AI

oh no my AI

ML is now automating parts of chip R&D. How big a deal is this?

One additional thing I'd be interested in is AI-assisted solution of the differential equations behind better masks for EUV lithography. It seems naively like another factor of 2-ish in feature size is just sitting out there waiting to be seized, though maybe I'm misunderstanding what I've heard about switching back to old-style masks with EUV.

Covid 6/10: Somebody Else’s Problem

Someone (MondSemmel to be precise) posted this last week. I think it's very cool, but also see last-week-me's further thoughts here: https://www.lesswrong.com/posts/92aXvTXxReBQZk2gx/?commentId=Dg6EBu3Cjd6DcBxjD

Covid 6/10: Somebody Else’s Problem

Did your lab leak section have anything at all on biology as opposed to politics? I'm concerned that a lot of total non-experts seem to be promoting the genetic engineering hypothesis because it's a good morality story, or it gives them someone to blame, or even just that it's something exciting to talk about. Arguments like "wikipedia has broken links!" aren't merely unconvincing to me, I find them actively causing me to raise my guard about the selection process that led it to my attention. And though engaging with the biology isn't a surefire way to avoid repeating bs, at least it helps.

Load More