I think, for me, memory is not necessary for observation, but it is necessary for that observation to... go anywhere, become part of my overall world model, interact with other observations, become something I know?
and words help me stick a thing in my memory, because my memory for words is much better than my memory for e.g. visuals.
I guess that means the enduring world maps I carry around in my head are largely made of words, which lowers their fidelity compared to if I could carry around full visual data? But heightens their fidelity compared to when I don't convert my observations into words - in that case they kind of dissolve into a vague cloud
...oh but my memory/models/maps about music are not much mediated by words, I think, because my music memory is no worse than my verbal memory. are my music maps better than my everything-else maps? not sure maybe!
for some reason these crows made me laugh uncontrollably
This is great, thank you.
I didn't quite understand how "Beware ratchet effects" fits into/connects with the rest of the section that it's in - could you spell that out a bit? Also I'm curious if there are concrete examples of that happening that you know about & can share, though ofc very reasonable if not.
oh yeah my dispute isn't "the character in the song isn't talking about building AI" but "the song is not a call to accelerate building AI"
as Solstice creative lead I neither support nor oppose tearing apart the sun for raw materials
Take Great Transhumanist Future. It has "a coder" dismantling the sun "in another twenty years with some big old computer." This is a call to accelerate AI development, and use it for extremely transformative actions.
Super disagree with this! Neither I nor (I have not checked but am pretty certain) the author of the text wants to advocate that! (Indeed I somewhat actively tried to avoid having stuff in my program encourage this! You could argue that even though I tried to do this I did not succeed, but I think the fact that you seem to be reading ~motivations into authors' choices that aren't actually there is a sign that something in your analysis is off.) I think it's pretty standard that having a fictional character espouse an idea does not mean the author espouses it.
In the case of this song I did actually consider changing "you and I will flourish in the great transhumanist future" to "you and I MAY flourish in the great transhumanist future" to highlight the uncertainty, but I didn't want to make changes against the author's will, and Alicorn preferred to keep the "will" there because the rest of the song is written in the indicative mood. And, as I said before, Solstice is a crowdsourced endeavor and I am not willing to only include works where I do not have the slightest disagreement.
If the main problem with changing the songs is that many people in this community want to sing about AI accelerationism and want the songs to be anti-religious, then I stand by my criticisms
hmm, I want to be able to sing songs that express an important thing even if one can possibly read them in a way that also implies some things I disagree with
If the main problem with changing the songs is in making them scan and rhyme, then I can probably just pay that cost.
you are extremely welcome to suggest new versions of things!
but a lot of the cost is distributed and/or necessarily borne by the organizers. changing lines in a song that's sung at Solstice every year is a Big Deal and it is simply not possible to do this in a way that does not cause discourse and strife
(I guess arguably we managed the "threats and trials" line in TWTR without much discourse or strife but I think the framing did a lot there and I explicitly didn't frame it as a permanent change to the song, and also it was a pretty minor change)
re point 1 - maybe? unsure
[edit: one issue is that some irregularities will in fact be correlated across takes and STILL shouldn't be written down - like, sometimes a song will slow down gradually over the course of a couple measures, and the way to deal with that is to write the notes as though no slowdown is happening and then write "rit." (means "slow down") over the staff, NOT to write gradually longer notes; this might be tunable post facto but I think that itself would take human (or really good AI) judgment that's not necessarily much easier than just transcribing it manually to start]
re point 2 - the thing is you'd get a really irregular-looking hard to read thing that nobody could sightread. (actually this is already somewhat true for a lot of folk-style songs that sound intuitive but look really confusing when written down)
As someone who likes transcribing songs,
1) I endorse the above
2) if you ask me to transcribe a song I will often say yes (if it's not very frequent) (it costs time but not that much cognitive work for me so I experience reasonable amounts of this as fun)
One thing that makes this hard to automate is human imprecision in generating a recording, espeically with rhythm: notes encode frequencies but also timings and durations, and humans performing a song will never get those things exactly precise (nor should they - good performance tends to involve being a little free with rhythms in ways that shouldn't be directly reflected in the sheet music), so any automatic transcriber will get silly-looking slightly off rhythms that still need judgment to adjust.
I think the first time I encountered this post I had some kind of ~distaste for, idk, the idea that my beliefs and my aesthetics need have anything to do with each other? Maybe something about, protecting my right to like things aesthetically for arbitrary reasons without feeling like they need to fit into my broader value system in some coherent way, and/or to believe things without worrying about their aesthetics? whereas now I guess my... aesthetics, in this post's frame... have evolved to... idk, be more okay integrating these things with each other? having a more expansive and higher-uncertainty set of values/beliefs/aesthetics? all these words are very uncertain but this is interesting to encounter
A more concrete thought I have is: I've noticed that my social environments seem to kind of organically over time shape my worldviews in a way that I sometimes find kind of meta-epistemically disconcerting because it makes me feel like what I believe is more determined by who I'm around than by what's true. I think this is a pretty fair reaction for me to have, but also, reading this post now makes me think that actually part of it is that being around people who find a given thing beautiful causes me to learn how to find that thing beautiful too? And that's not a bad thing, I think; at least as long as I don't forget how to find other things beautiful like I used to, and perhaps periodically explore how I might find yet other, more foreign things beautiful, and don't start to believe beauty is quite the same thing as truth.