Aesthetics

My recommendation of the month is this story from the art world. It’s a pretty simple story.

  1. Artist A posts photo of his living room

  2. Artist B makes a painting, almost exactly matching the photo. The only change is to remove a poster in the background. It’s as close a recreation as you could ask for.

  3. Artist B’s painting is in a fancy gallery in London: I’d be shocked to learn that it appraised for less than a hundred pounds. Artist B has in no way whatsoever given credit or mention to Artist A.

  4. This all comes out in a news story.

  5. Nobody suggests that Artist B did anything wrong. There’s no public outcry. Googling the artist’s name doesn’t reveal hordes of people out for his blood, and the blood of the gallery that allowed stolen art. A photo of the artwork is still up on Artist B’s instagram. I’m not linking because I’m not convinced that Artist B did anything wrong by the standards of their community.

In case you’re wondering: yes, this is about the absolutely absurd claims of some artists that generative AI models are engaged in theft, unlike human artists, who have very different norms around art.

I’ll quote Scott Alexander, because he’s a better writer than I am.

Let me rephrase that. You wanted quicker burger-flipping; instead, you got beauty too cheap to meter. The poorest welfare recipient can now commission works of wonder to make a Medici seethe with envy. If deep down humans always thought that art - and music, and poetry, and all the rest - were just jobs program - just the aesthetic equivalent of digging ditches and filling them to raise the employment rate - tell me now, so I don’t hesitate when the time comes to paperclip you.

If you want a useful model of when something becomes “theft” in art, as opposed to found art, I find that the line is “when it might hurt professional artists”. If an ordinary person like you or I make something of beauty, and an Artist comes along and takes a photo of it, all rights lie with the Artist. Take and display photographs of the work by a famous sculptor, and you will find a very different attitude on display.

By this standard, it is not hard to see why AI-generated art is theft.

On Sanderson: you may have seen the Wired article. It’s a profile of a man who has published an average of almost two books a year for 18 years, Brandon Sanderson. It was not, I think it is fair to say, kind. Sanderson was, as always, a class act. The article is one of those interesting pieces that is worth reading with a writer’s eye: what *is* this person trying to accomplish? How do they succeed or fail? But a much more interesting piece analyzed Sanderson’s writing style: this approach was still not kind, but it at least engaged a little with what it is Sanderson is doing.

Italy seems to be trying to go the way of the French, banning the usage of English (or anything other than Italian) across a wide range of areas, with the goal of restoring the purity of the language. To illustrate why France has failed, embarassingly, I find this quote helpful:

The Académie Française says "jeu video de competition" should replace "e-sports", and "streamer" should become "joueur-animateur en direct".

When you ask to increase length by a factor of four, people will simply not take you seriously. You can attempt this, from time to time, to deliberately reduce people’s references to a concept via a sort of Orwellian Sapir-Whorf-inspired “people will tend to avoid using concepts that take many words in favor of ones that take fewer words”, but I don’t think that this is particularly effective for political goals. Trying to attempt it for non-political ones is simply quixotic.

My personal theory is that part of why the French tilt at windmills like this is that none of them watch esports, and aren’t particularly motivated to come up with a term that actually has a shot at spreading. “c-video” could maybe work! But the average age at the Académie Française, whose decrees have some legal force, is 78. “I need a number of characters that will fit in a Tweet” is not going to persuade them. So they will continue to lose their war, and if Italy attempts to go in the same direction, I expect it to work about as well.

Communal Reasoning

Below is an interesting essay if you’ve been thinking about declines in something like civic virtue, the ability and willingness of individuals to Just Do The Right Thing, cleaning up a park or fixing a road or feeding the homeless. These days, if you host a barbeque to feed people experiencing a state of being unhoused in your city, the tops will arrest you. Greer of Scholar’s Stage has written extensively on how this used to be better, though I really think the starting point is to just go read de Tocqueville and notice how things are different today. I can’t sue civil servants for failing to do their jobs anymore, for one. It also touches on political and personal relationships with Jewishness, something of interest to many of my readers.

Estonian news of the month: other countries in the EU are complaining that Estonia is sending old military hardware to Ukraine and getting reimbursed at the rate to buy new stuff, so that 1% of GDP they’re sending is actually profitable. Still not certain if it’s accurate or not, though the Estonians don’t seem to be denying it very strongly [Update: the Estonians are denying it very strongly] On the other hand, how much will the rest of Europe *really* complain about re-arming one of the most vulnerable countries?

This piece on EA epistemology is strongly recommended. I think it’s worth reading Asterisk with a bit of this frame in mind, because one of the things the magazine tries to do is bring EA approaches to a broader audience. I think that those approaches are good to adopt on the margin for almost everyone, so I support this.

New to LessWrong?

New Comment
1 comment, sorted by Click to highlight new comments since: Today at 9:25 PM

On the AI art as theft thing, two things: yes, it's true that the art world has its own internal bias (as all do). One case you might see mentioned sometimes is how renowned pop artist Roy Lichtenstein essentially did nothing but pick panels out of other people's comics and print them bigger. He would get millions for that stuff whereas the original authors got nothing; because see, what Lichtenstein was doing was elevating their pulp comic panels to art or some other nonsense. Obviously this is now frowned upon also by many artists, but still, that he's considered a great artist at all off this is pretty telling.

But on the other hand, what AI is changing in this sense is scale. Music piracy was always possible. You could copy cassettes and hand them out to your friends. But it never became a massive problem the music industry rallied against until MP3s and the internet made it possible at a scale and degree of losslessness previously impossible. AI art stuff is similar, except whereas piracy tended to even things out for the consumer, AI art seems like a reverse Robin Hood; a corporation extracts value from random people on the internet, aggregates it, and resells that product, which is possible at all only because of that work, to its customers, for cheaper than the original artists would have made it. Note also that it's kind of the wrong framing to look at professional artists as the reference here. They're not the ones getting shafted. High level professional artists can keep producing performances or banana peels stuck to a wall and get paid millions for them because at this point it's not about the object itself, it's about some mix of signalling and prestige attached to it. AI art is directly competing with small commission artists, the kind of people who work their ass off learning how to paint on a tablet with Corel Draw or whatever and then post their digital art portfolios online. That's likely the bulk of the dataset (it's also in line with what the resulting artstyle of the AIs looks like) and that's the people who got their images used without any compensation even when the license didn't allow for derivative works, and now are seeing something built on it resold in a way that completely prices them out of the market.

It is kind of maddening. "Theft" isn't probably the right word for it, but it's a huge power asymmetry when a huge corporation (let's consider how now OpenAI is straight up owned by Microsoft) who would crack down on random nobodies for pirating their software gets away with pirating on mind-numbing scale simply because the use case is new and not quite yet well legislated (I consider it somehow a milder version of making a derivative work) and because no one can actually check for sure. If these corporations want to scrap all copyright laws, hey, go ahead. But at least let's not make this one rule for me and another for thee.

Also, from a pragmatic viewpoint, few points would create a strong immediate economic incentive for pursuing alignment than "you get sued into oblivion if your AI happens to accidentally violate copyright, trademark and/or privacy laws". So, you know, there's a strategic aspect to that too.