For so many ppl "good" art is something that "looks like the thing" and this study just shows that depressingly well
Also, it is more accurate to say that this AI art is really just a derivative work, it's standing on the shoulder of artists from the past and it's not that difficult to create "art" when all the hard work has been done for you
I already think about this tweet a lot, but your post (which is excellent) really brought it back to mind in a big way: "As a teacher of poetry what I can tell you for sure is people want poems to rhyme. They want poems to rhyme so bad. But we won’t give it to them" (https://x.com/ursulabrs/status/1434791291653558275)
which is funny bc it's nearly impossible in my experience to get an LLM to generate poetry that *doesn't* rhyme, even when specifically prompted, which makes it stand out from most modern poetry
great piece. i think many within Rationalists communities are very tied to their one analytical way of thinking, and can't wrap their minds around other non-analytical forms of knowledge production, and because they can't understand alternative models they don't believe the models are really saying anything. so these tests basically confirm what they've long suspected, that there is ultimately something "bullshit-ty" about "art". i deeply respect Scott Alexander, and i have many friends who i'd consider to be part of this rationalist community, but this is one of the huge blindspots of these folks
I think there's a lot to respect about people who can take an idea seriously instead of just dismissing it as absurd. Sometimes that ability backfires, but most of the time it's pretty useful.
Distilled a lot of thoughts I've had about this issue quite well. The argument that AI is democratizing artistic expression might be technically true but it's increasingly obvious that this removal of skill barriers is solely for the benefit of people who seek to create the equivalent of a Dr. Who / Rick and Morty crossover tee-shirt. Maybe that's what people want, but it certainly doesn't justify the weird gloating many tech enthusiasts seem to be doing about human made art being "over".
There’s virtually no public understanding that art does not exist without context. You see this particularly clearly in the evergreen gripe about abstract art: my six-year-old son could do that.
AI art is the perfect expression of that most horrible word into which all creative output has been packed: content. Devoid of context and meaning, like a gray porridge, ready to be poured into a website template, social media post, or whatever other Internet vehicle is at hand. AI art is like sausage meat in that regard, thousands of years of human culture thrown into a grinder and extruded at the other end in regular, easy to eat, pre-cooked chunks.
| taking the safest and least surprising path at all times.
I agree with the 'least surprising' bit (e.g. the models are predictive and will take the median path), but I think the 'safest' bit is more a product of the large and lucid models being trained by large corporations with liabilities to worry about.
"...So, in the conversation concerning art, as I said before, something is likely to get lost and as funny as it is, what mostly goes unsaid is the creative human behind the work. That is, the artist. How comical it is, and typical, that the initial human spark of the work is forgotten, ignored, unmentioned, and the vomit that is created, the soul’s mandatory upchuck, is now debated, deliberated, picked apart, thoroughly dissected, and in many cases to such an extent that it causes the works own demise in the eyes of the viewer as well as the eyes of the artist. This is why, I take it, some individuals have such an apathy (and sometimes celebration) towards Artificial Intelligence in art—they have no concern for the process or individual, only for the output. Only for the “thing” in its physical or visual form. Either way, we take too much time for the meaning of the piece, rather than the meaning of the process."
Your second footnote is really the key here: people don't understand that these things work by averaging and interpolating data - they think they work by the model literally scratching out little imitations of humans , because they've been given every impression that's how it works by it's boosters ('it learns and imitates, just like you or me!')
The yardstick I am using is that if there was an AI that really did write like Walt Whitman and people preferred the poem shown here to its stuff, I would still be affronted on its behalf.
If people then said “why do we need the Walt Whitbot when we already had these much better poems?” I would be enraged. It’s not really a matter of human uniqueness with the poem. It’s just that it’s really fucking bad
Maybe I am being naive and the answer is simply money, but I don’t get the purpose of using AI to generate art or fiction. Why bother? Why should the average person be asked to distinguish between the two? Why not teach critical thinking so the average person can utilize discernment and judgment when reading or viewing art? Our schools now teach how to pass a test, and not how to think. Let’s, as a society, agree to use AI to create technical prose and solutions, leaving art to humans. Then let’s teach students how to tell the difference between good and bad art, as well as good or bad political candidates.
I‘m skeptical, precisely because while it’s manifestly true that critical thinking *can* be taught, it is equally evident that it either is not, or attempts to teach it are being overwhelmed with an endless flood of cheap entertainment and distractions. I tend to believe the latter but I don’t feel any better equipped to change the situation.
The average adult, yes. But, not the average child. I was taught critical thinking as a child. Many gray issues were discussed in my schools. Tell me that is part of today’s educational experience.
I'd be interested for your thoughts on how the dynamics at play here interact with the adult-YA-fan/poptimist school. Are they an advanced deployment fifth column by time-traveling Super AIs?
I think the adult YA fan/optimistic school does have a good point a lot of the time. People do have a tendency to confuse genre with artistic quality. To someone who bothers to look, YA books and pop art often display the same kind of depths as "high art."
You can take the same critical tools and apply them to anything to find hidden depths. I think the reason that "high" art lends itself to this process better is that it is often stripped of anything obviously appealing so it had no value except for hidden depths.
Alexander's test wasn't a test of AI art. It was a test of AI art THAT WAS SELECTED BY A HUMAN BEING! Use an AI image generator to try and produce a specific image and any sense that they're a good substitute without human curation will immediately evaporate. He's a classic AI triumphalist-in-doomer-clothes so of course he's leaving that out. The rest of us shouldn't.
For so many ppl "good" art is something that "looks like the thing" and this study just shows that depressingly well
Also, it is more accurate to say that this AI art is really just a derivative work, it's standing on the shoulder of artists from the past and it's not that difficult to create "art" when all the hard work has been done for you
100%
There’s not that many creative geniuses around. Isn’t all art standing on the shoulders of giants
Yes but it still involves choices, AI art isn't making any choices it's using templates
I already think about this tweet a lot, but your post (which is excellent) really brought it back to mind in a big way: "As a teacher of poetry what I can tell you for sure is people want poems to rhyme. They want poems to rhyme so bad. But we won’t give it to them" (https://x.com/ursulabrs/status/1434791291653558275)
which is funny bc it's nearly impossible in my experience to get an LLM to generate poetry that *doesn't* rhyme, even when specifically prompted, which makes it stand out from most modern poetry
great piece. i think many within Rationalists communities are very tied to their one analytical way of thinking, and can't wrap their minds around other non-analytical forms of knowledge production, and because they can't understand alternative models they don't believe the models are really saying anything. so these tests basically confirm what they've long suspected, that there is ultimately something "bullshit-ty" about "art". i deeply respect Scott Alexander, and i have many friends who i'd consider to be part of this rationalist community, but this is one of the huge blindspots of these folks
Why would you respect anyone from the community that shat themselves over an incredibly stupid thought experiment (look up “Roko’s Basilisk”)?
Are you mixing up Scott Alexander and Eliezer Yudkowsky?
I’m taking a dunk on the rationalist community as a whole. That’s why I said “community” and not “Scott Alexander”.
I mean the artistic community hasn’t always had flawless judgment either.
I think there's a lot to respect about people who can take an idea seriously instead of just dismissing it as absurd. Sometimes that ability backfires, but most of the time it's pretty useful.
Distilled a lot of thoughts I've had about this issue quite well. The argument that AI is democratizing artistic expression might be technically true but it's increasingly obvious that this removal of skill barriers is solely for the benefit of people who seek to create the equivalent of a Dr. Who / Rick and Morty crossover tee-shirt. Maybe that's what people want, but it certainly doesn't justify the weird gloating many tech enthusiasts seem to be doing about human made art being "over".
It’s auto-generated fanfic, forever.
There’s virtually no public understanding that art does not exist without context. You see this particularly clearly in the evergreen gripe about abstract art: my six-year-old son could do that.
AI art is the perfect expression of that most horrible word into which all creative output has been packed: content. Devoid of context and meaning, like a gray porridge, ready to be poured into a website template, social media post, or whatever other Internet vehicle is at hand. AI art is like sausage meat in that regard, thousands of years of human culture thrown into a grinder and extruded at the other end in regular, easy to eat, pre-cooked chunks.
I should qualify “public understanding” as “compulsively-online public understanding.”
| taking the safest and least surprising path at all times.
I agree with the 'least surprising' bit (e.g. the models are predictive and will take the median path), but I think the 'safest' bit is more a product of the large and lucid models being trained by large corporations with liabilities to worry about.
"...So, in the conversation concerning art, as I said before, something is likely to get lost and as funny as it is, what mostly goes unsaid is the creative human behind the work. That is, the artist. How comical it is, and typical, that the initial human spark of the work is forgotten, ignored, unmentioned, and the vomit that is created, the soul’s mandatory upchuck, is now debated, deliberated, picked apart, thoroughly dissected, and in many cases to such an extent that it causes the works own demise in the eyes of the viewer as well as the eyes of the artist. This is why, I take it, some individuals have such an apathy (and sometimes celebration) towards Artificial Intelligence in art—they have no concern for the process or individual, only for the output. Only for the “thing” in its physical or visual form. Either way, we take too much time for the meaning of the piece, rather than the meaning of the process."
--"Of Masterpieces and Scraps"
https://judsonvereen.substack.com/cp/150209507
At the risk of coming off as glib and immature, fuckin’ A.
oh boy the fact people liked that AI walt whitman poem is bleak. although the people do hate poetry, so that’s not super surprising
I enjoyed this Read Max very much, and I'd like to think I liked it more than I would a similar AI-generated blog!!
Your second footnote is really the key here: people don't understand that these things work by averaging and interpolating data - they think they work by the model literally scratching out little imitations of humans , because they've been given every impression that's how it works by it's boosters ('it learns and imitates, just like you or me!')
ohhhhhh now this is the real shit right here
The yardstick I am using is that if there was an AI that really did write like Walt Whitman and people preferred the poem shown here to its stuff, I would still be affronted on its behalf.
If people then said “why do we need the Walt Whitbot when we already had these much better poems?” I would be enraged. It’s not really a matter of human uniqueness with the poem. It’s just that it’s really fucking bad
Maybe I am being naive and the answer is simply money, but I don’t get the purpose of using AI to generate art or fiction. Why bother? Why should the average person be asked to distinguish between the two? Why not teach critical thinking so the average person can utilize discernment and judgment when reading or viewing art? Our schools now teach how to pass a test, and not how to think. Let’s, as a society, agree to use AI to create technical prose and solutions, leaving art to humans. Then let’s teach students how to tell the difference between good and bad art, as well as good or bad political candidates.
I‘m skeptical, precisely because while it’s manifestly true that critical thinking *can* be taught, it is equally evident that it either is not, or attempts to teach it are being overwhelmed with an endless flood of cheap entertainment and distractions. I tend to believe the latter but I don’t feel any better equipped to change the situation.
Pardon my pessimism, but I think teaching critical thinking to the average person is like teaching a horse to play chess.
The average adult, yes. But, not the average child. I was taught critical thinking as a child. Many gray issues were discussed in my schools. Tell me that is part of today’s educational experience.
I'd be interested for your thoughts on how the dynamics at play here interact with the adult-YA-fan/poptimist school. Are they an advanced deployment fifth column by time-traveling Super AIs?
I think the adult YA fan/optimistic school does have a good point a lot of the time. People do have a tendency to confuse genre with artistic quality. To someone who bothers to look, YA books and pop art often display the same kind of depths as "high art."
You can take the same critical tools and apply them to anything to find hidden depths. I think the reason that "high" art lends itself to this process better is that it is often stripped of anything obviously appealing so it had no value except for hidden depths.
Alexander's test wasn't a test of AI art. It was a test of AI art THAT WAS SELECTED BY A HUMAN BEING! Use an AI image generator to try and produce a specific image and any sense that they're a good substitute without human curation will immediately evaporate. He's a classic AI triumphalist-in-doomer-clothes so of course he's leaving that out. The rest of us shouldn't.
Art and Poetry, the two things I need to be told what is good and why.