“No one uploaded their work to the internet with the expectation that it would be used to train AI. That’s it. It’s that simple.”» Logan Preshaw
The robots are coming for art.
You might not have seen any art created by artificial intelligence yet, but it won’t be long. We’re at the very beginning of a trend that will change art as we know it.
On first glance, the ability to generate custom artwork by typing a few words seems like an amazing advance for everyone, especially small business owners and anyone on a tight (or nonexistent) budget who wants artwork to use in a project.
But there are deep ethical concerns involved in both the existence and the use of AI-generated art.
Here’s an overview of what AI art is, where it’s going and what effects it might have on businesses, artists and body liberation.
So what is AI-generated art?
Artificial intelligence is used increasingly widely for everything from information security to self-driving cars to writing marketing copy, so it’s no surprise that its use is expanding to images.
AI-generated art is created by special tools that allow humans to type in keywords. The artificial intelligence then attempts to create an image that matches the prompt.
These tools, most of which are just beginning to open up to the public, include Midjourney, DALL-E and Stable Diffusion as well as apps like Lensa, which use AI to alter user-submitted selfies and portraits.
(If you’re in the tech world, you may already know that there’s debate over whether AI art generation is machine learning with a fancier name, but roll with it for this article.)
This is not (yet) a mature technology. I’ve watched as people like Ursula Vernon have explored these new art generators, spending hours upon hours fine-tuning keywords and technique just to create relatively simple images. But that will change.
Where does the AI get its “inspiration?”
The details depend on the AI in question, but all AI artwork generators learn from data sets created from existing art. Most learn from data sets created by scraping the internet for images.
In other words, human artists’ work is being taken without permission or payment from Pinterest, Fine Art America, Google and millions of other websites and used without permission to train AIs to create similar work.
Artist Simon Stålenhag says, “It basically takes lifetimes of work by artists, without consent, and uses that data as the core ingredient in a new type of pastry that it can sell at a profit with the sole aim of enriching a bunch of yacht owners.”
The inspiration images themselves are discarded and the AI doesn’t use them directly, only what it learned from them (at least in theory).
But this theft of artists’ work for training purposes opens some deep ethical issues with the entire concept of AI-generated art.
Not only did artists not give permission for their work to be used this way, AI art can tip over into what — if we were discussing human artists — would be simply plagiarism.
If AI art generators are drawing “inspiration” from copyrighted works, how are they legal?
The short answer is “legal loopholes.” As often happens, technology is outpacing legislation.
How is this any different from artists taking inspiration from other art they see?
Artificial intelligences are generating artwork, not creating it.
As Logan Preshaw says in his excellent thread on AI art generation,
“Some people are also comparing AI training to human inspiration. In reality they have no parity.
Experts have already expressed at length that ML [machine learning] is fundamentally different to embodied biological agents. Those claiming they’re synonymous don’t know how ML works. Simple as that.
An AI also has a capability of scale that no human can possibly reach. We cannot analyze 5 billion images and output a dozen derivatives in seconds.
Humans are not a globally accessible web service owned by a company, and our remuneration is not funneled into a few individuals.”
If artists don’t want their work used this way, shouldn’t they just stop sharing it online?
Sharing work online is also how artists get work, so, no.
What are the ramifications of AI art for business owners and companies?
Small and mid-size businesses will likely jump on AI art generation like white on rice. Marketing departments everywhere will discover that letting an employee have a couple of days to train on this new tech is cheaper in the long run
Larger companies will likely use a combination of AI-generated art, creative assets they already own, and custom work from photographers, illustrators and artists as necessary.
What are the ramifications of AI art for artists?
Popular fantasy artist Greg Rutkowski’s art style has been used as inspiration 93,000 times in a month. (Michelangelo, Pablo Picasso, and Leonardo da Vinci have been used as prompts 2,000 times or fewer each.)
For artists like Rutkowski, being name-dropped in AI art means that his own art will be overwhelmed and no longer findable by search.
For less well-known artists, having their name swamped isn’t quite as much of a concern, though even artists with smaller audiences might find their sales drying up if potential customers can hop over to an AI art generator and type in “fat woman in a bikini in the style of Shelby Bergen.”
And, of course, commissions — the bread and butter of many independent artists — and full-time jobs for artists will also disappear when people can get any image they want by simply typing a few words into a text box.
What about marginalized artists?
There are a couple of additional — and important — ramifications of AI-generated art for artists, illustrators and photographers who are marginalized.
The more highly marginalized an artist is, the harder they find it to make a living wage from their work in the first place (e.g., less likely to be hired for commissions or full-time roles), and AI-generated art has the potential to make this difficulty far worse.
For example, why hire a fat photographer or buy her stock images when you can ask an AI to produce your images and never have to interact with an actual fat person?
(Or a Black artist, or queer artist, or disabled artist, and so on.)
What are the ramifications of AI art for body liberation?
As companies steal artists’ work to train AIs and others begin to use these tools to avoid paying artists and photographers, commercial imagery will continue to openly stigmatize fat people’s bodies.
There just aren’t very many images of very large bodies out there, and most of the images that do exist are stigmatizing. The few of us creating fat-positive imagery simply can’t create enough work to balance out the vast amounts of fat-hating images created to serve diet culture.
Since most of the images of fat bodies that AIs will be using as inspiration are biased, its outputs will also be stigmatizing.
When fat and fat-positive artists have an even harder time making money from their work, fewer of us will exist, which means even fewer body-inclusive images will exist.
These factors affect us all. The more we see unrealistic, idealized people in advertising and the media, the more it makes us doubt the worth of our own bodies, skin colors, looks and orientations.
In addition, AI art will accelerate the use of non-aspirational bodies as negative symbolism. For an example, see this fascinating video on “Loab.”
(And what do you think will happen if you type, say, “unhealthy person” into an AI image generator? What kind of bodies might automatically be shown?)
What can I do to prevent AI-generated art from destroying the careers of marginalized artists?
As an individual, probably not much right now, unless you have the ear of a tech company decision maker or a legislator in your state or country, but here are some actions you might consider:
- Support fat and other marginalized artists by sharing their (genuine) work, hiring them or supporting them financially.
- If you’re part of a marketing or art department, explain the ethical dilemma of using AI-generated artwork to your colleagues and push for hiring and paying artists.
- Buy stock images for your own business or personal use from reputable and inclusive sources.