DEV Community

Cover image for Demonstrating AI's Innocence Through Food
Calin Baenen
Calin Baenen

Posted on • Updated on

Demonstrating AI's Innocence Through Food

Do you think AI image-generation is scary?
Do you believe AI image-generation is art-theft?

Well, let's talk about that!


I got the brilliant idea for this post when Kat-the-Hylian on DeviantArt made this brilliant comment:

Or using a microwave dinner and calling yourself a "chef."

This inspired me to create an article that compares AI, AI-generated images, and the ethics thereov to food, in some way.


What is AI?

Let's take a second to quickly recap what AI is.

You may be familiar with terms such as "AI images", "AI-generated", et cetera, but do you know what "AI" means?
Well, "AI" stands for "artificial intelligence".

What is (an) artificial intelligence?
(A program with) Artificial intelligence is a program that is capable ov replicating some facet ov human intelligence, such as learning or providing the most acclaimed information.

In the sense ov image creation, we can assume an artificial intelligence is any program which seeks to replicate the part ov the human brain which allows us to draw (well), including the motions our hands do.

How does an AI make new images?

An image-generation program that uses artificial intelligence is typically trained on images that are paired with captions or visual descriptions so it can make a connection between the descriptors and the, abstract, figures in the image.

These images (and their paired captions) are called "training-data".

Those who maintain the AI can get training-data in a number ov ways, including making their own training-data, or purchasing training-data from others.

Think about it like this:  it's like someone taking store-bought food (which is already assembled/prepared) and trying to make a new [food / meal] out ov it.

Are AI-generated images stolen?

Not inherently.

Maybe you've heard a scandal about AI stealing your images and such.

Is it true that some who maintain an AI can, and may, steal art?
Yes – it is certainly a possibility.

Does this mean everyone who maintains an AI will steal art?
No – those who maintain an AI have the option not to steal.

Let's think about this using grocery stores.

There isn't The Grocery Store™ that you go to once a week — likely, instead, you find yourself at one or more stores, such as Target, Walmart, or Cub Foods.
In the same way, there is no one artificial intelligence — you have DALL·E 2 by Open AI, Midjourney by... Midjourney Inc, et cetera et cetera.

If Walmart was found to be stealing stuff, Target wouldn't get the blame.
We must understand AI-technology (and its branding) the same way — we should view each AI in isolation from eachother (except ones with the same maintainer(s)).

Are AI-generated images art?

As Oxford English Dictionary defines "art", art is “the expression or application of human creative skill and imagination, typically in a visual form such as painting or sculpture, producing works to be appreciated primarily for their beauty or emotional power”.

By this definition, AI-generated images do not meet the qualifications to be "art".
However, they can still be the basis for (new) art.

[NOTE:  Just because you enter a prompt and upload the best looking result does not mean you are an artist.]

Are AI-generated images original?

In the context for this post's inspiration, Kat-the-Hylian insinuates that taking an AI-generated image, uploading it, and calling yourself an artist is equivalent to buying a microwave dinner, heating it up, and calling yourself a chef — I believe this is an amazing comparison.

Could you modify the microwave dinner?
Ov course, but this still does not necessarily make you a chef.

Now, if you get a microwave dinner with some mashed potatoes and some steak strips and turn that into cheesy mashed potatoes with bacon and seasoned steak strips with gravy, then maybe we can start discussing your chef status.

The line between not being modified enough and being modified adequately is blurry, and mileage may vary.
(I believe this is in part because what is adequate is subjective.)

Who owns the copyright to AI-generated images?

It depends – and there is currently no legal standard in place.    [24/05/06]

If the AI's training-data is entirely open-source or royalty-free, then the copyright should belong to no one.
If the AI's training-data is comprised entirely ov proprietary or licensed images, then copyright should belong to the person(s) that made the image (as appropriate).

Copyright only gets more complicated as we consider the possible modifications that could be made to the generated image.

Will artificial intelligence take my job?

No.
AI will not take your job.

Let's say AIs are restaurants (or microwave dinners) and artists are home-cooks who like to make homemade meals for their family.

Will the family want to go out to eat every now and then at their favorite restaurant?
Ov course, but fast-food probably won't outright replace your homecooking.

A Safe(r) AI

It may not be perfect, but I think I know a decent AI.

If AIs were restaurants, Leonardo AI would be a buffet.

Why?
Because Leonardo lays out all the food for you, and you get to pick what you eat.

Trained on LAION-5B, Leonardo allows you to compile your own training-data and generate your own images using that.


Over all, I think while AIs can do harm,  and come with a lot ov confusing implications ethically and legally,  but I believe this technology is the future, and that it can be a helpful tool to aid in the creation process (when used correctly and in good faith).

[Hell, even this post's cover-image is a modified AI-generated image saved as a low-quality JPEG.]

Thanks for reading!
Cheers!

Top comments (5)

Collapse
 
alvaromontoro profile image
Alvaro Montoro

Is AI art theft?

No, not inherently.

Unless the creator had license to use all the images it used for training the system–which most likely wasn't the case–, they used intelectual property without permission, which is theft. Even if the prompt-generated image is "original", it's source is based on stolen content.

I don't see the grocery store analogy correct. I think it's more like a shop lifter stealing meat from the grocery store and selling it outside. The final consumer may be happy that they got cheap steak, but deep down they have to know that what they bought is stolen food. Maybe some people will say "the grocery store is big and can afford losing some money in theft" which may be true, but what happens when the meat comes from a small local butchery instead of a large store? Can they afford it? Because that's basically what all big AI companies have been doing.

Collapse
 
baenencalin profile image
Calin Baenen • Edited

Unless

Did you miss "inherently"?
It is a keystone word in what I say here.

AI -generated images are not inherently stolen — training-data is not inherently stolen because it can come from anywhere.

Unless the creator had license to use all the images

Or owns the images themself, or the images are royalty free, or they got permissoon to use the image – which may, admittedly, involve a license.

I don't see the grocery store analogy correct.

It's not because it's incorrect — allow me to clarify the example.

When you are referring to "the grocery store analogy", you're referring to when I say “If Walmart was found to be stealing stuff, Target wouldn't get the blame.”, and the leading text.

In this case, the analogy is correct — I am saying that one AI (program) is not the same as any other AI (program), and that no one AI represents all AIs.

If DALL·E 2,  for example,  stole art but Leonardo AI didn't, then Leonardo shouldn't face the same ridicule under (the more broad) "AI" umbrella.

The final consumer may be happy that they got cheap steak, but deep down they have to know that what they bought is stolen food.

You're becoming attached to the idea that AI training-data is always going to be stolen, even if that's not necessarily true.

I feel like the affixation to this idea skewed the meaning ov the point I was trying to make.

but what happens when the meat comes from a small local butchery instead of a large store?

Well, AIs take from everyone, indescriminantly, and in small portions — it would be a lot like it taking small cuts ov meat from every corporate grocer and every local butcher in town, mincing ov them, and mixing them together.

Because that's basically what all big AI companies have been doing.

Do big AI companies take things that aren't theirs?
Yes – so they do.

Do big AI companies represent all AI companies?
No – in much the same way the deli at a Cub Foods doesn't represent all [delis / butchers] that exist.

Collapse
 
alvaromontoro profile image
Alvaro Montoro

I didn't miss it. I disagreed with it. Training data coming from "anywhere" is turning a blind eye into what companies are doing: stealing data for training. A practice that they have publicly admitted (e.g. when the NYT sued OpenAI for copyright infringement). Content being on the Internet doesn't mean that the content is free to take. It is still intellectual property of someone and it has copyright. Even this comment or your comment or this post are intellectual property and cannot be used without permission.

And yes, there may be some good actors out there that only use their content to train their AI, but I'm positive (although I don't have any data to support this claim) that they are the least of them, incredibly outnumbered by the bad actors. And until there is some regulation and transparency about what content is used/extracted for training, all AI companies are suspect of theft.

Maybe I'm too radical about it 😅

Thread Thread
 
baenencalin profile image
Calin Baenen

I didn't miss it. I disagreed with it

Based on this, I am not fully confident you understand the meaning ov "inherently".

Bread isn't inherently moldy, but it can be — AI training-data is inherently stolen, but it can be.
If bread were to *inherently* be moldy, all bread would have mold — if (all) AI training-data was *inherently* stolen, (all) AI-training data,  even it was made by tbe person developing the AI,  would be stolen.

What am I getting at here?
When X is inherently Y, then it means that Y is X's default or natural state.

I believe it is worth reevaluaring what I said with this context in mind.

Training data coming from "anywhere" is turning a blind eye into what companies are doing

No.
The assumption that the training-data comes from anywhere is just telling that we can't make many assumptions about it because we don't know what's in it.

When your family members give you a gift that is wrapped, you are not instantly suspicious that it was stolen are you?    /rht
I doubt you have such thoughts – you just trust that they are not doing wrong because you haven't opened the box yet.

Content being on the Internet doesn't mean that the content is free to take.

I think you are lost — I made this point in my post.

Even this comment or your comment or this post are intellectual property and cannot be used without permission.

I am aware.

there may be some good actors out there

There we go.

This is literally the point ov the post – just to show that AI is not *inherently* evil.

that they are the least of them, incredibly outnumbered by the bad actors

I don't doubt that, but the point is that not all AI-generated images are evil, stolen, or whatnot, as it is an over-generalization ov what is happening behind the scenes.

until there is some regulation and transparency about what content is used/extracted for training, all AI companies are suspect of theft

That is a very pessemistic way ov thinking.

If your family gives you a gift, but not the receipt for the item they purchased for you, are you going to assume your family stole it?

Maybe I'm too radical about it 😅

No, it's just that you're stuck in the mentality ov "Because this is the way it is now, and because most things are like this, then all things will be like this.", and you have to try to see things from a nuanced perspective.