Jump to content

16,000 Artists Work/Style Stolen by Mid journey listed


Recommended Posts

You know, to this day I fail to understand how this is meaningfully different than a studio using references of shit they find on Google to make something "legally distinct." Same idea, "training" someone on art they don't own to make a commercial product. I've yet to hear a robust explanation on it being truly anything I should give a shit about.

Link to comment
Share on other sites

15 minutes ago, Keyser_Soze said:

The real artist don't have to worry because they know how to draw hands and the AI makes some disjointed abomination for hands.

The only thing I know about real artists is that none of them know how to draw hands!

  • Shocked 1
Link to comment
Share on other sites

2 hours ago, Xbob42 said:

You know, to this day I fail to understand how this is meaningfully different than a studio using references of shit they find on Google to make something "legally distinct." Same idea, "training" someone on art they don't own to make a commercial product. I've yet to hear a robust explanation on it being truly anything I should give a shit about.

 

Someone who cares more might be able to explain it better but I’ve heard something like the AI isn’t just looking at the art it’s trained on and being “inspired” or even thinking “maybe I can tweek this concept to something new” as a human might, but more so copying and pixels from existing art and then pasting them together with other stolen pixels to create it’s generated art. So if bits and pieces of it are literally just art from other people that could be how it could hold up legally but idk

Link to comment
Share on other sites

8 hours ago, Xbob42 said:

You know, to this day I fail to understand how this is meaningfully different than a studio using references of shit they find on Google to make something "legally distinct." Same idea, "training" someone on art they don't own to make a commercial product. I've yet to hear a robust explanation on it being truly anything I should give a shit about.

 

These tools don't exist without this training data, and the tool is being sold. If you're going to make a tool and profit from it, we ought to compensate and get consent from people whose work you used to build the tool.

 

Beyond that, people *do* have to be very careful when they use references images. There's a huge amount of existing law working out when something is sufficiently novel. These image generators are not just building from a reference idea though. Their objective is to out right replicate the training data.

 

I mean that very literally. The objective function being optimized to train the models is optimized optimally when it can perfectly replicate training data.

 

Now we also strive to build models and method such that we don't *only* replicate exact instances in the training data, but also generalize for new inputs (in this case, a language prompt). But the fact remains that the algorithm and method works well as intended when it *can* reproduce the training data exactly when prompted accordingly.

 

And as things stand now, many of these image generator models tend to fall back very quickly to the training data even when prompted with slightly different inputs. That is, they're doing a better job memorizing than generalizing.

 

 

6 hours ago, stepee said:

 

Someone who cares more might be able to explain it better but I’ve heard something like the AI isn’t just looking at the art it’s trained on and being “inspired” or even thinking “maybe I can tweek this concept to something new” as a human might, but more so copying and pixels from existing art and then pasting them together with other stolen pixels to create it’s generated art. So if bits and pieces of it are literally just art from other people that could be how it could hold up legally but idk

 

That's right. Hopefully my above comments help.

Link to comment
Share on other sites

11 hours ago, Xbob42 said:

You know, to this day I fail to understand how this is meaningfully different than a studio using references of shit they find on Google to make something "legally distinct." Same idea, "training" someone on art they don't own to make a commercial product. I've yet to hear a robust explanation on it being truly anything I should give a shit about.

 

Adding on to some of the points already made above by legend, you ought to consider the data for what it is at a social level--it's not really just bytes of information.   It's an atomized abstraction of human labor; it represents the product of someone's efforts, however tiny and unconscious they might be.

 

Then consider a world (not too far off now) where all economic activity is software-mediated, and production is driven primarily by the output of AIs.  By demonetizing the labor that powers these AIs, you are demonetizing production more generally, and thus essentially shrinking the broader economy for everyone except the owners of the mega servers that house and route the data.

 

Right now when that kind of thing happens we're okay with it because we've grown accustomed to the idea that we will get free services and benefits in return.  But we're already approaching the limits of that model.  At the end of the day, free services alone can't sustain a middle class.  It ultimately needs a growing income stream of some kind to adequately feed, clothe and shelter itself.

  • Like 1
Link to comment
Share on other sites

Thinking more that direct inference, generative ai is skimming an ocean to construct an image of a pastiche that is both alienated from the source but also entirely dependent on it. It's one million monkeys on one million typewriters. Basically the death of the author's intent for what will amount to low value entertainment. Some of these ai types seem to think they'll be regaled with hollywood productions, when a wall-e future is most certain.

  • True 1
Link to comment
Share on other sites

18 hours ago, unogueen said:

 It's one million monkeys on one million typewriters.

 

This is either incorrect or so reductive that it conveys no meaning. You can use a random search algorithm to "train" an image generator and you'll have a very bad time. The methods being used here are *vastly* more powerful.

 

18 hours ago, unogueen said:

Some of these ai types seem to think they'll be regaled with hollywood productions,

 

"Generative AI" is not the same "AI." Generative AI has numerous limitations that will prevent it alone from ever realizing the dream of what we want "AI" to be, but not all AI paradigms have those same limitations, they're just harder problems to solve, and the progress in generative AI methods can still play a *part* of a bigger system or at least inform future work.

 

18 hours ago, unogueen said:

when a wall-e future is most certain.

 

It is not.

 

 

  • Like 1
  • Sicko 1
  • Halal 1
Link to comment
Share on other sites

I'm sure we'll have no shortage of court cases this year litigating this exact issue, but I don't expect the artists on this list to win most of them.

 

US copyright law is pretty obsessed with the idea of a copy. I expect all the AI companies to get lose some battles over the original copies that they trained data on, but not have to pay royalties going forward for generated content. Downloading a torrent of thousands of books is a pretty clear cut copyright violation. Building a legal library of data to train on will probably become more expensive and time consuming, but I just don't see a world where the end state is that artist X gets paid whenever their training data is utilized to generate something.

Link to comment
Share on other sites

42 minutes ago, TwinIon said:

I just don't see a world where the end state is that artist X gets paid whenever their training data is utilized to generate something.

 

I agree that model won't work. The logistics is too difficult to track. I also think open source models may need special rules from models/services that are sold.

 

But for the sold service setting, I think a better model is just opt-in consent. Set up a market place where artists/photographers, etc can sell their work a fixed price. I think a lot of artists would consider that. Especially because as tech gets better, generative images will become a more useful tool for artists and they'll want it to work well.

 

But figuring out an economic model isn't really my specialty, so that might not be great either. But in any case, I'm pretty sure people can find a way to make this work. Giving up because you cannot just steal would be utterly ridiculous :p 

Link to comment
Share on other sites

On 1/3/2024 at 12:01 PM, Keyser_Soze said:

The real artist don't have to worry because they know how to draw hands and the AI makes some disjointed abomination for hands.

 

Is it some mess in the algorithm where it constantly uses Rob Liefeld art assets for hands and feet?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...