This is not about any specific case. It’s just a theoretical scenario that popped into my mind.

For context, in many places is required to label AI generated content as such, in other places is not required but it is considered good etiquette.

But imagine the following, an artist is going to make an image. Normal first step is search for references online, and then do the drawing taking reference from those. But this artists cannot found proper references online or maybe the artist want to experiment, and the artist decide to use a diffusion model to generate a bunch of AI images for reference. Then the artist procedes to draw the image taking the AI images as references.

The picture is 100% handmade, each line was manually drawn. But AI was used in the process of making this image. Should it have some kind of “AI warning label”?

What do you think?

  • Koolio [any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 day ago

    To me, pardon the term, but hysteria about AI generated content as some sort of cooties that taints whatever it touches is silly.

    AI content generally not art because there is not the same intentionality behind it in an ontological sense, in the same way pretty patterns that naturally occur are not art. You can be inspired by whatever, I’d just call you a shit artist if you trace some AI content and call it your art.

    You could argue that it is possible to be art in the same way that guided natural processes can be art, using tools does not immediately make something not art, but looking like art also does not necessarily make something art – it is an interplay between an artisan and their tools to shape the world with intentionality. I think its just a higher bar to clear with tools that could possibly make some of the “creative decisions”.