The future of AI is ethical training sets
I am an artist who loves AI art
Published on
filed under "Artificial Intelligence"
by WFL
Let me say this first: AI isn't going anywhere. That barn door was opened a long time ago, and there's no getting the horses back in now.
As artists, we need to get our heads and hearts wrapped around that.
I saw the utility of AI art a long time ago when I started messing with Stable Diffusion 1.0; It provided an accessible tool for creating new and unique visual feasts, and AI has only grown since then. I've used it to help code (Copilot in VSCode is a HUGE timesaver for large-scale projects). I've used it to analyze text to see if my key points are coming across. Of course, I've also used it to create artwork in both professional and personal settings.
My biggest sticking point, though, is how models are trained. Initially I was ambivalent; I loved the idea of AI learning from the collective works of humanity, but most folks weren't, so I opted to respect that.
Where do we go from here, though? We can't get rid of AI at all, and at best we can hope that industries adopt it as a tool rather than a replacement for artists (of all stripes). That's just the bare minimum, however.
I firmly believe that the success of AI and the acceptance of it by the art community will come in the form of ethically-trained models.
Opt-in, rather than opt-out, is key.
The image in the hero was generated (partially) using Adobe's Firefly 3 beta, which I have been evaluating for use in professional settings (plus, I've been listening to a lot of Carpenter Brut lately and wanted something fun for a new desktop wallpaper).
I've used Adobe's Firely 2 in some limited capacities; It was great for a rush project where we combined my art as well as another artists' skills and supplemented it with AI in order to get it out the door on time. It's not perfect, but it worked for what we needed.
Adobe is getting close to having the right idea with their model: Their source material is their own stock images that users sell through them (plus some opt-in folks as well as public domain imagery). Some of the stock images in Adobe's library, unfortunately, were generated using other AI image generators such as Midjourney.. Which don't have an ethically-sourced model backing them.
Of course, Adobe's AI model doesn't even provide an opt-out for users who list their works via Adobe's stock image service, so it's not the shining beacon of ethically-trained AI that we'd like to see, but it's getting closer.
We - as artists - need to advocate for more ethically trained models, but we should also consider allowing our works to become a part of AI model training sets. Through compromise we can help ensure the future of AI is a positive one, rather than a negative.
The industry is shifting, and we need to change with it. Much like when photography became an accessibile reality, AI is becoming a part of the landscape.. And traditional (yes, I'm lumping in digital with "traditional" too) artists will always have a place in it.
We just have to be a part of it's development, rather than an ineffective voice of opposition.
Oh, and we need to not let the crypto-bros take it over, because fuck that noise.