It's fascinating to think on how quickly things are moving. It's only been five months since we first encountered Dall-E 2. And OpenAI's approach to such tools, whether these image models or the language model, GPT-3, has been to try and control access carefully to slow down any potential risky usage. But now with so many options now available, OpenAI has announced users can upload faces to include in AI images – a concept many feel is ripe for abuse.
I've spent most of my time playing with Midjourney, as its Discord interface has been an exciting way to not only test your own ideas but watch others exploring other styles and prompts that helps you learn how to 'tell' the Midjourney AI what you want to create.
One aspect of this access to the collective wisdom is seeing keywords being used to trigger stylistic choices that aren't directly associated with your prompt. It's one thing to ask for a "money tree", for a deeply simple example. It's another to add colour suggestions or exclusions, art styles, aspect ratios, and more to guide the AI to a specific look and feel you're hoping for.
In some ways this is quickly becoming a new creative coding language. Yes, any phrase will get a result, but to achieve a specific result you can picture in your mind you'll need to iterate on an idea to get the output you want to see.
This leads to the use of style prompts that could cause a major legal conflict for these tools: artist names. If you ask an AI to draw you a picture "in the style of" a specific artist, are you entering the realm of infringement?
Taking my simple 'money tree' example, I offered up the simple two words to Midjourney and got the following result:
Seriously, how pretty are those trees? I really love this tool.
Then I asked for two more versions 'in the style of' two famous artists.
Here's upscaled versions of two of these trees, showing that added power of selecting from the options and getting it to go a little deeper.
So how does the AI know how to be 'inspired by' any given artist? Through the corpus of artworks that were fed into the AI to teach it about art. And that's where it gets very troubling, as one particular artist is arguing as they find their name has become a very popular prompt – far more popular than Picasso.
Greg Rutkowski has a gorgeous fantasy art style that nerdy AI art users would love to be able to use as a style guide. And whether you know the artist directly or not, that Midjourney process of watching other people's prompts can lead to learning new keywords that get great results.
But if his work was scraped from 'public' image sources that taught an AI how to let people pay $10/month to a service that will produce work that is 80% as good as his real work when you use his name as a prompt? It doesn't seem like this will take long to be tested in court – it'll just take the right artist with the right financial resources to be able to test it.
The use of an artist's name feels like you're telling the AI to put a laser focus onto a specific subset of the hundreds of millions of images it was trained on. If this subset is not just a style, but an actual name, the granularity might achieve stronger results – but that's because the AI knows exactly what you're asking for.
Looking again at that Monet-inspired work even seems to feature an attempt at a Money signature in the lower right corner. That's how closely trained the AI is to the specific artworks of each artist. It knows he signed his work in a certain way and follows the form closely.
It is far more than inspiration. It is a new kind of mechanical inspiration. A digital forgery – and one that will only get better by the hour.
I really love these tools and I do expect to use then to illustrate more work on Byteside in future. It absolutely elevates my ability to use compelling imagery on a limited budget. But with these issues in mind, I'll be avoiding the use of named artists in my efforts. If I want a named artist, I should have to pay for the human to do the work for me.
Join the newsletter to receive the latest updates in your inbox.