Adobe explains how it created an AI platform without stealing artists’ work

adobe firefly summit

This week Adobe announced its own foray into generative AI with Adobe Firefly. From text prompts it is able to generate images, audio, videos and more. But what is perhaps most interesting is that Adobe has been transparent about what Firefly is trained on.

Adobe made the big announcement at its annual Summit in Las Vegas this week, saying Firefly will be available across Adobe Experience Cloud, Document Cloud, Creative Cloud and Express.

During the keynote speech, the company revealed that Firefly has been trained on open license and public domain content, as well as Adobe Stock images.

It made a big deal about striving to not utilise the intellectual property of artists or creators that haven’t given consent. This is something that has been a big problem in the AI-generated art space. Apps such as Lensa have received quite a bit of backlash over this practice.

Interestingly, the company is also looking to allow artists to train the system on their own work so they can quickly generate content in their own style. This could, of course, have the potential to result in someone training the AI to generate work in another artist’s style. Adobe has admitted that this is something that will need to be looked into.

But on the flip side, Adobe also stated that it will be looking into paying creators in the future who contribute their work to the Adobe Stock library for training.

“The Firefly training data set is an area that we took a very different approach than other players,” Alexandru Costin, vice-president of Generative AI and Sensei ML platform at Adobe, said in an interview with SmartCompany.

“All the scraped stuff we didn’t touch it because we actually care. These are our customers… we wouldn’t want to impact them in a negative way.”

According to Costin, Adobe Stock has 300 million assets to draw from, each of them looked at by human curators for approval and to ensure no trademarked content is added to the library.

“For the new stuff we’re bringing in, public domain from museums etcetera, we are building filtering systems to make sure whatever we ingest we check against some of those concerns.”

In addition to intellectual property and trademarked content, this system will also check for pornographic content.

“This gives us the confidence that the model not only will be high quality, but will reduce the risk of generating involuntarily trademarked or copyrighted content.”

Adobe is also expanding its content authenticity initiative which adds a cryptographic signature within a piece of content to track any changes and guarantee authenticity.

It will now also mark anything created by Firefly as AI-generated content so users can make informed decisions about whether to use that asset.

“And then in the same light we announced the ‘do not train’ tag which will retain inside the content itself a metadata piece that includes the artist’s declaration if he or she is or is not okay with other companies training on their content,” Costin said.

One of the biggest problems with AI being pushed out so quickly has been limitations and bugs. According to Costin, Adobe has been utilising stringent and tough internal feedback for Firefly to ensure that it works well and doesn’t fall into stereotypical generative AI traps.

We’ve also built detectors to make sure we intercept biases and then we change the prompt to reduce the bias,” Costin said.

If you put in ‘person washing dishes’ or ‘nurse’ you don’t just get what you would expect the stereotype to be. You get diverse skin tones, you get diverse ages, you get diverse genders, because that’s what it should be. We shouldn’t perpetuate harmful stereotypes.”

Costin also acknowledged that this will be an ongoing journey for Firefly.

“These models are big, they have personality and they learn from the data. There will be aspects we cannot control, train or test against. This is why we have an amazing immediate feedback mechanism with the beta community and our employees,” Costin said.

We have a team responsible at the back of the system that takes all this feedback and continuously refines the algorithms to make sure we really continue to remove this bias.”

The author travelled to Las Vegas as a guest of Adobe.

COMMENTS

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments
Close
SmartCompany Plus

Sign in

To connect a sign in method the email must match the one on your SmartCompany Plus account.
Or use your email
Show
Forgot your password?

Want some assistance?

Contact us on: support@smartcompany.com.au or call the hotline: +61 (03) 8623 9900.