Want to monetise generative AI tools effectively? Then make them safe for creators
Photo: Andrew Ridley
Adobe wrapped its creativity conference on Wednesday after a slew of product announcements for creators.
Much like last year’s MAX event, artificial intelligence was front and centre. Many of the demos focused on tools being brought to bear by Adobe Firefly, the company’s family of generative AI models.
These included updates to its image, vector, and design models, with improvements spanning quality, speed, variety, and customisation.
However, these improvements were largely overshadowed by the unveiling of Adobe’s new Firefly video model.
On the surface, this AI video generator resembles others out in the market, with support for text-to-video generation and image-to-video generation.
It was not features that made this announcement stand out, however, but the claim that accompanied it. In unveiling the app, Adobe stated that this was “the first publicly available video model designed to be safe for commercial use”.
Adoption will rise if AI tools respect creator values
Generative AI tools have an adoption problem with creators, especially among more experienced ones.
Casual creators playing around with AI video generators may be oblivious to the legal dangers and ethical questions they pose. Some may not even care.
Featured Report
The future of creator software Maximising AI’s engagement opportunity
Artificial intelligence is revolutionising creator software to make it easier for those with little or no skill to produce high-quality content. These innovations are proving highly disruptive to existing...
Find out more…However, experienced creators are more cognisant of the threat posed to their work and the wider creator community. Some may feel reticent from publishing work using generative AI tools if there is any risk that they could be capitalising on a peer’s creativity who has not been compensated (or vice versa).
After all, creators care about what other creators think. When asked what matters to video creators most, 37% of advanced creators said it was being recognised and respected in my scene / by my peers (MIDiA Video Creator Survey Q2, 2024). This compares to 24% of beginners and 27% of intermediates.
By only training its AI video generator on content Adobe has the right to use, creators can play and produce with confidence.
This is crucial for Adobe, not just from an operational and monetisation standpoint but a reputational one as well. After all, this could have been a very different MAX 2024 – one overshadowed by the summer’s concerns about Adobe’s approach to AI.
To recap, Adobe was pressured to clarify its terms and conditions in June, after creators using Adobe apps like Premiere Pro and InDesign raised concerns that their work and the process they took to make it were being used to train AI models.
Responding in a blog post, Adobe said creators owned their content and it would never be used to train any generative AI tool. It added that Adobe Firefly would only be trained on a dataset of licensed content where it had permission, such as stock images and video in Adobe Stock.
Don’t just sell features, sell values
There is an important lesson to be learned here for all AI tools providers.
Generative AI’s ability to create movie-grade output in a matter of minutes makes for compelling marketing. Yet, it will gain little traction with creators if it does not pass muster on questions of ethics, responsibility, and fairness.
As Adobe has rightly identified, creating a video model ‘that is safe for commercial use’ is about more than just legal compliance.
It is a monetisation opportunity.
As creator awareness grows about how generative AI models are trained, those that respect creator copyright ownership will trump models that do not.
Creativity is about freedom of expression. When done right, AI tools can expand the boundaries of what is possible – but only if creators feel safe to do so.
The discussion around this post has not yet got started, be the first to add an opinion.