With greater use of artificial intelligence across influencer marketing and social media, media agencies are having to prioritize transparen...
With greater use of artificial intelligence across influencer marketing and social media, media agencies are having to prioritize transparency and tighten brand safety measures when working with influencers.
Content creators and agencies are continuing to experiment with generative AI images and sounds to social media filters and AI influencers – which comes with more ethical and legal considerations when it comes to publishing content. For agencies and their clients, the risk of not disclosing use of AI content can lead to damage of the brand and its reputation. And for influencers, it could hurt their authenticity and engagement with followers.
“The biggest consideration for an agency, and the partners we engage with, is always around the attribution concerns of AI,” said Alexis DeBrunner, associate strategy director at IPG agency R/GA.
Ben Jeffries, CEO of influencer marketing company Influencer, added that audience trust could decline if followers perceive content as “manipulative.” Jeffries pointed to risks such as deepfake technology, misinformation or harmful biases and stereotypes that could be perpetuated by AI-generated content.
“Brands must strike a balance between using technology to enhance their influencer campaigns and maintaining the human element that makes them truly resonate with consumers,” Jeffries said. “After all, AI is only as unbiased as the data it’s trained on, and if that data is flawed or discriminatory, the content it produces will be, too.”
Disclosures when using AI
Many influencers are employing AI in a variety of ways – from video filters and shorts generators to translation tools and strategies for analyzing people’s behavior and preferences to produce personalized content. Some 94.5% of creators surveyed in the U.S. this year said they use AI for content editing and generation, according to The Influencer Marketing Factory.
Media.Monks teams are incorporating measures into the contract process with influencers when discussing AI usage. Amy Luca, evp and global head of social at the agency network, said the legal contracts conversation will cover transparency and disclosure when using AI in content.
Influencers have to be “very clear and tacit about the ground rules for how that content is created,” Luca told Digiday. “I think that’s something we’re going to see a lot more from an ethics perspective, but more importantly from a brand safety and brand transparency perspective. Because at the end of the day, it only takes one influencer to produce content that is not authentic or is done through AI and for that to be found out.”
For example, one of the stipulations with brands and influencers selling a cosmetic product, such as mascara, is they cannot use fake eyelashes. The same goes for skincare products, like foundations, because they have clauses that say influencers cannot use filters or augmented tools to change skin tone or looks to ensure the authenticity.
“They have to be their real lashes,” Luca said, “because we want to show that mascara could do the job, not something that’s false like false lashes … At the end of the day, it kind of comes down to the fundamentals of how that content is created and what messages they’re putting out there.”
Because using generative AI is still in its early days, Luca believes that some issues will inevitably come up – but the “core legal framework” they have established and being transparent about using AI will be critical. “I think AI is that new frontier for brands – where if they do employ AI, they have to do it in a responsible and ethical way so that they don’t erode long-term brand equity and trust. It’s a huge risk for future brands.”
Arielle Carter, group vp of consumer and content experience at Razorfish, also mentioned that the disclosure of employing AI will become more important in influencer marketing. Carter said the influencers they work with have already been using a variety of AI-generated sounds, filters and editing on social media, but the development of generative AI will require more industry standards and collaboration.
“With the filters and the sounds and everything, I think that’s already kind of a given and not necessarily something where we’re currently being asked to give disclosures around that – but that’s where that governance comes in for generative AI and thinking about the future of leveraging AI with creators,” Carter explained. “It’s definitely a new frontier that we’re all going to have to navigate together, but that governance is definitely going to be a key first step to work as an industry to get established.”
Besides legal, CMI Media Group is also focusing on the right targeting and comment moderations for campaigns to protect brands. Bianca Blando Kroupa, associate director of paid social, said CMI works with legal and privacy teams closely on new products and offerings with content creation.
“We take that even one step further to any content that they’re a part of actually goes through our full med legal review,” Kroupa said. “They’ll review every piece of it for accuracy to make sure … that we are only making claims that are totally accurate.”
The future of AI in influencer marketing
As more influencers turn to AI not just in content creation, but in scaling and business efficiency tools, Ryan Detert, CEO of influencer marketing company Influential, said the technology allows them to rapidly grow the business. Detert said his company ensures that data is consented and anonymized with AI brand safety tactics in order to protect clients – a safety mechanism that was implemented seven years ago.
“Many are also using AI to generate or inspire creative concepts, creative briefs, or customer and creator support tickets,” Detert said. “To us, AI is a tool just like a calculator, a force multiplier to get more efficient.”
Luca at Media.Monks also said there is opportunity for celebrities and influencers to have their likeness or voices become licensed with AI integration – sort of like cloning themselves. These can be turned into live events, content and experiences that people can interact with, beyond social media.
“There’s something really interesting about that from a living history perspective,” Luca said.
Another business potential for influencer marketing is using AI to produce content that reaches more people across different markets and languages. With advancing AI and translation tools, influencers can have their content and voice synthesized into many languages with the availability of large language models.
“With the dawn of AI and large language models, you’re able to retain such high fidelity with the actual context of the word itself,” said Michael Balarezo, global vp of enterprise automation at Media.Monks. “And you transcribe the actual text and then you can have the machine read in a synthesized voice, the translated copy itself – and that can all be automated today.”
Some AI is also capable of syncing the person’s lips to the spoken word that is analyzed in the video, Balarezo said. “I always say this AI technology today is the worst it’s ever going to be. It’s only going to get better.”
And ironically, AI can even be leveraged to mitigate some of the risks created by AI content. For instance, as Jeffries at Influencer explained, AI can help with brand safety by analyzing the uploaded content or captions and providing insight on how the information will be perceived by different audiences, be it region, age or other demographics.
“This extra dose of insight can help brands make necessary tweaks to ensure that their content is appropriate and have the sort of impact they’d imagined,” he added.