Psembi Kinstan x The Australian: AI already generating plenty of concern for content providers

Last week, an allegedly AI-generated song went viral, featuring “fake” vocals by musicians Drake and The Weeknd.

The song was first released via the account of a TikTok user by the name of Ghostwriter, before making its way on to music streaming platforms such as Spotify and Apple Music.

Millions of listens later, it was quickly taken down from the platforms at the request of the artists’ label, Universal Music Group, citing concerns over AI’s potential infringement on copyrighted music.

The incident provides an insight into the potential issues content creators and content owners of digitally available assets may face as generative AI technologies mature at pace.

The rapid proliferation of generative AI tools has creative agency leaders in Australia approaching the technology with vigilance.

Many creative agency leaders agree that AI, when used with the appropriate guardrails, can be used to help eliminate rote tasks and drive efficiencies within teams, which can in turn free up precious time and resource.

Industry leaders also say, however, if quality, distinctive creative work is ultimately the growth-driving goal for business, then generative AI is limited and is no match as a replacement for human creativity.

In the five months since OpenAI made its generative AI tool ChatGPT available as a prototype for public use, the conversations within advertising agencies now also include not only its maturing creative capabilities but questions around content “ownership”.

The question of ownership is being discussed globally in relation to content generated by AI, and the information it uses to create content or creative assets.

Psembi Kinstan, executive creative director at advertising agency DDB Group Melbourne, told The Growth Agenda that while drawing inspiration from the arts and artists is not new, generative AI may pose a new threat to original creative or company and brand-owned work.

“Sometimes, however, ‘drawing inspiration’ can look a lot more like ‘stealing’ when a lazy agency doesn’t add enough of their own creativity – or pay for the privilege of IP they’re referencing,” Mr Kinstan said.

AI has made this easier than ever. Mr Kinstan argues that content creators must take responsibility to use the technology tactfully.

“A brand, or agency, could type in ‘Image of a can of X brand dog food in the style of Warhol’ and seconds later have a stunning high-res visual for a poster campaign. That’s clearly a problem if you’re the Warhol estate. But even more of a problem if you’re an emerging artist with little recourse for action,” he said.

“Depending on how creatives and marketeers are getting their AI-generated imagery, they might not even realise that they’re ripping off the style of a particular artist. And if it’s not an accident, it’s still easier than ever for perpetrators to hide behind the murkiness of the database the AI is drawing from.

“Agencies need to understand they’re more likely to be perpetrating the problem than being the victim. Good agencies and creative leaders already have commonsense principles to ensure this doesn’t happen.

“And another question to ask: do we need to be worried that our brand assets will be ripped off? Yes, absolutely. But should this worry our big brands? Perhaps not. After all, it might actually drive their success as well.”

In 2019, for example, he mentioned UK-based creative firm Mother, London created an ad campaign for fast-food chain KFC, which used similarly designed logos from the chain’s competitors, adding the line: “Guys, we’re flattered.”

But imitation may not always be the sincerest form of flattery when it comes to generative AI.

Simon Newcomb, a technology and intellectual property lawyer and partner at national law firm Clayton Utz, said there was a range of concerns from a content perspective with generative AI technology, including business “owned” assets.

Mr Newcomb said he was already giving legal advice on the use of generative AI, not only in terms of potential intellectual property issues but on privacy and contracts, for example.

“Firstly, it takes over the customer relationship,” he said.

“Because the customer doesn‘t have to go to the business’s website, if they can get the information from the AI. And that potentially prevents monetisation of the content. If the end user is not going to the content provider’s website, it means that the content provider loses control over the presentation, or accuracy, or context or branding.

“It means that they lose also information about the customers’ interests or activities, because they haven‘t gone to their website, and so they’re not getting the data about that visit.”

Reproduction and attribution, where creative assets or owned content are not properly licensed or referenced, are also posing copyright infringement issues, Mr Newcomb said.

R/GA, part of The Interpublic Group, is a global design and marketing consultancy, and is one of many creative firms experimenting with the technology. At present, it uses AI as a tool to help streamline operational and workflow efficiencies, such as Slack chatbots and assistants, as well as some types of content such as emails and recommendations tailored for individual users or segments.

Ciaran Park, executive director of technology, Australia, at R/GA said: “The potential for these tools to scrape brand-owned content, including content created by agencies, and generate new content without proper authorisation or attribution can, and will, raise concerns about copyright infringement, intellectual property rights and misuse of creative assets.

“It can lead to issues for brand, agency and the audience. The creation of new content that may not align with the brand‘s intended message or value can bring inconsistency, brand dilution, or even legal disputes.”

Within R/GA, Mr Park said it was constantly exploring how AI can help with design tasks such as inspiration for generating concepts and suggesting colour palettes. “This is always strictly governed by predefined parameters and human oversight, it’s a form of inspiration,” he said.

Mr Park said other areas of exploration could include the development ofcreative assets, such as content and logo generation, but it is approaching generative AI with caution: “Currently we don’t feel the technology is mature enough to be utilised in this manner or amid the current copyright issues.

“It‘s important to note that while generative AI can be a powerful tool for agencies, human creativity, judgment and expertise are still invaluable in the creative process.”

The agency believes that generative AI should be used as a tool to augment and enhance human creativity, rather than replace it.

R/GA has created its own generative AI usage policy and also said agencies should articulate its proper use and agree on parameters in client and partner contracts.

Mr Park also said proper attribution should be a focus for content creators, as well as obtaining proper licences and educating internal teams about changes and usage policies.

First published via The Australian