In a recent blog post, Google announced that election advertisers must now disclose if an ad features synthetic or manipulated content. This expands upon Google's Political content advertising policy, "We support responsible political advertising and expect all political ads and destinations to comply with local legal requirements. This includes campaign and election laws and mandated election “silence periods" for any geographic areas they target."
With the surge in generative AI technologies, there is a growing concern about its potential misuse in creating misleading content, especially in the political domain, which is already challenged by the spread of misinformation and the potential impact in the upcoming political season.
Despite online election advertisements operating mainly in an unregulated arena, the Federal Election Commission (FEC) is soliciting commentary on amending regulation.
New Google Policy Details:
Implementation: Beginning in mid-November, election ads on Google that have been verified will need to clearly state if they contain content that's either digitally altered or synthetically generated. This applies especially when such content realistically portrays people or events, even if AI tools are involved.
The declaration should be obvious and placed where viewers can easily spot it.
If the synthetic or edited content does not significantly alter the message or claims of the ad (e.g., minor photo edits for size or color), no disclosure is necessary.
Ads that falsely depict someone saying or acting in ways they never did or that manipulate actual event footage will come under the purview of this policy.
Disclosure Examples: Possible disclosure phrasings might include "computer-generated audio" or "synthetically produced video," according to Google's post.
Exemptions: Editing such as image resizing, cropping, color or brightening corrections, defect correction (for example, “red eye” removal), or background edits that do not create realistic depictions of actual events.
Enforcement: Google plans to use a blend of human oversight and technological tools to scrutinize ads. Non-compliant ads might either be disapproved or deleted.
Accounts that repeatedly or severely breach the guidelines might face suspension.
If an ad is declined, advertisers will be alerted and allowed to rectify and resubmit with the correct disclosure.
Google has different requirements for political and election advertising based on region.
Google is updating its Political content policy to require verified election advertisers to disclose when their ads contain synthetic content that inauthentically depicts real or realistic-looking people or events. The disclosure must be clear and conspicuous and placed where it is likely to be noticed by users. Ads with inconsequential alterations or generated content will be exempt from this requirement. Examples of content that would require disclosure include ads that make it appear as if a person is saying or doing something they didn't or ads that alter footage of an actual event or generate a realistic portrayal of an event that didn't happen.
Note: Although Google has many regulations governing political content on its platforms, this policy targets election advertisements for individuals vying for office, not singular campaign topics like environmental concerns or reproductive rights. It's also important to remember that Google's Political content policy only applies to those areas where verification is required, so always remember to verify the authenticity of what you hear and see!