Google is urging U.S. election regulators to consider more
explicit rules around online political advertisements, perhaps even a ban on
foreign entities from purchasing election ads focused on some issues, not just
candidates.
The comments — filed with the Federal Election Commission on
Thursday — also appear to call for greater clarity as to how the search giant
and its tech peers should handle the likes of RT, the Russia-backed news
organization that has been blasted by the U.S. government for spreading
propaganda online.
Google’s submission to the FEC comes as the company, its
counterparts like Facebook and Twitter and their regulators in the nation’s
capital begin debating whether new rules are necessary to prevent the Russian
government and other foreign malefactors from meddling in U.S. politics again.
For now, though, Facebook hasn’t yet filed with the FEC — but it
is expected to do so before the agency closes the window for public comments
later this month. The original deadline was Thursday, but the FEC recently
extended it.
And Twitter did share its views with the U.S. government late on
Thursday. Its comments essentially call on the agency to be mindful of its
character limits in proffering any new rules, without delving into the details.
During the course of the 2016 presidential election, Kremlin
trolls flooded Facebook with thousands of misleading posts and ads while
commanding 2,782 accounts to spread disinformation on Twitter, all with the
apparent goal of sowing social and political unrest around controversial
issues, like immigration and race. A small number of similar ads appeared on
Google, too.
To that end, some lawmakers in the U.S. Congress have proposed new legislation
that would require internet companies to maintain a public file of
all political ads they run, much like broadcasters currently must do.
Hoping to stave off that sort of regulation, meanwhile, some
tech giants have responded by introducing more transparency checks of their
own. Facebook, Google and
Twitter each recently pledged they would make political ads easier to spot while
providing more information about the audiences those ads target in the coming
months.
And at the FEC, election regulators this year revived an old debate
as to what, exactly, political advertisers have to disclose about their efforts
to sway voters online. The agency began that debate in 2011, but technically
never issued any disclosure rules targeting the tech industry, thanks in part to lobbying
by companies like Facebook.
Under new circumstances, however, some in the tech industry are
now seeking greater clarity.
In its filing with the FEC, Google sought to emphasize that it’s
a bit different from its social-network peers. It allows political ads in
search, on websites of publishers that participate in Google’s ad networks or
on its platforms and apps like YouTube.
In the company’s estimation, the “majority of advertisers” on
Google “self-impose some form of disclaimer.” Amid reports about Russia’s
meddling, though, the company told the FEC it would require all
election-related advertisements to use a pre-existing icon that, when hovered
over, will detail why a viewer is seeing that ad in the first place.
But Google also told the agency it had to “modernize its
disclaimer rule so that political committees and other organizations have clear
notice regarding the disclaimers they are required to include with their
internet communications.”
Meanwhile, Google suggested that Congress, the FEC and others
issue new rules around foreign-bought ads purchased in the weeks before and
after an election. To be sure, foreign nationals are banned from advertising in
support or defense of a U.S. candidate. But many of the rules governing the ban
don’t actually mention online ads — only broadcast and print.
With it, Google further appeared to be calling on the FEC to
wade into whether that ban includes issues-focused ads that may not “express
advocacy for or against a particular candidate.” In many cases, the ads
purchased by Russian trolls during the 2016 presidential race on Facebook, for
example, didn’t mention Donald Trump or Hillary Clinton — but rather divisive
topics like Black Lives Matter and gay rights.
And Google also urged the U.S. government to clarify whether
online ads and other content purchased or by lobbyists on behalf of a foreign
power should include a disclaimer.
The phrasing sounds wonky, but the search giant seemed to be
seeking clarity around RT, a news organization with deep ties to the Russian
government. U.S. intelligence agencies have blasted RT as Kremlin propaganda,
and its videos have been viewed on Google-owned YouTube millions of times — but
at the moment there is no explicit disclaimer about its Russian origins.
In some ways, though, Google’s comments about RT strike at one
of the most vexing challenges facing lawmakers and federal regulators. It’s not
just ads, but free, organic content that Russian trolls published and shared
online — tweets and posts and other material that at times even ended up
in major U.S. newspapers and websites.
For its part, Twitter stressed it planned to publish more
information about political ads on its platform. But its new efforts to
highlight political ads with special indicators, the company said, only apply
to those ads that touch on candidates — not political issues.
Twitter didn’t weigh in on other issues, like RT.
But the company did note that its newly doubled, 280-character
limit — while allowing users to say and share more — still makes it difficult
for advertisers to disclose more information in tweets themselves.
No comments:
Post a Comment