Google autocomplete results about Trump lead to claims of election interference

With the 2024 election just 100 days away, social media users are claiming that the lack of autocompletes on Google about former President Donald Trump and his attempted assassination is evidence of election interference.

Many of the posts include screenshots showing what the autocomplete feature, which predicts what users are trying to type, has generated for text such as “attempted assassination of tr” or “president donald.” Among the displayed results for the former are references to other assassination attempts, including those of Harry Truman and Gerald Ford, but nothing for Trump. The latter offers two options: “president donald duck” and “president donald regan.”

Several public figures, including Trump and sitting members of Congress, promoted the claim on social media platforms, collectively collecting more than 1 million likes and shares by Tuesday. Trump did not immediately respond to a request for comment.

Google attributed the situation to existing protections against autocomplete predictions related to political violence. According to Google, “no manual action was taken” to suppress information about Trump.

According to search engine experts, there are several reasons why some autocomplete results about the former president did not appear.

Here are the facts examined in more detail.

CLAIM: Google is interfering in the election by censoring autocomplete results about former President Donald Trump, including the assassination attempt at his July 13 rally in Pennsylvania.

THE FACTS: It’s true that Google’s autocomplete feature failed to complete certain sentences related to Trump and the assassination attempt on Monday, as seen in screenshots circulated online. However, there’s no evidence that this was related to election interference.

On Tuesday, some of the same terms returned relevant autocomplete results. The text “president donald” now also suggests “Donald Trump” as a search option. Similarly, the phrase “attempted assassination of” includes Trump’s name in autocomplete predictions. Adding “tr” to the same phrase removes the option.

Searches for Trump and the assassination attempt conducted on Monday and Tuesday produced comprehensive and relevant results, regardless of which predictions emerged from the autocomplete.

Google told the AP that its autocomplete feature provides automated protections for violent topics, including searches about theoretical assassination attempts. The company further explained that its systems were already outdated before July 13, meaning that the protections already in place could not account for the fact that an assassination attempt had actually taken place.

According to the company, the additional autocomplete results now appearing on Trump are the result of systematic improvements (rather than targeted manual fixes) that will impact many other topics.

“We’re making improvements to our Autocomplete systems to show more up-to-date predictions,” Google told The Associated Press in a statement. “The issues are starting to get resolved, and we’ll continue to make improvements as needed. As always, predictions change over time and there may be some inaccuracies. Autocomplete helps save people time, but they can always search for what they want, and we’ll continue to connect them to useful information.”

Search engine experts told AP they see no evidence of suspicious activity by Google. There are many other reasons why there are so few automated predictions about Trump.

“It’s very plausible that there’s nothing malicious going on here, that these are other systems that are set up for neutral or good purposes that are preventing these query suggestions from showing up,” said Michael Ekstrand, an assistant professor at Drexel University who studies AI-driven information access systems. “I have no reason to disbelieve Google’s claim that these are just normal systems for other purposes, particularly around political violence.”

Thorsten Joachims, a professor at Cornell University who researches machine learning for search engines, explained that autocomplete tools typically work by looking at searches that people make frequently over a period of time and returning the most frequent completions of those searches. Additionally, a search engine can automatically prune predictions based on concerns like security and privacy.

That means it’s likely that Google’s autocomplete feature didn’t account for recent searches about the Trump assassination attempt, especially if the systems haven’t been updated since before the shooting.

“Depending on how big the window is that they’re averaging over, that might just not be a frequent query,” Joachims said. “And it might not be a candidate for autocompletion.” He added that it’s normal not to update a search model daily, given the costs and technical risks involved.

A 2020 Google blog post about the autocomplete feature describes how the system reflects previous searches and why users might not see certain predictions, including those that are violent in nature. The post also explains that predictions can vary based on variables such as a user’s location, the language they speak, or increasing interest in a topic.

Both Ekstrand and Joachims agreed that proving bias in a complex system like Google’s search engine from the outside would be extremely difficult. It would require much more data than, say, a few search queries, and would risk compromising the company’s defenses against data scraping, reverse engineering, and fraud.

“Generally, claims that platforms are taking specific targeted actions against specific people on a political basis are hard to substantiate,” Ekstrand said. “They happen sometimes, I’m sure, but there are so many other explanations that it’s hard to substantiate those claims.”

Joachims noted that the demographics of Google’s user base could skew the results of such a study if they leaned toward one side of the political spectrum or the other and therefore searched more for their preferred candidates. In other words, the way the system works would make it difficult to study.

Technical issues aside, limiting autocomplete predictions as a method of political influence could simply be bad for business.

“Even if Google wanted to do that, I think it would be a very bad decision because they could lose a lot of users,” said Ricardo Baeza-Yates, a professor at Northeastern University whose research includes Web search and information retrieval.

___

Find AP Fact Checks here: https://apnews.com/APFactCheck.

Leave a Comment