WSJ vs. Google, 5 flaws of the accusatory article

Put us to the test

Put us to the test!

Analyze your site
Select the database

In the United States the long stream of debates sparked from the November 15th Wall Street Journal article claiming that Google interferes with its own search algorithm to orientate SERPs has not stopped yet: alongside some critical voice against the search engine, as a matter of facts, the majority of the comments from the SEO Community actually defended Google, also analyzing all of the errors committed by the reporters. According to Sam Ruchlewicz, Warschawski’s VP of Digital Strategy & Data Analytics, there are 5 main weak spots within WSJ’s attack.

The 5 main errors of the Wall Street Journal

The american expert, general manager of the company operating all-round in the digital communication field, wrote a long contribution published on SearchEngineJournal dwelling on the debunking of the Wall Street Journal original article, especially highliting at least 5 big mistakes and misunderstandings (frankly speaking, he found over 34, reported on another piece on the company blog).

1.     With its edits to the algorithm, Google favors big companies

First issues the author focuses on is the claiming according to “Google made algorithmic changes to its search results that favor big businesses over smaller ones”, that is “dangerous and unfair” because it undermines both Google and the whole search community’s credibility.

First and foremost, “there is no evidence offered in the article that Google modifies its algorithm to favor larger companies over smaller ones”, and facing the proof of facts this is a typical “textbook example of the correlation-causation fallacy“, a.k.a the frequent error to mistake a correlation for a cause-and-effect connection.

Google does not favor big companies

According to Ruchlewicz “In general, larger companies are better at marketing in general than smaller businesses” even because they have more resources at their disposal, to not only invest in advertising but in the developing of contents, web sites creation and so on. But this does not mean that Google is granting them special favors, only that there is no such thing as an equal starting points (as already stated by the ex Googler Kaspar Szymanski barely a few days ago).

As we know, Google favors high quality contents published on influent and high quality Web sites: there are neverending examples of small companies fairly ranking for high values queries, because “those businesses did the smart, hard work required to produce uniquely valuable, high-quality content, secure authoritative links, and present that content on a well-built website”.

Success depends on SEO strategies

No “black magic“, then, only “a consistently brilliant execution of sound SEO strategy“, that anyway cannot completely shelter from the effects of Google algorithm’s modifications, that always can “materially impact” businesses. To point finger against these variations means to also overlook some facts, though. Like:

  • No company is entitled Google (or any other search engine) ‘s organic traffic, because organic traffic can only be acquired through the hard work described before.
  • If a business relies on Google’s organic traffic, it would probably be a good idea to face that factor of risk through other marketing investments.
  • Google makes relatively easy to stay updated thanks to webmaster guidelines and Quality Rater guidelines.

2.     Google employs low-paid contractors to assess SERP quality and convey their work

Right on the quality rater topic there is the focus of the second debunking chapter of the american expert, that challenges WSJ claimings by using as proof some numbers and considerations:

  • Google indirectly employs ~10,000 QRs (who are paid $13.50 an hour, for the record) all over the world at any given time via a network of contracting firms.
  • Google has been running this program since at least 2005.
  • QRs are basically no different from reviewers or quality-controllers, who assess results using a publicly-available set of guidelines. They do not have access to or control over any components of Google’s algorithms – QRs are just testers who validate that the product (the search algorithm) is working as intended.

Further deepening the matter, Ruchlewicz writes that “Over the 15-year history of the program, Google has likely employed millions of QRs. The tenure of most QRs is short (the individual included in the WSJ article was there for just 4 months), which limits loyalty and complicates any efforts to interfere”. Bringing our speech to an extreme, then, “Google is engaged in a vast conspiracy to manipulate the feedback that they are paying hundreds of millions of dollars each year to obtain”, a fairly paradoxal predicament also hard to manage on practical terms, given its size and time-length.

3.     Contrary to its own policy, Google made changes to favor eBay

A prickly issue the one about the relationship between Google and its advertisers, particularly eBay: according to WSJ, in at least one case the search engine would have made algorithmic changes to favor the selling site, in open violation of its own policy and public position to always support transparency and independence from the engine. As a matter of facts, Ruchlewicz explains, this claiming includes at least 3 wrong assumptions.

  • First of all, WSJ mixes uporganic SERPs and Google ADS, two realities completely distinct and separate, because no one can “buy” organic rankings on Google.
  • Second issue, Ebay pages’ 2014 drop in ranking emerged, yes, from Google’s algorithmic modifications, but also from the structure of the site itself, that was no long offering “quality contents“: it is proved by the fact that, right after following corrections and work on improvements, the site got right back on SERP display.
  • Third foggy point is more of a reasoning: Google would have modified ranking to barely gain 30 million dollars in annual ads investments from eBay. Wanting to extremely stretch WSJ thesis, Google can therefore afford to spend over 200 million dollars on SERPs quality controls that then ignores and/or manipulate, but at the same time is afraid to lose 30 millions ads incomes from eBay, for which it jeopardize the entire company business ( 900 billion dollars!) exposing its belly to huge liability and investigations.

4.     Google keeps blacklists and removes specific sites on specific results

This mistake actually holds several others within starting from the misuse of the term “blacklist” to describe different things, but above all this is based on a conspiracy theory – Project Veritas – already vastly disproved and discredited.

In further detail, the author clarifies following matters:

  • It is true that Google “filters” autocomplete suggestions in accordance with their policies. That’s not new – and the guidelines around it are public.
  • There is no evidence (including the WSJ’s laughably incomplete analysis of 17 searches conducted over 31 days) that Google is adjusting those in any way that is in violation of their policy.
  • The same is true for spam sites not being indexed (nothing new there – been happening since at least 2004). Not to mention that Google is pretty transparent in how manual actions work.
  • There have been instances where Google has run into issues around indexing, but when that happens, it’s in Google’s best interest to fix the issue quickly.

5.     Google has no political bias against conservative sites

Last but not least, the final part of the article debunked by the american expert regards the alleged bias the search engine would have towards specific political orientations, mainly in the conservative area. Even in this case, deep studies proved that Google does not hold any bias against these specific sites, but that any potential total absence on SERPs or visibility of other results depend on regular SEO activities.

To clarify how a search engine works

Ruchlewicz’s target was not really to discredit the Wall Street Journal report (that still remains fairly reprehensible) rather than to tackle the false claim that the research system is a sort of twisted or rigged “black magic”, no one should rely on. According to the author “there is certainly a lot of work left to be done around educating both reporters and the general public about how search engines work“, that are among the most important and mysterious pieces of online infrastructure.

The topic is complex, but it can be simplified by simply reminding that search engines have the duty “to bring some semblance of order to the otherwise chaotic, unimaginably large and constantly shifting corpus of information that is the web, so that when you or I are looking for something online, we can find it”.

The exact process remains mysterious and unknown

Obviously, “the exact process and factors that search engines use to accomplish that monumental task are shrouded in secrecy, protected by thousands of patents, and may well be unknowable” forever, he adds. And this exact combination – on one hand, the pervasiveness and central role of search engines on our daily lives, on the other the mysterious way they work – “has fueled feelings of confusion and distrust, not to mention more than a few conspiracy theories (many of which, thankfully, have been debunked)”.

Surely, Ruchlewicz admits, “there are real and legitimate concerns about how “big tech” in general – and search engines like Google specifically – wield their vast power to shape our world”, as well as many people in the SEO field leveled legitimate criticisms at Google, going from its propension to take content from editors (once again, we would like to remind you about the Google News case in France) to its practices of data collection, up until its proclivity to to favor its own products/services (as demonstrated by Rand Fishkin SMX intervention  we have covered at the beginning of the week).

Google is not perfect, but WSJ did not provide a reliable article

In the end, according to the author, “no organization is perfect and Google is no exception”, but these valid considerations does not excuse the use “of shoddy, agenda-driven journalism to propagate a false narrative”. And this is the biggest limit of the Wall Street Journal article, that embraced “many discredited conspiracy theories to weave a baseless narrative that the world’s largest search engine, Google, abuses its power for its own nefarious purposes”, creating a piece that, Ruchlewicz concludes, “belongs in the cheap fiction section at the airport, not on the cover of one of the country’s most esteemed news outlets”.

Iscriviti alla newsletter

Try SEOZoom

7 days for FREE

Discover now all the SEOZoom features!
TOP