Google is having a stab at identifying  upsetting or offensive content via its Quality Raters, Google Search Console had a day of missing Google Analytics data, before squeezing organic SERPs further with 6 Google Ads at the bottom . There has also been a study that shows Chrome does not pass URLs for googlebot discovery as well as another reiteration that 404s are not signs of a low-quality site.
#Googlebot #GoogleQuality #GoogleSearchConsole #OrganicSERPs #WebsiteQuality

Google Tries to Flag Upsetting or Offensive Content

Summary:

  • New / updated quality guidelines issued to “Quality Raters”
  • Information from Raters will be fed into developing algorithms
  • Algorithms will aim to disambiguate between ‘general’ information searches and ‘specific’ information searches
  • Upsetting or Offensive content includes: promotion of hate or violence, racial slurs, graphic violence, explicit how to’s, locally offensive content.
  • Google is specifically avoiding calling this content “fake news”.

How to Cover Upsetting or Offensive Content:

  1. If discussing contentious topics, write carefully and appropriately.
  2. Present both sides of the argument, support information with facts.
  3. Avoid repeating falsehoods or being intentionally offensive.
  4. Read the SEO Content Planning Guide for assistance.

Upsetting & Offensive Content Discussion:

Following the recent increase in Fake News, Alternative Facts, and all sorts of other terms used to distract onlookers, Google is attempting to replicate its long-time success with weeding out adult sites from ambiguous searches, so that when a user searches for contentious topics they are presented with authoritative information on it rather than hate speech, or other offensive or inaccurate content.

If a user specifically searches for that kind of information, it will be presented to them.

Generally, this is a good thing. It’s not likely that all alternative views will be weeded out of search, but that which is clearly intended to be offensive, or is measurably inaccurate is likely to be.  The challenge will be satire. Google (and others) algorithmic beasts don’t do contextual humour terribly well.

Obviously, this will also only further the secret agenda of the neo-con-liberal-pharma-military-industrial-MSM-sheeple complex, which only a few far-seeing anarcho-bloggers have the insight to see, presumably because they have “special” powers.

More info:

Return to Top


GSC GA Data Went Walkabout

Summary:

  • March 9th didn’t happen according to the GA report in Google Search Console – at least for a few days.
  • The fix was implemented on March 14th.
  • This has happened before and will likely happen again.

Actions to take:

  1. When data disappears, check feeds of Googlers to see if they are aware and have a fix timeline.
  2. If they are not, advise them.
  3. Wait for an update advising that it has been fixed.

Discussion:

When you consider the amount of data Google must process on a daily basis, it’s not surprising that the odd day goes walkabout for a while.

Or, at least, it wouldn’t be if they weren’t the world’s largest data aggregator who really should have their stuff together by now.

More info:

Return to Top


Google Tests 6 Ads beneath SERPs

Summary:

  • There were recent postings showing screenshots of SERPs with 6 Ads beneath the 10 Blue Links.
  • Top sets of Ads were also in place.
  • Organic SERPs real estate about to get squeezed again?

How to deal with more ads on Google:

  1. Keep monitoring Organic SERPs for high ratios of ads / blue links, especially if traffic drops substantially from a keyword with no ranking change.
  2. Consider targeting keywords which generate fewer ads, but have higher relative exposure for your link.

Google Search & Ads Discussion:

Slowly over the years, Google has morphed further away from the classic 10 Blue Links to more and more real estate being devoted to directly monetised ad links.

It’s frustrating for users to constantly be assailed by ads – Google search has always been one of the publishers that didn’t annoy the user overtly (think auto-play videos, interstitials, pop-up overlays) – but it is possible that at some page in the future, organic SERPs will be almost completely overshadowed by paid links.

On the plus side, at least Google isn’t Yahoo!

More info:

Return to Top


Google Chrome Doesn’t Pass Data to Bots for URL Discovery

Summary:

  • Stone Temple Consulting ran a test to see if Googlebot would visit undiscovered URLs based on a Google Chrome user visit
  • Googlebot didn’t visit.

Actions to take:

  1. Well, don’t rely on Google Chrome visits to get content indexed ;-)
  2. Read SEO Ranking Factors: Find Crawl Index to discover the best ways to get URLs crawled, indexed and ranked.

Discussion:

It is possible that Googlebot didn’t visit those pages for some other reason, but I am surprised it didn’t.  It could be a time lag issue, or something else, but it does seem on the face of this study that Google ignores the data provided by its browser for URL discovery.

It is possible that other sources of URL discovery which are rumoured to be in existence are also, in fact, not.

More info:

Return to Top


404s are Not a Sign of Low Quality

Summary:

  • John Mu dismissed the idea that 404s are a sign of low site quality.
  • 404s are “normal”
  • Google may not mind 404s, but users don’t like them.

How to Deal with 404s For Google:

  1. For the sake of users, avoid 404s where possible.
  2. 301 redirect removed URLs to the next most relevant URL.
  3. Avoid asking techies questions to which they can give a “correct” answer.
  4. Use Google Search Console to help identify 404s Google knows about.

404s – Some Discussion:

So, Google says 404s don’t make a site “low quality”.

I think this comment is a touch of Google-Speak.

A 404 in an of itself is not a sign of low quality – things happen, URLs disappear – but 404 half your site and the view might change, or at least Googlebot may visit less often, or Googlebot may take a while longer to index and rank your content.

Let’s face it, if you have a crawl budget to manage, are you going to spend it pinging pages which may or may not be 404? Or are you going to display content which comes from a domain with a significant history of 404s . I know I’d be less likely to link to a site which historically has a habit of 404ing content. I’d imagine Google to be relatively similar.

The obverse of this is that if your are a well-known, trusted site, Google is likely to leave your 404 content listed for longer, probably because it likes to believe that webmasters make mistakes and that content will come back eventually.

More info:

Return to Top