What’s happened this week?

The 5 most important updates to organic search this week are: Chrome’s Not Secure Warnings, RankBrain irrelevance to SEO, Google Job Search in the works, AdSense 300*250 ATF SEO issues and W3C Validation unimportant and a number of other interesting bits and pieces.

#HTTPS #SEOStrategy #WebsiteStrategy #MobileStrategy #Googlebot


Further Google Chrome “Not Secure” Warnings Coming

Summary:

  • From October, Chrome 62 will start to mark HTTP pages as “Not secure” if they capture any user data—think forms or search boxes.
  • All HTTP pages accessed in Incognito mode will be marked as Not secure.
  • This follows existing Not secure notations for HTTP pages which contain username and password boxes.

TWIS 05 May 2017 Google Chrome Not Secure Warnings

How to Move to HTTPS:

  1. Change to HTTPS if you collect, or have a user enter, any data on your website.
  2. If you have users enter a username and password then you should already be on HTTPS.
  3. Ensure any third-party content SRC tags also use HTTPS.
  4. 301 redirects from old HTTP to new HTTPS URLs.
  5. Don’t forget to create a Google Search Console profile for the HTTPS URLs (and a “set” if you want an aggregated view).
  6. Read this post on How to Migrate from HTTP to HTTPS

HTTP to HTTPS Discussion:

Although there have been big complaints about changing to HTTPS, it’s really not that big a deal and managed well, it’s a simple protocol change. Many websites have been putting it off until the next re-design or re-vamp of the website, but like all re-designs & re-vamps, they should be carried out as separate projects or distinct phases within the overall project. That way, if anything goes wrong, it’s easier to identify the root cause, roll it back, fix it and roll it out again.

There really isn’t any reason to delay switching to HTTPS. Admittedly, most users don’t know or care about HTTP / S, despite signals from the various browsers over the years, but there is still no reason not to do it. You are effectively doing your users a favour by doing this and working to protect their information.

Any ranking boost from going HTTPS is minimal, but still working for your user’s security is generally a good thing.

The key issue with moving from HTTP to HTTPS is managing the 301 redirects from old to new. Managed well, and with no other URL changes, it should be a simple case of switching over, issuing 301 redirects, and making sure you are re-indexed using XML Sitemaps and Google Search Console.

More info:

Return to Top


Optimising for RankBrain: It’s Irrelevant To SEO

Summary:

  • John Mu has re-stated for the umpteenth time to build great sites for users rather than optimise for RankBrain.
  • He described the RankBrain part of that process as “irrelevant”.
  • (It’s not wholly irrelevant, but Google would much rather you build sites for users and let them sort out the rankings.)

TWIS 05 May 2017 RankBrain Irrelevance

How to Optimise for RankBrain:

  1. Build your sites for users folks.
  2. Optimising for RankBrain is/isn’t about building content around topics & concepts and user intent,
  3. Ignore anyone who offers to optimise a website for RankBrain.

RankBrain Discussion:

So, RankBrain is an interesting thing from Google; it’s been with us for about a year or so, but I’m not sure any users have particularly noticed. As it’s based on machine learning, or semi-artificial intelligence (slightly fancy words for better algorithms), Google engineers sometimes have no idea what RankBrain will deliver in return for a search query.

The key concepts behind it are that instead of simply taking a flat view of “user searched for xyz ==> return top 10 pages that match that query,” it is trying to understand the user intent behind the query, based on device, context, user’s previous searches, other user’s previous searches and deliver results based on that set of parameters. This lessens the need to re-click or re-search from the user’s perspective. You might like to think of it as Google taking the top 30 cached results and trying to deliver the page from those top 30 that the user is most likely to like.

It’s also worth noting that Google has always wanted webmasters to “just build a good site.” If you build a good site, they have great content to list. This mantra is often repeated. However, describing RankBrain as “irrelevant” for optimisation is both true and disingenuous at the same time; you have to take account of user intent when building web pages and at least take a stab at building content that is likely to match and serve that user intent.

More info:

Return to Top


Google Testing Job Search

Summary:

  • Google followers spotted some Job Search pages in the wild recently.
  • This is a test of some kind, possibly for a new product or a new way of displaying search results.
  • Google issued a “nothing to see here; this is just a test” statement shortly afterwards.
  • Job portals might be worried or might have to modify their ad spend.

TWIS 05 May 2017 Google Job Search

Actions to take:

  1. Similar to Google’s other forays into hotels, flights, etc., it’s worthwhile to be prepared.
  2. Make sure your Google information is in good standing.
  3. Use structured markup on your listings.
  4. If running a listing site of any kind, make sure you provide additional value beyond simple listings. Google eventually would like to extract the data and list it a click earlier, hosted on Google, with their ads.

Discussion:

Google has got form in taking data from listing sites and lifting it a level to serve from its pages (flights, hotels, etc.) It’s not clear from this test whether Google intends to directly answer a job seeker’s query with a listings panel, similar to Local Search, or whether it is planning to run more of a shopping experience driven by ad buys as it did with hotels.

I understand the desire to give information more quickly to the user and make money from it, but I’ve always thought this to be a slight abuse of Google’s dominant position as the world’s information provider. It also removes the incentives to collate and run listing sites in the first place, and when that happens, then the variety of listing sites decreases, competition lessens, so sites fold and merge, and quality ebbs quite strongly because there is less incentive to invest in curation. At that point, it then becomes a pay-to-play model on Google’s ad engine, which is less discerning about quality than the organic results tend to be.

More info:

Return to Top


AdSense 300*250 Above The Fold On Mobile Clashes with SEO

Summary:

  • AdSense now allows the use of the 300*250 ad unit above the fold on mobile.
  • Webmasters need to note the “in a way that does not affect the user experience” caveat.
  • This clashes with received wisdom on Google’s SERPs updates, including the page layout penalty and interstitial penalty.

TWIS 05 May 2017 Google AdSense 300 250 Mobile

Actions to take:

  1. If tempted to use this ad unit, test the design on multiple devices.
  2. Make sure the design does not impact the user experience or get in the way of users seeing content.
  3. User content should still be visible above the fold.
  4. Be prepared for potential organic ranking drops if you implement this.

Discussion:

This is, by some way, Google’s most successful ad unit. It fits in so many places and in so many ways. This is a way for Google to increase exposure for its most commonly used ad unit, as well as increase views for those ad units on mobile (as they were forced below the fold, they would often not be viewed).

However, unless Google tested it on big-screen devices only, it clashes with current thoughts on what would trigger various ad/content layout penalties, so it becomes a risk to implement unless your header is small and the content can still be seen even with the ad unit in place.

It would be nice if Google Adsense talked to the organic guys sometimes webmasters get very confused by the mixed signals emanating from the ‘Plex.

More info:

Return to Top


W3C Validation Who Cares?

Summary:

TWIS 05 May 2017 W3C Validation

Actions to take:

  1. Make sure Googlebot can render your page and read your structured data.
  2. Use the Google Search Console Fetch and Render Tools, and their Structured Data Tester.
  3. Don’t ignore W3C completely.

Discussion:

W3C validation has been declining in importance for years. Very few people ever implemented everything correctly, so browsers (and Google) just worked around it. It is now not worthwhile to base a fix list on a W3C validation report.

That said, it is still worthwhile to use the tool to check for significant issues with validation. Googlebot is important, but it is not for every user, and there may be some users who are not able to read/render content because of the W3C snafu. It is also pretty good at checking for accessibility issues.

One thing to remember is that while Googlebot can circumvent W3C issues, it does make everyone’s life easier if your code validates reasonably well. That increases the chances of people reading your content, linking to it, getting more Googlebot visits and increasing search rankings. Googlebot is also aided if you treat it as an accessible user; the easier Googlebot can access, render, and interpret the site, the better your rankings are likely to be.

More info:

Return to Top


Bits & Pieces

  • Webmasters have been getting all upset over differing images being sourced and used for Featured Snippets in Google Search, although Google takes the view that this is normal and that’s the way Google News has operated for a long time. I think the webmasters are right, and Google should be using images from the page it is linking to.
  • Local reviews are being tested segmented by traveller type (family, couple, solo, etc), or by what the review is about, eg rooms, service, facilities, etc. This is somewhat useful but relies on there being enough reviews in the first place.
  • Google doesn’t forget “old” links that haven’t been crawled in a while, as they have lots of storage. Of course, this asks the question of how long between crawls Google might leave it before crawling again and rescoring the Link Graph. You’d have to imagine it would be triggered by a page in the chain somewhere acquiring a link which Google hasn’t seen and followed before.
  • This is a superb article from Danny Sullivan on how Google assesses the “authority” of web pages. Really worth the read. One interesting piece is about topic concentration/diversity and how more diverse sites have issues ranking because Google can’t work out what the site is about. This was also quoted by the owners of about.com, which recently took the site down and replaced it with numerous vertical-related sites. There are two sides to this coin: 1 – Wikipedia still ranks reasonably well despite being relatively unordered. 2 – if Google can’t figure out what your mega-site is about, it’s more likely to be a problem with your IA which Google cannot interpret.
  • John Mu is at it again with Mobile First mentioning that it may be released in batches (when it gets here). This would cause mayhem for SEO’s and indicates the kind of issues Google may be running into with this project. Delays and then batching releases are usually symptomatic of project issues (unless that was always the plan).

Return to Top