What’s happened this week?
The 5 most important updates to organic search this week are: Chrome’s Not Secure Warnings, RankBrain irrelevance to SEO, Google Job Search in the works, AdSense 300*250 ATF SEO issues and W3C Validation unimportant and a number of other interesting bits and pieces.
#HTTPS #SEOStrategy #WebsiteStrategy #MobileStrategy #Googlebot
Further Google Chrome “Not Secure” Warnings Coming
Summary:
- From October, Chrome 62 will start to mark HTTP pages as “Not secure” if they capture any user data – think forms or search boxes.
- All HTTP pages accessed in Incognito mode will be marked as Not secure.
- This follows existing Not secure notations forHTTP pages which contain username and password boxes.
How to Move to HTTPS:
- Change to HTTPS if you collect, or have a user enter, any data on your website.
- If you have users enter a username and password then you should already be on HTTPS.
- Ensure any third party content SRC tags also use HTTPS.
- 301 redirect from old HTTP to new HTTPS URLs.
- Don’t forget to create a Google Search Console profile for the HTTPS URLs (and a “set” if you want an aggregated view).
- Read this post on How to Migrate from HTTP to HTTPS
HTTP to HTTPS Discussion:
Although there have been big complaints about changing to HTTPS, it’s really not that big a deal, and managed well, it;s a simple protocol change. Many websites have been putting it off until the next re-design, or re-vamp of the website, but like all re-designs & re-vamps, they should be carried out as separate projects, or distinct phases within the overall project. That way if anything goes wrong, it’s more easy to identify the root cause, roll it back, fix it and roll it out again.
There really isn’t any reason to delay switching to HTTPS. Admittedly most users don’t know or care about HTTP / S, despite signals from the various browsers over the years, but that is still no reason not to do it. You are effectively doing your user a favour by doing this and working to protect their information.
Any ranking boost from going HTTPS is minimal, but still working for your user’s security is generally a good thing.
The key issue with moving from HTTP to HTTPS is managing the 301 redirects from old to new. Managed well, and with no other URL changes, it should be a simple case of switching over, issuing 301 redirects, and making sure you are re-indexed using XML Sitemaps and Google Search Console.
More info:
Optimising for RankBrain: It’s Irrelevant To SEO
Summary:
- John Mu has re-stated for the umpteenth time to build great sites for users rather than optimise for RankBrain.
- He described the RankBrain part of that process as “irrelevant”.
- (It’s not wholly irrelevant, but Google would much rather you built sites for users and let them sort out the rankings),
How to Optimise for RankBrain:
- Build your sites for users folks.
- Optimising for RankBrain is / isn’t about building content around topics & concepts and user intent,
- Ignore anyone who offers to optimise a website for RankBrain.
RankBrain Discussion:
So, RankBrain is an interesting thing from Google, it’s been with us for about a year or so, but I’m not sure any users have particularly noticed. As it’s based on machine learning, or a semi-artificial intelligence (slightly fancy words for better algorithms), Google engineers sometimes have no idea what RankBrain will deliver in return for a search query.
The key concepts behind it are that instead of simply taking a flat view of “user searched for xyz ==> return top 10 pages that match that query”, it really is trying to understand the user intent behind the query, based on device, context, user’s previous searches, other user’s previous searches and deliver results based on that set of parameters. This lessens the need to re-click or re-search from the user’s perspective. You might like to think of it as Google taking the top 30 cached results and trying to deliver the page from those top 30 which the user is most likely to like.
It’s also worth noting that Google has always wanted webmasters to “just build a good site”. If you build a good site, they have great content to list. This mantra is often repeated. However, describing RankBrain as “irrelevant” for optimisation is both true and disingenuous at the same time – you have to take account of user intent when building web pages and at least take a stab at building content which is likely to match and serve that user intent.
More info:
Google Testing Job Search
Summary:
- Google followers spotted some Job Search pages in the wild recently.
- This is clearly a test of some kind – possibly for a new product, or a new way of displaying search results.
- Google issued a “nothing to see here, this is just a test” statement shortly afterwards.
- Job portals might be worried, or might have to modify their ad spend.
Actions to take:
- Similar to Google’s other forays into hotels, flights etc, it’s worthwhile being prepared.
- Make sure your Google information is in good standing.
- Use structured markup on your listings.
- If running a listings site of any kind, make sure you provide additional value beyond simple listings, Google eventually would like to extract the data and list it a click earlier, hosted on Google, with their ads.
Discussion:
Google has got form in taking data from listings sites and lifting it up a level to serve from its own pages (flights, hotels etc). It’s not clear from this test whether Google is intending to directly answer a job seeker’s query with a listings panel, similar to Local Search, or whether it is planning to run more of a shopping experience driven by ad buys, like it did with hotels.
I understand the desire to give information more quickly to the user, and make money from it, but I’ve always thought this to be a slight abuse of Google’s dominant position as the world’s information provider. It also removes the incentives to collating and running listings sites in the first place, and when that happens, then the variety of listings sites decrease, competition lessens, so sites fold and merge, and quality ebbs quite strongly, because there is less incentive to invest in curation. At that point it then becomes a pay-to-play model on Google’s ad engine, which is less discerning about quality than the organic results tend to be.
More info:
AdSense 300*250 Above The Fold On Mobile Clashes with SEO
Summary:
- AdSense now allows use of the 300*250 ad unit above the fold on mobile.
- Webmasters need to note the “in a way that does not affect the user experience” caveat.
- This clashes with received wisdom on Google’s SERPs updates, including the page layout penalty and interstitial penalty.
Actions to take:
- If tempted to use this ad unit, test the design on multiple devices.
- Make sure the design does not impact user experience, or get in the way of users seeing content.
- User content should still be visible above the fold.
- Be prepared for potential organic ranking drops, if implementing this.
Discussion:
This is by some way Google’s most successful ad unit. It fits in so many places and in so many ways. This is a way for Google to increase exposure for its most commonly used ad unit, as well as increase views for those ad units on mobile (as they were forced below the fold, they would often not be viewed).
However, unless Google tested it on big screen devices only, it clearly clashes with current thoughts on what would trigger various ad / content layout penalties, so it becomes a risk to implement, unless your header is small and the content can still be seen even with the ad unit in place.
It would be nice if Google Adsense talked to the Organic guys sometimes – webmaster get very confused by the mixed signals emanating from the ‘Plex.
More info:
W3C Validation Who Cares?
Summary:
- John Mu stated on Twitter that W3C validation “pretty much doesn’t matter”.
- The caveat is that Googlebot must still be able to render a page and extract structured data.
Actions to take:
- Make sure Googlebot can render your page and read your structured data.
- Use the Google Search Console Fetch and Render Tools, and their Structured Data tester.
- Don’t ignore W3C completely.
Discussion:
W3C validation has been dropping in importance for years. Very few people ever implemented everything correctly, so browsers (and Google) just worked around it. It is now not worthwhile basing a fix-list on a W3C validation report.
That said, it it still worthwhile using the tool to check for significant issues with validation. Googlebot is important, but it is not every user, and there may be some users who are not able to read / render content because of the W3C snafu. It is also pretty good at checking for accessibility issues.
One thing to remember is that while Googlebot can circumvent W3C issues, it does make everyone’s life easier if your code validates reasonably well. That increases the chances of people reading your content, linking to it, getting more Googlebot visits and increasing search rankings. Googlebot is also aided if you treat it as an accessible user, the easier Googlebot is able to access, render and interpret the site, the better your rankings are likely to be.
More info:
Bits & Pieces
- Webmasters have been getting all upset over differing images being sourced and used for Featured Snippets in Google Search, although Google takes the view that this is normal, and it’s the way Google News has operated for a long time. I think the webmasters are right and Google should be using images from the page it is linking to.
- Local reviews are being tested segmented by traveller type (family, couple, solo etc), or by what the review is about, eg rooms, service, facilities etc. this is somewhat useful, but obviously relies on there being enough reviews in the first place.
- Google doesn’t forget “old” links which haven’t been crawled in a while as they have lots of storage. Of course, this askss the question of how long between crawls Google might leave it before crawling again and rescoring the Link Graph. You’d have to imagine it would be triggered by a page in the chain somewhere acquiring a link which Google hasn’t seen and followed before.
- This is a superb article from Danny Sullivan on how Google assesses the “authority” of web pages. Really worth the read. One interesting piece is about topic concentration / diversity and how more diverse sites have issues ranking because Google can’t work out what the site is about. This was also quoted by the owners of about.com which recently took the site down and replaced it with numerous vertical-related sites. There are two sides to this coin: 1 – Wikipedia still ranks reasonably well despite being relatively unordered. 2 – if Google can’t work out what your mega-site is about, it’s more likely to be a problem with your IA which Google cannot interpret.
- John Mu is at it again with Mobile First mentioning that it may be released in batches (when it gets here). This would cause mayhem for SEO’s and indicates the kind of issues Google may be running into with this project. Delays and then batching releases are usually symptomatic of project issues (unless that was always the plan).