TWIS SEO Update 02 June 2017

What’s happened this week in SEO?

In The Week In Search SEO Update for 2nd June 2017, we have Chrome will be blocking ads in 2018JavaScript causing crawling and indexing issuesPageSpeed being difficult to determinea view that Social Media helps SEONOODP going for good; as well as some exceptionally interesting Bits & Pieces.

The State of SEO Mid 2017

We’ve also released the super-exciting The State of SEO in mid-2017. Read it now.

#SEO #SocialMedia #JavaScript #AdBlocking #PageSpeed #MetaTags


Google Chrome to block bad ads

Summary:

TWIS 02 June 2017 Chrome Ad Blocking

Actions to take:

  1. Review the Better Ads Standards to ensure your ads are in compliance with the code.
  2. Review  the DoubleClick Best Practices – there are differences between Desktop and Mobile.
  3. Test different implementations to discover the alternative ad strategies which work best for you.
  4. Or, just ignore and accept that Chrome will block these ads.
  5. Contact me if you would like help reviewing your ad strategy and ways to avoid having your ads blocked.

Discussion:

AdBlocking was a huge topic 6-8 months ago, although it did kind of fade into the background behind the contretemps over iffy programmatic ads and views. It’s good to see Google taking a stance on this pernicious issue.

Most users dislike ads, but the degree to which the ads annoy is very much predicated on the type and style of the ads. Users intensely dislike pop ups on mobile and prestitial ads with countdowns (YouTube anyone?). On desktop they really dislike large sticky bottom ads, and pop-ups with countdown. Google has generally grouped the ads into three broad types: annoying, distracting and cluttering

Oddly there isn’t a direct mention of auto-playing video ads with sound, even though this is commonly one of the most often-mentioned annoying ad types. Let’s hope they get the chop.

The final thing that springs to mind with this is that Google is playing both ends of this game: it’s taking the lion’s share of online ad spend at one end, and at the other, it’s starting to block those ads from being displayed. It’s not a huge jump to imagine there may be antitrust issues somewhere down the track over this, despite Google ostensibly doing it for the user’s benefit.

More info:

Return to Top


Some JavaScript Prevents Indexing & Slows Down Crawling

Summary:

  • A few weeks ago, I tweeted about Javascript causing indexing / caching issues, now we have an in-depth test on Moz which demonstrates the issues.
  • Some of the frameworks used in a number of web deployments just do not present content to Google to Fetch and Render, and then index.
  • Those frameworks that do render can cause big slowdown issues on mobile, and occasionally desktop. Usually this is caused by too much processing being used, memory sizes or memory sizes & caches being exceeded.
  • JavaScript has brought many dynamic improvements to the web, but it still needs to be used carefully.

TWIS 02 June 2017 JavaScript Blocking Indexing

Actions to take:

  1. Read the Moz article to understand the particular frameworks affected and how they are affected.
  2. Test critical pages in Google Search Console Fetch and Render. Make sure Google can see and index all critical content.
  3. Review alternatives to JavaScript usage. If you don’t *need* to use it, don’t. Traffic tests show increased usage for non-JavaScript implementations.
  4. Always have a <noscript> alternative. Googlebot and Screen Readers will thank you for it.
  5. Where possible, ditch JavaScript if it drives critical / cannot fail functions on your website.
  6. Contact me if you would like to chat about JavaScript and if it is causing issues with your site’s indexation and performance.

Discussion:

Using JavaScript to do your processing for you is almost the ultimate distributed computing / cloud computing experiment. On Desktop with loads of RAM, super-quick processors and lots of resources, it tends to work reasonably seamlessly for the user. On mobile, however, it causes non-loads, back-outs and user frustration because the phone struggle to cope with the vast amounts of data, scripts and resources being flung down the pipes at them..

There is now a very good case for stripping JavaScript implementation back to functions which are absolutely necessary. I have had instances where a client’s entire navigation was rendered in JavaScript which was unreadable to Google. The client’s CMS also didn’t allow for the creation of an XML Sitemap. This kind of thing is the thinking behind the AMP project and almost takes us back to flat / static HTML.

Don’t believe the developers who think that Google can read an index all JavaScript. It can’t and it may not make an effort to do so.

More info:

Return to Top


Google: PageSpeed is difficult

Summary:

  • A small Twitter discussion revealed that Google (and AMP) struggle with identifying fast pages more than they struggle to identify a slow page.
  • Fast pages are usually consistently fast, but are they fast for everyone? (False positive).
  • Slow pages tend to be consistently slow. If not caused by routing / congestion, they are usually caused by bloated code or processes, which affect every user.
  • Google is not planning on including page speed initially in its upcoming Mobile First Index.

TWIS 02 June 2017 PageSpeed Slower Easier

Actions to take:

  1. Keep on checking PageSpeed using the Google tools.
  2. Remember PageSpeed is relative to the competition.
  3. Compare load speed of your critical pages versus your ranking competitors in this space.
  4. Improve your PageSpeed until you are faster than the competition. Work to remove blockages caused by bloating and use good CDNs to cache and deliver content.
  5. Remember that PageSpeed is still currently applied as a factor post-relevance. Improve your content as well.
  6. Contact me if you would like to review your PageSpeeds and design solutions to beat the competition.

Discussion:

Speed, speed, speed. The world is a fast place and the mobile-first world is a faster one. There is a lot of hype around page speed, mostly justified, but some not. Don’t forget that having websites load faster benefits Google as it provides its users with a better experience, and encourages them to search Google and see their ads again.

The key thing to remember with page speed is that it is currently relative to the results set overall. There is not yet a minimum speed bar to step over before being considered (aside from not managing to load / timing out, obviously).  You need to be faster than the competition – who will also be working on their speed. If speed were a primary determinant of ranking, we would all be hosting out of the data centres near to the IP addresses which  Googlebot resolves to – that’s pretty much the way the internet works.

More info:

Return to Top


Social Media Helps SEO (Indirectly)

Summary:

  • Time to revisit this old chestnut, with a good post from SE Journal.
  • The external evidence points to social signals having some impact on Google rankings.
  • The Google evidence says “no”.
  • The reality is that social activity does impact search rankings, but not specifically and directly because they are part of the social graph.

TWIS 02 June 2017 Social Media Helps SEO

Actions to take:

  1. Keep on posting, amplifying and magnifying your content on social.
  2. Optimise your social posts in the same way you would any web posts.
  3. Link to your social profiles, if public, and to your social posts, if public.
  4. Ensure there is consistency between social content and web content.
  5. Ensure any piece of content is re-used for every publication avenue possible – that way it will really earn its keep.
  6. Don’t rely on social signals equating to rankings however.
  7. Contact me if you would like to discuss this further.

Discussion:

To me, this a classic case of Googler’s directly answering the specific question put in front of them, rather than the question that is probably intended. Google cannot access anything behind a login, so unless your posts are public, there is no way for them to be found and included. Those URLs which Google can access are treated as normal web pages and used in the same way as normal web pages (crawled, and indexed).

Even if links are nofollowed and redirected on outbound by most socials, Google will crawl them – it just won’t pass PageRank to them. Nofollow does not equate to a crawling or indexing directive. While PageRank is still part of the Google algorithm, relevance is still the more important factor.

More info:

Return to Top


NOODP No More – Time to Update Snippets

Summary:

  • Following DMOZ closing, as discussed in The Week In Search 3rd March 2017, Google has dropped support for the NOODP meta tag.
  • This means that it’s time to make sure your own meta descriptions are in order and accurately reflect the content of your page.

TWIS 02 June 2017 DMOZ Meta Descriptions

Actions to take:

  1. If any search listings were using a DMOZ snippet, then it’s time to update that page.
  2. Remove any legacy NOODP meta tags from web pages. Every bit counts.
  3. Review your listings for various keywords to see if Google is using the meta-description, a snippet from the page, or one from somewhere else.
  4. If needed, update the meta description to accurately reflect the content.
  5. Use this Basic Meta Tags guide to get your metas up to speed.
  6. Contact me if you would like assistance reviewing and optimising meta tags for improved display and click-through.

Discussion:

Ahhh, meta descriptions, the unloved child of SEO. They don’t directly impact rankings. They don’t particularly need to be under 160 characters, but they do have an impact on search performance. Get the right and they set the scene for increased click-through from search (out-performing your ranking position), and conversions, as people are find what they are looking for when they hit your page.

It’s sad to see DMOZ go, but with the news that Del.icio.us also appears to be in its death throes it really seems to old guard of the web is falling off the perch at an alarming rate. I have to confess, I can’t remember the last time I saw an ODP snippet in use, but I’m sure there were a few out there, and still are on legacy pages, long-forgotten, but still indexed, still ranking, and still sending traffic.

More info:

Return to Top


Bits & Pieces

  • AHREFs have launched a study demonstrating that Featured Snippets reduce traffic to the remaining organic results on the page 🙁 SE Land has also published a handy primer on how to generate a Featured Snippet.
  • Google updated their Quality Rater Guidelines early in May, although there doesn’t seem to be a huge amount of change.
  • Google has stated that the Fetch & Render tool is a better indicator of what Google “sees” when it crawls than the Text / HTML Cache.
  • There’s a really interesting snippet in one of the webmaster hangout videos about how Google’s algorithms share data between themselves and “update” them. I don’t think this means that the algorithms change the other algorithms in a machine learning way, but that information from one set might affect how another set behaves. This isn’t terrifically surprising, but is interesting nonetheless.

Return to Top


TL;DR

  • Read The State of SEO in mid-2017.
  • Chrome is going to block “bad” ads. Your definition of bad may vary from theirs.
  • JavaScript can still stop your site from being indexed and crawled. Use it wisely, young padawan.
  • PageSpeed is tougher to determine for a fast website than a quick one.
  • Social media does help SEO – kinda. So keep doing it, and optimise everything.
  • NOODP is no more. Update your meta tags, descriptions and snippets.

Thanks for reading. If you would like to discuss what these changes mean for your web property, or would like to know how to implement them, please feel free to contact me.

Return to Top