What’s happened this week?
The 5 most important updates to SEO for w/e 12 May 2017 are: Pages Ranking for 1000+ Keywords; BrightEdge / SearchMetrics Patent Dispute; Adding Video Does Not Assist Rankings; Events Debut on US Mobile; How Google Crawls 404s; and a number of other interesting bits and pieces.
#SERP #Googlebot #ContentStrategy #StructuredData #Patents
How to Rank for Multiple Keywords
- The guys at Ahrefs recently published what I’m calling a Ranking Diversity study which showed that top ranking pages can rank front page for around 1,000 other keywords on average (the median figure is much lower).
- The study also correlated the number of rankings to content length, with an optimal length appearing to be between 500-2000 words although there were diminishing returns.
- No other correlating factors such as backlinks, pages on site, pages on topic, update frequency were taken into account.
How to Rank for Multiple Keywords:
- Diversify your keywords on page. Don’t be afraid to use multiple related keywords on a page, which is a natural effect of writing… naturally.
- Don’t expect every keyword, or topic to perform the same. Many keyword “sets” may only have 50-100 natural variants, others may have 10k+.
- Write longer, fuller content. Assuming you have limited resources, there are points at which your effort will start to exceed the increase in returns. Also, significantly longer content tends to be difficult to break down and digest for the reader
- Don’t bank on every page ranking for 1000+ keywords.
Ranking diversity has been a “thing” in my world for a long time. I’ve seen clients who have had multiple rankings for a single page, and I’ve also run sites where a single page ranked for many, many terms. This knowledge freed me up to create pages about topics without necessarily focusing on a single keyword, but focusing on a cluster or group of them as being in the right area. Writing naturally about a topic is a lot easier if you are not sticking to a didactic keyword plan.
It is critically important these days to have content of the correct length. 500 words seems to be the absolute minimum, with 2000 the effective maximum, but there are instances where shorter, or longer will perform better.
One of the problems with these studies is that they only cover a single aspect. For instance what isn’t removed from the equation is the template “fluff” each of these pages have: it is a bare word count. Google has been able to ignore templates for a long while, so it can concentrate on the content (a duplicate content filter can’t work any other way). There is also no tying back to any other metric, or on-page, or off-page factors, which is a bit of a shame.
BrightEdge Sues SearchMetrics Which Then Files for Ch.11 US Bankruptcy
- BrightEdge, one of the corporate SEO “platforms” filed a patent infringement suit against SearchMetrics, according to TechCrunch.
- SearchMetrics responded by filing for Chapter 11 bankruptcy protection in the US.
- SearchMetrics contend that this is as a result of failed merger / acquisition talks in late 2013.
- The action by BrightEdge has been likened to “patent trolling”, a widely disliked tactic in the tech world.
Actions to take:
- If you are a Search Metrics customer, or considering becoming one, there should be no need to take immediate action. Chapter 11 allows a company to carry on trading normally whilst affording it certain protections against creditors.
- Keep a watching eye on the action if you are a customer of either. It could have a big result wither way.
For some reason, BrightEdge does not appear to be universally loved. I’ve had dealings with them in the past and they were like many other corporate SEO platforms: okay at producing data and reports, but not so hot at identifying issues and producing coherent SEO strategies off the back of the wealth of information at their disposal. Overall, I thought they were “okay”.
This appears to stem from SearchMetrics allowing BrightEdge access to internal information during potential acquisition / merger discussions. According to SearchMetrics this allowed BrightEdge to file a patent (which SearchMetrics should have filed for), which they are now claiming SearchMetrics infringes.
The world of patents in the US is very murky. There are lots of patent disputes and patent trolling, as it’s known, appears to be a business for some companies. Europe, because it does not allow patents to be filed in the same way, does not seem to suffer from this problem.
It would seem that Search Metrics may not have filed for for a US patent because in Europe it wouldn’t have been allowed to, and BrightEdge has decided that SearchMetrics infringes one on of its patents.
Adding Video Doesn’t Help Rankings Says Googler
- During a mild Twitter stoush between a disgruntled Twitter user and SEO agency, a Googler weighed in announcing that “having a video on your page will absolutely not help you rank better in web search”
- This was after some discussion about whether the length of content needed to rank for “how to boil an egg” should be +2,000 words.
Does Video Help Rankings?:
- Ignoring the immediate instinct to drop all video right now, this needs to be treated with some caution.
- Continue to use video where it is useful and appropriate.
- Continue to optimise pages and optimise videos which add relevant content to the page.
- Always ensure the content of the video is relayed in text. Visually this can be a summary with an unseen layer of transcript for accessible browsers.
- Where possible, host the video yourself and back it up with a YouTube posting, rather than embedding a YouTube video.
- Don’t always take Googler’s tweets at absolute face value. They are often literal, or the context is lost in translation.
It’s a tough life being a Googler in the SEO world. Your every word is pounced on and dissected to the nth degree, especially on Twitter. Gary Illyes is absolutely correct in stating that just adding a video to your page is unlikely to add anything in terms of ranking, eg adding a video does not automatically add 10 bonus points to your (imaginary) ranking score.
However, he’s also only telling about 1/4 of the story. Having well-optimised rich media, such as video, is always useful when you use it as part of the content mix. Videos can increase time on page, can increase shareability of content and can encourage onwards clicks or conversions, but they aren’t an absolute answer to SEO ranking success.
Google Debuts Events on Mobile Search in US
- In keeping with Googles’s desire to remove clicks from the user journey, they have debuted the ability to display events in search results via structured data.
- This is currently only available in the US on mobile, or through the Google App.
- They have also provided a detailed developers guide to aid structured markup implementation.
Actions to take:
- If you are in the US, and you run events, you should strongly consider implementing this.
- Read the guidelines, use the Google Search Console Structured Data testing tool and push live.
- It is not yet known if these information panels will replace the standard organic links, or if the results will be placed in organic SERPs order.
- Don’t spam your structured markup. It may not affect an organic SERPs listing but it may prevent other structured data listing from appearing.
- If you are not in the US, you should still strongly consider implementing this, once you have recovered from the FOMO Google mentions. It is likely it will roll out internationally.
- If you run an events listings site, traffic may be about to drop, or you’ll need to find other ways to reach users.
Another day, another way for Google to present information without a user having to visit a website. All of a sudden pages about “Top 10 things to do in XYZVille” are likely to be pushed down the organic SERPs. This is a good thing for users and webmasters who know how to implement, and a bad thing for those webmasters who don’t or can’t.
By now, you really should be getting comfortable with structured data, as the likelihood is that it will form a much larger part of our search journey over the next few years.
This format will of course give Google the opportunity to sell ads into these feeds, almost indistinguishable from the content and like Google My Business is also likely to create some severe spam headaches for them.
I can see it’s usefulness, I’m just not convinced I want Google to select events for me.
Google Recrawls 404s But After Other Pages
- JohnMu stated on Twitter that Google will recrawl 404s, but usually after other pages have been crawled.
- This is so valuable crawl budget is not wasted.
- This was part of a larger discussion on 404s, crawl budgets and domain authority (which Google says doesn’t exist).
Actions to take:
- Firstly, you should be keeping on top of 404s – redirecting them when appropriate, or reinstating content if needed, or even reaching out to webmasters who have incorrectly linked content.
- If you reinstate a page, always include it in an XML Sitemap, or even use the Fetch As feature in Google Search Console.
- If you remove content, 301 redirect it to the next most appropriate page. Include the 301 redirect in an appropriate XML Sitemap.
- If you intentionally remove content and don’t want to redirect, leave it as a 404, or even issue a 410 Gone Away status.
- Although Google has often stated that 404s do not impact their view of your website quality, having huge amounts of 404s caused by broken internal links, or wholesale removal of content with no redirects is unlikely to endear you to them.
It’s good to know that Google prioritises live pages over 404s. As we know that the crawl budget is limited for each site, and appears to be a function of popularity, regularity of updates and authority, despite what Google may say, this is a good thing.
It’s also good to know that Google will re-crawl 404 pages, just to check that they are still 404, but if you reinstate content it is also important that you signal to Google that the page is now live again and should be re-crawled.
I have to confess, following the Twitter discussion, I was a bit surprised that webmasters thought that Google *didn’t* re-crawl old URLs. How else would they check for bait & switch etc?
Bits & Pieces
- JohnMu has stated that you can get unnatural links from good sites and natural links from spammy sites, but that really you should focus on the removal of unnatural links and get on with other things in life. I’m glad that’s sorted then. It’s not confusing in the slightest.
- Danny Goodwin posted a handy list of 17 pretty good tweets from the most recent SEJSummit. I can’t believe that with that size of listicle he didn’t go for a clickbait headline.
- Geo-targeting helps to promote but href.lang doesn’t, says John Mu. This is a bit of a complex area, but essentially if you have a relevant language page Google will try to swap that URL into the search results in the same position, whereas if you are specifically targeting a country, then geo-targeting through GSC helps to promote that domain in the relevant country’s SERPs. There probably is some apportioning of value from links to other country pages, but you should try to get same language / locale links for your href.lang pages. Phew.