This week turned into a bit of an AI-Google fest, blathering on about the challenges of controlling AI access to your hard-crafted content, vs Google wanting full access. There are also changes to FAQ & How To’s on the SERP, presumably due to spam, some sage advice re content pruning, and a good guide to AI-driven PMax Google Ads campaigns. Read it and educate yourself ;-)

Block GPTBot with robots.txt

Beep-boop. OpenAI has released information about its new web crawler: GPTBot. Managing the way this bot traverses your site is critical if you do not want your site’s content used to feed the ChatGPT AI leviathan.
The UA is “GPTBot”, and assuming it respects robots.txt instructions, you can use the same code in your robots.txt to block GPTBot as you do any other bot.
Note, once again, robots.txt blocks a page and its content from being crawled, in Google’s case it does not prevent indexing of a URL – no matter how many online idiots think so. That is Technical SEO 101.
So, I’ll say this once again: if you do not want your content used to feed ChatGPT’s AI, then you need to use robots.txt to block GPTBot from crawling it. This is part of Development & Technology.
This is about the most exciting thing to happen to robots.txt for about 15 years (and that’ll be the last time most people edited the file). Talk to me if you want to add this code, but don’t know how to do it.

Google De-Spams How-To & FAQ Rich Results

In a little bit of a strange move, Google has changed the way FAQ & How-To Rich Results work.
For those FAQs generated using the FAQPage schema, FAQs will *only* show for authoritative Government and health websites – Google’s screen grab image includes their own FAQ results, so I’m not sure if they count themselves as a Government or Health website now.

Google Wants Your Content for AI

*SIGH*. AI and its content-ingesting, copyright-ignoring ways are the flavour-du-jour this week. Google in all its ever-loving kindness for the good of humanity wants to turn copyright law on its head and make it so all content is available for ingestion by its AI bots, UNLESS the content creator and / or publisher opts out.
In the same way that the web was created on the back of swiped content, Google is now trying to codify this kind of behaviour and make it all a-okay, when it’s not, it’s really not. Using other people’s hard-crafted content to train your own engine is just not acceptable – as the New York Times has decreed via its “special relationship” with Google. In some ways, it’s the same level of cheek that caused privacy uproars when people discovered that CCTV footage was being used to train AI engine’s image detection and facial recognition technology.
Even if you love AI, it has to operate within a framework. Talk to me about Content Strategy, SEO and Development within a holistic VCMO approach.

Google Says No Need to Content Prune

Content pruning is one of the maddest SEO content strategy crazes ever to pass through the internet. And it’s till not done yet. CNET has this week announced it has deleted a whole heap of pages for “SEO purposes” and Danny Sullivan has mentioned “well, no need to do it for SEO, it brings no benefit”.
Obviously, Google does not always speak the objective, accurate truth, but sometimes a redacted, or convenient version of it, but in this case it is probably right. Yes, Google does like fresh content, but no, it doesn’t prefer newer content over older content per se (or else its listings would purely be spam content from the last 5 minutes). Yes, you should prune really rubbish, unhelpful, or non-useful content. You should also prune really thin or duplicated content, but that is covered by the previous type of pruning.
Even if you do decide to prune content, basing the cut decision on page views is really silly. You never know who’s linking to it, or what benefit is brings the site as a whole.
Want to talk about the kind of content you should have? Contact me.

New PMax “Best Practices” Guide

Accidentally, this week is mainly about Google. I suppose that reflects the number of pies they have their fingers in. In this instance it’s a new “Best Practices” guide for the Google Ads Performance Max campaign type. Now, in amongst the “broad match on the letter S” self-serving advice, there is also a wealth of somewhat decent information, especially if you are new to PMax campaigns. And, of course, Performance Max Campaigns are all AI-powered…
It really is worth getting on top of this now if you are a retailer looking towards Christmas. PMax has a habit of taking 2-3 weeks to get dialled in properly, so it’s best not to leave set up too late as the algorithm has to learn what works best for your audience & product. If you want advice on setting up Performance Max, or other PPC campaigns, talk to me.