Of course it's best to catch issues before they become issues. If you catch issues before they cause an impact on your organic traffic, you might be saving the company, this is why all of my clients are set up in ContentKing.
Hi Gerry! Can you tell us a little bit about yourself?
I am a geeky digital marketer, typically involved in aspects of SEO & data. I'm currently Technical SEO Lead at Just Eat which is an international marketplace for online food delivery. I also do some freelance work as well as working as growth director on DrinksPal.com (opens in a new tab) and help run an infrequent digital marketing event called Take it Offline (opens in a new tab).
A while ago you said that you feel that "always crawling, always reporting is where we will be moving to soon". Why do you feel this is the case?
Yeah, that came up in my 10 Digital Tools Gerry White Couldn't Live Without (opens in a new tab) article. Google is crawling your site all the time, thousands of pages a day on large sites. If something goes wrong on your site, it can't wait until your monthly crawl to make sure your traffic from search engines has dropped off a cliff. We've all seen clients where something got released and no one picked up on it until traffic took a nosedive.
Crawling on a desktop will often stop you from being able to work efficiently, cloud crawling is the only way to go. The question is should you schedule a crawl once or twice a month or use a tool which is always looking at your site. With many tools, your budget restricts the frequency and depth you are crawling across your portfolio of sites.
What tools are you using for this, and how are you using them?
Our toolkit at Just Eat includes monitoring hit-level data to see if search engines, or user traffic is acting differently, on top of that we also actively use Screaming Frog to check by hand, the new Google Search Console information is increasingly useful but can be a few days behind.
Most clients are on Google Analytics and because Google Analytics has an excellent API, a well set up account will tell you a lot about what's happening on your site, especially now that you can feed this data into so many places and tools.
Of course it's best to catch issues before they become issues. If you catch issues before they cause an impact on your organic traffic, you might be saving the company, this is why all of my clients are set up in ContentKing.
Why do you think it's important to continuously monitor sites and have access to SEO data in real-time?
Clients are releasing code multiple times a week, sometimes multiple times a day - a single bug that isn't obvious to users but might have added in or removed something will significantly impact organic performance. We've seen test pages dropping robots noindex
directives or live pages creating spider traps (opens in a new tab).
How has this trend changed the way you do SEO?
With many clients I can do a rapid check using the 'change tracking' feature in ContentKing, quickly spotting that the analytics account has fallen off a page means the campaign we have been working on is being tracked almost continuously. I can even see when title tags have been updated or when the canonical tags have all been pointed to the homepage, knowing exactly when this happens means we can isolate which JIRA went live at that point.
Working across multiple sites, the less time I am spending needing to check for basic SEO issues, the more time I have to work on moving forward, spotting new and different opportunities as well as issues. Having an up-to-date view of the websites I work on is critical to developing faster.
As for always reporting, in what format do you do this?
Google Data Studio has become a key part of my day to day - it isn't just for reporting but has become a powerful analysis tool and a way of querying data from APIs - the future will be everything in one single report with blended data and more - but I still need to use a few other tools.
I love the data in Sistrix, with daily visibility updates - it is another tool that can alert us across multiple websites and multiple countries (and mobile focused). Sistrix is my favourite visibility tool, but for specific keywords we use AccuRanker globally and through our agency we have access to SEOmonitor - both of which have market leading strengths and as the data can be pulled into Google Data Studio with a free connector, this is increasingly the approach companies are taking - not becoming the dashboard, but make the content.
What kind of reports have you set up and what KPIs do you report on?
I tend to avoid reporting as much as possible and focus on investigation, intelligence - no one reads weekly/monthly data reports unless its a core part of their role as such I tend to focus on what happened, why and impact - both negative and positive, something that is actionable or proves the case for further investment.
One thing I have been a little obsessed with is segmentation - using Data Studio and the 'case' statement and Google Tag Manager, you are able to fare more rapidly improve segmentation of visitor types into analytics - for instance users who are logged in, page type, page status (such as 404 pages).
If you could give SEOs one tip when it comes to real-time SEO data, what would it be?
You can't monitor everything everywhere. Don't obsess about that.
Having said that, interactive dashboards using real-time SEO data are definitely the future, this is a trend set to rise and I can confidently predict that on 2020 we will see more smarter alerts based on machine learnt significance.
Continue reading in-depth interviews with SEO specialists
You can check out our previous editions of SEO in Focus here: