The Fundamentals of Crawling for SEO – Whiteboard Friday

In this week’s episode of Whiteboard Friday, host Jes Scholz digs into the foundations of search engine crawling. She’ll show you why no indexing issues doesn’t necessarily mean no issues at all, and how — when it comes to crawling — quality is more important than quantity.

infographic outlining the fundamentals of SEO crawling

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Good day, Moz fans, and welcome to another edition of Whiteboard Friday. My name is Jes Scholz, and today we’re going to be talking about all things crawling. What’s important to understand is that crawling is essential for every single website, because if your content is not being crawled, then you have no chance to get any real visibility within Google Search.

So when you really think about it, crawling is fundamental, and it’s all based on Googlebot’s somewhat fickle attentions. A lot of the time people say it’s really easy to understand if you have a crawling issue. You log in to Google Search Console, you go to the Exclusions Report, and you see do you have the status discovered, currently not indexed.

If you do, you have a crawling problem, and if you don’t, you don’t. To some extent, this is true, but it’s not quite that simple because what that’s telling you is if you have a crawling issue with your new content. But it’s not only about having your new content crawled. You also want to ensure that your content is crawled as it is significantly updated, and this is not something that you’re ever going to see within Google Search Console.

But say that you have refreshed an article or you’ve done a significant technical SEO update, you are only going to see the benefits of those optimizations after Google has crawled and processed the page. Or on the flip side, if you’ve done a big technical optimization and then it’s not been crawled and you’ve actually harmed your site, you’re not going to see the harm until Google crawls your site.

So, essentially, you can’t fail fast if Googlebot is crawling slow. So now we need to talk about measuring crawling in a really meaningful manner because, again, when you’re logging in to Google Search Console, you now go into the Crawl Stats Report. You see the total number of crawls.

I take big issue with anybody that says you need to maximize the amount of crawling, because the total number of crawls is absolutely nothing but a vanity metric. If I have 10 times the amount of crawling, that does not necessarily mean that I have 10 times more indexing of content that I care about.

All it correlates with is more weight on my server and that costs you more money. So it’s not about the amount of crawling. It’s about the quality of crawling. This is how we need to start measuring crawling because what we need to do is look at the time between when a piece of content is created or updated and how long it takes for Googlebot to go and crawl that piece of content.

The time difference between the creation or the update and that first Googlebot crawl, I call this the crawl efficacy. So measuring crawling efficacy should be relatively simple. You go to your database and you export the created at time or the updated time, and then you go into your log files and you get the next Googlebot crawl, and you calculate the time differential.

But let’s be real. Getting access to log files and databases is not really the easiest thing for a lot of us to do. So you can have a proxy. What you can do is you can go and look at the last modified date time from your XML sitemaps for the URLs that you care about from an SEO perspective, which is the only ones that should be in your XML sitemaps, and you can go and look at the last crawl time from the URL inspection API.

What I really like about the URL inspection API is if for the URLs that you’re actively querying, you can also then get the indexing status when it changes. So with that information, you can actually start calculating an indexing efficacy score as well.

So looking at when you’ve done that republishing or when you’ve done the first publication, how long does it take until Google then indexes that page? Because, really, crawling without corresponding indexing is not really valuable. So when we start looking at this and we’ve calculated real times, you might see it’s within minutes, it might be hours, it might be days, it might be weeks from when you create or update a URL to when Googlebot is crawling it.

If this is a long time period, what can we actually do about it? Well, search engines and their partners have been talking a lot in the last few years about how they’re helping us as SEOs to crawl the web more efficiently. After all, this is in their best interests. From a search engine point of view, when they crawl us more effectively, they get our valuable content faster and they’re able to show that to their audiences, the searchers.

It’s also something where they can have a nice story because crawling puts a lot of weight on us and our environment. It causes a lot of greenhouse gases. So by making more efficient crawling, they’re also actually helping the planet. This is another motivation why you should care about this as well. So they’ve spent a lot of effort in releasing APIs.

We’ve got two APIs. We’ve got the Google Indexing API and IndexNow. The Google Indexing API, Google said multiple times, “You can actually only use this if you have job posting or broadcast structured data on your website.” Many, many people have tested this, and many, many people have proved that to be false.

You can use the Google Indexing API to crawl any type of content. But this is where this idea of crawl budget and maximizing the amount of crawling proves itself to be problematic because although you can get these URLs crawled with the Google Indexing API, if they do not have that structured data on the pages, it has no impact on indexing.

So all of that crawling weight that you’re putting on the server and all of that time you invested to integrate with the Google Indexing API is wasted. That is SEO effort you could have put somewhere else. So long story short, Google Indexing API, job postings, live videos, very good.

Everything else, not worth your time. Good. Let’s move on to IndexNow. The biggest challenge with IndexNow is that Google doesn’t use this API. Obviously, they’ve got their own. So that doesn’t mean disregard it though.

Bing uses it, Yandex uses it, and a whole lot of SEO tools and CRMs and CDNs also utilize it. So, generally, if you’re in one of these platforms and you see, oh, there’s an indexing API, chances are that is going to be powered and going into IndexNow. The good thing about all of these integrations is it can be as simple as just toggling on a switch and you’re integrated.

This might seem very tempting, very exciting, nice, easy SEO win, but caution, for three reasons. The first reason is your target audience. If you just toggle on that switch, you’re going to be telling a search engine like Yandex, big Russian search engine, about all of your URLs.

Now, if your site is based in Russia, excellent thing to do. If your site is based somewhere else, maybe not a very good thing to do. You’re going to be paying for all of that Yandex bot crawling on your server and not really reaching your target audience. Our job as SEOs is not to maximize the amount of crawling and weight on the server.

Our job is to reach, engage, and convert our target audiences. So if your target audiences aren’t using Bing, they aren’t using Yandex, really consider if this is something that’s a good fit for your business. The second reason is implementation, particularly if you’re using a tool. You’re relying on that tool to have done a correct implementation with the indexing API.

So, for example, one of the CDNs that has done this integration does not send events when something has been created or updated or deleted. They rather send events every single time a URL is requested. What this means is that they’re pinging to the IndexNow API a whole lot of URLs which are specifically blocked by robots.txt.

Or maybe they’re pinging to the indexing API a whole bunch of URLs that are not SEO relevant, that you don’t want search engines to know about, and they can’t find through crawling links on your website, but all of a sudden, because you’ve just toggled it on, they now know these URLs exist, they’re going to go and index them, and that can start impacting things like your Domain Authority.

That’s going to be putting that unnecessary weight on your server. The last reason is does it actually improve efficacy, and this is something you must test for your own website if you feel that this is a good fit for your target audience. But from my own testing on my websites, what I learned is that when I toggle this on and when I measure the impact with KPIs that matter, crawl efficacy, indexing efficacy, it didn’t actually help me to crawl URLs which would not have been crawled and indexed naturally.

So while it does trigger crawling, that crawling would have happened at the same rate whether IndexNow triggered it or not. So all of that effort that goes into integrating that API or testing if it’s actually working the way that you want it to work with those tools, again, was a wasted opportunity cost. The last area where search engines will actually support us with crawling is in Google Search Console with manual submission.

This is actually one tool that is truly useful. It will trigger crawl generally within around an hour, and that crawl does positively impact influencing in most cases, not all, but most. But of course, there is a challenge, and the challenge when it comes to manual submission is you’re limited to 10 URLs within 24 hours.

Now, don’t disregard it just because of that reason. If you’ve got 10 very highly valuable URLs and you’re struggling to get those crawled, it’s definitely worthwhile going in and doing that submission. You can also write a simple script where you can just click one button and it’ll go and submit 10 URLs in that search console every single day for you.

But it does have its limitations. So, really, search engines are trying their best, but they’re not going to solve this issue for us. So we really have to help ourselves. What are three things that you can do which will truly have a meaningful impact on your crawl efficacy and your indexing efficacy?

The first area where you should be focusing your attention is on XML sitemaps, making sure they’re optimized. When I talk about optimized XML sitemaps, I’m talking about sitemaps which have a last modified date time, which updates as close as possible to the create or update time in the database. What a lot of your development teams will do naturally, because it makes sense for them, is to run this with a cron job, and they’ll run that cron once a day.

So maybe you republish your article at 8:00 a.m. and they run the cron job at 11:00 p.m., and so you’ve got all of that time in between where Google or other search engine bots don’t actually know you’ve updated that content because you haven’t told them with the XML sitemap. So getting that actual event and the reported event in the XML sitemaps close together is really, really important.

The second thing you can do is your internal links. So here I’m talking about all of your SEO-relevant internal links. Review your sitewide links. Have breadcrumbs on your mobile devices. It’s not just for desktop. Make sure your SEO-relevant filters are crawlable. Make sure you’ve got related content links to be building up those silos.

This is something that you have to go into your phone, turn your JavaScript off, and then make sure that you can actually navigate those links without that JavaScript, because if you can’t, Googlebot can’t on the first wave of indexing, and if Googlebot can’t on the first wave of indexing, that will negatively impact your indexing efficacy scores.

Then the last thing you want to do is reduce the number of parameters, particularly tracking parameters. Now, I very much understand that you need something like UTM tag parameters so you can see where your email traffic is coming from, you can see where your social traffic is coming from, you can see where your push notification traffic is coming from, but there is no reason that those tracking URLs need to be crawlable by Googlebot.

They’re actually going to harm you if Googlebot does crawl them, especially if you don’t have the right indexing directives on them. So the first thing you can do is just make them not crawlable. Instead of using a question mark to start your string of UTM parameters, use a hash. It still tracks perfectly in Google Analytics, but it’s not crawlable for Google or any other search engine.

If you want to geek out and keep learning more about crawling, please hit me up on Twitter. My handle is @jes_scholz. And I wish you a lovely rest of your day.

Video transcription by Speechpad.com

Diving for Pearls: A Guide to Long Tail Keywords – Next Level

Welcome to this refreshed installment of our educational Next Level series! Originally published in June 2016 this blog has been rewritten to include new tool screenshots and refreshed workflows. Together we’ll uncover keywords in the vastness of the long tail.

Looking for more Next Level posts? Previously we explored how to create relevant and engaging SEO reports.

One of the biggest obstacles to driving forward your business online is being able to rank well for keywords that people are searching for. Getting your lovely URLs to show up in those precious top positions — and gaining a good portion of the visitors behind the searches — can feel like an impossible dream. Particularly if you’re working on a newish site on a modest budget within a competitive niche.

Well, strap yourself in, because today we’re going to live that dream. I’ll take you through the bronze, silver, and gold levels of finding, assessing, and targeting long tail keywords so you can start getting visitors to your site that are primed and ready to convert.

Quick steps to building a long tail keyword list:

  1. Draw from your industry and customer knowledge

  2. Add suggestions from Google Autocomplete

  3. Explore industry language on social media

  4. Pull relevant suggestions from a keyword tool

  5. Prioritize using popularity and difficulty metrics

  6. Understand the competitive landscape to pinpoint opportunities

What are long tail keywords?

The “long tail of search” refers to the many weird and wonderful ways the diverse people of the world search for any given niche.

People (yes, people! Shiny, happy, everyday, run-of-the-mill, muesli-eating, credit-card-swiping people!) rarely stop searching broad and generic ‘head’ keywords, like “web design” or “camera” or “sailor moon.”

They clarify their search with emotional triggers, technical terms they’ve learned from reading forums, and compare features and prices before mustering up the courage to commit and convert on your site.

The long tail is packed with searches like “best web designer in Nottingham” or “mirrorless camera 4k video 2016” or “sailor moon cat costume.”

This adaptation of the Search Demand Curve chart visualizes the long tail of search by using the tried and tested “Internet loves cats + animated gifs are the coolest = SUCCESS” formula.

The Search Demand Curve illustrates that while “head” and “body” terms typically amass higher search volume, seeming appealing at first. The vastness of the “long tail” presents a more substantial opportunity and larger percentage of search traffic that shouldn’t be ignored. You can really see this illustrated when combined as a percentage of search traffic. While this graph contains no cats, it is still entirely illustrative. However the long tail of search isn’t slowing down anytime soon with voice search and AI integrations we can expect the vastness of the long tail to continue to grow.

While search volume for any individual long tail keyword is typically less, user intent is much more specific and viewed as a group targeting the long tail often enables you to target a larger more engaged audience. Also beautifully illustrated in Dr. Pete’s infamous chunky thorax post.

The long tail of search is being constantly generated by people seeking answers from the Internet hive mind. There’s no end to what you’ll find if you have a good old rummage about, including: Questions, styles, colors, brands, concerns, peeves, desires, hopes, dreams… and everything in between.

Fresh, new, outrageous, often bizarre keywords. If you’ve done any keyword research you’ll know what I mean by bizarre. Things a person wouldn’t admit to their best friend, therapist, or doctor they’ll happily pump into Google and hit search. In this post we’re going to go diving for pearls: keywords with searcher intent, high demand, low competition, and a spot on the SERPs just for you.

Bronze medal: Build your own long tail keyword

It’s really easy to come up with a long tail keyword. You can use your brain, gather some thoughts, take a stab in the dark, and throw a few keyword modifiers around your ‘head’ keyword.

Have you ever played that magnetic fridge poetry game? It’s a bit like that. You can play online if (like me) you have an aversion to physical things.

I’m no poet, but I think I deserve a medal for this attempt, and now I really want some “hot seasonal berry water.”

Magnetic poetry not doing it for you? Don’t worry — that’s only the beginning.

Use your industry knowledge

Time to draw on that valuable industry knowledge you’ve been storing up, jot down some ideas, and think about intent and common misconceptions. I’m going to use the example pearls or freshwater pearls in this post as the head term because that’s something I’m interested in.

Let’s go! Let’s say I run a jewelry business and I know that my customers regularly have questions, like:

How do I clean freshwater pearls

Using my knowledge I can rattle off and build a keyword list.

Search your keyword

Engage google suggested search tool to get some more ideas. Manually enter your keyword into Google and prompting it to populate popular suggestions, like I’ve done below:

Awesome, I’m adding Freshwater pearls price to my list.

Explore the language of social media

Get amongst the over-sharers and have a look at what people are chatting about on social media by searching your keyword in Twitter, tiktok, Instagram, and Youtube. These are topics in your niche that people are talking about right now.

YouTube is also pulling up some interesting ideas around my keyword. This is simultaneously helping me gather keyword ideas and giving me a good sense about what content is already out there. Don’t worry, we’ll touch on content later on in this post. 🙂

I’m adding understanding types of pearls and Difference between saltwater and freshwater pearls to my list.

Ask keyword questions…?

You’ll probably notice that I’ve added a question mark to a phrase that is not a question, just to mess with you all. Apologies for the confusing internal-reading-voice-upwards-inflection.

Questions are my favorite types of keywords. What!? You don’t have a fav keyword type? Well, you do now — trust me.

Answer the Public is packed with questions radiating out from your seed term

Pop freshwater pearls into the tool and grab some questions for our growing list.

To leave no rock unturned (or no mollusk unshucked), let’s pop over to Google Search Console to find keywords that are already sending you traffic (and discover any mismatches between your content and user intent.)

Pile these into a list, like I’ve done in this spreadsheet.

Now this is starting to look interesting: we’ve got some keyword modifiers, some clear buying signals, and a better idea of what people might be looking for around “freshwater pearls.”

Should you stop there? I’m flabbergasted — how can you even suggest that?! This is only the beginning. 🙂

Silver medal: Assess demand and explore topics

So far, so rosy. But we’ve been focusing on finding keywords, picking them up, and stashing them in our collection like colored glass at the seaside.

To really dig into the endless tail of your niche, you’ll need a keyword tool like our very own Keyword Explorer. This is invaluable for finding topics within your niche that present a real opportunity for your site.

If you’re trying out Keyword Explorer for the first time, you’ll have 10 free searches/mo with a free Moz Community account and even more with a Moz Pro free trial or paid subscription.

Find search volume for your head keyword

To start, enter a broad industry keyword. In my case I’ll type in “pearls” into the Keyword Explorer search box. Now you can see Moz’s Monthly Volume displaying how often a term or phrase is searched for in Google:

Now try “freshwater pearls.” As expected, the search volume goes down, but we’re getting more specific.

We could keep going like this, but we’re going to burn up all our free searches. Just take it as read that, as you get more specific and enter all the phrases we found earlier, the search volume will decrease even more. There may not be any data at all. That’s why you need to explore the searches around this main keyword.

Find even more long tail keywords

Below the search volume, click on “Keyword Suggestions.”

Well, hi there, ever-expanding long tail! We’ve gone from a handful of keywords pulled together manually from different sources to 1,000 suggestions right there on your screen. Positioned right next to that, search volume to give us an idea of demand.

The diversity of searches within your niche is just as important as that big number we saw at the beginning, because it shows you how much demand there is for this niche as a whole. We’re also learning more about searcher intent.

I’m scanning through those 1,000 suggestions and looking for other terms that pop up again and again. I’m also looking for signals and different ways the words are being used to pick out words to expand my list.

I like to toggle between sorting by Relevancy and search volume, and then scroll through all the results to cherry-pick those that catch my eye.

Now reverse the volume filter so that it’s showing lower-volume search terms and scroll down through the end of the tail to explore the lower-volume chatter.

If we don’t have tracked data in our database you can always cross reference with another data set to validate their value.

This is where your industry knowledge comes into play again. Bots, formulas, spreadsheets, and algorithms are all well and good, but don’t discount your own instincts and knowledge.

Use the suggestion filters to your advantage and play around with broader or more specific suggestion types.

Looking through the suggestions, I’ve noticed that the word “cultured” has popped up a few times.

To see these all bundled together, I want to look at the grouping options in Keyword Explorer. I like the high lexicon groups so I can see how much discussion is going on within my topics.

Scroll down and expand that group to get an idea of demand and assess intent.

I’m also interested in the words around “price” and “value,” so I’m doing the same and saving those to my sheet, along with the search volume. A few attempts at researching the “cleaning” of pearls wasn’t very fruitful, so I’ve adjusted my search to “clean freshwater pearls.”

Because I’m a keyword questions fanatic, I’m also going to filter by questions (the bottom option from the drop-down menu):

OK! How is our keyword list looking? Pretty darn hot, I reckon! We’ve gathered together a list of keywords and dug into the long tail of these sub-niches, and right alongside we’ve got search volume.

You’ll notice that some of the keywords I discovered in the bronze stage don’t have any data showing up in Keyword Explorer (indicated by the hyphen in the screenshot above). That’s ok — they’re still topics I can research further. This is exactly why we have assessed demand; no wild goose chase for us!

Ok, we’re drawing some conclusions, we’re building our list, and we’re making educated decisions. Congrats on your silver-level keyword wizardry! 😀

Gold medal: Find out who you’re competing with

We’re not operating in a vacuum. There’s always someone out there trying to elbow their way onto the first page. Don’t fall into the trap of thinking that just because it’s a long tail term with a nice chunk of search volume all those clicks will rain down on you. If the terms you’re looking to target already have big names headlining, this could very well alter your roadmap.

To reap the rewards of targeting the long tail, you’ll have to make sure you can outperform your competition.

Manually check the SERPs

Check out who’s showing up in the search engine results page (SERPs) by running a search for your head term. Make sure you’re signed out of Google and in an incognito tab.

We’re focusing on the organic results to find out if there are any weaker URLs you can pick off.

I’ll start with “freshwater pearls” for illustrative purposes.

Whoooaaa, this is a noisy page. I’ve had to scroll a whole 2.5cm on my magic mouse (that’s very nearly a whole inch for the imperialists among us) just to see any organic results.

Let’s install the Mozbar to discover some metrics on the fly, like domain authority and back-linking data.

Now, if seeing those big players in the SERPs doesn’t make it clear, looking at the Mozbar metrics certainly does. This is exclusive real estate. It’s dominated by retailers, although Wikipedia gets a place in the middle of the page.

Let’s get into the mind of Google for a second. It — or should I say “they” (I can’t decide if it’s more creepy for Google to be referred to as a singular or plural pronoun. Let’s go with “they”) — anyway, I digress. “They” are guessing that we’re looking to buy pearls, but they’re also offering results on what they are.

This sort of information is offered up by big retailers who have created content that targets the intention of searchers. Mikimoto drives us to their blog post all about where freshwater pearls come from.

As you get deeper into the long tail of your niche, you’ll begin to come across sites you might not be so familiar with. So go and have a peek at their content.

With a little bit of snooping you can easily find out:

  • how relevant the article is

  • if it looks appealing, up to date, and sharable

  • be really judge-y: why not?

Now let’s find some more:

  • when the article was published

  • when their site was created

  • how often their blog is updated

  • how many other sites are linking to the page with Link Explorer

  • how many tweets, likes, etc.

Learn more about how to do a competitor analysis in our free guide, and don’t forget to download the handy worksheet.

Document all of your findings in our spreadsheet from earlier to keep track of the data. This information will now inform you of your chances of ranking for that term.

Manually checking out your competition is something that I would strongly recommend. But we don’t have all the time in the world to check each results page for each keyword we’re interested in.

Keyword Explorer leaps to our rescue again

Run your search and click on “SERP Analysis” to see what the first page looks like, along with authority metrics and social activity.

All the metrics for the organic results, like Page Authority, goes into calculating the Difficulty score above (lower is better).

And all those other factors — the ads and suggestions taking up space on the SERPs — that’s what’s used to calculate Organic CTR (higher is better).

Priority is all the other metrics tallied up. You definitely want this to be higher.

So now we have 3 important numerical values we can use to gauge our competition. We can use these values to compare keywords.

After a few searches in Keyword Explorer, you’re going to start hankering for a keyword list or two. For this you’ll need a paid subscription, or a Moz Pro 30-day free trial.

It’s well worth the sign-up; not only do you get 5,000 keyword queries per month and 30 lists (on the Medium plan), but you also get to check out the super-magical-KWE-mega-list-funky-cool metric page. That’s what I call it, just rolls off the tongue, you know?

Okay, fellow list buddies, let’s go and add those terms we’re interested in to our lovely new list.

Then head up to your lists on the top right and open up the one you just created.

Now we can see the spread of demand, competition and SERP features for our whole list.

You can compare Volume, SERPS Features, Difficulty, Organic CTR, and Priority across multiple lists, topics, and niches.

How to compare apples with apples

Comparing keywords is something our support team gets questions about all the time.

Should I target this word or that word?

For the long tail keyword, the Volume is a lot lower, Difficulty is also down, the Organic CTR is a bit up, and overall the Priority is down because of the drop in search volume.

But don’t discount it! By targeting these sorts of terms, you’re focusing more on the intent of the searcher. You’re also making your content relevant for all the other neighboring search terms.

Let’s compare the difference between freshwater and cultured pearls with how much are freshwater pearls worth.

Search volume is the same, but for the keyword how much are freshwater pearls worth Difficulty is up, but so is the overall Priority because the Organic CTR is higher.

But just because you’re picking between two long tail keywords doesn’t mean you’ve fully understood the long tail of search.

You know all those keywords I grabbed for my list earlier in this post? Well, here they are sorted into topics.

Look at all the different ways people search for the same thing. This is what drives the long tail of search — searcher diversity. If you tally all the volume up for the cultured topic, we’ve got a bigger group of keywords and overall more search volume. This is where you can use Keyword Explorer and the long tail to make informed decisions.

You’re laying out your virtual welcome mat for all the potential traffic these terms send.

Platinum level: I lied — there’s one more level!

For all you lovely overachievers out there who have reached the end of this post, I’m going to reward you with one final tip.

You’ve done all the snooping around on your competitors, so you know who you’re up against. You’ve done the research, so you know what keywords to target to begin driving intent-rich traffic.

Now you need to create strong, consistent, and outstanding content.

As Dr Pete confirmed:

We don’t have to work ourselves to death to target the long tail of search. It doesn’t take 10,000 pieces of content to rank for 10,000 variants of a phrase, and Google (and our visitors) would much prefer we not spin out that content. The new, post-NLP (Natural Language Processing) long tail of SEO requires us to understand how our keywords fit into semantic space, mapping their relationships and covering the core concepts. Study your SERPs diligently, and you can find the patterns to turn your own long tail of keywords into a chonky thorax of opportunity.

Here’s where you really have to tip your hat to long tail keywords, because by strategically targeting the long tail you can start to build enough authority in the industry to beat stronger competition and rank higher for more competitive keywords in your niche.

Wrapping up…

The various different keyword phrases that make up the long tail in your industry are vast, often easier to rank for, and indicate stronger intent from the searcher. By targeting them you’ll find you can start to rank for relevant phrases sooner than if you just targeted the head. And over time, if you get the right signals, you’ll be able to rank for keywords with tougher competition. Pretty sweet, huh? Give Moz’s Keyword Explorer tool a whirl and let me know how you get on 🙂

The Mozbot Mashup: Roger Explores the World of Generative AI Imagery

AI image generation has taken big leaps forward in the last year. It’s fun to play with. It’s a little bit weird. It can produce some mind-blowing results — and often laughable ones.

But is it useful in a marketing context?

We decided to find out, and our valiant SEO robot, Roger, was volunteered to be our first test subject. Don’t worry, he was cool with it. He was actually pretty excited to have a machine intelligence to engage with, after spending so much time doling out SEO knowledge to us simple humans.

Training the model

AI imagery tools like Midjourney, Stable Diffusion, and DALL-E 2 are pretty amazing at creating images of just about anything you can come up with, but they have their own algorithmic and random-noise way of getting there. So while you can come up with interesting results, it can be hard to come up with a specific result.

To get to anything that actually looked like our friendly SEO Mozbot, we needed to train a stable diffusion model to get a start. There are a lot of ways to go about this, some that get pretty technical, and a number of others that use app interfaces to make the process easier on someone with a little less technical expertise.

We chose to start with Astria, a solution which allows you to customize (they call it tuning) a model of your own. A lot of users train it on their own likeness to make cool avatars (like the popular Lensa app), but we threw a bunch of variations of Roger in there, had him party with the AI model, and watched what kind of shenanigans they got up to.

A Rogues Gallery of Rogers

These tools generate images based on a text prompt, so our initial prompt was to see if it could output a version in a fun and colorful 3D style.

Not bad first results! It was clear this generation drew heavily from photos of a Roger toy held in a hand, as well as a photo of our life-size Roger Mascot at one of our Mozcon events (thus, the people in the background of some of the images). These are all actually recognizable as Roger, which I was impressed by, though none of them are quite “right”.

Time to try something in a completely different style. How about “Roger Mozbot with a rocket jetpack and fishbowl helmet, watercolor painting.”

Some super fun results! And others that look like Roger is having a very bad time. Also, apparently the “rocket” part of our prompt gave Roger some hardware in some of the results that made it look like his switch was accidentally set from Hugs to Destroy.

Further iterations produced equally interesting, fun, terrible, and wacky results as we messed around with other styles including more 3D, schematics, children’s book illustrations, and even Anime!

They just keep coming…

Want even more Roger mashups? We experimented further with a tool called Scenario.gg, which is a tool targeted toward creating game assets, but also has a nifty way to train a generator. A bonus of this one is that you can use an existing image as a starting point for a generation, allowing a little bit of additional control in how close or far you hew towards that starting point. Here are some of those results:

If you’re following generative AI, you know it’s an area evolving incredibly fast right now, with new tools, features, and techniques constantly coming out. A couple weeks after the initial generating on Astria, we delved back in and they have a video generating feature now. A little trial and error later, we had a super cool little video of Roger to go with all those pictures:

What have we done?

We’ve put Roger through the AI ringer, but to what end? Sorry Roger, it was all in the name of… SCIENCE! And learning. The initial experimental results came out with a ton of quantity, but the quality was not quite there. At least for reproducing a brand mascot with a specific look but that may not be widely disseminated enough to have been a subject of training on the models. If you are a little less specific with the results you are trying to achieve, AI imagery is already achieving jaw dropping results. Good enough that we are finding other ways to use this imagery in our marketing material, and no doubt you have seen some really cool stuff in your various feeds. For getting a quality version of Roger in a new style or pose, it would be more efficient to have an actual person just illustrate or render the artwork in the traditional style.

As mentioned at the top of the article, this technology is developing rapidly, and it seems like the game is changing every week with new models and new implementations that can make results better. As of the time of releasing this article, we’re already working on a new batch of Rogers using other tools, so look out for a follow up in the near future.

Roger is representative of a software tool that humans can interface with to achieve greater things. Generative AI is a new and potentially very powerful such tool in art, and for our purposes, brand design. Creative and talented people are still needed to guide the process, make decisions, and curate or cleanup the results. So, here’s to humans and robots working together to achieve interesting things! We’ll just have to see where Moz and Roger go with this next.

How to Advance Your SEO Career – Whiteboard Friday

As SEOs, we know how to optimize websites, but what about our careers? In this episode of Whiteboard Friday, Noah shares his insights from his own search marketing career path, with tips for those people in the beginning and middle stages of their career.

infographic outlining tips to advance your SEO career

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. This is incredibly exciting to be with you here today for Whiteboard Friday. Most of these help you with really cool strategic and tactical stuff, and today we’re going to be optimizing something super different. It’s your career. It’s the most important thing in the world.

This talk is going to be especially useful for people in the beginning and middle stages of their career. Perhaps you’ve just started out in your journey in SEO, perhaps you’ve started to take on some more challenging work, perhaps you’re given more areas of responsibility, and you’re trying to understand how do I get to that next step. You’re thinking about, “How do I go from maybe being a specialist to a manager, to a senior manager, or maybe even better, to a director or a vice president in an organization?” For me, this was a really exciting journey. I want to share with you some thoughts that I’ve gathered along the way in the hopes that you, too, can have a little bit of the same kinds of outcomes that I’ve been able to do.

The steps are pretty simple. The execution takes time and a lot of dedication and a lot of effort. So if you’re a striver, a trier, someone who just really has a lot of energy and drive, you’re going to get a ton out of this today.

The first thing that I want to say is that if you are just slightly strategic, just slightly strategic, imagine yourself as a shepherd and you’re up on top of a mountain and you have all of these sheep, and all you know is that you need them to go downhill. It’s that level of strategy that’s necessary to get to the next place. You don’t have to know everything. You don’t have to be an expert in all areas of SEO, and we’ll get into that in a second. 

What are we really talking about? We’re talking about craft, and we’re talking about people primarily in this process that we’re going over. For me, a lot of this comes from a blog post that I read by a guy named Jason Roberts. He published this in 2010. It’ll be a link in the show notes. For him, it was called the Surface Area of Luck. This blog post blew my mind when I read it, because it’s all about the doing. It’s about the craft. It’s about the skills. It’s what we learn, it’s what we do, and it’s about the telling. It’s how we communicate to others the qualities and the skills that we bring to the craft every single day.

So what we’re talking about are things like learning how to do great work and how to tell people about it. The better we do and the more skills we learn and the more people who know about it, and it’s not just all of the people, it’s the right people, that will accelerate your career through all those steps that I talked about before, whether it’s from specialist to manager, manager to director, or director to vice president.

So what are we really talking about? We’re talking about getting smarter. We’re talking about when we start in SEO. I think it’s crazy important to start in an agency environment. My theory is that if you’re in an agency, you’re exposed to all different types of problems and all different types of clients and all different types of verticals.

As you get started, part of learning is learning how to think. It’s learning how to make decisions. It’s learning how to look at data. It’s learning the fundamentals of the web, how to build HTML, maybe some cascading style sheets, maybe a little bit of JavaScript, just a little just to know what it is. Learning a lot about different types of SEO with different types of clients, learning the challenges that happen in a local SEO type challenge, learning about the challenges that come into play when you’re learning how to build content, learning how to build stuff around strategy, learning all about e-commerce SEO as well. So getting this wide array of exposure so you can learn all about the different types of problems.

What you’re going to find in that process is you’re going to start getting excited and passionate about something, and you’re going to find yourself talking to people, and you’re going to find yourself following people on social media that are experts in one specific thing. You’re going to watch Whiteboard Fridays about specific topic areas. You might find yourself going super deep into technical SEO. You might find yourself going super deep into e-commerce SEO. You need to listen to that, because that’s your inner self telling you where your passion is.

This is our Moz-shaped superpower. This whole process of learning is really important, and it’s really important because it helps you unlock your superpower. Going from going wide, almost like what Rand talked about all those years ago with a T-shaped marketer, that’s a lot what we’re talking about, getting a really nice wide foundation early in our career so that we can unlock what our superpower is.

What is our superpower? It’s that thing that keeps you up at night. It’s that thing that when you wake up, you think about. It’s that thing that you can’t find yourself stop talking to your colleagues about or reaching out to other people on social media. Notice, when we start talking about reaching out to people on social media, that’s where we get into the people portion of this process.

So once you find your superpower, you want to get really, really good at it, and you’re going to find yourself up-skilling religiously. You’re going to learn all kinds of new technologies. You’re going to learn processes and workflows to help you get the most out of that superpower.

This is where we get to the next stage, and this is incredibly powerful. As you start to improve and start to go down that path, remember this concept of the sheep. When you start to unlock your superpower, it doesn’t mean that you’re going to end up walking in a straight line. It’s going to be meandering, and it’s going to be in a direction, and you should expect that and be open to change and open to possibilities.

Once we know what our superpower is, we can start to think about our board of advisors. These are the people who are 6 to 24 or 36 months ahead of you. The reason that we’re targeting that very specific range of time and world of experiences is that they’re going to be able to have all of the same recent learnings and recent challenges that you’re going through in the back of their own minds, and they’re going to do everything that they can to help you.

So start to pay attention on social media, places like Twitter, places like LinkedIn, and start following the people who are doing the thing that you’re really passionate about. Start liking and commenting on their tweets, maybe comment, and eventually you’re going to find yourself reaching out to them directly and saying something like this via a direct message. “I love what you’re talking about and the problems that you’re working on. Would you be open to a Zoom? I’m trying my hardest to build my network, and I’d love some time. Thoughts?” Keep it really simple, value their time. What you’re going to find is that if you’re kind and you’re nice, people will go above and beyond to help you in your career.

So we’re building our board of advisors here. What you’re going to find is that these people care a ton about your outcomes, and they’re going to do whatever they can to introduce you to other people who are going to be able to help you as well.

Now that we’re talking about people, we all know how to be a good person. The core of it is being nice. You’re going to find, and this has played so well for me in my career, that if you’re kind, genuinely caring about your peers and colleagues across our industry, you’re going to find that the industry will open up to you like an oyster.

More things on people. You’re going to find a need to connect with these people who share that same passion, and that’s when we get into this idea of community. This is critically important. When I started in SEO, I had no idea how I fit into anything. I had a 20-year career doing retail, and I also had a lot of experience building websites, but I didn’t have direct SEO experience. I knew I liked automation. That was my sheep down the mountain moment. But I didn’t really see my community. I would advise that you try and find one on Slack, Discord, or maybe a YouTube channel that you’re really passionate about. If you don’t see a community, make one. Sounds crazy, right? Then, after you join this community, you’re going to find that if you’re producing content, we’re telling mode, we’re telling the world about what we do, if you’re producing content, whether it’s a podcast, a YouTube channel, or maybe it’s articles because you don’t feel comfortable yet being in front of your peers, maybe imposter syndrome is still like totally overwhelming at times, you’re going to find that starting to get in touch with that community is going to open up more and more doors for you. It’s this concept of the Luck Surface Area. The more we do and the better we get at it and the more people that know about it, the better our world will become.

What you’re going to find is that the next stage, if possible, if you can get yourself to that place, is to consider speaking in public. You’re going to find that that’s incredibly powerful. If it sounds scary, start really, really small. Go bigger, start at a local meetup, speak at a local conference, try and go a little bigger and see what your comfort level is. If you like it, you’re going to find that there’s no better opportunity for opening doors for you.

So that’s it. It’s the Luck Surface Area. If you embrace this and really dedicate yourself to upskilling, you’re going to develop the types of skills and relationships that make you impossible to ignore in our industry. If you do that in five, six, seven years, you’ll be able to move from being a specialist through management into being a director and hopefully a vice president. I hope you guys got a ton out of this today. I’m super grateful to be here, and y’all take care. Cheers.

Video transcription by Speechpad.com

Shelfies: Why and Where Local Businesses Should Publish Them

If you own or market a local brand, your camera has never been a greater business asset. Early smartphones may have inspired the selfie, and it’s a fantastic idea to photograph the owner and staff of a local business to prove both its authenticity and approachability, but in a commercial context, it’s the “shelfie” that’s I see coming to the fore as a signal to both customers and search engines of what to expect on your premises.

For the past few years, I’ve strongly encouraged local businesses to photograph their most popular goods and services and add these pictures to their Google Business Profiles, but shelfies are different – instead of snapping a single product, take photos of your shelves and displays to give a sense of the abundance and character of what you vend. One look at this on a local business listing, and any customer would immediately know that this is a great nearby place to head for socks:

Or that this independent grocery store has a deli counter with prepared salads:

Or that this may be a hardware store, but it looks like it has a great selection of kitchen wares:

Why publish local business shelfies?

Potential customers may not bother to read all your local business listing categories, your business descriptions, your posts, but if they see a great image of what they’re looking for, the connection is instantaneous (in fact, 400% faster than textual learning). And it’s not just people who are learning from your shelfies…it’s Google, too! Local SEO, Mike Blumenthal mentioned this in a Duct Tape Marketing interview:

“I was listening to a Google webinar for Product Experts… and they really liked what they called shelfies: pictures of the products in your business, on the shelf where Google and the consumer could get a really solid idea of what the place looked like and the range of products you were offering… They’ve created a term for it and they’re clearly focused on it. And I think it’s the kind of photograph you want.”

Google has gotten so good at parsing images that they are now able to match them to perceived query intent. We already know that Google differentiates between images of single products. For example, here’s a search for “engagement rings san francisco”, and do note the images in the local pack:

But when I change my query to “diamond necklaces san francisco”, look at how the photo for the business in the top spot changes. It’s the same company, but a totally different image chosen to match my query:

I have yet to find a live example of Google behaving this way inside the 3-pack for shelfie-type queries, but what we do know from Google’ Cloud Vision API is that they are quite capable of distinguishing between multiple objects in a single image:

Given Mike Blumenthal’s report from the Product Experts webinar and Google’s ever-increasing ability to parse images, I would highly recommend that you do a photo shoot this spring of your most popular shelves of inventory, because I predict that Google will presently treat shelfies the same way that they are already handling single-object images. Additionally, proofs that your premises are well-stocked make simple good sense in 2023, as supply chain shortages continue.

Where to publish your local business shelfies

Here are five places to promote your high-quality shelf pics:

  1. Google Business Profile: Two to three times per month, upload a new shelfie to your main photos set on your listings, as it’s felt a steady drip is more impactful than a flood. You can also upload shelfies to the Products section of your listing via the New Merchant Experience, representing product lines rather than single products with these photos. Finally, use shelfies in your Google Updates (formerly Google Posts) to advertise the breadth, depth, and availability of desirable inventory.

  2. Apple Maps: Moz Local customers should know that you can add up to 100 photos to your dashboard and we’ll distribute your images to Apple Maps. The company’s launch of Apple Business Connect shows that they are getting serious about local, and so should you. ABC also has a new feature called Showcases, which is like Google Updates, and which would also be a good place to microblog about your shelfies. Again, a slow drip is likely the best approach to gradually proving the active status of your listings.

  3. Your other structured citations: Shelfies belong on any local business listings that supports photos. Add them manually, or let Moz Local do the distribution for you.

  4. Your website: Be sure your location landing pages incorporate some shelfies to give potential customers an instant idea of what they’ll find at your different premises.

  5. Your social media profiles: These would be the best places to post up-to-the minute shelfies of hard-to-find items in short supply, holiday-related offers, and new product lines you’re introducing to the community.

Image credit: VGM8383

When I was young, I remember responding with excitement to the nicely-designed holiday displays at different stores. It was a small thrill to see the valentines in the window of the local stationers, the witch’s kettle of candy corn at the grocer’s in fall, or shiny paper crackers imported all the way from the UK for Christmas. Shelves taught me things about my neighbors — the special Manischewitz matzos and Kedem grape juice at Passover and the red envelopes and paper lanterns at Lunar New Year helped me appreciate the diversity and celebratory nature of my community. In Spain, the beautiful arrangement of products has become such an artform that people regularly photograph and even paint the marketplaces there.

So there’s a small goal you can certainly meet in the months ahead: creating displays that you’re proud to photograph and publicize, and maybe if they’re inspiring enough, customers will opt in, too, and add their own shelfies to your local business listings, and their reviews and social posts about your business. Some things in local search marketing are fun, and it’s nice when they’re no big hassle to do well!

9 Years of the Google Algorithm

If it feels like Google search is changing faster than ever, it’s not your imagination. Google reported an astonishing 4,367 “launches” in 2021, up dramatically from 350-400 in 2009. On average, that’s nearly a dozen changes per day.

Many of these more than 25,000 changes were undoubtedly very small, but some were outright cataclysmic. What can we learn from nine years of Google rankings data, and how can it help us prepare for the future?

Is the algorithm heating up?

Thanks to our MozCast research project, we have daily algorithm flux data going back to 2014. The visualization below shows nine full years of daily Google “temperatures” (with hotter days representing more movement on page one of rankings):

Click on the image for the full-size view!

White 2022 was certainly hotter than 2014, the pattern of rising temperatures over time is much more complicated than that. In some cases, we can’t map temperatures directly to algorithm updates, and in others, there are known causes outside of Google’s control.

Take, for example, the WHO declaration of the global COVID-19 pandemic on March 11, 2020 (#8 in the labeled events). COVID-19 changed consumer behavior dramatically in the following months, including huge shifts in e-commerce as brick-and-mortar businesses shut down. While it’s likely that Google launched algorithm updates to respond to these changes, COVID-19 itself reshaped global behavior and search rankings along with it.

The summer of 2017 is an entirely different story, with unexplained algorithm flux that lasted for months. Truthfully, we still don’t really know what happened. One possibility is that Google’s Mobile-first index update caused large shifts in rankings as it was being tested in the year or more preceding the official launch, but at this point we can only speculate.

What were the hottest days?

While some of the hottest confirmed days on Google from 2014-2022 were named updates, such as the “core” updates, there are a few interesting exceptions …

The hottest day on record in MozCast was a major outage in August of 2022 that measured a whopping 124°F. While this corresponded with an electrical fire in an Iowa-based Google data center, Google officially said that the two events were unrelated.

The 8th and 10th hottest confirmed days over these nine years were serious bugs in the Google index that resulted in pages being dropped from the search results. Our analysis in April of 2019 measured about 4% of pages in our data set disappearing from search. Thankfully, these events were short-lived, but it goes to show that not all changes are meaningful or actionable.

The largest search penalty on record was the “Intrustive Interstitial Penalty” in January of 2017, that punished sites with aggressive popups and overlays that disrupted the user experience.

What was the biggest update?

If we’re talking about named updates, the highest-temperature (i.e. largest impact) was actually a penalty-reversal, phase 2 of the Penguin 4.0 update in October of 2016. Phase 2 removed all previous Penguin penalties, an unprecedented (and, so far, unrepeated) move on Google’s part, and a seismic algorithmic event.

Keep in mind, this was just the impact of undoing Penguin. If we factor in the 7+ major, named Penguin updates (and possibly dozens of smaller updates and data refreshes), then Penguin is the clear winner among the thousands of changes from 2014-2022.

What’s in store for the future?

Ultimately, Google’s “weather” isn’t a natural phenomenon — it’s driven by human choices, and, occasionally, human mistakes. While we can’t predict future changes, we can try to learn from the patterns of the past and read between the lines of Google’s messaging.

As machine learning drives more of Google search (and Bing’s recent launch of ChatGPT capabilities will only accelerate this), the signals from Google will likely become less and less clear, but the themes of the next few years will probably be familiar.

Google wants content that is valuable for searchers, and reflects expertise, authority, and trust. Google wants that content delivered on sites that are fast, secure, and mobile-friendly. Google doesn’t want you to build sites purely for SEO or to clutter their (expensive) index with junk.

How any of that is measured or codified into the algorithm is a much more complicated story, and it naturally evolves as the internet evolves. The last nine years can teach us about the future and Google’s priorities, but there will no doubt be surprises. The only guarantee is that — as long as people need to find information, people, places and things, both search engines and search engine optimization will continue to exist.

For a full list of major algorithm updates back to 2003’s “Boston” update, check out our Google algorithm update history. For daily data on Google rankings flux and SERP feature trends, visit our MozCast SERP tracking project.

Helping Google Navigate Your Site More Efficiently — Whiteboard Friday

This week, Shawn talks you through the ways your site structure, your sitemaps, and Google Search Console work together to help Google crawl your site, and what you can do to approve Googlebot’s efficiency.

infographic outlining tips to help Googlebot crawl. your website

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to this week’s edition of Whiteboard Friday, and I’m your host, SEO Shawn. This week I’m going to talk about how do you help Google crawl your website more efficiently.

Site structure, sitemaps, & GSC

Now I’ll start at a high level. I want to talk about your site structure, your sitemaps, and Google Search Console, why they’re important and how they’re all related together.

So site structure, let’s think of a spider. As he builds his web, he makes sure to connect every string efficiently together so that he can get across to anywhere he needs to get to, to catch his prey. Well, your website needs to work in that similar fashion. You need to make sure you have a really solid structure, with interlinking between all your pages, categories and things of that sort, to make sure that Google can easily get across your site and do it efficiently without too many disruptions or blockers so they stop crawling your site.

Your sitemaps are kind of a shopping list or a to-do list, if you will, of the URLs you want to make sure that Google is crawling whenever they see your site. Now Google isn’t always going to crawl those URLs, but at least you want to make sure that they see that they’re there, and that’s the best way to do that.

GSC and properties

Then Google Search Console, anybody that creates a website should always connect a property to their website so they can see all the information that Google is willing to share with you about your site and how it’s performing.

So let’s take a quick deep dive into Search Console and properties. So as I mentioned previously, you always should be creating that initial property for your site. There’s a wealth of information you get out of that. Of course, natively, in the Search Console UI, there are some limitations. It’s 1,000 rows of data they’re able to give to you. Good, you can definitely do some filtering, regex, good stuff like that to slice and dice, but you’re still limited to that 1,000 URLs in the native UI.

So something I have actually been doing for the last decade or so is creating properties at a directory level to get that same amount of information, but to a specific directory. Some good stuff that I have been able to do with that is connect to Looker Studio and be able to create great graphs and reports, filters of those directories. To me, it’s a lot easier to do it that way. Of course, you could probably do it with just a single property, but this just gets us more information at a directory level, like example.com/toys.

Sitemaps

Next I want to dive into our sitemaps. So as you know, it’s a laundry list of URLs you want Google to see. Typically you throw 50,000, if your site is that big, into a sitemap, drop it at the root, put it in robots.txt, go ahead and throw it in Search Console, and Google will tell you that they’ve successfully accepted it, crawled it, and then you can see the page indexation report and what they’re giving you about that sitemap. But a problem that I’ve been having lately, especially at the site that I’m working at now with millions of URLs, is that Google doesn’t always accept that sitemap, at least not right away. Sometimes it’s taken a couple weeks for Google to even say, “Hey, all right, we’ll accept this sitemap,” and even longer to get any useful data out of that.

So to help get past that issue that I’ve been having, I now break my sitemaps into 10,000 URL pieces. It’s a lot more sitemaps, but that’s what your sitemap index is for. It helps Google collect all that information bundled up nicely, and they get to it. The trade-off is Google accepts those sitemaps immediately, and within a day I’m getting useful information.

Now I like to go even further than that, and I break up my sitemaps by directory. So each sitemap or sitemap index is of the URLs in that directory, if it’s over 50,000 URLs. That’s extremely helpful because now, when you combine that with your property at that toys directory, like we have here in our example, I’m able to see just the indexation status for those URLs by themselves. I’m no longer forced to use that root property that has a hodgepodge of data for all your URLs. Extremely helpful, especially if I’m launching a new product line and I want to make sure that Google is indexing and giving me the data for that new toy line that I have.

Always I think a good practice is make sure you ping your sitemaps. Google has an API, so you can definitely automate that process. But it’s super helpful. Every time there’s any kind of a change to your content, add sites, add URLs, remove URLs, things like that, you just want to ping Google and let them know that you have a change to your sitemap.

All the data

So now we’ve done all this great stuff. What do we get out of that? Well, you get tons of data, and I mean a ton of data. It’s super useful, as mentioned, when you’re trying to launch a new product line or diagnose why there’s something wrong with your site. Again, we do have a 1,000 limit per property. But when you create multiple properties, you get even more data, specific to those properties, that you could export and get all the valuable information from.

Even cooler is recently Google rolled out their Inspection API. Super helpful because now you can actually run a script, see what the status is of those URLs, and hopefully some good information out of that. But again, true to Google’s nature, we have a 2,000 limit for calls on the API per day per property. However, that’s per property. So if you have a lot of properties, and you can have up to 50 Search Console properties per account, now you could roll 100,000 URLs into that script and get the data for a lot more URLs per day. What’s super awesome is Screaming Frog has made some great changes to the tool that we all love and use every day, to where you cannot only connect that API, but you can share that limit across all your properties. So now grab those 100,000 URLs, slap them in Screaming Frog, drink some coffee, kick back and wait till the data pours out. Super helpful, super amazing. It makes my job insanely easier now because of that. Now I’m able to go through and see: Is it a Google thing, discovered or crawled and not indexed? Or are there issues with my site to why my URLs are not showing in Google?

Bonus: Page experience report

As an added bonus, you have the page experience report in Search Console that talks about Core Vitals, mobile usability, and some other data points that you could get broken down at the directory level. That makes it a lot easier to diagnose and see what’s going on with your site.

Hopefully you found this to be a useful Whiteboard Friday. I know these tactics have definitely helped me throughout my career in SEO, and hopefully they’ll help you too. Until next time, let’s keep crawling.

Video transcription by Speechpad.com

20 Google Analytics Alternatives

The adage is that if you’re not paying for the service, you are the product. Unfortunately, this rings especially true in the analytics world.

The analytics space is changing, though, and there are many alternatives — both free and paid — that take into account privacy, cookie-less tracking, GDPR compliance, core web vitals, and more. The current leader in the analytics space is Universal Analytics (UA), which has been tracking web data since 2005. But Google is going to stop tracking any new data in UA as of July 1, 2023 (Happy Canada Day?).

This is a move to get users to migrate to Google Analytics 4 (GA4), which is a whole new way of tracking and navigating. In a move from sessions to an event model, GA4 will require some knowledge and familiarity to get up and running, as there’s quite a bit changing.

If you haven’t yet set up GA4 yet, or are on the fence, now is the time to take a look at what else is out there and how the landscape has changed. We’ve broken the alternatives up into three categories:

  • Web analytics

  • Product analytics

  • Data warehousing solutions

How legal is Google Analytics?

The short answer is: depends on where your visitors are from.

Without getting too deep into legal jargon, the user has control over the options now and although there are a few lawsuits out there I’m sure Google will make the proper adjustments to ensure that everyone using it will be mostly covered, eventually.

This isn’t happening right now and under GDPR there are requirements that Google Analytics isn’t passing. Also, by using a product that’s integrated into other products I doubt they’ll ever see 100% coverage.

Even if you’re planning on installing Google Analytics 4, you can run an alternative to ensure that you’re getting the right data and test as you go

Web Analytics

More than a visitor counter, but often simplified, web analytics providers focus on giving services to small businesses, bloggers, and small websites. Their metrics are also the closest to Google Analytics. Most of the web analytics tools below are easy to maintain, quick to install, and don’t need self-hosting. However, some will offer self-hosting in addition to their regular services.

Fathom Analytics

Launched in 2018, Fathom Analytics is a cookie-less, privacy-focused Google Analytics alternative that’s really simple to use. They care about user privacy and pioneered proprietary routes in compliance with the EU for EU visitors. Calling the route “EU isolation”, Fathom Analytics is also GDPR, PECR, COPPA, CCPA, and ePrivacy compliant.

Features of Fathom Analytics include:

  • Seven-day free trial

  • Prices start at $14 per month for up to 100,000 page views

  • Can include up to 50 sites

  • Shows live visitors and where they’re navigating
  • Offers uptime monitoring, event filtering, email reports, ad blocker bypassing, live visitor information, and much more from a bootstrapped company

Being privacy-focused, Fathom Analytics has a beautiful interface and offers unlimited CSV exports of your data available at any time. This allows you to conveniently connect it to other data sources. In addition, Fathom is expected to release a backup option for GA data in the near future.

DEMO FATHOM ANALYTICS

Matomo

Previously named Pikqwik, Matomo (“keep your data in your own hands”) is an open-source, cookie-free tracking, and GDPR-compliant analytics tool offering a search engine and keywords section where you can connect with Google Search Console data.

Additional Matomo features include:

  • Customizable dashboard.

  • Real-time data insights (pages visited, visitor summary, conversions, etc).

  • E-commerce analytics.

  • Event tracking (analyzes user interaction on apps or websites).

  • Measures CTR, clicks, and impressions for text and image banners as well as other page elements.

  • Visitor geolocation (stats can be viewed on maps by city, region, or country).

  • Page and site speed reports.

  • Page performance (number of views, bounce rates).

You can host Matomo on your own servers for free. Otherwise, their cloud pricing starts at $29/month after a 21-day free trial for 50,000 hits (up to 30 websites).

DEMO MATOMO

Cloudflare

If you’re already using Cloudflare’s CDN solution, then there’s no setup required. You can simply authorize the web analytics to start tracking inside your account. You can add up to 10 websites for free, and there’s no need for a cookie banner, since they do not collect personal data.

In addition to tracking standard metrics — page views, visits, load times, bandwidth, etc. — Cloudflare has incorporated Core Web Vitals metrics. These are measured each across your tracked websites, and can email you weekly with updates — very handy.

Screenshot of the CWV dashboard.

Cloudflare installation is light on page load, since you can proxy it through your Cloudflare setup. Alternatively, you can install their lightweight Javascript code (or “beacon”, as they call it).

Although Cloudflare doesn’t say they are GDPR compliant, what they do say is that they are considered an “Operator Essential Services” under the EU Directive on Security of Network and Information Systems. You can assume that Cloudflare is tracking something along the way, with a free price tag and with other offerings available where data could be shared.

Cloudflare is closest to server analytics without going directly to your server. It’s only available in specific areas, so you might have to pay for this.

DEMO CLOUDFLARE

Adobe Analytics

Adobe Analytics is a popular analysis platform that provides tools essential for collecting relevant data concerning customer experience. Company analysts and online marketers frequently depend on it for improving customer satisfaction.

Applying Adobe Analytics to your business website can help you determine what leads to conversions. For example, did changing content or the CTA increase conversions? Did adding more visual aids increase the number of inquiries about a certain service or product?

Adobe analytics is marketed as an enterprise analytics solution with audience insights, advertising analytics, cohort analysis, customer journey analysis, remarketing triggers, and much more. It truly could fit your web analytics or product analytics depending on how your team deploys it, and it has been around for quite a while just like Google Analytics.

REQUEST A DEMO FOR ADOBE ANALYTICS

Clicky

Privacy-friendly web analytics tool Clicky is easy to navigate and offers metrics similar to Google Analytics. However, Clicky embodies the feel of server analytics tools while making their interface simple and fast-loading, thanks to minimal graphics.

Features of Clicky include:

  • GDPR compliance

  • Bot detection and blocking

  • Heatmaps

  • Uptime monitoring

  • Backlink analysis

Clicky also provides a developer API and white labeling of their solution, where you can create your own theme for better brand integration starting at $49/month as part of their hosted service. They also offer a free tier if you’re looking to try it out with limited features.

DEMO CLICKY

Simple Analytics

EU-hosted, privacy-based Simple Analytics offers tools to use for checking your websites daily. Simple Analytics does not collect cookies, IP addresses, or any unique identifiers. Their package provides a bypass of ad blockers, hides referral spam, and includes an iOS app. You can even embed a widget to get public web statistics.

This GA alternative offers some nifty events that are gathered by default for ease of use. These include email clicks, outbound links, and files to streamline tracking. All of Simple Analytics features come together in a UI that’s a simple dashboard providing quick “in and out” times — much quicker, in fact, than it would take to get to the correct property using Google Analytics.

Simple Analytics offers a free 14-day trial. If you decide to keep Simple Analytics, you’ll pay $19/month, or $9/month if you pay a year in advance.

DEMO SIMPLE ANALYTICS

Pirsch

An open-source, cookie-free web analytics alternative to Google Analytics, Pirsch seamlessly integrates into websites and WordPress with plugins. A developer-friendly analytics offering a flexible, impressive API, server-side integrations, and SDKs, Pirsch provides Golang (Go), PHP SDK, JavaScript, or a community-provided code to embed snippets. Pirsch also works with Google Search Console.

You can perform any function you want from the Pirsch dashboard, like viewing statistics or adding websites.

Get started by viewing Pirsch’s live demo or opting in on their 30-day free trial. Paying for Pirsch is only $5 per month if you choose annual billing. If you want to make monthly payments, the cost increases to $6 per month.

DEMO PIRSCH

Plausible

A privacy-friendly, open-source web analytics platform, Plausible is cookie-free and fully compliant with PECR, CCPA, and GDPR. Plausible is a popular alternative to Google Analytics because of its simplicity, lightweight script, and ability to reduce bounce rates by expediting site loading.

You can easily segment data into specific metrics, analyze dark traffic via Urchin Tracking Module (UTM) parameters, and track how many outbound link clicks you get.

Plausible offers a 30-day free trial option that provides unlimited usage of its features without requiring a credit card. Once your free trial has ended, you can pay $9 per month for up to 10,000 page views. If you choose yearly billing, you get two months of free use. Unlimited data retention, slack/email reports, and event customization are also provided with a paid subscription to Plausible.

DEMO PLAUSIBLE

Umami

Umami is GDPR-compliant, open-source, cookie-free, and does not track users or gather personal data across websites. By anonymizing any information collected, Umami prevents the identification of individual users. In addition, you won’t need to worry about staying compliant with constantly changing privacy lawsi.

Features of Umami include:

  • Mobile-friendly so you can see stats on your phone at any time.

  • Since you host Umami under your domain, you won’t see any ad-blockers like you see when using Google Analytics.

  • Public sharing of your stats is available using an exclusively generated URL.

  • Umami’s lightweight tracking script loads almost instantly and won’t slow down the loading of your website.

  • A single installation of Umami enables tracking of an unlimited number of sites. You can also track individual URLs and subdomains.

Since Umami is an open-source platform and self-hosted, it’s free to use. Umami says a cloud-based version of its analytics is coming soon.

SIGN UP FOR UMAMI

Product (behavior) analytics

Although product analytics are not always GDPR-compliant, they provide powerful analytic tools that include valuable elements like segmentation and deeper levels of customization. For that reason and others, product analytics come with a higher maintenance cost. Here are several product analytics tools that are excellent Google Analytics alternatives for tracking customer behavior.

Hubspot Analytics

The traffic analytics tool provided by Hubspot Analytics offers exceptional analysis data for breaking down page views, new contacts, and even entrance/exit information regarding how long visitors remained on specific web pages. You can also install a tracking code to external sites so you can track traffic stats.

Additional features include:

  • Bounce rate percentages.

  • Number of call-to-action views/number of CTA clicks.

  • Conversion rates of visitors who click on your CTA.

  • Access to specific URLs or country stats.

A subscription to a starter Hubspot Analytics platform is $45 per month. You can choose monthly or annual billing, which is $540.

A professional subscription to Hubspot Analytics starts at $800 per month ($9,600 for one year). This subscription gives you multi-language content, video management and hosting, phone support, and A/B testing.

An enterprise subscription starts at $3,600 per month ($43,200 per year). You get 10,000 marketing contacts with this subscription, with additional contacts available for sale in increments.

FREE HUBSPOT ANALYTICS TRIAL

Kissmetrics

Kissmetrics offers analytics for e-commerce and SaaS websites. Kissmetrics for SaaS gives you deep insights into the type of content and features that are fueling conversions, converting trials into conversions, and reducing churn.

Kissmetrics detects characteristics that drive conversions and retain regular buyers, so you can adjust site elements appropriately. This unique analytics tool also streamlines checkout funnels and integrates with Shopify.

Pricing for Kissmetrics SaaS involves three tiers:

Heap

Heap Analytics works on mobile and PC devices, quickly captures nearly all behavioral parameters, and supports first or third-party installation. You can also create individual identities for users over numerous sessions and augmented product information to purchase/sales events.

Heap features include:

  • Customer journeys with session replies

  • Campaign Management

  • Query Builder

  • Behavior Tracking

  • Campaign Segmentation

  • Funnel Analysis

  • Dashboard Creation

  • Key Performance Indicators

  • Web Traffic Reporting

  • And much more

FREE TRIAL

FullStory

The makers of Fullstory state that if you know how to copy and paste, you’ll have no trouble setting it up. This analytics platform also offers “private-by-default” abilities that ensures masking of text elements at their source.

Their features include, but are not limited to:

  • Tagless web autocapture

  • Funnels & conversions

  • Journey mapping

  • User segmentation

  • Heatmaps

  • Session replay

  • AB Testing and much much more

FREE TRIAL

Mixpanel

Mixpanel offers a free subscription that gives access to core reports, unlimited data history, and EU or U.S. data residency. Mixpanel can be sliced and diced to fit so many situations, it’s flexible and moldable for many businesses at many levels.

Screenshot of the dashboard.

For $25 per month under their growth plan, you get all the free features, plus data modeling, group analytics, and reports detailing causal inference. Mixpanel also includes features such as:

  • Segmenting users based on actions

  • Integrate with Slack to share reports, even if they don’t use Mixpanel

  • No limits on the amount of events tracked

  • Team Dashboards with Alerts

  • Identify top user paths and drop-off points

  • Understanding of conversion points across the funnel

  • And much much more.

It’s used by 30% of Fortune 100 SaaS companies and if you need custom pricing and plans,

CONTACT MIXPANEL SALES

Amplitude Analytics

According to a report available here, Amplitude is consistently ranked as #1 among other product analytics platforms, top software products, and customer satisfaction. You can analyze collected data with fast self-serve analytics that do not require SQL.

You can use Amplitude to…

  • Explore behavioral data

  • Measure customer engagement

  • Use product intelligence to build faster

  • Understand channel performance

  • Answer questions between, product, marketing, engineering & analytics

The Start Amplitude package is free and offers unlimited data destinations, users, and data sources. You also get 10 million events (streamed or unstreamed) per month.

Their paid plans start with customizations on top of that including advanced behavioral analytics & custom event values.

CONTACT SALES FOR GROWTH OR ENTERPRISE PACKAGES or if you’re a startup you can apply for a free year of Amplitude Scholarship!

Segment

Segment which was acquired by Twilio in 2020 offers superior email onboarding and intuitive insights in SMS campaign and email performance & that’s just one arm of what it does. Segment uses only one API to collect data across all platforms. They also have SDKs for Android, iOS, JavaScript, and over 20 server-side languages. They have been written as a technology startup that lets organizations pull customer data from one app into another & they have as of writing over 300 integrations!

Segment has customers across industries including media, medical, B2B, retail and from startup to enterprise. As well you can upload your customer data into their data warehouse to keep your data there so they could fit into the data warehousing category below too.

To get started using Segment, simply make a free account which allows for 1,000 visitors from two sources and access to their 300 integrations or their paid tiers start at $120/month where you can sign up for a team account.

Rudderstack

Rudderstack provides developers all they need to get started immediately with this product/behavior analytics. Features of Rudderstack include identifying anonymous mobile and web users, customizing destinations through the application of real-time modifications to event payloads, and automatically occupying warehouses with event and user record schemas.

The free version of Rudderstack gives you five million events per month, over 16 SDK sources and more than 180 cloud destinations. Like Segment you can use Rudderstack as a data warehouse for your customer data and they’ve built their platform as warehouse-first and built for developers.

The Pro version starts at $500 per month. You get the free features, plus email support, a technical account manager, and even custom integrations in the higher tiers. Request pricing for the Enterprise version of Rudderstack here.

Data warehousing solutions

It’s always good to have your data backed up and if you have a lot of history in Universal Analytics it would be a good place to start. There are even services to connect all these solutions together seamlessly, such as Funnel. While these have additional costs, you can save all your data for many years without a problem coming up so you can refer to them when need be.

Google BigQuery / Google Cloud

Available with an easy connection to your GA4 data if you’re leaning that way. Google lists BigQuery as enterprise and there will be limits to how much data you can send to BigQuery before paying. They offer real-time analytics with built-in query acceleration and with the scale of data that Google handles we can be pretty sure that they’ll be able to handle it. Google Cloud Storage offers standard, nearline, coldline, and archive storage options.

One suggestion is if you’re using Google for backups to ensure you have a secondary account attached to it in case your primary account loses access, which I’ve seen happen from time to time. They provide a migration from some tools and their pricing is based on data, so it’s free up to 10GB. They created a billing calculator for easy cost analysis once you get into the paid side.

Amazon – AWS

AWS states that they support more compliance certifications and security measures than all other cloud providers. They allow for backup of all data types and have redundancy built in that you would expect at the Amazon level.

Like Google they have a calculator to estimate a pricing model for your team as they have a lot of other services integrated into this that you can take advantage of there.

Snowflake

Snowflake provides a data cloud and isn’t Google or Amazon, which may appeal to some. It supports data science, data lakes, and data warehousing on the three top clouds with their fully automated solution.

They’re HIPAA, PCI DSS, SOC 1 and SOC 2 Type 2 compliant, and FedRAMP Authorized and have a 30 day free trial with all plans you can check out as well as a “pay for usage” option or a “pay for usage upfront” option too.

Recap and recommendations

On July 1, 2023, Universal Analytics will stop tracking any new data, and then by EOY 2023 all UA data will be removed by Google — so back up your data! Think about warehousing your data long term for your Universal Analytics and moving forward.

Then, make a plan for how you’re going to be tracking moving forward. Talk options and thoughts with stakeholders.

Install GA4 now (or yesterday!) or try the options above out, as none of these are created equal and many have free trials.

How to Use Estimated Brand Reach as a Meaningful Marketing Metric

Estimated brand reach is the most important high-level metric that everyone seems to either interpret incorrectly, or ignore altogether.

Why? Because it’s a tough nut to crack.

By definition, brand reach is a headcount of unique “individuals” who encounter your brand, and you cannot de-anonymize all the people on every one of your web channels. Simply put, two “sessions” or “users” in your analytics could really be from one person, and there’s just no way you could know.

Nevertheless, you can and most definitely should estimate your brand reach. And you should, and most definitely can, use that data in a meaningful way.

For instance, it’s how we confirmed that:

  • It was time to abandon an entire paid channel in favor of a different one.

  • There’s a near-perfect correlation between our engaged reach and our lead generation.

And that’s just the tip of the iceberg. Let’s dive in.

What is reach?

Reach counts the number of actual people who come in contact with a particular campaign. For example, if 1,500 people see a post on Instagram, your reach is 1,500. (Warning: Take any tool claiming to give you a “reach” number with a grain of salt. As we covered earlier, it’s really hard to count unique individuals on the web).

Impressions, on the other hand, is a count of views. One person can see an Instagram post multiple times. A post with a reach of 1,500 can easily have as many as 3,000 impressions if every one of those people see it twice.

Brand reach takes this a step further by tracking all the individual people who have encountered any and all of your company’s campaigns across all of your channels, in a given time period.

If you’re tracking brand reach correctly, every single person only gets counted once, and as far we know, that’s impossible.

Google Search Console, for instance, will show you exactly how many impressions your website has achieved on Google Search over a period of time. But it won’t count uniqueindividuals over that period. Someone could easily search two different keywords that your site is ranking for and encounter your brand twice on Google. There is no way to tie those multiple sessions back to one individual user.

It would be even harder to track that individual across all of your channels. How, for instance, would you make sure that someone who found you on social, and then again on search, isn’t counted twice?

The short answer is that you can’t.

However, you can estimate brand reach, and it’s work worth doing. It will a) help you tie meaningful metrics to your overall brand awareness efforts, and b) give you an immense amount of insight into how that high-level brand awareness affects your deeper-funnel outcomes — something that is sorely missing in most marketing programs.

Using impressions as a stand-in for pure reach

We’ve accepted that we can’t count the number of users who encounter our brand. But we are confident in our ability to count total impressions, and crucially, we’ve deduced that there’s a strong relationship between impressions and reach.

Common sense tells us that, if you see changes in your brand’s total impressions, there are likely changes to your reach as well.

We tested this premise using one of the only channels where we can actually count pure reach vs impressions: our email marketing program.

In email marketing:

  • Reach = the number of people who receive at least one email from us each month.

  • Impressions = the total number of emails delivered to all the people in our database each month.

And, as we suspected, there is a near perfect correlation between the two, of 0.94.

Interestingly, there is also a near-perfect correlation between email impressions and email engagement (someone clicking on that email) of 0.87.

Admittedly, email is a very controlled channel relative to, say, search or social media.

So, I went one step further and looked at how our “impressions” in Google Search Console aligned with Google Analytics’ count of “New Users” over the course of one year (which we’ll use as a stand-in for pure reach, since it only counts users once in a given timeframe):

The Pearson Correlation Coefficient for impressions’ relationship to GA’s New Users is 0.69, which is very strong! In other words, more impressions typically means more unique users, (AKA, reach).

Meanwhile, the relationship between GA’s New Users and GSC clicks is an astonishing 0.992, which is just 0.008 off from a perfect correlation.

People much smarter than I have pointed out time and time again that GA’s user data must be taken with a grain of salt, for reasons I won’t get into here. Still, the point is that there’s ample evidence to suggest an extremely tight relationship between reach and impressions.

TL;DR: If impressions change negatively or positively, there is very likely to be a corresponding change in reach, and vice versa.

What we ended up with

Taking all of this knowledge into account, we started tracking impressions of every single channel (except email, where we can actually use pure reach) to help determine our estimated brand reach. The outcome? This graph of our brand reach as it changes over time:

It’s extremely rewarding to have this type of number for your brand, even if it is an estimate.

But the greatest value here is not in the actual number; it’s in how that number changes from month to month, and more importantly, why it changes (more on this later in this post).

How to track estimated reach

The chart above displays our brand’s estimated reach across all our known marketing channels. Acquiring the data is as simple as going into each of these channels’ analytics properties once a month, and pulling out the impressions for the prior month.

Let’s go through the steps.

1. Have a spreadsheet where you can log everything. Here’s a template you can use. Feel free to update the info in the leftmost columns according to your channels. Columns G through L will populate automatically based on the data you add to columns C through F. We recommend using this layout, and tracking the data monthly, as it will make it easier for you to create pivot tables to help with your analysis.

2. Access your impression data. Every marketing mix is different, but here’s how we would access impression data for the channels we rely on:

  • Organic search: Pull impressions for the month from Google Search Console.

  • Email marketing: Total number of unique contacts who have successfully received at least one email from you in the current month (this is one of the few channels where we use pure reach, as opposed to impressions).

  • Social media: Impressions pulled from Sprout, or from the native social media analytics platforms. Do the same for paid impressions.

  • Google Ads/Adroll/other ad platform: Impressions pulled from the ad-management platform of your choosing.

  • Website referrals: The sum of estimated page traffic from our backlinks each month. We use Ahrefs for this. The idea is that any backlink is a potential opportunity for someone to engage with our brand. Ahrefs estimates the traffic of each referring page. We can export this, and add it all up in a sheet, to get an estimate of the impressions we’re making on other websites.

  • YouTube: Impressions from Youtube Analytics.

Most of the above is self-explanatory, with a few exceptions.

First, there’s email. We use pure reach as opposed to impressions for two reasons:

  1. Because we can.

  2. Because using impressions for email would vastly inflate our estimated reach number. In any given month, we send 3 million or more email messages, but only reach around 400,000 people. Email, by its nature, entails regularly messaging the same group of people. Social media, while similar (your followers are your main audience), has a much smaller reach (we are under 30,000 each month).

We deliver many more emails (impressions) every month than there are unique recipients (reach).

Second, is Referral traffic. This is traffic that comes from other sites onto yours, but note that it excludes email, search-engine traffic and social media traffic. These are accounted for separately.

The referral source, more than any other channel, is a rough estimate. It only looks at the estimated organic page traffic, so it leaves out a large potential source of traffic in the form of other distribution channels (social, email, etc.) that website publishers may be using to promote a page.

But again, reach is most valuable as a relative metric — i.e., how it changes month to month — not as an absolute number.

To get the desired timeframe of one full month on Ahrefs, select “All” (so you’re actually seeing all current live links) and then show history for “last 3 months” like so:

This is because Ahrefs, sadly, doesn’t let you provide custom dates on its backlink tool. My way of doing this adds a few steps, but they’re fairly intuitive once you get the hang of them (plus I made a video to help you).

Start by exporting the data into a spreadsheet. Next, filter out backlinks in your sheet that were first seenafter the last day of the month you’re analyzing, or last seenbefore the first day of that month. Finally, add up all the Page Views, and that will be your total “impressions” from referral traffic.

The video below how we would pull these numbers for November, using Ahrefs: 

Finally, you’ll notice “branded clicks” and “branded impressions” on the template:

This data, which is easily pulled from GSC (filter for queries containing your brand name) can make for some interesting correlative data. It also helps us with engagement data, since we count branded search as a form of engagement. After all, if someone’s typing your brand name into Google Search, there’s likely some intent there.

How to evaluate estimated reach

Once you’ve filled in all your data, your sheet will look something like the image below:

That’s enough to start creating very basic pivot tables (like adding up your total reach each month). But notice all the holes and zeros?

You can fill those by pulling in your engagement metrics. Let’s run through them:

  • Organic search: Pull clicks from Google Search Console. (Optional: I also recommend pulling branded search impressions, which we count as engagements in our spreadsheet, as well as branded clicks). New Users from GA is a viable alternative to clicks (remember that near-perfect relationship?), but you won’t be able to filter for your branded impressions and clicks this way.

  • Email marketing: Total number of “clicks” from the emails you’ve sent. We do this over opens, because opens have become less reliable; some email clients now technically open your emails before you do. Clicks in emails can be pulled from your email automation platform.

  • Social media: Engagements (link clicks, comments, likes and reposts) pulled from Sprout, or from each social platform’s native analytics. Do the same for paid engagements.

  • Google Ads/AdRoll/other ad platform: Interactions, or clicks, pulled from the ad platform of your choosing.

  • Website referrals: Referral traffic from Google Analytics (these are the people who encountered your brand on an external website and then engaged with it).

  • YouTube: Views from Youtube Analytics.

Once you’ve filled in this data, your spreadsheet will look more like this:

Now you have some new insights that you can create pivot tables around. Let’s look at a few:

1. Engaged reach

This is the portion of your total estimated reach that has engaged with your brand. You want to see this climb every month.

2. Engagement rate

This is the percentage of your estimated reach that is engaging with your brand. This is arguably your most important metric — the one you should be working to increase every month. The higher that percent, the more efficient use you’re making of the reach you have.

3. Engagement rate by channel

This shows you the channels with your highest engagement rate for the current month. You can use this to flag channels that are giving you what we might call “bad” or “inefficient” reach. It affirmed our decision, for instance, to drop an entire display channel (AdRoll) in favor of another (Google Display). Month after month, we saw low engagement rates on the former. Diverting our spend away from that display channel slightly increased our cost per thousand impressions, but the added cost was more than offset by a higher engagement rate.

4. Winners and losers month-over-month

You can do this as a direct comparison for reach or for engagement. The chart below is a comparison of engagements between October (blue) and November (red). We always want the red (most recent color) to be bigger than the blue (unless, of course, you’ve pulled resources or spend from a particular channel, e.g., paid Instagram in the chart below):

5. Correlation data

This is where we get a little deeper into the funnel, and find some fascinating insights. There are many ways to search for correlations, and some of them are just common sense. For example, we noticed that our YouTube reach skyrocketed in a particular month. After looking into it, we determined that this was a result of running video ads on Google.

But reach and engagements’ most important relationships are to leads and, better yet, leads assigned to sales reps. Here’s an example using five months of our own data:

While we still need more data (5 months isn’t enough to close the book on these relationships), our current dataset suggests a few things:

  • More reach usually means more engagement. There’s a strong relationship between reach and engagement.

  • More reach usually means more lead gen. There’s a moderate relationship between reach and lead gen.

  • More engagement almost always means more lead gen. There is a very strong relationship between engagement and lead gen.

  • More engagement almost always means more assigned leads. There’s a strong relationship between engagement and leads that actually get assigned to sales people.

  • More lead gen almost always means more assigned leads. There’s a very strong relationship between lead gen and leads getting assigned to sales people.

This is just one of the ways we’ve sliced and diced the data, and it barely skims the surface of how you can evaluate your own brand reach and brand engagement data.

6. Collaborating with other marketers on your team

Some of the relationships and correlations are subtler, in the sense that they relate to specific levers pulled on specific channels.

For example, we were able to figure out that we can increase branded search by running broad-match-keyword Google paid search campaigns, specifically.

The only reason we know this is that we meet as a team regularly to look over this data, and we’re always debriefing one another on the types of actions we’re taking on different campaigns. This structured, frequent communication helps us pull insights from the data, and from each other, that we’d otherwise never uncover.

Why this work is so worth doing

If at some point while reading this article you’ve thought, “dang, this seems like a lot of work,” you wouldn’t necessarily be wrong. But you wouldn’t be right, either.

Because most of the actual work happens upfront — figuring out exactly which channels you’ll track, and how you’ll track them, and building out the pivot tables that will help you visualize your data month after month.

Pulling the data is a monthly activity, and once you have your methods documented (write down EVERYTHING, because a month is a long time to remember precisely how you’ve pulled data), it’s pretty easy.

One person on our team spends about one hour per month pulling this data, and then I spend maybe another two hours analyzing it, plus 15 minutes or so presenting it at the start of each month.

We’ve only been doing this for about half a year, but it’s already filled gaps in our reporting, and it’s provided us with clues on multiple occasions of where things might be going wrong, and where we should be doubling down on our efforts.

Eventually, we even hope to help use this as a forecasting tool, by understanding the relationship between reach and sales meetings, but also reach and the most meaningful metric of all: revenue.

How cool would that be?

5 Key Considerations for Winning SEO Buy-In – Whiteboard Friday

Petra has plenty of experience talking to C-level decision-makers about their business problems, and translating them into SEO solutions. So in today’s episode of Whiteboard Friday, she takes you through the main considerations you need to pay attention to when explaining the value of your work: commitment, concerns and objections, status versus purpose, and prioritization.

infographic outlining five key considerations for winning SEO buyin

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, I’m Petra Kis-Herczegh, and welcome to my Whiteboard Friday on five key considerations for SEO buy-in. I’m an SEO and solution engineer, which means I get to talk to C-level decision-makers about their business problems and translate them into SEO solutions. Today, I will take you through the five key considerations, which are audience, commitment, concerns and objections, status versus purpose, and prioritization. We’re going to talk about the common pitfalls and what you need to pay attention to and how you can make sure that you optimize this process so you can save time and do what you do best and spend more time executing your SEO strategy.

So to start with the audience, the first thing you want to understand is who you are talking to. You want to make sure that you identify your key stakeholders, the decision-makers, and also the blockers because the challenge here is that sometimes you might not be speaking to the right people or you might not be following the right process. So you might be stepping over blockers, going straight to decision-makers, which upsets people, or you might not be involving the relevant stakeholders early on. So what you need to consider here is who to involve and when. The solution here is that you want to build rapport to make sure that the stakeholders and blockers and decision-makers trust you and you want to make sure that you fully understand the process to follow it as you should within the business.

The next thing is commitment. So with commitment, you need to make sure that when you’re getting buy-in, you’re identifying if you’re getting real buy-in or fake buy-in. Fake buy-in is when you get a yes, but you’re getting it without commitment. We often tend to do this without us even realizing it by pushing to a yes, by asking questions that give no other option. So you can ask things like: Do you want your content to rank and convert better? Or do you want more traffic? These are not really questions, and what you actually do is you damage relationships, which means that you end up in a cycle where you’re not being able to execute what you actually wanted to achieve because there is no true accountability on an execution level. So here you want to apply critical thinking, which means that you need to be really skeptical on how are you getting to that yes.

That drives us to the next point, concerns and objections, because you need to make sure that you address these early on. So the common pitfall here is our confirmation bias because our confirmation bias really pushes us to start a research. Let’s say you’re thinking about a local SEO project or a technical SEO project to look at use cases that prove your own point. But what you’re doing, when you’re doing that, is you’re forgetting that there might be concerns and objections coming from different stakeholders and different teams. So the way how you can think about this is that you want to engage in healthy conflict. You want to make sure that you do your research with the idea to preempt these concerns and objections and ask questions like: Well, if this project is so important, why is it not being done already? What are the questions that other stakeholders could raise with changes within the website, how that could impact other teams? Are there going to be trainings required for relevant teams if we introduce, for example, a new tool? So you want to make sure that you understand what sort of concerns could come up so you can actually be really comfortable and confident when you talk about these and address them and bring them up.

That leads us to the next point, which is status versus purpose. So what’s your real reason on trying to get buy-in for an SEO strategy, project, or idea? What’s driving it? Because you really want to make sure that it’s purpose that’s driving it rather than your ego, which is why you need to check in with yourself. The real solution here is that you want to think about your SEO KPIs and connect them to overall business needs because that’s when you can look at a holistic level and think about how your actual strategy is driving purpose rather than status, which leads us to our last point, which is prioritization. Because if everything is important, then nothing is. What that means is that if you focus on everything at once, the likelihood is that nothing will ever get done.

So here you actually want to use a prioritization framework. So you can go to your favorite prioritization framework. There are things like ICE, which focuses on impact, confidence, and the effort required to execute your solution. But you probably also want to add a metric on the probability and the chances that you get real buy-in from your leadership in order to make sure that you’re not wasting too much time trying to get your ideas and strategy executed.

I hope you found this session useful, and hopefully you can adapt some of these to optimize your process of getting SEO buy-in, which means that you will have now more time to execute your SEO strategy.

Video transcription by Speechpad.com

Crie um site como este com o WordPress.com
Comece agora