TopHatRank Blogger SEO SEO Resources for Blogger and Publishers Getting Started with Google Search Console; Recap SEO For Bloggers Episode #17

Getting Started with Google Search Console; Recap SEO For Bloggers Episode #17

Recap, Q&A, + All the Resources

The valuable insights offered by Google Search Console enable publishers to see what’s working and what’s not. During this episode of SEO For Publishers, our panelists shared the many facets of GSC, such as the difference between security warnings vs. manual actions. In addition to highlighting how you can analyze changes in your traffic over time, this episode covers how to identify keywords tied to specific content on your site and tons more.

Use these buttons to jump to sections, and don’t forget the “back to top” button (bottom right) for easy navigation:

Replay the LIVE Webinar

Don't Miss Out On Future Resources!

You don’t want to miss out on future virtual and live events with #TeamTopHat and other industry experts. Sign up for our mailing list to get free resources and updates today!

Q&A With The Panelists

These are each of the questions that were asked during the Q&A portion of the webinar. The answers are provided by the panelists. Have a question about this episode you'd like addressed? Reach out to!

Question 1

Confused about the difference between Google Analytics, Google Search Console, Google Tag Manager, etc. Now there is a new thing under beta Search Console Insight. WHAAT? Do I need all these, or can we just use one of them? Why do they give different results?

Google Tag Manager is a tool. The odds you are using it are low. Google Search Console and Analytics both measure traffic. The reason the totals may be different can come down to many reasons including; time zones, reporting periods and more. If you want to track all your traffic as a whole, use Analytics. If you want to see Google-only traffic as a snapshot, you use Search Console. As for “Seach Console Insights’ that’s nothing more than another view of your data based on current user trends. Use it, don’t use it, it’s up to you.


Question 2

If Google is only using search metrics for those using Chrome, then are rankings not as accurate since many people use other browsers?

It won’t really impact “accuracy” of search rankings — but it does mean that Google is only using real-world timing data from Chrome visitors, yes. (This is Google’s world, we just live in it.)


Question 3

I’m seeing some of my Web Stories as needs improvements. Didnt know web stories were a part of core web values, but good to know.

Correct. All URLs on a site are counted for purposes of Core Web Vitals


Question 4

Just noticed I’m getting “Needs Improvement” for LCP for all of my web stories. (Sorry if this Q has already been asked.) Figuring that’s not something to worry about?

It is. But it’s also not an easy issue to fix.


Question 5

Once you have identified a keyword that is not getting as many clicks as it used to how would you fix that?


You look at the intent of the keywords and see if your content matches that intent. Do that by comparing your resource with the resources that Google is ranking and adjust accordingly.


Question 6

Very new to blogging – I have Google Analytics set up so does that mean GSC is set up?

Hi Elizabeth, and welcome to the party! Google Analtyics and Google Search Console are two separate services,so you’ll need to set up GSC separately. However, you can probably use your GA setup to *verify* your domain property in GSC.


Question 7

Seeing the urls with # confuses me. I assume this is coming from my table of contents headers? Anyway, does the breakdown mean anything?

It does not, nor is it hurting you.



Question 8

In the examples Arsen showed, what do we do when we see a drop in search terms for specific posts? If they’re ranking in the top 3 but have slipped one or two spaces,should we go in an update the post? I know Casey has often said it’s risky to touch posts in top positions. But if they’re slipping is it ok to try to adjust them?

That’s up to you. If something is in the Top 3 though we tend to not want to touch it if at all possible. Just understand that any change you make could help you, or hurt you. You would need to be patient and wait several weeks to see if the changes help you. Don’t panic, and reverse your changes, if the post drops further. That’s common with Google. And it’s a way of “keeping you guessing” algorithmically. Be patient. Give the changes time to work. And then adjust accordingly.


Question 9

What should be in Excluded? For example…if I see a valid URL from my website and it is excluded, should I be concerned?

Absolutely. You will need to ask yourself: is this URL unique? Does it have a technical issue preventing it’s indexing? How does this URL compare with other existing, indexed results? Then, adjust accordingly.


Question 10

In addition to the main sitemap and video sitemap, I also have a web-story sitemap. Is that correct?

Not necessary. It’s auto-generated as well by Yoast. Just submit the main index file and that would be covered. No reason to submit that individually.


Question 11

When looking at coverage for the past 12 months, impressions and clicks were basically along the same line. Then in July, impressions stayed about the same, but clicks went down (although pretty evenly spaced apart). Not sure why?

Impressions staying the same while clicks trend down can be attributed to a decline in positions – you are still on page 1 but below the fold. The other reason could be seasonality (declinein demand).


Question 12

My submitted sitemap has a dot instead of an underscore. Is that okay? /sitemap.xml

Are you using Yoast SEO? If so that’s the wrong sitemap name. But if you’re using a different SEO Plugin, that could be correct. Try going to and see if you get a valid — and current — sitemap. If not, you definitely need to update what’s configured in GSC.


Question 13

I checked my sitemaps and see I have attachment-sitemap. Is that ok? Status is success and I have 1000 URLS. Since attachments are images, can these be in sitemaps?



Image attachement URLs are not a good thing.

Question 14

What does Discovered – currently not indexed mean.

It means Google has crawled the URL, but decided not to index it. Investing why and how was discussed live in this webinar.


Question 15

If you gain tons of new keywords, won’t that bring down the average position?

“Average-postion” isn’t a ranking factor. I wouldn’t lose sleep over it.


Question 16

Do you have other webinars / videos about getting started with GSC?

Please start with the detailed YouTube playlist put out by Google that was shared during the presentation.

Question 17

Is it good practice when you update a post (or create a web story) to submit the URL for reindexing?

It’s usually not necessary. Google will find the URL via your sitemap.

Question 18

If a recipe keyword is competing with pinterest and also retailers, for example, is there a chance to outrank them (since their not recipe pages)?


Question 19

I have sitemap index added and all the others individually, should I delete the individual sitemaps?

It’s not a huge deal, but yes I do recommend not having the individual sitemaps added if they’re already referenced in sitemap_index.xml. It’s redundant and long-term is a bit harder to maintain.

Question 20

In all the panelist’s opinions, what is the most important thing we should be doing to prepare for Q1 (regarding our old content)?

Start a content calendar. Focus on seasonal content first. Make sure you are working on content “before” it becomes seasonally relevant. You would hate to miss the window and wait another year to address it again.


Question 21

Should the are kidney beans poisonous be a separate article, or should he include that topic in the same article that’s already ranking?

That’s up to you. Is the original article about kidney beans? Is this addition going to make the existing article better? Or, do you feel that this topic warrants a completely new article that is more detailed? That’s a question you’ll have to investigate.


Resources & Links

Below are links to all tools, articles, and other resources mentioned in this webinar:

  1. GSC Introductory Video – Google Search Console How-To
  2. Google Search Console 
  3. Ownership How-To video – 7 ways to verify site ownership – Google Search Console Training
  4. Core Web Vitals SEO For Publisher episode – SEO For Publishers episode about Core Web Vitals
  5. GSC Report – Monitoring Rich Results in Search Console – Google Search Console Training
  6. Luck of the Links – SEO For Publishers episode about link building


Ashley: And we are officially live.

Andrew Wilder: Yay.

Casey Markee: Yes. Let’s see, is everyone starting to pop in here?

Ashley: And Arsen leaves.

Casey Markee: Arsen’s gone. Get out of here, Arsen.

Andrew Wilder: That’s about right.

Casey Markee: Get out of here, Arsen. Attendees list still populating?

Arsen Rabinovich…: I don’t need to take this.

Ashley: As everyone’s joining in, let us know in the chat box where you are tuning in from, the very last episode of the whole year. That’s crazy.

Casey Markee: [00:00:30] That’s right, it’s going to be awesome. We’re all wearing our nice festive gear today. I got the nice hat today. Apparently they were out of the naughty. Actually, as I was talking, as I was telling my fellow panelists here, my daughter took the naughty one to school, so I’m not sure how I should feel about that as a father. I’ve decided not to make too much of it, but yeah, she had two choices and she just chose to take the naught one to school, so good times. Good times.

Ashley: Definitely won’t comment on what-

Casey Markee: Not at all. All good, all good.

Ashley: Options.

Casey Markee: [00:01:00] Oh, ho, ho, ho from Asheville. Yes, Lori, good to see you. Happy holidays from Massachusetts. TME. [Deance 00:01:07] from Wisconsin. Look at that.

Ashley: Montreal?

Casey Markee: All right, really? I hope everyone has gotten their Christmas shopping done. What is it, 17 days to Christmas? Unbelievable.

Arsen Rabinovich…: Germany.

Casey Markee: Germany.

Arsen Rabinovich…: Germany.

Casey Markee: My goodness, my goodness. We are taking a Christmas cruise, [00:01:30] we’re going to go on… We’re going on the high seas. We are seeing the incredible sights of Cabo and Ensenada. Yeah, exciting.

Ashley: Nice.

Casey Markee: I get my Puerto Nuevo fried lobster and-

Arsen Rabinovich…: All that snowy goodness in Cabo.

Casey Markee: Oh yeah, yeah. We get my Puerto Nuevo fried lobster and my unlimited beach drinks on the catamaran cruise in Cabo, and we’re going to be doing that the week of Christmas. I believe that we, I think we leave on [00:02:00] the previous weekend and then we get back on Christmas Eve.

Ashley: Oh, how nice.

Casey Markee: Looking forward to that, looking forward to that. Anytime you can be stuck in the middle of the ocean with unlimited drinks, I think is a good deal. That’s big.

Ashley: Not too bad.

Casey Markee: All good.

Ashley: What cruise line are you going?

Casey Markee: We’re going Carnival. We’ve always done Carnival. We’ve done enough cruise line… Enough trips with Carnival now that we are like Diamond members, so all that does is get you [00:02:30] free stuff. I’m all for free stuff, so all good. They could pay my keno. They could pay me to play keno, there you go.

Ashley: Kathy did-

Casey Markee: Who’s from the French Alps? Wow. David, hello from the French Alps. That’s awesome.

Andrew Wilder: See, that’s where I would go for my year end party. Somewhere in the mountains.

Casey Markee: There you go. Somewhere in the mountains?

Andrew Wilder: No, the French Alps sounds perfect.

Casey Markee: I can host you in Julian, it’s fine. Julian’s just 20 minutes away. We’re going to get snow.

Andrew Wilder: It does get a little chilly in Julian, but David, I’m coming to visit.

Casey Markee: [00:03:00] Some people on this call had their wedding in Julian and didn’t invite other people that were only 15 minutes away. I won’t mention any names. It’s fine.

Arsen Rabinovich…: Are you still going on about that?

Casey Markee: I am. I’m never going to let her live it down.

Arsen Rabinovich…: It’s been four years.

Ashley: It’s been almost three, and-

Casey Markee: It’s three years, by the way. It’s actually about two years, 10 months, 21 days. But that’s okay, I’m not counting.

Ashley: At least you were honest about it. You did tell me that you would never let me live this down.

Casey Markee: Never ever.

Ashley: Yeah.

Casey Markee: Never. It’s [00:03:30] all good.

Ashley: Yeah. Very accurate.

Casey Markee: And don’t let things fool you folks. Arsen is not drinking water, that’s vodka.

Arsen Rabinovich…: It’s just straight up vodka.

Casey Markee: Straight up vodka.

Arsen Rabinovich…: Straight up vodka.

Ashley: His face doesn’t make any reaction [inaudible 00:03:46].

Arsen Rabinovich…: Oh man.

Ashley: A Russian face.

Arsen Rabinovich…: It’s Russian genetics, it’s just drink water.

Casey Markee: It’s good for your skin.

Arsen Rabinovich…: Right. Keeps things nice and preserved.

Ashley: All right. Well, let’s go ahead and get started. Let’s finish this year [00:04:00] up strong with a big episode on Google Search Console. Today we have, as always, our expert panelists, Casey Markee, Andrew Wilder, and Arsen Rabinovichh, who are going to be sharing the ins and outs of GSC, Google Search Console. This is definitely going to be a big episode. There’s a huge range of questions that you all submitted, from just walking through to the technical questions, so we’re going to cover as much as we can.

[00:04:30] If you have any questions that we haven’t addressed yet or a follow-up question to maybe something that we just talked about, go ahead and drop it into the Q&A. As always, a week after the episode airs, we’re going to be publishing a recap blog post that has the answers to all of the questions. But make sure you get it into the Q&A. If you put it into the chat box, there’s no guarantee, because Zoom doesn’t carry all of that information over, and so we’re not able to get all of that information after the webinar. If you want your question answered, put it in the [00:05:00] Q&A section. Okay, so without further ado, let’s dive in, let’s talk about GSC, Google Search Console. Casey, can you start us off with a pretty good breakdown of what Google Search Console is and how to actually get it set up?

Casey Markee: I can. I am not good enough to go to your wedding, but I am good enough to answer your questions about Google Search Console. That’s fine.

Ashley: Good transition.

Casey Markee: I see where the transition is there. It’s fine. [00:05:30] For those of you on the call, Google Search Console is a free service offered by Google that helps you monitor, maintain, and troubleshoot your site’s presence in the search results. It’s that simple. If you have a website and you want to garner traffic from Google, the most important single step that you can do is make sure that you go in and verify your Google Search Console. Now, I’ve provided a link over here. We’re going to try to populate the chat with as many resources as possible, so make sure that you write [00:06:00] those links down when you can. I believe they will be on the transcript regardless, but this will help you.

The introductory video that I’ve shared is directly from Google, and it’ll tell you what Google Search Console is, it tells you all the ins and outs. We’re going to be sharing a lot of those videos today because the best resources are the ones that Google provides for your amusement, as I like to say. When we talk about setting up and verifying your Search Console, this verification process is necessary so Google understands that you are [00:06:30] the owner of this. We want to make sure that if you have sensitive information in there, you can access that information, and Google really does a pretty good job of making sure that, hey, we’re not going to make sure that we’re going to give just anybody access to your proprietary information.

Now, there are seven ways for you to verify your site in Search Console, and they’re all detailed in a video here, which I’m going to paste over for your amusement. And I’ll go ahead and put that right in here, paste that over. Now, the seven ways [00:07:00] to verify your site in Google are as follows: a DNS record, HTML file upload, HTML tag, Google Analytics tracking code, Google Tag Manager, snippet container. Google Sites has a specific, unique approach to verification, and also Blogger sites have a unique way to verify. The vast majority of those of you on the call are going to be verifying your site with one of four methods. You’re going to be using a DNS record through your host, you’re going to be using an HTML [00:07:30] file upload, or you’re going to be using an HTML tag within your Yoast or SEO plugin, or maybe you’re just going to do a one click verification with Google Analytics. Most likely, most of you, it’s going to be an HTML tag with Yoast or it’s going to be a one click, which is the recommended approach. When Google provides of all those seven methods, they really want you to verify through Google Analytics more than anything, and they make it very easy to do so.

When we’re talking about making sure that you [00:08:00] have this stuff verified, the most important thing to understand is that this is your own data. You can go ahead and provide full, you can provide limited, or you can provide restricted analysis to your data. It’s funny because I didn’t used to believe that there was an upward limit to how many accesses you could have for Search Console. I found out last week that that is not the case. It is 500 accounts. I actually had to go in and start removing accounts the other day because I’d finally hit my limit of how many accounts I had access to in [00:08:30] Search Console. It was crazy. I’ve been slowly deleting accounts or accesses ever since so that I can onboard new clients. But it was pretty crazy in that regard.

If you’re interested in seeing what it would look like, let me see, I can do a very quick screen share here. But let me go ahead and see if I can pop this up really quickly to give you an idea of what it would look like here. Let me see [00:09:00] if I can do, share screen, and we’re going to share that one. We’re going to click share.

You should be seeing, on your site, just a very over quick view of, if I wanted to verify ownership of the site, in this case, Organically Addison, which was a site that we visited with earlier this week. You see how there’s two breakdowns. There’s the recommended method here, which is Google Analytics, and then you can see the alternate methods there, which are the HTML file upload, the HTML [00:09:30] tag, the domain provider, or the Google Tag Manager. In many cases, I could just click on the Google Analytics there and verify, and I’ll be right set up, or I could go and use an alternate method. Maybe it’s a new site, they’re not too familiar with Google Search Console. I would instead go in and verify by means of the HTML tag. And you can see with the HTML tag, it’s just a matter of me copying the HTML tag, going into something like Yoast, putting the code in under, I believe the general [00:10:00] resources screen, and then it’d be instantaneous. But this is just kind of an overview of exactly what you would see if you were to try to verify the average site with Google. We’ll just stop the share, there. Pretty simple to do. If you have any questions on this, please don’t hesitate to submit something through the Q&A, and we’ll make sure that it’s included in the full transcript.

Ashley: Perfect. Thank you for showing that, Casey. Arsen, how often would you recommend actually diving into Google Search Console, [00:10:30] and what kind of reports or different areas of GSC are the most important to look at regularly?

Arsen Rabinovich…: Right. You want to look at a few places inside of Search Console on the regular basis. I wouldn’t necessarily live in there, because you can go nuts looking at all the data. If you’re noticing declines, if you’re noticing [00:11:00] weird things happening with your traffic and positions, you want to look under performance, you want to look at the search results section. And there, you want to look at impressions, you want to look at clicks, you want to look at which pages are performing how, which keywords are trending up, which keywords are trending down, and what’s happening in general. The other places that you want to quickly just check out if you’re doing this prophylactically are the page experience, the corporate vitals, and mobile usability.

Coverage reports [00:11:30] are super important. This is directly from Google telling you that this is what we’re seeing, this is what we’re encountering when we’re crawling your website. In the coverage reports, you want to look at errors, you want to look at valid with warning, and you want to look at excluded. And you want to look under details, and later, when I share my screen, I’ll show you in details, you want to look at anything that shows large chunks or groupings of pages moving in and out of the errors or the excluded [00:12:00] from index. And that typically is a good way to start narrowing it down or ruling out any potential issues with the website. But yeah, I would probably check, so internally with a top part rank, our larger clients, we check about once a week, we do a deep dive once a week. Clients that are in the mid to smaller range, once a month, once every two weeks, is a good way to just keep track and stay on top with. Definitely not something I would be in every [00:12:30] day.

Ashley: Then it’s not a Google Analytics replacement, by any means.

Arsen Rabinovich…: No, not at all.

Ashley: Got it. Okay. And Andrew, a lot of questions came through about GSC and Core Web Vitals. Can you briefly explain what the Core Web Vitals report is and how bloggers can use this info to actually improve user experience?

Andrew Wilder: Sure. We did a whole episode on Core Web Vitals, episode six, which feels like ages ago, but I’m going to paste that link in [00:13:00] the chat. If you want a deep dive on Core Web Vitals, that’s the place to start. But a quick summary is that Core Web Vitals are basically speed and usability metrics that are now rolled into the page experience algorithm. Page experience includes the Core Web Vitals, it includes mobile usability, if your site is HTTPS, which most of you are going to have that down, but all of this is part of the page experience because Google wants, anytime you click through from the search results, Google wants to have some… The user to have a good experience. [00:13:30] Core Web Vitals, which rolled out in June, also feels like forever ago, basically looks at three specific performance metrics, it’s Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift.

The first one is basically how long it takes for the biggest thing on the page, in the view part, before you scroll, to load. That’s sort of a marker for speed. First Input Delay is a marker for performance. Most food blogs don’t have a problem with that, but you want that to be under 100 milliseconds, so basically the site feels responsive. And then Cumulative [00:14:00] Layout Shift is how much stuff moves around on the page, either as it loads or after it loads. That’s the really difficult one.

Google basically collects real world data and stores it in what’s called the Crux Database. And then in there, so they’re using the Chrome browser, so actual, real-world user metrics are transmitted back to Google from when people use Chrome and access your site. They collect all this data, and then Search Console is actually reporting on this data. You should theoretically be accessing the same data when you run a [00:14:30] PageSpeed Insights test, and it’s showing you the Core Web Vitals info there. It’s supposed to be the exact same data. Sometimes we see that one hasn’t updated yet, so it can be a little finicky. We’ve worked on sites where we’re like, “No, this is really good now, the site is blazing fast,” and Search Console starts kicking out warnings and we’re like, “There’s nothing wrong.” Then we validate fixes and it actually works again.

Oh, another thing with Core Web Vitals is right now, it is only a ranking factor on mobile. It is a minor ranking factor. Google still [00:15:00] said basically it’s a little bit more than a tiebreaker, so don’t freak out if your Core Web Vitals say “fail.” We’re going to talk about that in a second. Since it’s a known ranking factor, it’s important to perform as well as you can on it. But it is not make or break. We had a lot of clients who got hit by core updates in the beginning of June or the beginning of July, and they thought it was Core Web Vitals, but it was actually one of those big core updates last year. There’s been a lot of volatility this year.

Core Web Vitals tends to get the blame because [00:15:30] people know it and Google’s announced it, but it’s actually pretty minor. Having said that, in February, they’re going to start to include it as a ranking factor for desktop as well. They’re going to include it in the existing page experience algorithm for desktop. Obviously, mobile usability will still just be for mobile. We do need to start looking at desktop Core Web Vitals as well.

How can you use Search Console for all this? Well… Sorry, my hat. Oh, take a drink! Ashley’s cat just ran by.

In [00:16:00] Search Console, and actually, let me share my screen. Let’s see if I can do this quickly. Is that coming up okay? Okay, so here, we’ve got my food blog, and I’ve clicked on page experience. The page experience is the overall category for this stuff. And on mobile here, we’ve got Core Web Vitals, we’ve got mobile usability and HTTPS. If you click on Core [00:16:30] Web Vitals here, so the same thing as clicking on Core Web Vitals on the left. These graphs here are going to show you what URLs it’s found and what the categories are. Everything in Search Console basically works this way, where they’ve got this graph at the top and you can filter it by clicking on the boxes. This can be a little confusing because the details actually changes based on what’s selected here.

Right now, we’ve got poor, needs improvement, and good. Poor, depending on which metric it is, has different criteria. [00:17:00] There’s three buckets. There’s good, needs improvement, or poor. It is not actually pass/fail. When Google rolled this out originally, they said it would be pass/fail, you either hit the metric or you don’t. They’ve rolled that back, actually, and the closer you are to the good range, the better. And then once you’re in the good range, there’s no additional improvement. We’re actually trying to get Google to change that language everywhere and just stick with the good, needs improvement, or poor, and drop the pass/fail because it’s not actually true anymore.

In here, what you’re going to see is needs improvement. I’ve got no issues. [00:17:30] I’ve done a lot of work on my site and I’ve got 612 URLs that are in the good range. Actually, this isn’t a good example, I guess, to show you. But if we see here though, we did have some poor ones at one point clocking in, so it’s showing poor. The CLS issue is more than .25, so something happened here where it had a lot of shifting. And if you click through on that row, this is the part… Oh, it’s not showing any of these samples. If you click through on here, though, it’ll show some, usually show you some URL examples. [00:18:00] And this is where you can start to see trends. One of the things we’ve noticed is sometimes there may be issues on just category pages. If you’re having a hard time getting your whole site to pass Core Web Vitals, you can actually drill down and see what URLs are in here that it’s flagging. And that’s probably the best way to use Search Console at the moment so you can start seeing what actually needs to be addressed and what the pattern is, because usually, it’s not just one page or one thing on one page. It’ll be just category pages or just your homepage or something like that. I [00:18:30] think that covers everything there.

Ashley: Yeah. That was a great explanation.

Andrew Wilder: Okay, thanks. We can figure out how to stop sharing. Okay, it’s all yours.

Ashley: You said it. Arsen, getting into where specific things are in Google Search Console, where exactly can you actually see changes in traffic over time?

Arsen Rabinovich…: Right, so I’ll share my screen here. Here we go. [00:19:00] Let’s go with, is it this one? Yeah, this one. Okay.

All right. Inside of Search Console, you want to look at your coverage report. The question is changes in search traffic over time, right. Sorry. You want to go into search results. You want to start there. You want to take a look at your overall trend. [00:19:30] Just to make it clear, I just get rid of total impressions. I only leave it on clicks. If you want to see the trend over time, you kind of see it this way. And then if you want to break it up and compare it, you can click on here, compare, and let’s just do compare last 28 days to the previous one. And this way, you can see changes in your clicks between the last 28 [00:20:00] days and the 28 days previous to that.

This is the cool part where you actually start filtering. When you have clicks enabled here, you can see which keywords have taken a loss in their clicks. But I like to start with pages instead of keywords. You can get in here and start seeing which pages on your site, which posts have taken the biggest hits, and you can sort it by [00:20:30] clicks. Once you see which post has declined in traffic, you can then drill in into the post. And what happens is, Search Console right away applies a filter specifically for that URL. Then, when you scroll back down here, you can click on queries, and it’s going to show you all of the keywords for this one post that have declined in referring traffic. “How to Cook Dried Beans Fast” [00:21:00] is 38 clicks difference between the two date ranges, between the two segments.

What you want to do here is you want start drilling in, not only onto clicks, you want to take a look at positions. And this brings in this other section here. Let me just widen the screen so we can everything. There we go, right. Then you can see [00:21:30] that “How to Cook Dried Beans” had a decline in 38 positions because it went 38 clicks, because it went from being number one to being almost number three. This way, you can drill into specific posts and start identifying which keywords are responsible for the drops in traffic.

The other quick thing that I want to show here is a way for you to check if there’s other pages on your website [00:22:00] that are competing for the same keyword. You would just simply remove this filter, click on the keyword, and then click on pages. And here, it will show you all of the pages, once it loads, all of the pages that are competing for that keyword, “How to Make Chocolate.” There’s currently two pages on the site. This is not really competing [00:22:30] because it only had one click or one position. It was the number one with no clicks, so this is the dominant post.

But this is where you go and this is how you create those filters and you drill into specific areas of your website that could be affected. Again, you do a comparison, you first remove the comparison, and you look at your trend line and you see where things took a downturn. You remember those dates, you then do the comparison [00:23:00] to see before and after based on clicks, you drill into specific pages, you click into the page, it applies the filter, and then you click on queries, bring in average position, and now you have a full metric. And those of you that have done calls with me, I use SEMrush on those calls. Same thing, same data. You can see the keyword, the query, how many clicks I got in the last 28 days, and then compare it to [00:23:30] the previous date segments. The difference, negative means you lost clicks, and then you can see the position. This is the best way to start your diagnostic process.

Ashley: If anyone’s feeling overwhelmed by any of these walkthroughs or screen shares, don’t worry. You will get a replay, you’ll be able to watch this, pause it, and dive into things a little bit more as you need to, as well.

Arsen Rabinovich…: Right.

Ashley: Thank you, Arsen. Casey, how can you see which snippets Google [00:24:00] is actually recognizing on your site?

Casey Markee: That’s a good point, yeah. The good news is that Google provides a reporting, Google Search Console, called enhancements, and we’ll take a look at that in a minute. But here’s a video that I’ll paste over here. Let me see and make sure that we’re getting everyone. I was just going to paste everything over to Ashley and make her do all the work. But I’ll paste it over to everyone this time. It’s the least I could do. If you’d invited me to your wedding, I could do a little bit more.

Ashley: And there it is again.

Casey Markee: It’s another thing, yeah. Everyone [00:24:30] take a drink. What we’re going to talk about today is, again, the rich snippets, the enhancements. Let me go ahead and share my screen here, and I’ll pop this up. Let me see if we can… Nope. I’m going to actually make sure that I have it up first. Yeah, let’s go here and let’s try that again. There [00:25:00] we go. Okay.

We’re going to go ahead and share the screen here. This is an example of the Eating Role site that Andrew has so kindly allowed us to use today. If you go on the left hand side of your Search Console, you’ll see your menu here. And what we’re looking for is the enhancements report. You can see there on the left hand side. Now, the cool thing about Google is that they support up to 17 different review snippets with regards to Search Console. But the average recipe site [00:25:30] usually has between six and nine. And you can kind of see an idea here, you can see breadcrumbs, there’s guided recipes, there’s recipes, there’s review snippets, there’s site links. Most other recipe sites might have how to there, they might have FAQ, they might have products, that would be another one that would be listed there. But this is where we would find those enhancements.

For example, if you click on breadcrumbs here, this is where Google is detecting that Andrew [00:26:00] has site-wide breadcrumbs. They’re validating just fine. You can see that it’s valid there. Here is his recipes template. You can see here that he has zero errors, 303 recipes with warnings, and none that are fully enhanced. Not a big deal. You can see that he’s just missing video there and he’s missing keywords on a couple. Both of those are honestly just optional. Warning is not going to kill you in error. We always want to fix, and we’re going to talk about that a little bit later, but this is where you would see all the [00:26:30] enhancements that you would see over here.

And we’d want to go in here and just take a look at these whenever we can, make sure that everything is valid. If I click on complete, it allows me to see what’s going on with regards to completed review snippets here. I can go in and see that someone has left reviews on each of these, and we can see them. But very easy to follow. And that’s what we’re doing is we’re looking at enhancements specifically. Here’s guided recipes. Guided recipes is kind of a failed [00:27:00] initiative by Google. You can see that he has a lot of warnings here. My advice to most people on the call is wouldn’t even lose sleep over any of these. There’s no guided recipes carousels, there’s no guided recipes… There’s no way to track specific guided recipes traffic. It’s just not a concern for the average blogger. But maybe we were to go to another site here, I’ve got plenty of sites.

Here’s Ask Chef Dennis. If we were to go to Ask Chef Dennis’s site, see how he has [00:27:30] a couple different enhancements over here. He has AMP because of web stories. We was using web stories on his site. Here’s some valid web stories that pop up, so we could take a look at those. That’s how you would know that he’s using AMP on his site. See that he also has FAQ, he’s been using FAQ schema, and a couple recipe posts. You can see those individually here. All of those validate. That’s the difference. He even has videos marked up, so he has videos popping up here, but there is various rich snippets, and this is how you track [00:28:00] them all under the enhancements report within Google Search Consoles. Pretty easy to see when we’re talking about these.

Again, the average blogger will have AMP, breadcrumbs, guided recipes, recipes, FAQ, how to, products, review, site links, and video, and that’s acceptable. The more risk snippets we have, the better. Understand that rich snippets are never a guaranteed thing. We can have everything marked up 100 percent and [00:28:30] Google could just decide that it’s having a bad Monday and not show those rich snippets. They’re totally a voluntary thing. You’re not guaranteed rich snippets, regardless. Sometimes, when we have disappearing rich snippets, we can troubleshoot that by looking for, maybe there’s an issue with the page speed of the page? Maybe we haven’t paginated comments and the page has been overwhelmed with a very high DOM Node count, which sometimes can hinder Google’s ability to crawl and fully render the page, which [00:29:00] they can’t fully crawl the page, they can’t fully render and process rich snippets. That can happen. Or maybe we have disappearing thumbnails, which can happen sometimes, just because Google’s a dick. Just understand that that stuff exists, and if you go into your Search Console, you can track your enhancements pretty easily over time. Hope that’s helpful for y’all.

Ashley: And Andrew, Casey just showed a little bit of the warnings and whatnot within GSC. [00:29:30] Google sends out a lot of various notifications, some of them being warnings, some of them being errors, and it can be incredibly confusing trying to figure out what’s the bigger issue, what should be prioritized, what’s errors, first warnings. Can you explain what all the notifications mean, why they’re happening, and what really should be prioritized?

Andrew Wilder: Sure. I don’t know what the count is, but Google could probably send out 50 or 60 different types of messages. I [00:30:00] found a blog post a while back, and I think they’ve added to it since, that was every single thing that they go through. Most of you, if you’ve been using GSC for a while, you’ve gotten used to getting these really scary sounding emails. Oh, sorry. Cheers.

And some of them are really important, and some of them aren’t. As a general guideline, if it says “error,” you need to fix it. If it says “warning,” you probably don’t. On the [00:30:30] guided recipe stuff that Casey was just showing you on my site, I took Casey’s advice and I didn’t spend any time worrying about guided recipes. You see all those warnings on there, because I’m not actually marking it up with the guided recipes information. I’m not worrying about it because they’re warnings. However, if there’s an error on anything on your site, it can actually impact all of your rich snippets.

I don’t know if it’s still the case, Casey, but we saw a while back, we saw some errors on one or two things on one site, and it was actually causing a loss of rich snippets for all of the URLs on the site. [00:31:00] Errors are something to pay attention to, and you really want to clean those up. If it’s related to schema or rich snippets, that’s where it’s like, you’re missing your image on a recipe or the title of a recipe or a description, there’s only a few things that are required information, and that’s where you’re going to get the error. Or I think another is, if you have a one-step recipe and there’s only one step in the instructions, I think that can trigger a message. Don’t panic when you get one of these, what you want to do is click the link in the email, go to that spot in Search Console, and [00:31:30] then it’ll actually show you what the URLs are and what the errors are. You can actually drill down and say, “Oh, I forgot to put an image on this recipe.” You add the image, you click validate fixes, and then it’ll clear it up.

Ashley: And Casey, what about some of the errors that come through that aren’t as direct as the examples that Andrew just provided? One of the attendees that registered said that sometimes they’ll get a missing field. A video shows valid, shows that it’s valid, but it has warnings, even though it’s saying an optional [00:32:00] field. Should warnings like that just be ignored and assumed it’s a bot versus human issue, and when you actually go manually look, the video’s actually there, or the video’s not necessary because it’s optional? Where do you really draw the line with some of these warnings?

Casey Markee: Yeah, and that’s something that Andrew just briefly touched upon is that I always say that warnings are optional but errors should always be addressed. And the only required fields for recipe schema, username and image, period, that’s it. Those are the only required fields that you need [00:32:30] for schema is the name of a recipe and an image. If you miss everything else, it’s fine. You’re still going to show up in the Search Console. You can literally miss every piece of information other than the name and image, you will still qualify for a rich snippet. If you don’t fill out those, the image and the recipe, the image and the name, you get an error, and that’s bad because you will not get any rich snippets. But if you don’t fill out the aggregate rating, you don’t fill out time, you don’t fill out cook time, prep time, whatever, you will get warnings.

Now, I’m a big believer, it’s all about the [00:33:00] little things in the recipe niche. I think that’s why I’ve been so successful with consulting with my clients is that we take a kitchen sink approach. Our goal is to fix as much as we can, as fast as we can. And that includes recipe schema. I recommend filling out all schema except videos and guided recipes. I think that until you get to a certain level, videos won’t help you. I’ve qualified people for media buying, never had them do one video. I’ve had people that didn’t even touch [00:33:30] videos until they got to 400 or 500,000 sessions, and it was fine, just so that they could have something they could give to their ad company and they could start running video ads for them.

But my goal is always to focus on everything that you can, like for example, the aggregate ratings. It’s a very visible rich snippet. That is a rating on a recipe. That’s a visible testament that this recipe is good, having a star rating. Every recipe you publish should have a rating or multiple ratings as soon as possible, so we want to get those filled out. [00:34:00] In my experience, in my case, I’ve literally done nothing but had clients fill in their recipe schema, and this has resulted in paradigm shifts in traffic, very big traffic improvements.

Now very quickly here, we can take a look at some sites, and you can see what we’re looking for with regards to recipe schema. Here is an example of a site, and I’m going to go ahead and just share [00:34:30] this really quickly, here. This is Buns in the Oven, a very good site. You can see here that if we pop this open, they’ve done an exceptional job. They have filled in, they have several hundred recipe cards, you can see here, and all they’ve left free are video and aggregate ratings. But nevertheless, this aggregate ratings is a big one. That’s 541 recipe cards that do not have a rating. And [00:35:00] that is going to be a very strong signal that we want to improve upon. This is an example where we’d want to go in and generate some ratings for these. You can either go in and specifically ask your clientele to start leaving a rating. Maybe you can work with some friends and have them generate some ratings for you.

Don’t believe any bullshit about, oh, you can never rate your own rating. Complete and utter garbage. Google doesn’t care how the ratings come about, they care that they’re from real people. That’s what you’d want to do, you want to go in and you want to put ratings [00:35:30] on your recipes as much as you can. A lot of people are really like, “Oh my God, it’s so… I don’t want to rate my own rating. That’s unlawful.” Not true at all. If you want to rate your own rating, you can even go down and say, “Test.” Done, making sure that the rating works. That makes you feel better, great. But I’ve been doing this a long time. I have never ever, period, since the dawn of time, ran across a blogger who got a penalty for rating their own ratings or rating their own recipes. Doesn’t happen. [00:36:00] Your goal, again, is to fill in your recipe information as much as you can, have your friends and family come in and give a rating, but you want to add a rating to as much of your content as you can. You want to fill this up as much as possible, and that’s the fastest way for you to increase bottom line traffic is making sure that this is as clear as possible.

Andrew Wilder: Casey, can I jump in with just one point of clarification? All this stuff will not improve your rankings. Getting ratings on your posts, Google doesn’t use that as a ranking [00:36:30] factor because they know it’s too easy for you to go in and click. But what it does is help you get the rich snippets, which improves your clickthrough rate when you do come up in search results. It’s an important distinction to understand the difference, though.

Casey Markee: And just an FYI, there is no such thing as a ranking benefit for structured data, period. Structured data as a whole is not a ranking benefit. It’s just like everything else. We want to fully fill out the information to make sure that we’ve made our snippet, to make our snippet in the search results as attractive [00:37:00] as possible. Now, if you get a spammy structured data penalty and you lose all of your rich snippets, your traffic will drop. But it’s not because the rich snippets were a ranking factor, it’s just because all of a sudden, your results became substantially less attractive than the other results around you that were fully fleshed out with visible rich snippets. And I think that’s what’s confusing to people. Just understand that we want to fill out the structured data as much as possible, but it, by itself, is not a ranking factor. We’re just trying to make [00:37:30] it as attractive as possible to increase our bottom line CTR.

Ashley: Good clarification, there. Getting really specific, Arsen, can you talk about some of the more popular errors in the coverage report?

Arsen Rabinovich…: Right. Super important stuff over there. A lot of times, and again, when those of you have been on calls with me or Casey, we talk about things like redirects and 404s and pages that are low [00:38:00] quality or don’t have enough content. A lot of that stuff can be found on these coverage reports in here. I’ll share my screen that we can take a look at. Andrew doesn’t have many issues in there, which is great, but let’s poke around. Under coverage report, usually, when you load this, this is the way it’s going to look, it’s going to just give you errors right away. There’s zero errors here, but usually errors is where you’ll see the 404s, and I wish Andrew had one in here at [00:38:30] least, so we can-

Andrew Wilder: Sorry, I run a tight ship, man.

Arsen Rabinovich…: Right, right, right. But typically, what you can do is you can… And here, I’ll try to pull it up, here. Let’s go pages with redirects. Once you click into any of those errors, like I just did, Google will give you the URLs and the dates that it was last crawled or discovered. You can then click into [00:39:00] it and you can inspect the URL, and it gives you a lot of really, really, really cool information, here. This one obviously is not, this URL is not in Google. And it tells you discovery, how Google discovered this page. Is this page in the site map? It’s not. Which page did Google crawl to find this page, referring page? [00:39:30] Google crawled this page on Andrew’s site, which then linked to the page that we’re, to this one. Tells you the canonical for that page. Canonical issues, you hear us talk about this frequently. Here, you have a page that is canonicalized to a different page on the website, and this is why it’s not an index because we’re looking at those canonical and saying, “Okay, you’re telling me that the same contents that’s on this page is living here.” We’re [00:40:00] going to remove this, we’re not going to include this page into index.

But a lot of times, and like I said, when you’re on those calls with Casey or myself, we look at Screaming Frog, and we’re like, “Well, you have this page here that generates a 404. You have this page here that creates a 301 redirect.” It’s the same information, but now it’s coming directly from Google. You can see where the page, how Google discovered it, and then why Google is excluding it. Here, let me back out here.

Valid with warning means [00:40:30] that Google is okay with that page. It has a warning on it, but typically, those are fine. Valid. Here, let me get rid of this. When you click on excluded, he gives you all of the reasons why. Alternate page with proper canonical tags, so on Andrew’s site, he has 4,300-something pages which are canonicalized to a different [00:41:00] page on his website. Pages with redirects, crawls currently not indexed. You definitely want to investigate these. Why is Google crawling but not indexing these pages? And you want to click into this, and you can see again, all of the pages. These are feed pages, and that’s fine for Google not to index the feed pages, not a big deal. Not found 404s excluded, so 404s, and we talk about 404s where you have probably messed up an internal link, which generated a 404. Again, [00:41:30] you can click into it, inspect the URL, and see where Google discovered, how did Google crawl get to this link? From the homepage on Andrew’s page, he’s linking here. He was linking here at one point, on October 28th. And when Google crawled it, this link returned a 404 page not found.

You can go in here and start cleaning these up, not found for duplicate, submitted URL, [00:42:00] not selected as a canonical, only one of these. These are a little bit more technical issues, but again, you want to look at the total amount of pages. Anything that’s one page or two pages or anything that’s very nominal, I wouldn’t really worry about it. But if you see larger chunks of pages or inventory moving into pages with redirects, something is obviously wrong, or crawled not index out of nowhere, it’s coming up, or excluded by no index tags. [00:42:30] We see this a lot with migrations that were improperly handled, they forgot to remove the no index. And then we see thousands of pages just make their way all the way in. And you just want to also look at the trend line where you can see what’s happening. But the coverage report is super important, especially for overall site health, from a technical perspective. And this information, again, is directly from Google, so it’s not like you’re doing a crawl with a third-party app like Ahrefs or Screaming Frog or SEMrush or Semrush. This [00:43:00] is directly from Google. A wealth of information, lots of actionable insights here.

Ashley: Thanks for that screen share, Arsen. Andrew, some of the warnings that come through are security-specific warnings and manual actions from Google. Where can you go inside of GSC to see any security or manual action issues and address them?

Andrew Wilder: This should be really quick. On the left side, below the enhancement section, is a section called security and [00:43:30] manual actions. And there’s two there, there’s one for manual actions. And if you click on it, hopefully it’ll just have a green check mark and say “no issues detected.” And same with security issues, it should say “no issues detected.” If it has detected something, it’ll tell you. The manual actions are things where they’ve literally said, “Hey, you’re doing this wrong,” like an actual human being has flagged the site. And it’ll say what the problem is and how to correct it, and you need to correct the problem and then tell Google, and that’s how you clear that. For security issues, if Google has detected malware on your site, if it’s been hacked, [00:44:00] that should show up there. Also, if there’s a security issue, it’ll show that in the search results, or if somebody tries to click through, I think it says, “This site has been possibly infected.” If anything happens with either of these things, you want to jump on it because these are mission critical, so it’s like errors times 10. But most of the time, it’ll just be “no issues detected,” and then you’re good to go.

Ashley: Perfect. And Casey, you touched a little bit on web stories [00:44:30] earlier, but when it comes to web stories in GSC, is there a way to exclude web stories in results aside from individual web story URLs?

Casey Markee: Not that I’m aware, no. Google Groups, all the web stories info, both under discover, and again, under regular organic, you have to export out all the data in your Search Console via a CSV or XFL… XFL, it’s football related. Or XLS document, and then filter it out by Excel. There is no clean way to do it. The discover tab specifically [00:45:00] is disconcerting because we don’t necessarily know… That’s the only way we can track discover is in the Search Console. That’s all grouped under regular organic channels and analytics, so it’s a little confusing, I know. But unfortunately, no. It’s possible that there might be some refinements coming in the future, but as of now, there’s really no way to exclude those web story results in Search Console.

Ashley: Okay. Okay, thanks for clarifying. Andrew, talking about site maps, how should site maps [00:45:30] actually be configured in Google Search Console? This is definitely a really popular question.

Andrew Wilder: If you’re using SSEO, it generates what’s called a site map index file on Yoast. It’ll be site map_index.XML. Real simple, Yoast automatically generates it as soon as you install it. Let me share my screen real quick because there’s some confusion on… Oh, I’m just getting three blank boxes from Zoom, thanks. That’s weird. There’s some confusion on what to add in [00:46:00] the site map field. And really, the only thing you need to add is the site map index.

Casey Markee: Thanks, yeah.

Andrew Wilder: Okay, it’s not letting… Wait, let me see. Can you see my screen right now?

Casey Markee: Yep.

Andrew Wilder: No?

Casey Markee: No, yeah.

Andrew Wilder: Okay, great. Under index site maps… Wait, I’ve just got site map_index.XML. If you click on that, it’ll actually expand and show you the other site maps. Yoast automatically generates this. What we often see is people will manually add all of these in, and that can be problematic because you’re [00:46:30] kind of duplicating effort. And if anything changes on these, you have to go back and update it. This way, it’s automatically updated. In general, this is what you need. The only other site map you might want there is if you are doing videos with AdThrive or Mediavine, they can set up a redirect for a URL to a site map that they host just for your videos. And if you’re using AdThrive, it’s I think it’s AdThrive-video-site map, and think Mediavine, it’s MV-video site map, I’d have to double check that. But if you’re going to add that, [00:47:00] you do AdThrive-video-site map and hit submit. And it’s probably going to fail because I don’t have one, but… Oh, there you go. And it says couldn’t fetch.

But if you were working with AdThrive and you had this, this on your site, AdThrive’s plugin will actually redirect over to their hosted site map, but these are the two things you have in here. And you can also see in here, it’ll show the discovered URLs. You can see, hey, it’s actually successfully crawled the site map, we’ve crawled the post map, I’ve got 691 URLs. If you have more than 1,000 posts, [00:47:30] you’ll see multiple post site maps. It breaks up into 1,000 at a time. And it’s basically like, hey, you’re good to go. It’s nice and simple. But this tells Google, “Hey, here’s all my content.” And that way, Google can use this to crawl your site more effectively.

Ashley: And I was muted. Thanks for sharing that, Andrew.

Andrew Wilder: Yep.

Ashley: Arsen, what is the disavow tool and how can it actually be used by bloggers?

Arsen Rabinovich…: Don’t use it.

Ashley: Oh.

Arsen Rabinovich…: [00:48:00] Period. And we talked about this on our previous webinars a bunch of times, we talked about this on, I think we was the draw, the luck of the links or something like that, the one of those, the link building webinar. The disavow tool should only be used if you had a manual action from Google where they said, ” [00:48:30] We’ve taken action against your entire website or specific pages on your website because of crappy links.” I didn’t want to use the other word. “That were built. You’re trying to manipulate results, it’s against our terms. You suck.”

This is where you go in and you start doing cleanup. And a part of that cleanup is disavowing those back links, essentially telling Google that, “Hey, I did not build these. I [00:49:00] did not ask for these. Please do not use them as a part of the evaluation for my website.” And you can disavow entire domains, you can also disavow specific pages that are linking to you.

If you don’t know what you’re doing, you will definitely, 100,000 percent cause more harm than good to your website by using disavow tool. The only other reason you should be using the tool is if you have noticed a huge decline in traffic after [00:49:30] an update and you’ve talked to somebody like me or Casey or somebody else who knows what they’re doing when it comes to back links and who has evaluated your footprint and your backlink inventory and said, “Yeah, there’s a lot of crap here that needs to be disavowed.” But that should only happen if you’ve ruled out that nothing else on your website is causing or is the cause for the decline in traffic, so you’ve ruled out all technical issues, all [00:50:00] crawlability, accessibility, renderability, and content issues, and all you’re left with is, hey, maybe these are the external signals.

Google has become really good at filtering out the crap. A lot of times, I haven’t seen a penalty in maybe two, three years now. Nobody has come to me. We used to do a lot of recovery audits, made a lot of money recovering people after they’ve done some crappy link building. I haven’t seen one come through. I [00:50:30] haven’t seen any issues with any of our clients in Search Console. Google’s just really good at, Google will not penalize, for lack of a better word, will not penalize you. Google just filters those links out so they don’t harm you, but they also don’t help you. Don’t use the disavow tool.

Ashley: I need that mute button. It’s extra difficult, worse than the disavow tool, huh?

Andrew Wilder: Every time you use the [00:51:00] disavow tool, Arsen kills a kitten.

Arsen Rabinovich…: I don’t kill kittens.

Ashley: Andrew, a lot of episodes throughout the whole SEO for Publisher series, we’ve talked about updating content a ton and how to do it, how not to do it. But how can you actually utilize GSC to see if the updates or the changes that you’ve made to content are actually helping the piece?

Andrew Wilder: I haven’t been updating any content on my site, so I might not be the best example, [00:51:30] but basically, to keep track of all this, you want to go back to the performance search results section. And this is the same spot that Arsen was clicking around in earlier. Let’s just, for kicks, we’ll look at the last three months right now, and again, we can use the compare tool to compare it to the last three months. It gives us a couple of graphs here. I’ll just look at… Oh, it’s not responding. Okay. Too much data. Here, let’s look at average position. [00:52:00] Okay, so the solid line is the last three months, and then the dashed line is the three months before that. You can see the trend line’s pretty much the same here, but if you’re working on a specific page, you can click on pages.

And then let’s look at “My Healthy Options at Panda Express.” And you click on that, and now we’ve filtered this list to just that URL. You can see, so last three months, my average position unfortunately has been 15.3, sadly. [00:52:30] Before that, it was 10.5, so I’ve gone in the wrong direction, here. You might have caught that earlier as well, when I was showing some of the metrics. My site’s trending down because I’m neglecting it, spending all my time on NerdPress. But if you’re working on your content and you’re improving this, obviously, you’d want to watch this and see it going in the opposite direction. Now, keep in mind, average position lower is better, you want to get to position one. I was just off the first page, and now I’m in the middle of the second page. [00:53:00] You can also then look at your total clicks and how much traffic… Clicks, of course, are not just based on your rankings but how many times people are searching for it. And if your rich snippets are fully marked up, so if you’re working on your rich snippets, your rankings may not change, but your clickthrough rate might improve. You can track that here. And actually, it’ll show you your average clickthrough rate.

You can also then, if you want to get granular, you can look at like, hey, do people from Italy search differently? I’m probably not going to get a lot of traffic for [00:53:30] people looking for Panda Express options in Italy, but you can track that and see. If you’re doing some geospecific things or travel related things, that can be helpful. Different devices, see what’s in search and appearance. If we have a good page experience. I’m not doing anything Web Light results. I haven’t even heard about that. Is that old or new? Web Light results, anyone?

Casey Markee: Old.

Andrew Wilder: [00:54:00] Old. Okay, good. Or if you just are targeting specific queries, you can filter by the search queries rather than by pages. But that gets a little more tricky because you can see how many variations there are. If you’re really doing it by updating a URL, I’d stick to the pages filter.

Ashley: And we only have a couple more questions before we open up for Q&A, and I noticed there’s 14 questions in Q&A so far, so great job everyone. Make sure, now is definitely the time. If you have a question that we haven’t addressed [00:54:30] yet and you want addressed that has to do with GSC, make sure and drop it into the Q&A now. Then that way, the panelists can get to it. Casey, speaking of content, while we’re on the topic, is there a way to use GSC to actually get new content ideas or determine even what kind of content you should create more of, like either long form or more videos or different types of recipes?

Casey Markee: Yeah, absolutely. You could use your search results page in Search [00:55:00] Console to review existing traffic and find both content gaps and new content ideas just by sorting your content. We’ll continue using Andrew’s site. He’s going to get all those free content ideas today. But basically how it works is we would just go to… We’re going to share screen, here. We’re going to go ahead and pop him up again. And this is the default view that you have when you log in. Whoops, actually… Well, let me just go ahead and stop the share real quick, and I’m going to pace [00:55:30] over to his account. Let me get his account up first.

Andrew Wilder: We’re going to fix that in post.

Casey Markee: What’s the name of your site again?

Andrew Wilder: Eating Rules. Everybody go visit

Casey Markee: Eating Rules. There you go. Fantastic.

Andrew Wilder: We’ll make up for my last ranking by getting everybody to click through.

Casey Markee: All right, so sorry about that. We’re going to go ahead and share screen again. This is his site up. Yeah, so we got Eating Rules there. This is the default view as you log in, our [00:56:00] overview. This is the search results screen. We’re going to go ahead and select the average CTR and the average position here, and the default is queries, but we’re going to go to pages. And when we’re looking at pages, the very first page that he has up here on Andrew, he has “Healthy Options at Panda Express.” We’re just going to pop up that page here. We’re going to take a look and see what the queries are on the page, and we’re going to sort those queries by conversion over here. See, look at that. You can see that he’s got great [00:56:30] matches here for everything that has to do with Panda Express, as he should. Now, watch what happens when we insert that or we reverse that. You see all these zeros that he’s getting?

Look at this, he getting matches for Panda Garden and PF Chang’s, and there’s Panda Wok, there’s Mandarin. What this tells us is that if Andrew had a page just like this one, just like the one that he has for Panda Express, but he actually made it for PF Chang’s and Panda Garden [00:57:00] and Panda Wok, he’d probably have a chance to accumulate quite a bit more traffic because that’s a more qualified page for those search results. This is a great way for you to say, “Well, if I know that my ‘Healthy Options at Panda Express’ is doing great, I might go ahead and pop up a ‘Healthy Options at Panda Wok,’ a ‘Healthy Options at Panda King,’ a ‘Healthy Options’ at whatever Google is telling me I’m not meeting the needs of right here.” That’s [00:57:30] a really fantastic and easy way for you to pop up three or four other related pages.

You’d want to do these probably one at a time. I wouldn’t want to pop up three or four pages. I’d do one page at a time, see how the page does, and then add more pages as they do better. If there’s enough impressions here to have a PF Chang’s one, if there’s enough impressions to do Panda Wok or Panda Garden, then that’s what I would start with. And you could go back and literally do this with [00:58:00] any or many other pages as well. We’re going to go back over here to the top, we’re going to go to pages down here. I like how he’s got the homemade, he’s got “How to Cook Dried Beans” here. We could probably go to “How to Cook Dried Beans.” I’m going to go to queries, and I’m going to do the same thing. I’m going to invert the CTR here to zero, and then I’m just going to go down here. I’m like, oh, look at that. He’s getting results for “are kidney beans poisonous?” Maybe [00:58:30] he could go ahead and put up a new result about poisonous beans, Jack and the poisonous beans. Well, we’ll just name him Andrew. And we could see necessarily if that’s something that he could possibly rank for.

And that’s what we’re looking at is we’re looking for something with high impressions. Oh, look at that. Maybe Andrew should do a page on how to make a bean bag. Maybe it’s edible, Andrew. Maybe that’s going to make an edible bean bag. We’ll mix the best of both worlds, there. But this is what we’re doing when we’re looking for content gaps. We could also just go ahead and do a [00:59:00] sortation, a sort, go ahead and expand this out to 500, and see if there’s anything down here that he is getting conversion on, but it could be higher. Maybe he does a completely different page on soaking dried beans, or maybe a new page that he does on best ways to cook dry beans.

Yeah, pretty interesting. He’s all over the place on this. And you can see that the CTR drops considerably. [00:59:30] How to broil great northern beans, quick soak, but yeah, lots of opportunities there. And that’s how we would do that is we would start at the page level. We would then sort by keywords, and then we would go up and down using the CTR to see if we can find any content gap options or other issues here that we could turn into new content pieces. And that is a very simple way to go about it.

Arsen Rabinovich…: I do want to quickly add to that, before you start digging in and creating content, you always want to perform [01:00:00] that search and see what page one looks like, because for some of these queries, blog content might not be the best type of results. Before you put in the effort, do check. But yeah, this is a great way of finding the opportunities.

Andrew Wilder: It doesn’t mean you should do everything.

Casey Markee: No.

Andrew Wilder: This is an idea generation tool.

Casey Markee: That’s all it is, all it is.

Speaker 1: Awesome. Well, Andrew, to wrap things up, a question that kind of piggybacks off of that tee shirt idea, but why does Google take its sweet, [01:00:30] lovely time indexing content and sometimes never actually indexing content at all?

Andrew Wilder: We’ve been getting that question a lot from our clients. It’s only in the last few months where it’s really become a problem. Earlier, it used to be like you’d publish something new and in an hour, it would be in the search results. And something’s changed with Google. We’re not quite sure what yet, but basically, for something to get into the Google search results, it’s like a three step process. First, there’s discovery, then there’s crawling, and then there’s indexing. You have to get through all three of those phases, [01:01:00] and Google seems to be discovering URLs really quickly, still, but not necessarily crawling and indexing them. Google knows the URL exists, but it hasn’t sent Google bot there to really scan the page, or it hasn’t scanned the page yet and said, “Okay, now I’m going to put you in the index.” We’ve been seeing a lot of warnings coming out for that or people saying, “Hey, I published this two weeks ago and it’s still not showing up.” It’s been particularly bad for web stories.

Something’s changed, [01:01:30] we think, where Google is being more selective, maybe, with how it’s adding stuff to the index. There are some indications that there may be a quality signal involved here, and Arsen and Casey, you can probably speak better to that. It’s hard to tell why Google does what it does, but you’re not alone if this is happening to you, if it’s taking a long time. And just before we started the call, you guys were talking about a couple strategies to deal with this, so I’m going to let one of you jump in on that. [01:02:00] Either one of you.

Arsen Rabinovich…: Yeah. Casey looks like he’s typing something.

Andrew Wilder: Casey’s doing holiday shopping now.

Arsen Rabinovich…: Yeah. We noticed that when we look at crawl rates, we noticed that sites that are “better quality” do tend to be crawled and indexed much quicker and better and more frequently than sites that are lower quality. Sites that are bigger produce content quicker, [01:02:30] tend to be indexed quicker. Google is looking at inventory. We’re assuming Google is looking at how much processing time and how much investment am I making into grabbing this content and putting it into my index, and how useful is this content? Sites that are faster, quicker, provide a better user experience will be indexed much quicker and faster. Sites that are slower, sites that provide low quality content, were noticing pages that [01:03:00] are very thin, pages that have very few words, pages that don’t do a good job of satisfying the query or the intent, pages that are taking a long time to load. Pages with weird canonical signals are not going to get index as fast as pages that tap that like, everything looks good here and we want this page in our index because it’s providing valuable information and a good user experience.

Casey Markee: [01:03:30] We don’t take changing URLs lightly. We don’t recommend it in the vast majority of situations. But we’ve had several colleagues look into this, where they’ve had existing clients which are very strong clients, very strong brands. For some reason, Google has an odd index, a random URL, they put it into a filter, the discovered but not currently indexed filter specifically. And they found, especially for new content, that they could [01:04:00] get out of that filter by changing the URL and resubmitting it for a crawl. That is not something that you should take lightly. That is not something that you should just do by default. But if it’s a completely new URL, you’re on this call, you submitted a recipe, you know it’s high quality. It’s been sitting in there for three weeks, it hasn’t done anything. It’s very possible, because it’s in that filter, it has a label, an algorithmic label on it, that we can’t see that may prevent it from getting crawled and indexed at [01:04:30] any inconceivable point in the future.

In that case, there’s nothing to be lost by you changing the URL, redirecting the old URL to the new one, and resubmitting it again through the URL inspector to see if you can get it moving. We have found that that has been successful in finite cases. We’re just hoping that that does not become the norm. We’re hoping that these crawling issues take care of themselves over time and that Google gets caught up instead of just telling us, “Hey, you are fine for three [01:05:00] years, but obviously over the three months, we’ve just decided that your content’s just not up to snuff.” We just don’t believe that that’s happening.

A quality site doesn’t just go from having everything indexed, all of a sudden, only one out of every 20 URLs is getting indexed. There has to be an external pressure causing that change, and we’re still waiting to see what is possibly doing that. And I think in many cases, it’s just a Google hiccup, and we want to be cognizant of that and try to find ways around that, if possible.

[01:05:30] There’s a question in here about IndexNow, do you think Google will be part of this? Yes. Google will be a part of IndexNow. For those of you are not familiar with IndexNow, it is a new kind of a prioritized indexing system that Bing has pushed out, that Google has decided that they will adopt as well. That is not going to necessarily change, cause some kind of a paradigm shift for food bloggers, nor is it going to replace your use of site maps. Don’t go rushing headlong into thinking that you need to use IndexNow and [01:06:00] it’s going to solve your indexing issues. It will not. IndexNow is not a ready for prime time thing and it will be months, many months, before that’s even an opportunity to do with Google.

Arsen Rabinovich…: We spend a lot of time on crawlability, accessibility, renderability on all of our technical audits. It’s the foundational stuff. This is the important stuff. If you’re noticing these issues, if you’re noticing crawl and index issues, this is [01:06:30] the first place that I would start. I would say to start looking at what’s happening in my foundational SAO before moving on to content. Obviously, if your content is thin, if you’re covering… If you wrote the same article in different words, one of those articles might not get indexed. Google might say, “Oh, I already have this.” But majority of the time, when we solve these issues through our audits, it’s crawlability, renderability, discoverability, indexability. Those are the issues that we’re fixing in order to improve crawl rates.

Pagination, super important for discoverability. Google’s [01:07:00] not grabbing a bunch of your pages, you might have no index, no follow on your pagination on your category archives. You might have weird canonicals for the pages in the sequence pointing to the first page in that sequence. Look at all of those. If you’re having these issues, there’s definitely a reason why Google is excluding it. Once in a while, we see what Casey talked about, where it’s a new post, it’s checking all the boxes, but Google’s just not picking it up. Those happen, [01:07:30] but not as frequently as pages that get excluded because of technical issues.

Ashley: Well, unfortunately, that wraps us up. We’re over time. I told you guys this was going to be a big episode, and it sure was.

Casey Markee: Yeah. Right, it is.

Ashley: We still have a lot of questions in the Q&A, but fear not, everybody. All of your questions will be answered next week. We’re going to be emailing everybody a link to the recap blog post, and that will have the direct [01:08:00] answers provided by the panelists that they’ll do after all of this. Thank you, everyone for joining us today. I hope you guys have a great holiday season, a great rest of your year, which is only a few weeks, which is crazy. I will be sending out a link to everyone as well for January’s episode, where we’re going to be talking about site audits, so we’re going to look at how you did in 2021 and how you can do a whole lot better in 2022. You’re definitely going to want to register for that. But in the meantime, happy holidays, everyone, and [01:08:30] thanks for joining us.

Casey Markee: Happy holidays, everyone. Be safe out there.

Andrew Wilder: Y’all [crosstalk 01:08:34] happy new year.

Arsen Rabinovich…: Bye.

About The Panelists

Andrew Wilder

Andrew Wilder is the founder of NerdPress, a digital agency that provides WordPress maintenance and support services for publishers and small businesses, placing an emphasis on site speed, stability, and security. He has been building, fixing, and maintaining websites since 1998, and has spoken on a wide variety of technical topics (in plain English!) at conferences such as WordCamp LAX, the International Association of Culinary Professionals, Food & Wine, Techmunch, BlogHer, BlogHer Food, and Mediavine.

Andrew on Twitter >>

Arsen Rabinovich

Digital Marketer, SEO, International Speaker, 2X Interactive Marketing Award Winner, Search Engine Land Award Winner. Founder @TopHatRank, a Los Angeles based marketing agency that specializes in innovative digital marketing techniques for modern brands of all sizes.

Arsen on Twitter >>

Casey Markee

Speaker, writer, and trainer, Casey Markee has been doing SEO for 20+ years, has conducted over 1000+ site audits, and has trained SEO teams on five continents through his consultancy Media Wyse. He believes bacon should be its own food group and likes long walks to the kitchen and back while under home quarantine.

Casey on Twitter >>

Back to top