Hello, everyone. I’ll give just a couple of seconds, people are coming into the room now we will get started in just about a minute. Perfect, the attendees are rolling in. That’s very exciting. Welcome everyone. We’re going to get started in just a couple more seconds. All right. Hello everyone and welcome. I’d like to thank you so much for joining us today. We’re going to be sharing findings from a study that we conducted about Google’s algorithm update. It’s an update that happened back in November of last year. And we’re going to be discussing what we’ve seen from the update and what’s happened since. So, to help share the findings, we have Sarah Decker with us who is the SEO department manager of TopHatRank and Arsen Rabinovich, the founder of TopHatRank. To begin, let’s get a little bit more background into the study, the original study that we did about Google’s November update. Sarah, can you please share with us why your team decided to conduct this study? How the analysis was conducted and the reasoning behind the necessariness of the study.
Absolutely. Thank you, Ashley. So, in November, we all saw a big update roll out from Google and there was a lot of chatter and a lot of uncertainty about exactly what this update was affecting. In one of their announcements they said that they were affecting local search, but we saw this affect a whole bunch of different websites that didn’t necessarily only appeal to local. Some people said that this update had to do with EAT, expertise, authority and trust. Other people said that it had to do with backlinks. So, there was a really a lot of uncertainty. And every time we looked into one of these possible reasons, what we were finding didn’t really add up. So, we decided to do this study, we asked people to participate in order to find out if there were any commonalities between all of these websites that were affected by the update, so we could really pinpoint what this update changed and the causes and the effects on the website. So, we analyzed a total of 22 sites during the study.
All of these websites were small to medium sized recipe blogs, they had an average of 700 pages for domain, the biggest one had around 2700 pages. They were all written in English, besides one UK-based website, all of them were targeting the US market. They were all recipe focused and there was almost no content on these websites that didn’t contain a recipe for a dish or a drink. And while they were all food blogger specific, some of them did also target more specific niches such as clean and healthy eating or healthy lifestyle or weight loss. All of these sites also saw at least a 30% loss in traffic after that November update. And that number was determined by looking at their Google Analytics data. So, after we had all these 22 websites volunteer to be part of this study, we put their sites through a whole bunch of tools.
So, some of the tools that we used, I’m not going to go through all of them because they’re really a lot. But some of them were Ahrefs, search metrics, the mobile-friendly tester, structured data testing tool, Screaming Frog, Chrome User Experience and of course Google Analytics and Google [inaudible 00:03:49]. So, once we got all the data from all of these tools for all of the websites, then we started comparing them to see what was going on consistently across each website to identify common issues that were potentially causing the drop in traffic that they were seeing as a result of the algorithm update. And so that is what we are going to be presenting to you today, is what we found from this study.
Okay, thank you for that background, Sarah. So, how the webinar is going to work today, Arsen’s going to share the findings in a presentation. If you have any questions as we’re going through this presentation and discussing the findings, please feel free to drop them into the chat box or the Q&A. We will have an open Q&A at the end and a lot of you as you registered, put your questions in. So, we’re going to make sure to address as many questions as we can. But please feel free to ask them in either the chat box or the Q&A throughout the presentation. And without further ado, Arsen if you’d like to take it away and share the study with us.
Hey everybody. Okay, let me try to get this going. I always have technical difficulties with Zoom. All right, I’m going to try presenting now. And let’s see, I am going to present.
I’m going to present the screen. And hopefully it’s what I want to share and not anything else on my computer. Sarah’s given me a thumbs up. All right, perfect. Here we go. All right. So, as Sarah did the intro already, so we are going to go through this. Just like Sarah said, there was an update in November. The only thing that Google told us was that this was a local update in nature and we really didn’t know anything else. All we knew that a bunch of food blogger sites were affected. So, we put together the study, we asked most of you on the Food Bloggers Central group to see who was affected by this and a lot of people submitted. So, we had 22 websites just like Sarah said. 22 websites that are medium to large size, all targeting the US market, all experienced at least 30% traffic loss. And this is what happened to organic visibility. This is a screenshot from Search Metrics and among the sites that we analyzed, that experienced a traffic loss after November 2019 update, 19 blogs out of 22 had a downward visibility trend as well.
Most of the sites suffered around 40% loss just in one week as the update rolled out. So, that was pretty significant, everybody felt that. Websites noticed losses for mobile and desktop visibility even though the censors detected that more ranking fluctuations were on the desktop side, the visibility loss on mobile devices was slightly higher for more than a third of the blogs that were participating in the study. Both of the domains with the highest SEO visibility on our data set [inaudible 00:07:06] 47 of the other visibility index lost over 47% of the visibility index in just one week. The sites experienced visible losses both on mobile and desktop devices. For one of them, the most visibility loss was over 61%. And you guys can see this. For those of you who have downloaded the study by the way, for those of you who have downloaded the study, I’m going to name the pages… I’m adjusting my frames here. I’m going to tell you which pages we’re pulling this information from. Those of you that haven’t downloaded the study, we will provide a link.
I know Ashley, I was supposed to give you all these links but I forgot. We’ll put it in the email or the roundup or we do afterwards. So, those of you that don’t have to study, you’ll have a link to it, it’s also on our website. Thank you, somebody just dropped it in there. Okay. So, organic traffic losses. More than half of the food blogs had over 30% organic traffic loss between November 8th and December 8th. Here are a few examples of that. We’re going to take a look at winners and losers right now. So, from the losers, 95% of analyzed domains noted visibility fluctuations during algorithm updates that happened before November 2019 update. So, 95% of the domains in our study were not doing too good before the November update. So, there were fluctuations before that. Over half of the participating domains lost their visibility after both the March 2019 update, so that’s the update that was to happen earlier in the year and the November 2019 update.
Interesting observation that we did see during this, is blogs that gained visibility after the March 2018 update, were the same blogs that did not experience declines back in August, September of 2018. If you remember that update had to do with relevancy and intent, something that we talk about frequently. So, those blogs also dropped in the November 2019 update. All blogs that had noticed downward visibility trend after March 2019 core algorithm update, either remained unaffected or dropped in the middle of December 2019 update. And almost all of them dropped again in January 2020. Almost all blogs were negatively affected by at least one of the updates that had happened before November 2019. And by at least one update after November 2019. So, we can safely assume that again, all participating blogs had some pre-existing conditions.
While the loss of organic keywords does not always impact organic traffic and those of you who have been on calls with me, I talk about this a lot, you can have one keyword that sends 10s of thousands of referrals exit out of your inventory, while 10 keywords enter that only send 20 referrals each, you’re still going to be at a net negative. So, keyword totals do not correlate to traffic. They do sometimes, but it’s not something that we rely on. And when we look at your keyword inventory and how it expands and contracts through updates, it lets us know how well your website behaves or how well your website is resilient to these updates. So, in November over 68% of analyzed domains lost more than 40% of their organic traffic as we can see here. For 68% of domains the number of organic keywords ranking for positions one through three was more than 50% lower in December than before the November update. In December all participating sites ranked for nearly 77% fewer organic keywords in position one through three. Look at that slide. Winners.
Again, sorry, we had some technical difficulties we had to use a different presentation tool. So, we didn’t notice any clear winners specifically with similar sized websites that get the same amount of traffic as the ones that are participating in the study. The domains that weren’t impacted were these larger aggregators like Allrecipes, Food Network. So, you can see this data on page 91 of the study. So, as of November 17, November 17th was when the update actually hit. We did observe a slightly higher visibility trend for them, but keep in mind that slight increases for these giants are actually fairly large lifts for smaller websites. A small percentage increase for a site that’s doing millions of traffic and has millions of keywords, is a much bigger gain for smaller websites. As of November 17th, again, there’s an increase in visibility and increase in growth. 1.7% for Allrecipes, 2.8% for Food Network. The clear winners had millions of organic keywords. We didn’t observe any strong fluctuations and we did see a slight increase in their visibility. So, causes of the drops for the participating websites.
We try to create it for you guys and we went back and forth on How we’re going to present this. So, hopefully this makes sense. So, you see the four bars, that means that this is definitely something that’s super important, the three bars is pretty important, two is important, but you have no control over this. So. this is Google changing the layout of the result page. And then one bar is just something that’s worth improving. So, let’s talk about SERP layout. Video carousels were moved higher on the page. So, this is something where you have no control over. So, back in November 10th, before the update, the video carousel was ranking in the fifth position. This is against… I think this is search metrics that we pulled this from. So, this shows you positions one, two, three and then you can see under position four is where the video carousel was. And then if we fast forward to November 17th when the update did happen, the video carousel is now hovering under position one. So, it’s actually positions two, three and four.
So, if you were ranking anywhere on page one, you were pushed down by just having Google move these videos up. Link profile, another thing that we looked at is your backlinks. Little research regarding the November update points to unnatural links being at the root of this, but we still wanted to take a quick look. We didn’t see too much chatter around it. So, we dug in some of the food blogs embed lists of external links to other similar domains below the recipe instructions as we can see here. Oh no, the gifts are not working in Google. Oh, boo. Here you would have been able to see as this would move up and down, here you would have been able to see what we’re talking about, but you have these chunks of links that point out to similar recipes and other people’s blogs under your actual recipe card. And those are typically pointing to the same sites over and over and then those sites are also pointing back to you. That’s something that can potentially be seen as reciprocal linking. Being mentioned in roundups, I know it’s super popular for food bloggers.
But a lot of these roundups have really thin content. And these are the roundups where you’re like X amount of recipes or pancake recipes you should try today. Majority of these are just essentially lists of images and links with very little value, there’s not much content. So, this is what it looked like when we were actually pulling this through from Ahref. So, you have these 85 easy low carb recipes, that’s 85 links on one page, you’re not getting any value out of these, these are not anything that’s going to help you grow. If you do get visibility from them, I recommend nofollow links. But as you can see, this just overwhelms amount of links coming from one specific domain. Reciprocal linking. I talked about this a lot in our previous webinars and some of the conferences. Reciprocal links are based on an agreement between two parties that basically I’ll link to you, you’ll link to me. In one particular instance, we saw a blog linked out to her husband’s website 16 times while her husband’s website linked back to her 600 times.
So, this is definitely something that’s really high severity and we recommend that you guys start looking into this. We recommend reviewing your backlink profile at least once a quarter, if everything is normal just to take a look and see what’s happening there. If you were affected by an update, we recommend that you do look at your backlink profile much more frequently. Affiliate links. So, severity here is fairly low, we didn’t notice a clear correlation. And again, just take a step to the site here. This is a correlative study. So, correlation does not equal causation, we’re looking at patterns, we’re looking at commonalities. This does not mean that we know specifically that these are the issues that were a part of the update. And that’s why we did the study. So, we did not see a huge correlation with this, but we did notice that the blogs that lost the most were blogs that were not using a nofollow or rel sponsored or we’re not disclosing affiliate stuff, like you see here in the recipe card. Vanilla extract, liquid stevia.
Having just a disclosure at the top of the bottom, we feel is not a strong enough of a signal for Google. We want to make sure that you annotate the links properly, Google understands that these are affiliate links and that your users also understand that these are affiliate links. More info on this on page 220 of our study. And here we have from one of the winners… Actually not a winner, somebody who didn’t take such a big hit. They’re using this widget. I’m not super familiar with this, but this widget uses all the appropriate signaling to tell the user and Google that these are affiliate links. Content issues. This is super important and I’m going to take a drink of water for this one. Oh, the gifts are not going to work on this either. Okay, bummer. Typically, the order of content for a typical recipe page is personal and descriptive info, list of ingredients, sometimes in precise and quantities, and then the how-to section. This gift would have shown you how much of a scroll somebody has to do to get to the actual recipe.
And Casey and Andrew and I have been talking about this content length is not a ranking factor of making sure that you optimize for the user intent, the user is coming in to get a recipe, you want to give them that information at the top. And that’s what these winners are doing, these larger websites that we look at. They’re structuring their content a little bit differently. They’re structuring it a short paragraph about the dish, a list of ingredients, precise in quantities and then instructions for the dish after that. And this gift would have showed you how that’s structured. We’ll make the presentation available also for download. I think for everyone, I’ll upload the PowerPoint, so you guys can really understand what all this is. And since this is all being recorded, you can follow along later. Over optimization, another very severe issue that we’re noticing over and over and over again. And then during the audits that Sarah and the team do, it’s a constant issue. Overloading your posts with keywords might be perceived as keyword stuffing by Google.
So, we want to make sure that we avoid that. And it’s also part of the Panda update on the algorithm. So, while it can in the short term increase your rankings since you’re just showing something new and shiny to Google, you’re saying, “Oh, look at this new content.” And it’s really filled with these words and repeating these words over and over. So, Google will right away think, “Oh, this piece of content is important for this particular query.” But that doesn’t stick around. It has a very short time decay. In one of the instances pumpkin pie phrase was used 22 times on the pumpkin pie crumble bar page. And over 23 times sweet potato occurred on the other recipe that we looked at. This high of a keyword density typically suggests that the user is optimizing content for search engines rather than for readers. For human readers, these sentences typically, when they’re over optimized and stuffed with keywords, they don’t sound natural.
And also back in 2012, Google released a statement regarding all of this and I’ll read it to you. This is directly from Google. “We are trying to make Google bots smarter, make our relevance better and we are also looking for those who abuse it. Like too many keywords on the page or exchanges where too many links or go well beyond what you normally would expect.” So, Google is actively policing this. Let’s take a look at all recipes. Sweet potato is mentioned 11 times and only four times in the actual recipe. Ads, this is another thing that’s very severe and we totally get it that you need to monetize and you need to make money. But these ads are somewhat counterproductive to that, especially when you have too many of them. So, the majority of the blogs have over seven advertisement boxes for one recipe page. And it’s typically inserted every two to three paragraphs, including videos that auto start. You could imagine that if you were to go to somebody else’s website and you were exposed to this menu flashing images, it would be a little bit overwhelming for you.
And it’s harder for people to focus on the content and people tend to gravitate to ad blocks and that way you’re just not making money. When we actually look into it and take a look at what’s loading, we see that… In this particular example, we spent more than three minutes on the website and the page still did not finish loading. Some of these gift ads are constantly being requested and are constantly loading. When we look at the winners for this, during this update, somebody like Delish, only includes ads on their page where they’re not coming in contact with the main focus of the page and there’s not that many of them also. So, we definitely understand where you guys are as publishers and your need to monetize, but keep in mind that it’s counterintuitive because too many ads slow down your site, you’re ranking less, you’re getting less traffic. And I know it’s also really difficult to do anything about it because you don’t have control of those ads.
Performance. This also touches on the ads. The severity here is for, we are big on pay speed, but we don’t we don’t look at it as a north star for us. What we’re really looking at is FCP, first contentful paint. And it’s the time when the first bit of content on your page is actually loaded for the user, for the reader. Typically, for us fast means under one second, non of the examined blogs had a fast FCP, both on mobile and desktop devices. All went above one second. For the majority, that’s a moderate speed and only one blog had suppressed 2.5 second mark on this. Only six out of 22 homepages that we tested had painted first bits of content under two seconds, essentially had a FCP time under two seconds. To really get understanding of the performance of the article pages, we picked one cake recipe per domain and compared their performance. Only four URLs have their first bits of content painted under two seconds.
So definitely, work on improving speed. We did publish a bunch of articles on our website that expand on this study and our findings. We’ll also link out to them in the email and in the resources. And if you go to our website, you’ll also see them in our blog. They’re just a chunk. I think there’s 10 articles that we published yesterday. So, they’re already there. So, a quick recap of the speed issues. Overall pages performed worse on mobile devices. Majority of the domains we analyzed had issues with serving properly sized optimized images. They also fetched unused CSS style sheets. And we talked about this in the articles where we talked about also doing a plugin audit where you clean up the plugins that you’re not using that are injecting code into your pages, even though you’re not using those plugins. Only 10 out of 22 sites where the thread of the loading was affected by third party scripts, almost all examined blogs should work and minimize their main thread work to improve performance.
So if possible, if you’re technically savvy enough load some of the scripts after the page. And then time to first byte and we just talked about requires improvements for most of the tested pages. So, key action points for us here. And again, these are all covered everywhere. And we outlined them on our website, is make sure that the recipe instructions are the main focus of your content, always think about the user and the intent like, why are they coming to your page? Limit the number of advertisements on the page. Don’t over optimize your posts with keywords, links you earn should not be spammy. Find potential issues in your backlink profile if you were affected by an update and improve your pages performance by optimizing the image and plugins. So, we took this information and we started applying it to our clients who came to us, who were affected by this update and we are seeing recovery as we de-optimize their pages, improve structure organization, internal linking and just create a better user experience.
So, we went back a few months later… Slide. We went back and we wanted to actually see what happened since then. And especially what happened on the May 4th update that just happened. So, Google again announced that there was an update, our rank trackers picked up fluctuations, May 4th and sixth for both mobile and desktop indexes. According to SEMrush data websites, from the food and drink niche rose an average of five positions in search results from this Google update. So, we wanted to take a look. And this is where the kind of the graph shows the improvements on May 6th, a nice spike. So, after the May core update, 50% of the domains in our study had increases in visibility trends. So, 20 blogs out of 22 were included in this secondary study we lost access to two out of the 22 that we initially looked at. Some of the landing pages with the highest impressions that dropped in November of 2019, were the biggest winners after the core update in May. For many sites, the update in May was a real recovery and some of them managed to double the numbers of organic sessions.
So, what happened? Clearly structured content helped out, this is what we observed. Improvements in performance, especially those related to visual changes, metrics such as first contentful paint. Again, first contentful paint measures how long it takes the browser to render the first piece of content on your site. And then changes in link profiles, removing sideway links from external domains, getting mentions on authoritative websites definitely helped with recovery. Then the factors that are outside of your control. SERP layout, especially for related questions and video carousels changed again. The precision of how Google matches user intent to the query, user intent and query syntax also got much better. And that’s what helped a huge chunk of the queries that you didn’t have control over, that last back in November. So, that’s what we saw in the recovery. If you guys have specific questions, shoot us an email at publishers@tophatrank. And I think Ashley, do you want to just jump into discussion or do you want me to go back to those points as we go through them or just stop sharing?
Yeah, we can just go straight into discussion.
Perfect. So, thank you for sharing all that Arsen in detail. At this point, I’d like to, the main findings that you went over, I’d like to break them down a little bit more and ask you and Sarah specific questions about them. We are having some questions come in through Q&A. This is definitely a great time to continue adding them inside of Q&A or inside of the chat box. Okay, so let’s get started. Sarah, first question I have is for you. So, in the study, one of the things that Arsen was going over was that, publishers need to be prepared and that was definitely repeated over and over. So, what does that really mean? Can you break down, what can site do to really be prepared for these kind of fluctuations in algorithm updates?
Well, I think you really need to understand that updates are going to continue to happen first of all. And like Arsen said, there are some things that are within your control and some things that are out of your control. In order to minimize the impact and the risk that the update is going to negatively impact your site, the best way to do that is to diversify your content and diversify your keywords. Don’t put all of your eggs in one basket. And that’s something that we see a lot of the time too, like how Arsen mentioned with your keyword inventory, you might have one keyword that’s bringing in the majority of traffic for your site and then if you lose that one keyword, it is really hurtful and really impactful on your overall traffic. So, rather than only focusing on the really big keywords that are more or less reserved for these giant sites that are always going to rank well for them, you should do something else.
You should go after long tail keywords, you should try to answer specific questions, you could do something in order to gain more valuable keywords into your inventory that you will be able to rank better for and that those rankings will also be sustained for a longer period of time regardless of what changes within the algorithm.
Okay. That makes sense. So, it’s basically changing the focus and not trying to compete with the big guy so much. Arsen, one of the questions that we had come through during the registration process, this came through back to back was, why these fluctuations… There we go, big words and algorithm updates happen?
Well, yeah. Google wants to do a good job of presenting the best result for the person that’s performing the search, right? And Google has to gauge that somehow. One of the ways that Google does this is by constantly updating how they evaluate websites. And that’s where you see these algorithm updates happen. And the smarter Google becomes the more artificial intelligence Google uses and machine learning Google uses, the better Google will become at understanding content. Back in the day, we were able to just drop a bunch of words on a page and rank it. And a big part of the reason why we have these updates is because SEOs like us who’ve been doing this for a very long time, who’ve been abusing the system and figuring out these little gaps in the algorithm and gaming it. So, Google is taking that into consideration also, when it comes to like backlinks, Panda, Penguin, all of those words, direct response to what we were doing to the web. The panda came out and they basically said, “You can’t over optimize anymore because we understand that now and we’ll either ignore it or we might penalize.”
Same thing with Penguin. Penguin came out and everybody got… A lot of people got penalized for link building, all kinds of link building practices just had to disappear. So, as Google is doing these updates, they’re not doing it to mess with you guys, they’re doing it to show the user the best result. Google doesn’t do an update and crawl through your website and say, “Your website sucks, so we’re going to move you down.” They’re just saying, “Somebody else is doing a better job at answering this query and we’re going to move them up because we feel based on all of these factors, all of these signals, all of the long click model,” where Google measures the pogo-sticking effect when somebody clicks through to your site and back time on site based on query. And they say this a lot. If somebody’s searching to get the temperature to cook their tomahawk steak and they land on their page, they’re just looking for that one specific little thing. They’re going to come in, get the temperature and leave.
They’re not going to read your blog, they’re not going to subscribe. They need this right now, they’re going to… Other queries that have investigatory intent, where people need to learn something in depth, Google understands the time on site was going to be a little bit longer and pogo-sticking should be less throttle. So as these updates happen, we have to take a look at them from a standpoint of Google’s trying to improve results for the users. They don’t always get it right. We see this happen all the time. I think somebody was just recently showing after the May update, a full first page of nothing but YouTube videos. So, they come back and they tweak it back. But that’s why they do it. They’re not doing it to mess with you guys. Most of the time, they’re doing it to improve the results.
Okay, that makes sense. Sarah, there’s a lot of bloggers that are just starting out or joining us right now, what are some of the easy wins that publishers can focus on regardless if they have a really large site and they’re well established or if they’re just starting out and have a smaller site?
I’d say really put a lot of time and effort into focusing on what you’re good at. Don’t try to be the best at everything because then you’re going to end up being the best at nothing. It’s going to be… Unless you’re one of these giant, giant sites, you’re not going to be able to be the most relevant result for every single type of food search out there. So, find your niche. What are you better at than everyone else is? Is that Russian food? Is it keto diet? Is it diabetes food? Something that’s very specific to you that you can dominate more so than anyone else. And that really dovetails a lot into what we often talk about, which is relevance. And why are you more relevant? Why is your content better and more relevant than everyone else that’s out there to show number one, number two, number three, for the things that people are searching for? What makes you qualified to be the best? You have to own that that piece of content.
If you really focus on that one specific niche and dominate it, you’re going to get a whole lot more traffic from that niche than trying to apply to everything where you’re just whitewashing yourself and it’s all just very blended and even. And so within that, as far as specific things that you could even do within that niche, just to amp up your blog a little bit, you could create secondary pieces of content. You can take the content that you already have, that’s already doing well and that users have expressed an interest in or that’s getting a lot of traffic for you and you can combine them all together into an online cookbook or an Ebook that someone could download.
So, if you’re really good at desserts, maybe that’s your thing, you’re fantastic at desserts and you have a bunch of different brownie recipes that are all doing really well, combine those all together into one brownie recipe cookbook that people can download and use. And that’s a way to really focus more specifically on your niche and also give something back to your users, which will help instill more loyalty and also present yourself as an authority on that topic.
Sorry, I’m going to cut in real fast Ashley, There’s a question in the Q&A that we should probably save an answer on this. For the sites that saw drops, were they due to individual high performing pages dropping rankings or was it drops across all posts? So, once we’re done with this part, let’s make sure that we answer this. This is actually a very interesting question.
Of course, if anyone has additional questions, definitely put them in the Q&A and we’ll have a portion for Q&A at the end of the panel. Arsen, when it comes to optimizing each of the posts, a lot of publishers rely on Yoast. And in the study you mentioned and used the word de-optimizing quite a lot. What do you mean by that? And what are some of the best ways to actually optimize a post and avoid over optimizing?
So again, this all comes from… For a lot of us comes from just muscle memory. This is the way we used to write blog posts and we were trained to include a keyword in our title in our H1, make sure that it’s included in the first… Sarah, if you remember this, 160 characters, including spaces, the first sentence of the first paragraph, right? So, this is all done to basically hammer the point that this post is about this particular keyword. And that worked again when Google wasn’t smarter figuring this out. And then SEOs started abusing this and we were stuffing title tags and all tags. So, Google now looks at things a little bit differently. Over optimization is a serious thing and we see this a lot, we clean this up de-optimizing. So, Yoast is a really awesome plugin. I love what they do. You have to take that green light, that green marker, that green… I think now it’s a little circle that they do instead of a check mark. That’s a tool that’s measuring based on frequency of how well you repeat this word and measuring things. You should not be chasing that.
I think one of the main reasons that we see during audits, especially with over optimization is people just trying to change that green light in Yoast. They’re continuing to add keywords, thus over optimizing because they want to get the green light. I always recommend read through it out loud, get on the phone with somebody, read your post to them. If it sounds weird to the person that’s not reading it, it’s going to be weird to Google. Google’s getting much better understanding queries, Google’s getting much better at natural language processing, stop over optimizing. Write it naturally. Your headings should not repeat keyword every single time. If you’re properly nesting your H1s, H2s, H3s, H4s, Google understands that. If you include your keyword in the H1, this post your H1 would be like, Arsen’s awesome keto pancake recipe.
I don’t have to repeat keto pancake recipe or keto pancakes in every single heading that’s nested under that H1. H2 can be ingredients, H3 can be preparation, the second H2 can be preparation. Google understands that this is a sub topic of the main topic of that H1. So, think of it this way, read the content, get a screen reader to read it to you if it sounds weird, it’s weird.
And I’m going to take the question from Amy, because it’s about Yoast. And while we’re on this subject, Amy’s asking, “Is it important to link to an outside website in every single post as Yoast suggests?”
Nope, nope. Nope. We have… There’s a plenty of examples on the web where there’s posts. You should link out if you are citing something, if it contributes to what you’re writing about. If you’re linking out just to get that check mark, you’re again doing it for the wrong reason. So, if you’re again writing about keto breakfast pancakes and you want to account for the user who’s just getting started with keto, you might want to link out to WebMD or some authoritative place that talks about keto and what it isn’t how it is. Because it enhances the experience of the user who might be very top of the funnel and doesn’t really understand what they’re looking for. Or if keto is even right for them. But if you’re just doing that because Yoast told you to do it, don’t.
Okay. Okay, that makes sense. Thank you for answering that. The next question. So, I do a lot of content audits specifically for bloggers. And I constantly see that meta descriptions aren’t filled out. And that’s always one of my biggest red flags. In the study, you guys also talk about meta descriptions, why are they so important?
So, meta descriptions are no longer a ranking factor. They haven’t been for a little while, but they are still a quality signal. So, it is something that you still do want to have on every single post because it is just at the very base level. One of the things that Google is going through and evaluating your site just on a check mark level, do you have a meta description? And if you don’t, then it could perceive your site as not being as quality because it’s not as well filled out. So just have a meta description on every single page, every single post, every single category page at minimum. So at minimum, just pull an excerpt if you don’t feel like writing one for every single post. But, if you do have the time and you want to go in and write a meta description for each post, which is even better than pulling one just out of an excerpt, then you really want this to essentially almost be ad copy. So, the meta description is something that’s going to show in SERPs when your site is showing in search results.
And it’s what’s going to be right below your title and it could hopefully entice people to click on the link to your website. So, don’t think about this as far as trying to stuff in keywords here because words don’t matter in your meta description. You want it to be something that’s useful to your users in order to get them to click on the link to then go into your website.
So, another point of contact word. It needs to be useful to the users. Keeps being repeating. Got it. Arsen, you’d mentioned when it comes to ads, which this is the number one moneymaker for publishers. But you’re suggesting that more ads don’t equal more money. Don’t most publishers sites rely on advertising to make revenue? How can they find a balance and what does that balance look like?
There’s no standard for this. It depends. I’m going to give you the typical SEO answer. It depends. It depends on the length of your post, it depends what you’re covering and it depends on how your website is structured, with your design, it depends on a lot of things to determine your ad position. I used to do a lot of affiliate marketing before SEO and we played around with ad positioning a lot of to get better clicks and to increase time on site and all of that. I agree with our findings 100%, that ads should not be placed where attention can be broken or taken away from the main focus of the content. Ads should not be intrusive or disruptive. They should be somewhere on the side where if a user is interested and the ad is relevant enough, they will click on it. But you should not take away from the experience of the user or the reader being there with an ad. And you’re going to see this more and more especially with what Google is doing with core web vitals and everything else.
You’re going to see more and more of these. The way the ads are being handled right now, it’s going to have to start changing. Schuyler, Allrecipes, puts a recipe card at the beginning, closer to the top and then they also do a really good job with ad positioning. So, if anybody wants to look at that, it’s a good way to… And Allrecipes didn’t get affected by these updates. They actually grew.
Sarah, when it comes to plugins, how many plugins is too many? Because they can be really helpful, but they can also totally damage the site.
Yes, they absolutely can. Well, it really depends of course. [inaudible 00:47:19] yes.
I know, every time we say it depends. So, when it comes to plugins, that is one of the great things about WordPress, is that you can do just about anything with plugins without really the need to have a lot of HTML or coding language. Just put in a plug in, follow the instructions, create some awesome stuff on your website. But the problem with that is that a lot of times you might download a plugin and put it in, it doesn’t do exactly what you want it to do, so then you download a different plugin and try that one out, not really what you’re looking for, try a different one, download that. So, what’s happening when you do this is that you end up bogging down your site with all this extra code for plugins that you might not necessarily really be using. So, some plugins are pretty light when it comes to code, but other of them are a lot heavier. And they could be injecting things in the background in the back of your site that you’re not even aware of.
So, you want to make sure that you’re disabling plugins that you’re not using, even deleting them even, not just even disabling them just to make sure that you don’t cause this extra code bloat on your website. Maintaining your plugins is also really important because they can potentially become a security risk for you. That is how a lot of sites do end up getting hacked, just if you have an older and outdated version of your plugin. So, definitely make sure that you’re being mindful of them and that you’re keeping them updated as much as possible. And if you have trouble with that, there are managed services out there such as NerdPress, little call out there, which can help you with managing your plugins. And with page speed optimization and all sort of things.
Anytime you can get into something that’s a managed solution for you like with Andrew at NerdPress, a lot of the clients who we recommend to him come back to us and say that it’s night and day performance improvement. It’s just like peace of mind that somebody is watching over your site, making sure that everything’s in order.
Arsen, can you explain what you mean by indexing strategy and cutting down on low quality pages?
Yeah. And there’s a pretty good write up we should probably link to it on our site. So, indexing strategy. So, quick backstory on that. So, Back when Google wasn’t as powerful, we were always concerned and Sarah, has been with me for… With TopHatRank for almost 10 years now. And she remembers these days. We were concerned about crawl budgets. And then Google came out and said that, “Hey, we’re strong enough now, we’re not really… We’re able to index pretty much crawl and index anything you throw at us. Larger sites should still be concerned with index bloat and crawl budgets and all of that. Smaller sites, not so much. But you should still keep in mind that you don’t want Google to go around and pick up pages that are very low quality. And low quality pages are your comment paginated pages. There’s not much value, Google does not rank those. They’re valuable for your users and beyond page one, we don’t see those pages show up in rankings.
So, as Google crawls, Google can go into this long paginated sequence of common pages and just spend a lot of time there, it will pick those up, but if Google encounters an issue, you might… Like, if you’re using a canonical. So, if you’re using a canonical that’s pointing to the first page or to the actual article on your common pagination, Google might not respect the canonical. And we saw a lot of that in our study as well. Canonical is we’re just not being respected. So, as Google is crawling through these paginated common pages and spending time there. Definitely, you don’t want to send Google in that direction. I’d rather have Google spend more time in picking up new content and new blog posts. Your tag pages, a lot of the audits that we do, we see publishers still… Well, first thing that I see all the time is that you have an exact clone of your categories in your tags and both are accessible and crawlable.
A, you’re bugging down the crawler with this, it’s useless for crawler to crawl your tags and categories are exactly the same, you’re duplicating content. It’s a really bad experience. And Google looks at it from a standpoint of, “If I’m having a bad experience as a crawler as a bot, that’s not just going line by line, I can only imagine what a user is going to go through if they encounter this issue.” So, pagination of your taxonomies that you don’t use for navigation or organization, should not be crawled by Google. Your common pages should not be crawled by Google. You just don’t need to spend time on that. And that’s what we mean by indexing strategy.
Okay. And is a content audit something that you can do to help see those red flags and cut down on your content with or?
Yeah. So, you can also have… Thank you. You can also have a low quality content happen just from content that just been sitting there for a while and it’s useless and it’s not doing anything for you. A content audit will definitely help you identify those posts, something that we do as a part of our content audits is we score your posts on multiple SEO and social factors and we determine whether this post is worth keeping, if it is worth keeping are we going to update it? Are we going to replace it or are we going to delete it? But yeah, even with us, we just did a content audit for TopHatRank and we found pieces of content that were just like for… They were covering topics, they’re for web 2.0 properties back from 2009 that don’t even exist anymore. And this was just sitting on our site. So yeah, absolutely. A content audit helps from a user perspective of, how valuable is this content to the user? When was the last time somebody actually cared enough about this topic? When was the last time somebody actually searched for this?
Another thing we often see with these low quality, low content pages in a lot of the audits that we do, is we see pages that are just media attachments. And so this happens when someone goes to insert an image into a post, but when in the little box that’s in Yoast, it says link to media attachment. And what that does is create a page that only has that image on it. And that’s it. And that’s really not a good thing to do. But they’re really low quality pages, because there’s no content on them. So, that’s something you should make sure that you are not doing and that’s just a setting in WordPress to make sure that you’re not doing them.
There’re really awesome questions in the Q&A right now. [inaudible 00:54:42].
Yes, yes, please keep adding them in. We only have a couple more panel questions and then we’re going to open up to Q&A. So, we will make sure to address every question that’s inside of the Q&A. And if we happen to run out of time as more questions get in, after the webinar, we’ll be sharing all of the links, all of the resources, a replay of this and any questions that were asked that didn’t get answered, we’ll make sure and answer them and send you all the resources. So with that, Sarah affiliate links are another way for publishers to make revenue from their content. How can a publisher clarify that they’re having affiliate links on the post? What’s the best way to go about doing that?
So, what we saw within our study is that there are definitely a couple different things that you want to do to make sure that you’re very clearly disclosing that this is an affiliate link. Just having the blanket affiliate disclaimer in your footer or at the top of the posts or your sidebars, things like that is not really enough. So, on the actual link, first, you want to make sure that you have a nofollow tag attributed to it because you don’t need Google following your affiliate links, but then also you want to have a rel sponsored tag on it. So, when Google crawls the link, it sees that this is actually a sponsored link, then also within the content for your users, you want to specifically call out that this is a link to go buy something somewhere.
So, it could be buy this cookie sheet or get this cookie sheet here on Amazon or get this cookie sheet here from Better Homes and Gardens. Is that still around? So, you want to make it very, very clear that this is an affiliate link, that you’re not trying to deceive the user in any way. That’s really what Google doesn’t like. Don’t be tricky, don’t try to be sly, just be very clear and upfront and transparent with everything that you’re doing.
Okay. And clarify that it’s an affiliate link.
That makes sense. And Arsen, my final question for you, how often should you be reviewing your backlinks and how can you determine when your backlink is actually harming your site?
Yeah. So, rule of thumb… And this is something that again in that write up that we published on our site to further expand on our findings, you should not be using the disavow tool if your site has not been affected by an update and everything is working fine. To answer your question, if you were affected by an update, you should definitely be looking at your backlinks. Don’t rush to take action on them. You should definitely examine them and there’s a lot of hints that you can find within your backlink profile that will uncover more interesting things or even other issues. In-house I think we recommend right now that if everything is fine, once a quarter for smaller or larger sites once a year or twice a year, just as a prophylactic for smaller sites. If you are fluctuating through updates, definitely look at your backlinks. Backlinks, like I said, paint a picture. You might have an influx of sidebar links, one domain sending a link to you from every page on their site, especially if it’s a low quality domain. It just doesn’t look right.
So, I would be looking for those things. Anything that’s coming from foreign language websites, anything that’s adult, anything I’m keeping keep in mind. Google is really good at ignoring links. So again, your trigger should be, “I was affected during an update, something weird happened, I should probably take a look at my backlinks.” But again, it’s a part of the wide investigation of all of the things that could potentially happen during an update. So, we look at crawlability, accessibility, we look at content, we look at structure, we look at architecture, internal link, page speeds, schema, backlinks, all of it should be a part of your evaluation.
That makes sense. Well, thank you, Sarah and Arsen for diving deeper into the study and showing the presentation. We’re officially going to go into Q&A, so now’s your time if you haven’t already to drop a question in. We are going to start with Schuyler’s question, for the sites that sell drops, were they due to individual high performing pages dropping rankings or was it the drop across all posts? Arsen if you’d like to address that?
Yeah. So, interesting observation in our study was that… Well, let me take a step back. When the update rolled out, a lot of the bloggers were coming to us and saying, “I was affected, I was affected.” I think we were doing six, eight calls a day at that time. And I had nothing to tell you guys. I’m like, “I don’t know.” I think this was also when we were doing the little short meetings for the Tastemaker Conference. So, for those of you who were on those calls, Casey’s ears are burning now. We didn’t have anything strong to tell you. And that’s why we decided to do the study. We wanted to do this because we just didn’t know. So yeah, when we actually dug into the data, a lot of bloggers were telling us, “I lost so much of my traffic, my entire site has been affected, everything is down. This is the end of the world.” But when we actually dug in, it was only a handful of pages that were delivering majority of the traffic.
And majority of the traffic for a lot of these pages was coming from a handful of short tail broad keywords that as a part of this update, a lot of these bloggers participating in our study, have lost positions for it. And when you move from even being number two, number three to being number five, number six, your traffic drops off drastically. It’s a huge drop. You’re talking like for some of these keywords that we were looking at, I think we were looking at one of the keywords was almost 60,000 monthly referrals we’re dropping off from that one fluctuation. So yeah, Schuyler, it felt like the entire site was being affected. But when we actually dug into the data, it was a handful of posts that were affected. Typically, unless it’s a manual penalty or you did a full migration and it was botched, you will not see… And Sarah chime in, maybe I’m missing something, but we haven’t seen a full site being affected. It’s usually certain pages.
Not in a long time. Back in the old days-
… Happened but it doesn’t really happen so much anymore.
They haven’t seen it in a while.
Yeah. [crosstalk 01:01:30]. Sorry.
Oh, go ahead Sarah.
A lot of the going back to what we had talked about a little earlier with the intent of the search, a lot of what was lost were those really big broad keywords with really high search volume. And what we’ve seen with that, is that the intent of what the user is looking for when searching for that, is really questionable. If someone is searching for falafel, are they looking for a falafel recipe, are they looking to buy one, or are they looking for information, calories? What are they looking for when they’re searching for this broad, broad query? And so we think that that’s a lot of the reason why these bigger sites that have a whole lot of authority and a whole lot of knowledge depth about these topics are ranking because they’re able to satisfy all these different possible intents for the searches that are being performed.
And Google is constantly shuffling around and I think I might have rushed through that slide, but it’s out of your control, Google changes this around how it templatizes the result page very frequently. So, I’m sure you’ve seen this when you put in a very broad query like Eiffel Tower, Google gives you a kitchen sink result. So it gives you videos, it gives people also ask, it gives you a knowledge panel. The more you refine your search, the more precise Google’s is going to get. But that’s also constantly changing. So, intent matching to Google, is an ongoing process. It’s constantly shuffling around results, testing which type of result. One of the things that we saw in our study as a part of the indexing strategy was category pages that weren’t optimized on specific websites, were being outranked for their category topic keyword by roundup posts. And I think we’re working with a client right now where we have around the post as number one and the category page after we optimized it is now number two. Right, Sarah?
We just talked about this. So, the roundup post is not going to age as well as your category page that’s constantly being updated as you continue to publish and add more information on that particular topic. So, Google will… If I search for something broad like potato soup, is Google going to want to show me a specific recipe? I didn’t clarify potato soup recipe, but I’m saying potato soup to Google. Is Google going to want to show me a page that shows me just one potato recipe or is Google going to show me a page that shows me a bunch of potato recipes for me to be able to make a decision on which potato soup recipe I want to look at? So definitely, as a part of your indexing strategy, you should be looking at, will these category pages perform well? Category pages will always be better than roundups at least in my book because of how evergreen they are because you can constantly add content to them.
If they’re properly optimized.
If they’re properly optimized. A blank category page is just like, “Here’s all my posts,” and there’s just an image in the title, that brings zero value to the user and to the search engine. It’s just basically, when you’re doing it that way, when you’re not optimizing, when you’re not bringing in excerpts, when you’re not bringing in any other feature. Again, look at Allrecipes, look at Mealthy, look at everybody. They’re bringing in star ratings, they’re bringing in excerpts, they’re helping the user make a decision on which recipe they should be looking at and preventing constant clicking back and forth. The more richer, the more of an experience you can make your category page, the better that page will rank for broader terms.
And I think this whole concept bounces and I apologize in advance if I say your name wrong, Hurry. But your question, “I’ve been blogging for many years now, so understandably I have many older recipes which qualifies thin content that I’d like to know index. However, many of these are top ranked but don’t result in much traffic.” So, do we see any problem from an SEO point of view, if she went ahead and no index these top ranking pages and updated them or if the SEO fit content later on? And Karen’s backing this same concept with [crosstalk 01:06:06] high ranking pages. What do you do if the content is not that great anymore, but they’re ranking good?
Okay. So, I would say, don’t touch it, don’t kill it off and just update it and see if you can modernize it. If it’s ranking, it’s ranking because of something. Something is causing it to rank. It doesn’t usually just rank on its own. If they’re not very competitive terms, if there’s nothing salvageable there… Listen, you definitely need a content audit to determine all this. It’s hard for us to say on a case by case basis, “This is what you should do with this post.” It all depends… All depends. It all depends on how many backlinks, it depends on how many sessions, all depends if there’s dependencies on that post, is that post internally linking to other posts that’s supporting? You can remove a piece of content that’s passing a lot of authority internally from one page to another and see fluctuations on the other page that it’s linking to. So, it’s not so easy like, “Hey, this is the proper standard operating procedure for handling this.” It literally depends in the situation. Sarah.
I know, I totally agree. I think it really comes down to your personal preference and how much you value the traffic that’s coming in from that post. So, if it is ranking highly for some keywords, then in theory, it is going to be bringing in traffic for those keywords. So, it’s really up to you as far as how much traffic it’s bringing in and is that more valuable to you than having the old outdated content out there. If it’s a post that’s really not bringing in any traffic, then I would say that you can just no index it or if it something you don’t want to write about anymore, feel free to go ahead and update it and update the content to make it more relevant and make it into something that you do want to be shared. But ultimately, I would look more at traffic than specifically at rankings, because just because you’re ranking for a keyword doesn’t mean you’re necessarily going to be getting traffic depending on the sort of [inaudible 01:08:09].
That makes sense. Question from Stacey, “Should we keep the video ad that pops up and follows you through the post on mobile or is that bad for user experience?”
Yeah, it’s definitely bad for user experience. I think… And again, I’m not 100% sure on this, some further reading might be required. But I think if it’s blocking, I think Sarah, with 30% of the visible part of the screen, right? Google is not going to happy with that.
Especially on mobile. Yeah.
Yeah. But definitely, I’m sure there’s new information on this. I would definitely do some research on that. I can’t give you 100% answer on this.
I think when it comes to user experience, if you’re always questioning if it’s going to be a bad user experience, go through it yourself and go through it with your family, see what they do if they bounce right back off of it, because it’s so interruptive.
But it could also be the ad that you’re serving up as a part of your recipe card. And I think some of these platforms you send them your video and then they add a pre-roll. So, it’s like-
Yeah. Oh, is Arsen okay?
… And I’m back.
It was like pause break.
Awkward silence break.
Of course. Of course. Just a couple more questions and then we’ll go ahead and wrap this up. Does Google penalize you for EAT if you write about different topics? For example, recipes, home decor, travel, is it better to focus and just be a recipe blogger or do it all kind of blogger?
Okay, so there’s no penalty for EAT, expertise, authority and trust. And I think John Mueller recently came out and said that Google does not explicitly measure EAT as a part of its algorithmic evaluation of websites. So, I don’t think EAT as a penalty exists, you can definitely write about multiple topics as long as you don’t write about things that you are not certified to write about or you’re not authority on the topic whichever way Google decides to evaluate, whether it’s the backlinks or perceived authority in your site, whatever it is, right? If you do publish content… So, just to take a step back. If you want to cover multiple topics, that’s perfectly fine. There’s plenty of multiple topic blogs out there that are doing very, very well. Should you be covering a topic of things to eat when your sugar level is at 128, definitely not. You shouldn’t be writing about that. Again, it depends, but also be very mindful of what kind of an approach you’re taking on a specific topic.
So, you want to maybe answer questions around it that do not have to… Anything that doesn’t have to do… Let me rephrase that again. If something you’re writing about is incorrect, is it going to badly affect the person that’s reading? So, if I follow your instructions and you are wrong, is it going to affect me financially or health wise or medically? If you can say no, even if I am going to completely bullshit my way through this article and give some crappy advice and you follow it, nothing bad’s going to happen to you, then sure write about it. But I don’t think there’s a… You’re not going to have a penalty. Unless, again, Google can look through your content, assume that, “Hey, you’re not really an expert on this topic.” And that content is just not going to rank. You’re not going to be penalized.
Yeah, that makes sense. Question from Renee. “Many of us do theme-based weekly, monthly or quarterly posts and link to each other. Is it a good idea as it would lead to the same people linking to each other?
So, you only really need for SEO purposes, you only really need one link from each domain. I would… So, based on the data in our study, I would not continue doing that. I would probably add nofollow links, especially now that nofollow are just suggestions, Sarah, right? There’s recommendations, suggestions. Google, might-
They’re still part of your evaluation of your profile, even if they’re not as strong as a follow link.
Yeah. So, I would play it on the safe side, if you already got a link from that site, adding another link is not really going to move the needle much for you. I would just apply nofollow. I think there was also a question about adding sponsored to affiliate links even though they’re not paying you for it. Affiliates get paid when somebody transacts. So, if you are adding a link and it has an affiliate code, you’re doing this before they pay you. If it is a sponsored post, then you need to add rel sponsored to it. So, if somebody paid you, we come to you and we say, “Hey, we want you to review this kitchenware product.” You need to add sponsored to that.
Okay, that makes sense. Sarah, what’s the recommended keyword density? And do you have any tools that you recommend or use to check keyword density?
It depends. Yeah, there’s not a specific amount that you should be aiming for because it’s really… Each topic that you’re writing about, each vertical that you’re writing in, is going to vary a bit and it’s going to have a range of a keyword density. So, a lot of times what we like to do when we’re doing page level recommendations for our clients, is we’ll take a look at the keyword density of the top five sites that are ranking and see what are they doing? How many times are they using the keyword, because if they’re ranking, then they’re probably doing something right and it’s probably a good density to aim for. So, with that in mind, if you want just a general ballpark number to aim for, I would say between one to 2% ish, that’s very, very general. Take that with a grain of salt and really delve into what is specifically working for the topic that you’re writing about, but that is what a lot of people do talk about as far as a general keyword density that you should be aiming for.
And as far as tools, there are a lot of really good content analyzing tools that do frequency analysis and density analysis. So, some that we like to use is sometimes SEMrush, has a good analyzer and then also Write Bot also has a new content analyzer portion that they’ve released recently.
Yeah, we’re very happy with… Sorry. We’re very happy with Write’s TF-IDF analysis, term frequency inverse document frequency. SEMrush does a really good job. But yeah, Write might not be the perfect tool for bloggers, but SEMrush definitely is a really good tool for that.
Perfect. Our final question from Dave, “Does updating the post several times in a short period of time affect its performance? I have a lot of old posts that need to be updated, but the updates might need to be done in several different stages. So should it be updated all in one go or can I actually spread out the update over time?”
It’s up to you until it reaches its final form, it’s not going to perform. But Google has said that they like to see ongoing improvements being made to websites, so completely up to you. I don’t see it being bad either way.
Okay, perfect. That makes sense. Well, thank you again, everyone who joined us today. Thank you, panelists, Sarah and Arsen for sharing all of this great information. We will be sending out all the resources that we mentioned any of the questions that didn’t get answered, we will be answering all of them and emailing each and every one of you, all of the links and everything that you’ll need to be able to watch this again and follow along. If you have any specific questions, want to learn even more about the study or have specific questions on pages or anything, please feel free to reach out to us, firstname.lastname@example.org and again, thank you for joining us.
Wait, don’t turn it off.
I will not. I saw that thing coming. But wait, there’s more.
But wait, there’s more. So, we do have a write up on the follow up study that we will be sending out to you guys who attended this webinar. We’re not publishing it yet publicly, so we’re just sending it to everyone who attended, so you guys get that. We will also make sure you subscribe to our email list. And that’s not because we want to send you promotional stuff, we just Ashley and I and Sarah speak at different webinars and conferences online and we feel that some of them might be very useful to publishers and recipe bloggers, some are very specific to e-commerce, but then there’s others where we feel that it’s useful. So, we want to make sure that we’re updating you guys on that.
As a matter of fact, in August, I will be speaking at a digital marketing summit event or maybe something else. But we’re going to be talking about a topic that is constantly brought up and that’s information architecture proper category structure in bread crumbs. So, that’s a very important topic, we will be publishing a post announcing this event. But yeah, just subscribe. And again, it’s not because we want to sell you something, we just want to keep everybody informed.
Subscribe, like and follow. Thank you again everyone, have a great rest of your day or evening, I don’t think morning for anybody. But take care and we’ll talk to you all soon.