The Theory Behind Ranking Factors — Whiteboard Friday

Since day one of SEO, marketers have tried to determine what factors Google takes into account when ranking results on the SERPs. In this brand new Whiteboard Friday, Russ Jones discusses the theory behind those ranking factors, and gives us some improved definitions and vocabulary to use when discussing them.

Since day one of SEO, marketers have tried to determine what factors Google takes into account when ranking results on the SERPs. In this brand new Whiteboard Friday, Russ Jones discusses the theory behind those ranking factors, and gives us some improved definitions and vocabulary to use when discussing them.

5f625f1075a833.85186618 - The Theory Behind Ranking Factors — Whiteboard Friday

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, folks. Welcome back to another Whiteboard Friday. Today, we’re going to be talking about ranking factors and the theory behind them, and hopefully get past some of these — let’s say controversies — that have come up over the years, when we’ve really just been talking past one another.

You see, ranking factors have been with us since pretty much day one of search engine optimization. We have been trying as SEOs to identify exactly what influences the algorithm. Well, that’s what we’re going to go over today, but we’re going to try and tease out some better definitions and vocabulary so that we’re not talking past one another, and we’re not constantly beating each other over the heads about correlation and not causation, or some other kind of nuance that really doesn’t matter.


So let’s begin at the beginning with direct ranking factors. This is the most narrow kind of understanding of ranking factors. It’s not to say that it’s wrong — it’s just pretty restrictive. A direct ranking factor would be something that Google measures and directly influences the performance of the search result.

So a classic example would actually be your robots.txt file. If you make a change to your robots.txt file, and let’s say you disallow Google, you will have a direct impact on your performance in Google. Namely, your site is going to disappear.

The same is true for the most part with relevancy. Now, we might not know exactly what it is that Google is using to measure relevancy, but we do know that if you improve the relevancy of your content, you’re more likely to rank higher. So these are what we would call direct ranking factors. But there’s obviously a lot more to it than that.

Google has added more and more features to their search engine. They have changed the way that their algorithm has worked. They’ve added more and more machine learning. So I’ve done my best to try and tease out some new vocabulary that we might be able to use to describe the different types of ranking factors that we often discuss in our various communities or online.


Now, obviously, if there are direct ranking factors, it seems like there should be indirect ranking factors. And these are just once-removed ranking factors or interventions that you could take that don’t directly influence the algorithm, but they do influence some of the direct ranking factors which influence the algorithm.

I think a classic example of this is hosting. Let’s say you have a site that’s starting to become more popular and it’s time to move off of that dollar-a-month cPanel hosting that you signed up for when you first started your blog. Well, you might choose to move to, let’s say, a dedicated host that has a lot more RAM and CPU and can handle more threads so everything is moving faster.

Time to first byte is faster. Well, Google doesn’t have an algorithm that’s going out and digging into your server and identifying exactly how many CPU cores there are. But there are a number of direct ranking factors, those that are related perhaps to user experience or perhaps to page speed, that might be influenced by your hosting environment.

Subsequently, we have good reason to believe that improving your hosting environment could have a positive influence on your search rankings. But it wouldn’t be a direct influence. It would be indirect.

The same would be true with social media. While we’re pretty sure that Google isn’t just going out and saying, “Okay, whoever is the most popular on Twitter is going to rank,” there is good reason to believe that investing your time and your money and your energy in promoting your content on social media can actually influence your search results.

A perfect example of this would be promoting an article on Facebook, which later gets picked up by some online publication and then links back to your site. So while the social media activity itself did not directly influence your search results, it did influence the links, and those links influenced your search results.

So we can call these indirect ranking factors. For politeness’ sake, please, when someone talks about social media as a ranking factor, just don’t immediately assume that they mean that it is a direct ranking factor. They very well may mean that it is indirect, and you can ask them to clarify:  “Well, what do you mean? Do you think Google measures social media activity, or are you saying that doing a better job on social is likely to influence search results in some way or another?”

So this is part of the process of teasing out the differences between ranking factors. It gives us the ability to communicate about them in a way in which we’re not, let’s say, confusing what we mean by the words.


Now, the third type is probably the one that’s going to be most controversial, and I’m actually okay with that. I would love to talk in either the comments or on Twitter about exactly what I mean by emergent ranking factors. I think it’s important that we get this one clear in some way, shape, or form because I think it’s going to be more and more and more important as machine learning itself becomes more and more and more important as a part of Google’s algorithm.

Many, many years ago, search engine optimizers like myself noticed that web pages on domains that had strong link authority seemed to do well in organic search results, even when the page itself wasn’t particularly good, didn’t have particularly good external links — or any at all, and even didn’t have particularly good internal links.

That is to say it was a nearly orphaned page. So SEOs started to wonder whether or not there was some sort of domain-level attribute that Google was using as a ranking factor. We can’t know that. Well, we can ask Google, but we can only hope that they’ll tell us.

So at Moz, what we decided to do was try and identify a series of domain-level link metrics that actually predict the likelihood that a page will perform well in the search results. We call this an emergent ranking factor, or at least I call it an emergent ranking factor, because it is obviously the case that Google does not have a specific domain-authority-like feature inside their algorithm.

But on the contrary, they also do have a lot of data about links pointing to different pages on that same domain. What I believe is going on is what I would call an emergent ranking factor, which is where, let’s say, the influence of several different metrics — none of which have a particularly intended purpose of creating something — end up being easy to measure and to talk about as an emergent ranking factor, rather than as part of all of its constituent elements.

Now, that was kind of a mouthful, so let me give you an example. When you’re making a sauce if you’re cooking, one of the most common parts of that would be the production of a roux. A roux would be a mix, normally of equal weights of flour and fat, and you would use this to thicken the sauce.

Now, I could write an entire recipe book about sauces and never use the word “roux”.  Just don’t use it, and describe the process of producing a roux a hundred times, but never actually use the word “roux”, because “roux” describes this intermediate state. But it becomes very, very useful as a chef to be able to just say to another chef (or a sous-chef, or a cook in their cookbook), “produce a roux out of” and then whatever is the particular fat that you’re using, whether it’s butter or oil or something of that sort.

So the analogy here is that there isn’t really a thing called a roux that’s inside the sauce. What’s in the sauce is the fat and the flour. But at the same time, it’s really convenient to refer to it as a roux. In fact, we can use the word “roux” to know a lot about a particular dish without ever talking about the actual ingredients of flour and of fat.

For example, we can be pretty confident that if a roux is called for in a particular dish, that dish is likely not bacon because it’s not a sauce. So I guess what I’m trying to get at here is that a lot of what we’re talking about with ranking factors is using language that is convenient and valuable for certain purposes.

Like DA is valuable for helping predict search results, but it doesn’t actually have to be a part of the algorithm in order to do that. In fact, I think there’s a really interesting example that’s going on right now — and we’re about to see a shift from the categories — which are Core Web Vitals.

Google has been pushing page speed for quite some time and has provided us several iterations of different types of metrics for determining how fast a page loads. However, what appears to be the case is that Google has decided not to promote individual, particular steps that a website could take in order to speed up, but instead wants you to maximize or minimize a particular emergent value that comes from the amalgamation of all of those steps.

We know that the three different types of Core Web Vitals are: first input delay, largest contentful paint, and cumulative layout shift. So let’s talk about the third one. If you’ve ever been on your cell phone and you’ve noticed that the text loads before certain other aspects and you start reading it and you try and scroll down and as soon as put your finger there an ad pops up because the ad took longer to load and it’s just jostling the page, well, that’s layout shift, and Google has learned that users just don’t like it. So, even though they don’t know all of the individual factors underneath that are responsible for cumulative layout shift, they know that there’s this measurement, that explains all of it, that is great shorthand, and a really effective way of determining whether or not a user is going to enjoy their experience on that page.

This would be an emergent ranking factor. Now, what’s interesting is that Google has now decided that this emergent ranking factor is going to become a direct ranking factor in 2021. They’re going to move these descriptive factors that are amalgamations of lots of little things and make them directly influence the search results.

So we can see how these different types of ranking factors can move back and forth from categories. Back to the question of domain authority. Now, Google has made it clear they don’t use Moz’s domain authority — of course they don’t — and they do not have a domain-authority-like metric. However, there’s nothing to say that at some point they could not build exactly that, some sort of domain-level, link-based metric which is used to inform how to rank certain pages.

So an emergent ranking factor isn’t stuck in that category. It can change. Well, that’s enough about emergent ranking factors. Hopefully, we can talk more about that in the comments.


The next type I wanted to run through is what I would call a validating ranking factor. This is another one that’s been pretty controversial, which is the Quality Rating Guidelines’ list of things that matter, and probably the one that gets the most talked about is E-A-T: Expertise, Authority, and Trustworthiness.

Well, Google has made it clear that not only do they not measure E-A-T (or at least, as best as I’ve understood, they don’t have metrics that are specifically targeted at E-A-T), not only do they not do that, they also, when they collect the data from quality raters on whether or not the SERPs they’re looking at meet these qualifications, they don’t train their algorithm against the labeled data that comes back from their quality raters, which, to me, is surprising.

It seems to me like if you had a lot of labeled data about quality, expertise, and authoritativeness, you might want it trained against that, but maybe Google found out that it wasn’t very productive. Nevertheless, we know that Google cares about E-A-T, and we also have anecdotal evidence.

That is to say webmasters have noticed over time, especially in “your money or your life” types of industries, that expertise and authority does appear to matter in some way, shape, or form. So I like to call these validating ranking factors because Google uses them to validate the quality of the SERPs and the sites that are ranking, but doesn’t actually use them in any kind of direct or indirect way to influence the search results.

Now, I’ve got an interesting one here, which is what I would call user engagement, and the reason why I’ve put it here is because this still remains to be a fairly controversial ranking factor. We’re not quite sure exactly how Google uses it, although we do get some hints every now and then like Core Web Vitals.

If that data is collected from actual user behavior in Chrome, then we’ve got an idea of exactly how user engagement could have an indirect impact on the algorithm because user engagement measures the Core Web Vitals, which, coming in 2021, are going to directly influence the search results.


So validating is this fourth category of ranking factors, and the last — the one that I think is the most controversial  — is correlates. We get into this argument every time: “correlation does not equal causation”, and it seems to me to be the statement that the person who only knows one thing about statistics knows, and so they always say it whenever anything ever comes up about correlation.

Yes, correlation does not imply causation, but that doesn’t mean it isn’t very, very useful. So let’s talk about social metrics. This is one of the classic ones. Several times we’ve run various studies of ranking factors and discovered a direct relationship — a strong relationship — between things like Facebook likes or Google pluses in rankings.

All right. Now, pretty much everyone immediately understood that the reason why a site would have more plus-ones in Google+ and would have more likes in Facebook would be because they rank. That is to say, it’s not Google going out and depending on Facebook’s API to determine how they’re going to rank the sites in their search engine.

On the contrary, performing well in their search engine drives traffic, and that traffic then tends to like the page. So I understand the frustration there when customers start asking, “Well, these two things correlate. Why aren’t you getting me more likes?”

I get that, but it doesn’t mean that it isn’t useful in other ways. So I’ll give you a good example. If you are ranking well for a keyword but yet your social media metrics are poorer than your competitors’, well, it means that there’s something going on in that situation that is making your users engage better with your competitors’ sites than your own, and that’s important to know.

It might not change your rankings, but it might change your conversion rate. It might increase the likelihood that you get found on social media. Even more so, it could actually influence your search results. Because, when you recognize the reason why you’re not getting any likes to your page is because you have broken code, so the Facebook button isn’t working, and then you add it and you start getting shared and more and more people are engaging with and linking to your content, well, then we start having that indirect effect on your rankings.

So, yeah, correlation isn’t the same as causation, but there’s a lot of value there. There’s a new area that I think is going to be really, really important for this. This is going to be natural language processing metrics. These are various different technologies that are on the cutting edge. Well, some are older. Some are newer. But they allow us to kind of predict how good content is.

Now, chances are we are not going to guess the exact way that Google is measuring content quality. I mean, unless a leaked document or something shows up, we’re probably not going to get that lucky. But that doesn’t mean we can’t be really productive if we have a number of correlates, and those correlates can then be used to guide us.

So I drew a little map here to kind of serve as an example. Imagine that it’s the evening and you’re camping, and you decide to go on a quick hike, and you take with you, let’s say, a flag or a series of flags, and you mark the trail as you go so that when it gets later, you can flick on your flashlight and just follow the flags, picking them up, to lead you back to camp.

But it gets super dark, and then you realize you left your flashlight back at camp. What are you going to do? Well, we need to find a way to guide ourselves back to camp. Now, obviously, the flags would have been the best situation, but there are lots of things that are not the camp itself and are not the path itself, but would still be really helpful in getting us back to camp. For example, let’s say that you had just put out the fire after you left camp. Well, the smell of the smoke is a great way for you to find your way back to the camp, but the smoke isn’t the camp. It didn’t cause the camp. It didn’t build the camp. It’s not the path. It didn’t create the path. In fact, the trail of smoke itself is probably quite off the path, but once you do find where it crosses you, you can follow that scent. Well, in that case, it’s really valuable even though it just mildly correlates with exactly where you need to get.

Well, the same thing is true when we’re talking about something like NLP metrics or social media metrics. While they might not matter in terms of influencing the search results directly, they can guide your way. They can help you make better decisions. The thing you want to stay away from is manipulating these types of metrics for their own sake, because we know that correlates are the furthest away from direct ranking factors — at least when we know that the correlate itself is not a direct ranking factor.

All right. I know that’s a lot to stomach, a lot to take in. So hopefully, we have some material for us to discuss below in the comments, and I look forward to talking with you more. Good luck. Bye.

Video transcription by

What is Dwell Time? Is Dwell Time a Google Ranking Factor?

There are many ranking factors Google uses, and dwell time (D.T) is one of those factors. How is dwell time measured? Is it a ranking factor? And if so, how can you optimize dwell time? Here is what dwell time is and the Google ranking factor for it.

There are many ranking factors Google uses, and dwell time (D.T) is one of those factors. The chatter on social media platforms started prioritizing dwell time importance.  How is dwell time measured? Is it a ranking factor? And if so, how can you optimize dwell time? Here is what dwell time is and the Google ranking factor for it.

What is Dwell Time?

Duane Forrester at Bing wrote a blog post three years ago, and in this blog post, the concept of dwell time was first introduced.

In simple words, Dwell time is the length of time a person spends looking at a webpage after they’ve clicked a link on a SERP page — but before clicking back to the search results.

Image  Source –

D.T is often imprecise with session duration as well. Session duration defines the total amount of time spent on a website before the website visitor leaves the site. Dwell time tells a different tale.

Also, keep in mind that this is different from Click through rate.

Let’s Understand Dwell Time

  • For example, let’s say you search for “Pancakes” in Google.
  • The first result shown on the SERP seems appropriate, So you click on a page.
  • When you land on the site, it looks displeasing and difficult to navigate. And the content isn’t helpful.
  • After seven seconds on the page — you click back to the results.
  • Your Dwell Time was 7 seconds.

Average D.T = Total users time spent before clicking Back to Result  — Count of All Users who clicked Back to Result

Can You Measure Dwell Time in Google Analytics?

You will first have to measure dwell time if you are thinking of increasing it. Here is how you can do that:

In Google Analytics, focus on the bounce rate & avg. Time on website metrics. You will have to decrease the bounce rate and increase time on page.

It indicates that your site’s visitors are spending more time on a page before bouncing-off. That’s how you can quickly increase your dwell time.

Does dwell time affect your rankings?

On this question, every marketer has a different point of view, but there are some proofs that do support the idea of Google somehow considering dwell time sort of ranking signal.

Steps to Increase D.T

  • Optimal use of Ads

Google started to penalize mobile sites that contain intrusive pop-up ads and this has been officially announced in 2017. Let’s embrace this fact that ads are sometimes annoying, which results in long dwell times on your website. If you remove ads, then certainly your dwell time will improve a lot.

  • Right Keywords Targeting

You must understand that Driving relevant audiences play a vital role in increasing D.T & this can be done only if you’re targeting the right keywords. You can use tools like Keyword magic tool provided by SEMrush.

Image Source:  SEMrush Keyword Magic Tool
  • Page Load Time Improvement

Slow websites are tedious and frustrating for any of us — especially if we are browsing on mobile. Here are a couple of tools you might like. Pingdom, GTMetrix and Google Page Speed Insights. Also, make sure that you have an outstanding speed.

A Case study made by Radware shows that every single second delay impacts bounce rate, conversion rate, cart size, and page views. It seems like a one-second delay can make a huge difference in user engagement on the website.

  • Better Use of Internal Linking

An excellent Internal linking strategy will help your SEO along with that it will help in creating a better user experience. It creates a natural access structure and will guide website visitors through more website pages and eventually increase your page views.

Simultaneously, this will increase the chances to add certain links to pages that people will further relate to. This one is my favorite strategy to increase the time.

  • Attractive Design of Your Website

You have to make sure that your website design is flawless, easy to navigate, and straightforward. Good web design helps you in SEO, and your users stay on your site, which will improve your dwell time.

Final Conclusion –

As Google keeps on focusing more on excellent user experience & making sure that, it shows the most effective results for every search. And as Neil Patel claims, dwell time is less of a science and more of an art. 

Rahul Setia

Founder at

A competent professional with more than 8+ years of experience in Digital Marketing, FB Ads, Adwords, Linkedin Ads, Email Marketing, SEO & Internet Marketing, ATL, BTL, Lead Generation Campaigns.

Is Google E-A-T Actually a Ranking Factor? – Whiteboard Friday

Many SEOs agree that showing expertise, authority, and trustworthiness in your site content is important to ranking well. But why is that, exactly? Is it because Google E-A-T is an actual ranking factor, or is it something else? In this episode of Whiteboard Friday, Cyrus Shepard explores whether it can be considered a true ranking factor,…

Many SEOs agree that showing expertise, authority, and trustworthiness in your site content is important to ranking well. But why is that, exactly? Is it because Google E-A-T is an actual ranking factor, or is it something else? In this episode of Whiteboard Friday, Cyrus Shepard explores whether it can be considered a true ranking factor, making your E-A-T goals SMART, and how to communicate it all to curious stakeholders.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans. Welcome to another edition of Whiteboard Friday, coming to you from my home where I am wearing a tuxedo, wearing a tuxedo in hope that it exudes a little bit of expertise, perhaps authority, maybe even trust.

Yes, today we are talking about Google E-A-T, expertise, authority, trust, specifically asking the question, “Is Google E-A-T actually a ranking factor?”

Now surprisingly this is a controversial subject in the world of SEO. Very smart SEOs on both sides of the debate. Some SEOs dismiss E-A-T. Others embrace it fully. Even Googlers have different opinions about how it should be communicated. I want to talk about this today not because it’s a debate that only SEOs care about, but because it’s important how we talk to stakeholders about E-A-T and SEO recommendations. Stakeholders being clients, website owners, webmasters.

Anybody that we give an SEO recommendation to, how we talk about these things is important. So I don’t want to judge. I don’t want to be the final say — that’s not what I’m attempting — about whether Google E-A-T is an actual ranking factor. But I do want to explore the different viewpoints. I talked to dozens of SEOs, listened to Googlers, read Google patents, and I found that a lot of the disagreement comes not from what Google E-A-T is — we have a pretty good understanding what Google E-A-T actually does — but how we define ranking factors.

Three ways to define “ranking factors”

I found that how we define ranking factors falls into roughly three different schools of thought.

1. Level 1: Directly measurably, directly impact rankings

Now the first school of thought, this is the traditional view of ranking factors. People in this camp say that ranking factors are things that are directly measurable and they directly impact rankings, or they can directly impact rankings.

These are signals that we’re very familiar with, such as PageRank, URLs, canonicalization, things that we can see and measure and influence and directly impact Google’s algorithm. Now, in this case, we can say Google E-A-T probably isn’t a ranking factor under this definition. There is no E-A-T score. There’s no single E-A-T algorithm. As Gary Illyes of Google says, it’s millions of little algorithms. So in this school or camp, where things are directly measurable and impactful, Google E-A-T is not a ranking factor.

2. Level 2: Modeled or rewarded, indirect effects

Then there’s a second school of thought, almost equal to the first school of thought, that says Google’s algorithm is sufficiently complex that we don’t really know all the direct measurements, and in these days it’s a little more useful to think of ranking factors in terms of what is modeled or rewarded, things with effects that are possibly indirect.

Now this really came about during the days of the Panda algorithm in 2012, when Google started using much more machine learning and eventual neural networks in its algorithm. To give you a brief overview and to grossly oversimplify, Panda was an algorithm designed to reduce low-quality and spammy results in Google search results.

To do this, instead of using directly measurable signals, instead they used machine learning. Again, to grossly oversimplify, Britney Muller has a great post on machine learning. I’m going to link to it if you’re interested. But what they did is they took sites that they wanted to see more of in Google search results, sites like New York Times, things like that, that based on certain qualifications, like did they think the site was well-designed, would you trust it with your credit card, does it seem like it’s updated regularly and written by authors, and they put these in a bucket.

Instead of giving the algorithm direct signals, they told the machine learning program, “Find us more sites like this. We want to reward these sites.” So in this bucket, ranking factors are things that are modeled or rewarded. People in this school of thought say, “Let’s just go after the same thing Googlers are going after because we know those things tend to work.”

Algorithms that fall in this bucket are like Panda, site quality, and E-A-T. In this school of thought, yes, E-A-T can be considered a ranking factor.

3. Level 3: Any quality or action, direct or indirect effects

Then there’s even a third school of thought that goes further than these two, and this school of thought says any quality or action that could increase rankings should be considered a ranking factor, even if Google doesn’t use it in its algorithm, direct or indirect.

An example of this might be social media shares. We know that Google does not use social media shares directly in its algorithm. But getting your content out in front of a large number of people can lead to links and shares and eventually more traffic and rankings as those signals roll downhill.

Now it may seem kind of crazy to think that anyone would consider something a ranking factor if Google actually didn’t consider it a ranking factor directly in its algorithms. But if you think about it, this is often the way real-world business scenarios work. If you’re the executive of a company, you don’t necessarily care if Google uses it directly or not. You just like seeing the end result.

An example might be, aside from social media, bounce rate, long clicks. TV commercials, excellent example. If you were in a Super Bowl commercial and you’re the CEO of a Fortune 500 company and you know that that’s going to lead to increased rankings and traffic, you don’t necessarily care if it’s a direct impact or an indirect impact.

So those are the schools of thought, and I’m not here to judge any of them. But what I think is important is how we communicate recommendations to stakeholders.

Use SMART goals to communicate SEO recommendations to stakeholders

When we give SEO recommendations in our audits or whatnot, the standard I like to use is I like to think of it in terms of goals.

A framework for goals that I like to use is the SMART system of goal making, meaning goals should be specific, measurable, actionable, relevant, and time-based. Now in the traditional view of ranking signals, yes, specific and measurable are great because we’re using direct impacts.

But with E-A-T, the signals get a little squishier, and it’s hard to translate those into specific, measurable signals, and I think that’s why people in this camp don’t like considering E-A-T a ranking factor. To illustrate this, Bill Slawski, the Google patent expert, recently shared a patent that he thought might be related to E-A-T or is possibly.

We don’t know if Google uses it or not. But the patent, it took website representation vectors to classify sites. Now that’s a mouthful. But basically what that means is the patent’s goal was to determine actual expertise of websites based on vectors. For example, it could determine, through machine learning and neural networks, if a website is written by actual experts, say medical doctors, or if it was written by medical students or laypeople or somebody else.

It can do that for any type of site, whether it’s medical, law, finance, and classify its expertise. In this sense, what Google is saying, if Google wants sites within the medical sphere to be like the Mayo Clinic and they are rewarding sites that are like the Mayo Clinic, that is really hard to fix, and it’s almost impossible to fake with these kinds of sophisticated algorithms. So it’s very hard to give SEO recommendations based on something like this.

What you really have to do, if you want to dive in, is start finding where the differences are between your site and those sites that are actually ranking. Marie Haynes, another SEO who thinks a lot about E-A-T, she says in an interview with Aleyda Solis, that I’m also going to link to, it’s an excellent video.

Thank you, Aleyda, for doing that. She says it’s about finding the gaps. But back to Lily Ray. I’m getting sidetracked here. Lily Ray is one of the few SEOs who has actually done really good research into E-A-T by comparing sites and seeing what some of the differences are of sites that have been rewarded and sites that have fallen in rankings. Some of her research has found some really interesting things.

For example, for medical queries, sites that lost had 433% more CTAs, calls to action, typically because they’re selling something, they’re trying to sign you up, a little bit of mixed intent. Where the expert sites had fewer CTAs. The winning sites were written 258% more by real experts as opposed to laypeople or people without advanced degrees.

The losing sites had 117% fewer affiliate links, and that could be something like this algorithm at work or something like that. But we start to identify what’s actually being rewarded. Again, this is hard to fix or fake, but we can start to fill in the gaps. So the question is, though, how do we make these specific, measurable, and actionable?

Measurable is especially hard when we’re talking about things like expertise and authority. Fortunately, a lot of these problems have already been solved back when Panda was released back in 2012. If you want to make these more nebulous, squishy things measurable and actionable, you have to start to measure them the same way Google does, and that’s using panels of people, like the quality raters that Google employs, thousands of quality raters across the globe to look at sites and rate them.

Those ratings aren’t used directly in Google’s algorithm. They’re used to sort of test the algorithm. But you can start to score sites on a certain deliberate scale. So you can use things like the Quality Rater Guidelines or the E-A-T questions that Google has released. It’s a list of questions that say things like: Is this site written by an expert?

Would you cite this site if you were writing an academic paper about it? Questions like that. You get a group of people — maybe it’s 5 people, 10 people or more — and you ask those questions about your client’s site and compare it to the expert sites that are winning, and you can start to see where the gaps are. Maybe this site only scored a 5 on a scale of 10 that it appeared to be written by experts.

By assigning values to it and using panels of questions and scoring, you can make it specific and measurable and actionable, and that’s how you do it. It doesn’t pay to give nebulous recommendations, such as improve your E-A-T. I know of one SEO consultant who says E-A-T is meaningless, and he is definitely in this camp here that the signals should be measurable.

E-A-T is meaningless because it could mean anything you want. If you tell your clients to improve E-A-T, you could be meaning anything, improve your links, write better content, hire some experts. Instead you’ve got to make it measurable, and you’ve got to make it actionable. I think no matter what camp you’re in that’s the way you want to go. All right.

I hope you enjoyed this Whiteboard Friday. Hopefully, it sparks some conversation. If you enjoyed it, please share. Thanks, everybody. Bye-bye.

Video transcription by

Want more Whiteboard Friday-esque goodness? MozCon Virtual is where it’s at!

If you can’t get enough of Cyrus on Whiteboard Friday, don’t miss his top-notch emcee skills introducing our fantastic speakers at this year’s MozCon Virtual! Chock full of the SEO industry’s top thought leadership, for the first time ever MozCon will be completely remote-friendly. It’s like 20+ of your favorite Whiteboard Fridays on vitamins and doubled in size, plus interactive Q&A, virtual networking, and full access to the video bundle:

Get my ticket + the video bundle for $129!

We can’t wait to see you there!

How Much Time Should You Be Spending on the Google Algorithm Update?

The search engine’s May 2020 update has caused alarm among some business owners. Here’s what you need to know about it.

The search engine’s May 2020 update has caused alarm among some business owners. Here’s what you need to know about it.

5 min read

Opinions expressed by Entrepreneur contributors are their own.

On May 4, a day filled with nerdy Star Wars jokes, Google‘s public search liaison Danny Sullivan made an important announcement via Twitter: “Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the May 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post here.

Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the May 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:

— Google SearchLiaison (@searchliaison) May 4, 2020

Predictably, online business owners everywhere began fretting about our page rankings and worrying about how our website content would fare. As it turns out, the word of the day…perhaps of the year…is “volatility.”

Related: 4 Ways You Can Use Google Hangouts to Optimize Efficiency

Volatility is the best word to describe pretty much everything about 2020. Unemployment, stay-at-home orders, the health of our businesses, the stock markets, the news cycle, and yes…Google’s May 2020 Core Update. And as it turns out, all these things are wrapped up together, all mutually interdependent.

Later in the day he made the announcement, Danny Sullivan confirmed that the update was underway.

The May 2020 Core Update is now rolling out live. As is typical with these updates, it will typically take about one to two weeks to fully roll out.

— Google SearchLiaison (@searchliaison) May 4, 2020

But the two-week rollout seems to be one of few qualities that make it typical of Google. What makes this particularly volatile?

First – and I’m counting on readers to let me know if I have this wrong – I don’t recall Google updates that take into account a specific event like a global health crisis. Although Google hasn’t specifically tied its 2020 Core Update to the current situation, Danny Sullivan did tweet earlier on May 4, 2020 that the search engine was seeing radically different user behavior and needs than it previously had. In April, Google started changing the way it displayed some crisis-related search results, so clearly it’s poised to quickly deliver the results users need and want.

Second, anytime Google rolls out an algorithm update, there’s volatility in search results, partly because the rollout happens over a roughly two-week period, and partly because SEO professionals labor mightily to ensure their clients’ websites still get traffic.

Related: Google’s New Algorithm Update Means New SEO Best Practices For 2020

So wacky search results tend to be the norm for Google updates. The May 2020 Core Update, though? Epic volatility. In fact, SEMrush analyzed the volatility for the new update compared to the January 2020 one. Its conclusion: “While January’s core update only led to average volatility of 8 points, on May 6, almost every category showed peaking volatility rates — from 9 to 9.4 points. So, the May core update appears to be much stronger and influencing more SERPs and positions.” The SEMrush article goes on to list the industries most affected by the update, among them travel, real estate, news and health.

Despite the ranking shakeup that’s happening, though, SEMrush has clear advice: “Something to keep in mind with these updates, you need to give this update time to complete before panicking.”

So we know the world’s in flux, and the new normal seems to change as the wind blows. But what does that mean for your business, your website and the changes you need to make to ensure your company weathers the storm in the May 2020 Core Update’s wake?

Related: Former Google Exec: ‘Don’t Be Evil’ Motto Is Dead

Well, for starters, you shouldn’t pull a “LinkedIn.”  Though there’s no direct comment from either Google or LinkedIn that definitively explains why the social media site didn’t appear in Google searcher for roughly 10 hours on May 6, Search Engine Journal speculates LinkedIn came to be de-indexed from Google either by blocking Google’s crawlers or by removing the HTTP version of the site. Don’t do that!

What you should do to make sure the update doesn’t negatively affect your page ranking? Follow the advice Danny Sullivan linked to in his Tweet announcing the update. The Google Webmaster blog explains: “We suggest focusing on ensuring you’re offering the best content you can. That’s what our algorithms seek to reward.”

We may feel like life’s moving too quickly for us to keep up. Everything feels volatile, and that’s scary. But it may be helpful to realize that uncertainty isn’t a modern problem. After all, as Greek philosopher Heraclitus wrote, “change is the only constant in life.”

What do you need to do to navigate the Google May 2020 Core Update? Exactly what you’ve been doing. Provide fresh, authentic, useful, relevant content that answers questions and gives vital information about your business.

Related: How Google Analytics Help Small Business Owners to Make Better Business Decisions

SEO Needs to Be Part of Your PR Strategy

No, PR and SEO are not the same thing. But for best results, they belong together. it’s time to start thinking of SEO as an integral part of your PR strategy. Put on your best public face, first by ranking, and then by providing a clickable link that converts.

No, PR and SEO are not the same thing. But for best results, they belong together.

Image credit:

Nipitphon Na Chiangmai | EyeEm | Getty Images

6 min read

Opinions expressed by Entrepreneur contributors are their own.

Interestingly, although I lead a PR company, we don’t do SEO. However, I strongly consider SEO know-how one of the most essential advantages a PR business can have. The guiding rule of PR, in my estimation, is to provide relevant and value-add information for the people who want to receive it.

The job of SEO, then, is to direct that content to ensure it is seen. So in that respect, PR and SEO are best friends. As my friend Dan Posner, business development lead for Big Leap, likes to put it: “PR and SEO are like diet and exercise. Why would you attend to one without the other?” I agree.

Since Google updated its core algorithm in September, SEO practices have shifted yet again. And as usual, SEO experts are pivoting to adjust. The Google game is getting more sophisticated and abstract, with forces like BERT and ‘Entities’ making the algorithm ever more intelligent.

Too often, companies are still approaching SEO from a utilitarian perspective, treating it simply as a way to harvest clicks. But forward-thinking CEOs are supporting and empowering their Chiefs of SEO, because that link on Google is often prospects’ very first point of brand contact.

Considering that search engine results pages (SERPs) will provide your introduction to many of your future customers, clients and partners, it’s time to start thinking of SEO as an integral part of your PR strategy. Put on your best public face, first by ranking, and then by providing a clickable link that converts. Here’s why this crucial point of contact is good for more than inbound sales.

Related: 10 Fundamentals to Understanding SEO

Stop chasing clicks and build a more authoritative presence

Google has been putting more effort into keeping people on the SERP instead of just clicking through. If you can earn a snippet or an answer box, you’ll be the one who answers many users’ questions. You may not get the clickthrough, but you’ve still won, by positioning yourself as an authoritative source in the conversation. This can be true even for a high ranking link without a snippet or an answer box.

“A high SERP ranking establishes you as a trusted authority,” says Guy Sheetrit, CEO of Over the Top SEO. “It’s like being quoted in the Times (or the Post. Or even the Journal). Users will automatically associate higher ranked companies with being ‘top of their class.’”

Sheetrit says smart SEO now means putting less emphasis on clickthroughs, and more on brand presence. Of course, this makes it trickier to measure ROI. But the long term payoffs of being an established authority are worth it, according to him.

Google doesn’t want to rank you for free

The SEO game was different when Google pages were comprised of the classic “10 blue links,” meaning the top 10 organic search results, with ads peppered off to the side. But there’s no use pining for the good ol’ days.

According to Moz Data Scientist Pete Meyers, only 3 percent of Google SERPs now feature 10 blue links. Google pages du jour are dominated by ads, answer boxes, snippets, PAA question boxes and other flotsam. “This is not the exception,” he says. “It’s our new reality.”

Most people focus a majority of their effort on the high-volume terms that face heavy competition and are plastered with paid ads in the search results. But the people searching for the topics at this end of the spectrum tend to be further along in the buying process, which means they’ve probably already interacted with many of your competitors.

Jeremy Knauff, founder of Spartan Media and contributor to Search Engine Journal, believes there’s a better approach.

“When you create and optimize your content to reach people earlier in the buying process, you’ll have a better chance of forming a relationship with them before your competitors do,” he says. “An added benefit is that it’s generally a lot faster and easier to rank for these types of long-tail terms.”

That’s not all bad. It just means there’s a price for ranking consistently. If you want to be front and center on the SERP, above the fold and ready to make a good impression, you can buy the privilege. Whether it’s worth the price is a question for your marketing team. So make sure marketing and PR are collaborating closely with your SEO experts on shared goals and strategies.

Related: 7 Reasons You Should Stop Managing Your SEO and Hire a Pro

Shore up your reputation with best practices

According to Google, three out of four smartphone users go to search engines first to address their immediate needs. There are 3.5 billion smartphone users worldwide, so that makes for roughly 2.625 billion people who search mobile first. The importance of optimizing for mobile cannot be overstated. Consider this principle as well: The best defense for a strong reputation is to proactively be sure you are “on the record” for who you really are and what you really stand for before a crisis occurs. In many kinds of crises, your attorneys or company policy may not allow you to directly respond. So the evidence that already stands may be for a period of time your best (or even your only) response.

In addition, you should follow the most current SEO and PR best practices:

●      A secure website (https://) is now expected, not a differentiator.

●      Optimize for voice search so Alexa and Siri are on your side.

●      Quality text content should be longer than 2,000 words for optimal results.

●      Quality video content should quickly answer users’ questions.

●      As always, optimize for the best possible mobile UX!

Following these tips will get you a ranking that leads others to conclude that you’re at the top. More than clickthroughs, more even than conversions, strategic SEO will make you seem like an authority in your field. Don’t allow yourself to get lost in the technical end of landing SERPs and winning snippets. Allow your SEO team to fret over those. Just make sure the team you choose is aligned with your best PR thinking, to allow the full force of your marketing effort to propel you to the top of the heap.

Related: 6 SEO Marketing Trends in 2019 That Entrepreneurs Should Know …

How to do Keyword Research for SEO

Keyword research is the backbone of your online presence. It’s common knowledge that this practice determines your rankings and visibility in the organic search. However, the way keyword research is done has changed over time. Today, using Google’s Keyword Planner to find phrases with a high search volume and optimizing your site for them is not…

Keyword research is the backbone of your online presence. It’s common knowledge that this practice determines your rankings and visibility in the organic search. However, the way keyword research is done has changed over time. Today, using Google’s Keyword Planner to find phrases with a high search volume and optimizing your site for them is not enough.

To rank higher, you need to focus on improving user experiences and understanding the intent behind their searches. Most importantly, you need to create quality content around those keywords. These practices will not only improve your online exposure but also help you drive relevant traffic to your site and convert it into leads and sales.

Here are the benefits of keyword research and some key steps you should take when performing it.

What is Keyword Research?

Keywords are the phrases online searchers use to perform online searches and find the right content. For example, if someone wants to make a face mask, they will google something like “face mask diy” or “homemade face mask.”

Your goal is to identify the key keywords your searchers use and optimize your website pages for them to make them more visible in the SERPs. That’s what we call keyword research. 

How Keyword Research worked in the Past

The approach to researching keywords has changed over time. Namely, 10 years ago, Google didn’t know what kind of results its searchers expected to see. Back then, its crawlers would scan the page and look for keywords similar to the search query. In other words, it was enough for marketers to use Google Keyword Planner to identify keywords with the highest search volume and spam their content with them.

To prevent keyword stuffing and provide users with more relevant and quality content, Google introduced several algorithm updates. Some most pivotal algorithm updates that impacted keyword research are:

  • Google Panda (2011) – penalizes low-quality and duplicated content.
  • Google Penguin (2012) – penalizes sites with spammy keywords.
  • Google Hummingbird (2013) – focuses on the improvement of the semantic search and search intent.

The Importance of Google Rank Brain

In today’s era of personalized user experiences, Google is striving to understand the context behind users’ searches. Its goal is to know the search intent and provide each user with the right content. This is where it resorted to artificial intelligence and, consequently, RankBrain was rolled out. 

RankBrain is Google’s artificial intelligence component that helps the search engine understand search queries. It uses machine learning, helping Google understand how searchers interact with search results and what the intent behind their search is. 

Keyword Research in 2019 and Beyond

With the rise of Google’s algorithm updates and RankBrain, keyword research evolved, as well. Namely, it’s not all about identifying profitable keywords and stuffing your website with them. On the contrary, keyword research has become more topical and natural. It now focuses on understanding the idea behind search queries and it depends on numerous factors. 

Benefits of User-Centric Keyword Research

This way of researching keywords benefits your SEO strategy in multiple ways, including:

  • Gaining a Better Understanding of your Target Audience

Just like I’ve already mentioned above, keyword research is essentially the process of identifying the phrases your target audience uses. By finding the most popular keywords in your niche, you can also understand who your audience is, what kind of content they want to see, what their needs are, and so forth. This data will help you create more reliable buyer personas and meet your customers’ needs.

  • Tightening your Content Strategy

Keyword research allows you to discover popular topics in your industry. This way, you will increase your industry credibility, make your content more engaging and attention-grabbing and, logically, boost your rankings. This is where both a basic Google search and content discovery tools can help you a lot.

  • Identifying Relevant Keywords

When building an online brand, you want to know what people think and feel about you. In other words, you want to measure your brand sentiment. Is it positive, negative, or maybe neutral? This is where, again, keyword research can help. Namely, it allows you to track your searchers’ activities and see what kind of phrases they use to find out about you.

  • Monitoring Competitors

When conducting keyword research, you can always focus on your competitors. By knowing what keywords they’re ranking for, you will gain a better understanding of your industry, “copy” their keyword targeting strategies that may work for you, as well as find effective ways to stand out and compete against them. 

Make a List of Relevant Topics and Turn Them into Keyword Ideas

The first step towards identifying your keywords is to create a detailed list of topics relevant to your industry and business. For example, if you’re running a sports blog, you will write about different types of sports and these are your topics.

Short-Tail Keywords Vs. Long-Tail Keywords

Keep in mind that these phrases are not your keywords. They are just some broader topics that will guide you and help you focus on the right phrases during your keyword research. Your goal is to narrow them down into more specific phrases. 

Say you’re writing about cycling. You don’t need to be an SEO expert to understand that one of the keywords you would like to rank for is “cycling.” These are short-tail keywords that were extremely popular a while ago, as they have an immense search volume. However, when targeting highly popular keywords, your competition is also great. Therefore, if your blog is new in the digital marketing ecosystem, chances are you won’t rank high for such keywords. 

Still, when you dig deeper, you will see that there are many natural and organic keywords people search for. Some of them are, for example, “health benefits of cycling,” “how many calories cycling burns,” “best mobile apps for cyclists,” etc. These are examples of long-tail keywords that usually consist of more than 3 words.

Even though they have a lower search volume, they also have fewer competitors, meaning that you have greater chances to rank high in the SERPs. Also, they’re more specific, helping you attract the right traffic to your site and convert it into leads and customers. Above all, they give you the opportunity to create more user-friendly posts, where your keywords would be inserted organically and naturally.

Know what Keywords You’re Currently Ranking For

One of the first steps to take during your keyword research is to see what keywords you’re already ranking for. This lets you see what keywords work or don’t work for you and, in this way, improve your rankings. 

Numerous SEO tools can help you here. For example, with SEMrush, you just need to enter your page URL and you will see what keywords it ranks for. Ahrefs automatically detects the keywords your site is ranking for. Both AccuRanker and Ahrefs let you analyze your keywords based on a particular location. You could also enable Console data in Google Analytics. This will help you observe your keyword phrases, clicks, and the average position on Google. 

Sure, you can also augment keyword tracking by automating your data tracking efforts. There are many reporting tools that let you create customized dashboards, where you would combine the features of the tools you’re already using to gain better results. However, to get relevant results, you need to track your data strategically. The Guide to Making High-Quality Marketing Reports emphasizes that understanding audiences, setting straightforward goals, and aligning them with the right KPIs and metrics is key to creating functional dashboards.  

Powerful Sources of Keywords

There are numerous tools and tactics you could use to identify keywords. However, you should also focus on those that also help you understand your audiences and their search intent. Here are a few tactics to start with.

Google Keyword Planner

This is certainly the first (and the most popular) choice of online marketers. Even though keyword research has become more complex and many advanced keyword research tools have been created, you could use GKP as the foundation of your keyword research

To use GKP, you need to have a Google Ads account. Next, click Tools and go to the Keyword Planner tab. You have an option to “Find New Keywords,” where you should enter your seed keyword to get relevant keyword suggestions. 

Google Search

The basic Google search can serve as an outstanding source of relevant and long-tail keywords. Its major benefit lies in the fact that it displays the keywords your audiences really search for. There are three features you could use:

  • Autocomplete

Google is striving to provide keyword suggestions in its search box. As these recommendations are based on actual searches, they could serve as a treasure trove of potential keywords. For example, if you type “best cycling” into the search box and hit the space button, you will get suggestions like “best cycling apps,” “best cycling shoes,” “best cycling exercises” and so forth.

  • People Also Ask

The People Also Ask snippet is one of the valuable features provided by Google. Simply put, it displays questions people ask that are relevant to your keyword. These may help you understand the search intent and, above all, find some valuable long-tail and conversational keywords.

  • Searches Related To

This feature is similar to the Autocomplete feature. Namely, you google a certain seed keyword and then scroll down to the “Searches Related To” are at the very bottom of the page. For example, if you google “cycling app,” you will get suggestions like “best cycling app for android,” “best cycling app in 2019,” “live tracking cycling app,” etc.

Answer the Public

This app works similarly to Google Suggest. Namely, it gives you a list of questions your customers ask. Using this tool is easy. Namely, you just need to enter your seed keyword and hit “Get Questions.” Apart from the visualized data, you can also go to the “Data” tab, where you will find three major kinds of keyword ideas – question-based queries, long-tail queries, and related topics. 


Reddit is our everyday platform for connecting with people from all across the globe, asking questions, and providing valuable answers. As such, it may serve as an amazing source of keywords. The idea behind that is simple – type your seed keyword in the search box and look for relevant subreddits. Then, seek popular treads that have numerous comments. Such threads may have lots of long-tail keywords you could use to understand the search intent.

YouTube Suggestions

YouTube is one of the largest world’s search engines. Therefore, ignoring it during your keyword research means missing out on numerous awesome keyword opportunities. There are numerous searches behind each seed keyword you could type in. For example, if you search “makeup,” you will see a bunch of potential keywords, such as “diy-makeup,” “makeup transformation,” etc.


Wikipedia has a logical structure, where broader topics branch into more specific ones. For example, if you search for the “Exercise” topic, you will see a bunch of sub-categories and suggestions, including “active living,” “warming up,” “exercise intensity,” “anaerobic exercise,” etc.

Keyword Analysis

Now that you’ve created a list of keywords, it’s time to analyze their relevance and effectiveness. There are three fundamental criteria to consider during the process of analysis:

      1. The popularity of a keyword – is its search volume high enough to deliver the expected results?

You could start by analyzing the data from the Google Keyword Planner, where you can easily see the search volume for each keyword. Google Trends is also a great tool to use, as it lets you see how the popularity of the keyword is changing over time. 

      2. Keyword difficulty – how difficult it is for you to rank for a certain keyword.

Logically, a higher keyword difficulty means that it would be harder for your site to rank for a certain phrase. This is where the quality and number of backlinks play an immensely important role. There are many keyword research tools that will calculate the keyword difficulty for you. When tracking this metric, keep one thing in mind. Namely, it is only a guide that provides generic insights. How well you will rank also depends on your overall SEO efforts, the quality of your site, and the relevance of your content.

      3. Search intent – how relevant your content is to your target audience.

There are different types of search intent. The first is navigational, where a user searches for a specific site. The second is informational, conducted by users wanting to learn something online. The third is transactional queries, performed by users that want to buy something online. Finally, there is a commercial search, done by people that research brands and products before making purchases. Your goal is to understand the intent behind every keyword you use and optimize the right kind of content for it.

Optimizing your Pages for Keywords

When optimizing your site for keywords, start by choosing focus keywords. These keywords illustrate your topic and appear in different elements of a page. These are usually exact-match keywords and you should insert them into your title tag, meta tag, headings, subheadings, and text content. However, don’t stuff your pages with keywords. Always strive to add them organically, where it makes sense. 

When optimizing pages, don’t forget about the importance of long-tail keywords. In today’s era of voice searches, where searchers replace traditional keywords with conversational ones, you need to optimize your content for natural keywords, too. These keywords let you increase user satisfaction and engagement, as well as appear high in mobile and voice searches.

Most importantly, note that targeting the right keywords is pointless if you don’t create high-quality content around them. For example, when optimizing your blog, make sure your posts are informative, insightful, and authentic. They should keep users engaged and deliver value to them. Such content is more likely to rank high in the organic search, compared to spammy, keyword-packed articles. 

Over to You

Keyword research remains a crucial SEO strategy. However, the way we do it has changed significantly. In the past, exact-match keywords and keyword density were enough to rank high. Today, these practices can only compromise your online presence. To gain a competitive advantage and appear high on Google, you need to focus on aligning keyword research with search intent and, above all, to create quality content around these keywords.

Raul Harman

My name is Raul, editor in chief at Technivorz blog. I have a lot to say about innovations in all aspects of digital technology and online marketing. You can reach me out on Twitter.

Why Meta Tags Are Still Relevant to SEO – and the Best Way to Use Them

If you include quality keywords in your meta tags, Google is more likely to list your site higher in web search results. January 3, 2020 4 min read Opinions expressed by Entrepreneur contributors are their own. Within the constantly shifting guidelines for what does and doesn’t boost a website’s SEO, meta tags typically don’t garner…

If you include quality keywords in your meta tags, Google is more likely to list your site higher in web search results.

4 min read

Opinions expressed by Entrepreneur contributors are their own.

Within the constantly shifting guidelines for what does and doesn’t boost a website’s SEO, meta tags typically don’t garner much attention. They don’t directly impact Google SEO rankings, but they can still impact a site’s SEO, so it’s important to understand how this happens, which meta tags matter and how to maximize them.

Here’s why meta tags can still boost SEO on your site.

  1. They affect Google’s indexing of your site. Meta tags allow Google to understand the content of your pages so they can appear in relevant searches.
  2. Meta tags can boost your keywords ranking. When you include quality keywords in your meta tags, Google is more likely to list your site higher in web search results, which can help you get noticed.
  3. The right meta tags can influence user experience. Tags create an expectation for the user that your website should meet or, ideally, exceed. Meta tags can also make it easier for people to find exactly what they are looking for on your site, which reduces frustration.

Meta tags are especially relevant when you realize that 93 percent of online experiences begin with a search engine. If you want to increase your website views and SEO ranking, it’s a good idea to utilize meta tags.

Potential Meta Tags to Use

There are many places where you can embed meta tags on your site, but these five are starting points that can help provide the best ROI for your time and effort.

  1. Title Tags: These critical tags become the text displayed above the description of your page in the Search Engine Results Pages (SERPs) and serve as a preview of your website. Ideally, every page of your website should have a unique meta tag.
  2. Meta Description: This is the text that appears below your title tag and should build on the information provided in the title tag. It serves as another way to draw people to your website and can be crucial for getting leads to click through to your listing.
  3. Canonical Tag: These tags help guide people to the right page of your site. It is especially helpful if you have pages of nearly identical information, such as similar product pages. Canonical tags also inform Google that these are two separate pages so that you aren’t penalized for duplicate content.
  4. Alt Tag: This image optimization tag makes your images accessible to people and search engines. This is an underutilized but highly valuable meta tag that can help boost user engagement.
  5. Header Tag: These tags help you break up large pieces of content into smaller sections to improve user experience on your site. They can also help search engines better understand your information to appropriately index it, which can have a significant impact on your SEO rankings.

Best Practices for Meta Tags

To get the most benefit from meta tags, here are a few best practices to consider.

  1. Keep it short: Google’s new meta description length is approximately 158 characters, according to Spotibo. For mobile devices, the maximum is 120 characters. However, Google typically displays 55 to 61 characters on a desktop computer, so you’ll likely want to write a tight descriptive tag that’s under 60 characters.
  2. Use keywords: Select one or two important keywords, then try to use them naturally in your tags rather than cramming in as many as you can. This is the first impression people will have of your site, so it’s important to make it a good one.
  3. Include branding: When writing your tag, be sure to include your logo, company name or website URL so that the information presented is clearly connected with your business.
  4. Use modifiers: Modifiers can help you describe your product or service in fewer words. Consider options such as: Best, Top, Buy, Easy, How To, Current, Review, Find, etc.
  5. Make each meta tag unique: Google recommends that all title tags be 100 percent unique on your site, and that’s a good rule of thumb to follow when creating all meta tags.
  6. Use the exact keyword: Don’t try to make a tag different by rephrasing a keyword into less-common wording because it likely won’t boost your keyword ranking and could even make people less likely to find you in a search.

Why Discoverability Needs to Be an SEO Focus with Garrett Mehrguth [PODCAST]

Podcast: DownloadSubscribe: Apple Podcasts | Android | RSSFor episode 170 of The Search Engine Journal Show, I had the opportunity to interview Garrett Mehrguth, CEO at Directive, enterprise search marketing agency.Mehrguth talks about the importance of focusing on your brand being discoverable for SEO and challenges some of the traditional sales funnel models.Why is focusing specifically…

Podcast: Download

Subscribe: Apple Podcasts | Android | RSS

For episode 170 of The Search Engine Journal Show, I had the opportunity to interview Garrett Mehrguth, CEO at Directive, enterprise search marketing agency.

Mehrguth talks about the importance of focusing on your brand being discoverable for SEO and challenges some of the traditional sales funnel models.

Why is focusing specifically on your brand one of the most important aspects of your SEO right now?

Garrett Mehrguth (GH): I’m going to look at it in two different ways:

  • From a psychological perspective.
  • From a financial perspective.

And why both of these matters so much to in-house marketers or agency side folks.

First and foremost, we all know the importance of brand. There’s a reason why people do Super Bowl commercials and run these very expensive ads to get awareness for their products or services.

Unfortunately, a lot of us have … become so obsessed with understanding the KPIs to conversions and the metrics that we’ve devalued how people actually make purchasing decisions.

And so what I mean by that is the traditional marketing funnel right now in my opinion, is broken when you run it on a cash model.

And so this is kind of that financial side of it. I would love to see in a cash business a 10% cost of sales ratio.

So what that means is if you think you can generate $100,000 in total contract value in a month, if you can do that with a $10,000 spend, including salaries, ad spend, and all of that stuff, you can now create a highly efficient model.

Now, the problem is that current marketing funnel doesn’t allow that. So let me talk from exact personal experience.

I’m blessed to work with some really good advertisers at Directive and they helped me with my stuff for our own advertising. In fact, we got to the point where we were converting at 60% from LinkedIn.

So it’s hard for me to say, “OK, my team could do better.” Right? If you get to 60% conversion rate on any platform, you’re already well past the benchmark.

Now the problem is marketing – especially SEO and PPC – is entirely timing-dependent. In other words, you cannot force someone to make a purchasing decision.

And that’s what makes SEO so powerful is because you can take the keywords they’re searching and then position your brand to be discovered when you know there’s purchasing intent.

Now the problem with a funnel and lead gen is there’s not purchasing intent. So when someone’s on LinkedIn, they’re not looking to purchase.

And so what happens is, even at 60%, let’s say our cost per acquisition for a lead was still around $17 on LinkedIn, but we have a 1% qualification rate – meaning that in a given one month period, if we were to generate a hundred leads from LinkedIn, we were able to turn one of those into a proposal and then we have anywhere between a 20- 30% close rate.

Twenty percent means you’re doing well. If you’re above 30%, you should probably raise your rates.

But the problem is when you look at that funnel, if you want to get enough opportunities to hit a deal or revenue goal, you’re going to have to spend exorbitant amounts of money on your lead gen.

Brent Csutoras (BC): And so where does that tie into the brand? Do you just bypass a lot of that by having brand exposure?

GM: You delete lead gen.

Imagine if you get rid of running white papers to generate into Marketo and try to nurture leads. Imagine if you just deleted all of that spend, OK?

Here’s what you can get for $250 on GDN. You can use it for in-market audiences, find people looking for exactly what you offer, and for $250 you’re able to generate 1.5 million impressions.

For the same amount, you could generate 10 leads with a 1% qualification rate and never get an opportunity.

What I found is, if you can get above the lead area and go to the brand, so I think the new funnel has to go brand, lead, opportunity, deal revenue. And if you can go above leads and go to brand, you’re going to see phenomenal results.

Just since launching this in the last month, we were able to have conversations with five enterprise brands. By the way, our advertising says the search marketing agency for enterprise brands…

I’m talking enterprise companies that we’ve never been able to talk to before, who fill out our form.

What changed?

We started to launch a brand campaign, and it was just a billboard, that drove awareness at what Directive was.

Now here’s the beautiful part. You can’t just have awareness.

Then you go to your SEO and your PPC strategy and you say, when someone searches for the products or services you offer, do you show up?

And now the difference is you’re the brand that has seven impressions on the ideal customer persona, while the others do not.

If you have your product market positioning right, you’re discoverable, plus you’ve built brand equity, now you have a powerful combination that drives incredible revenue at a lower cost per sale and allows you to have an efficiency in a cash model.

Is there any other low-hanging fruit opportunities to increase your brand opportunity?

GM: I would say podcast advertising. It’s truly actually good not only brand but also lead gen. It’s a captive audience and we still generate amazing deals from podcast ads.

With GDN, how would you go about finding the sites?

GM: You find the software that your ideal customer persona uses and then you leverage their login pages. Now you know they’re an active user of the product and you run from there.

Why does discoverability really need to be one of the primary focuses of our marketing campaigns?

GM: I think it has to be the number one focus because it’s the one thing that’s going to increase close rate, lower time to close, and allow you to essentially create velocity within your sales department.

The reason I say that is because currently if you’re not being creative with your SEO efforts, for example, you will almost guarantee, in any vertical, not be able to rank when the timing is hottest or when the purchase intent is strongest.

At the very bottom of the funnel when people start to modify the keywords they searched when looking for you by top best reviews and other purchase-related kind of modifiers, third party review sites are showing up like crazy.

In the services business, you got, you’ve got TopSEOs, all these people had been there forever. Are they transparent, are they good? That’s a whole other conversation.

Either way, they’re showing up number one and getting a 28% click-through rate, which is still 10 to almost 20x sometimes the volume you can get even from a search ad for the same keyword.

And so if you go search top ERP software, we talked about this before, no ERP software is ranking because Google is saying, due to what I call the Yelp and the Amazon effect.

This means when people are searching and they have purchase intent, they don’t want to hear how great you are from you, they want to go to what they are considering an unbiased source – whether that’s true or not – and they want to look at reviews.

If they look at reviews before they buy a $5 breakfast burrito, they’re sure as hell going to look at reviews before they buy a $250,000 software.

And so you have to take that same purchase reality of a consumer in 2019 and make sure that your being discovered when those users are looking online to make a purchasing decision.

What does it mean to be ‘discoverable’?

GM: I mean if you see other websites ranking, that means you could rank, you just need to write a better piece that’s more authoritative like traditional SEO. But that’s still one position out of 10.

And unless you’re number one, and if you’re outside of the top five, the click-through rate frankly won’t be substantial enough to generate any type of net new business unless it’s a very high volume term.

The second we stop thinking about websites and start thinking about brands, that’s when we become world-class search marketers.

To listen to this Search Engine Show Podcast with Garrett Mehrguth:

Visit our podcast archive to listen to other Search Engine Journal Show podcasts!

Image Credits

Featured Image: Paulo Bobita

Podcast: Download

Subscribe: Apple Podcasts | Android | RSS

Turning Google traffic into leads, and what’s new in SEO

Julian Shapiro Contributor Julian Shapiro is the founder of, the growth marketing team that trains startups in advanced growth, helps you hire senior growth marketers, and finds you vetted growth agencies. He also writes at More posts by this contributor We’ve aggregated the world’s best growth marketers into one community. Twice a month,…

Julian Shapiro

Julian Shapiro is the founder of, the growth marketing team that trains startups in advanced growth, helps you hire senior growth marketers, and finds you vetted growth agencies. He also writes at

More posts by this contributor

We’ve aggregated the world’s best growth marketers into one community. Twice a month, we ask them to share their most effective growth tactics, and we compile them into this Growth Report.

This is how you’re going stay up-to-date on growth marketing tactics — with advice you can’t get elsewhere.

Our community consists of 600 startup founders paired with VP’s of growth from later-stage companies. We have 300 YC founders plus senior marketers from companies including Medium, Docker, Invision, Intuit, Pinterest, Discord, Webflow, Lambda School, Perfect Keto, Typeform, Modern Fertility, Segment, Udemy, Puma, Cameo and Ritual.

You can participate in our community by joining Demand Curve’s marketing webinars, Slack group, or marketing training program. See past growth reports here and here.

Without further ado, onto the advice.

What are some new, advanced SEO strategies?

Our community ran an SEO masterclass in which we discussed Google’s algorithm updates and shared advanced practices for writing blog content in a data-driven manner.

Tactics for turning blog visitors into leads

Based on insights from Nat Eliason from Growth Machine.

SEO traffic can sometimes be a vanity metric if you’re not converting it into lead flow. Here are three ways to convert blog visitors into leads:

  1. Prompt blog readers with quizzes to help them identify the product/plan that’s best suited for them. Then require their email address to see results. Follow up with drip emails.
  2. Create “Buyer’s Guides” — downloadable PDFs with nice visuals that help readers figure out how to accomplish their goals (e.g. “paleo cooking starter kit.”) Again, require an email for them to download the complete guide.
  3. Pixel your blog visitors and retarget them with Facebook ads. Have the ads send visitors to landing pages that match whichever blog content category initially drew them to the site.

How to (re-)target business customers with Facebook ads

Based on insights from Nima Gardideh of Pearmill and Julian Shapiro of Demand Curve.

Most people use their personal email address on their Facebook/Instagram account. So if you’re collecting business emails during your user onboarding process, Facebook can have a hard time matching those emails to the corresponding Facebook profiles when creating custom targeting lists. 

 Here are a few tricks around this: