Google has announced many algorithms over the years, some after much prodding by the SEO community who noticed a tectonic shift in the search results and demanded to know what happened. Yet most SEOs can’t demonstrate which algorithms Google actually still uses. According to my guest, there’s a big difference between theoretical robustness and practical application among all the algorithms that Google potentially uses. But in the end, it’s all about the result.
Kevin Indig leads SEO at Shopify. He’s also the creator of the Growth Memo newsletter and host of the Tech Bound podcast. Before Shopify, he ran SEO & Content @ G2 and Atlassian and helped companies like eBay, Eventbrite, Samsung, Pinterest, and many others grow their organic traffic.
In this episode, Kevin and I geek out about SEO tools, hacks and algorithms. Kevin compares SEO to medicine, in that both follow scientific principles to a high standard in order to discern what works from what doesn’t. We also discuss the importance of sharing our insights, observations, research, and data to advance the practice of SEO.
Without any further ado, on with the show!
In This Episode
- [00:20] – Stephan welcomes Kevin Indig to the show. Kevin is the SEO lead, the director of Shopify, and the person behind the Growth Memo newsletter and Tech Bound Podcast.
- [02:45] – Kevin discusses some algorithms and the practical application of LSI keywords.
- [10:26] – Kevin mentions the tools he considered great for identifying internal linking optimization.
- [12:40] – Stephan and Kevin express their opinion on Majestic as a tool.
- [17:33] – What is the idea of topical authority, and how useful topical trust flow is to an SEO practitioner?
- [22:58] – Stephan mentions one of the problems in the SEO industry and tackles a little bit about the evil twin concept he learned from Andy Crestodina.
- [35:09] – Kevin shares his best SEO hack and discusses the correlation between unlinked mentions and the website’s overall authority.
- [40:41] – Stephan wants to know from Kevin the best AI-based tools and strategies to “outsmart” Google.
- [48:22] – How valuable is using log file analyzers such as the Screaming Frog Log Analyzer?
- [50:45] – Kevin mentions he is building an SEO library for Shopify customers. Visit Shopify’s website and Kevin Indig’s personal website to follow his newsletters.
Kevin, it’s so great to have you on the show.
Mr. Spencer, it’s awesome to be here.
I really enjoyed the time that we hung out together years ago, pre-pandemic, where we were being hosted by Semrush (formerly known as SEMrush) and hanging out in cool locales. That was really fun. Great to get to know you there and at a couple of different venues. I think we were in Helsinki and also not Barcelona, but where did we end up meeting?
I think it was Portugal. I didn’t make it there.
Oh, you didn’t make it to Lisbon?
Yeah.
Helsinki was amazing.
Yeah.
Anyway, thanks for coming. Thanks for being here. Why don’t we get started by geeking out a bit about some of the algorithms that have come out of Google and some of the things that SEOs think Google’s using but actually aren’t like LSI and stuff like that? There’s BERT; there’s RankBrain.
There are all sorts of algorithm names that Google has put out, but then there are also things like TF-IDF, LSI, Topical Authority, and Dwell Time. Some of these, Google’s probably not even using. I’d love to distinguish those two groups for our listener or viewer.
Yeah, absolutely. It’s a topic I have written quite a bit about lately, and I’m passionate about it. One thing I want to call out is that, of course, we don’t know for sure what Google is using and what not. All we can do is speculate and look at patterns or statements by Googlers, but nobody knows for sure except for Google.
Anyway, that being said, there’s certainly a paradox for some things or algorithms like TF-IDF. First of all, you have to explain the idea behind TF-IDF. The basic idea, without going too much into technicalities, is that Google will look at the frequency or occurrence of a term in a document like a webpage and say how often you mentioned a certain word and compare that against the rest of its corpus, against all of the pages that it has indexed to understand what topic this specific page is most relevant for. To simplify it down 10 times, that’s really what it is all about.
SEO is very similar to the field of medicine. Not because we're saving lives, but because medicine also follows scientific principles to a high standard to understand what really works. Share on XThe counterargument of people who say that TF-IDF is BS and not valuable is that they say, hey, nobody has the full corpus. Nobody has all the web pages indexed, except for Google. You’ll never be able to really get to a specific value. I think that’s absolutely true, by the way. That’s a solid argument.
The counter-statement is that there is something about using TF-IDF or certain SEO editors that follow some of that methodology. Whenever I use such an editor, I always see better results than when I don’t. There’s a theoretical argument, and then there’s a practical argument.
My personal experience is that we might not understand the theory behind it, or it might not be logical, but in practical applications, I often see a better outcome when using that. Again, is it better to use TF-IDF or some other methodology? Who knows? Who cares? I don’t think that’s really where the magic lies.
There’s this kind of difference between practical and theoretical assumptions. That’s something we see across all of these algorithms that we think Google uses. It’s the same with LSI keywords. I’m not bullish on LSI keywords. I don’t think about them or LSI in general.
There’s always a difference between theoretical robustness and practical application.
Another one could be pogo-sticking or dwell time. As SEOs, we just often see that when the user experience is bad, the page seems to rank worse and vice versa. You see good click-through rates, good dwell time, and lower bounce rate, and there seems to be something happening to rank as well.
It’s the same story where theoretically, you could find all sorts of counter arguments for using dwell time. One is that let’s say somebody looks up soccer stats, and they just want to see the score of the game. They come to a page, see the score, and leave after two seconds. Technically, a good user experience but has low dwell time.
There’s a lot of context in dwell time, or it could be difficult to measure and very noisy. But then, on the other hand, you see these practical examples where you work on something that might increase or that has increased dwell time, and then you see ranks going up. I think the bottom line, between all the algorithms that we assume Google might use, there’s always a difference between theoretical robustness and practical application.
What would be an example of a practical application of the concept of LSI keywords? I know you’re not that keen on it, but what have you seen in terms of application? That you’re like, okay, that is a good optimized piece of content because they did apply LSI keywords in a way that the algorithm is probably going to reward them for.
I’m probably going to butcher this topic because I don’t look at LSI keywords in my workflow. I’ve seen LSI keywords being used the most when people try to map topics to keywords or just try to understand all the related keywords within a fuller topic. That’s when I’ve seen people applying this idea of LSI, which plays into this idea of relatedness and semantics, and then saying, “oh, okay, these are all the semantically related keywords within the same topic, and I need to have them in my content.” So that’s kind of the idea there.
Other related concepts play into that as well. All these SEO editors, whether the Semrush editor, Clearscope, Phrase or some of the others I forgot, do similar things. They look at the top results, see what terms are mentioned the most often, and then give that as a brief. Again, I think even Google confirmed that they don’t use LSI keywords with LSI keywords. This is a debunked concept.
Of course, if you believe what Google’s saying, they are not 100% truthful all the time.
Yeah, for sure. They rarely ever admit something very, very exactly. They sometimes get back into phrasing things a certain way like, oh, we don’t use this for ranking. But okay, how do you define ranking? There’s so much gray area.
Anyway, long story short, if you work with that concept and see good results for LSI keywords, then who cares? SEO, at the end of the day, is about the result and the outcome. Whatever mental framework you have, if it works for you, then it works.
Whatever mental framework you have, if it works for you, then it works.
I’m not too bullish about it, but there are other people who are very sensitive and very, very bullish about not using any of these outdated concepts or not having the wrong mental framework. I’m a bit more practical, I assume.
Yeah, sounds good. Let’s talk about some of these tools that you quickly named dropped like Ryte, and Clearscope. Surfer SEO is another example. There are a bunch of these tools out there. I’m curious what your favorites are and the ones that you think, yeah, they’ve kind of jumped the shark. They’re no longer relevant these days.
The interesting thing is that they start to diverge in different areas. Phrases, for example, use a lot more machine learning technology to write or outline content for you. Whereas others really double down on improving the data quality or making the editor more writer-friendly. It’s really hard to pick a favorite.
We’re blessed at Shopify to use many of them successfully. I’ve used them all before and also at various companies. I struggle to pick a favorite because they all have pros and cons. They all do some interesting things. Surfer, as you mentioned, also provides an internal link or backlink suggestions.
I would say that we’ll probably see much clearer differences between SEO editors in a couple of years than we see today. Clearscope is one that I’ve successfully used a lot in the past. Surfer, Phrase, and then Ryte. I’m sure I forget a couple, as I always do, and then people get offended.
How about WordLift?
Yeah, there we go. WordLift, MarketMuse. I got to stop name-dropping because people reached out to me, “man, you didn’t bring mine up.” I don’t think you can make a wrong decision.
What would be a great tool for identifying internal linking optimization opportunities?
There are a couple of different places. Of course, they’re typical crawlers like OnCrawl, Botify, Deepcrawl that will give you interlink recommendations. Then one that I’ve recently seen making great strides is Ahrefs. In their site audit reports, they started to show incoming, outgoing, and backlinks for the same URL. That’s one thing that I’m super bullish about.
To take a step back, about five years ago, I presented this concept that I developed at Atlassian called TIPR, which stands for True Internal PageRank. The basic idea is simply that we are often too narrowly-minded by looking at our internal link structure and isolation. What isolation means, in this case, is that we run these crawls on our sites, and then we get an approximation of PageRank for our site. That’s cool, but that’s only 50% of the puzzle.
Gatekeeping SEO strategies don't work anymore. We only advance if we share observations and findings. By doing so, we can then improve and perfect them as a community. Share on XThe other 50% is backlinks, other sites linking to my site. That’s the whole idea. Linking doesn’t happen in isolation. It happens between sites and within sites between pages. By simply combining this internal link view with a backlink view, we get a lot smarter about where PageRank accumulates and maybe where PageRank is wasted.
I find it really interesting that Ahrefs provides a report that shows both backlinks and internal links. I personally use Screaming Frog a lot still, because it shows me the link position since 2020. That’s a very interesting feature because it allows you to differentiate between internal links in the top navigation in the footer or in the content itself, which is where they are the most valuable.
A lot to be said about internal linking. I started liking Ahrefs more and more. I still rely on Screaming Frog a lot. Some larger enterprise crawlers have also started developing interesting new product features. I guess it depends on your company size if you want to pick one. But with Ahrefs and Screaming Frog, you can walk miles with those two.
What about Majestic? You mentioned that Screaming Frog has that link position capability. There’s the link context tool inside of Majestic that shows you where the link is on the page and how many other links both internal and external are on the page that are linking to you. That’s pretty cool.
Super cool. I love Majestic. They were the first ones to really integrate some sort of a relevance factor in their link reporting or their proprietary metric because a big fallacy in my mind, a big trap that we keep running in the SEO world, is to think that, oh, domain authority, domain strength, or whatever the proprietary metric is the end all be all, but it often ignores the relevance between two pages.
If a link from a recipe to a cookingware product page, that obviously has a higher relevance than linking from a recipe to an SEO tool website. This idea of relevance and how to understand it is very complex. You can abstract from basic factors like the meta title.
My assumption, to go back to Google’s algorithms, is that they look at a lot more factors that determine the relevance between two pages. They have this whole entity graph, which is basically this graph that tells you the connections between abstract concepts. Google has a very, very different understanding of the web, and they can leverage that to understand the relevance between two pages that link between each other.
Long story short, Majestic is one of the first tools that has done that and it still does it. I think it adds a very interesting layer of value to internal linking. I haven’t yet played around with it to a level where I can compare internal link models between Majestics and other tools, and then say, ‘okay, their link relevance adds this extra level that helps you improve internal linking much, much better.’ It’s certainly something that I want to do.
I really like Majestic. I like the fact that it has a trust score, and most of the SEO tools out there do not. They just have a combined authority metric, which includes trust baked in, but it doesn’t just separate that out for you. There are some really important websites that have really poor trust and then they don’t rank as well as they should. That’s why or that’s part of it.
Majestic is one of just a couple of tools that give you a trust metric. There’s also LRT, linkresearchtools.com. Back in the day, Moz used to talk about MozTrust, and then they stopped really updating that algorithm and the metric. You can’t really take that metric to heart on Moz anymore, but I think that’s an important thing to measure. What are your thoughts about trust?
There are no real secrets left anymore in SEO. The topic of theoretical robustness vs. practical application has been an ongoing discussion, but in the end, it's always about the result. Share on XI think it is important, and I think it’s very contextual. I remember when I did these link audits manually, and you would have to develop our trust algorithm. It’s not a fancy algorithm. It’s really just a set of decisions you make, like does the link come from a site that still gets traffic? Does the link come from a page that’s still indexed? Are there any other spam signals you can determine? There are lots of signals you can combine into trust.
I think the benefit and the challenge of a proprietary metric is that it’s simple. You can just look at this metric and get a feeling for trust. But the downside is you don’t know what flows into that metric. I’m a fan of transparent proprietary metrics where the companies just say, ‘hey, this is what goes into that metric and hope you understand it, or into just very basic metrics.’
Visibility is another one that can be super helpful, but you might not always know what goes into it. It can be keyword position and search volume, but do they factor in seasonality? Do they factor in SERP features or something else? You get the problem. I love the idea of a trust metric. I love the idea even more of really understanding what goes into that trust metric so we can build the necessary context to act on the metric.
There’s this one aspect of trust inside of Majestic that doesn’t get a lot of discussions but I’m curious to hear your thoughts on it. It’s topical trust flow. I know you’ve written about topical authority on your blog, but I’m curious about topical trust or topical trust flow. Is that something that is useful for an SEO practitioner to look at and see for example that the top categories and topical trust flow are completely unrelated to the company’s industry and the genre that they’re in?
The idea of topical authority is that you also want to demonstrate your expertise by creating high-value content for a certain topic.
Such good points, Stephan. Maybe we start by outlining the idea behind topical authority. The basic idea is to say that Google understands what topics websites are experts in or authorities in. An example would be Shopify is an authority for ecommerce. The way that they would understand that is by looking at the content or the pages that this site has.
If I at Shopify write about 50 different subtopics within ecommerce, that makes me more of an authority than if I only have one article about ecommerce. First of all, we don’t often define enough how we understand these buzzwords that we often throw around SEO. Second, I don’t think that’s really how it works. I think there’s a lot more nuance to it.
For example, backlinks. Backlinks, in my mind, are still a strong, strong signal and super important. They go into this idea of authority. Usually, when we speak about authority, we first mean backlinks. We don’t speak about content. The idea of topical authority is that you also want to demonstrate your expertise and authority by creating high-value content for a certain topic.
Another idea that’s kind of implied in this idea of topical authority is to say, ‘hey, if you write about all the related long tail keywords of a short head keyword, your chance of ranking higher with that short head keyword increases.’ Let’s say I’m way skeptical that this is really true. I think there are lots of different ways that Google can assess your authority for a topic, but I don’t know how much of a ranking signal that really is.
Does that mean you shouldn’t write about these topics? No, absolutely not. You should write about these topics. You want to fill out topics with content and write for keywords that are important for that topic. Why would we at Shopify not write about all the ecommerce related keywords? The question is, how much does it move the needle and what are the mechanics behind that?
We think about a backlink metric that is related to topical authority. I think you mentioned it was called topical trust flow. The way that I would understand that is to say, do I get a backlink from a site that has a high authority for a topic that’s important to me? What would have to be true for that is that this metric understands or the tool understands how high the authority is of another website, how high my authority is, and what the subtopics are within that topic.
I don’t understand the metric well enough to say that this is perfectly true, but I’d be very interested to see what are the correlations between better rankings and this trust flow metric. I think that’s a big summary of what we’re talking about here. How do all these concepts actually relate to outcomes? Can you replicate them? Can you build more links with topical trust flow and then see a clearer uptake in ranks? And then would convince me to say, oh, yeah, this is totally a thing and this makes total sense.
Who’s doing the best research, scientifically validated or peer-reviewed like top-shelf kind of research, to prove that this stuff is actually not just correlation but causation?
I don’t know if anybody does, to be honest.
Yeah, that’s a problem, isn’t it?
You would need an independent objective council of SEOs who take suggestions, case studies, and claims and really stress test them.
It’s a huge problem. I actually think that SEO is very similar to medicine. Not because we’re saving lives, we really don’t. But because medicine is another one of these fields that follow scientific principles to a high standard to understand what really works in the body and whatnot. There are still so many things that we don’t understand about the human body. But since the scientific method, we’re able to test back into it.
I think that’s something that’s really missing in SEO. I don’t think this will ever happen, but technically, you would need an independent objective council of SEOs who take suggestions, case studies, and claims and really stress test them. We kind of peer review them as you said. I think that would be phenomenal for this industry, but I don’t think it’s ever going to happen for various reasons.
I think some of the better case studies, I’ve seen Ahrefs putting out some recent case studies, which are interesting, where they just do some stuff and kind of show what they see. They had a disavow case study where they just disavowed 10% of their backlinks. They saw a traffic dip, then they removed the disavow, and then they saw traffic increase again. It proves a small point, but it’s, in my mind, a very objective way to approach things.
The question is, how often can you replicate that? That would be the follow up case study in my mind. Just to be fair, I want people to do the same thing with my claims and concepts. This TIPR model, for example, many companies applied that and saw positive results, but I wish more could publish what they really see because I want that idea to get better over time as well.
Anyway, SEO is a unique, weird art or dark science where we only advance if we share observations and findings and then perfect them and iterate them together as a community. It’s a part of why I love this so much.
Yeah, but then the problem here in this industry is it’s so secretive. It reminds me of Amazon sellers and how they won’t even tell each other at a conference where they’re supposedly collaborating and sharing their tips and tricks. They won’t even say what their products are because they think they’re going to get ripped off and somebody’s going to end up running around them, go directly to some factory in China, get it made a few dollars cheaper, and then outsell them on Amazon. This mental hoarding thing is rampant in the SEO world.
Years and years ago, I tried to promote. I think I made an impact. It was back in 2007 or 2008, I pitched Danny Sullivan on the topic of a conference session called Give It Up where at SMX Advanced, we’d share our best secrets. I really brought some really great ones.
In terms of algorithms, we don't know for sure what Google is or isn't using. All we can do is speculate and look at the patterns and statements of searchers. Share on XThere was one that was relevant back in the day when you were seeing indented results. It would show up position one, position two with a competitor. Let’s say it was position one and two. But the second position was not the true position, it was somewhere between 2 and 10. It was just displayed, grouped. These are grouped results.
I would change the number in the parameters in the URL, so I’d do ampersand num equals nine to show nine results per page. That listing, the second indented listing, would drop off the page. That was position 10, not position 2. That’s a very weak position.
Let’s go on to page 2, find something that’s position 11 or 12, send some link authority to it, bump that onto page one, and then we knock our competitor’s second listing off of what appears to be position 2 onto page 2. People are going, wow, that’s amazing. That was just one of a whole bunch of tips that I shared.
I gave a whole ton of stuff away and then everyone else on the panel was a little chintzy, I think. Maybe Rand Fishkin was probably the second most generous. You know what happened because of that? It was the coolest thing.
Rand, who I didn’t even know—I knew of him, but I never spoke to him before—came up to me a few weeks later at the next conference, which was SES Toronto. I think it was, in the speaker room, and he gave me a hug. He said, dude, you brought it. You really brought it to the Give It Up panel. Wow.
We ended up having a chat. We decided in that little conversation to do a book together, which ended up becoming The Art of SEO. At the beginning, it was the SEO Cookbook. We got O’Reilly to be our publisher.
It ended up we joined forces with Jessie, who was working on The Art of SEO, and it became The Art of SEO in a collaborative effort. Rand was in the first two editions of that book. That book came about because I was extra giving and didn’t care that the whole industry would use it, call it their own, and pretend that they were all the ninjas who had figured it out.
Information wants to be free. Actually, we had this conversation prior to the recording like, everybody wants the magic bullets, the secret sauce. Tell me in your conference presentation some of your best secrets. Tell me in this podcast interview some of your best secrets.
That’s kind of unfair to ask that because you spend a lot of time building that arsenal, but yet, information wants to be free. When you let it out for the world to use, then you get these wonderful side effects. It’s like you get karma points, I think. What are your thoughts about that?
Yeah. I think there’s a lot of truth to that. I tried to give away a lot of my newsletters for free. Once a week, I write something that I hope has value. I try to embed as much as possible. This can be a competitive advantage for companies. This can be millions of dollars worth.
Shopify, for example, has whole teams with data scientists, engineers, and designers who work on understanding some of these competitive advantages, testing them in a very rigorous manner. I think there’s also an art to when to release information into the world. I’m not giving away what I’m working on right now. I might give away some that worked a couple of years ago. They might still work, but it’s not destroying the competitive advantage for a company that I worked at or giving away something that is very sensitive.
I think there’s also an art to when to release information into the world.
There’s a fine balance that I’m all bought into. I mentioned before that SEO is an art that lives from open-sourcing some things. I think part of that is also to give away some tricks here and there. I think everybody weighs that for themselves like, hey, am I giving away something that would really help me right now in front of a large audience that maybe has cost hundreds of thousands or millions of dollars to get to, or am I giving something away that is maybe not as sensitive?
I also believe that there are no real secrets left anymore in SEO. There might be some things that help you understand if you’re eligible for a featured snippet or some other stuff. Some of these things that you don’t roll them out on your site across the board and then you see the immediate uptick in traffic. It’s much more something that you can apply to better understand something. TIPR is a perfect example of that, this other model or this concept.
You said Atlassian was a competitive advantage and brought in real dollars. Not little, a lot of dollars. It’s a tool to understand and act on. It’s not the act in itself. When we open source, I think where we can be very free is to just say, hey, this is how I came to this conclusion.
Another thing that I’m bullish about is like, teach a man to fish, this whole kind of thing. I don’t like giving away these SEO tricks because first of all, I don’t think there are many left or maybe there are none. Second of all, what are you going to do to get the next trick? Rather, learn how to test yourself and how to find these things yourself and then you don’t need the latest trick. You will find the latest trick yourself.
As you mentioned, the execution mindset, understanding is so much more important than the latest little rich snippet thing that will lead to an increase in traffic, if that makes sense.
Yeah, that makes sense. I do like it when you recommend a strategy, a tactic, a hack, a trick, or something, to give credit where the original source, the originator produced it. For example, if I’m going to recommend somebody consider the evil twin strategy, which is let’s say you write for your blog, but you also write for a third-party news outlet. Let’s say you’re contributing to Inc. Magazine or whatever and they want exclusive content.
The evil twin concept is simply that you’re basing the second article on the same research. It’s kind of a flipped version of the title. The content is a rewrite based on the same research, so you’re not starting from scratch instead of the seven best practices of the largest SEO tool providers or something like that.
The flip side of that would be the seven biggest mistakes that new SEO tools always make, so you just flip it. That comes from Andy Crestodina. You might say, well, that’s not really a secret. It’s a clever strategy, but you don’t hear it in normal parlance. You don’t hear this approach talked about in SEO circles all the time, so give credit where credit’s due.
Yeah, I absolutely believe in that. I mentioned before that ideas build on top of each other. It’s actually very rare that one person has an absolutely unique idea. You previously quoted Kevin Kelly when he said information wants to be free or maybe he quoted someone else. I’m a huge fan of Kevin Kelly. He also said that everything is a remix of some sort. I believe that’s true.
There are so many examples in the tech world, where Apple builds on some other inventions and so on. I absolutely believe that credit is important because a lot of these ideas originate from someone else and yourself. It goes back to this idea of crowdsourcing where you accept that this is a hive mind situation. You just credit the people who came up with it first or maybe came up with part of the idea.
The funny thing is that when I published this whole material about TIPR, Dennis Goedegebuure, who used to work at eBay, then Phonetics, and Airbnb and whatnot came out and he was like, hey, man, I actually had the same thought around the same time about how to combine internal and external links. These things are not patents. None of these ideas are patents. They’re like crowdsourced ideas.
Maybe sometimes you just sit in the conference and the presentation, you pick something up, and you forget that it came from this presentation, and then years later, you have this idea and you combine two ideas. Long story short, I’m all for referencing, attributing people, and giving credit where credit is due because we’re all in this together. It’s a very synergy stage.
We stand on the shoulders of giants.
Yeah, exactly. You want to be humble and just acknowledge that.
I’m all for referencing, attributing people, and giving credit where credit is due because we’re all in this together.
Yeah, and then there’s the whole idea of, where does this stuff really come from? Ultimately, like the quantum field or the inspiration, the muse, the intuition. Is that even from our brains or is that really we’re picking up like FM radios, picking up the station of an angelic realm? I think it’s the latter.
Yeah, for sure. You want to be very careful in pretending that you’re the big mastermind. All of the ideas that I worked on were all team efforts. To pretend that you were the person who came up with an idea, executed it, analyzed the impact, and then built this whole concept is all nonsense.
It’s a team effort. You want to acknowledge that and motivate people to work with you again by not claiming ideas from them to be your own. I’m very sensitive to that. I’m very careful to that and try to live up to that whenever I can.
Yeah, that’s awesome. One thing that I see a lot of are these roundup posts, so 100 top SEOs all said the ideal length of a blog post should be 4000 words, 2000 words, 2500 words, or whatever. That’s not based on research, really. It’s based on opinions and coming up with the average or the mean of that opinion across the sample set of SEOs that you’ve asked who were willing to answer you. What’s your take on that?
Yeah, I’m thinking that myself from time to time I joined the round up or I’ve even done one myself about technical SEO a couple of years ago on my blog. I guess there can be cool roundups when it comes to asking for opinions or providing opinions from real experts who are really active in that field, who are particular practitioners about something that is really important. There are really bad roundups where it’s just like, what are the best SEO things companies can do? It’s like, man, this is so broad just to have some of the people on the blog.
What’s your best SEO hack? What’s your top secret?
Yeah, the top-secret is to hit exactly 4000 words or LSI keywords. I think you can do this in an elegant way that is actually useful. For example, what we spoke with Rand Fishkin earlier, I want to know what Rand Fishkin thinks about audience development or audience segmentation, and then probably five other people who I would really want to know what they think about that. If that’s a roundup, that’s amazing.
If you ask Rand Fishkin about the latest usage of memes in social media, I don’t care about that. That’s not really his expertise or where he spends most of his time. If you do a roundup, be super specific, make sure you have experts who are actually in that field and don’t just have a big name, and then do it in a way that it actually provides value for your audience. Don’t restrict experts in how much they can contribute. Don’t cherry-pick a statement to make it more clickbaity or some stuff like that. Some of the podcast works. I think you can do it in a really good, helpful way, and you can do it in a very poor, shallow way.
The top-secret is to hit exactly 4000 words or LSI keywords.
Yeah. Speaking of Rand Fishkin, I love that guy. He has the most amazing tool. He and Casey have developed SparkToro, which was something super incredible. I don’t know if you’ve used that tool or not, but it is really quite applicable to SEO, even though that’s not really the way it’s thought of as an SEO tool.
Imagine that you wanted to get on a bunch of podcasts. You got a new edition of your book coming out and you want to do a podcast roadshow. You want to be mindful and intentional about which shows you get on.
The SparkToro tool will identify audiences that are your competitors’ audience or your audience already, and what other shows they’re listening to. And then you can identify the ones that have good overlap and good other metrics like downloads or whatever and then go after those shows.
I think that’s just the coolest thing. You get new listeners to your show and readers for your book. It’s a really great tool. Have you used it?
Yup, using it on a regular basis. Fantastic tool. In my mind, it’s so symbolic of the evolution of SEO as well. We all know Rand is the founder of Moz. There was, initially, at least a very SEO-heavy tool, and then he left. We all know the story about Moz. He’s very public about that.
He eventually left Moz and then he started SparkToro. SparkToro is much more of, as you said, identifying audiences, followers, and target segments. To me, that’s very symbolic because SEO used to be this very tactical thing like optimize your title, optimize your h1, and do these kinds of things. That is still somewhat relevant, but there is a growing factor of stuff like popularity, how many people search for your brand, or mentions. Not even links, but just mentions from other pages, and all these very fuzzy, blurry things that are more difficult to quantify.
When I started over 10 years in SEO, that factor was maybe 1% or maybe 5%. Today, in my mind, that factor is maybe 30% of your success. There’s a growing fact that it’s really hard for us to reverse engineer, but by building communities, by identifying the target audiences to go after, and really crafting your content in what you do for them, you see better results even in SEO.
SEOs need to break out of their box and consider things like brand marketing.
To me, it’s been very, very symbolic. I agree with you that SEOs need to break out of their box and consider things like brand marketing or maybe even positioning, like very, very far away concepts that start to have a bigger impact on SEO because Google gets so much better in understanding these fuzzy things.
You mentioned mentions. Do you think that unlinked mentions count towards your overall authority?
I think so. It’s especially important if it’s in the right context. If the New York Times mentioned Shopify in the context of fulfillment, then I would say that that can be a signal to Google to understand the relationship between Shopify and fulfillment. Again, the question is, how many mentions do you need? What sites from?
There are so many nuanced factors that make it really hard to give a broad recommendation, but I do think Google looks at mentions. I do think that Google looks at brand sentiment even. You can see that and then people also asked questions, for example, where those are very spicy questions: “Is MailChimp illegal? Is Shopify good? Is it scammy? Is dropshipping scammy?” There all these spicy questions that seem to also have a sentiment component?
I think that Google looks at all of these signals, but then the big question is, how much of an impact do they have? How can you influence them really? Brand sentiment is something that can be difficult. Not impossible, but difficult to control and impact. It goes back to this idea of how to apply this, how to actually act on this information. I think Google picks up a ton of these fine-grained pieces of information as far as even what sources do you cite when you write about medical content.
Yeah. This is becoming more of a black box. In the beginning, everyone was trying to figure out what the black box of Google was, and how it worked. Pouring over the patents as they would get released and knowing full well that this could be a boondoggle. It can be just nonsense that they know that the SEO community is going to pour over, and they’re not even applying it or is so out of date because the patent was applied four years earlier and it’s now obsolete.
We’re seeing a lot more of the obscurity, opacity, and black box nature of Google as time goes on because we’re moving into a brave new world of AI and eventually to be autonomous general intelligence where AI will be so smarter than us at just pretty much anything. They can paint better than a master. They can come up with writing a symphony better than Beethoven, that sort of thing.
I’m curious to hear what you think the best tools and strategies are today based on where we’re at with AI and Google, where you think it’s going to be in maybe even just a few years’ time, and what sort of tools will be best?
I’ll preface this by saying a quote I heard from Peter Diamandis, the founder of the A360, Abundance360 conference. He said that there are two types of businesses by the end of this decade. Businesses that are using AI at their core and the other type will be businesses that are out of business. I’m curious to hear about now and into the future, what are the best AI-based tools and strategies to do our best at outsmarting Google?
Outsmarting Google? Man, it’s quite a mission.
At least reverse engineering and keeping up as best we can with Google’s march towards the singularity.
Sure. I’m going to tell you what I’m seeing 10 feet in front of me. I think that’s going to be most valuable because as much as I’m fascinated by the topic, I’m not an AI expert, unfortunately. A lot of the math is over my head, but I see some applications.
Number one is content. We’re starting to see today tools that will provide good outlines for articles or where you can even insert a thousand products, and they will write good descriptions that are valuable to users. The way they understand content is that there is a spectrum. At the very bottom of the spectrum is content that is basic like super boring, low skill, like writing a product description, for example, or maybe even a meta description. At the top are unique insights, beautiful stories, and super high skill like 10-year writing experience types of content.
We’re starting to see AI being able to surf the bottom of that spectrum and probably slowly making its way up. The question is, can AI get to the top of the spectrum ever? Is there a cut-off point? How long will it take for incremental progress?
There are a lot of open questions, but we can see today tools that will do this for you, and they will do decently well. It’s probably for the good of everyone. Which writer wants to write a thousand product descriptions or a thousand meta descriptions and stuff? That’s one thing.
The second thing is tools that help us cluster keywords, group keywords, find keywords, very basic, linear progressions. You can argue if that’s really AI or if it’s something else. If it’s really just really good statistics, it’s more of a meta conversation. It’s the same type of machine learning algorithm or statistics that you can use to understand outliers in your PPC campaigns where you have exceptionally good click rates, high costs, or whatever. This is something that becomes a standard in the industry.
The third one that I’m most excited about personally is tools that use some sort of a machine learning algorithm to help you better understand what’s happening in the search results. I think that’s where we can, as SEOs, find the best leverage to compete in Google.
Search results change more than ever before. So there’s this constant volatility, and there’s very little stability.
Here’s what I mean. We used to have these 10 blue links, very simple, basic search results. Then Google started to integrate vertical searches, so they will start to show images from Google image search or maps from Google Maps, videos, and all these kinds of things.
Today, the search results are full of these SERP features, which don’t just include vertical search, but also knowledge graph integrations, direct answers, featured snippets, people also asked, image carousels, and video carousels. There are these endless amounts of SERP features. They’re multiplying like rabbits. It’s crazy.
On top of that, search results change more than ever before. There’s this constant volatility, and there’s very little stability. First of all, that is something I think most SEOs don’t talk about nearly enough. Second of all, it starts to also give us a better reflection of what Google wants to see here.
By applying some tools that tell us what Google tries to get to, what intent Google tries to surf, or what several intent Google tries to surf, we can get a better understanding of what Google wants and then create better content or better pages. I think that’s where I’m most excited about machine learning technology because it is a data problem where you need a lot of data, like daily snapshots of the search results. You can quantify it and you can understand patterns in the data.
There are few tools that slowly start to explore that. Semrush has a search feature integration. You still might need to do a couple of steps manually, but it’s slowly coming together. To me, the big motto of our times is that it’s less the SEO teams that know the latest tricks. It’s really the SEO teams that have the best understanding of what’s going on in the search results that win today because the best understanding means you can act faster than anybody else and you can surf the right content.
That’s the area of machine learning that I’m most bullish about, where I see the most potential to actually make a meaningful impact.
Have you looked at MarketMuse as an AI-based tool?
Yup. MarketMuse does some really cool things and constantly evolves their tools. I’m keeping track. I’m friends with Jeff as well.
Awesome. I’m about to be on a webinar with them. They’re having me as their guest in a few hours. There’s another tool set that I’m curious to hear what your thoughts are. These are tools that do log analysis. That was all the rage way back in the day, but there are still log analysis tools out there.
You mentioned Screaming Frog earlier as a great crawler. Of course, you also mentioned Botify, DeepCrawl, and OnCrawl. Those are out in the cloud. Screaming Frog is one that’s on your desktop that crawls a website, but then there’s Screaming Frog Log Analyzer. That takes your log file, will process it, and analyze Googlebot behavior on your site, for example, based on the server logs. How important or valuable is that these days?
SEO used to be very technical and tactical. Though it's still somewhat relevant, there is a growing factor in other areas like popularity, trustworthiness, and authority. Share on XI think it’s hugely valuable, especially for large sites. I’m a big fan of log analysis because you can get so much out of there. It’s a true source of understanding how Google interprets your site. I think if you have a smaller blog or something, it won’t matter much. But if you work with a highly templatized site that maybe has a couple hundred thousand or even millions of pages in Google’s index, you really want to dig into logs on a constant basis.
A couple of things you can understand is, how many pages does Google crawl on a daily basis, weekly, monthly, or yearly? Does Google even crawl all pages? A lot of SEOs don’t have an answer to that. It’s really, really important because that tells you more about how Google understands your site.
Can they even find all the pages? Are there orphan pages? Are they of such low quality that Google doesn’t want to spend any crawl budget on them? These are super important questions because it can be the difference between making more money and less money.
Let’s say you’re a real estate site like Zillow. Of course, Zillow needs to know if all of their pages are crawled. Let’s say their city pages are crawled, but not their suburb, town pages, or whatever smaller subset you have of that. A lot to be gained out of that. I’m a big fan.
I think the biggest challenge for log files is actually getting access to log files and filtering log files. They often are very large if the company even stores them. I know lots of companies that delete them after 24 hours and you’re deleting SEO gold there. But then getting access to them, exporting them, filtering them by user agent equals Googlebot, and all the other photos you want to implement, that’s a bigger act.
The best companies store their log files in some sort of a data lake. They use tools to visualize them. It can be local. It doesn’t have to be Screaming Frog. It can be stuff like Logflare to gain log files and stuff. There’s a bigger chain of steps and tools that you need, but I think there’s a ton of value to be gained.
Yeah, awesome. I know we’re out of time. If our listeners or viewers are interested in learning more from you and maybe even from Shopify, I bet you’re working on building an SEO library for Shopify customers as well. Are you?
Sure, yeah. We’re constantly expanding our content to help also known SEOs to do better SEO. There’s a lot happening in that area. I’m going to keep it short.
Where do we send them to? That’s the bottom line.
Sure, shopify.com. Very straightforward, shopify.dev. There’s also a lot of documentation on that domain. Personally, for me, I’m @Kevin_Indig on Twitter. I blog a newsletter on www.kevin-indig.com.
Awesome. Thank you, Kevin. This was great. You’re whip-smart. It’s really fun to talk with you. Thanks for joining us today.
Likewise. Thanks so much for having me.
All right. Thank you, listeners. We’ll catch you in the next episode. In the meantime, have a fantastic week. I’m your host, Stephan Spencer, signing off.
Important Links
Connect with Kevin Indig
Apps and Tools
Book
Businesses/Organizations
People
Previous Marketing Speak Episodes
Your Checklist of Actions to Take
Speculate and always look for the patterns and algorithms made by Google. SEO practitioners should always be in the know of what’s happening in the industry. Failure to do due diligence may lead to serious site penalties.
Understand the concept of TF-IDF (term frequency-inverse document frequency) and how it applies to search problems and behavior. This is a crucial component of SEO, wherein it is a statistical measure that evaluates how relevant a word is to a document in a collection of documents.
Prioritize a good user experience for my website audience. For example, ensure the most convenient navigation wherein visitors can easily find what they need whenever they are on my domain.
On the other hand, recognize the importance of dwell time as well. With the latest updates on Google search, such as featured images, it’s more challenging for websites to get people to click on their links and stay for a longer period on their pages.
Provide extremely valuable information in the form of mixed media and long-tail articles I know my audience needs to consume. This is how I get them to stay longer on my site, making me seem like a highly credible source in the eyes of Google.
Utilize tools specializing in automation, machine learning, link auditing, and content writing. Stephan and Kevin laid out several highly recommended tools that can help boost a business’ SEO rankings.
Be mindful of my site’s inbound and outbound links—link to credible sources and check which sites link back to my domain. There needs to be relevancy in the equation. For example, it doesn’t make sense for a recipe site to link to an SEO agency site.
Keep improving authority and trustworthiness. SEO is all about visibility. Position myself or the client I work with in the best light possible by creating campaigns backed by foolproof strategies and data analysis.
Check out Kevin Indig’s website to access his blog and newsletter and learn more about organic growth.
About Kevin Indig
Kevin Indig leads SEO as Director @ Shopify and is the creator of the Growth Memo newsletter and Tech Bound podcast. Before Shopify, he ran SEO & Content @ G2 and Atlassian and helped companies like eBay, Eventbrite, Samsung, Pinterest, and many others grow their organic traffic.
Leave a Reply