One of the tricky (and frustrating) things about SEO is that once you’re penalized by Google, it can be hard to recover. This is particularly true for manual penalties, as opposed to algorithmic penalties. And, as if that in itself weren’t stressful enough, there’s an added stressor in the fact that Google has a history of penalizing retroactively.
My guest today is Christoph Cemper, a guru at link analysis who has been a noteworthy figure in the online marketing space since the early 2000s. He created the remarkable LinkResearchTools, an amazing toolset that can help you triage your bad links. One of the tricky (and frustrating) things about SEO is that once you’re penalized by Google, it can be hard to recover. This is particularly true for manual penalties, as opposed to algorithmic penalties. Today, we’ll chat about analyzing your link profile, the important differences between 302 and 301 redirects, and much more.
In this Episode
- [02:05] – Christoph and Stephan discuss how long they’ve known each other.
- [02:40] – We learn more about what, exactly, Christoph did in his earlier days of working with links. Christoph also talks about changes to the industry.
- [06:26] – What are presell pages, and how did Christoph make a significant amount of money with them?
- [08:08] – Christoph defines the term “co-citation,” which he used a moment previously, in the context of SEO.
- [10:12] – Christoph shares his views on PBNs, or private blog networks, then offers advice on whether to use them.
- [13:21] – Google connects the dots and takes all your stuff down if you haven’t set up your PBN properly, Stephan explains. Christoph agrees, and elaborates on what Stephan has been saying.
- [16:07] – One mistake, such as one of your employees accidentally logging into the wrong account on the wrong computer, could cost you everything. Christoph explains what he means.
- [18:15] – Stephan emphasizes how high-risk PBNs are. He and Christoph then talk about algorithmic penalties and manual penalties, and talk about trying to recover from being penalized.
- [21:53] – Google has a history of penalizing retroactively, Christoph explains. Stephan then discusses this in relation to the example of Wikipedia.
- [24:37] – Christoph brings up an example of a pharmaceutical company editing the FDA page on Wikipedia.
- [25:01] – Stephan returns to the topic of Penguin, which he and Christoph then explain and discuss in some depth.
- [30:42] – The idea that Google devalues (rather than demotes) is FUD (fear, uncertainty, and doubt), as Stephan points out. Christoph then points out that he isn’t saying that Google lies, but rather that their PR team does an amazing job.
- [33:38] – Christoph responds to what Stephan has been saying about not taking too many tweets from Google too literally.
- [35:47] – We learn some of what Christoph has learned in his recent experiments on redirect rankings.
- [37:38] – Adwords is the true business that Google is in, Christoph explains.
- [38:27] – Stephan brings us back to the topic of 302 vs. 301 redirects. Christoph then shares his thoughts on the differences and which type you should always use.
- [41:43] – Christoph poses a question to Stephan: when he says a 302 is labeled a temporary redirect, what does Stephan think “temporary” is?
- [43:38] – Christoph questions whether even Google has the data to analyze the topics he has just been discussing. He then explains what he means by “red canonicals,” and talks about why he has that data.
- [45:40] – Stephan brings us back to the 302 redirect topic, pulling out a huge takeaway for listeners.
- [46:28] – People still follow the old patterns and rules without questioning them, Christoph points out. He then discusses the hate he’s gotten for deciding to do things differently.
- [51:10] – Listeners can go to www.linkresearchtools.com to sign up for Christoph’s toolset.
—
Transcript
Hello and welcome to Marketing Speak. I’m your host, Stephan Spencer. Today, I have Christoph Cemper with us. Christoph is a guru at Link Analysis. In fact, he has the definitive tool set for Link Research. It’s called Link Research Tools. He started in online marketing in 2003, providing SEO consulting and link building services. Back in the day, he was creating presell pages on dot-edus. We’re going to that story. He ended up creating just the most amazing tool set and part of that tool set includes Link Detox which triages your bad links. It’s a really interesting process. We’re going to talk a lot about links and how to get powerful links, how to analyze your link profile on this episode. Christoph is going to share some of his fascinating research on things like two redirects instead of trio ones. We’ll talk about metrics for Link Analysis and ones you don’t want to use as much as do want to use. Things like domain authority and so forth. We’ll talk about Impactana Content Marketing Intelligence Platform that Christoph created and who knows what else. We’ll see where it goes. Thanks for joining us today, Christoph.
Thanks for having me. Thanks for the exciting and motivating introduction. It’s just a little bit more than an innovative pitch for all the stuff that I’ve been doing for the last 14 years in my own company.
Yeah, you’ve been doing a lot of stuff. I’ve known you for quite a while. How long have we known each other?
Probably 10 years by now. 2007 or maybe 2006, I’m not sure.
Yeah, it’s been a long time.
But the whole Link Buying, I remember you writing about the concept of presell pages so that was in the era of 2005 to 2012 or something and so you were early on 2005 when you still had Net Concepts, I think.
That would have been prior to 2010. Cool. Let’s actually define presell pages for our listeners because they may not know what the heck you’re talking about.
When you introduced me being an SEO agency or something like that, we could spell it out. I sold links. I build links. I had link building service. I was one of the first specialized companies for doing just that for links. Back in the days, it was so freaking easy to rank in Google by just buying and building links, getting them, whatever, but it was basically a simple trade. The more money you spend on links, the better you ranked. That worked for many, many years. Of course, the money for the links was always in direct relation to the strength of the links, the trust of those links. Contents like trust for links goes back to 2005, 2006 already because those links from universities or even governmental sites helped companies rank big time and fast. There was this wide range of products that we offered doing just that. This is why I say for 14 years, I get up in the morning and just think about links. Back then, it was finding the right links and it started with a simple problem of back then already, to evaluate link, quality. Just today, I read a research from a couple of hundred people that were SEO experts and the only thing they seem to look at based on that research, based on this survey, was just overall visibility of domain and a number of linking domains to that link website, which frustrates me to a point or scares me to another point because none of these two metrics are actually helpful to evaluate the strength or the potential impact of a link. To wrap that up, the job, the skill that I developed throughout all these years started back then already with the need of finding good links. Finding good links for the clients to sell them to, to build them to, when we have a set of let’s say 100 opportunities, back then, same problem as today, you want to cherry pick.
Yeah.
Back then it wasn’t only money problem to pick only let’s say the 10 best links for the budget, today, it’s a risk problem. Finally, 2012, Google caught up with the industry and said, “We’re going to slap you. We’re going to penalize you and we’re going to make you suffer for things that you did that we don’t like.” They always said it. They always talk. People don’t read. People don’t listen and not even to Google. All this talk for all these years basically resulted in no action from the SEO world, from all the search optimizers until Google, in 2012, decided to actually make something very natural to attribute effect of linking not only positive or zero but also negative. Pulling websites rankings back. If you had a lot of problematic, toxic, or spammy or paid links or whatever Google thought was paid links, you were then dropped. This was the idea of building link defects out of them. But it always goes back to the original question that I had in 2003 already, which of those links actually are the good ones or the bad ones? That’s what I’ve been doing first for my own site, I had affiliate sites worth billions of pages from Amazon carpet, just indexing Google, making affiliate money. This is how I learned SEO, if you want.
That’s amazing. Presell pages, what are those? How did you make a lot of money with presell pages?
A presell page was a term for an article that I rented out. An article of relevant content with links embedded. More than the rented link, it’s a page rented to a buyer or to a renter with links with their anchor text in there and contextual co-citation links in there. Quite often, people would only want to link to Wikipedia but that’s another story. More or less, an upgraded paid link, something much more powerful, especially because I put it on edu sites.
That’s very clever. That’s amazing that you would be able to be able to get edus to say, “Sure Christoph, we’ll give you a page on our site that you can put whatever you want on.” You managed to do that a lot and get some pretty high rankings for the clients and yourself.
Yeah, there were some amazing results that we saw all these years and therefore the costs were relevant, related to the amazing results. They were expensive. They were paid on a monthly basis and something that people also use this for sometimes was reputation management. Sometimes, I would rank the original website. Obviously, if you have the New York University with some content related to your keywords, and your business, and your products, that was a lot stronger website than the actual target sites. That sometimes happened and then we had to, so to say, de optimize the content of that presell page.
That’s funny. Let’s also define co-citation for our listeners because that’s another important word that you dropped into the conversation.
Co-citation defines which other websites you are linked with on pages. You maybe all know the term good or bad neighborhood. This is always getting confused with other things. In Link Analysis, in the link graph, the information which other websites have the same links as you defines your neighborhood. Google even had an operator called related, I’m not sure if that still works in 2017 but it did work very well in 2003, 2004, 2005, for sure, to find those related websites. What Google did was basically just look at the common backlinks of these tools. Now, in Link Research Tools, we have a common backlink tool to do such an analysis. Also for getting additional links from websites that are related to yourself. The key example that I always give is bad neighborhood means you have a block of links across the web, maybe on multiple websites, where all sorts of nasty businesses, let’s say online casinos, the same time online casinos buy links where you buy links. If you just sell sunglasses, or if you have a fashion store, then you don’t match that. But from a link graph perspective, from Google’s perspective, you look related. That is bad for you because the whole contextual relevance that Google would apply to the site and the rankings would go more in the direction of those casino sites rather than your fashion business.
Yeah, not good.
Not good, not good at all.
Let’s dive into some of the sketchier things first and then we’ll talk about how to clean up the mess and metrics that matter for Link Analysis, your amazing tool set and a bunch of other cool tools and tactics. Let’s start with PBNs, Private Blog Networks. Please give me your thoughts on that.
Very, very overviews for actually paid link network. The P stands for private and the moment you read about someone selling the PBN, you need to be careful. Quite often, all of this cut off networks were called article link networks or a linkvana or whatever brand name they had. Brand names have gone but the moment someone sells out of his Private Blog Network, you put yourself in the same bad neighborhood. Bad neighbourhood being link biased. That means even Link Research Tools can find your other link bias. If we can do that, Google can do that as well. This is the first problem. The second problem is PBN is basically just a synonym for expired domain link network, usually. People go and buy the expired domains which is fine and then sometimes quite amateurish refurbish those domains or don’t refurbish them at all. Just yesterday I answered a question about redirecting the domains all over, which is not a good idea. Because I did all of that, of course for many years, we have a tool for that called Link [00:11:21] Recovery Tool that helps refurbishing that expired domain better. The key here is you would only want to build a private block network for yourself, for you privately. To be careful and spend a lot of time, and efforts, and energy, and money, to not reveal any footprints. We get this question everyday, we need to know what this and that detox role means when you say you talk about footprints. In the software, we now have so many different ways of trying to do what Google also does, detecting footprints. If we find a footprint, we call it suspicious. If Google finds a footprint, they penalize you and take the whole network out of the index. That’s what it is. An off label use of link detox is actually proving and making Private Blog Networks bulletproof. A lot of our customers are actually black hat SEOs or gray hat SEOs, or whatever hat color that they attribute to themselves using link detox to make sure those footprints are not there. If you do it like that, that’s basically the best possible approach to not get into trouble. But obviously, that is the sketchiest or nastiest link manipulation that Google knows or doesn’t like and they go after this. Every little bit of precaution, every extra hour that you spend securing your PBN, can pay off and be able to use it for a month, or a year, or 10 years from then.
Yeah.
Once Google found you, they have the footprint of whatever, let’s say a couple hundred domains, but they also get a footprint of yourself, of your company name, of your office location, of all these other things that go in there. Believe me, I’ve been killed from Google AdSense in 2003 or 2004, or 2005, 2 or 3 times. I know a thing or two about that footprinting that they developed massively.
They connect the dots and take all your stuff down because you might think, “Oh, I’ve isolated myself. I don’t have everything in the same Google search console account. I don’t have the same address, physical address, mailing address on my domain who is information across all my sites. I’m safe.” That’s not true at all.
No because those are just the simple things. I must say, I’ve seen PBNs been taken down that had even on hundreds of domains the same registrant in the domain name. That’s just plain stupid. I don’t even know the owners of that but it was ALN, Article Link Network, one of the biggest, so to say, public PBNs that caused hundreds of websites getting their penalty in the first Penguin update in April of 2012. I’ve also seen footprints like greedy people putting their same AdSense code all over the place. 800 different domains, well developed, everything done right, different [00:21:08], different IPs, different owners, different companies, different phone numbers, different everything, and then the same little snippet to make some extra dollars with Google AdSense. That is one public rule that we talk about in [00:21:23] where we reveal that this is an obvious network pattern. Other tracking tools like Google Analytics Stat Counter. Whatever you may think of to run on one website, if you run the same thing on 700 websites, chances are very, very, very high that Google will find a footprint. For instance, my favorite AB testing tool visual website optimizer, VWO, is one of those that allow you to actually not leave a footprint. This was a feature suggested to them, I think 10 years ago or so, just for this purpose of being able to run SEO campaigns on many different websites and still do AB testing on them. This is not so much for the link sources but for link targets but it’s an example you might not think about in the first place. The truth is it just gets harder every year.
Yeah. It reminds of when you tell a lie, you have to basically remember this false story and there are so many details and then when you tell one lie, you have to tell another. There’s so much periphery you have to be cognisant of when you are lying. Same thing when you are trying to cover up a sketchy link network that you’ve created or that you bought into.
Exactly. One mistake can cost you all. It could be one of your employees logging into the wrong account from the IP that would be your office IP to connect the dots. Actually, in my case, that was my home IP with my first wife. We’re not married anymore. Back then, I think I didn’t have all the dots put together but I thought about the situation why. Her account and my account would both be banned. That was just a simple home office IP, if you want. No, but that was not the reason.
While we’re talking about lying and detecting lies, I have to plug my other podcast, there’s an amazing episode on Get Yourself Optimized that Christoph, you’ll enjoy. All my listeners, you guys gotta check it out. It’s with Mike Mandel and it’s episode 97. How to detect lies, among other things, it’s just a ninja episode about hypnosis and NLP, Neuro-Linguistic Programming and so forth. If you want to tell if somebody is lying, one of the tricks is to have them tell you the story backwards. They won’t be able to do it. They only figured it out, rehearsed it forwards. So you tell them like, “Okay, tell me what happened before that again. Oh, and then before that. And what happened before that?” They won’t be ready for that and you’ll catch them every time. I was just thinking about lying and covering your tracks and everything and how that relates to link buying and link selling. Anyways, I digress.
That’s cool. That’s setup for a PBN. It means documenting every single move. It’s like in the large scientific experiment if you want, once you stop keeping track of all your changes or do some change when you’re drunk at night or something, then you’re screwed.
Yes. Bottomline, don’t ever participate in PBNs. I never advised anyone to even get close to a PBN or to create your own. It’s so high risk. It’s really risk assessment. Are you willing to have everything get burned to the ground? If you have any commerce site and this is your livelihood, the answer is no, never.
Exactly. What I see is some ecommerce operators, not do it anymore, hopefully, for themselves, but for their affiliates and hide behind an affiliate. There is some kind of affiliate of that website. That affiliate is spamming and jamming and experimenting. That has been going on for years as well. That would be a way to maybe leverage some of the risk away, to hide that away from your main site. You don’t want to get caught on that either, of course. The one thing is that algorithmic penalty that you would get but once this gets out, and this can be a disgruntled employee or something, just turning it to Google, then a manual action would take place. Getting out of that is, you’re talking to a real person.
Let’s distinguish these two things for our listeners. You have algorithm penalties and manual actions or essentially manual penalties where somebody hits a big red button on your website, or a set of pages, or just one page. It can be a keyword level. It can be page level, whole site. It’s very difficult to get out of that situation because you are, like you say, talking to a human. You have to convince them that you’ve cleaned up your act, that you’re an upstanding member of the community again. It’s like going to the parole board and trying to convince them like, “No, no. You should let me out. I will be a good person.” Heard that one before.
Exactly. That’s what it is. Repeat violations, or even harder. Google keeps track of everything anyways so companies get out once but I also know about cases where they would not get out again, where maybe the penalty was tied to the actual person running the business. That’s been going on for so many years as well. That lie analogy, I like that a lot.
I like the parole board analogy too. It’s fun. If you’re a repeat offender like, “No, no, I’m reformed.” Like, “Oh okay. Yeah, heard that before. We’re looking at your rap sheet and this doesn’t look good.” I’ve been seeing this for many years, many years that Google keeps a rap sheet on all of us and if you get to the line and you said, “Okay, I’m close to the line here. I’m not going to cross it. I’m just going to hang out here until it gets a little too hot and then I’ll back off and never get penalized.” Google is able to retroactively figure out what you’ve been up to and keep all those black marks on your record even though you never got officially penalized and then they have a profile about you like, “Okay, this is the kind of person who likes to skate on the line and doesn’t really respect our guidelines,” and then they treat you more harshly. Just like the parole board would treat you more harshly if you’re a repeat offender even if you didn’t think you got caught, ever.
Yeah, yeah exactly. Google has a history of always penalizing retroactive and retro perspective, looking back so everyone’s account in 2012 got killed for offense made in 2007. And then they changed the rules. This is not just an SEO thing. It’s Google policy. It’s Google culture to introduce policies. It was Valentine’s Day 2012, that violation cost everyone’s account for violations in 2006 and 2007, AdSense arbitrage back then. But that’s another story.
Yeah, I have to throw in one other thing that has nothing to do with SEO but with Wikipedia that reminds me of what we’re discussing regarding going back in time and penalizing you retroactively. Wikipedia has this conflict of interest guideline that they say if you have an interest in the company that’s being profiled in the article, you should not be editing the article. Makes sense, right? I work at IBM. Let’s edit the IBM article. You’re not supposed to do that for obvious reason.
You’re not supposed to do that for obvious reason. There’s this tool called WikiScanner developed by this graduate student in Greece just for fun. He just created this tool that would connect the dots between anonymous edits. When anonymous edits are made on Wikipedia, then the IP address of the editor gets archived and stored in the history. Connect the dots between the owner of the IP block for that IP address that made the anonymous edit and see if the owner of that IP block is the same company that is the topic or the subject of this article. Cisco owns this IP block and Cisco was the article that was edited on Wikipedia. This is conflict of interest and so these bloggers would dig into the WikiScanner tool when it became available and out these companies for violating Wikipedia’s guidelines and then the Wikipedia editors would go ballistic then there would be all this negative press. They’re like, “They got away with it for so long.” “Oh, we changed our whatever, we got rid of this negative thing in our article, and it’s been five years.” Wow, that was really smart and clever, and they get outed five years later. You got to think long term and not just long term like, oh what’s going to happen over the next few years? But what’s going to happen over the next decade because I don’t think Google’s going away in the next decade. I think they’re going to be around in 10 more years.
Yeah. Exactly. Very well said. I actually just pulled up that page and found out this pharmaceutical company editing the FTA page about some allegations against themselves and it was in there for two years, which is crazy. I didn’t know about that, so thanks for that.
It is crazy. Play the long game, that’s the bottom line. Anything about Penguin that we want to raise as [00:32:35] came out in 2012, there’ve been multiple iterations of Penguin and now the latest iteration, there really is no penalty per se for low quality links. It’s really not a demotion of your site but a devaluing of those garbage links. Let’s talk about that.
The real time Penguin, as it’s all the color Penguin 4.0, that’s here is to take away the harsh effects of years of Google algorithms. Penguin is an algorithmic penalty, took away, in most cases, 100% of the traffic. Now, this could be a loss in 10%, a loss in 30%. Google says they suddenly find all the bad links and just devalue them. But that cannot be true. It was never true. The fact that devaluing bad links and portraying the link risk in the backlink profile into healthy relation that when 3 times 297% increase perhaps, for example. We just got approval for that. We are working on a case study. The agency is called RBBI located in Dubai actually and they used [00:34:07] to multiply the traffic by just cleaning up those links. The story that Google wrote magically wrote overnight with this, okay, not overnight but with this new algorithm would just take out all the bad links and leave the good ones in and basically, do the job of the SEOs again. Basically, going back to 2003 where you could just throw against the wall and see what sticks. That’s not the case. You still have penalties. You still have demotions of keywords, that’s what Google said. Keywords for those pages and categories could drop. That for an SEO is even tougher because that means you could drop for some keywords, familiar pages could drop back and you would not necessarily know that it’s because of the links going there if you don’t do an on-going audit of the links. You suddenly have a lot more problems on the table if traffic changes to check and this will be probably basically every link on there. If you don’t look all the links going there, then could this world be some of the good links you see and then some other bad ones you don’t see and then you can never cross this [00:35:28]. You want to make sure that the links are not impossible reason for that directory or that keyword drop. But this is what we saw. The story that Google keeps repeating every couple months, and Barry Schwartz, as much as I respect him, keeps echoing that, it’s just wrong. It’s the same FUD that Google has been doing about paid links for 10 years. The thing it has to do with paid links, getting a real renaissance. Basically, SEO conferences back then were people getting on stage and saying, “Buy links. You can buy links from me.” That was SEO conferences and so it made Google look a little bit bad over time and that led to a quite harsh effect. What’s going on now is that people realize, okay, links are important, we need links and of course everyone tries to build links in a very natural way. The truth is write good content and they will come, it was never true and they’re still not true. Any kind of outreach, any kind of activity to try to get links is basically link building, is something that Google would call unnatural already. Even going after potential linkers and of course, there’s a wide range of possible risk and a wide range of possible ways to sneak in links somewhere. Recently, Google said, yeah we know all these typically links sellers on magazines, like Entrepreneur or Forbes etc. and we just ignore these links. This is what they said. The truth is they get the same Excel list for all these forms, places and groups and email lists being sent to and from link seller and of course they look at those websites. But that’s like 50 websites, 100 websites that I usually see all the time. Those are the ones that might be under men will review first and then get some extra treatment from Google. But one of the rules of building links is the same rule that we talked about for the PBNs, not disclosing, not creating footprints, not telling people where their links was. Back in the days, for all these university links, people would pay and wait for days, maybe sometimes a week or two, until they receive an information of where that link would be. No information whatsoever went out before. The other opposite of what I see still going on is people just being totally promiscuous about what they have to offer and large lists being sent around. If I get this list, Google gets this list as well. That’s how they find it automatically. Maybe crossing email automatically or something.
The other opposite of what I see still going on is people just being totally promiscuous about what they have to offer and large lists being sent around. Share on XThe idea that because Google has come out to say we don’t demote anymore, we devalue with Penguin 4.0, you think this is fun if you’re uncertainty and doubt and people are unsure about how the algorithms work and where the risk lies and so forth and reality what’s happening is that Google’s capturing a lot more data as people get emboldened to take higher risks.
Exactly. Nothing of what Google says is probably wrong in a literal way because they have automation placed, they have footprints. University links were also caught at some point for some universities once they go public. Even 10 years ago, same patterns. Co-citation patterns like who is buying links? If you combine a common backlink tool with a common forward link tool and let it run two or three times cascaded, you can find hundreds of link buyers by just seething that and you’ll find all those people that are buying in the same places and so you find all these places that sell the links and that’s basically how you do it on scale. And Google has data, Google has scale so they can do that. I can do it.
If you can do it, Google can certainly do it.
Exactly. And this is what they are referring to. I’m not saying that Google lies, I’m saying that Google has perfect messaging, a perfect agenda setting PR team that does an amazing job and what helps them big time that [00:32:18] the old head of Google spam did not have. Not to that extent. They restrict their information a lot to 140 characters. Very brief, very undetailed. It lacks detail and sometimes it’s even confusing. You mentioned redirects, on all redirects pass page ranks, oh great stuff. I love this tweet.
Yeah. He said 30x redirects. There’s so much nuance here that we don’t get. Which of these 301 or 302 and 307, 304 or something. I think he said in the tweet they don’t lose page rank but what about in a chain of redirects. If I do five redirects in a row, there won’t be any loss in page rank, I don’t believe that. So much unsaid and then people read into this stuff. I agree, it’s crazy to take these tweets literally from Gary or anyone else at Google and just base their business on it.
Exactly. And that’s actually a big, big problem that I’ve seen in the industry. I respect [Jen [00:42:04] and Barry Schwartz big time, that’s the SCM post and Search Engine Roundtable. They do a great job of what Google guys said in 2004 also. Just take what the Google guys say and capture it and write about it. The problem is, back in the days, [00:34:07] hangout on the web master world forum and had a proper discussion, had a proper technique explanation. He was an engineer who built this safe search filter that he knew all these things and he was open about that, to take that, to quote that, to put it together on blocks made a lot of sense. Now, we get this little piece of information, this little sentence and that is being echoed and repeated and interpreted to the extreme. I just pulled it up here, 30x redirects don’t lose page rank anymore. That’s what he tweeted on July 26, 2016, which is funny because by that time I redirect this trend for six months already for other reasons because that has been going on with the redirect stories for years. But that has been translated, if you want, or interpreted, or rewritten by bloggers into all redirects are the same, which is not the case. He never said that. Next moment, it would mean that they are the same in terms of not losing page rank anymore. But he didn’t say anything more than that. The funny part here is that the very important aspect of what we just discussed, the history, the development over time in SEO matters. It’s the number of links per week or per month, it’s the number of changes to a website, it’s the number of redirect chains here that developed or changed over time and it’s also the pagerank, as Google calls it pagerank, they mean that the power to push rankings, to influence rankings. I saw that of course the redirects pass something on but the interesting part is that this ability changed. The rankings passed on and my experiments, they stopped at some point. If you imagine for a moment that you do this great website migration, you do an SSL migration, you do an HTTPS or tap tap tap or move to another domain and you do this migration project, you finish that up and you have everything redirected at some point. Imagine for a moment what would happen if 301 redirect would lose the power to pass on the rankings. Not after seven days, not after seven months. The ability to lose those rankings goes away depending on the page that it redirects to the keywords that were affected. Some keywords rankings might go away, some power might go away, some pages might stop ranking. It’s Google, they could easily obfuscate these effects to the extreme. Why? To make lives harder, SEO lives. Our job as SEOs is trying to reverse engineer the machine, is trying to reverse engineer the algorithm and not less than 240 characters that get echoed through the world and do how Google says. That’s, I think, a big misconception where just because there are Google guidelines that we want to play fair with as much as possible, those that take everything for granted from Google will basically just do what Google says and might end up buying a lot more AdWords.
Exactly.
At some point. Which is the true business that Google is in. They are selling ads and they are doing that on top of their content, on top of their search engine which is the product and many years the question was always answered no, no organic rank, this has nothing to do with the rest of the business. I’m sure that there are various tools in place to keep these things separated. But from the top level down from the head of the company, it’s an advertisement selling company and everything they do, terms, conditions here, problems post it there, all these things in place but the original goal of Google is to make money via our ads, has always been.
Let’s circle back to this redirect issue, 302 versus 301 redirects. You presented at SMX East last year and that was a bombshell. The bombshell of the conference was your announcement on the study that you did finding that 302s would pass the rankings benefit indefinitely whereas 301s would drop that rankings benefit after a short number of months. Are you advising therefore that everybody should switch from 301 redirects to 302s or is it on a case by case basis?
I’m advising to use 302 redirect because Google says they don’t lose page rank anymore, which showcase the one thing. But from my observation, the rankings benefit of why I actually do the redirect remains, so I changed the redirects on my company websites, on Link Research Tools. The problem is that some SEO plugins for instance for WordPress, they have the 301 hot corner in there because the problem here is people don’t talk about 301 or 302. In the original HTTP speck, the 301 is called permanent redirect and the 302 is called a temporary redirect. That’s why the idea comes up that the effect of doing that 301, so called permanent redirect, would be permanent. The truth is it looks like it’s the total opposite of that. The simple explanation for that is that 301s have been used and abused by the SEOs for so many years. When Google Penguin came out, redirects could still help to get rid of the penalty. They close that loop and a year later, in 2013, redirects, 301 redirect, SEO redirects passed on the penalties from then on. That was a big change in Penguin 2.0 in May 2013. However, only in 2014, we saw the first evidence of what Google, Gary Illyes confirmed here in 2016, that the 302 redirects have massive effect of also causing penalties, of also causing rankings change. That’s not new. That’s the case, at least since 2014. That means that they tapped it back then already. But the theory that I have is because the whole market, the whole industry always keeps using the 301 redirect, that was the type of redirect to tamper with, to play with, to build in extra filters, extra rule sets, extra code. That’s basically it. There’s a piece of code somewhere in the Google algorithm taking care of the 301 redirects. The other piece of code for the 302 is different. That’s for sure. What I heard from an insider was that these theories that there’s one or two developers, one or two engineers inside Google talking about that or working on that piece of code because it’s so central. It’s like original code from the original core, core, core of the Google search engine back then. When I say a 302 is labelled a temporary redirect, what do you think, Stephan, is temporary? How long is temporary? What do you think you hear when I ask this question?
I think people would say maybe a day, or a week, or something. But I think probably when the spec was written back in the day, it meant seconds later it would be different. It would either point to a different location or it wouldn’t exist anymore.
Exactly. It’s not defined at all. There is no number, nothing that a software developer and engineer could use. And so by the lack of specification, they have to assume that in the next microsecond, it’s different. And so it has to be handled differently in the code than the so called permanent. To the permanent, you can attach all kinds of this is spammy redirect, all kinds of extra information. For the temporary, you cannot do that.
You could 301 redirect a penalized site to a competitor and hurt them, right? Does that still work and other kinds of negative SEO techniques?
It happens. That’s the reason why we totally rebuild our whole Link Research Tools backend to handle redirects and [00:43:00] even better. Nobody talks about [00:43:03] but I know companies that have thousands of [00:43:06] from penalized websites and [00:43:09]from virus websites. The [00:43:13] is basically like a redirect only for the search engine. In the browser, you would not get moved over to this other website for a redirect. That is basically even geekier, even techier than the redirect question. Nobody is talking about that.
That’s fascinating. That’s a form of negative SEO that I think would be on nobody’s radar, generally speaking.
Because nobody has the data. I question if Google has the tools to see that data. Google, they build everything on scale for the search engine but the diagnosis of things like that for looking at a website and finding those directories that actually have a lot of redirects too, that have a lot of external [00:43:58]. When I say [00:43:59] here, I must explain that this is something that originally was thought only for your own website, for an ecommerce store to clean to clean up the canonicalization problems that duplicate common problems on your own website, to help indexation. And then, overnight, Google allowed to use the [00:44:18] cross domain. That allows to have a similar view or similar effect as redirects to pass on rankings, to pass on link choose and has been used and abused for years. It was changed in 2013. That was a time where so many websites dropped anyways because of the harsh penalties. It’s like development on the backend development, on the technology side stopped and just started in end of 2015 to build that. That’s one and a half years by now, to actually have full support for [00:44:58]. You don’t see that data anywhere. Majestics doesn’t have it, nobody has it. One reason why we can do it is because we do this specialized cross, specialized link graphs. That means for one website, we could install a lot more detail, a lot more data for the links. When we re-crawl stuff, we sometimes get data that we buy from five years olds and recrawl it and of course, look at what’s going on right now. Negative SEO is sometimes just an overnight thing that happens and then goes away after two weeks. You don’t want to use one year or five year old data to say something about the SEO at all. That’s the problem I see a lot.
Just going back to the 302 redirect. I think that is a huge takeaway for all of our listeners. Go back and switch from 301 to using 302s. I advised one of my clients to take the site that they had acquired, the entire business they had acquired, switch that 301 redirect that had been in place from that acquired site to their main site. Switch to a 302. This had been in place, the 301, for at least 6 months. They switched it to a 302 and they got a really nice boost in their rankings on their main site. Just doing that one switch from a 301 to a 302 on this separate site, really cool.
It’s amazing how people still follow the old paddles, the old rules without questioning this. I feel like I’ve been the first to question this beginning of 2016, doing all this test. I’m not sure if you saw that on SMX East in New York last year, that I got all the hate, all kinds of people hating on me for that. That this is totally wrong and that this is not possible because Google whatever blah, blah, blah. People came up on this controversy to question that and annoy the heck out of John Miller. The day later, John Miller tweeted that in his opinion, SEOs fuss too much about redirects. That was a direct response to me being on stage, causing all this noise from all these haters. Some random, anonymous guys that claim to be the best SEOs in the world said that this is wrong and [00:47:26] discussion. I feel like John Miller felt like he had to shut that down because at the same time, they were rolling out Penguin 4.0 and this is what I barely switched to as well in all my attention then right after SMX East. It actually started rolling out beginning of September based on my data but the official announcement was September 25, if I recall it right. That kind of overlapped. And then of course, everyone was on Penguin 4.0 since then.
It’s like Google saying these are not the droids you’re looking for. More fear, uncertainty and doubt.
[00:48:07] that don’t look too close at what we are doing and what you are doing. But basically, that’s like a call to action of not paying too much attention to details, which is the exact opposite of what the job description of an SEO should be. Questioning things, testing things, and trying out things is the first sentence in the job description of an SEO. Following the Google guidelines is what Google made us do after the penalties. People are scared. I would say this is angst.
Yeah.
A lot of people are just scared to go beyond but this lead to more or less blind [00:48:57], just following. That is not SEO.
No, it’s not. I agree wholeheartedly. It’s a science. It’s an experimental science. If you do not have hypotheses, if you do not run tests to see if your hypotheses are valid, you are not doing SEO. Yes, I have a book called The Art of SEO but it really is a science more than an art. You need to be always be poking and prodding at the black box that is Google and not just abiding like sheep to all the propaganda that is spotted. Not just from Google but from SEOs who are just trying to protect their turf.
Yeah, exactly. It’s easy to follow Google and echo what they say, be in sync with their story.
We’re out of time, unfortunately, but I got to have you back because we didn’t cover things like domain authority, metrics. I know you have a striking position on DA. We got to talk about that.
That’s simple. The muscling index is too small and therefore using DA and PA is ridiculous. It’s wrong. It’s not looking at all the links for the evaluation, basically.
Yeah. We didn’t go into all the different tools. We did the common backlinks too a little bit. We have the BLP tool, CLA, pitch box integration. There’s so much stuff that we didn’t talk about. Impactana and so forth.
I agree.
We’ll have to do a part two and have you come back, if you’re open to doing that.
I would love to. I would love to, Stephan. Sorry for being so talkative.
No. I’m sure our listeners have been just wrapped with this incredible information and just having some powerful takeaways like we got to switch to using 302. Stephan is sharing that he had clients switch to 302 and got huge rankings benefits. That’s corroborated data analysis. I think just that alone is well worth the folks’ time listening to this episode. Where can they go to sign up for your amazing tool set?
linkresearchtools.com.
Awesome. We will catch you on the next episode of Marketing Speak. This is your host, Stephan Spencer, signing off. Thanks everybody.
Important Links:
- Christoph Cemper
- LinkResearchTools
- Manual penalties
- Twitter – Christoph Cemper
- LinkedIn – Christoph Cemper
- Facebook – Christoph Cemper
- The Art of SEO
- Mike Mandel – Previous GYO episode
- Co-citation
Your Checklist of Actions to Take
☑ Do not use PBNs unless I am willing to take the risk of losing all of my rankings and my traffic.
☑ Use the Link Detox tool to help me recover and protect my site by knowing what my bad links are.
☑ Do not put the same AdSense or Analytics code or anything traceable that will link all of my sites together.
☑ Remember Google keeps track of everything, and I can possibly recover from one penalty, but not multiple penalties.
☑ Do not take everything Google says for granted. Instead, conduct my own experiments and research on what really works.
☑ Use 302 redirects as opposed to 301 redirects because they pass the rankings indefinitely.
☑ Keep in mind that links pointing to my sites connect to the content of the linking sites. Do not attach my site to bad neighborhoods and out of context content.
☑ Question everything about SEO and links and do not blindly follow the old rules. Find what actually works.
☑ Learn how to recover from a Google link penalty by understanding my backlink data, and don’t repeat the same mistakes.
☑ Find the right tools to understand how Google is looking at my links and understand the metrics. LinkResearchTools has a comprehensive suite that can help.
About Christoph C. Cemper
Christoph C. Cemper started working in online marketing in 2003 providing SEO consulting and link building services. Out of the need for reliable and accurate SEO software, he developed the first internal tools in 2006. This was the basis for the full product LinkResearchTools (LRT) launched to the public in 2009 as SaaS product with four tools. Thanks to ongoing development, LinkResearchTools now provides 21 tools with ever growing power, and functionality adapted to market requirements and Google changes.
When the famous Google Penguin update changed the rules of SEO in 2012, Christoph launched Link Detox, a software built for finding links that pose a risk in a website’s backlink profile. Christoph has been talking and writing about link risk management since early 2011 and introduced the technology and formal process for ongoing link audits in 2012 as well as pro-active removal and “disavow” of bad links.
In 2015, Christoph launched Impactana, a unique “Content Marketing Intelligence” technology that helps marketing professionals find content ideas that make an impact.
furnace repair says
we love Christoph and his tools!