SEO algorithms are constantly changing, and frankly Google’s guidelines aren’t stagnant either. So it can be hard to keep up with the latest SEO best practices. What worked well in the past might actually be working against you now. Today’s guest offers innovative ideas for getting on top of your SEO issues and modernizing your strategies to have a bigger impact.
Lily Ray is the SEO Director at Amsive Digital, where she provides strategic leadership for the agency’s SEO client programs. She has worked across a variety of verticals with a focus on retail, e-commerce, B2B, and CPG sites. Lily loves diving into algorithm updates, assessing quality issues, and solving technical SEO mysteries.
In today’s episode, Lily talks about specific system-wide errors and how to solve them. She provides insight into Google’s quality guidelines and how to use them to your advantage. We discuss how to rank well in Google News and Google Discover. Lily also gives valuable advice for keyword research, content development, and link building. You’ll get some great tips for updating your SEO tactics and increasing your site traffic.
In this Episode
- [00:20] – Stephan introduces his next guest, Lily Ray, the SEO director at Amsive Digital, where she provides strategic leadership for the agency’s SEO client programs.
- [1:24]- Stephan asks Lily how she ended up focusing on the technical side of SEO.
- [3:06]- Lily shares the tools and resources that keep her technical skills sharp.
- [7:09]- Stephan wants to know the critical takeaways in finding when to use a log analyzer or a more industrial-strength crawler.
- [9:37]- Lily gives an example of finding a technical issue on a website that turns out to be very helpful and valuable.
- [12:01]- Lily and Stephan talk about strategies to make sure that old content won’t bring down the crawl budget.
- [16:51]- Lily explains the algorithmic and manual ways that Google is cracking down pages and approaches to help pages rank
- [20:03] – Lily and Stephan point out the benefit of consolidating multiple URLs to a one page URL.
- [22:35]- Stephan asks Lily to break down what the audience needs to know from the Google quality rater guidelines and its counterintuitive analysis that the viewers are not implementing.
- [27:21]- The first step to work on for those who are still starting in SEO.
- [29:26]- Lily discusses the tips on boosting the site’s trustworthiness and the requirements in ranking well in Google News.
- [32:51]- Is using clickbait worth the risks?
- [34:09]- Lily briefly describes Google Discover.
- [35:42]- The deeper discussion about page experience algorithm update and core web vital components of Google.
- [37:48]- Stephan asks about link quality and examples of a link campaign.
- [41:43]- Lily shares about Google images, strategies in ranking higher in Google image search, ways if of optimizing photos in improving load time
- [45:31]- Lily recommends not to make decisions without a proper understanding of facts in SEO.
- [47:43] – Check out Lily Ray’s website to know more about her. Also, visit Amsive Digital’s website to work with her agency.
Transcript
Lily, it’s so great to have you on the show.
Thanks so much for having me. I’m excited to be here.
I would love for us to start talking about how you ended up focusing so much on the technical side of SEO. Because there’s a lot of art to SEO, but there’s a lot of technical acumen, a lot of science where all the details and the things get missed, all the nuances. I think that’s an area that is lacking, generally, in the SEO community. There’s not as much technical depth as I’d like to see. I’m curious, where did you get that technical bent and how do you keep that sharpened?
I started doing SEO when I was pretty young. I was actually just coming out of college. My brother is a web developer and my father is a software engineer. I kind of grew up in an environment where I learned some HTML, CSS, and things like that when I was younger. But of course, between my dad and my brother, they’re experts on those things. Right off the bat, when I was working in SEO, I realized there were problems that I needed to solve that were technical problems that were outside of my skill set because I didn’t really have a background in those things.
But thankfully my brother was really helpful. A lot of the time I would just chat him and ask him tough questions. Then it was just kind of trial by fire, like learning how to do things as they came to me with different clients that I worked with. I think the most important skill in SEO is the ability to figure out problems as they come to you, research the problems, and try things and see what works. Because it was only through experience and through really working with a lot of ecommerce clients at my first agency that I was able to put a lot of these technical skills to the test and learn new things.
What tools and resources are you pulling from in order to keep those technical skills sharp and thinking just reverse-engineering the Google algorithm?
Throughout the course of my career, I’ve been fortunate to work at a couple of different agencies, and the current agency that I’m at, Amsive Digital, is very focused on investing in technology. We have a partnership with Botify, we use Sitebulb, I’ve used Screaming Frog throughout my entire career. A variety of different crawling tools that we were testing out.
More important than the tool is knowing what to look for and how big of a problem it is
Then of course Google Search Console just continued to get more and more helpful and innovative over time. Really, a lot of the technical troubleshooting that you can do can just be in Search Console itself with things like the crawl stats report. But I think more important than the tool is knowing what to look for and how big of a problem it is. Because a lot of these SEO tools just report back basic facts like you have this many 404 errors, but maybe that doesn’t really matter. Just prioritizing what’s actually important to fix is a more important skill.
How do you do that? I know when I’m working with a client, I’m making it very clear what the priorities are based on the effort involved versus the potential payoff. There’s a prioritization that takes place and also in part based on what their skill sets are and what they want to outsource. There are lots in play besides just weighing the effort versus the potential outcome from here. How do you map all this out?
That’s a good question. I think that’s always important to keep in the back of your mind when you’re giving recommendations to clients or developers because telling them to overhaul a page template to slightly improve a core web vital metric or something is probably not going to work. Because as we’re all starting to realize that the page experience update wasn’t as huge of a ranking factor as many SEO’s made us believe that it might be.
That being said, I think when it comes to prioritizing technical issues, I like to look more so for things that are relatively easy to fix but have a big impact. Maybe it’s something like the canonical tag is not set up correctly on your category pages or on your product pages. It’s one thing that’s broken that’s affecting thousands of pages simultaneously. That one fix is going to have a really big impact.
Looking for those systemic issues is really important. A lot of SEOs get laser-focused on small details that might not matter as much. But when you can find those templated or site-wide issues, that tends to have a bigger impact.
The most important skill in SEO is the ability to figure out problems as they come to you, research the problems, and try things and see what works. Share on XYeah, for sure. What are some of the most egregious errors that you’ve come across that were kind of system-wide?
A lot of it has to do with Google ignoring things like canonical tags, which are suggestions more recently. Sometimes Google ignores title tags, it’s not necessarily a technical problem. Looking at how Google is crawling your website and what they’re actually doing as opposed to what we think they’re doing when we look at tools like Screaming Frog and these other tools that illuminate certain issues, you have to cross-reference that with how Google is actually processing the website. Looking at crawl stats, looking at things like the crawled but not indexed report, sometimes you’ll notice when that report is working correctly because I think right now it’s not 100% accurate.
But when it is 100% that can illuminate problems like why are so many of our product pages not being indexed? That report in and of itself is showing sample data. That’s only maybe 1000 URLs potentially that we’re looking at. But if you see 50 product pages appearing in there, that’s probably indicative of a larger problem. You probably have thin content, low-quality content, or content that Google doesn’t think is worthy of being indexed on its own. That to me is a very big systemic problem.
It’s less about looking at individual URLs and looking more for patterns. I think that’s where we like to prioritize our focus where it’s something that can affect, again, the whole site globally at once.
You mentioned Screaming Frog early on in the conversation. I’m curious, where does a tool like Botify fit in compared to Screaming Frog which is quite a lot cheaper. Also, even Screaming Frog has two different tools, the SEO Spider and then there is a Log Analyzer. Most people are not using Log Analyzer, they’re only using the crawler. What use cases and examples of key takeaways or findings you were able to glean from using either a Log Analyzer or a more industrial-strength crawler to analyze site issues?
It depends on, for one, the budget that you’re able to work with. As you mentioned, Screaming Frog is way more cost effective for most SEOs and teams than something like Botify. But for certain enterprise clients, you’d need a tool like Botify because it’s cloud-based. You can’t possibly crawl your clients on your machine without breaking the machine, or in our case getting our IT guy getting really mad at us all the time for doing those crawls locally.
For our bigger clients that have hundreds of thousands or millions of URLs we have to use an enterprise crawler like Botify. That’s happening on a scheduled basis in the background, on the cloud. It’s not something that’s affecting your workflow really. It’s just there and all the crawls are stored, so it’s really easy to see when something changed from week to week, which is very helpful.
Screaming Frog and Sitebulb, I believe, also now have scheduled crawl functionality, which is really great. Because having the historical crawl data proves to be very helpful in many cases. Especially at an agency where we have so many different clients, and it’s like okay, when did this change take place? Oh, somebody added a noindex tag three weeks ago. We have the crawl that shows that it got added. That’s really important.
When it comes to log file analysis, honestly again, being in an agency, it’s generally very hard to get buy-in from clients to be able to grant us access to even view log data. I know that my friends that work in in-house environments have much more access to be able to see that type of data and get information from that type of data. For us, it’s more difficult because you really have to get buy-in and bypass a lot of different security concerns and things like that.
Which is why Google Search Console crawl stats report has proven to be so helpful because even though it’s sampled, it’s not 100% of all log file data. It’s very helpful for getting an understanding of how Google is crawling the site. Thank you Search Console team for launching that because that’s been very helpful for us.
Do you have an example of something that was maybe a big win that you found a technical issue or perhaps it was a multi-million dollar win to their bottom line?
As much as possible, Google should not waste your crawl budget on resources or pages that don’t offer any value or shouldn’t be indexed.
What’s nice is like with crawl stats, you have to set it up on the domain property level. Sometimes, more often than not, it eliminates subdomains that you didn’t even know were being crawled. That’s happened a lot of the times with our clients where we’re like, what’s this subdomain and why is Google spending 20% of its crawl budget across your domain and different subdomains that you have on this subdomain?
The clients are like, we haven’t touched that in 10 years. We didn’t even know that was still in existence. We’re like, well, Google is wasting a lot of time on it. We’ve cleaned up a lot of that. And just other like robots.txt modifications that we can make when we see Google’s getting trapped in some area of the site that doesn’t really matter for business reasons. Cleaning that up and narrowing that down has been very helpful.
How important is crawl budget?
It depends on how big your website is and how much demand there is for your website. For bigger brands, that hopefully don’t have too many crawling issues or technical issues, I don’t know that it’s a massive problem or limitation. If you do see a problem, again, with Google-like crawling but not indexing a lot of your content, you might want to look into ways to make sure that whatever they are crawling, they are also indexing. As much as possible, they’re not wasting their crawl budget on resources or pages that don’t offer any value or shouldn’t be indexed.
I don’t know that it’s an issue in and of itself, but there are always opportunities to optimize it. And then again, if you see a systemic pattern where Google seems to be spending a lot more resources crawling something that’s not super important for the business. A lot of the time with WordPress sites, for example, there are these feed URLs, with Shopify it’s .atom, or these other kinds of feeds or types of pages that are just part of the website but don’t actually offer any value for the searcher. There are ways you can modify the robots file to prevent Google from crawling those things, and that can help them spend their resources on the pages that actually matter.
Would you say a site with million-plus pages would be one that would need to spend quite a bit more time optimizing its crawl budget versus a 100-page website?
Most likely, yeah, I think that’s probably where you’re going to run into larger issues. We have a lot of publisher clients or big ecommerce clients where they’ve just been publishing content so much for so many years and a lot of it doesn’t offer any value anymore. They might have 500,000 articles that are 10–15 years old, and 10–15 years ago, the content strategy was very different than it is now.
A lot of these SEO tools just report back basic facts that don't matter. Prioritizing what's necessary to fix is a more important skill. Share on XYou could do something like embedding a social media post or a picture with a few words. Then there’s just a bunch of ads on the page and then there’s no other real valuable content, but they just let that remain in the index or made in the site maps. It’s not doing anything for the site. A lot of the work that we do is just making sure that old content is not bringing down our crawl budget in the sense that Google is just wasting time seeing if anything changed there when it rarely does.
It’s not just about preserving crawl budget, it’s about making the site look more valuable to Google because there’s more rich content and few, if any, thin content pages, right?
Exactly. That’s the end goal. I’m always saying, put your best foot forward with all the pages that you do want indexed, and anything that you’re allowing Google to index is being really counted against the overall evaluation of quality on your site. People always ask, should I have a tagged pages index, should I have author pages index, and all these things?
Have them indexed if they offer value and they are pages that you think somebody would benefit from if they landed on it from Google. Author pages, yes, I think you should. Tag pages is a tricky subject because a lot of WordPress sites just allow anything to be indexed and anything to be tagged. I wouldn’t do it willy nilly, but I would look for opportunities to take those tag pages and make them something valuable if they are going to be indexed.
My opinion on tag pages, they shouldn’t be in the index. In most cases, they’re not valuable entry points into a site and they’re duplicate content in many cases. And then there are people who just don’t put any consistency to their approach to tagging and then they end up with tag pages that have very similar sounding tags. Maybe a verb, a couple of verb tenses, maybe a synonym, and so forth—that can be a real mess. And all that ends up in the index.
Tag pages is a tricky subject because a lot of WordPress sites just allow anything to be indexed and anything to be tagged.
That’s a common problem, especially with sites like WordPress sites where maybe out of the box. They allow tag pages to be indexed or different types of archived pages to be indexed, and then maybe the editorial teams don’t have a solid process around what’s a category and what’s a tag. People think tag means hashtags. They go and they write all these different things, and then the other authors write all these different similar things, and suddenly like you said, there’s a million overlapping or duplicate tag pages.
If you are going to allow them to be indexed, for the sake of the quality of your overall site, you really have to be careful about avoiding duplicate content and making sure the pages actually add value.
This whole concept of content pruning is important for a large website. Is it something that a company should do ongoing like every month re-evaluate sort of basis, or is it more of just a one-off and maintain certain standards and then you don’t have to contact me ever again?
It depends on how much content you create. It’s kind of funny because the last couple of years, a lot of the work that I’ve been doing with my team has been around pruning, eliminating, consolidating, or merging pages together. Really, just kind of like reducing the overall number of URLs in the index. Whereas maybe five or seven years ago, the focus was pretty different in SEO.
It was yet as many pages out there as you possibly can. Write as much content as you possibly can. Try to rank for all the things regardless of your authority on those topics, and it worked. Launch an affiliate website, write about all these things, put affiliate links on it, and make a bunch of money really quickly. That worked.
Google’s cracking down on these types of tactics. A lot of the work that we do in terms of helping sites recover from algorithm updates, in terms of getting in line with what Google is looking for with the product review update, it’s like, maybe we don’t have the authority to write all these different topics. Maybe we wrote the same thing 10 times on 10 different URLs over the course of the years and that should be really just one article.
Many of these Google updates and announcements all boil down to them not wanting spam and low-quality content.
It’s kind of funny that it’s reversed in that way. I think a lot of these Google updates and Google announcements all boil down to them just not wanting spam and low-quality content. So we have a lot of work to do to clean things up.
What would be some of the ways that Google is cracking down? And you mentioned the product review update, what are some of the ways that Google algorithmically and through manual means is cracking down?
The core updates are probably the biggest example. Almost every site that we deal with that has been affected by core updates, which is a lot of our clients at this point. They were doing something that was a gray area as far as Google quality guidelines are concerned, and maybe they didn’t even know that.
Let’s take, for example, a very common problem that people have which is having a page for every city in the country, even though you offer the exact same thing in every place. Or maybe you don’t have a business address in every place, but you want to rank for that city plus the service. You use the same template in all those pages. Google can identify patterns like that. It’s very clear because I’ve worked on so many of these different types of sites that have been impacted, but the last few years of algorithm updates they’ve been cracking down on those types of techniques because I think they see it as not helpful for users.
Through the broad core updates, product review updates, the small tweaks that happen between the big updates, they’re constantly refining what it means to have high quality content. I think, for the most part, they’re very interested in, not necessarily longer pages, but just more helpful pages.
Things like jump links for example, where you can come to a page and you can navigate to the section that’s most important to you. Then we have passage-based ranking and all these things where it’s basically like Google doesn’t need an individual URL for every single keyword that you’re targeting. It might be better off that you take a group of related keywords and have just one page.
When prioritizing technical issues, look more for relatively easy things to fix but have a significant effect. Share on XCan you give us an example of taking that approach?
I was trying to think about how to describe this without giving away my clients. Let’s take eHow, for example. It’s not a client of mine, but these types of sites where it’s like how to do XYZ thing, and then in the past, they were ranking really well for a ton of different topics because they have these how-to articles for everything you can imagine under the sun. I think that includes a lot of your money or life content.
“How to treat acne naturally,” or “how to treat a cough using natural products” or whatever? That’s something that Google’s really been cracking down on. Publishers who don’t have, for example, medical authority, scientific authority, aren’t renowned journalists, news journalists, or anything like that talking about these very consequential topics that have a great impact on people’s safety, security, well being, and health that might not be a technique that can bring you traffic anymore depending on who you are as a brand as a publisher.
You can potentially, first of all, see if you have the authority to publish on those topics, but then also to the point of consolidation, maybe you wrote 10 different pages about 10 different types of tea and the benefits of those different types of tea. Maybe you should have one page about here’s the top 10 types of tea and all the different benefits, and then you could just kind of navigate within that page to the individual types.
We do a lot of work like that with our clients where we notice like you guys have so many different URLs that would actually benefit from being on one page.
That happens a lot on ecommerce sites where you have different colors of the same product, each with their own unique URL instead of having all of it consolidated to one product URL. You might have separate SKUs for each color, but it’s really the same product with one, two, or three different color choices.
Yes, that’s a very great example and something that I definitely uncover in many cases when I work with ecommerce clients. I think some Shopify sites have that set up as the default. It’s problematic. That’s a good example of when you go into your crawled but not indexed report in Search Console. You might see those different color variants not being indexed, or you can just kind of go into Search Console, look in your performance report, and then take the name of the product, filter by just that name, and see all the different variants and how much traffic they’re getting.
In many cases, you’re going to see Google chose one as the canonical and then the other ones are not really ranking. That’s a good indication that you should probably have those pages consolidated. It’s a lot of work, but again, it goes into this larger trend of Google wanting sites to have less duplicate and thin pages and more consolidated pages.
The traditional fix for this issue is to use the canonical tag, and yet it’s only a hint, as far as Google sees it. It may not obey those canonical tags and thus we end up still with all those different color variants in Google’s index, even though you’ve applied the best practice SEO and applied all these canonicals.
Yup, canonicals being a suggestion. The last agency that I worked at, I was primarily focused on ecommerce. I would say it was like maybe six or seven years ago, I was like wow, I didn’t know canonical was just a suggestion. I thought it was a directive and now I know because Google is ignoring a lot of them.
But I think when they ignore them it’s a good indication that there’s something that maybe you should reconfigure as far as your canonical strategy goes. Because maybe you are a company, I used to work with a lipstick brand. They were known for the different colors of their lipstick. In that case, you wouldn’t want them to all be on one page. You might actually want the individual colors to have their own because people are searching by those color variants.
Put your best foot forward with all the pages that you do want to be indexed. Anything you're allowing Google to index is being counted against the overall evaluation of quality on your site. Share on XIf you’re seeing patterns like that where Google is ignoring it because they think that users want an actual page that just shows that one color or that one variant, then maybe that strategy doesn’t apply to you.
Something you just mentioned a few minutes ago was your money or your life. That is a Google term actually. It’s in the Quality Rater Guidelines for Google. YMYL, Your Money or Your Life, those types of sites get a lot more scrutiny than just regular sites about basket weaving or whatever.
If you were to break down what are the key things to know from the Google Quality Rater Guidelines, besides the fact that we are going to be much more of a target if you’re in the YMYL set of categories, what would those be? I know I’m pitching your softball here so that you can talk about the E-A-T, but there’s more to the Quality Rater Guidelines than just that one. Let’s talk about that.
Oh yeah. There are like 165 pages or something, and they just updated it last week for the first time in a year. I just heard about that on Search Engine Journal. But it’s interesting because one of the updates that they made this year was the definition of Your Money or Your Life, and they also did that last year.
Every time Google updates the guidelines, they seem to be expanding this notion of Your Money or Your Life.
It’s like almost every time they update the guidelines they seem to be expanding this notion of Your Money or Your Life. At this point, it’s pretty comprehensive. Basket weaving is a good example of something that’s probably not Your Money or Your Life, but you can almost make the argument that all content is, especially if it’s ecommerce, it’s accepting people’s credit card payments. In that sense, it’s Your Money or Your Life.
But of course, the further along the spectrum you get towards talking about things like health conditions, politics, and news, that’s going to be the most Your Money or Your Life content. But to your point, there are so many other things in the guidelines. I strongly encourage anybody working in SEO to look through this whole thing because there are so many nuggets of information about what Google perceives to be high and low-quality content.
They have links to examples throughout the guidelines that show you actual pages that they rate high or low quality. You can extrapolate from those examples and say, that looks a lot like the client that I’m working on, so I know I need to change these things. They talk a lot about E-A-T which is Expertise, Authoritativeness, and Trustworthiness. E-A-T is more important. The more Your Money or Your Life the site is, the page is, or the topic is. But it really seems to be the case that they’re using E-A-T as this blanket assessment of how valuable, how high-quality something is. Throughout the guidelines, they mention it over a hundred times.
Just familiarizing yourself with these notions of high and low-quality content and then there’s a whole other section of the Rater Guidelines, which does this page meet my needs. That talks more about intent. There are also the raters who are tasked with flagging things as offensive content or adult content. Just knowing all these different inputs and signals that Google is looking at when it does these quality evaluations I think is really important to understand what’s happening with your pages.
What would be an example of something that would be maybe even counterintuitive that you found through analysis of the Quality Rater Guidelines that perhaps our listener or viewer is likely not implementing?
Start your content strategy with your areas of expertise because that can translate into better performance over time. Share on XHonestly, most of it to me is pretty intuitive. I don’t know that anything is necessarily counterintuitive. But the first time I read it, I was pretty struck by how specific Google is about E-A-T and researching the reputation of brands.
They say, for example, take a brand name, put the brand into the search results, look for things like Better Business Bureau ratings, look for Yelp ratings, Amazon ratings, any other third party site that you can look at, or how many awards the individual offers have won. Very specific and nuanced evaluations about how to determine how much E-A-T a person or a brand has. Honestly, before I read the guidelines because I think they were only made public in 2015, 2016, or maybe a little bit earlier than that for the first time.
Again, guidelines are not an active demonstration of what Google is currently doing with the algorithms. It’s just showing how they evaluate the quality and what will potentially happen in future algorithm updates. But reading the guidelines really changed my perspective of SEO because it used to be, again, just like, make sure you have your technical SEO in order and publish whatever topic you want. Just make sure it’s optimized.
But the guidelines really confirm you have to really think long and hard about your brand and people’s experience with your brand, your authors, and everything. That’s fundamentally changed my approach to SEO and our teams. The results have been pretty clear. It’s pretty effective that focusing on these types of strategies is very effective for SEO nowadays.
Let’s say that a new client comes in and they don’t have the awards, they don’t have the certifications, they don’t have author pages that have these contributors listing certifications, awards, and so forth. What would be the fist opportunity that you would have them work on or work on with them?
Everybody has to start somewhere.
It’s a good question. I always say everybody has to start somewhere. I wouldn’t use those hurdles as, well, this isn’t going to work for us because you have to start building a foundation if you want to succeed at Google. It’s probably going to take a very long time.
If you’re publishing on Your Money or Your Life topics like health topics, it’s going to take even longer. It’s very hard to compete with the big players in certain categories. What I always tell clients, even if you’re a small business, or let’s say you’re like a college professor who has a blog or something, I work with a client like that, you have an area of expertise, hopefully, because that’s why you started a business in the first place.
An example that I used in some recent presentations was a tattoo artist that I worked with here in New York and I built the website with him. He was tattooing, I was asking him questions, and I was writing blog articles, literally only using the information that he gave me. I didn’t do keyword research. I didn’t look at what everybody else had written and just write the same thing. I just wrote what he told me and that was four or five years ago that we published these blog articles.
To this day, they’re ranking in featured snippets for very high volume tattoo keywords. The only thing we did was just leverage his actual expertise. I’m always telling people, don’t start necessarily from doing keyword research to pick the highest volume keyword. Start with areas of expertise that you as a professional and you as an expert actually know, and what do you want to share? What do you want to tell the world? Start your content strategy from there because that genuine expertise, I think, is going to translate into better performance over time.
All right. When addressing the trustworthiness side of it, so that’s the T in E-A-T, is that about getting more high trust links? Is that about something other than that? Where do you see the trust side of this, and how to boost that?
Again, it depends on what type of category your site is in. It’s pretty clear for ecommerce. There are all kinds of signals that you can demonstrate throughout the site like Trustpilot integrations, just star ratings, a clear return policy, or a good Better Business Bureau rating. All these things that consumers would actually look at when they’re looking to make a purchase I think are important for E-A-T and for trust.
If you’re a news publisher, Google’s made it very clear throughout the news ecosystem that they’re looking for these transparency signals. So who are your authors? Authors should have bylines and biographies. Where else have they written online? What are their credentials? Having an overarching editorial policy page or how do you source information? How do you check your facts? All these signals that you can add throughout the content, think about it from a user perspective.
I think if you look at the biggest players in the health space nowadays, science, or anything when you load the page, all above the fold content has like who wrote this? Who reviewed it? Who are the experts behind it? Learn about our policies, learn about our leadership, team. How long have we been in business? All these things are signals that help users and I think also indirectly help Google as well.
You brought up Google News. Let’s actually talk more about that. There’s a set of rules and requirements for publishers in order to be in Google News, in order to rank well in Google News as well. What would some of those be?
Google news is actually an area where Google is a little bit more forthcoming when it relates to E-A-T. They say, for example, you need to have author bylines. You need to be transparent about your editorial policies and guidelines. You should be publishing at a certain cadence, you should be using news XML sitemaps. There’s a Google Publisher Center, which is kind of a separate Google product that can help manage your logo and some information about your brand.
I’ve had some success with actually just editing the information that’s available in the Publisher Center for our clients. They recently rolled out a handful of new manual actions specifically for Google News and Discover that they are pretty strict.
You can’t have any content that could be flagged as hateful, violent, adult content, swearing, and profanity. A lot of very strict content requirements because they’re really kind of protecting the trustworthiness and accuracy, almost like family friendliness of a lot of the news content that gets put out there.
Then clickbait is another big thing as well. They seem to be cracking down on clickbait for both Google News and Google Discover. We’ve actually had a client receive a manual action for that, despite from our perspective the content wasn’t really clickbait. But it seems like they’re getting more aggressive with sending out those notifications because they’re really just trying to create a good trustworthy ecosystem for the consumers of that content.
How would you define clickbait from a perspective of not being spammy or over-promising and under-delivering?
Clickbait is when you over-promise in your headline something that’s not revealed in the content, and that’s a big problem.
Clickbait is when you over-promise in your headline something that’s not, first of all, even revealed in the content, that’s a big problem. Second of all is when you make the title more extreme, more shocking, or just emotionally provoking than the content actually is. Ironically, I’ve done some research lately about Google Discover, which is an area where Google says, don’t use clickbait.
The content that performs best does a lot of the things I just described especially with the headlines. It seems that Google Discover, Google’s looking for these very emotional headlines, but you have to deliver on the content, and you can’t bury the lede as well. You won’t believe this one thing that this company did, don’t spend 300 words leading up to it before you get to the point of the article.
Help the user get that information above the fold as much as possible, even though that’s kind of contrary to a lot of business goals that publisher websites have. I think the risk of being eliminated from products like Google Discover is too big of a risk these days, so I think abiding by their guidelines as it relates to clickbait is very important.
For our listeners who’re not familiar with Google Discover, can you describe it briefly?
I’ve been talking a lot about it recently because so many publisher sites are starting to see more traffic from Google Discover than from organic search. Google Discover is a product that is somewhat like a social media feed that’s customized based on your Google history, so your personal account on Google.
It takes a lot of different data points, potentially even what you’re doing with Gmail, what you’re doing with YouTube, maybe even where you’re standing, all these different data points that Google has, and it creates a curated list of content that you can see through the Google Chrome app or just going into google.com if you’re logged in on your mobile device. It refreshes, I want to say, multiple times every day.
You can see a curated list of content that’s almost scary in terms of how accurate it is to your interests and then you can kind of tell Google, I want to see more of this content less of this content. But if your site ends up ranking there, which a lot of publisher sites nowadays get just so much traffic from Discover, it can bring more traffic to your site than SEO.
For a lot of Publisher sites or not even publisher sites, like different clients that we work with, they’re not even paying attention to this. But the blog article that you wrote for SEO might even be doing better in Google Discover than organic search. We’re really taking this approach because they’re both organic channels. It’s both essentially free traffic. What’s working in Discover, what’s working in SEO, and there might be kind of like two concurrent strategies that you can use.
Google Discover is a product that is somewhat like a social media feed customized based on your Google history.
Great, and let’s move on to another area of Google where there’s a lot of misinformation or misunderstanding. You spoke briefly about it already, but let’s dive deeper into it. It’s the page experience algorithm update and the core web vitals component of that. There’s a lot of work involved in passing all three Core Web Vitals if you’re not currently doing so with your site and getting your PageSpeed scores to be in the green. How important is this, what do you need to work on, and what are some of the nuances of this page experience algorithm?
The three new Core Web Vital metrics are just a few of several other metrics that Google looks at as far as page experience, so that’s one thing that’s important to keep in mind. It’s also important to keep in mind that I keep talking about recently and I have been talking about, is that this is a lightweight signal.
My friend, Glenn Gabe talks about this a lot as well. We were recently in an interview together where we talked about this page experience update that there was so much build-up and so much anticipation, but it didn’t end up being as big of a change as people anticipated. It also took two months to roll out. It’s very hard to isolate whether the work that people have done to improve Core Web Vitals actually translated into better performance.
That being said, it’s part of a larger strategy of PageSpeed optimization that sites should be focused on regardless of their SEO rankings. I think it’s very important to think about performance and PageSpeed.
I was excited about CLS, Cumulative Layout Shift because that’s when you load a page and then the page continues to move under your cursor or under your finger on your phone, and that’s very frustrating for users. When Google announced that metric, I was like that’s great. That being said, a lot of the best performers still have horrible CLS.
I think it’s important. I think it’ll increase your conversions if you focus on Core Web Vitals. I just don’t know that it’s the biggest SEO signal that sites necessarily need to be focused on first and foremost, above things like improving your website architecture or your content quality.
How about the links? Where does link quality fit in, in your opinion?
People that are building links are violating Google’s guidelines and potentially getting websites in trouble.
Yes. I tend to not list links as one of my main areas of focus with SEO, but that’s not because they aren’t important. Of course, they’re very important. The reason that I frame it that way is because it’s so hard to build links in a way that’s in line with Google’s guidelines that more often than not, people that are building links are violating Google’s guidelines and potentially getting websites in trouble. Then a lot of those sites come to us and say we got in trouble, can you help?
I think it’s better in most cases to focus on content quality and then of course some version of digital PR, social media to get links in an organic way where people are truly finding your content valuable and linking to it organically. Because so much of the in-between is another example of what Google has been innovating and trying to crack down on with its algorithm updates. Whether their broad core updates or the updates that happen in between, it’s pretty clear that most forms of link spam and black hat SEO are becoming outdated, sorry to say.
If you’re focusing your efforts on paying for links, it might work now maybe, but the risk of getting penalized or the risk of Google identifying those tactics and then making them obsolete I think is pretty high. You should focus on links, you should get links, but do it in a way that’s 100% in line with their guidelines.
Yeah, and the guidelines had not paying for links to be part of it for forever, for a very very long time, so this isn’t new news for skilled SEO. They just may be skirting the guidelines because of convenience or because they’re after a short-term hit and not sustainable rankings and benefits for the client. What would be an example of a link campaign that you’ve seen that was totally above board and really innovative? Something that you’d want to show a Google engineer?
Yeah, there’s a lot of them. I used to attend MozCon in person before the pandemic and hopefully again in the future soon. Lisa Myers, she might have changed her name. I don’t know what her new name is, but Lisa Myers shared a lot of really interesting and exciting examples.
Help the user get that information above the fold as much as possible, even though that’s kind of contrary to a lot of business goals that publisher websites have.
I think the one that stuck out most, and this is a good example of how link building should be done if it’s being done by SEO and digital PR teams, was what’s the dirtiest part of an airplane interior? And it ended up being the tray table, but it was this whole thing where they went into the airplane. They tested all these different parts of the plane and they discovered it was a tray table and they published this big study with all this data behind it. I think there was a video and there were all these different formats of displaying the same information like an infographic and just really interesting and exciting visuals to represent the research that they did.
As a result, it not only got links. It got TV press coverage and just became this huge campaign where it got a lot of eyeballs, it was on the front page of Reddit, and all these things. That obviously takes a ton of work to do that type of research and to create nice accompanying visuals, videos, and everything. That’s going to get you probably enough links, more links than most sites are able to get.
Again, it’s a different side of SEO. I think at this point, it’s not even SEO, it’s really a different animal in and of itself, like digital PR. There are other companies that are specializing entirely in that, and we partner with some of them because the reality is like my SEO team doesn’t pick up the phone and call journalists all day to pitch our ideas, that’s PR.
Focusing on what is a cool piece of content that we can create as a brand, that’s going to be very helpful and something different that hasn’t been done before, research that hasn’t been done before is going to be the best approach because that will authentically get a lot of eyeballs and a lot of links.
Yup, for sure. Out of the different updates that you’ve come across, studied, and so forth, what have we not discussed in this episode that we really should be?
We’ve talked about broad core updates. We’ve talked about Google News, and Discover which continue to innovate with new updates. A lot of what Google has been announcing this year, throughout its different documentation and things like the Search On conference and all these different Google communications, they almost always talk about E-A-T at this point.
They always talk about the work that we’re doing to make sure people can tell that brands are trustworthy and legitimate. There’s this new feature on Google called ‘about this result’ where you can click three little dots next to the result and see information about the brand so that ties back to E-A-T.
I guess one thing we didn’t talk about is what Google is doing with Images, Google Lens, AR, and being able to look at a product in your living room, on your phone, and things like that. I don’t know how much of that directly plays into what we do day-to-day on the SEO side, but it’s clearly a big area of focus for Google, and also Google Shopping as well.
I think that ties into what they’re doing with displaying a lot more data from Google Merchant Center, knowledge panels becoming more robust with product content. The ability to use Google Lens to look at a product and shop for something directly on Google. I think that one of their many initiatives this year is to become a bigger competitor to Amazon, so we’ll see how that goes.
Speaking of images, is there anything particular that you figured out or just want to share with our listener viewer about Google Image search? Let’s say that the listener wants to rank higher in Google Images? Is that something that you would, first of all, recommend, and if so, what would be some strategies and tactics to achieve that?
Yeah, there’s a lot to say about that. I think number one, we talked about Google Discover. High quality images of a certain size, 1200 pixels or bigger are crucial or required for Google Discover, so that’s one consideration. There are all types of recommendations about the performance and loading of images, making sure that they load quickly.
My colleague, Romain Damery, published a really great article for Search Engine Land about image optimization that goes into great detail about those best practices. I think that for ecommerce websites, you can see a big impact in terms of image traffic and conversion rate optimization as well if you use actual photos that you’ve taken of the products. Professional photos, as opposed to some image that maybe came from the manufacturer or like a stock photo.
If you can provide images of your products, your services, your storefront, or whatever, taking your own photos is going to be more effective than just using what’s already out there.
What about optimizing these photos to improve page load time, to improve the quality of the images? What are some of the techniques that you recommend for an existing catalog of images on a website?
You can use a CDN, you can implement lazy loading, making sure your image file isn’t massive. It’s something that we find for a lot of sites. Honestly, when it comes to best practices.
Just go right into Screaming Frog and when you run your crawl, you can just go to the images report or tab and then sort by the file size of those images.
Exactly. There’s a lot of best practices around what type of file format to use for different images as well. I don’t have those all memorized. But again, the article that I mentioned before, my coworker Romain, went into great detail about all these best practices, so I would encourage listeners to check that out.
Awesome. What would be something that we would want our listener or viewer to take action on as kind of a next thing? A lot of people are just passively consuming podcast episodes and then moving on to the next one. That doesn’t actually move the needle if you don’t implement anything of what you’ve learned. What would you recommend as a next step?
I think there’s just a lot of SEO misinformation out there, and it makes a lot of sense. First of all, Google is not always 100% clear about everything that factors into SEO evaluations, so that’s one aspect. It’s like a lot of the articles that come out from the SEO industry are people speculating, reverse engineering things, and trying to figure things out, but we don’t have all the information.
I think it’s really important to not take SEO information at face value, which can be challenging and also understand that SEO from two, three, or five years ago might be completely different. A lot of the rules have changed.
It’s really important to not make decisions without a true understanding of the facts. Cross-reference the facts as much as possible from Google’s own documentation. They have a Google SEO Center, essentially in their developer side of the Google site that provides all the answers. A lot of the SEOs are speculating about or as much as they can. I would encourage people to really look at those guidelines and make sure that you’re making decisions off of that data and not just what you read on some random SEO blog.
Right. I think one really easy next action for our listener to take is to go through the Quality Rater Guidelines, even if it’s just a quick scan through. Where would our listener or viewer go if they wanted to work with you and your agency or at least learn from you?
I share my thoughts, kind of stream of consciousness style all the time on Twitter. It’s @lilyraynyc. You can find me personally at lilyray.nyc and then I work at Amsive Digital, so if you want to work with our agency, you can reach out to Amsive Digital.
Awesome. All right. Thank you so much, Lily, and thank you, listener. I hope you apply what you’ve learned here and not just move on to the next episode. At least pick three things here from this episode and get them done. We’ll catch you in the next episode. I’m your host, Stephan Spencer, signing off.
Important Links
Lily Ray
Facebook – Lily Ray
Instagram – Lily Ray
Twitter – Lily Ray
LinkedIn – Lily Ray
Amsive Digital
Facebook – Amsive Digital
Instagram – Amsive Digital
Twitter – Amsive Digital
LinkedIn – Amsive Digital
Glenn Gabe
Lisa Myers
Romain Damery
Better Business Bureau
Botify
eHow
Google Search Console
MozCon
Reddit
Screaming Frog
Search Engine Land
Shopify
Sitebulb
Trustpilot
‘about this result’ – Google
Core Web Vitals
Cumulative Layout Shift
Log Analyzer – Screaming Frog
PageSpeed
Quality Rater Guidelines for Google
SEO Spider – Screaming Frog
YMYL & E-A-T
Your Checklist of Actions to Take
Be open to trying new things to find what works best for me. It will help me conquer the fear that is blocking me from moving forward and will allow me to expand my mind.
Always remember priorities are based on the effort involved versus the potential payoff.
For SEO technical issues, look for things that are easy to fix but have a significant impact.
Make sure that whatever Google is crawling is also being indexed. So as much as possible, Google is not wasting my crawl budget on resources or pages that don’t offer any value and shouldn’t be indexed.
Put my best foot forward with all the pages that I want to be indexed. Anything that I allow Google to index is being counted against my site’s overall evaluation of quality.
Tag pages shouldn’t be indexed. In most cases, they’re not valuable entry points into a site and duplicate content in many cases.
Have patience; everybody needs to start somewhere. I have to build a foundation if I want to succeed at Google, and it’s going to take time.
Don’t start with keyword research by picking the highest volume keyword. Instead, start with my area of expertise.
Avoid producing clickbait. Google is getting more aggressive against clickbait because they’re trying to create a trustworthy consumer ecosystem.
Don’t take SEO information at face value. Google is not always 100% clear about everything that factors into SEO evaluation. Also, understand that SEO from two, three, or five years ago is entirely different now.
Check out Lily Ray’s website to know more about her. Also, visit Amsive Digital’s website to work with her agency.
About Lily Ray
Lily Ray is the SEO Director at Amsive Digital, where she provides strategic leadership for the agency’s SEO client programs. Born into a family of software engineers, web developers and technical writers, Lily brings a strong technical background, performance-driven habits and forward-thinking creativity to all programs she oversees. Lily began her SEO career in 2010 in a fast-paced start-up environment and moved quickly into the agency world, where she helped grow and establish an award-winning SEO department that delivered high-impact work for a fast-growing list of notable clients, including Fortune 500 companies. Lily has worked across a variety of verticals with a focus on retail, e-commerce, b2b and CPG sites. She loves diving into algorithm updates, assessing quality issues and solving technical SEO mysteries.
Leave a Reply