It’s been another busy week in the SEO world. While no Google updates caused ranking problems for SEO professionals, it wasn’t all plain sailing for site owners.
Here’s our weekly recap of what happened, including the Google performance report delay and an update to the merchant center tracking tool. An alternative robots.txt file approach from Google’s search analyst Gary Illyes and an insightful interview with Google’s Director of Product Management, Elizabeth Tucker, explaining how Google measures search quality.
SEO News (Week in-Review)
Google Explains How It Measures Search Quality
Elizabeth Tucker, Google’s Director of Product Management, gave a brilliant interview on the Search Off the Record podcast, providing listeners with inside information on how Google measures search quality.
Tucker spoke openly about Google’s challenges with improving search quality, its constant adaptation to new user behavior, and how keyword Search lengths have changed.
Key takeaways:
- Google uses multiple ways to measure search quality, including human evaluations and behavioral analysis.
- Tucker says it’s really hard to determine search quality. “We use a lot of metrics, where we sample queries and have human evaluators evaluate the results for things like relevance.”
- As search quality improves, users are making more complex queries.
- Keyword searches are getting longer.
- Prepositions and “not” are still a problem for Google.
- Kids search differently from adults.
Google Updates Merchant Center Conversion Tracking Tool
Google began updating its Merchant Center conversion tracking tool to include conversion data from Search web results and other sources.
The goal is to provide merchants with more transparent performance data to help optimize their marketing strategies and enable Google to improve user search and shopping experience.
Key takeaways:
- New conversion data will include all website conversions, helping sellers improve their marketing campaigns.
- The update will automatically apply to sellers using the Merchant Center.
- Merchants can turn off the conversion settings in 3 ways: “Disable the conversion setting in the Merchant Center, turn it off using the Google & YouTube channel app on Shopify, or unlink their Google Analytics and Merchant Center accounts.”
Google Search Console Bug Delays Reports
The biggest story from last week was the delay in the Google Search Console Performance Report, which left site owners and SEO professionals relying on it for SEO reporting in the dark.
Key takeaways:
- Performance report delay hits the 62-hour mark and counting.
- The latency left site owners and SEO professionals with zero access to crucial site performance data.
- Google says it’s working on it but will take 48 hours to catch up.
- Users complain about the lack of data on forums, with some reporting ranking drops.
Google Says Search Console Delay Is Not Another Core Update
Rumors of an unconfirmed Google Core Update began appearing on forums as the Google Search Console Performance report delay continued throughout the week.
Google’s John Mueller quickly said this wasn’t the case, explaining why Google doesn’t show delay updates on its Search Status Dashboard.
Key takeaways:
- Mueller says the delay is not an unconfirmed core update; Google will announce it when it’s time.
- Mueller explains that Google reserves the Status Dashboard for core search systems.
Google Reports You Don’t Need Robots.txt On Root Domain
Google Search Analyst Gary Illyes gives an alternative method of using robots.txt that contradicts 30 years of conventional wisdom.
Illyes explains why you don’t need robots.txt on root domains and how centralizing them in your content delivery network (CND) can improve your site’s SEO.
Key takeaways:
- Locating robots.txt files at the root domain is optional.
- You can centralize robots.txt files on CDNs, not just root domains.
- One robots.txt file containing all the rules on a CDN allows you to keep track of all the rules you must manage.
- As long as robots.txt files aren`t in the middle, they`ll work fine.
- Having your robots.txt file rules in a single location could help streamline management and improve site SEO.
- Google Goes Nuclear To Power AI Data Centers - October 15, 2024
- SEO Weekly News Roundup [October 7 to 11, 2024] - October 14, 2024
- Google Local Service Ads Update On The Way - October 11, 2024