Today, I reviewed a notification from Google Search Console stating that some pages on my site, https://brianschnabel.com, were not being indexed due to being blocked by robots.txt and excluded by a noindex tag. After investigating, I realized this was likely a false report because the noindex code doesn’t exist on my pages. The issue may stem from cached data or server headers, especially since I recently relaunched WordPress and started posting new content.
I decided not to stress over it, as Google often updates its index over time. I am aware that the noindex directive can appear in meta tags or HTTP headers and that WordPress settings or SEO plugins can sometimes add it unintentionally. For now, I’ll keep publishing content and let Google catch up.
In my research on this matter, I also noted a few age-old tips for speeding up indexing naturally even though we are in the age of AI:
- Submit the sitemap in Google Search Console.
 - Use internal linking between posts.
 - Share content externally for discovery.
 - Publish consistently.
 - Use “Request Indexing” for key pages.
 
Some things will just never change despite the rise of new tech. Google screwing things up is just one of those things.
So, yeah, I’m taking a relaxed approach and focusing on building out the site. Google can do its thing and I’m simply going to do mine.