a robot carefully budgeting

As Google Cuts Costs, Publishers Should Embrace SEO

"let Google sort it out" is a bad strategy even in the best of times

Google is cutting costs across the board, including the compute resources it puts towards basics like web crawling and indexing. Given the incredible costs of competing in the AI revolution, Google is bound to look for savings in other areas and will likely make more cuts.

This means that brands who want to stand out in search need to invest in contextualizing their content and making its subject matter and structure as clear as possible. Adopting a “Google will sort it out” attitude doesn’t make sense when we know that Google is engaged in broad-based cutbacks, faces mounting AI compute costs, is barely making a profit with its cloud services offering, and has noted explicitly that it’s harder than ever to have content indexed or even crawled.

To use an analogy, we can view Google as a two-sided market, like a cable TV provider. Providers like Comcast pay premium content providers like ESPN to include them in their cable TV offerings. Smaller channels with less demand, like Food Network or HGTV in their early days, pay providers to be included in their channel lineups.

Google similarly invests computing resources into making sure high-demand sources like AP News, The New York Times, or The BBC appear in their search results. Users would find it unacceptable that popular news outlets were excluded from search based on technical failings. So even if those brands have poorly-constructed websites and don’t provide much context for their content in the way of structured data, internal links, or clear categorization, Google will user their resources to parse and contextualize new pages from those sources.

The same cannot be said for most small-scale publishers, who just like Food Network or HGTV need pay to play. “Paying” in this case means consciously curating content into categories and subcategories, tagging and internally linking content by micro-topic, adding structured data whenever and wherever possible, and emphasizing best practices highlighted in Google’s Quality Rater Guidelines—specifically anything that can enhance Google’s perception of experience, expertise, authority, and trust or E-E-A-T.

In other words, unless you’re a globally-recognized media brand, you need to present your content to Google wrapped up in a bow, set on a silver platter, and carried on a velour pillow. Doing so will allow Google to contextualize your content, map it to the Knowledge Graph, understand how it relates to existing pages in its link graph, and otherwise fully assess its worthiness to be ranked with a minimal compute investment.

Below is our general assessment of Google’s cost-cutting posture from search-specific measures to the broad moves and challenges that have made mainstream headlines.


Google is Reducing Crawling

Gary Illyes, a Google search team analyst declared in a 2022 episode of Google’s “Search Off the Record” podcast that “Basically, we’re rethinking how we issue refresh crawls.”

In the episode, Gary noted that “computing, in general, is not really sustainable” and that “we definitely have room for improvement there on refresh crawls, because sometimes, it just seems wasteful that we are hitting the same URL over and over again.”

Tallest Tree can confirm that crawl rates overall are down. We’ve seen crawling across all of our clients reduced, with many of them seeing a simultaneous reduction in crawl rates in Q2 of 2023.

Google Intends to Keep Its Index Small

The same Gary Illyes noted in a much more recent episode Search Off the Record, that “Unless you are publishing something utterly unique, it’s pretty hard to get stuff indexed.”

This aligns with research from Kevin Indig, an SEO consultant and former SEO lead for Atlassian, who argues that Google’s index is smaller than we might think and is likely to stay that way. While Google discovers and records billions of new URLs per day, it only indexes a small sliver of them. In fact, Google’s 2019 Webspam report noted that it discovers 25 billion spam pages per day. Indig estimates that spam accounts for roughly 30% of the pages Google discovers.

In just the last several months Google made further moves to cut indexation by  asking publishers to mark syndicated content with a “noindex” tag, explicitly excluding that content from Google’s index. This countermands previous guidance that said syndicated content should use a “canonical” tag, noting that, for example, a syndicated piece was originally published by AP News or the LA Times.

SEOs discussing this development on Twitter/X noted that Google is simply “passing the buck” of cleaning up its index:

Google Cloud Barely Profitable

Both of these moves should be seen in the context of Google’s struggle to make cloud computing—or anything outside of advertising—a profitable part of its business. Despite its expertise and enormous scale, Google Cloud Services (GCS) reported its first profits in Q1 of 2023, after 15 years of losses.

And that profit is nothing like the 24.63% overall profit margin Google posted for Q2 of 2023. Operating margins were a meager 2.6%, putting Cloud Services profitability on a par with your local grocery store. However, GCS’s historic losses and present slim profits are largely attributable to its ongoing investment in data centers as it continues to play catchup with Microsoft’s Azure and Amazon’s AWS computing services.

With such slim margins and activist fund managers calling for cost cutting and diversification, why would Google ever expend more of its precious computing power to make better sense of lower-tier publishers?

AI Brings 10x Cost Increases

Even if its cloud services grow into profit centers like Azure or AWS, a staggering new cost center will soon be situated at the very heart of Google’s core business. Alphabet Chairman John Hennessy told Reuters that a chat with a large language model (LLM) is likely to cost 10 times more than a standard keyword search, though costs could be reduced by fine-tuning the LLM.

In a recent episode of Bloomberg’s Odd Lots podcast, Brannin McBee, co-founder of CoreWeave, a GPU-based cloud computing company, noted that AI-specialized computing built on GPUs represent a step change for data centers. Specifically, the Nvidia-based GPU arrays his company creates are “four times more power dense” than traditional CPU-based compute, require correspondingly greater cooling, and are connected not through ethernet but through 500 miles of fiber to adhere to Nvidia’s “DGX reference spec.”

An Overall Froogle Posture

All the specifics above exist in the context of company-wide cost cutting. In January Google cut 12,000 jobs after wildly over-hiring during the pandemic. The latest cuts see Google laying off recently unionized contractors

Ruth Porat, who recently moved from CFO to a new position as Alphabet’s president and chief investment officer, announced Google would spend $500 million to exit leases as it reassesses its global real estate position. To further shed office space, Google added more than 1.4 million square feet of office space to the Bay Area sublease market.

And while rumors of tape and stapler rationing are perhaps unfounded, Google has announced a swath of smaller policies designed to cut costs. Non-engineers will be issued Chromebooks going forward, equipment replacement cycles are being extended, and there’s talk of cutting back on perks like massages, yoga classes, and fresh-baked muffins.


Compared to laying off employees, buying your way out of long-term leases, or going after all the little employee perks with a budgetary scalpel, it’s very easy for Google to simply crawl 10% less, raise the bar for indexation, or apply less machine learning magic to sub-par websites.

All this underscores the importance of SEO, particularly SEO approaches that take into account rankings systems like topic authority and concepts like E-E-A-T.

That’s why our SEO practice is now squarely focused on buildings tools that, among other things, help automate the process of tagging topic areas or create structured data for author profiles pages.

We’re excited to introduce those tools to our existing clients over the course of the next month, beta test them with a select group of think tanks as we move into the Fall, and then roll them out to a wider audience later this year.

Contact

1 Lee Hill Road
Lee, NH 03861

‪(978) 238-8733‬

Contact Us

Connect

Subscribe

Join our email list to receive the latest updates.