Tallest Tree has worked with dozens of public policy groups to increase their organic web traffic through SEO. Every website we’ve worked on has had room for improvement. In fact, it’s rare to see even basic SEO best practices in place unless a group is intentionally pursuing an SEO strategy.
But how much traffic should a think tank, advocacy group, or political magazine expect from search? Is 100,000 pageviews a month, above or below average? With so much variety between groups it’s hard to benchmark what good search performance looks like.
That’s why we created our Public Policy SEO Ranking. By collecting data for over 300 groups and then running a simple regression analysis, we’re able to create a model that tells groups where their organic traffic ought to be compared to their peers.
These allows us to compare groups of different sizes working in different areas of public policy. Our regression produces an R-squared value of over 0.71 for the national list and 0.78 for the state list, which means these variables explain a great deal of the variance in monthly organic search traffic.
We use the coefficients found in our regression to generate predicted traffic values for each group. We then compared those predicted results to the real traffic measurements and generate an efficiency score. The more you exceed expectations, the higher your score. The more you fall short of expectations, the lower your score.
The results show that some groups generate 10x, 20x, or even 40x or more traffic than their inputs would suggest. It also shows that some groups have huge room for improvement.
Ultimately, that’s the point of this study: to show policy groups that it’s possible leverage their investments in publications and PR to generate search traffic—if one group can do it, so can another.
You’ve already done the hard part of creating and promoting great content, now you just need to get to work on the technical and editorial details that help Google recognize high-quality, expert-written content.
Quick Guide to the Numbers
Here's Ahrefs' explanation for how it calculates the organic traffic, the output in our analysis:
This number is an estimation of how much organic search traffic your target website, subfolder or URL gets each month.
How it’s calculated:
- We find all the keywords for which your target ranks in the top 100 organic search results.
- We estimate how much traffic your target gets from each of those keywords, based on its ranking position and our estimated CTR for that position.
- We add up all these numbers.
The study takes into consideration these inputs:
- Annual Expenditures: A measure of present resources and a key components of this analysis. Knowing a group's expenditures let us compare hundred-million-dollar groups to hundred-thousand-dollar groups and grade them on a curve. This data is taken from the ProPublica Nonprofit Explorer API, which scrapes expenditure data from a group's most-recent 990 disclosure form.
- Backlinks: A count of all the links discovered by Ahrefs pointing to your site from other sites.
- Domain Rating: The overall strength of a website's backlink profile compared to other websites in Ahref's database, using a scale of 0 to 100. This takes into account the quality of the domains linking to your site. This means that while two websites could have very similar number of backlinks, one may have a much higher domain rating because it has links from higher-quality sites, a wider variety of sites, or both.
- Indexed Pages: The number of pages Google reports to have indexed from your domain name. We use SerpApi to gather this information in bulk.
We also collect these figures:
- Monthly Traffic Value: The organic traffic value represents the monthly expense that would be incurred if the website's traffic from all keywords it ranks for was paid for through PPC rather than earned organically.
- Referring Domains: Referring domains are websites from which the organization's website has one or more backlinks. For instance, if a site has a backlink from the Washington Post, it would have one referring domain. If another has backlinks from both the the Washington Post and Forbes, it would have two referring domains. However, if a site has two backlinks from the Washington Post, it would still only count as one referring domain.
- Keywords: The total count of keywords that a website ranks for in the top 100 organic search results in the Ahrefs database.
The Faults in Our Numbers
We're measuring efficiency based on four inputs and one output. If any of those numbers fluctuate—which they can easily do—scores fluctuate accordingly.
So, if your annual expenditures go way up, your score is likely to drop, unless you had a commensurate increase in search traffic. For example, we've seen some groups experience fundraising and expenditure spikes during the COVID-19 pandemic, but most of that money was distributed in the form grants. Groups like these would see their efficiency scores drop despite their SEO game being unaffected by this money.
Similarly, if you had a great month for organic search traffic because an issue you've been working on for years is suddenly at the center of the national conversation, your score will climb incredibly high. This wouldn't mean you're better at SEO, only that search demand for your content rose.
These numbers are also inconsistent in their time frames—namely, we use ProPublica expenditure data which is based on the latest 990 form a group has filed. Sometimes those filings are delayed by one, two, or three years. We don't think this is a huge problem for most groups as budgets tend to be similar from one year to the next, but there are notable exceptions.
We also haven't stripped out outliers. These numbers include groups with budgets that range from shoestring to enormous and wildly varying figures on domain rating, backlinks, indexed pages, and organic traffic. We could have stripped these figures from our study, but we're trying to keep our calculations and processes simple as we're generating new figures for this study every month. We also found that removing outliers didn't change the rankings dramatically, though it did affect the order to some small degree.
Grading on a Curve
Generally we've found that think tanks don't invest much time, effort, or money into SEO, so getting a middling or even above-average grade on this ranking might not be reason for celebration.
In other words, doing well on a test graded on a curve is only impressive if your peers are impressive. In this case, your peers mostly haven't studied for the test and in many cases weren't even aware the test was happening.
We'd like to see the nature of this curve change over time as more think tanks realize the tremendous benefits of SEO and up their game.
What does this ranking actually measure?
This study is only a measure of how good a group may be at optimizing their content for search engines. And SEO is only one of many channels for promoting ideas about public policy.
This study can't tell you if your group or another group is effective at its overall mission.
Some of the lowest-rated groups on our list are undoubtedly making huge impacts in the world of public policy. They communicate directly with legislators, consult with regulators, hold conferences, debate with academics, hold politicians accountable, and raise awareness of policy problems through a wide variety of channels.
Couldn't these difference be explained by the popularity of subject matter?
In a word: yes.
In more than a word: sure, but the groups on this list have generally neglected SEO so much that there is plenty of headroom to grow organic search traffic before truly hitting a demand ceiling.
That said, it may be the case that some groups are just too niche to ever become a top-tier group on this ranking. We still think this ranking is useful, however, because niche groups can still compare themselves to their peers in the same or similar niche and benchmark their performance accordingly.
Why should SEO matter to policy groups?
SEO is worthwhile because it requires very little effort to make the assets you already have produce organic search traffic. For a fraction of what most groups spend to create and promote content, they could optimize their content to be founds by tens or hundreds of thousands of people per month.
Further, SEO forms a positive feedback loop with a policy organization's PR efforts. The more a piece of content is promoted to high-quality outlets that might links to it, the more likely it will rank highly in search results. The higher content ranks in search results, the more likely it is to be discovered by yet more high-quality publications who are researching a topic via Google or another search engine.
SEO also reinforces your brand. When you Google "millennial attitudes toward marriage" you see a Pew Research Center study. A search for "2023 tax brackets" delivers results from the Tax Foundation—ranking higher than IRS's own webpage on the same issue. The Federalist Society stands alongside sites like LexisNexus or Cornell Law School in searches for "kelo v. city of new london" or "debs v. united states."
Search helps to associate these brand with their areas of expertise—demography, tax policy, and the law, respectively. Appearing at the top of search results declares you to be the official, high-quality, authoritative source for whatever the searcher is seeking.
What kind of regression specifically are you running?
Our calculations are run automatically in Google Sheets using the LINEST function. The regression equation produced by LINEST for multiple regression in Google Sheets uses the "least squares" method in the form:
y = b1x1 + b2x2 + b3x3 + b4x4 + c
- y is the predicted value of the dependent variable,
- b1, b2, ..., bn are the coefficients representing the estimated values of the regression equation for each independent variable x1, x2, ..., xn,
- c is the y-intercept of the regression equation, which represents the predicted value of the dependent variable when all the independent variables are zero.
How did you arrive at this list?
- Defunct or inactive groups
- Groups without an accessible website
- Groups with very low traffic: we define this as national groups with less than 300 organic visits per month or state groups with less than 30 organic visits per month.
- Non-U.S. Think Tanks
- Groups without an available EIN: this excluded many university centers
- Groups with less than $100k in annual expenditures
- Groups without a separate domain: no subdomains or subdirectories)
- State or local affiliate groups that used the same root domain
These exclusions allow for more meaningful comparisons. To control for huge disparities in budgets, we needed expenditure numbers. That’s why non-US groups and many university-based groups were excluded. We used only groups whose financial info we could easily query.
Groups without a distinct top-level domain were also excluded because there's no way to separate their performance from that of their host organization. So, our benchmarks include university groups like GMU’s Mercatus Center (mercatus.org) but not Michigan’s William Davidson Institute (wdi.umich.edu), despite it having public budget info.