If I write a blog post on any topic, what do you think happens?
It typically gets indexed by Google the same day I publish the content and within a week it tends to rank high on Google.
Then again, I have a domain score of 94 and I have 633,791 backlinks. Just look at the image above. (If you are curious what your link count or domain score is, put in your URL here.)
But if you have a lot fewer backlinks and a much lower domain score, what do you think would happen?
Chances are your content won’t get indexed fast and it won’t rank as high as you want.
But there has to be a way to change this, right? Especially without building more backlinks because we all know that’s time-consuming and hard.
To find the most ideal solution, I decided to run a little experiment.
Around five months ago, I sent out an email to a portion of my mailing list asking people if they wanted to partake in an SEO experiment.
As you could imagine, I had well over a thousand websites who were willing to participate. I had to narrow down the list because for this experiment to be effective, a website had to have a domain score of 30 or less and no more than 40 backlinks.
That way it’s at least a challenge to figure out how to rank new content higher.
In addition to that, the site couldn’t be a subdomain, such as domain.wordpress.com. It had to be a standalone site.
Once I removed all of the outliers, I was left with 983 people who agreed to participate in the experiment. Of those, 347 stopped replying or backed out of the experiment due to time commitments, which means I was left with 636.
How did the SEO experiment work?
For all of the sites, we had them write a piece of content. We didn’t make it a requirement that the content had to be about any specific topic or that it had to be written a certain way… we just had them write one piece of content that was between 1,800 and 2,000 words in length.
We enforced the minimum and maximum length limit because we needed the post to be long enough to naturally include keywords, but if it was too long… such as 10,000 words, it would have a higher chance to rank on Google.
Each site had 30 days to write the piece of content and publish it on their site. Within 30 days of the content being published, we looked up the URL in our Ubersuggest database to see how many keywords the post ranks for in the top 100, top 50, and top 10 spots.
We also repeated this search 60 days after the article was published to see if there were any major differences.
The Ubersuggest database currently contains information on 1,459,103,429 keywords from around the world in all languages (a lot of keywords have low search volume like 10 searches per month). But for this experiment, we focused on English speaking sites.
We then split the sites up into 9 groups. Roughly 70 sites per group. Each group only leveraged 1 tactic to see if it helped with rankings.
Here’s a breakdown of each group.
- Control group – this group just published the article and didn’t leverage any promotional or SEO tactics. Having a control group allows us to compare how specific tactics affect rankings.
- Sitemap – all this group leveraged was a sitemap. They added the article to their sitemap, and we made sure the sitemap was submitted to Google Search Console.
- Internal linking – this group added 3 internal links from older pieces of content to the newly written article.
- URL Inspection – within Google Search Console you can request that they Crawl and index a URL. That feature is called URL Inspection.
- Social shares – Facebook, Twitter, LinkedIn, Pinterest and Reddit were the social sites that this group submitted and promoted their content on.
- Google Chrome lookup – for each site in this group, we had 40 people type in the URL directly into their address bar and look up the site. This could have been done on either mobile or desktop versions of Chrome. I added this group in there because I was curious to see if people visiting your site from Chrome browsers affects your rankings.
- Meta tags – my team optimized the title tag and meta description for everyone in this group. Based on the article, we crafted the optimal meta tags to not only include keywords but also to entice clicks.
- URL – with this group we only optimized their article URL to include keywords and we tried to keep the length around 50 characters as that is what they supposedly prefer.
- Everything – this group combined all of the tactics above other than the control group as they didn’t do anything.
Before I dive into the data, keep in mind that if someone was in one of the groups, we did our best to make sure that they weren’t leveraging any other tactic. For example, for everyone who wasn’t in the sitemap group, we had them remove their existing sitemaps for Google Search Console (other than the everything group).
So how many keywords does an average website with a domain score of 30 or less rank for in Google within a month and even two months?
I was shocked at how many keywords a site could rank for when it barely has any links and a low domain score.
But what wasn’t as shocking is how a web page’s ranking can increase over time. The orange line shows the number of keywords that ranked within the first 30 days and the green line shows the number over the first 60 days.
You know how people say you need an XML sitemap, well it is even more important if you have a low domain score. At least, that is what the data shows.
When your site has very few links and a low domain score, you’ll find that Google may not crawl your site as often as you want. But by leveraging a sitemap, you can speed up the indexing process, which helps decrease the time it takes for your site to start ranking for keywords.
Internal linking group
Links, links, and more links… it’s what every site needs to rank well. Technically they are external links, but internal links are better than nothing.
When you add internal links from your old content to your newer articles, it helps them get indexed faster and it helps push them up in the rankings.
Especially when these internal links come from relevant pages that have some decent rankings on Google.
Articles that leveraged 3 internal links had more page 1 rankings than sites that just used an XML sitemap.
URL inspection group
If you aren’t familiar with the URL inspection feature within Google Search Console, it’s a quick way to getting your content index.
Just log into Search Console and type in your article URL in the search bar at the top. You’ll see a screen that looks something like this:
All you have to do is click the “request indexing” link.
Leveraging this feature has a similar result to using the sitemap.
Social shares group
I’ve noticed a trend with my own website, in which if I create a piece of content that goes viral on the social web, my rankings for that new piece of content skyrocket to the top of Google… at least in the very short run.
And after a few weeks, I notice that my rankings drop.
Now, my site isn’t a large enough sample size and there are many reasons why my site ranks really well quickly.
Nonetheless, it was interesting to see how much social shares impact rankings.
Getting social shares substantially performed better than the control group, but similar to my experience with NeilPatel.com, the rankings did slip a bit in month 2 instead of continually rising to the top.
Social shares may not have a direct impact on rankings, but the more people who see your content the higher the chance you build backlinks, increase your brand queries, and build brand loyalty.
Google Chrome lookup group
Do you know how people are saying that Google is using data from Google Analytics and Chrome to determine how high your site should rank?
Well, I wasn’t able to prove that from this experiment.
I had 40 random people directly type in the URL of each new article into Google Chrome. I spread it out over a week, making sure they clicked around on the site and stayed for at least 2 minutes.
The ranking results were very similar to the control group.
Meta tags group
Now this group performed very similarly to the group that leveraged internal linking. And the month 2 results outperformed all other groups.
User metrics are a key part of Google’s algorithm. If you can create a compelling title tag and meta description, you’ll see a boost in your click-through rate and eventually, your rankings will climb.
If you want to boost your rankings through your meta tags, it’s not just about adding in the right keywords, you’ll also want to boost your click-through rate. Follow these steps to do just that.
The 8th group tested if URL length impacts how high a new piece of content ranks on Google.
Based on the graph above, you can see that it does. It didn’t have as much of an impact as internal linking or meta tags, but it did have an impact.
The key to creating SEO friendly URLs is to include a keyword or two and keep them short.
If your URL is too long and descriptive, such as:
The article will rank for very long tail phrases but will struggle to rank for more popular terms like “meta tags” compared to URLs like:
The beautiful part about the short URLs is that they rank well for head terms and long tail phrases.
The charts clearly show that little things like meta tags, URLs, internal linking, social shares, and even sitemaps help.
But the key to doing well, especially if you want your new content to rank well is to not just do one of those things, but instead do them all.
As you can see from the chart, doing everything gives you the best results. Now sure, some of the things are redundant like using an XML sitemap and using the URL inspection feature, but you get the point.
You’ll also notice that when you leverage everything together your results aren’t exponentially better… SEO is competitive and has turned into a game where every little thing adds up.
If you want to do well and have your new AND old content rank faster and higher, you need to do everything.
I know the tactics above aren’t anything revolutionary or new, but it’s interesting to look at the data and see how specific tactics affect rankings.
So, what do you think?