Although there isn’t an official list of Google ranking factors, we do know a lot about the signals that Google use across their index to rank sites accordingly.
This information comes from a combination of official sources, case studies, experience and my own personal experimentation.
While it is impossible to know if the list of Google ranking factors below is complete – I can say with great confidence that the vast majority of them are covered.
It is also important to note that not all of the below are strictly ranking signals, some of them are indexing/crawling signals which in turn are part of the overall search process.
You should also be aware that Google rarely looks at just 1 ranking signal and tend to ‘stack signals’ to build a bigger picture.
I have broken down the Google ranking factors into their respective categories and whether they are a positive or negative factor-
Don’t forget to check out my detailed video about the 3 new Google keyword ranking factors.
The Complete List Of Google Ranking Factors
This is a list of domain level Google ranking factors that can affect your rankings positively or negatively.
Exact Match Domain
Exact match domains or EMD’s as they are known used to rank very easily. However Google cracked down on this with the EMD update.
You still get a small amount of benefit from an exact match domain but now you have an extra quality layer (patent) to please with your site thanks to the EMD update.
Keywords In Domain
Having your target keyword in the domain used to give a huge boost. However it has been confirmed that it isn’t as important as it once was.
It is also worth noting that target keywords are still bolded in the domains of search results which helps them stand out more and attract a higher CTR, however you do not need the keyword in the domain to rank.
Keyword In Subdomain
Having your keyword as part of your sub domain can also help give you a minor rankings boost. Hoewever Google do treat sub domains as entirely different web property.
This was seen when HubPages recovered using sub domains.
The history of your domain also has an impact on how your site performs.
If the domain has been in trouble in Google in the past through link spam or bad neighbourhoods it will be harder to rank.
Take a look at what Google check for.
The belief that older domains rank better was confirmed by Matt Cutts. However it does not carry that much weight in the algorithm.
Domain Registration Date
In a patent filing Google they said-
The date that a domain with which a document is registered may be used as an indication of the inception date of the document.
Domain Renewal Date
In the same patent they also said-
Certain signals may be used to distinguish between illegitimate and legitimate domains. Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain and, thus, the documents associated therewith.
Country TLD Extension
Having a cTLD like a .es domain does help with geo-targeting indication but at the same time it doesn’t mean they are inherently easier to rank in their target country. You could rank a .com in the Spanish Google just as easily.
Private Whois Data
Matt Cutt’s suggested that private Whois data can be combined with other factors as a negative signal.
Having private Whois data on it’s own isn’t a problem – but it is used as part of the bigger picture.
Penalized Whois Owner
I cannot find any official Google reference for this but there have seen a couple of examples where it seems the Whois owner is penalised.
Granted the examples I have seen were extreme examples of abuse and I’m thinking of 2 people in particular where Google wiped out every single web property whether it was abused or not.
However this is far from confirmed.
Country TLD Extension
Where the cTLD will count as a negative signal is if you have a Spanish domain but are trying to rank it in the Russian market.
However the domain extension on it’s own isn’t enough, because if you put Russian content on a Spanish domain it will rank in Russia.
So while this is a signal it is a very minor one that is used in conjunction with others.
Exact Match Domain
Having an exact match domain does give you a bit of extra credit – however the new risk here is that you trip the EMD filter.
This filter is specifically looking at exact match domains and is a filter you wouldn’t have to go through if you didn’t have an EMD.
So by choosing to use an EMD you are choosing to go through an extra layer of the algorithm.
If you have a parked domain Google are actively removing them from their index after the parked domains update.
A list of page level factors that can affect your rankings positively or negatively.
URL Contains Keyword
Having the target keyword in your URL will help improve relevancy and boost rankings.
Take a look at this case study to see just how important that is.
Title Tag Contains Keyword
The title tag is a very strong signal for Google & having your target keyword in the title tag will help your rankings.
Having your keyword closer to the start of your title tag will also give you a minor boost.
Although this is less important than it once was and just having your keyword in the title is enough to feel the benefits.
Meta Description Contains Keyword
Having your keyword in your meta description is also important.
You should also use the meta description to include other related words & terms.
H1 Tag Contains Keyword
The H1 tag is a strong signal for Google and should include your target keyword.
Other Headings Contain Keyword
Having your keyword present in other headings such as your H2 or H3 tags will also help. However it is much better to use these headers to include LSI keywords and avoid over optimising.
Main Body Content Contains Keyword
Mentioning your target keyword a couple of times in the main body of your content will also help improve relevancy.
The order of your keywords also has a small impact. For example someone searching for download antivirus software will see different results than antivirus software download even though the intent is the same.
Latent Semantic Indexing keywords help search engines work out if you are talking about Orange the colour, Orange the fruit or Orange the mobile phone network.
Including LSI keywords in your title, description and main content helps Google index & understand the topic of your page better.
Faster websites provide a better user experience, increases engagement & converts more.
Having unique content across your entire site is a strong quality signal. Make sure content is original and not duplicated.
Whether it is a blog post or a product descriptipn – make sure your content is unique.
Length Of Content
Longer content ranks better & converts better period.
The average content length for sites in the top 10 is at least 2,000 words. Longer content attracts more engagement, links and social signals. Oh and it converts more.
Schema / Rich Snippet Markup
Integrating all relevant Schema and Rich Snippet markup with something like Author Hreview to your page helps you spoon feed data to Google which is then shown in the SERPS which in turn get a higher click through rate.
Having unique images and videos to support your main content is a quality signal.
Google will also look at your usage of images so make sure that you use your keyword in the file name and alt text. Don’t over optimise by including your keyword in the file name, alt text, title tag and description tag though.
You should also be looking to optimise your EXIF data according to Matt Cutt’s.
Google have recently updated their image capability and can now recognise 2 pizzas on top of an oven without any help from file names or alt text.
Way back in 2010 Google released the Caffeine update which was designed to return fresher and more up to date results.
This was a significant shift in how Google indexes the web and favours fresher content.
Updating older content will also see a positive impact. You need to do a little more than just changing the date though. Google is looking for significant updates to content in order to label them as ‘fresh’.
Spelling & Grammar
Matt Cutt’s categorically stated that spelling & grammar doesn’t matter in 2011.
However he has since made a new video where he advises it does matter in your main content.
Outbound links can also affect your overall rankings. When you link to another site in essence your are ‘voting’ for it and associating your website with it as well.
It is important to make sure anything you link to is credible and relevant to your site. Matt Cutt’s has some advice about this.
Just like external links affect your rankings so do internal links as well. Specifically the number of internal links and quality of those pointing to a given page on your site.
It was shown just how important internal links are on ranking.
Syndicating content is fine, but make sure you are linking back to the original source.
Having supplementary content is a strong quality signal. The 2014 version of the Google Quality Rater Guidelines make this very clear.
Google provide the example that a recipe page might have a feature to multiple or divide the recipe based on how many people you are serving.
Google measure the reading level of pages and label them as either basic, intermediate or advanced.
You can access this by doing a search, then clicking on search tools > all results > reading level.
As you can see my site is ‘basic’ but it still ranks very well. It doesn’t appear that Google are using this as an active signal, but they certainly have the data.
W3C code validation is not a ranking factor period.
The overall authority of the domain also has an impact on how your page ranks. Assuming everything is equal the page on the more authoritative domain will always rank higher.
This is something I see with my own authority sites where they have what I call ‘page 1 pull’ where I can publish an article and have it appear on page 1 for it’s target keywords with no additional effort.
Duplicate Meta Descriptions
Matt Cutt’s answered the question of whether it was important for every page to have a meta description.
He said it isn’t necessary however you should use unique meta descriptions or no meta descriptions at all. Whatever you do – don’t use duplicate meta descriptions.
Duplicate Title Tags
Just like with Meta Descriptions you should ensure every page has a unique Title tag and there are no duplicate title tags across your site.
Underscores In Title Tags
Matt Cutt’s specifically said not to use underscores as seperators in your title tags.
Instead you should use commas, pipes or dashes to separate your titles.
Keyword Stuffed Meta Tags
Having a keyword stuffed title tag or meta description will negatively affect you. For example if your title tag is ‘Link Building | Link Building Services | Link Building Strategy’ that is keyword stuffed.
Instead you should have something more natural like ‘Link Building Services – The Perfect Strategy’
While not strictly a negative factor the meta keywords tag was originally used to help software indexing.
Google have never used the meta keywords tag as a ranking signal since they launched in 1998. They do read the tags – but they are not a ranking factor.
Way back when we used to build sites to have the perfect keyword density. It seemed that around 3% was the ‘sweet spot’ – however since then Google has got much better a processing language and understanding the topics of webpages.
When optimising for keyword density it is very easy to over optimise and get caught out. So ignore keyword density and just make sure your target keyword is mentioned in the title tag, meta description, H1 tag and once or twice in the main content.
Slow Load Times
Just like having a fast site serves as a positive factor having a slow loading site will act as a negative factor assuming all other things are equal.
Having a lot of duplicate content on your site will hurt you. Matt Cutt’s has said that a little duplicate content isn’t so bad. For example if you are quoting another page.
But if your site features a lot of duplicate content throughout it will be penalised. Learn more about how Google handles duplicate content.
Hiding content specifically to manipulate search, increase word count or increase the number of keywords on a page can get you penalised. Especially if you use the classic div display none trick.
Irrelevant Image Alt Tags
For a long time we have been using ALT tags to tell Google what our images are about. Typically we use the image alt tag to include a relevant keyword.
However Google can understand images now so if you have a picture of a Zebra with the alt text best iPhone deals you are going to have a hard time.
If you are not careful you could end up with a site wide penalty.
If you have never read the linked article it is worth reading. Google do hand out penaltys for outbound links, even if it’s just 1 bad outbound link across your entire site.
Also having too many outbound links can hurt your sites ranking. The Quality rater document clear states “Some pages have way, way too many links, obscuring the page and distracting from the Main Content”.
Having too many broken links is a sign of a low quality site according to Google’s rater guidelines. However you should not be worried about having the odd broken link here and there.
But broken links are quick to fix so if you have them, fix them.
Too Many Affiliate Links
Google and affiliates historically do not get on. For the most part affiliates are a pain for Google as they contribute a huge amount of spam and low quality sites to the web.
Having affiliate links isn’t a problem, but if your spamming affiliate links throughout your content you are going to run into trouble. Either way you should be nofollowing all of your affiliate links.
While W3C code validation is not a ranking factor, neither are HTML errors unless those errors interfere with how Google spiders and indexes page.
Stay on top of HTML errors and fix them as they are reported in Google Webmaster Tools.
The length of your URL can have a negative impact on your rankings.
A list of site level Google rankings factors that can affect your rankings positively or negatively.
Having a good standing history of trust with Google has a huge influence. Trusted sites can get away with more and enjoy higher rankings across the board.
Trust is measured across a range of signals including links from highly trusted sites.
Google’s Search Rater guidelines says that sites should have easily accessible contact information to help build trust.
However this could introduce the possibility of duplicate content but Google say its not a problem.
About Us Page
Organizing your site into a silo structure is a positive signal for your site. Many people have a flat structure eg domain.com/post-name but its much better to have a silo structure.
This would look something like domain.com/seo/ranking-factors or domain.com/social-media/facebook-marketing for example.
Just like freshness has an impact on the page level, it also has an impact on the domain level.
Make sure you are adding content to your site regularly. I have found in my own tests that adding a new 2000 word article every day to a site builds great trust with Google.
Number Of Pages
The number of pages a site has is not a ranking signal on it’s own.
However having more indexed pages does mean you have more chances to rank for different keywords and are more likely to attract incoming links which does have an impact.
Having an XML sitemap will help Google spider your site easily.
This doesn’t guarantee that Google is going to index all of the pages in the sitemap, but it does help them get around your site.
However not having a sitemap isn’t a negative factor assuming that your site structure is correct and Google can spider the site properly.
Why stop at just an XML sitemap when you can have a HTML sitemap as well?
Although this seems to be an old practice that is rarely seen now but it can help your users and Google bots.
The geographic location of your server/hosting does have a factor in helping to decide which countries your site will show up in.
Google are officially using HTTPS as a ranking signal.
Although at the moment it is a very minor signal and certainly not strong enough for me to move this blog over to HTTPS any time soon.
Breadcrumb Rich Snippets
Having breadcrumb navigation on your site will help with the overall architecture of the site.
Making sure your site has a responsive design and displays well on mobile devices will ensure it performs well in the mobile world.
A day or two of downtime will not inherently hurt your rankings.
However Google will remove your site from the index entirely if is unavailable for around a week or more. On the bright side once your site is backup – Google will reinclude it in the index.
Duplicate Meta Content
We discussed before how important it is to have a unique title and meta description for each page.
Having duplicate meta information across your site can lead to less visibility in the search engines.
Not Optimized For Mobile
While not having a responsive design won’t hurt your rankings in desktop search, having a mobile friendly design is very important to showing up in mobile search.
For example this blog performs very badly in mobile search and is something that is high on my list of priority’s to fix.
Poor User Engagement
Google know precisely how many pages people visit on your site and how much time they spend on each page. They know if the user bounces or keeps reading.
They know this regardless of whether you have Google Analytics installed or not.
You might wonder how they know exactly which webpages every single person on the internet visits – but I’m going to save that revelation for later this week.
Sites with poor user engagement signals such as high bounce rate and low time on site do drop in the rankings.
If your site has a bad reputation on sites like Yelp.com, RipOffReport.com or Google Places you will suffer in the long term for that.
This was addressed after someone came up with the ingenious way to earn links by treating customers badly.
Adverts Above The Fold
If your site is ‘top heavy’ with adverts above the fold that distract from the main content of your site you will suffer.
Pagerank Sculpting is the practice of nofollowing all outbounds links and nofollowing specifc internal links in order to control the flow of PageRank. Abusing this will get you in trouble.
Blocking Access To CSS/JS
Blocking Google Bot’s access to your sites CSS or JS files will directly harm your sites performance in search results.
Google specifically targets low quality sites with the Panda update.
If your site is penalised you will see huge drops in search visibility across every single page on your site.
A list of backlink related factors that can affect your rankings positively or negatively.
The anchor text of the backlink helps Google understand the topic of the linked page. It is not as relevant as it once was and you should be mixing generic and branded anchors to steer clear of Penguin.
The title of a link also helps indicate the topic of a page although it is a much weaker signal than anchor text.
Domain & Page Relevancy
A link from a domain and/or page that is relevant is much more valuable than a link coming from an irrelevant site. For example you wouldn’t want a link from this blog if you sold garden furniture online.
The text around a link also helps Google work out the relevancy and whether or not the link is a positive or negative citation.
Keyword In Title
Links coming from pages that have either the same or tightly related keywords in their title are much more valuable than those that don’t.
Getting backlinks from domains with high levels of authority & trust are a significant ranking factor.
The overall authority of the page that you are getting a link from also plays a big role. A link from an authoritative page from an authoritative domain is the holy grail.
Number Of Links
The overall number of domains that link to your site is also very important. Sites with links from more root domains have higher rankings than sites that do not.
Age Of Backlink
Google have a patent that shows the age of a backlink is relevant. Older backlinks are more powerful than newly created ones.
However the number of links from domains only counts if those domains are all on separate class-c IP address ranges. You want to have links from lots of different domains on lots of different IP ranges for maximum effect.
Make sure you have backlinks coming in from a range of sources. Relying on just 1 ‘link type’ such as blog comments for example stands out and is not natural.
You want a diverse mix of link types coming from relevant and authoritative sources.
Getting links from pages that are currently in the top 10 for your target keyword will boost your own sites rankings.
A link from a page that has a high number of social shares is worth more than a link coming from a page that does not have a high number of social shares.
Matt Cutt’s discussed guest blogging in detail and you will get penalised for spammy guest blogging for links.
Links coming from the homepage of a site carry more weight than links that come from inner pages that are found further away from the homepage in structure.
Links coming from the main content of a page are known as contextual or editorial links.
These links carry the most weight compared to any other link. For example an editorial link is an awful lot more powerful than a footer link from the same page.
User Generated Content Links
Google knows the difference between user generated content and content that is published by the actual site. Links coming from the actual site are more powerful than links coming from user generated content on the same site – a forum for example.
A link that reaches your site via a 301 redirect is just as powerful as a normal link.
Although links from Wikipedia are Nofollow, Wikipedia backlinks are highly authoritative in Google’s eyes. If you get your site listed as a source of information you will also receive targeted traffic.
Positive Link Velocity
Link velocity is a measurement of how many links you gain over time. It is important to maintain a positive link velocity.
Matt Cutt’s has confirmed that incoming NoFollow links do not hurt your sites rankings.
However if this is massively abused by people spamming blogs with thousands of comments, then those links can be used to dish out a penalty.
Word Count Of Page
A link from a page that has a 2,000 word count is worth more than a link from a page with only 100 words.
Number Of Outbound Links
A link from a page that has hundreds of links to other sites is worth a lot less than a link from a page that only has a few links to other sites.
A link from the sidebar of a site that is present on every page of the site is treated as a single link.
Having links from sites that have been identified as a ‘bad neighbourhood’ by Google will hurt your site.
For example if Google has uncovered a blog network and your site has a lot of links from that network – you are in trouble.
Having lots of links from IP addresses in the same C class is not natural and is usually an indication of manipulation.
Guest posting was a great way to get backlinks a couple years back, however spammy guest posting will get you penalised.
So if your throwing together poor quality articles to guest post purely for the sake of links, make sure you are nofollowing them – Google are actively cracking down on this activity.
You will be penalised if you get caught out buying or sponsoring links. If you do buy links make sure they have the NoFollow tag applied to mitigate any risk.
Although if I put my Black Hat on paid links like those from the Jacob Hagberg SEO service are VERY effective – until you get caught.
Just like buying links can get you in trouble, so can selling links. There have been numerous examples of well known sites selling links and being penalised accordingly. Again, that only happens if you get caught.
Negative Link Velocity
Link velocity is a measurement of how many links you gain or lose over time. A negative link velocity will have a negative affect on your visibility in search.
Reciprocal linking used to be very effective but an update many years ago killed that. It is now seen as a link scheme and should be avoided.
Links form forum profiles used to be highly effective however this will now get you penalised if systematically abused.
Sites with a high percentage of links coming from unrelated or irrelevant sites rank lower than sites that do have a high % of relevant links.
The Google Penguin penalty tends to focus on your backlink profile and can affect you on a page and domain level.
A list of user engagment factors that can affect your rankings positively or negatively.
Page Level SERP Click Through Rate
Pages that get a higher click through rate in search results get a boost in rankings. Mark my words, this is one of the most effective ways to rank today. Given the choice between the perfect backlink profile and a high SERP CTR – I would choose the CTR every day.
Domain Level SERP Click Through Rate
Domains that attract high click through rates across all of their pages will be reward with higher visibility across the site.
Dwell Time / Pogo Sticking
Google measures if people stay on your page after visiting it from a Google search. If the user spends a long time away from the results this is a positive signal.
Low Bounce Rate
Google knows exactly which pages you visit and how much time you spend on them regardless of which browser you use or if your site has Google Analytics.
Sites with a low bounce rate are rewarded with higher visibility in the SERPS.
Pages Per Visit
Sites that engage users for longer periods of time and/or visit multiple pages are ranked higher than sites that do not.
A site that gets a lot of direct traffic is deemed to by of higher quality than a site that doesn’t get much direct traffic.
Sites that have a high percentage of returning visitors are deemed to be of higher quality than sites that don’t get many return visitors.
Pages that attract user engagement in the form of comments are a clear signal of quality and user interaction.
Page Level SERP Click Through Rate
Pages that get a low click through rate in search results will get dropped from the first page regardless of other factors.
Domain Level SERP Click Through Rate
Domains that have a low click through rates across all of their pages will have much less visibility across search results.
Dwell Time / Pogo Sticking
Google measures if people stay on your page after visiting it from a Google search. If the user visits your site and quickly returns to the results, this is a clear sign to Google that your page is of low quality.
High Bounce Rate
Sites with a high bounce rate will see a negative effect on their rankings across the board.
A list of social media related factors that can affect your rankings positively or negatively.
It is generally thought that Google do not have access to Facebook’s data, but that is not true.
The number of times your page/domain has been shared on Facebook has an impact on your rankings and is the most valuable Facebook signal.
Like Facebook Shares, the number of comments a given URL has received on Facebook has a positive impact on rankings although less important than shares.
Facebook Likes of a URL/Domain also has positive traction in the SERPS although this Facebook signal is the weakest of the 3.
Having your URL pinned and re-pinned on Pinterest is a strong social signal that has a positive impact on rankings.
Although Google have said that Google +1’s do not affect rankings I have found the opposite to be true.
The other benefit of Google +1’s is if you are signed into your Google account when searching this will be used to personalise your results.
Relevancy of your social signals is also important. For example having an industry leader like Rand Fishkin mention this blog on social is very valuable.
Positive Social Velocity
Just like you can have a positive link velocity you can also have a positive social velocity. I have used positive social velocity to bring Penguin penalised sites back to the #1 spot.
Negative Social Velocity
Negative social velocity will see your site drop in rankings. For example if you bought 1,000 Tweets today and 3 days later 900 of them were removed – that would damage your rankings.
I mentioned brining penalised sites back to the #1 spot with positive social velocity, as the velocity drops off – so does the ranking.
A list of brand signals that can affect your rankings positively or negatively.
Branded Anchor Text
Anchor texts that include your brand a very strong signal to Google. For example Matthew Woodward Link Building is much more powerful than just Link Building
Similar to branded anchor text, when people search Google including your brand with associated keywords offers a huge boost in rankings. For example people searching for Matthew Woodward SEO tells Google my brand is important to SEO.
Brand Mentions / Citations
Having your brand mentioned on popular sites is a clear signal to Google. Even if you don’t get a link, just the fact your brand was mentioned is enough to see positive traction.
Popular brands have a presence on Facebook with a strong following.
Popular brands have a presence on Twitter with a strong following.
LinkedIn Company Pages
Popular brands have a dedicated LinkedIn page for their business which also shows employees.
Brands that are more active on social media/engaging with people are more popular than brands that do not.
Brick & Mortar Business
The majority of real businesses have physical brick & mortar premises. Listing your address in the websites footer along with a Google+ listing is a strong brand signal.
If your brand has a bad reputation on sites like Yelp.com, RipOffReport.com or Google Places you will suffer in the long term for that.
Low Branded Click Through Rate
If people are searching for your brand and associated keywords but do not click through to your site, that is a negative brand signal.
Wrapping It Up
So there you have it – all known Google ranking factors in one place for your convenience!
With this knowledge in hand, you now know how to improve google search ranking for any website on the planet.
Although this isn’t an official list, it is a list that I stand by based on my own experience of ranking sites. What is interesting though is how the list of ranking signals has grown over the past decade.
I’m going to go all out again and categorically state that backlinks are not the most important ranking factor for long term rankings and haven’t been for the past 2 years.
User engagement and brand signals are more important than ever before. If you are not paying attention to these areas you are going to have a hard time in 2015.
In my next post I’m going to reveal the 3 new major ranking factors Google are using today & how to beat them.
I will also tell you precisely how Google know which pages you visit and how long for regardless of whether you use Google Search, Google Analytics, Google Chrome or Google Toolbar.
And let’s not forget about these Bing SEO best practices as well!