If you need an SEO audit, feel free to reach out to Sam from Vudu Marketing as he is the one who actually created the template below.
Over the years, the one thing that I’ve noticed is that most sites aren’t optimized for search engines. Even websites that are run by and owned by SEOs aren’t always optimized.
Why? Well, it’s because most SEOs know that link building is what mainly drives rankings. And although that’s true, on page elements roughly account for 43.12% of a search engines algorithm.
Due to this, I decided that it would be fun to have my buddy Sam McRoberts from Vudu Marketing audit Quick Sprout. With Sam’s approval, I decided to:
- Break down how he did the audit step by step so you can replicate it…
- Have my designer stylize his audit and create a template so that you can easily copy the whole audit for your website.
But before I give you the audit template (I link to it in the conclusion), first download the SEO audit of Quick Sprout as I am going to use that as a reference for how to perform an SEO audit for your website.
Here’s how to perform your own audit:
Step 1: Perform a Screaming Frog Crawl on the website
Screaming Frog is a free application that crawls your website. It goes through every single one of your pages and looks for the following:
- Errors – client and servers issues, such as 404 pages
- Redirects – any permanent or temporary redirects (301, 302)
- External links – all of the sites you link out to
- URL issues – dynamic URLs, uppercase characters, URLs that are too long, and underscores
- Duplicate pages – anything with duplicate content
- Page title tags – any missing, duplicate, long or even short titles
- Meta description tags – similar to title tags, it looks for anything missing, duplicate, long or short
- Meta keywords tag – the same stuff as title and meta description tags… I personally don’t look at this field as most search engines ignore it.
- Headings – the types of headings you use (h1, h2, h3) as well as keyword usage, duplicates, and any missing ones
- Meta robots – what you are allowing to be indexed or not indexed as well as if you use it
- Rel canonical – in case you are pointing search engines to a different URL
- File size – the smaller your file sizes, the faster your load time
- Page depth level – how many levels deep search engines have to crawl to find all of your content
- Internal links – what pages you are linking to within your own website
- Anchor text – the link text you are using for both images and web pages
- Follow & nofollow – which of your links are being followed or not
- Images – the sizes of your images, the alt text length, or any missing alt texts
- Bot crawling – you can crawl your website as Google, Bing or Yahoo bot… this helps you see what they see
Once you crawl your whole website with Screaming Frog, which shouldn’t take more than a few minutes, you can then export all of that data into Excel spreadsheet to help you better analyze the data.
Step #2: Google Webmaster Tools and Analytics
If your website isn’t registered with Google Webmaster Tools, make sure you do so now. If you aren’t running Google Analytics or another form of analytics, sign up for it.
Through it, you can see your site’s health, any crawl errors Google is experiencing, how fast your site is loading, and almost anything you can dream of. If you want to learn about all of the features in Webmaster Tools, check out this guide.
Step 3: Keywords
With the Screaming Frog title tag, meta description, and meta keywords data, you can get a good understanding of what a website is trying to rank for. If you combine that data with your Google Analytics keyword data, you can see what a website is getting traffic for.
If you then take the keywords out of those two areas and enter them into Google’s Keyword Suggestion tool, it will spit out a list of keyword ideas:
The beautiful thing about Google’s keyword suggestion tool is that it will tell you how competitive a keyword is. Plus it will tell you how many times that keyword is searched for worldwide (global searches) and how many times it is searched within your country (local searches) each month.
This will help you get a better understanding of the potential keywords you could be going after, but currently aren’t. When looking at the Google keyword suggestion tool, keep in mind the following:
- Focus on local searches – it’s vary rare to capture all of the global searches as a US site is most likely to rank well in the US… not internationally.
- Don’t go after competitive keywords – there are a lot of keywords that are low in competition that have high search volume. You should be focusing on those keywords first.
Step #4: URLs
When you look at your Screaming Frog report, you should see a list of all of your URLs. The way I analyze URLs is:
- Static – your URLs should be static. Dynamic URLs usually contain random characters like: $, =, +, &. Static URLs typically contain numbers, letters and dashes.
- Length – although it doesn’t happen all the time, I try to keep URLs under 100 characters.
- User friendly – ideally your URLs should be easy to remember. Cut away dashes and slashes when you don’t need them.
If you have URLs that don’t fit the above criteria, you could create new URLs. When creating new ones, make sure you 301 redirect your old URLs to the new ones. That way you don’t lose the links that may be pointing to the old URLs.
Step #5: Title tags
The big misconception about title tags is that Google measures them in character limits. Google actually measures title tags by pixels.
So, what I do is I export my title tag data from Screaming Frog into Excel. I then change the font type to Arial and use the font size 12… as that’s what Google uses.
I then set the Excel column width to be set at 520 pixels as that’s Google’s cut off limit. Anything longer than 520 pixels is too long and anything under 350 pixels is too short.
Here are the rough guidelines you should use for your title tags:
- Roughly 50 to 65 characters in length.
- Be unique to that page (don’t use the same title tag on multiple pages).
- Use the keyword of that page twice if space permits (once at the start, followed by separator such as a colon, hyphen, or pipe, and then once again in a call to action).
- If relevant, include a geo-qualifier (such as Washington or Seattle, WA).
Step #6: Meta descriptions
The biggest mistake I see companies making with their meta description tag is that they stuff keywords in them. And although you should have keywords in your description, it should also read well. The more compelling your description is, the more likely people will click on your search result.
Assuming you aren’t missing meta description tags and they aren’t duplicate, here are some guidelines for creating them:
- Make sure they are unique and relevant to that page.
- They should be written as descriptive ad text, with a call to action.
- No more than 160 characters in length including spaces and punctuation (140-150 is ideal), but no fewer than 51 characters (Google considers 50 characters or fewer to be too short).
- It should contain 1-2 complete sentences with correct punctuation, and no more than 5 commas.
- Use the keyword once per sentence, as close to the start of each sentence as possible.
- Include a geo-qualifier such as “Seattle, WA”, if relevant.
Step #7: Meta keywords tag
If you want to use meta keywords you can, but it isn’t necessary. Most search engines ignore them as I mentioned earlier… for this reason I don’t include them on Quick Sprout.
I recommend that you don’t use them on your website as it is one more thing for you to have to maintain.
Step #8: Headings
Similar to title tags, there is a big misconception with heading tags. Search engines typically feel the most important keywords on a page are the ones biggest in font size. So popping keywords in heading tags and making the font size really small, may not have the same benefit.
With typical HTML standards, h1 tags are usually the largest on the page. For this reason it is important for you to use headings with large fonts within each page.
Here are some guidelines to run your headings by:
- Every page should have an H1 tag, as search engines look to the H1 to help determine the topic of a page. It should be the first thing in the body text of the page and should appear prominently.
- H1 tags should never contain images or logos, only text. The keyword of a page needs to be used in the H1 tag and in at least half of the total heading tags on a page, if more than one heading tag is present.
- From a usability perspective, paragraphs should never be longer than 5 lines of text, and it is wise to break up a page every 2-3 paragraphs with a sub-heading in the form of an H tag (H2 or H3) or an image. Testing has shown that when users are faced with a large block of unbroken text, most either skim over the text or skip it altogether, so content needs to be divided into usable chunks.
Make sure you go through all of the headings on your website so that they match the above requirements.
Step #9: Content
You’ve heard the saying “content is kind content is king”. So you want to make sure there is enough content on each page. Screaming Frog doesn’t do a great job of analyzing the content on each page of your website, but you can figure out where you have a good amount of content and where it lacks by just browsing your website.
For the pages that are lacking it, make sure you continually add more unique content. And don’t just add content for the sake of it, the content actually has to add value.
If you are looking for a rule of thumb, Google tends to rank web pages with over 2,400 words of content the highest. You’ll probably have a tough time adding that much content to each of your pages, but it shows that Google loves ranking good content.
Worse case scenario, each of your web pages should have at least 300 words of content. Ideally each page should have at least 400 to 600 words of unique content.
Step #10: Internal linking
Both Webmaster Tools and Screaming Frog will give you data on internal links. The more you link within your own site, when relevant, the easier it will be for search engines to crawl your whole site.
Search engines typically don’t want you to have more than 100 links on a page, so you ideally want to stay under this number. Sometimes you won’t be able to, which is fine, but try to stay under that limit.
As for anchor text, you would also want to look at all of the anchor text when linking internally. Avoid using rich anchor text all the time. Naturally, you should also be using link text like “click here” or “learn more”. Having a lot of rich anchor text may bring your rankings down a few slots.
Every page of your website should have at least two to three internal links. And you should only use rich anchor text 10-30% of the time for internal links.
Step #11: Image text and alt texts
For the benefit of search engines, code compliance, and visually impaired users, every image MUST have an ALT tag. The ALT tag should accurately describe the image and contain a keyword relevant to your website (but only if the keyword is relevant to the image as well).
Image file names should be descriptive words, not numbers or query strings. They should accurately describe the image, and, if relevant, should also use the keyword. If an image is used as a link, then the ALT tag functions in place of anchor text. A linked image should follow this structure:
<a href=”https://–precisewww.targeturl.com/”><img src=”https://–precisewww.domain.com/images/keyword-rich-image-name.jpg” alt=”Describe the Image and use a keyword if relevant” /></a>
By ensuring that all images are properly named and tagged, you will not only increase the SEO value of those images, but you will increase the likelihood of receiving referral traffic from image search results.
Also, for code compliance reasons, all images should also specify a height and width in the image tag.
Step #12: Nofollow
Google measures how different pages link together and assigns a weight to those links based on traffic, relevancy, age, size, content, and hundreds of other components.
When pages that Google deems relevant link to other pages, some of that “link juice” flows through that link to the site being linked to. A “followed” link is essentially endorsing the page being linked to.
Enter the rel=”nofollow” tag. Google introduced this tag to help preserve the relevancy of PageRank, which was being hurt by blog and forum comment spammers. When the tag rel=”nofollow” is used in an anchor tag (link), Google will usually pass 50-100% less “link juice” to the page being linked to. Using this tag is like saying “this page is nice, but we don’t really endorse it.”
NoFollow tags should be used on blog comments, site-wide external links, and on any internal links pointing to low quality or otherwise user-worthless pages.
Step #13: Page exclusions
An easy way to exclude pages from being indexed is through using the robots.txt file as well as adding no index tags on pages you don’t want indexed.
The reason you want to remove pages from the index is because search engines don’t like it when you have thousands of mediocre pages with little to no content or duplicate content. The best way to solve this is to not have them on your website or stop those pages from being crawled.
With Google’s latest updates, sites with thousands of mediocre pages are easily being penalized. So, make sure you stop mediocre pages from being crawled. Just place this code within the head:
<META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>
Step #14: Page inclusions
Through a sitemap you can help encourage the indexation of your website. You can create an HTML sitemap as well as an XML sitemap. Once you have your XML sitemap, you can submit it to Webmaster Tools, which will tell you how many of your submitted URLs are indexed.
It’s rare that 100% of the URLs you submit will be indexed… especially if you have a large website. But through internal linking, you can increase the likelihood of having all of your pages indexed.
Step #15: URL redirects
Unless a redirect is truly temporary (such as for a time sensitive promotion), 302 redirects should never be used. 302 redirects don’t pass any link value and are essentially a dead end for SEO. In almost every scenario where a redirect is needed, a 301 redirect should be used.
Any page that changes URLs or is deleted needs a 301 permanent redirect to tell search engines and users that the page has moved/is gone. There should never be more than one URL path to a page. You can learn more about redirects here.
Step #16: Duplicate content
Search engines really don’t like duplicate content, as it leads to a poor user experience and other content quality issues. If you have duplicate content, you need to do everything you can to eliminate it.
There are 4 main options for addressing duplicate content:
- Fix the URL structure of the site to eliminate accidental duplicate content coming from URL issues, per recommendations in the URL Redirects section and this section.
- Re-write all duplicate text content to make it unique.
- 301 redirect the duplicate content to one canonical page/site, if it is in your control.
- Implement the rel=”canonical” tag to identify the original source/root page to search engines.
Specify the canonical version of the URL using a tag in the head section of the page as follows:
<link rel=”canonical” href=”https://www.quicksprout.com/” />
You can use the tag on pages within a single site (sub-domains and subfolders are fine) or across domains (saying content on your site is identical to content on another site). You can use relative or absolute links, but the search engines recommend absolute links.
Step #17: Broken links
Because Google and other search engines crawl the web link-to-link, broken links can cause SEO problems for a website. When Google is crawling a site and hits a broken link, the crawler immediately leaves the site. If Google encounters too many broken links on a site, it may deem that site has a poor user experience, which can cause a reduced crawl rate/depth and both indexing and ranking problems.
Unfortunately, broken links can also happen due to someone outside of your site linking in incorrectly. While these types of broken links can’t be avoided, they can be easily fixed with a 301 redirect.
To avoid both user and search engine problems, you should routinely check Google and Bing Webmaster Tools for crawl errors and run a tool like XENU Link Sleuth on your site to make sure there are no crawlable broken links.
If broken links are found, you need to implement a 301 redirect per the guidelines in the URL Redirect section. You can also use your Google Webmaster Tools account to check for broken links that Google has found on your site.
Step #18: Code validation
Because there are so many programming languages and so many ways to accomplish any one thing using each language, search engines rely on certain rules when they read the content of the website.
Having code that follows these rules removes and helps to minimize errors when parsing or separating the code from the content of any one page.
Search engines such as Google have openly stated that W3C standards are what they suggest you use to make the code easy to understand for them. I typically only test the home page of the website because many issues can be easily fixed across the entire website using just its page templates.
Double-check your website with W3C to see how you stack up against the competition.
Step #19: Page load speed
Google’s recommended page load speed is 1.4 seconds or less. So, if your website loads faster, you are usually fine. If it is slow, your rankings won’t be as high as they could be.
If you are wondering what your load speed is, use Pingdom’s free speed test tool.
Once you learn your load time, you can typically make it faster through browser caching, CSS Sprites for images where possible, and reducing the image file sizes as much as possible for images that can’t be sprited.
You can also reduce the total number of CSS and JavaScript files by combining them into fewer files and minimizing file sizes by using compression and minification where feasible.
You might also see benefits by using a content delivery network (CDN) for your images.
W3 Total Cache is an excellent WordPress plug-in that can help with page load speed issues, and a simple CDN can be set-up via Amazon AWS for very little money.
Step #20: Inbound links
The more links come to your website, the higher you will typically rank. But it’s not just about the pure number, it’s also about how relevant the incoming links are, what anchor text of links is, and how many unique root domains are linking to you.
Most importantly, you have to look at where those links are pointing. If they all go to your homepage, it won’t help you as much as it could if all of those links were spread to all of your internal pages.
Through Open Site Explorer you can get a great overview of your inbound links:
If you are trying to grow your inbound link profile, keep in mind that the very best links come from trusted domains (sites like the New York Times, Wall Street Journal, Wired, Inc., TechCrunch, Huffington Post, Wikipedia, etc.). The more links you can get from authoritative websites, the better. Guest blog posts and press mentions are a great way to get those links.
One of the things that Google looks at and factors into the algorithm is domain diversity. Essentially, the concept is that ten links from ten domains would be more valuable as a ranking factor than ten links form one domain.
From an SEO perspective, you usually want to see a domain diversity of no less than 10% (i.e. 100 links from 10 domains), though higher is usually better. All other factors being equal, the site with the larger number of linking root domains would almost always rank higher. That said, in the case of extremely high quality sites, an acceptable domain diversity could be as little as 2%.
If you are looking to build links, content marketing is going to be your best bet.
Step #21: Authority and trust
Similar to Google PageRank, there is a metric called Domain Authority. It ranks websites from 0 to 100. Anywhere from 40 to 70 is good, and anything above 70 is great.
Typically the higher your domain authority, the higher you will rank on search engines. The best way to increase this is to build links from as many unique domains as possible… and ideally from ones with high domain authority.
Just like in Step #20 above, you can build links through content marketing.
Step #22: Social media mentions
Both Bing and Google have explicitly stated that they take social signals into account when ranking websites. In other words, social media does affects SEO.
Twitter, StumbleUpon, Facebook, Pinterest, Delicious, and any other social site you can think of, all need to be leveraged. They all may not fit your target audience, but should try to be on those that do.
If you want to do better on the social web, consider the following two tips:
- Make it easy for people to share your content socially by integrating sharing features throughout your website, blog posts, etc.
- Create content that is worthy of sharing and then reach out to people in that space via social channels to ask for feedback about said content.
If you are wondering how many shares or tweets a specific website or URL has, check out this article. It will explain how you can find out those numbers for your site as well as your competitions’.
Step #23: Competitive link analysis
Through SEOmoz, you can find out not only how many links you have and your domain authority, but you can also find out all of those stats for your competition.
This is important because if you have a great domain authority, it won’t mean much if all of your competitors have a much higher domain authority. You have to see how you stack up against your competitors, and not just the rest of the web.
As you can see from the image above, I have a good domain authority, but a lot of other bloggers out there have a much better domain authority. That’s one of the reasons they get more traffic than I do.
Conclusion
Now that you’ve done a complete analysis of your site, you can take the SEO audit template and plug in the information. This will allow you to have a comprehensive report that looks pretty.
You can either plug the data into the template, using Microsoft Word or Pages. Once you do, you will see an area for you to add a subjective score of how you did overall per category, 1 being the lowest and 10 being the highest.