Preview only show first 10 pages with watermark. For full document please download

Technical Seo: How To Perform An Seo Audit (step By Step Guide)

This document is an in depth guide on how to perform a search engine optimization audit. Contents include: 1.  Site Profile – Page 4 2.  Crawlability – Page 6 3.  HTML Status Codes – Page 8 4.  Indexability – Page 9 5.  On Page Content - Page 11 6.  Security Audit – Page 16 7.  Off Page Audit – Page 17 8.  Mobile Audit – Page 22 9.  Competitive Analysis – Page 25 10.  Glossary of Tools – Page 26 This is the same SEO audit and checklist we have used for almost 5 years. It outlasts any Google algorithm update because it checks the factors that Google really looks for. Please feel free to download and use for your own audits.

   EMBED

  • Rating

  • Date

    June 2018
  • Size

    1.1MB
  • Views

    2,153
  • Categories


Share

Transcript

1. Digital Marketing Services that drive revenue. Webris | [email protected] | Ryan Stewart | http://webris.org 2. EXEC SUMMARY My name is Ryan Stewart and I own and operate WEBRIS. I’ve been working in the digital industry for over 8 years and I’ve had the good fortune to work with some awesome clients, helping them solve complex digital problems. Hey, Iget it, there's a lot of digital agencies out there and we all tell you the same thing. Here's what I want you to do - fill out the form below and let's set some time to connect on the phone. During that call I want you to tell me your business goals and I'll draw you the blueprint to achieve them, right on the phone. Don't believe me? Put me to the test. Take a moment to fill out the form below and I'll be in touch before the end of the day. [email protected] | @ryanwashere 3. TABLE OF CONTENTS 04 06 08 09 11 15 16 21 23 25 Webris | [email protected] | Ryan Stewart | http://webris.org SITEPROFILE CRAWLABILITY HTML STATUSCODES INDEXABILITY ON PAGE CONTENT SECURITY AUDIT OFF PAGEAUDIT MOBILE AUDIT COMPETITIVE ANALYSIS GLOSSARY OFTOOLS 04 22 11 16 4. SITEPROFILE The first step in any auditing process is to know the monster you are dealing with. Are we talking about a website that was set up only last week, and is so still a virgin in terms of content optimiza- tion and backlinks? Or are we talking about a website that is many years old and has been worked on by many optimizers in the past? While the latter helps you build a great SEO audit report, the pos- sible work in SEO could be humongous too; especially if it in- volves removing badlinks. Anyway, you will first want to know how old the website is, how big it is, in terms of the number of indexed (and unindexed) pages, the level of authority the website has, etc. My favorite tool in this regard is a Google Chrome extension called Open SEO stats. It gives you a generic idea about all the SEO aspects of any given website – like the age of the domain name, the number of in- dexed pages on Google, the number of backlinks, the traffic trend on third party services like Alexa and Compete, social media pres- ence, page loading speed, web host location, etc. Ilike it because you get a complete picture of everything about the website in just a couple ofclicks. Build A Site Profile The first step in any SEO assignment is performing an audit of the client’s website to identify the problem areas, areas of op- portunity and a general sense of where you are and where you need to go. In other words, an SEO audit helps you chart the road map for the work ahead. This is true irrespective of whether this is an inbound SEO prospect, or someone you identified as a lead and have reached out to. This is a comprehensive guidebook on what SEO auditing involves and how you should go about it. I have also included dozens of free and paid online tools available for you to perform this auditing more efficiently. So let’s get started. Webris | [email protected] | Ryan Stewart | http://webris.org 5. CRAWL ABILITY AUDIT The first thing to look at is what the robots.txt file say. For those new to the game, this is a text file that contains information on what kind of bots can access your several web pages. So for ex- ample, if you have an admin panel on your website for internal purposes that you do not want the Googlebot to access, you could set this condition in the robots.txt file. You could access this file by simply typing yourwebsitename.com/ robots.txt on the browser address bar. Sometimes though, what appears okay on the robots.txt file could have unintended conse- quences. Alternately, some robot accessibility restrictions may be set at a page level in what are known as robots meta tags. To en- sure you have only restricted access for pages you intended to, it is a good idea to instead view the robot access information from a tool like Screaming Frog SEO spider. This is a free software that will instantly crawl through your entire website and pull out all the HTML elements of your page. You may however require a license if you work with large websites that have more than 500pages. Anyway, with this tool, you can get instant information on the robot accessibility settings from all the pages on your website. Use this tool to see what pages have robots set to ‘noindex’ or ‘nofollow’, and if this has been done deliberately to prevent Google from crawling the page or if this was a mistake. 1. Robots.txt So now you have a mental picture of the website you are dealing with, we can start with the first real step of the auditing process which is to test the website’s accessibility features. Here, we mainly check the website settings that dictate who can access the web page. Also, we check if the website standards are up to date and if web pages work as intended when we request an access. 6. HTML STATUS CODES Webris | [email protected] | Ryan Stewart | http://webris.org Besides robots.txt, another factor that could tell you if there any accessibility issues is the status codes. These are ‘responses’ that are delivered when a request to access a web page is sent. The most popular one, of course, is the ‘404 Page not found’ status. Here is a complete list of all the various possible status codes one could receive from a web page. You may want to know what the status code for the different pages of your website is. The Screaming Frog SEO Spider also gives you information on this. I typically sort the results from SEO spider on the basis of the Status Code to be able to see all the pages that are being redirected (301, 302), not found (404) in one bunch. 200 301 302 304 404 500 503 7. INDEXABILITY So now that you have studied the “crawlability” of your website, the next step is to know if the content is getting indexed as desired on Google. If you are just starting out on SEO, one thing you should know is that what you see is NOT what you get. You may have a pretty looking website, but that is not how Google could potentially be seeing your website. While Google has been getting better at it, it is always a good idea to keep things simple. So here are some things that you should look at. Navigability Webris | [email protected] | Ryan Stewart | http://webris.org Is the website letting Google freely navigate through all the pages? Does the website have an XML sitemap that dynamically generates a list of all web pages from the website? Has this been provided to Google through the Webmaster tools? Identify these and include them in your audit report. Site Architecture How is your website organized? Is it one huge mass of a thousand pages? Or, do you have all the content properly organized into various folders and sub-folders. There are two components to the auditing process while looking at site architecture. The first is the URL architecture. Traditionally, it was considered a good habit to demonstrate the organizational structure in the URL. For instance, if you had a directory of various hotels across the country, you would typically have a structure like website.com/state/city/hotel- name.html However, from an SEO perspective, what matters more is how eas- ily can a user navigate from the homepage to any particular web page they want to. Evaluate how the content is organized, and how the various folders and sub-folders in the website are organ- ized and inter-linked. Also, make a note of whether the URL struc- ture is consistent among all pages under one organizational folder. 1 2 8. INDEXABILITY CONT’D Animation and Visual Content As noted already, it is considered a bad idea to use elements like Adobe Flash or Javascript to render content on your website. There are modern technologies like HTML 5 that not only render better to the visitor but is also more search engine friendly. Use a tool like SiteCheckup to evaluate at least all the important pages on the website for Flash elements. That website has quite a few other useful auditing tools that you may check out. 3 4 HTML Markups Wouldn’t you rate a news article that is full of spelling mistakes poorly? The same goes for an HTML page that is full of syntax er- rors and coding violations. In extreme cases, this can also impact the indexability of your website. Run a markup validation check using the W3C tool to identify errors that need to be fixed. Webris | [email protected] | Ryan Stewart | http://webris.org 9. ON PAGE CONTENT SEO Spider does agreatGoogle understand the context of the page better. The job in providing this information from all the pages at one place. Typically, you will need to check for the following: 1. Identify main elements through SEO Spider Every web page needs to have a well defined title, header information, sub-headers. This helps The next step is to understand if all the HTML content elements are in place on your website. While it is no longer required to have elements like meta keywords on your website, others like Title, header elements, meta descriptions and structured data are still important. I would recommend doing this process via the following steps: Are titles succinct and less than 65 words in length? Is the main headline on the page marked with the H1 tag and the subsequent sub-headers marked with H2, H3, and so on? Is there more than one H1 tag for a page? This is a big no-no Are titles unique? A lot of websites tend to have the same page title across the website Is the title overtly manipulated for SEO reasons? This is again forbidden Does the meta description provide a good summary of the page’s content? This is the content that Google search visitors read before clicking on to your link Webris | [email protected] | Ryan Stewart | http://webris.org 10. ON PAGE CONTENT 2. Content Quality The quality of content that the website delivers to its audience is paramount from an SEO perspective. But this is a pretty subjective thing to do since what one calls ‘useful information’ is often an individual perspective. At the outset, you will need to check for the following: Webris | [email protected] | Ryan Stewart | http://webris.org Is the content unique? Siteliner is a good tool to help you identify content that may be duplicate on your page. But do note that this only helps with internal duplication. If you have plagiarized it from elsewhere, this tool may not be of help. For such cases, you may use tools like Copyscape. Is it comprehensive enough? The ideal word length is 300-500 words. But this is plainly sub- jective and depends on your niche. No tool shall be able to give you the right information for this question.What you could do is do a competitive analysis of other websites in the industry and see if your website has more in-depth analysis and information than the others. Does it provide useful, well-researched and verifiable information? Again, this is subjective. A competitive analysis could give you an idea of how good your website is compared to competition. Is there a compelling reason why Google should rank your site above the others? If not, note down these points in your audit report. Does it make for easyreading? Depending on your audience, your content should make for easy reading. This means that the content should not be riddled with too many jargons and abbreviations. Generally speaking, your website should be comprehensible to a student in the eigth to tenth grade. You need not spend too much time on this aspect and use a random sample of content from your website on a tool like Rea Able to benchmark your website’s readability factors. Does the content seem manipulated for SEOreasons? Self-taught website owners and amateur SEOs often place a lot of focus on elements like keyword density. This typically ends with a keyword-stuffed article that Google can easily make out as a manipulated piece. Your SEO audit report should definitely look into this aspect. The keyword analyzer from SEOBook does a good job at this. 11. ON PAGE CONTENT Are there grammatical and typographical errors? Finally, is your content free from grammatical errors and spelling mistakes? There are a number of spelling and grammar checking tools online. But if you are a native speaker of the language, you may make do with your own knowledge and the MS Word spell-check feature.e. Are you targeting the right keywords? This is an extremely crucial step. The website you are auditing may be the best in business. But without the use of relevant keywords, you may not be able to reach out to the right audience. Use keyword planner tool on Adwords to identify the main keywords for the niche and study if the web- site is optimized for all these important keywords. The quality of content that the website delivers to its audience is paramount from an SEO perspective. Webris | [email protected] | Ryan Stewart | http://webris.org 12. ON PAGE CONTENT 3. Image Attributes The SEO Spider does an awesome work crawling through all the images from your site and helps you identify those images that have not been attributed with ALT tags. ALT tag is a textual reference provided for images. This is to help website visitors who may not be able to view the image itself for various reasons and is thus an important SEO hygiene factor. 4. Hyperlinks It is believed that hyperlinking to contextually relevant high authority publications tend to have a positive fallout on your own web page’s ranking. Additionally, linking contextually to others part of your own website is a healthy way to improve navigation and usability. While auditing the hyperlinks on your web page, there are specifically a few things to look into: Webris | [email protected] | Ryan Stewart | http://webris.org Broken Links : Use a tool like W3C Link Checker to check for broken links in your website. Remember to tick the recursive option so that you can check across your whole website. Link Quality : This is again a very subjective aspect. The thumb-rule to deter- mine the quality of the destination website is the kind of back- links it has, the general level of quality of the website. Ideally, you may want to perform a site audit of each of the websites you are linking out to. But since that may not be practically fea- sible, a quick way to assess the destination website is to see if it links to, or is linked from an excessive number of spammy websites. Nofollow : There has been a lot of discussion over the nofollow strategy for websites. A lot of website owners tend to be pretty defen- sive and thus nofollow all their external links by default. This may be unnecessary and can also be counter-productive. Use nofol- low if any links are sponsored or if you cannot vouch for the quality of the end link. The nofollow Chrome extension auto- matically identifies nofollowed links on any web page and is a good starting point to audit the hyperlinks on any web page. Anchor Text: Apart from nofollow, this is another aspect that has been abused by website owners. Identify the anchor text used to point to various internal and external links and analyze if they are either significantly promotional in nature or are stuffed with keywords. 13. SECURITY AUDIT Often times, websites are penalized by Google for not being secure enough. There are a few things you may check out to evaluate how good the website is, in terms of security. 1. HTTPS Google has recently announced that they now have HTTPS as a ranking signal. While the non-deployment of HTTPS is not likely to bring down your rankings anytime soon, it is a good se- curity feature to offer your customers. Depending on what you offer, consider the use of HTTPS on your website. 2. Malware Content Websites are often hacked to host malware content. In some cases, the WordPress or Joomla theme you download may con- tain hidden malware that may be linking out to spammy niches like casino or gambling. There are two ways to track such con- tent. The first method is to check into your Google Webmaster Webris | [email protected] | Ryan Stewart | http://webris.org tools account to look for any malware content reported by Google. The second method is to install the Avast Chrome ex- tension that scans all the Google search results for spyware. Once done, use the ‘site:website.com’ search operator on Google to get a list of all your web pages and look for any po- tential spamreports. 3. Neighborhood Analysis Tools like SpyOnWeb give you a list of websites that are hosted in the same IP network as you. Identify potential spam websites on your shared hosting neighborhood that could affect your search ranking. 14. OFF PAGE AUDIT The off-page audit of your website has mostly got to do with backlink analysis. However, that is not all. You will also need to benchmark the social profile of your website, local listing profile and the overall trustworthiness of your site. Here isa brief on how you go about it: Webris | [email protected] | Ryan Stewart | http://webris.org 1. Backlink analysis This is one of the most important aspects of an SEO audit and one could write an entire book on this topic. However, for the sake of this article, here is a brief rundown of the various aspects you should check out. •Number and quality of backlinks : The Moz Open Site Explorer is one of the most popular tools for backlink profile analysis. This service not only gives you the tentative number of backlinks to your website, but also provides a domain authority and page authority rank for each of the inbound links. While this is in no way the absolute measure of the link quality, it gives you a rough idea of the kind of links that point to the website. •Anchor text analysis: A natural backlink profile will have a healthy mix of all kinds of anchor texts and not just the keyword you are trying to rank for. BacklinkWatch is a free tool to take a peek into the multiple backlinks that are pointed to your web- site along with the anchor text used and nofollow parameters, if any. Check out their raw export feature to download the back- link file for any further processing •Nofollow ratio: The BacklinkWatch analysis also gives you a distribution of followed and nofollowed links. Typically, most hy- perlinks to a website are follow in nature. However, the com- plete absence of nofollow links can raise eyebrows. This, in conjunction, with other parameters like anchor text keyword stuffing, hyperlink from low trust websites, etc. is often an indi- cator of bad backlink profile. In such cases, there is a good chance of your website having received a penalty from Google. Look out for any notifications on Webmaster tools for warnings and penalty notifications. 15. OFF PAGE AUDIT •Linking Schemes: It has been more than a decade since link exchanges and link wheels went out of use. But you won’t be- lieve the number of websites that still do it. Even if the site you audit does not engage in such tactics, if God forbid, the web- sites you have earned links from engage in such tactics, it is time to make a note of such websites and potentially disavow them later. The same is true for sponsored links. Even if your client does not pay for links, getting backlinks from websites that do is a risk factor and needs to be accounted for during your site audit. Webris | [email protected] | Ryan Stewart | http://webris.org •Relevancy : Do a thorough analysis of all the hyperlinks to your website and identify the context for each link. Hyperlinks that are from websites with little context are risky and could po- tentially impact your website ranking. This is a long drawn process but it is recommended that you do this manually with- out relying ontools. 16. OFF PAGE AUDIT 2. Local Listing Profile If the website you are auditing is a local business, it is important to separately audit its local listing profile. This includes the fol- lowing things: •Directory listings: Identify the maj