• Trinetra Initiative

How To Conduct A Technical SEO Audit Of A Site: Detailed Checklist

Updated: Aug 21


A website can generate income and be useful for businesses. But only if you are at the top of the search. Therefore, we advise you on how to perform an SEO audit of a resource.


Make your website more visible - conduct an effective SEO audit

Different search engines like Google analyze dozens of parameters on your website. This is necessary to evaluate the "usefulness" of the information and to determine the page rank. Did you know that a normal user rarely goes to the second search page when looking for information? Therefore, it is important to be in the top 10, and for this it is necessary to optimize the resource. Thanks to our article and tips, you only need 15 minutes to complete this process. Next, you will learn what to watch out for.


scan

A quick review is helpful, but a full review is always valuable. The first step in any site audit is to perform a full site analysis. Make sure your tool is configured correctly for crawling so that it can crawl all links (JavaScript, Image, Video, Flash, etc.). The duration of the analysis depends on the size of your site and can take from one minute to several hours.


Check for site indexing issues

Once you have crawled the site, consider everything that is responsible for crawling the site with a search spider. The most common problems to check are:


Does the robots.txt file allow crawling of all required pages?

What answers does the "server response code" give to your pages?

Precision Fill in the meta tag name = "robots" and the link rel = "canonical"

Check Sitemap.xml

Check the relevance of your sitemap. All sites must be advertised. If you are using the plugin to automatically create and manage a site map, you must ensure that you can create and edit XML maps automatically. When creating a sitemap, it is important to:


The new page should automatically appear in sitemap.xml.

If the page is banned in the robots.txt file or with a meta tag, it shouldn't appear on the sitemap.

After updating your sitemap and checking Google Search Console for errors, add it to Google Webmasters and Yandex Webmasters. If you are promoting the site in the English segment, be sure to upgrade to Bing WebMaster.

Things to Consider as Part of a Website Audit

Check Robots.Txt

Indexing of a page can be blocked for various reasons. You can include the NoIndex meta tag in the header, or the robots.txt file can shut down the page from indexing. To scan the site I use the Netpeak Spider utility which shows both options when scanning. Check if important pages are blocked or vice versa if some pages should not be closed for indexing. The robots.txt file for search robots is very important for proper internal optimization. You can use the Do not allow and Allow statements to control crawling of sections of a site.


Check the site for errors

Errors in server response codes are easy to find. You can check their website for errors of this type in Google Search Console. Go to Explore> Crawl Errors.


You must correct any 4XX or 5XX errors that you find. Fix internal links that lead to invalid pages or configure 301 redirects from pages that you were unable to remove external links to. Also make sure that the 404 error page is created on the site from which you can send your visitors to important pages on the site.

Request for reindexing

If you make any significant changes, you can ask the search engines to re-crawl the required pages. This is how you can verify that you have eliminated the main errors.


To submit a reindex request to Google, go to Google Search Console, go to Analyze> View as Googlebot, paste the URL of the updated page, and click the "Analyze" button.



Check list:


Please correct any errors on the website.

Check the robots.txt file.

Make sure all main pages can be indexed.

Correct the pages with incorrect response codes.

Submit the updated pages for re-indexing.

Avoid redirects

The redirects to the correct pages are not configured correctly. Moving a page or changing a URL is a potential problem for search engine optimization. Here are some of the most common problems:

Things to Consider as Part of a Website Audit

Replace temporary redirects (302) with 301 on all pages that are no longer displayed. The 302 redirect is a temporary redirect that tells search engines that you are temporarily redirecting visitors. However, this page will be available again soon. Therefore, a 302 redirect does not transfer the weight of the missing page to a new page, as a 301 redirect does. A 302 redirect is appropriate when testing a new page. However, if your page is gone forever, you will need to set up 301 redirects from it.

Check rel = "canonical". If you have duplicate content on multiple pages, you must specify which page is the home page and set it as a canonical URL. Each page can have a maximum of one Rel = "canonical" tag.

Fixed HTML and CSS markup errors. Errors in the code can significantly slow down your site. Regular code reviews can help you identify and correct most errors. Find and correct as many errors as possible in the code.

Create perfect URLs

One of the easiest ways to recover lost external links is to find and repair external links that lead to non-existent pages on the website. Use the website crawling tool or "External Links" in Google Webmasters to find non-existent pages to which external links lead, and then correct them. Set up 301 redirects from them to the relevant pages, or contact the webmaster of a resource referenced on a non-existent page to update the link to the correct and updated page.


Eliminate redundant outbound links. If there are 100 or more outbound links on the same page, this may indicate to search engines that the page may be spam. Limit the number of outgoing external links as much as possible.

Eliminate unnecessary redirects. If possible, remove redirects so that visitors go immediately to the landing page. Redirects not only increase page load time, but also indicate problems with search engines on the website.

Get rid of dynamic URLs. Dynamically generated URLs are difficult to read and cannot tell users where they are. Try to get rid of dynamic URLs.

Reduce the maximum length of the URL to 40 characters. Recent evidence suggests that URLs up to 35-40 characters long dominate the top of search engines. Of course, long URLs are also classified, but practice shows that it is desirable to reduce their length.

Define your primary domain. You must determine exactly what address your site can access, with or without www. Verify that your site only works in one direction and that the other has a 301 page redirect. For example, if your primary domain is http://site.com and a visitor enters http://www.site.com, the page will be automatically translated without www. When you see that your website is available at www. and without them, go to the MOZ toolbar to see which version of the site has the most external links and leave only those. Set a reference with the second 301. Also, you need to make sure that the site is only accessible from an https: // or https: // address, and the same situation is possible with a lightning bolt at the end of the address with "/ " or without "/".

Optimize meta tags

Fixing meta tags doesn't take long, but it can greatly improve a site's position in search and increase its CTR. But don't overdo it. The main purpose of the meta description is to let your visitors (and search spiders) know what your page is about. However, add keywords to them whenever possible.


Eliminate empty or long titles and descriptions. Make sure all your meta tags don't exceed the recommended number of characters (50 to 60 characters for titles; 150 to 160 characters for descriptions).

Avoid duplicating titles and descriptions. Many websites, especially most online stores, suffer from duplicate content or meta descriptions. Duplicate content can confuse search engines so they won't notice any of your duplicate pages. Therefore, you should try to keep the content of the website as unique as possible.

Optimization of H tags (h1-h6). Heading (H1) tags, this is an important factor for page level ranking. Use this option to target the page to specific keywords searched by users. Also try to avoid repeats in other H tags (H2, H3, etc.). Use this option to extend the semantics of the page to secondary keywords if necessary. Also make sure the H tags are not involved in the design. Enter the page code and verify that the H1 tag is only present once per page with the main keyword.

Content optimization

Quality content has long been a deciding factor in ranking. So you need to make sure that each of your pages has enough content and that the content is clear and useful.

Find pages with inappropriate content. Make sure there is enough content on every page of the website. I do this in WebSite Auditor just by sorting all pages by "Word count" column. Make sure all pages contain 1 to 2 paragraphs of text (at least 250 words per page). As you examine the pages, check the formatting and legibility of the text.


Image optimization

Show all the pictures you need on the website:


Make sure there are no broken links to the images.

Optimize the image size for the site (if the site displays a 500 × 200 image, do not upload a 3000 × 1800 image).

Check that each image has unique and informative alt and title attributes.

conclusion

Of course, the optimization process requires constant attention. However, this technique will help your website improve searches and increase customer visibility, resulting in potential profit. A good reason to spend 15 minutes. Now you can relax and have a good time with an exciting game on the sports games website - the best entertainment for smart people is here.

20 views

For Daily Update

© 2019 All rights Reserved . Proudly created by Veneficus Technologies Private Limited