How To Do A Detailed Technical SEO Auditing In Less Than 60 Minutes. There are three main types of SEO: On-page SEO, Off-page SEO, and technical SEO. Most of the time technical SEO is ignored or overlooked which may have a negative impact on your search engine rankings and traffic.
Overall, technical SEO plays 20% of the ranking of a website. The better the technical SEO, the better your rankings, sales, conversion, and website traffic in general will be.
In this article, I will cover technical SEO in-depth so that you can level up your SEO game. Here, you will find key technical SEO factors like mobile friendliness, website architecture, UX and UI, speed of a website, and a lot more.
On the other hand, you will also find a few tools that will help you take your technical SEO game to the next level.
So, without further ado, let the game begin.
What is technical SEO
Technical SEO is a type of SEO that deals with the overall health of your website. You can think of technical SEO as a regular health checkup that helps you diagnose any health issues and take corrective measures if any. In the same way, technical SEO diagnoses the overall health of a website and takes corrective measures if needed. Generally, technical SEO is divided into three categories:
These are the most common types of errors that your website might face in the front end. Such errors include slow page speed, broken links, long redirect chains, crawling and indexing issues, and a lot more.
Earlier, UX and UI errors were considered to be design issues rather than SEO. However, due to Google’s recent page experience update, UX and UI have become an important part of technical SEO as well.
To better understand which page should be ranked on Google or not, Google uses an algorithm called page experience. It’s all about giving the best ever experience to users when they land on any web page.
For example, how fast the page loads, its design, whether it is an initiative or not, page update frequency, and so on. Having a good UX and UI will also make your web pages more discoverable to users as well as search engine bots for crawling and indexing.
This is where technical SEO meets on-page SEO to improve the rankings of your website. You can look for the ranking opportunities by:
- The merging of two contents with similar target keywords.
- Removing duplicate content issues from your site.
- Working on keyword canalization and keyword density check.
- Improving the meta tags of your website so that web pages can appear on SERP.
All these sounds too difficult, right? But you don’t need to worry, as I have covered each of these factors in detail. By working on all these factors, you’ll be helping Google to better decide which web pages or content are to be ranked on SERP.
When you help Google, they’ll return the favor back by improving the ranking and organic traffic of your site.
Now, you might have got a detailed overview of technical SEO and its importance from an SEO point of view. However, conducting a detailed technical SEO isn’t that easy, especially when you don’t have the latest tools.
So, here are the 6 surefire technical SEO tools that will help you audit your entire site in less than 60 minutes.
- Screaming frog
- Google Search Console
- Google Analytics
- SEMrush or Ahrefs
- Google mobile finally test
Most of these tools are free, except SEMrush and Ahrefs. Also, screaming frog has a limit of 500 web pages to crawl. For more features, you can upgrade to its premium version.
Now it’s time to dive into the core topic of our blog, which is what are the factors that you need to look at while doing technical SEO auditing.
Check robots.txt file
First things first, you must check the Robots.txt file on your website. This is the most important file, as it helps search engines better crawl and index your site. Search engine bots can only index your web pages if they are allowed to crawl your website.
Therefore, before you run a crawl error report, make sure to check your Robots.txt file, which is placed in the root directory of your domain.
Here’s how the Robots.txt file looks like:
The Robos.txt file tells search engine bots which web pages and files of a website are to be crawled and which are to be ignored.
As you can see from the above image, certain web pages and sections of the website are disallowed from crawling as they are back-end elements of it. By disallowing such pages, you’ll save more bandwidth and your crawl budget will improve.
This means Google can crawl and index your site better.
If you’re running a large website like e-commerce with thousands of pages, using the Disallow feature in the Robots.txt file can allow Google to crawl and index those pages which you want to show on SERP.
Apart from this, you can also place your sitemap in the Robots.txt file so that the search engine bot can better crawl and index your website. If you want to make any changes to the Robots.txt file, you’ll it in the root directory of your folder.
If you’re using a CMS like WordPress, then use the Yoast SEO plugin to make changes in the Robots.txt file.
Run a crawl error report
Now that you have made the changes in your Robots.txt file. It’s now important for you to check whether search engine bots can crawl and index your site or not. You can do this with the help of a tool called Screaming frog.
I would personally suggest the use of Google’s Search Console, as it’s developed by Google and will give you accurate data.
Head over to your Google Search Console account and click on coverage.
It will show you the errors that search engine bots are facing while crawling and indexing certain web pages of your site. You need to make sure that all your main pages appear under the valid section, as those are already crawled and indexed by Google.
The above coverage report shows:
These are the affected URLs that need to be fixed. Search engine bots couldn’t crawl and index these URLs, and they might affect the health of your site.
Valid URLs with some warnings:
These are the web pages indexed by search engine bots, but they have some minimal errors.
These URLs are completely indexed and crawled by search engine bots.
Google has excluded these URLs from crawling because you might have excluded them in the Robots.txt file or due to redirects.
Your main goal should be to have a maximum number of valid URLs.
Fix all the redirect errors, Detailed Technical SEO Auditing
These are the most dangerous errors that might hamper 90% of your site’s health. So, make sure that you deal with these errors wisely. If you are a newbie in SEO, then look for the help of an expert.
All your website’s pages will have an HTTP status code. If all of them are good, a status code of 200 will appear on the technical SEO tool.
If any web pages are having problems then the status codes of 3xx, 4xx or 5xx will appear.
Status code 301 is a permanent redirect status code that indicates that a particular web page has moved to another location permanently. 301 redirects are fine but try avoiding long 301 redirects which may create long chains and loops on your site.
This might hamper the user experience.
404, Detailed Technical SEO Auditing
404 error appears when a particular web page of your website is deleted without proper redirection and you have changed the URL structure of your web page multiple times with redirection.
This is an error from your server-side which indicates that the server could not respond to the request of the user. Generally, such an error occurs due to hosting issues. So, make sure that you use good WordPress hosting that gives a good uptime guarantee with speed and reliability.
Conclusion, Detailed Technical SEO Auditing
If you are seeking to do a disputative SEO audit, you can dedicate another day (or week!) to do that. A full SEO audit usually takes several days, especially if you have more than thousands of pages.
By following this instant Search engine Optimization audit process, I hope you have gained a lot of SEO opportunities that may develop your Website’s performance in search rankings on search engines.