Think of your website as a house. The paint, appliances, and furniture are like your site’s content and design—they’re the first things people see, they’re easy to fix, and they allow you to live comfortably. Then there’s everything you can’t see: your HVAC system, smoke alarms, electrical wiring—the stuff that keeps you safe and your home functioning. That’s your site’s technical SEO.
Without strong technical SEO, your website may be able to function for a little while, but pretty soon, cracks are going to show. And this is where a technical SEO audit comes in—ensuring that your website stays strong and usable in the long run.
In order for search engine bots to find your site and rank it, it needs to be secure, easily crawlable, and free of back-end obstacles. Technical SEO is complex and technical, but it’s also essential if you want your site to rank well in search engine results.
What is a technical SEO audit?
A technical SEO audit is a systematic, comprehensive assessment of a website’s technical elements to identify issues that may affect its search engine visibility and performance. A step-by-step audit can help uncover and address issues like slow site speed, under-optimized metadata, 404 errors, incorrect canonical URLs, and more. The goal is to ensure that your website is optimized for search engines like Google and Bing.
The primary benefits of technical SEO are improved crawlability and indexability, which make it easy for search engine bots to find your site and index and rank your pages, and lead to improved search engine rankings. A technical audit can also improve website security, fixing errors can enhance user experience and increase conversion rates, and regularly doing audits can reduce SEO and website maintenance costs over time.
When to perform a technical SEO audit
The frequency with which you should conduct comprehensive SEO audits depends on your website’s size, technical infrastructure, and market landscape. Generally, aim to perform a technical SEO audit about every six months as part of broader site maintenance, or after a major site change, like a website migration.
Regular audits help you proactively address issues, preventing them from accumulating into long-standing issues—like broken links—that can impact your site’s performance over time. Technical SEO audits are also helpful if you’ve experienced a significant unexplained drop in your search rankings or traffic. They can uncover and address underlying technical issues that may negatively impact your site’s performance in search engine results.
How to perform a technical SEO audit
- Pre-audit preparation
- Check internal linking
- Check your sitemap and indexing
- Check for browser friendliness
- Check content structure
- Post-audit monitoring and maintenance
An SEO audit requires step-by-step categorization of issues that need fixing, and a strategic plan on how to go about making those changes.
1. Pre-audit preparation
Before you start your technical SEO audit, clearly define your objectives. Are you looking to fix something specific like your sitemap, internal and external links, or website speed? Or are you conducting a more high-level site audit to understand where to drill down further?
Ensure that you have access to the necessary tools; at the very least, you need Google Analytics, Google Search Console, a schema markup validator tool like Schema.org, and a crawling tool like Screaming Frog, Ahrefs, Semrush, or Moz. Since Screaming Frog is the most commonly used tool for this purpose, so we’ll use its configuration to describe what to do next.
Before you start your audit, make sure that your crawling tool is properly configured. Here’s how to configure Screaming Frog:
1. Go to Configuration > Spider > Preferences > Set Page Title Width > Characters Max = 58
2. Don’t crawl URL parameters under Configuration > Spider > Limits > Limit Number of Query Strings > 1
3. Under Configuration > Content > Duplicates > Check “Enable near duplicates”
4. Under Configuration > User-Agent > Select “Googlebot (Smartphone)”
5. Add the GA and GSC API Access under Configuration (Optional)
Once your crawling tool is properly configured, you’re ready to start your audit.
2. Check internal linking
Linking errors, including broken links, redirect loops (endless web page redirection cycles), and incorrect canonicals (misleading or inaccurate website links) can lead to user frustration and inefficient crawling and indexing, affecting your site’s performance.
Here are the main types of internal linking issues and how to fix them:
4xx errors occur when there’s a problem with the user request—like a missing or incorrect parameter, authentication issue, or permission problem—creating a poor user experience and impacting crawl efficiency. (404 Not Found, one of the most well-known 4xx errors, indicates that the server cannot find the requested resource at the specified URL.) To check for these in Screaming Frog, go to Bulk Export > Response Codes > Client Error (4xx) Inlinks. Remove 403 and other false positives from the report. Next, either restore the page with the same URL if it was accidentally removed, or 301-redirect the URL to an appropriate web page.
3xx errors indicate a redirection response from a web server. Ideally, links should point to the latest live URL to not waste crawl bandwidth and confuse crawlers. To check for 3xx codes, go to Bulk Export > Response Codes > Client Error (3xx) Inlinks. Make sure all links are pointing to the newest version of the final URL instead of the redirect.
Orphan pages are standalone pages without internal links pointing to them, and as such, they’re typically not easily discoverable by users or search engines. To identify orphan pages on your site, go to Sitemaps > Orphan URLs. Export all orphan URLs that are indexable with a 200 status code, as these are the potentially problematic ones. Then, add an internal link to these pages if they hold valuable information. If not, unpublish them and redirect them to a relevant page.
3. Check your sitemap and indexing
A quality site provides clear guidance to search engines on what pages to crawl and index, and is free of issues limiting indexability.
Here are the things to look out for:
Robots.txt files communicate to search engine crawlers which parts of a site to crawl or not. These are usually in the root directory of a website. When reviewing robots.txt, first verify if your site has a robots.txt file. Type “[yourwebsite].com/robots.txt” into your browser to see if the file exists. (Shopify generates robots.txt automatically for its stores.) Next, look for rules that include which parts of the site bots can access and index and which parts they should avoid.
An xml sitemap acts as a roadmap for search engines and users to navigate and understand your site’s structure and hierarchy. Submit a sitemap with a comprehensive list of all URLs on your site to Google Search Console. To see your sitemap on Google Search Console, go to Index > Sitemaps. If there’s no sitemap, create and submit your sitemap to Google Search Console. This improves crawlability and allows Google to audit the quality of your sitemaps. Check that your sitemap contains no URLs returning the status code 200-OK. There also should be no URLs in the sitemap that give a 3xx, 4xx, or other common non-standard website response code (this is interpreted as a faulty sitemap).
A canonical URL allows you to inform search engines that multiple similar URLs are identical, directing them to the most authoritative webpage among the duplicates. All pages should display a canonical tag, usually a self-referencing canonical tag. Shopify generates proper canonicals by default for its sites, but they can be customized (and broken) by a web developer. To check for missing or broken canonicals in Screaming Frog, go to Canonicals > Missing and Canonicals > Non-indexable (this usually means it’s broken). For missing or non-indexable canonicals, add or update a canonical tag (rel=“canonical”) to point to the right page.
4. Check for browser friendliness
When a website is slow or confusing to navigate, users are prone to leave. Conversely, a positive browsing experience increases user engagement, a crucial ranking factor. Browser friendliness has three key elements:
Core Web Vitals (CWV)
CWV are made up of three metrics: Largest Contentful Paint (LCP), which is the point at which the page’s main content has loaded; First Input Delay (FID), or the time from which a user first interacts with your site to the time when the browser begins to process the interaction; and Cumulative Layout Shift (CLS), a measure of unexpected layout shifts. To check your Core Web Vitals, go to Google Search Console’s Core Web Vitals report for a comprehensive list of potential issues. Open each mobile and desktop Core Web Vitals report for deeper insights. Once you’ve identified the pages with issues, run them through the PageSpeed Insights tool for a list of CWV insights and suggestions to fix. (Note that most CWV issues need the support of a web developer to solve.)
Sixty-seven percent of internet users browse on mobile devices, yet websites are usually built on desktops. This means many sites miss opportunities to be more mobile-friendly, potentially alienating users. Google tests for mobile responsiveness directly as a ranking factor. To find out how mobile-friendly your site is, enter your page’s URL in Google’s Mobile-Friendly Test tool. If you have many pages, choose a sample based on the pages with the highest search impact. Ensure your website and theme follow Google’s best practices of responsive web development. Addressing most mobile friendliness issues requires a developer with knowledge of HTML and CSS.
5. Check content structure
Ensure your content is organized to convey relevance, value, and uniqueness to search engines and users.
These elements are the most important parts of your content from a technical SEO perspective:
Title tags are HTML tags that tell browsers the title of a page. Check your website to ensure there are no duplicate page titles by going to Screaming Frog > Page Titles > Duplicate. You can also check the character length of your titles (they should ideally be under 58 characters so they’re displayed properly in search results) in Screaming Frog by going to Page Titles > Over 58 Characters. Edit any duplicates or excessively long title tags. In Shopify, you can edit these directly in the SEO section of a page, product, collection, or blog post.
In HTML, H1 headers are considered one of the most important elements for describing a page. Make sure that all content has H1 headers and that they’re all unique. In Screaming Frog, go to H1 > Missing and H1 > Multiple. Update the headers (usually the page’s title) to be unique. If your pages contain multiple H1s, convert the less prominent ones into H2 headers.
For optimal website performance, ensure all image files on your website are smaller than 300 kilobytes, and include alt tags for accessibility. Use Screaming Frog to identify large images by navigating to Images > Over 300Kb. To locate their placement on your site, go to Bulk Export > Images > Images Over 300Kb Inlinks. If necessary, optimize images with tools like Photopea or TinyPNG. On Shopify, you can use a plug-in like Tiny:SEO to automatically compress images or implement lazy loading, which conserves bandwidth by loading images only when they appear on the user’s screen.
If you’ve seen products displayed on Google with ratings and reviews, it’s thanks to structured data schema—a data organization framework. Structured data helps Google classify information on your site, like star ratings, reviews, and FAQs, which it can use to create rich snippets in search results. Use Schema.org to create the correct schema markups for your pages. For ecommerce sites, focus on Product markup. You can find a list of commonly used terms to get started. The fix can be implemented in Liquid, and requires a web developer.
Once you’ve completed the audit, you’ll likely have a long list of action items for your site, some easier to implement than others and some that are high priority. As a rule of thumb, the highest priority items are the findings that directly prevent search engines from including your pages in results, such as robots.txt issues and 4xx issues. The second-highest priority should be findings that confuse search engines, such as duplicate content and canonical issues. Then, look at performance issues such as page speed, and last, enrichment opportunities such as structured data.
However, these are general guidelines. Make your own decisions about what’s important based on your needs and goals, and consult an SEO expert if needed.
7. Post-audit monitoring and maintenance
In the first few months after implementing updates, monitor your site’s health with Google Search Console every week to see if your indexing and SERP rankings improve. Ideally, you should run a technical audit for your entire site every six months. For general maintenance, run a mini audit every month to review and fix crucial elements of core web vitals, PageSpeed issues, 4xx errors, and sitemap errors.
Technical SEO audit checklist for ecommerce businesses
To help you get started on your technical audit, here’s a starter-kit checklist for ecommerce technical SEO:
- Does your site have an average PageSpeed score of over 70?
- Does your site pass all CWV tests, and is it free of mobile friendliness issues?
- Are your site’s URLs descriptive and easy to read?
- Is your site free of duplicate content, like duplicate title tags and headers?
- Have you submitted an XML sitemap to Google Search Console, and is the sitemap free of 3xx or 4xx messages in Search Console?
- Do your site’s product pages include Product Structured Data?
- Do your site’s reviews include Rating Structured Data?
Technical SEO audit FAQ
How often should I conduct a technical SEO audit?
You should conduct a full technical audit every six months and a maintenance audit every month. You may also want to perform a technical SEO audit if your website is undergoing major changes like a site migration or you’re experiencing a sudden drop in indexed pages.
Can I perform a technical SEO audit, or should I hire a professional?
If you have some programming and web development skills, you can follow our technical SEO audit guide to perform one on your own. However, depending on the issues you find, you may need a developer to help fix them. Issues involving Core Web Vitals, structured data and mobile friendliness issues in particular can benefit from the help of a professional web developer.
How long does it take to see results after implementing technical SEO fixes?
After implementing your technical SEO fixes, you should start to see crawlability and indexability improve within a month and see changes in your SEO results within six months.
Should I consider a technical SEO audit when redesigning or migrating my website?
Yes. A website migration is a great time to do a technical SEO audit to ensure your new website is crawlable and indexable.