Technical SEO
Table Of Content
- Introduction to Technical SEO
- Why Technical SEO Matters
- Website Crawlers Tools
- Chrome Website Inspector
- Technical SEO Tools for Speed Testing
- Interpret What a Crawler Is Telling You
- The Role of Robots.txt Files
- When Website Log Files Can Be Useful
- What Are Google’s Core Web Vitals?
- Evaluating Largest Contentful Paint
- Evaluating Interaction to Next Paint
- Evaluating Cumulative Layout Shift
- Prioritising Website Speed Problems
- Best Practices for Website Architecture
- The Importance of an Internal Linking Strategy
- When to Use a Redirect Page
- Understanding Schema in SEO
- Why Canonicalisation Is Essential for SEO
- Optimising International SEO with Hreflang Tags
- Optimising Images for SEO
- Best Practices for On-Site Videos
- Prioritising Technical SEO Fixes
Introduction to Technical SEO
Welcome to Technical SEO detailed Guide from Courses Buddy!
Even if your knowledge of SEO is limited to understanding what the letters S, E, and O stand for, you likely know that great content alone isn’t enough to improve your website’s visibility. To maximise ROI and attract more leads to your business, you need to dive deeper into technical SEO.
In this detailed guide, we take you behind the scenes, showing you how to optimise your website from the ground up. We cover everything from how search engine crawlers work to how you can develop strategies for enhancing your site’s speed and discoverability.
Maximising Traffic with Technical SEO
Great content is just the beginning when it comes to boosting your website’s visibility, return on investment (ROI), and attracting leads for your business. However, to fully maximise your results, technical SEO is essential. This guide focuses on the key aspects of technical SEO, how to assess your current SEO situation, and how to effectively communicate actionable recommendations to your web developers for implementation.
As websites grow increasingly complex, technical SEO has become more critical than ever. Even for seasoned professionals, there are always new challenges and opportunities to optimise site performance. While this can seem overwhelming at first, the good news is that technical SEO is a learnable skill.
It’s clear that the core principles of SEO are what make the technical aspects work so well. With the right guidance, you can gain the expertise to drive more traffic and enhance your website’s performance through technical SEO.
Why Technical SEO Matters
When we talk about technical SEO, we are referring to the technical elements of your website that work behind the scenes. While some aspects, like a slow site or unfriendly URLs, may be visible to your visitors, the real focus of technical SEO is on helping search engine bots understand what your website is about.
To rank for your target keywords, you must prove to search engines why you should be recognised for them. This means presenting your website in the best possible way for search engines, ensuring it is optimised for ranking. However, this should never come at the cost of your visitors’ experience. Anything you do to improve your site for search engines should also enhance the user experience for those who visit it.
Fundamentally, technical SEO can be broken down into four broad categories:
- Crawling and Indexing: This refers to how search engines engage with your website. By viewing your site from a search engine’s perspective, you can identify issues that may be hindering its performance. Websites that search engines can easily understand tend to rank better.
- Site Speed: The speed at which your website loads is critical. A slow site not only frustrates your visitors but also negatively impacts search engine rankings. Improving your website’s speed is an essential part of good SEO.
- Content and Site Architecture: How well is your content organised? Are there clear links between relevant pieces of content? A well-structured website makes it easier for both users and search engines to navigate and understand your content.
- Other Factors: This includes things like Schema, Images, Videos, and other elements that help complete the website’s technical picture.
While technical SEO is an essential component of any SEO project, it is often overlooked. This could be due to a lack of technical knowledge within the SEO team or resistance from development teams to implement what may seem like minor recommendations. However, neglecting technical SEO can undermine your overall SEO success. No matter how great your content is or how many backlinks you build, basic technical issues on your site can prevent search engines from fully understanding it.
In fact, technical SEO is so crucial that it should be the first thing you focus on in any SEO project. While it may seem overwhelming, especially if you don’t have a technical background, don’t be intimidated. With curiosity, persistence, and attention to detail, anyone can master technical SEO.
Website Crawlers Tools
A website crawler is essential for understanding the technical aspects of your site, acting as a translator between your website and search engine bots. Crawlers replicate how search engine bots crawl your website to rank and index your content. To address the issues that crawlers might encounter, you need to replicate their process and identify the same problems they will.
There are many website crawlers available. These tools are primarily paid, though one offers a free version for smaller websites. You may also decide not to use a dedicated crawler if you already have an SEO tool that includes crawling features. Most SEO tools come with some crawling functionality, which might be sufficient for your needs.
It’s important to note that not all crawlers are created equal. Initially, it’s a good idea to run the same website through multiple crawlers to ensure all issues are identified. One of the advantages of using a crawler is that, in many cases, you can crawl any website. This provides a great opportunity to gain experience by reviewing different websites.
Popular Website Crawlers
- Screaming Frog:
This popular web crawler has been around since 2010, making it a well-established tool in the SEO world. It’s a downloadable software rather than a web-based tool. While the free version can crawl up to 500 URLs, Screaming Frog doesn’t include all the features of the paid version. For beginners or those on a budget, this is an excellent choice to get started with technical SEO audits. - Sitebulb:
Similar to Screaming Frog, Sitebulb is software you install on your computer. However, it places more emphasis on visual representations of the crawl results. If you prefer visual insights rather than spreadsheets, Sitebulb might be the better option. While there is no free version, Sitebulb offers a free trial. The Pro version is more expensive than Screaming Frog, but its visual approach may justify the cost. - JetOctopus:
If you’re working with larger websites, JetOctopus is a powerful web-based crawler that doesn’t require software installation. Unlike Screaming Frog and Sitebulb, which can run out of memory when crawling very large websites, JetOctopus can handle millions of URLs. This makes it the most expensive option but ideal for enterprise-level websites.
Key Considerations When Choosing a Crawler
When evaluating website crawlers, ask yourself the following questions:
- How often is the crawler updated?
It’s important to use a responsive company that regularly updates its crawler, especially when new Google features are released. - Can the crawler evaluate JavaScript?
As Google can crawl JavaScript, ensure that the crawler you use can also assess it. - Can you compare reports?
Being able to compare reports allows you to track improvements and assess the impact of changes on your website’s health. - Does the crawler generate reports?
Depending on your needs, this feature may be useful. If a client has an internal development team, giving them access to the tool can be beneficial.
By answering these questions, you’ll be able to choose the right website crawler to suit your needs.
Chrome Website Inspector
One of the most valuable tools for technical SEO is Chrome’s Website Inspector (also known as Chrome DevTools). While it’s commonly associated with web development, technical SEO relies heavily on many of the same elements a web developer would use when building a website.
To access the Website Inspector:
- Windows: Press F12 or Ctrl + Shift + J
- Mac: Press Command + Option + C Alternatively, you can right-click on any page and select Inspect to directly view the code of that element.
When you first open the Inspector, you’ll be greeted with various options. The first thing I recommend is setting up the layout so it’s most useful for you. Click the three dots at the top of the doc, then choose which side of the window you’d prefer the tools to be displayed. This is a matter of personal preference, and I prefer to have the tools displayed on the right side of the website.
Key Sections of the Inspector
- Elements:
This is where you can view the HTML code of a page. It’s useful for checking how a page is structured, for example, ensuring that headings are correctly outlined. - Console:
Although we won’t use the Console much for technical SEO, it’s handy for debugging JavaScript or checking analytics issues. - Sources, Network, and Performance:
These tabs are essential for reviewing site speed:- Sources: Shows all the different domains used to build the page. This is useful to identify if multiple domains are slowing down your site.
- Network: Displays all the elements that were loaded to build the webpage. If an element returns a 404 Not Found code, this can significantly slow down the site, so these issues need to be fixed.
- Performance: Provides in-depth details about how page elements load and their impact on site speed. You’ll find more information on using this tool in the course videos covering site speed.
- Security:
Ensure that the site has a valid security certificate. For example, you can verify if a site is secure by looking for the padlock icon in the address bar. A secure and valid certificate is vital for SEO. - Lighthouse:
Lighthouse is Google’s automated tool for webpage testing. It simulates your site’s performance on a slow connection and checks for accessibility and SEO issues. This is a great starting point for your technical SEO audit.
I recommend running Lighthouse in Incognito mode to avoid interference from Chrome extensions that could skew the results.
How Lighthouse Can Help
Lighthouse will generate reports on various aspects of your website:
- Performance: Similar to Google’s PageSpeed tool, this will show how fast your site loads.
- Accessibility and Best Practices: These are crucial for user experience and, in some cases, legal requirements.
- SEO: Although your website crawler will likely find most of these issues, it’s always good to double-check for additional SEO problems.
Practical Application
You can run the Website Inspector on any website, not just your own. This is an excellent way to practice and learn. By reviewing websites you visit regularly, you can become more comfortable using these tools and discover technical SEO issues that might not be immediately obvious on your own site.
Technical SEO Tools for Speed Testing
Site speed is a critical element of technical SEO. It’s not just about ranking—it directly impacts user experience and conversion rates. If your site is slow, users won’t wait, and you could lose both engagement and revenue.
Recognising its importance, Google incorporated site speed signals into its core ranking algorithm in 2021. This makes it essential to consistently monitor and improve your website’s performance. That’s where speed testing tools come in.
Recommended Speed Testing Tools
There’s no shortage of tools available, and most of them are free—so you can allocate your budget towards fixing the problems, not just identifying them.
1. Google PageSpeed Insights (PSI)
Google PageSpeed Insights is often the first stop. It evaluates your website based on the same performance signals Google’s algorithm uses. However, the output may feel a little technical or non-intuitive—especially if you’re new to SEO.
It provides scores for both mobile and desktop, alongside results for the Core Web Vitals. For a deeper understanding of these metrics, refer to the dedicated video on Core Web Vitals in this course.
2. GTmetrix
GTmetrix is another powerful tool and, in many ways, offers a more actionable experience than PSI.
- You can test from different connection speeds or devices, simulating real-world user conditions.
- It offers free and premium options, with the free version being quite robust.
- You can even schedule regular speed checks—essential for spotting sudden performance drops after site changes.
3. Other Tools Worth Exploring
- WebPageTest.org
- WattSpeed.com
Different tools interpret results in slightly different ways, so it’s helpful to rotate through a few options. This ensures you’re not missing issues that one tool might overlook.
4. Chrome’s Built-in Speed Tools
As covered in the Website Inspector section, Chrome DevTools also includes speed testing capabilities. These allow you to dive deep into performance insights and view how each page element affects load time.
5. Yellow Lab Tools
Found at yellowlab.tools, this is a more technical solution—especially effective for identifying JavaScript-related issues.
If your PSI or GTmetrix results mention bulky JavaScript files or long processing times, Yellow Lab can pinpoint the root causes. It’s an ideal tool to run and then hand off to a developer for detailed debugging.
6. Google Search Console
Google Search Console remains one of the most reliable sources for tracking Core Web Vitals and Page Experience metrics. While it doesn’t report on every page, it gives you a direct view of what Google sees.
Each of these tools brings unique strengths. By combining them, you’ll develop a strong, well-rounded understanding of your site’s speed performance. More importantly, you’ll be equipped to identify, prioritise, and address the issues that matter most—for both your users and your search rankings.
Interpret What a Crawler Is Telling You
When it comes to conducting a technical SEO audit, the best place to start is with a site crawl. Running a crawl provides a comprehensive overview of your website’s structure, content, and performance from a search engine’s perspective. In fact, it can immediately flag fundamental issues—sometimes even before you’ve written a single checklist item.
For instance, one client’s crawl failed entirely, revealing that their SSL certificate hadn’t been installed correctly. While this was an easy fix, such lucky breaks aren’t always the norm. Still, a crawl remains your most effective entry point for diagnosing deeper issues.
Choosing the Right Crawl Tool
If you’re just starting out, the free version of Screaming Frog will serve you well. Earlier in the course, we reviewed several crawler tools, but regardless of which one you choose, remember this:
Crawlers are robots—they can’t interpret context. You still need to apply your own judgment to the results.
A crawler might flag issues that aren’t actually problematic in your case—or that are simply not fixable for practical reasons. That’s perfectly acceptable. Not every warning calls for immediate action.
How to Read Crawl Results
Let’s walk through an example using Screaming Frog, crawling a test website—wikiHow.
- Start by filtering HTML pages
In Screaming Frog, make sure to select HTML from the dropdown. This filters the results to just your web pages, excluding images, scripts, and other non-HTML elements. This is where most of your actionable insights will be. - Review key tabs
- Page Titles: Are they optimised, or are they too generic?
- Meta Descriptions: Are they missing or duplicated?
- H1 Tags: Are they present and relevant?
- Canonical Tags: Is there canonicalisation in place to avoid duplicate content issues?
- Structured Data: Is there any schema markup implemented?
You’ll find many of these checkpoints in the technical audit checklist provided with this course.
- Export your findings
Use the Reports > Issues Overview section to save the crawl data. Keep in mind that it exports as a raw table—helpful, but not always easy to interpret at a glance.
Using Jet Octopus
While Screaming Frog is a fantastic tool, Jet Octopus offers a more visual interface that can streamline your audit—especially for ongoing client work. It highlights key issues in its SEO Problems section without requiring you to dig through multiple tables.
Both tools provide largely the same data; the difference lies in how the information is presented. For quick checks, Screaming Frog is ideal. For recurring audits or visual overviews, Jet Octopus adds value.
Don’t Chase Perfection
Even with machine learning built into some crawler tools, they don’t understand your business context. A technically “perfect” site doesn’t guarantee great user experience, effective design, or relevant content.
The ultimate goal isn’t to eliminate every error, but to ensure your website serves its users effectively. Visitors won’t notice if you’re missing an alt tag or a meta description—they just want clear, useful information.
So, let the crawl inform your strategy, not dictate it.
The Role of Robots.txt Files
A robots.txt file is a simple yet powerful tool in your technical SEO arsenal. This plain text file resides at the root of your website and acts as a set of instructions for search engine crawlers, guiding them on which parts of your site they can or cannot access.
To check whether your website has a robots.txt file, just visit:
yourdomain.com/robots.txt
Many content management systems (CMSs), such as WordPress, will automatically generate one. However, auto-generated doesn’t always mean well-configured. It’s essential to review and customise your robots.txt file to suit your website’s specific needs.
What Does a Robots.txt File Do?
Robots.txt files are primarily used to prevent crawlers from accessing areas of your site that don’t add SEO value — for instance, WordPress’s /wp-admin/ directory. This helps search engines focus their crawling efforts on your most valuable content.
The basic structure is straightforward:
User-agent: *
Disallow: /folder/
- User-agent specifies the crawler (e.g., Googlebot, Bingbot).
- Disallow tells the crawler not to access the specified folder or file.
You can add multiple user-agent rules if you want different instructions for different crawlers. For example:
User-agent: Googlebot
Disallow:
User-agent: Bingbot
Disallow: /go-away-bing/
In this case, Google can access everything, but Bing is blocked from the “go-away-bing” directory.
Important note: Always use relative paths (e.g., /folder/) rather than full URLs in your rules.
Other Useful Directives
- Allow: Used in combination with Disallow to make exceptions (mostly respected by Googlebot and Bingbot).
- Crawl-delay: Specifies how long a crawler should wait before requesting another page. Google ignores this (it prefers Search Console settings), but Bing and other engines may respect it.
- Wildcards and Symbols:
- Asterisk * acts as a wildcard (e.g., /folder*/).
- Dollar sign $ indicates the end of a URL (e.g., /*.pdf$ blocks all PDF files).
- Comments: Use # at the beginning of a line for notes.
Here’s an example using these:
User-agent: *
Disallow: /*.pdf$
Crawl-delay: 10
# Blocking all PDF files and slowing down the crawl rate
Real-World Example
Large websites like BuzzFeed have very detailed robots.txt files. Reviewing theirs can offer insights into managing complex crawl strategies. BuzzFeed’s file includes wildcards, crawl delays, and URL-specific blocking — all tailored for performance and control.
Best Practices
Always include a link to your sitemap at the top or bottom of the file:
Sitemap: https://www.yourdomain.com/sitemap.xml
- Use lowercase: The file must be named exactly robots.txt. Any capitalisation will result in it being ignored.
- Test before going live: Use Google’s robots.txt tester to ensure your rules don’t accidentally block important pages.
- Know its limits: Robots.txt is a suggestion, not a rule. Some crawlers (especially content scrapers) may ignore your file altogether.
While small in size, robots.txt files can have a massive impact on your website’s visibility. Misconfigurations can harm your site’s SEO — even prevent it from being indexed at all. Take the time to understand, customise, and test your file. It’s one of the most crucial steps in a comprehensive technical SEO audit.
When Website Log Files Can Be Useful
Before the rise of tools like Google Analytics, log file analysis was the primary method for understanding how users interacted with websites. While modern analytics platforms now offer rich insights, log files still serve a crucial purpose—particularly when it comes to understanding how search engine bots interact with your site.
What Is a Log File?
A log file is a record of every request made to your website’s server, including those made by search engine crawlers like Googlebot. By analysing these files, you can uncover details such as:
- When a bot last visited key pages
- Whether crawlers are getting stuck or encountering issues
- If important content is being ignored altogether
This information is invaluable for troubleshooting indexing or crawling problems that might not show up in conventional SEO tools.
Why Analysing Log Files Matters
Log file analysis helps you isolate specific problems that may be affecting your site’s visibility. For instance, if Google is repeatedly crawling the same page—like a feed URL—every hour, but ignoring other crucial content, that’s a red flag. It may indicate inefficient crawling, which could be hurting your site’s performance in search results.
You can also identify “spider traps”—places where bots get stuck, such as search filters or dynamic URL structures. Left unchecked, these traps can waste valuable crawl budget and prevent more important pages from being discovered.
Accessing and Analysing Log Files
You can usually obtain server logs from your web hosting provider, though it might require some persistence—especially if you’re working with a larger organisation. These logs are typically in the Common Log Format, which isn’t human-readable by default.
To interpret them effectively, you’ll need a tool. A great option is the Screaming Frog Log File Analyser, which offers both free and paid versions. The free version processes up to 1,000 log lines at a time.
Using such a tool, you can:
- Track which bots visited which URLs
- Review the frequency and timing of visits
- Detect anomalies or repetitive access patterns
When to Use Log Files
Log files are particularly helpful when:
- Your site isn’t ranking as it should, but all your technical SEO audits return clean results.
- You suspect search engines are missing or ignoring important content.
- You need to confirm whether bots are encountering crawl issues or traps.
In essence, log files can reveal what your regular SEO tools might miss. They are an essential resource when you need to dig deeper into site performance, especially from a bot’s perspective.
What Are Google’s Core Web Vitals?
Speed matters!!!
A slow-loading website can silently drive users away before you even know they visited. Research suggests that every additional second of load time can cause a 20% drop in user retention—and because many analytics tools load after the page, you might never realise those visitors were there in the first place.
Why Core Web Vitals Matter
Site speed isn’t just important for user experience—it’s also a key ranking factor. In 2020, Google introduced a set of speed-related performance metrics known as Core Web Vitals, which became part of their ranking algorithm from 2021 onwards. These metrics are now central to what Google calls the page experience score, a measure of how user-friendly your website is.
The Three Core Web Vitals
- Largest Contentful Paint (LCP)
- Measures how long it takes for the largest visible content element (such as an image or heading) to load.
- Reflects the loading performance of your page.
- Interaction to Next Paint (INP)
- Evaluates how long it takes for the page to respond after a user interacts (e.g., clicking a button).
- This metric replaced First Input Delay (FID) in early 2024, so if you’re reading older material, you might still come across FID.
- Cumulative Layout Shift (CLS)
- Assesses how much the page layout unexpectedly shifts during loading.
- Helps measure visual stability.
Google has clearly defined thresholds for each of these metrics, which we’ll explore in upcoming lessons of this course.
Measuring Core Web Vitals
To understand how your site performs, you can view your Core Web Vitals report in Google Search Console. However, if your website has limited traffic, data may be sparse or unavailable. More importantly, these reports won’t explain why a metric is poor—only that it needs attention.
To dig deeper, you’ll need to run your site through speed testing tools such as:
Each tool offers different perspectives and testing methods. They not only help you evaluate your own site but also let you benchmark competitors or other sites you admire.
One important note: Google evaluates your Core Web Vitals based on the 75th percentile of all page loads, filtering out extreme outliers for a more balanced view of performance.
Evaluating Largest Contentful Paint
Largest Contentful Paint (LCP) is one of the most common Core Web Vitals that websites struggle to optimise. Achieving a good LCP score often involves trade-offs and thoughtful prioritisation of page elements.
What Is LCP?
LCP measures the time it takes for the largest visible content element in the viewport to fully load. The viewport refers to everything visible on the screen upon loading a page—not content that requires scrolling.
Google uses the following thresholds to assess LCP performance:
- ✅ Good: under 2.5 seconds
- ⚠️ Needs Improvement: between 2.5–4 seconds
- ❌ Poor: over 4 seconds
Common LCP elements include hero images, banner graphics, or prominent text blocks.
Analysing LCP Performance
Let’s consider an example using Chrome’s Lighthouse report. While the report highlights the LCP element—often the first visible text or image—this element may not be the actual issue. The real bottlenecks are often the processes that occur before the LCP element appears.
For instance, a client once used a third-party personalisation service that delayed page rendering by about two seconds. Until that external script finished loading, nothing else appeared—images, text, or layout—resulting in a poor LCP. After removing the dependency, LCP improved significantly.
Using GTmetrix for Deeper Insights
The Waterfall tab in GTmetrix is especially helpful. Look for the dark blue line marking the LCP event. Everything loading before that line could be causing delays. Evaluate what can be deferred or optimised—but be cautious: deferring essential elements can negatively impact CLS (Cumulative Layout Shift), another Core Web Vital we’ll cover later.
If you notice a long server response time (e.g. over 500 milliseconds), poor hosting may be the culprit. In contrast, sites with excellent hosting (like Wisdom Pets in the example) might show waiting times as low as 30 milliseconds.
Common Causes of Poor LCP and Fixes
- Heavy JavaScript files and third-party scripts (like chat widgets) that load too early.
- Images not lazy-loaded — loading all at once instead of deferring below-the-fold content.
- Lack of basic page speed hygiene, such as:
- Minifying CSS and JavaScript
- Eliminating unused code
- Avoiding bloated libraries
These optimisations can have a significant impact, particularly when implemented with the help of a skilled developer.
Don’t just focus on the largest element on the screen. Instead, ensure it appears quickly by optimising what loads before it. LCP is not just about the size of an element—it’s about how efficiently your site prioritises rendering it.
Evaluating Interaction to Next Paint
Interaction to Next Paint (INP) was introduced in early 2024, replacing the now-retired First Input Delay (FID) as one of Google’s Core Web Vitals. Unlike FID, which measured only the initial delay, INP provides a more accurate and holistic view of how responsive your website feels during user interactions.
What Is INP?
INP tracks how quickly your web page gives visual feedback in response to user interactions, such as:
- Clicking an “Add to Cart” button
- Submitting a form
- Pressing a key on the keyboard
Scrolling is not counted unless done via keyboard input.
Google categorises INP performance as follows:
- ✅ Good: under 200 milliseconds
- ⚠️ Needs Improvement: between 200–500 milliseconds
- ❌ Poor: above 500 milliseconds
Why INP Matters
When users interact with a website, they expect immediate feedback. A delayed visual response can make a site feel unresponsive—even if the action is technically being processed in the background. INP directly measures that perceived delay.
How to Evaluate INP
Unlike other Core Web Vitals, INP is difficult to measure using traditional speed testing tools. Instead, you can use Total Blocking Time (TBT) as a proxy metric.
TBT measures how long the browser is blocked from handling user input while it’s still processing page load tasks. In most cases, a high TBT correlates with poor INP.
You can check TBT in tools like:
- GTmetrix (under the Performance tab)
- Google PageSpeed Insights
A TBT of zero or very close to zero usually indicates good INP performance—but to be confident, you should verify INP scores in Google Search Console.
Keep in mind: Google uses the lowest-performing user interaction on a page to determine your INP score. That means even if most of your interactions are fast, one slow interaction could drag the score into the poor range.
Fixing INP Issues
Addressing INP problems usually requires developer-level interventions, such as:
- Optimising JavaScript to reduce processing delays
- Avoiding heavy animations or inefficient event handlers
- Ensuring that the UI responds instantly—even if the backend takes longer to process the request
Google has shared case studies (e.g. by The Financial Times) showing how large sites have improved their INP scores. These can be valuable resources to share with your development team.
INP reflects the user’s experience of responsiveness. While tools like TBT provide useful estimates, only real-user data from Google Search Console can give you a complete picture. Prioritise responsiveness in your design and development practices to ensure users experience smooth, immediate feedback when interacting with your site.
Evaluating Cumulative Layout Shift
Cumulative Layout Shift (CLS) is one of the more visible and frustrating performance issues users may encounter, yet it’s also one of the easiest Core Web Vitals to observe—even without a tool. CLS measures how much content on your webpage shifts unexpectedly during loading, affecting user experience and usability.
What Is CLS?
CLS tracks unexpected movement of elements as the page loads. For example, if you’re about to click a link and the entire layout suddenly shifts, causing you to click the wrong item—that’s CLS in action.
This usually happens when elements like:
- Images
- Text blocks
- Banners or ads
load in after other content, forcing layout adjustments.
Unlike other metrics, CLS is not measured in time. Instead, it’s a unitless score calculated based on how far and how often elements move.
Google classifies CLS scores as:
- ✅ Good: less than 0.1
- ⚠️ Needs Improvement: between 0.1 and 0.25
- ❌ Poor: above 0.25
Common Causes of CLS
Many issues that lead to high CLS are simple oversights:
- Banners loading late – e.g. announcement bars that shift content downward.
- Images without defined dimensions – if image sizes aren’t declared in HTML or CSS, they can disrupt layout when they finally render.
- Ads loading unpredictably – especially on sites that rely heavily on third-party ad scripts.
A helpful analogy: imagine packing a moving van carefully, only to be told last-minute that you forgot the couch. You now have to repack everything to make space. That’s exactly what poor CLS feels like for users—disrupted and inconvenient.
How to Identify and Fix CLS
In GTmetrix, you can check the CLS score under the Performance tab. If there are any layout shifts, GTmetrix will highlight them under Top Issues, showing which elements are responsible.
Easy fixes often include:
- Setting explicit width and height for images, videos, and ad containers.
- Avoiding the late injection of dynamic content unless space is reserved.
- Minimising layout shifts by reserving sufficient space for third-party widgets or ads.
Why CLS Matters
Beyond improving your Core Web Vitals score, reducing layout shifts significantly improves user satisfaction. A stable layout builds trust and helps visitors feel more in control of their interactions.
CLS reflects how stable your site feels during load. By reserving space for dynamic content and declaring element dimensions, you create a smoother, more predictable experience. Reducing CLS not only enhances usability but also strengthens your site’s credibility and professionalism.
Prioritising Website Speed Problems
When evaluating your site’s performance, you’ll likely uncover a wide range of speed issues. Knowing where to begin can be overwhelming. The key is not just identifying problems—but implementing solutions. After all, the best fix is the one that actually gets done.
Start with a Brainstorm
Begin by documenting every speed-related issue you find. Don’t prioritise at this stage. Think of it as a brainstorming session—list everything without judgement. Once you’ve noted down all the potential problems, you can begin to organise and prioritise them.
Use a Three-Question Framework
To sort your speed issues, evaluate each one based on three simple questions:
1. Does this fix improve Core Web Vitals?
Focus on whether the issue negatively affects metrics like LCP (Largest Contentful Paint), INP (Interaction to Next Paint), or CLS (Cumulative Layout Shift). For instance, a slow-loading Pinterest embed might increase load time but may not impact your Core Web Vitals score. In that case, it’s not urgent.
2. Can this be implemented immediately?
Do you have developers or tools readily available to tackle the fix? If your developer is unavailable or a tool requires approval, it makes sense to delay such fixes. Prioritise items that can be acted upon quickly using available resources.
3. Is this a quick win or a time-consuming task?
Some fixes are fast and easy (e.g. compressing an image), while others, like improving site structure or adding alt text to hundreds of images, may be tedious. Break larger tasks into smaller steps, starting with your most important pages.
Scoring and Prioritisation
Give each issue a score from 1 to 3:
- 3 points: Helps with Core Web Vitals, is quick, and can be implemented immediately.
- 2 points: Meets two of the three criteria.
- 1 point: Meets only one (or none), and can be deprioritised—unless it causes major performance breakdowns.
This system helps you prioritise smartly, focusing on impact and feasibility.
Making the Case for Speed Fixes
If you’re struggling to get buy-in for improvements, consider this:
“If your site is so slow that users hit the back button before it loads, your analytics won’t even capture that visit. You could be losing traffic without ever knowing it.”
That insight often resonates and can help secure the attention and resources needed to fix performance issues.
Best Practices for Website Architecture
Website architecture isn’t just about aesthetics—it’s a strategic foundation that affects both user experience and search engine visibility. While real-world architecture can be subjective, website architecture is far more objective. There’s a clear distinction between a structure that works and one that doesn’t.
Why Architecture Matters
An effective site architecture achieves two key objectives:
- It helps visitors find what they need quickly.
- It helps search engines understand your site’s structure and purpose.
The way you organise folders and pages directly informs search engines about your site’s content and hierarchy. The better this structure, the easier it is for both users and search engines to navigate your site.
The Ideal Website Structure
Visualise your site as a pyramid:
- At the top is your homepage.
- Beneath it, in a structured hierarchy, sit your subfolders and pages.
But this doesn’t mean your homepage is always the most important page. Ideally, users land directly on the page most relevant to their search—often bypassing the homepage entirely. However, your homepage should still link to your most valuable pages to signal their importance to search engines.
Building Structure Through Navigation
Your menu system defines much of your site’s perceived structure. While it’s important to feature key sections in the top navigation, not everything needs to go there. For instance, a Careers page can be placed in the footer—search engines will still index it appropriately.
Check Your Structure with a Site Crawl
Use tools like Screaming Frog to crawl your website and view the structure in a tree diagram. This can quickly reveal structural problems:
- Are all your blog posts scattered at the same level as service pages?
- Are they organised under a logical /blog/ subfolder?
For example, having blog posts directly under the root folder can flatten your site structure, which is not ideal. Moving them to a designated blog folder (/blog/post-name) makes your site easier to understand and navigate.
⚠️ However, only change URLs if there’s a clear benefit and no risk to existing backlinks or traffic. Redirects should be handled carefully—this is explored further in our redirection strategy guide.
URLs and Headings Matter Too
Good site architecture also extends to URL naming and heading structure:
- Ensure folder and page names clearly describe the content.
Example: Instead of a vague URL like /services/special1, use /services/vet-care-exotic-pets. - Structure your heading tags (<h1>, <h2>, etc.) to reflect your page hierarchy. Think of headings as mini blueprints that reinforce the relevance and flow of your content.
Breadcrumbs and Schema Markup
A clear structure allows you to implement breadcrumbs, which not only improve navigation but also enhance search results when paired with schema markup. This improves both usability and SEO.
Fix Early
Website architecture is much easier to address before your site is launched. If you’re building a new site or redesigning an existing one, this is the perfect time to optimise structure. Cleaning it up after launch can be far more complex.
The Importance of an Internal Linking Strategy
Internal linking refers to the practice of linking to other pages within your own website. This is different from external links, which point to content on other websites. While internal linking might seem intuitive, it’s often overlooked—even on well-established sites.
Why Internal Linking Matters
Without a thoughtful internal linking strategy, you’re essentially relying on your site’s main navigation to do all the work. And while your navigation structure is important, it treats all pages equally—linking to each one in the same way. This uniformity gives search engines no additional signal about which pages are the most important.
Let’s take an example. Using a tool like Screaming Frog, you may find that many pages on your site have the exact same number of internal links, simply because they’re all linked from the navigation bar. This flat structure makes it harder for search engines to determine which pages to prioritise in rankings.
A High-Impact, Low-Effort Fix
One of the greatest benefits of internal linking is that it’s relatively easy to improve—often without needing a developer’s help. And the payoff can be significant.
Start by identifying your most important pages. For a business like Wisdom Pets, these would be core service pages, such as dental care or surgery.
Next, review your existing content—especially blog posts—and look for natural opportunities to link back to these key pages.
How to Link the Right Way
Avoid outdated phrases like “click here”. Not only are they vague for readers, but they also provide no context for search engines. More importantly, they waste a chance to reinforce the relevance of your target page through descriptive anchor text.
For instance, instead of writing:
For more information about how we can help with your pet’s dental health, [click here].
Try something like:
Our dental care services help maintain your pet’s tooth health and prevent long-term issues.
The phrase dental care services naturally aligns with both the topic and the keywords you want to rank for.
Best Practices for Internal Linking
Here are some actionable recommendations:
✔️ Use descriptive, keyword-relevant anchor text
✔️ Incorporate internal linking during the content creation process, not just afterward
✔️ Include internal linking in your editorial checklist
✔️ Maintain a shared list of your most important pages for content writers to reference
✔️ Avoid excessive linking—search engines may ignore links after the first 100–150, and readers may find the content hard to follow
Encourage your writers to ask themselves:
“Which existing pages can support or add credibility to this content?”
This mindset turns internal linking into a value-add rather than an afterthought.
Once you’ve crawled your website, review the internal linking paths. If your most important pages aren’t receiving a good number of internal links, you have a clear opportunity for improvement. Small changes can lead to meaningful SEO gains.
When to Use a Redirect Page
When reviewing your website’s structure or planning a redesign, you may find that some pages need to be changed or removed. In such cases, the question arises—should you redirect or not? This guide walks you through when and how to use redirects effectively.
Understanding Redirect Types
There are two primary types of redirects:
- 301 (Permanent Redirect): Used when a URL has been permanently changed. This is the most common redirect.
- 302 (Temporary Redirect): Suitable for short-term changes—such as seasonal promotions—where the original content may return.
In most cases, you’ll be implementing a 301 redirect, especially when URLs are being updated or removed permanently.
Ask the Right Questions
Before setting up a redirect, start by asking:
- Does this content still exist somewhere else?
If yes, and the content remains largely the same (e.g., a URL clean-up or domain name change), a 301 redirect is appropriate. - Is the content being permanently removed without a replacement?
If so, do not redirect. For example, if a company discontinues a service (like plumbing), redirecting users to unrelated content (such as heating or cooling) can be misleading. It’s better to signal to search engines that the topic is no longer part of your site.
Consolidating Similar Content
Another scenario involves merging multiple low-performing pages on the same topic into a single, stronger page. Here’s how to handle it:
- Identify the best-performing URL—based on traffic or backlinks.
- Redirect all other related pages to this chosen URL, assuming the new page comprehensively covers the same information.
- Avoid redirecting pages that are too different in topic, even if they fall under a similar theme.
For instance, if you’re combining ten blog posts on how to make scrambled eggs, a single consolidated post makes perfect sense. However, if those posts range from scrambled eggs to quiche and omelettes, only redirect the ones truly aligned with the new page’s content.
Watch Out for Redirect Chains
Redirect chains occur when a redirect points to another redirect and so on. These can:
- Slow down the user experience
- Cause search engines to abandon crawling before reaching the final page
To avoid this, test all redirects using a site crawler. Tools like Screaming Frog can help ensure all redirects point directly to the final destination without unnecessary steps in between.
By asking the right questions, understanding redirect types, and maintaining clarity for both users and search engines, you’ll make smarter decisions about page redirects. Done correctly, redirects can preserve SEO value, enhance user experience, and support a cleaner, more effective website structure.
Understanding Schema in SEO
In the early days of the internet, search engines had a tough job. With webpages made up almost entirely of raw text, they had to work hard to interpret the content. As a result, search results were quite basic.
Over time, it became clear that search engines needed a more sophisticated way to understand the type of information they were indexing. That’s where schema markup—also known as structured data—comes in.
What Is Schema?
Launched in 2011 by Bing, Google, Yahoo, and Yandex, Schema.org was introduced as a shared vocabulary that websites can use to define their content in a more structured way. Using schema markup, you’re not just presenting information—you’re telling search engines what that information is.
Whether you refer to it as schema, schema markup, or structured data, the goal remains the same: to clarify your content so that search engines can deliver more accurate and helpful results.
How Schema Works
Schema is most commonly implemented using JSON-LD (JavaScript Object Notation for Linked Data). For instance, if you’re hosting an event, your schema could include the event name, date, location, and organiser—all formatted in a way that search engines can easily interpret.
⚠️ Important: Schema content must also be visible on the page itself. Including hidden or misleading schema markup violates Google’s guidelines and can negatively affect your site’s performance in search results.
Where to Start with Schema
Schema.org offers an extensive list of markup types, covering everything from recipes to mathematical equations. But trying to implement every possible type would be overwhelming and unnecessary.
Instead, begin with the schemas most widely supported by search engines, particularly Google. Google maintains a list of supported schema types that’s frequently updated—start there.
Here are a few common schema types that may be relevant for your site:
- Local Business
- Video
- FAQ
- Article
- Logo
- Sitelink Search Box
- Breadcrumb
- Review Snippets
Check Required and Recommended Fields
Each schema type has required and recommended properties. For example, the Local Business schema requires at least the business name and address. You might need to gather additional data before you implement schema markup fully.
Refer to Schema.org if you’d like to explore all fields available for a given type, especially if you’re aiming for a comprehensive implementation.
How to Implement Schema
You have several options depending on how your site is built:
- Yoast SEO (WordPress) – Offers built-in schema functionality.
- Schema & Structured Data for WP & AMP – A robust plugin with custom schema support.
- Schema App – A third-party solution compatible with many platforms.
- Google Tag Manager – Ideal when you can’t modify your website’s code directly.
Always Test Before Going Live
Before deployment, test your schema using Google’s Rich Results Test. You can paste your code directly into the tool without publishing it to a live site.
Once implemented, check Google Search Console. You’ll find insights under the Enhancements and Search Appearance sections, including:
- Schema types detected
- Errors or warnings
- Whether rich results are appearing in search
Not all schema leads to visible changes in search results, but it improves how search engines interpret your content, which is a core goal of technical SEO.
Implementing schema effectively helps bridge the gap between your content and search engines. It doesn’t just support your visibility in search—it enhances the context and clarity of your website in a structured, reliable way.
Why Canonicalisation Is Essential for SEO
Canonicalisation—a word that’s tricky to say and spell—plays a vital role in maintaining a healthy, search engine-friendly website. At its core, canonicalisation is about avoiding duplicate content issues and helping search engines understand which version of a page is the primary, authoritative one.
What Is a Canonical Tag?
A canonical tag signals to search engines that a specific page is the original or preferred version of similar or duplicate pages. This allows crawlers to focus their efforts on indexing the right page—reducing wasted crawl time and ensuring the correct content appears in search results.
For example, if your site has several pages with very similar content (e.g., filtered product listings), canonical tags help search engines determine which version should be prioritised.
Why It Matters
Canonical tags became widely adopted around 2009 when the SEO community began grappling with increasingly complex websites. Dynamic URLs, filter controls, and similar technical structures were creating hundreds—or even thousands—of URL variations showing almost identical content.
Without canonical tags, search engines might:
- Crawl all versions of a page unnecessarily
- Split ranking power between duplicates
- Fail to rank any version clearly due to confusion over which is most relevant
Real-World Example
Let’s say you run an e-commerce site selling board games. Your main category page might have this URL:
example.com/board-games
Now, a customer filters to see two-player games:
example.com/board-games?players=2
Then narrows further by selecting games under 30 minutes:
example.com/board-games?players=2&duration=30
While these URLs display filtered versions of the same content, search engines might treat them as separate pages. A canonical tag on all of them pointing to example.com/board-games ensures that only the main version is indexed.
Other Use Cases
Canonical tags aren’t just for e-commerce. Here are additional scenarios where they make a difference:
- Language versions of the same site
- Show home listings vs. homes available for immediate purchase
- PDFs that unintentionally outrank your website pages
In one case, a home builder had pages for both model homes and prebuilt homes (which were almost identical). The team chose to make the show home pages canonical, since they were more permanent. All prebuilt home pages now point to the corresponding show home page via canonical tags.
Best Practices for Canonical Tags
- ✅ Every page should include a canonical tag – even if it’s self-referencing
- ❌ Do not include more than one canonical per page – this confuses crawlers
- 📌 Always use absolute URLs – be explicit (e.g., https://example.com/page)
- 🔎 Check for canonical issues using crawlers or Google Search Console
If you’re using a popular CMS like WordPress, many SEO plugins (e.g., Yoast SEO) handle canonical tags automatically. For custom CMS platforms, implementation may require development work.
Where to Check
- Crawlers – Most professional crawlers highlight duplicate content and canonical gaps
- Google Search Console – Review the Coverage and Performance reports for anomalies in URLs and indexing
By implementing canonical tags thoughtfully, you’re helping search engines do their job more effectively—while protecting your content from being diluted across multiple URLs.
Optimising International SEO with Hreflang Tags
Offering your website in multiple languages is far more sophisticated than simply adding a Google Translate button. To ensure that search engines understand your multilingual and multi-regional content, you must implement hreflang tags correctly. These tags tell search engines which language and regional version of a page is intended for which audience.
What Are Hreflang Tags?
Hreflang tags are HTML attributes that signal to search engines the language and, optionally, the region for which a specific page is intended. This helps ensure that users see the correct version of a page based on their language preferences and location.
Examples:
-
- For English-speaking audiences in the US and Canada:
- For English-speaking audiences in the US and Canada:
- hreflang=”en-US”
- hreflang=”en-CA”
- For generic English and Spanish content:
- hreflang=”en”
- hreflang=”es”
These tags also help avoid duplicate content issues. For instance, if your American and Canadian pages differ only slightly—such as in spelling or pricing—hreflang tags help clarify that they are intended for different audiences, not duplicates.
Why Hreflang Matters
Without proper hreflang implementation, users may end up on a page in the wrong language or region. Worse still, search engines may misinterpret similar pages as duplicates and fail to rank them properly.
Additionally, hreflang tags improve user experience by delivering content in the user’s preferred language and format—boosting both engagement and SEO.
Ways to Implement Hreflang
There are three primary methods to implement hreflang tags, and each has its own use cases.
1. In the Page <head>
This is the most common method. You insert hreflang annotations in the <head> section of each HTML page. For example:
<link rel=”alternative” hreflang=”en” href=”https://example.com/en/” />
<link rel=”alternative” hreflang=”es” href=”https://example.com/es/” />
<link rel=”alternative” hreflang=”x-default” href=”https://example.com/” />
Use this approach when you have direct access to the page code or are using a CMS that supports head tag customisation.
2. Via HTTP Headers
This method is useful for non-HTML documents such as PDFs. The hreflang information is included in the HTTP response header. However, implementing this typically requires server-side support, so it’s less accessible to most marketers and content managers.
3. In the XML Sitemap
You can also define hreflang attributes within your XML sitemap. This is particularly useful when code-level changes are difficult. However, this method can quickly bloat your sitemap if you manage large numbers of URL and language combinations.
Example structure:
<url>
<loc>https://example.com/en/</loc>
<xhtlm : link rel=”alternate” hreflang=”en” href=”https://example.com/en/” />
<xhtlm : link rel=”alternate” hreflang=”es” href=”https://example.com/es/” />
</url>
Best Practices for Implementing Hreflang
✔ Use Canonical Tags Correctly
Each language version must include a rel=”canonical” tag pointing to itself—not to the original or another translation.
✔ Always Link Back
Each translated page must reference all others in the set. For example, if page A links to page B, then page B must also link back to page A.
✔ Avoid Partial Translations
If a page isn’t translated into other languages, simply don’t include hreflang annotations for those versions. Use x-default for unassigned users.
✔ Use Tools
If you’re unsure, tools like Aleyda Solis’s hreflang generator can automate the process by generating either head tags or sitemap entries based on your input.
Common Pitfalls to Avoid
- ❌ Inconsistent hreflang sets between pages
- ❌ Missing self-referencing canonical tags
- ❌ Incorrect region/language codes (e.g., using en-UK instead of en-GB)
- ❌ Linking to pages that don’t link back
- ❌ Rushing implementation without testing
Implementing hreflang tags properly ensures that your global audiences are served the most relevant content, while also avoiding SEO penalties due to perceived duplication. While they can be easy to get wrong, a careful and well-tested approach can deliver significant SEO and UX benefits.
Optimising Images for SEO
Image optimisation is essential for improving both SEO performance and user experience, especially as image search grows in importance.
🔍 Key Image SEO Practices:
- Descriptive Filenames:
Use clear, relevant names like golden-retriever-puppy.jpg. - Alt Text (Not Keyword-Stuffing):
Describe the image naturally to help both SEO and screen readers. - Diversity Matters:
When relevant, include racial identity in image descriptions to promote inclusivity.
⚡ Speed and Technical Optimisation:
- Responsive Images:
Use srcset for different screen sizes to improve loading speed. - Compress Images:
Reduce file size using tools or plugins (e.g., TinyPNG, ShortPixel). - Correct Sizing:
Upload images at display size—not straight from a high-res camera.
🌍 Advanced Tips:
- CDN Images:
If using a CDN, ensure Google knows the images belong to your site. - Image Sitemaps:
Create one if your images appear in SERPs for key search terms.
Ongoing Education:
Give editors a checklist to maintain consistency.
Best Practices for On-Site Videos
Videos are an engaging and powerful content format, and many website visitors prefer watching a video over reading. While search engines like Google have improved in understanding video content through visual and audio cues, videos still remain among the least accessible elements for search engine crawlers. That’s why technical SEO for videos is so crucial.
Hosting Your Videos
One of the first questions that often comes up is: Where should you host your video?
Options include:
- YouTube and TikTok, where people actively search for video content.
- Vimeo or Wistia, which offer greater control over video presentation and often come with built-in SEO tools.
If possible, upload your video to multiple platforms to maximise visibility. For your own site, using a non-YouTube host can help maintain better control over branding and functionality.
Transcripts and Captions Matter
Search engines rely heavily on transcripts to understand what a video is about. Captions not only enhance accessibility for those who are deaf or hard of hearing but also serve viewers who watch without sound.
Worried about long transcripts cluttering the page? Use collapsible elements—as long as the transcript is in the HTML when the page loads, search engines will read it.
Add Schema Markup and Video Sitemaps
To help your videos appear in search results:
- Use video schema markup that aligns with Google’s guidelines.
- Build a video sitemap to provide detailed information about your video content and hosting locations.
While creating sitemaps manually is tedious, many tools, plugins, and video hosting platforms offer automated solutions—well worth considering.
Maintain Site Speed
Video embeds can affect page speed. Generally, embedded videos don’t load until clicked—but auto-playing videos can significantly slow your site down. Be sure to:
- Test your page speed regularly.
- Avoid unnecessary auto-play settings.
- Review our additional resources on site speed for further optimisation.
Creating high-quality video content takes considerable time and resources. Ensure your efforts are rewarded by setting up your on-site videos for SEO success—from hosting and transcripts to structured data and speed optimisation.
Prioritising Technical SEO Fixes
It’s easy to get caught in the trap of compiling a long list of technical SEO issues—but actually implementing those fixes can be far more challenging, especially when you’re not the sole decision-maker. In many cases, audits are presented in full detail, only to see minimal action taken months later.
Start Small and Stay Strategic
Well, presenting all fixes at once can be overwhelming. Instead, start with small, focused wins around a single topic. This approach helps prevent inaction and encourages real progress.
Within the spreadsheet accompanying this course, you’ll notice that although it forms one comprehensive document, the audit items are broken down into logical sections. Begin with a site crawl and an assessment of site speed. From there, identify the most urgent issues—what’s most “on fire.”
For instance:
- If internal links are broken and site speed scores are poor, fix internal links first.
- Schema implementation can wait, as it’s an enhancement, not a critical failure.
Evaluating What Matters Most
To determine priority, consider two factors:
- Impact on overall website health
- Time, cost, and effort required for implementation
A fix that’s high-impact and easy to execute? High priority.
A fix with uncertain benefit and high cost? Low priority.
Don’t Ignore the Fundamentals
Before diving into advanced tactics like voice search optimisation, ensure the basics are solid. If a website is slow, unstructured, or unreadable to search engines, none of the advanced optimisations will help.
As we often say: Don’t mow the lawn while the house is on fire.
Think in Templates, Not Pages
Most modern websites are template-driven. Instead of listing 100 URLs with the same issue, identify the template causing the problem. It’s more effective to say, “this link is broken in the blog page template,” than to list every affected blog URL.
This approach:
- Saves developer time
- Increases likelihood of implementation
- Streamlines communication
Speak to the Right Stakeholders
Even the best fix won’t be prioritised if you’re not speaking to the right person. Developers may not control their workload priorities—find out who does, and make your case to them.
When Fixes Don’t Happen
Sometimes, you may need to accept that certain fixes won’t be made—due to resource constraints, budget limitations, or other priorities. In such cases:
- Document everything clearly
- Explain the risks
- Suggest alternatives, like increasing inbound link efforts or paid search investments
The most valuable fix is the one that actually gets implemented. Keep this principle in mind as you prioritise your technical SEO efforts—it will serve you well across all future projects.
Mastering technical SEO isn’t just about fixing errors—it’s about building a strong, lasting foundation for long-term search success.