What is technical SEO and how does it differ from on-page SEO

SEO breaks down into several distinct areas. On-page SEO is about optimising content — keywords, copy, headings, meta descriptions. Off-page SEO is about authority — links from other sites, mentions, social signals. Technical SEO is the infrastructure — the way your site is built, served, and understood by Google.

You might have the best content in your industry, but if Google can't crawl and index your pages correctly, or if your site takes eight seconds to load, you won't appear in search results. Technical SEO is the necessary condition — not sufficient on its own, but absolutely necessary — for any SEO strategy to work.

Good technical SEO is implemented primarily when the site is being built. That's why it's important to work with a web designer who understands these fundamentals from the outset — not to bolt them on afterwards as patches. You can see more about what's included in my work on the services page.

Crawlability and indexing — how Google finds you

Googlebot is the crawler that "visits" websites and saves them in Google's index. The process has two steps: crawling (the bot visits and reads your pages) and indexing (the visited pages are saved in Google's database and can appear in results).

Common crawlability problems:

  • Misconfigured robots.txt — a robots.txt with "Disallow: /" blocks your entire site from being crawled. I've seen new sites remain invisible for months because of this single error
  • Pages with meta robots noindex — an attribute that explicitly tells Google not to index a particular page; used incorrectly, it can de-index important pages
  • Redirect chains — chains of redirects (A→B→C→D) waste crawl budget and pass authority less effectively
  • Orphaned pages — pages with no internal links pointing to them, so Google has no way to find and crawl them

Solutions include: correctly configuring robots.txt, submitting an XML sitemap in Google Search Console, creating a logical internal link structure, and regularly checking indexing reports in Search Console.

Site speed and Core Web Vitals

Since 2021, Google has used Core Web Vitals as a direct ranking factor. These are three metrics that measure the user experience on your site. A slow site doesn't just frustrate visitors — it's directly penalised in search results compared to faster sites with similar content.

The three essential metrics are LCP (Largest Contentful Paint — how quickly the main element loads), INP (Interaction to Next Paint — how quickly the page responds to user interactions) and CLS (Cumulative Layout Shift — visual stability of the page). You can read the full details on each in the article about Core Web Vitals explained simply.

To check your site's speed, run it through Google PageSpeed Insights (pagespeed.web.dev). A score below 50 on mobile is a serious problem that needs immediate attention. A score above 90 means the technical foundation is solid. If your site is slow and you want to understand why, I've written in detail about the causes and fixes for a slow website.

URL structure and site architecture

URLs tell both users and Google what a page is about. A good URL is descriptive, concise, and includes the primary keyword:

  • Good: amatei.co.uk/services/web-design
  • Poor: amatei.co.uk/page?id=47&cat=3

Site architecture — how your pages are organised and interlinked — affects how easily Google can understand the hierarchy of your content. A well-structured site has:

  • A clear hierarchy: Home → Category → Subcategory → Page
  • Logical internal links that distribute authority towards your most important pages
  • No more than 3 clicks from the homepage to any page on the site
  • Breadcrumbs that help both users and Google understand position within the hierarchy

Structured data — schema.org

Structured data is a special markup language (JSON-LD, Microdata or RDFa) that explicitly tells Google what type of content a page contains. This enables Google to display rich results — review stars, prices, FAQs, event dates — directly in the search results page.

Useful structured data types for UK businesses:

  • LocalBusiness — for physical businesses, enables appearance in Google Maps and Knowledge Graph
  • Person — for freelancers and professionals, builds author authority
  • Product — for online shops, shows prices and availability directly in search results
  • Article — for blog content, can enable display in Google News and Discover
  • FAQPage — shows expandable questions and answers directly in results, increasing visibility
  • BreadcrumbList — shows page hierarchy in the URL displayed in Google results

Structured data isn't optional for serious technical SEO — it's one of the easiest competitive advantages to implement compared to sites that ignore it.

HTTPS and security as an SEO factor

HTTPS isn't just a security matter — it's a direct ranking factor confirmed by Google. Since 2014, Google has favoured HTTPS sites over HTTP. A site without an SSL certificate in 2026 is at a serious competitive disadvantage.

Beyond the SSL certificate, security affects SEO in another way: a site infected with malware is de-indexed by Google and flagged as dangerous in browsers. Recovering a de-indexed site can take weeks or months — during which time you don't appear in any search results.

Technical SEO minimum checklist for any professional site: active HTTPS, correctly configured robots.txt, submitted XML sitemap, clean URLs, basic structured data, Core Web Vitals in the green zone, and logical internal links. Without these, no other investment in SEO will deliver its full potential.

If you'd like to check the technical SEO health of your current site, or want to build a new site that's optimised correctly from day one, get in touch for an analysis and a no-obligation conversation.