Website URLs matter because they help search engines effectively index your website. They way search engines do this is not dissimilar to a good organisation system in a library – and URLs ensure search engines can find all your important pages and keep a reference of them. URLs and URL parameters can also be a tool in helping boost your SEO when used correctly. Controlling your URL parameters and harnessing them to best effect will ensure your website is in the best state to be found by search engines and, hence, found by people looking for your products or services.

Search engines use what are commonly known as a web crawler, spider or search engine bot to download and index content from all over the internet. A web crawler/spider/bot is simply a software program. The goal of a web crawler is to learn what each page online is about so the search engine can show that information in response to a relevant search query.

Note that there are also AI crawlers that access content on the web to help train large language models (LLMs) and to help AI apps such as ChatGPT or Perplexity to provide information in response to a question or AI prompt.

Let’s take a closer look…

Why URL parameters matter

Essential for filtering, tracking and dynamic content, URL parameters can help not only the search engines, but users visiting your site easily and effectively find what they are looking for. However, this only happens when they are correctly set up. If you have multiple pages that should be classified together, but each has a different classification, the search engines and your users will find it confusing to navigate around your site. This can lead to:

  • Search engines like Google ignoring important pages within your site.
  • Your customers becoming frustrated and finding somewhere else to look for what they want.

Not good news, I’m sure you’ll agree.

This is because search engines view each of your unique URLs as a completely different page. Having the same content on various URLs can, effectively, dilute the power of the content you’ve painstakingly created. Not only that but it could mean your pages are fighting with each other for the same keywords, a phenomenon known as content cannibalisation. This can confuse the search engines, which won’t be able to work out which page is the most relevant. In addition to all this, if your pages are parameter heavy, this can indicate that they are low value to a search engine, and your rankings could suffer as a result.

Plan, Execute, Enjoy

If all this sounds a little worrying, let me calm your nerves a little. If you have a good URL parameter plan, you’ll avoid these pitfalls and get your site in the best shape it can be for the crawlers. The steps below should help.

Step 1 – Get to know your parameters

Know your URL parameters. First step is to crawl your website to see what search engines like Google will see. Tools such as Lumar (previousy called DeepCrawl) or ScreamingFrog can help with this. What you need to pay attention to is the most common parameters, which can include:

  • Tracking parameters (utm_medium, utm_source, utm_campaign etc, used for analytics)
  • Session IDs such as phpsessid and sessionid
  • Sorting parameters, like sort=popularty or sort=price
  • Filtering parameters, such as size=large, and colour=red
  • Pagination parameters, such as page=2 etc

Step 2 – Implement your plan

Next, it’s time to find the best way to manage your parameters. You’ll have a few options here. One of the most popular is to canonise your pages. This effectively tells the crawlers which page you would prefer them to crawl. Tag these pages with rel=”canonical” tags. To implement this method, add rel=”canonical” as a tag to the <head> section of your chosen pages, pointing it to the canonical version.

Alternatively, you can go through your site and rewrite your URLs. While this may take a little time, it can make your site look more professional. Plus, it can improve user experience. You can use your web server’s rewrite module for this (e.g. Apache’s mod_rewrite) and create rules for redirecting the heaviest parameter pages to cleaner URLs.

Or, you can use the noindex method. This can prevent the search engine from indexing the less important pages. If you have duplicate information or thin content on your site, this might be a good way to go. Again, simply add the noindex tag to the <head> section of the pages you don’t want indexed.

What else do I need to know?

There are a few dos and don’ts when it comes to URL parameters, and it’s important to keep these in mind when tweaking your website.

  1. Your visitors come first. If a URL is overly complex, it can be difficult for users to read. If they can’t read them, they are less likely to share, so be warned.
  2. Consistency matters – using the same parameter handling method across your site will ensure your efforts aren’t wasted.
  3. Don’t handle everything – some of your parameters won’t need to be managed. Focus on what’s damaging your SEO.
  4. Don’t over complicate things – it can be tempting to use all the parameters all of the time. However, doing so can make your URLs tricky to manage.

Set it and forget it?

As with most facets of SEO, you can’t just leave it to its own devices. Once you’ve decided on and implemented your strategy, it’s time to monitor and adjust. Take a look at your performance in Google Analytics and Search Console, looking closely for crawl errors, duplicate content red flags and how your traffic is moving. Use this insight to adjust your technical SEO strategy, and you should be primed for success.

The Bottom Line

While URL parameters are vital for ensuring your website functions effectively, they can also be detrimental to your SEO. Knowing how to handle your URLs, and keeping up to date with what’s helping and harming your SEO will give your users and the crawlers, the best chance of seeing your website as you’d want them to view it; perfectly organised, easy to navigate, and a great resource to come back to.

If you’re unsure of any aspect of URL parameters, get in touch and talk to an SEO expert who can help.

Need an SEO Expert?

We specialise in all things SEO including technical SEO

Get in Touch Today...

Michelle Symonds - SEO Consultant - Ditto Digital

Established as an SEO specialist since 2009, after a career as a software engineer in the oil industry and investment banking. Michelle draws on her technical experience to develop best-practice processes for implementing successful SEO strategies. Her pro-active approach to SEO enables businesses to reach new audiences, both nationally and internationally. She has a wealth of cross-industry experience from startups to Fortune 500 companies.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.