This One Mistake May Keep Your Customers From Finding You Online

January 3rd, 2019

You can do everything perfectly. Your on-page search engine optimization can be on point; your website can load quickly and offer an excellent design; your content can be helpful and engaging.

But in the end, that may still not be enough.

Even if your website scores high on most of the 200 factors that affect your Google ranking, its still going to have issues placing high on that search engine result page if your URL is overly complex. That’s why you need the guidance of experienced digital marketers to not only get your website on to Google’s SERP but to also make sure it ranks high as well.

Today, we’re going to analyze why overly complex or hard-to-understand URLs may not even place in Google’s SERP, and what we can do to circumvent that.

What’s Happening with URLs and Google?

In a recent chain of tweets, John Mueller, Google’s Webmaster Trends Analyst, was asked a question regarding recent difficulties certain websites were facing, included having their products removed from Google’s search index.

Google’s webmaster stated that, sometimes, hard-to-understand or overly complex URLs are to blame for the removal of those pages from the index. More accurately, he stated that “many URLs leading to the same content, making our systems assume that part of the URL is irrelevant,” was the problem.

In this particular case, an e-commerce website lost over 50,000 pages, as its URL structure had caused Google to removed these pages from its search index.

What Can Be Done to Protect Your Website?

Google has provided some insight on how you can ensure your site’s URLs aren’t bloated with irrelevant or invalid parameters:

1. Don’t Be Afraid to Use Punctuation in Your URLs – Using a well-placed hyphen in between keywords in your URL is always more useful than just typing them out altogether.

2. Avoid Using Session IDs – Session IDs are known for their duplication, generating a ton of URLs. Always use cookies instead.

3. Trim Off the Fat – Simply put, have a webmaster trim out the excess from your URLs. Have them target irrelevant or unnecessary parameters.

4. Check For Broken Relative Links – Broken relative links can lead to infinite spaces because of repeated path elements. Have an expert eye take a look.

5. Add a “Nofollow” Attribute to Links – A dynamically generated calendar may create links to future events — or even previous ones — where there are no restrictions to start or end dates. A nofollow attribute tells search engines not to follow any other links on that page. In the context of infinite calendars, a attribute will tell a search engine not to follow links to dynamically created future calendar pages.

6. Make Good Use of Your robots.txt File – Use your robots.txt file to block Googlebots from accessing URLs of concern. Consider blocking dynamic URLs that generate search results or URLs that create infinite spaces (like a calendar). With the use of regular expressions, your robots.txt file will enable you to block a multitude of problematic URLs.

Are your URLs structured right? Do you need help bringing pages back to life? Look no further than Cobalt Digital Marketing.

Believe us, we know how scary it would be to realize that all of your hard work was removed from Google’s search index because of a URL. Enlisting the aid of an experienced digital marketing agency – like Cobalt Digital Marketing – not only gives you the quickest return on your investment, it leaves your business’ online presence in good hands through our tried and true Cobalt Formulaᵀᴹ.

Contact us today at 956-303-6555 to get started on your 2019 digital marketing journey.