URL structure is a frequent SEO issue, one that can impair rankings, keep pages out of the search engine indexes, and suck ranking authority from your other pages or even the entire websites.

Some content management systems bake poor URL structures right into their websites. Lax rules can be a culprit, for example, not encoding spaces or special characters.

While it is true that search engines go to great lengths to read and index even the worst URLs, attention to URL management and optimization will provide both SEO and usability advantages.

https://www.linkedin.com/company/nkwebart

It is easy to read and understand. If I saw this address pasted into a blog or forum, I would likely click on it.

  • It is easy to read and understand. If I saw this address pasted into a blog or forum, I would likely click on it.

 

  • It is SEO optimized with breadcrumb style keywords. Search engines look for keywords in URLs; it’s a known ranking factor. This layout, going from general to specific, is ideal for enterprise SEO.
    • The URL includes its own anchor text. If this address were pasted into a blog or other web page as a link, that link would possess well-optimized anchor text.

Old style dynamic addresses are legal and acceptable, though they have drawbacks.

 

  • They tend to be longer and difficult to read because they contain both parameter names plus values.
  • Pairing parameter names with values adds extra words. This may dilute the SEO value derived from keywords within the URLs.
  • This type of address may contain information better transmitted outside of the URL. A user ID, session ID, sort code, print code and many other possible parameters could create duplicate content, security or other issues.

Diagnosing URL Issues

To find URL based issues:

  1. Check for errors and warnings then determine if URLs are the culprit.
  2. Audit all URLs for proper syntax.

To check for errors, begin with Google and Bing webmaster tool reports. Look for duplicate content then examine the webpage addresses themselves and their locations. Numerous third-party SEO tools can locate SEO issues as well.

Canonical issues, parameters that do not change page content, loose adherence to coding standards, or any number of reasons will create duplicate content.

Options for dealing with duplicate content include:

  • Reconfigure the content management platform to generate one consistent URL for each page of content.
  • 301 redirect duplicate URLs to the correct version.
  • Add canonical tags to webpages that direct search engines to group duplicate content and combine their ranking signals.
  • Configure URL parameters in webmaster tools and direct search engines to ignore any parameters that cause duplicate content.

When auditing URL syntax, I prefer to export every webpage address into a spreadsheet or database. If you’re thinking about using Google site: queries, don’t bother as many of the issues you will look for do not appear in search results.

This is the Main , Basic and Important Parameters which you all have to follow while doing SEO for your website or a Web application 

You may also like

Need Help? Chat with us