Encountering Common Drawbacks Of SEO And Ways To Deal With Them

Total
0
Shares

Have any doubts about SEO? Well, you shouldn’t because it is they who are running your online business along with the business model you have put in place.

However, you can always question the most common problems faced while doing SEO of the websites. There are those that are pretty common and encountered almost daily. There are others that are not so common but whenever encountered pose real complexity in optimizing the website. Let get on them one by one:

Common Drawbacks Of SEO

URL problem (case problem)

The Uppercase and lowercase problem is reported to be the most dealt with according to many reputed SEOs. It is noticed mainly in the sites made on .NET. The Uppercase version is taken and not redirected to the smaller case version. The problem has been made curable more or less by the use of URL rewrite module specifically on IIS 7.

Homepage in More Than One Version

This problem is also common to website developed in .NET coding but others have also been notices to so similar symptoms. An experienced SEO knows to get back to a site with “default.aspx”, “index.html”, “home”, etc if it is made on .NET (and/or other platforms) and usually he will see that one of them does exist. This problem could be dealt with proper scrutiny and playing hit and trial. Or do the crawling, get it exported on CSV, use META title column for filter and then make a search for the title of homepage.

Generally, you will get what is being looked for.

The problem of query parameters

This is found generally with ecommerce websites where website is nothing but coded grid of database. It can be found with other kind of websites as well but the occurrence is quite low as compared to the websites with loads of products categorized by various attributes of them.  Solution is to use your crawl budget for the best optimization. Decide which page you want Google to crawl and index, focusing on the keyword research. Depending upon the most popular attributes of particular website, make your website crawlable and indexable accordingly. You can find needed support from “Fetch My Google” tool in Webmaster where URL indexing is the prime issue. Solve that and then move ahead.

Sitemaps presenting a problem (old and useless)

Although sitemaps are very important for any website from the SEO point of view, sometimes they themselves can become a problem when they are either broken or outdated. For those one-time websites that are generated on XML, regular updates and new URLs is a problem and hence they lose credibility from SEO point of view. Solution is to look for some good tools just like the few told above for launching a search of broken links in the sitemaps. Secondly, building dynamic XML sitemap should be given preference. This will facilitate regular updating of new URLs.

There can be other problems like swap of 302 redirect and 301 redirect, Soft 404 errors, dealing with robot.txt, base 64 URLs and some other problems. There are some known and some unpractised solutions. Do share if you find something unique to tell.

Leave a Reply

Sign Up for Our Newsletters

Get notified of the best deals on our WordPress themes.

You May Also Like
Web Marketing

The Many Faces Of SEO

If you do business in the 21st century and have an online presence of any sort, the term Search Engine Optimisation (SEO) is likely familiar to you. SEO does exactly…
View Post