Common Problems With Search Engines And SEO

This section will cover common problems that webmasters will have and some of ways to fix them.

Every search engine is different and the algorithms they use always change, so one day you might be doing well and the next you’ll be off the radar. It’s important to always stay on top of the current SEO methods.

We’ll also go into ways to make your website in such a way that you won’t have any problems when it comes to the search engines crawling your site.

 

Roadblocks To Getting Noticed By The Search Engines

There are some things that prevent the search engines from crawling your site.

The most typical reason is a robots.txt with a disallow command on the whole site.

If you have a robots.txt already, check it out and if you see:

User-agent: *

Disallow: /

If you see the “Disallow: /” command, it means the search engine bots are being prevented from crawling your site.

Robots.txt’ are useful if you want the search engine to avoid duplicate and useless pages. Just make sure you don’t disallow everything.

When a web developer is building your site, they often place a robot Meta tag in the code to prevent search engines from crawling your site while it’s under construction. Sometimes they’ll forget to modify or remove it.

Check out the source code of your website and if you see the following, remove it:

Some websites are now using registration forms of some sort to make the user fill out before accessing the content. Having a form like this prevents search engines from seeing the content and therefore the site is not indexed.

If you have a membership site, keep in mind that it won’t be crawled, meaning you won’t receive too much traffic from the search engines.

 

Making It Easy For The Search Engines To Crawl Your Website

Search engines crawl your site by links on your page. Always make sure you have HTML links somewhere on your page as a way to navigate to other pages.

Search engine bots can’t use forms or drop down menus. You can still use forms and drop down menus but be sure to have a way for the bots to crawl your site.

Having a link to a site map that links to all your other pages is perhaps the easiest way to ensure that your entire site is crawled.

 

Ajax And Dhtml

Ajax and DHTML are code languages used to make fancy web apps, they use JavaScript to load the content.

Ajax is used to create fast loading WebPages and DHTML is used to create drop down menus and other interesting effects.

Since search engines are unable to read JavaScript, the content you create with Ajax and DHTML will not be indexed.

There are ways to build drop down menus without DHTML and in such a way that search engines can navigate them. The most common method is to have your page “degrade gracefully”.

 

Why You Shouldn’t Have Flash Websites

Flash is a highly advanced code used to make vector animations and “flashy” websites and apps.

Much like Ajax and DHTML, the search engines can’t read the flash code and therefore the content won’t be indexed.

Adobe (the company that makes flash) is working hard with Google to make flash crawlable but it’s still not there yet.

What many websites do is offer an html and flash version of their website which allows the site to still be indexed. The problem with having two versions is that only the html version will be indexed and so the traffic you receive will only be directed to the HTML version.

Flash looks great but it’s not SEO friendly.

 

Client Side Redirects

Back in the olden days of the internet, webmasters would create pages loaded with keyword rich content for the search engines but then include a redirect code so that people would be directed to the main page and not see any of the keyword spam.

This redirected method was also known as “cloaking”.

The redirect method worked great at increasing the ranking but search engines have caught on and now penalize sites for using this method.

Some websites still use advanced forms of cloaking for certain businesses but its considered black hat so you should avoid it.

 

The Firefox Web Developer Toolbar

Firefox has a web developer toolbar plug-in which can be found here:

  • http://chrispederick.com/work/web-developer/

This tool will let you see what a website looks like to the search engines.

Go to a website and disable CSS and JavaScript… what you see is what the search engines sees.

You can use this tool to see the problems in your own website. Disable JavaScript and CSS and if you can see all the content on your website and navigate to every page… you’re in good shape.

 

Duplicate Content

Duplicate content is a big problem when it comes to SEO.

In the early days of search engines, people would intentionally make several websites with the exact same content linking to each other thereby increasing relevancy and rankings. The search engines were unable to detect duplicate content but they now can and will often times penalize those sites.

Search engines now detect duplicate content and will only index one of the copies.

This can be bad because if someone decides to steal your content and they have more authority, it’s possible that the thief’s site will be indexed higher than yours.

If you have duplicates on your own site and you have backlinks going back to both duplicates, you’re essential decreasing the amount of links going back to a single page. Google will rank both pages separately and because of the divided links… neither page will be ranked high.

To find duplicate content go to Google and type in “site: www.yourwebsite.com”. This will show you all the pages of your site that have been indexed. You know you have duplicate content if you see a message at the bottom that sais:

  • “In order to show you the most relevant results, we have omitted some entries very similar to the 217 already displayed. If you like, you can repeat the search with the omitted results included”.

Another great way to find duplicate content is too copy a sentence of your content and do a search for it in Google. You’ll see all the other instances of where that sentence has been found.

Your website has 3 different addresses as in:

  • yoursite.com
  • www.yoursite.com
  • www.yoursite.com/index.html

Google sees each version of your website as an entirely separate website and therefore if you get backlinks to each version your site will be indexed 3 times as if they were different but with duplicate content. This can hurt your rankings greatly.

Always link to your site the same way, preferably like this:

  • http://www.yoursite.com/

Have a 301 redirect from “yoursite.com” to “http://www.yoursite.com/”.

If you have a website that utilizes a database you’ll probably get very complex URL’s, get a plug-in that will simplify all the URL’s and make them SEO friendly. You want each page to only have one URL all the time.

 

Broken Links

Broken links are a big no no in SEO.

When a bot is crawling on your site and navigates to a broken link, the search engine will lower the value of your site and the bot is likely to stop crawling the moment it finds a broken link.

To find broken links you can download a free program called “Xenu Link Sleuth” which you can download here:

  • http://download.cnet.com/Xenu-s-Link-Sleuth/3000-10248_4-10020826.html

Xenu makes a list of all the broken links it finds on your website.

With the list of broken links, you can then go through your site and fix all the issues.

Leave a Reply