A few months a go I took a position at a company where my job was to help Account Managers with optimizing their clients web sites. I've been doing web pages for years, and have been working with PHP for the last four years. Just last year I started working with ASP.NET 2.0. I must say that I do like ASP for the sole reason that it's extremely easy to pound out a good looking web site in little time. My skills as a PHP and ASP programmer have helped out a lot. While I have been helping out with the optimizations I've learned a great deal of what goes into increasing rankings in the major search engines.
Keywords play an important part when a person is searching for something. The best way to search for anything is type in the unique words in what you're looking for. When people are building a website they look into specific key words that describe what their site is about. Some tools that are used in determining what key words one should go after are:
• GoogSpy
• SEODigger
• Compete
• KeyCompete
• Word Tracker
• Key Word Discovery
Content
Having unique content is not only common sense, but it's essential for your site. Not only should you have unique content you should also not repeate your content. By repeating content the search engines will see it as spam. If the search engine sees the page as spam then it will decrease your ranking. There is a tool called Copyscape that will check for duplicate content on the web. Just go to www.copyscape.com and type in the content you are looking for. When a site has dynamic content, that is always up to date, the site will get high rankings. Using a Blog, bullentin board, or having people write reviews will help your site get higher rankings for specific keywords.
In order to keep your site up to date, on method is using an RSS feed to import into your site that is directly related to your site. However, remember that duplicate content will get you penelized. In addition to the RSS feeds that are imported into the site that you post a news article of your own once in a while.
Some content aggregation systems are:
• Google News
• Yahoo News
• YelloBrix
As search engines get more advance not only do they read html files, but they can also read MS Word, MS Excel, MS Power Point, Rich Text, and PDF files. It would be wise to optimize those files as well for keywords.
• Unique content - This is just common sense. You don't want to keep saying the same thing over and over again. If you keep replicating the same thing over and over again, you're not going to advance in the ranks at all because the search engines will catch on to your little plot.
• Content that is always up to date and changes often gets higher rankings. A Blog or Bulletin Board helps as this allows people to interact with your site ultimately changing the content of the site.
Linking
• Having links from other sites is a good thing as it shows that you have relivant content, especially if the sites that are linking to you are related to the target keywords. Links inside user comments don't count or devalue your site in ranking.
• Having a site map helps the search engine pick up your site. This is a central location where all pages can be indexed and made searchable to potential people.
• Meta descriptions should be used. Sometimes the search engine uses the description in the search results.
• Meta keywords tag aren't used as much as they were back in the early days of the internet. It's still a good idea to use them to point out to the search engine that it should be looking for specific keywords on the page.
• Create a company profile and Privacy page to help build trust with the users.
• The higher links appear on the page, the more it affects your ranking.
• Links in blocks of content are more relevent than individual links.
• Avoid using images when possible. If you have to use an image, use the alt property.
File Name
• Hyphens and Underscores are treated as spaces in the file name. No more than 2 or 3 spaces should be used.
• File names should be associate with both the title and keywords.
Title
• Title has heavy weight, especially when the key words that are in the page are found in the meta descriptions, meta keywords, and page are very important.
• The first 65 characters are used in the search results.
Robots
There is a robots tag which have been replaced with the robots.txt. However, these tags are still usable. There are some additional meta tags that you can use in conjunction with this. Instead of the old "<meta type="robots" contents="noindex" /> there are googlebot and msnbot. In addition to the standard index, noindex, follow, and nofollow googlebot offers noarchive and nosnippet. Where as msnbot only offers noarchive. Noarchive will tell the search engine to not cache the site. Nosnippet tells the search engine to not display any description below the title in the search results.
Keywords
There's a balance between having too many keywords and not having enough keywords. The rule of thumb would be the words should not repeat more than four to six times per 350 words. Target keywords should be placed in the title, and the title should not include more than 2 keywords or phrases. Webuildpages.com has a keyword density tool that will tell you the word or phrase, count, and density. The tool can be found at http://www.webuildpages.com/seo-tools/keywords-density By repeating keyword too much you stand a good chance that the search engine will see your page as spam.
Using keywords is a easy way of getting high rankings in google. In the future this idea tactic may no longer be valid. Latent Semantics are the use of words that are associate with other words. For example, Apple would return ipod, mac, and imac. Microsoft would return Windows, Zune, and Media Player. Latent Semantic indexing looks at the whole profile of your site and takes into account all your links. If you have only links that are related to a specific topic then you will get penilized for not being natural.
• Should not have more than 10 key words or phrases.
• Keywords or phrases should not repeat or else a penelty will be given.
Description
• Description is sometimes used in the search results.
• Include keywords in the description.
• 255 characters max.
Link Building
• Link to the home page from each page.
• Don't dilute the theme of the site by linking to other pages that aren't related. If you need to, use the 'nofollow' rel attribute.
<a href=”unrelated.html” rel=”nofollow”>
If it's an external link
<a href=”http://www.otherdomain.com/page.html” rel=”external no follow”>
• Avoid broken links: http://home.snafu.de/tilman/xennlink.html
• Google asks that any paid links have the no follow attribute.
Site Maps
• Search engines reward for site maps. Xml-sitemaps.com
• Place a link to the site map on the home page.
• The file name should be either sitemap.html or sitemap.php.
Robots.txt
• mcanerin.com/en/search-engine/robots-txt.asp
• USER-Agent: *
Disallow: /imgaes/
Disallow: /directory/file.htm
Sitemap: http://www.example.com/sitemap.html
http://www.htaccesstools.com
Re-writing URLs
• Rewrite old page to a new one
RewriteEngine On
RewriteRule ^old\.html$new.html
• Rewrite dynamic page
RewriteRule ^([^/]*)\.html$ /viewproducts.php?category=$1 [L]
Will rewrite the url from www.example.com/viewproducts.php?category=sports to www.example.om/sports.html
Redireting Non-WWW to WWW
If search engines crawl over example.com and www.example.com, this can be seen as duplicate content, and if other sites link to example.com and www.example.com then the links are spread out over two different entities. This issue is referred to as non-ww/www canonical issue. This can be solved by a simple redirect.
Rewrites non-www to www
RewriteEngine ON
RewriteCond %{HTTP_HOST} !^www\.example\.com$
RewriteRule (.*) http://www.example.com/$1 [R=301,L]
Rewrites www to non-www
RewriteEngine ON
RewriteCond %{HTTP_HOST} !^example\.com$
RewriteRule (.*) http://example.com/$1 [R=301,L]