Top 10 SEO Tips for Every Website: Jordan Kasteler

1. Canonical Domain Issues

2. Keyword Research

3. Title Tag Optimization

4. Meta Data

5. Keyword Density

6. On-page keyword Usage (ALT attribute, H tags, body copy, meta data, title, anchor text, images, file/folder names)

7. Sitemaps

8. Crawlability Issues

9. Code Optimization

10. Internal/External Links

1. Canonical Domain Issues – Typically a domain can consist of several different versions such as: http://www.domain.com, domain.com, http://www.domain.com/index.( html | htm | php | asp | jsp | etc). Google can read your WWW and non-WWW as two different versions of your site. The bad thing about this is it triggers a duplicate content filter and then chooses to index only one version. Let’s say Google indexed your non-WWW version, and all of your backlinks went primarily to your WWW version. Well those are wasted backlinks now because they aren’t being passed.

Solution: 301 redirects with a .htaccess file if your site is hosted on an Apache server. A 301 redirect is a “permanent move” of a Webpage. The code below tells bots to redirect the non-WW, index.html, index.htm, and/or index.php to the http://www.domain.com version. Place the following code in a file called .htaccess and put it in your root directory of your site:

Options +FollowSymLinks

RewriteEngine on

RewriteCond %{HTTP_HOST} ^domain.com

RewriteRule ^(.*)$ http://www.domain.com/$1 [R=permanent,L]

RewriteCond %{THE_REQUEST} ^[A-Z]{3,9} /([^/]+/)*index.(htm(l)?|php) HTTP/

RewriteRule ^(([^/]+/)*)index.(htm(l)?|php)$ http://www.domain.com/$1 [R=301,L]

If you’re using an IIS server instead of Apache then do this:

* Open Internet Services Manager, right click on the file or folder you wish to redirect

* Select “a redirection to a URL”

* Enter the Webpage of redirection

* Check “exact URL entered above” and “A permanent redirection for this resource”

* Then “Apply” those settings.

2. Keyword Research – There are a lot of great resources out there for keyword research that I’d like to share. One of my favorites is by Aaron Wall author of SEOBook. His free tool is a compilation of many different keyword research sources ran together to help you find what people are searching for. It can be found at SEOBook.

Overture.com, also known as Yahoo!, has always offered a free keyword selection tool. Overture Keyword suggestion tool

Keyworddiscovery.com and Wordtracker.com have always been the two biggest paid ones. However, Keyword Discovery is currently offering a free keyword search tool. There was a good article in Search Marketing Standard comparing the pros and cons of each. Unfortunately, I have no used Keyword Discovery, but I have been very pleased with Word Tracker. It’s also good for finding those long-tail keywords. In case you are wondering, the long-tail is the 5-7 word longer key phrases that are less popular. A lot of marketers have found it very beneficial to target these large amounts of longer phrases than the few extremely popular ones. I will blog more about this another time.

Incorporating related words and phrasing to your targeted keyword is very important to spiders especially in relation to LSI/A and PaIR. I will blog more about LSI/A and PaIR later, but for now just know that your content should have a related theme to your keyword, and not just be filled with “happy text”. MSN recently released adCenter Labs. There is a tool on here called “Search Result Clustering” which is great for finding related keywords to your main keyword. The caveat is there data is based of their top 10 results for your keyword. MSN ranks sites differently than Google. MSN results tend to favor on-page SEO and Google tends to favor off-page SEO.

3. Title Tag Optimization – Titles should stick around 5-7 words. A lot of people will say that it’s not how many words but its how many characters. I’ve found word-count to be more important. Your title should have your main keyword and then your branding, typically.

For example: Utah Search Engine Optimization | Utah’s SEO Pro

I’ve managed to stick to 7 words, and incorporate my main keywords. Fortunately for me, my branding, “Utah’s SEO Pro”, is also related to my main keyphrase “Utah Search Engine Optimization”. Keywords and branding should be separated in titles with either a dash – or a pipe |. I lean towards pipes since dashes are also used in breaking up words.

4. Meta Data – How many times have you heard someone claiming they know SEO because they know how to construct meta data? I know I have many times. Meta data isn’t half as important as it used to be. I’ve read recently that MSN is the only one who cares about the meta description tag anymore. Even if they don’t matter they’re still good to have because some search engines still use them. Also, meta description is used as the descriptive text in your search engine listing. Here’s an example of proper meta data:

Here are some rules to follow for your meta keyword:

* List keywords in order of most important to least important

* Keep keywords around 250 characters or below. MSN accepts a maximum of 1024.

* Be careful of repeating a keyphrase more than 6 or 7 times

* Don’t use keywords in your meta data that cannot be found on your page’s content

5. Keyword Density – It used to be that people kept a very close watch on the density of keyword in their body content. Typically they kept their keyword density around 2-7%. Now days there is very little weight on the keyword density of your content as long as it’s not overdone or unnatural appearing. They keyword density I’m referring to is in your meta data and title tag. I’m not going to give you an exact rule to follow, but just use common sense for this. Don’t have your title, meta keywords, and meta description all be just your main keyphrase. That’s a 100% density and will raise a red flag.

6. On-page Keyword Usage – There is such thing as over-optimization if you abuse your usage of your main keyphrase. The places you do want to have your keywords and keyphrases are here:

* ALT attributes

* H tags

* Body copy

* Meta data

* Title

* Anchor text

* Images, file, and folder names

ALT attributes can be added to your image tags. They tell spiders and screen readers what that image is about. If possible, use your main keywords to describe the image.

H tags should be set up as a hierarchical tree of headings. Your main header description of your page’s content should be a H1 tag. The H1 tag is the most important and should include your main keywords or keyphrase. Secondary headers should be H2’s, Tertiary should be H3’s, etc.

Body copy should have your main keyphrases used throughout. It’s most important to have them in the first 200 and last 200 words of the content. Extra weight is given to them if they are bolded or italicized. This is an old technique that many SEOs cringe against using, but search engines, especially MSN, still give good weight for it. Like I said before, there is such thing as over-optimization. I wouldn’t bold the main keywords more than 3 times in a typical 750 word page.

Meta data – We’ve discussed this up until now, but to recap, don’t overdo it. Don’t repeat keyphrase more than 6 or 7 times, keep meta keywords to 1024 characters or preferably 250 or below, and don’t add keywords that can’t be found in your content.

Title – We’ve also discussed this as well. Don’t go too far past 7 words and don’t repeat your keywords in your title.

Anchor text is very important for internal and external links. Your anchor text tells crawlers and users what that link is about. It is important to you user main keywords in the anchor text of link to give that particular page or site good ranking for that keyword or keyphrase. When getting backlinks from other people, try to vary your anchor text around your main keyphrase. Too many backlinks with the exact same keyphrase, especially in a short amount of time, will trigger Google’s ‘Google Bomb’ filter.

Images, files, and folder names should also be focused around your main keyphrase. Use dashes (-) when separating words in your naming convention. Dashes tell search engines that the word is separated. Underscores (_) tell search engines to read it as a whole. For example:

Utah-SEO-Pro.jpg will tell a search engine to read “Utah SEO Pro” as the name of that image.

Utah_SEO_Pro.jpg will tell a search engine to read “UtahSEOPro” as the name of the image.

Do you see where your keyphrase can be hurt by using underscores?

7. Sitemaps – Sitemaps are important to search engines and users. My favorite sitemap tool is http://www.xml-sitemaps.com. It will create an XML, ROR, and HTML sitemap for your Website. In addition, it will create a URL list of all your pages to submit to Yahoo! for indexing that can be submitted here http://search.yahoo.com/info/submit.html. The XML map is made especially for Google to submit to their Webmaster tools that can be submitted here http://www.google.com/webmasters/sitemaps/. This is important in telling Google how to crawl your site. MSN, Google, and Yahoo! teamed up to create one generalized protocol for XML sitemaps at http://www.sitemaps.org. It’s nice to seem they playing together on this issue.

8. Crawlability Issues – Never hide your navigation behind Flash, JavaScript, or Images. Always use text for search engines to crawl your navigation. CSS gives your text-based navigation the power of styling similar to using images and the power of roll over similar to JavaScript menus. Spiders have a hard time crawling Flash, and almost impossible time with JavaScript, and they can’t read text that is made from an image (although, Google’s new vector reading patent on images might change that in the future). If you are insistent on having a Flash menu or image-based menu then I suggest creating a text-equivalent alternative in the footer of your site.

If you have a dynamic site then your pages might not be crawlable if they have many parameters. If your URL looked like this for example:

http://www.domain.com/folder/index.php?var=1234&sort=date

Google can typically crawl up to 2 or 3 parameters of your site. It’s best not to risk it though. Any more parameters than 2 or 3 you definitely need to use Mod_Rewrite

Solution:

Mod_Rewrite. Many tools can be found on the internet such as this one: http://www.linkvendor.com/seo-tools/url-rewrite.html. In a (.htaccess) file you can tell your Apache server to write your parameter-based URL to appear as a static URL such as : http://www.domain.com/folder/1234.html instead of http://www.domain.com/folder/index.php?var=1234&sort=date.

9. Code Optimization – CSS layouts have many benefits over table layouts. They lessen your code-to-text ratio and allow spiders to access your content easier without sifting through tons of nested markup. Keep your CSS and JavaScript out of the and in external stylesheets. This reduces your on-page markup and index page’s file size. It also gives a slightly quicker load time. You can use PHP to combine your scripts into one file so it only makes one call to the server. It also can strip out empty space and comments to condense file size. To learn how to do that visit: http://www.ejeliot.com/blog/72.

For table data use the “summary” attribute to tell screen readers (and search engines) what the content in that table is about. Also, use the ALT attribute to describe images. “Longdesc” is a good attribute if you have a longer description of something such as a Flash movie. You can store the description in a completely separate HTML file it will read from.

It’s good practice to keep your code validated. The World Wide Web Consortium offers great code validation tools for XHTML and CSS. Although, Matt Cutts claimed that Google doesn’t care if your code is validated or not other crawlers might. If there’s a code conflict it can halt the spider dead in it’s tracks and may not be able to fully crawl your site.

10. Internal/External Links – Receiving external links from other sites is the #1 item of importance for your rankings. If “content is king” then “linking is queen”. Just keep in mind that pages that link to you should be related to your content. One-way links from sites carry far greater weight than links that are reciprocated. Reciprocal links are a dying practice. Try to keep a good ratio of top-level URL links and deep links. This means don’t get all links going to your http://www.domain.com. Get some links going to the most prominent pages of your Website (i.e. http://www.domain.com/page.html). The reason being is because it looks natural to search engines when someone is linking to something specific, like your internal page, rather than something general like your main domain. There are many variables that determine the weight of the link such as:

· The content of the page that’s linking to you

· Where on the page your link is (within the body content is the best)

· If it’s a directory listing then the higher on the page the better

· The anchor text of the link (should include your keywords)

· The # of outgoing links the page that linking to you currently has

· The relevance and authority of the page that’s linking to you (PageRank is a small determination of that)

· If the site linking to you is blacklisted in search engines or not

Internal links help users find things throughout your site, and help search engines crawl pages better. It’s good practice of information architecture to link to related pages within your content. This will help pass PageRank throughout your site as well.

10 SEO Tips to Remember While Launching a Site: Search Engine Journal

1.) Move JavaScript and CSS Off Page

Moving CSS and JS files off the page does two things. First, it creates cleaner code that is more easily managed. More importantly though, it frees up space that engines consider to be prime real estate. Let’s say a spider lends preferential treatment to content that appears in the first 20KB of a document. If 15KB at the start of your document is verbose JavaScript and CSS coding, you have created an uphill battle from the start…

2.) Code Validity is Key

Always make sure that the final code of your pages can fully validate according to W3C standards. Failure to validate could create accessibility issues — and the engines simply dislike that. They want to push their users out to complete sites that work for everyone.

3.) Browsable Navigation Links

Encapsulating links to internal pages in Flash or JavaScript is dangerous. While some engines can often find links from inside of these coding blocks, it is not guaranteed. Therefore, it is wise to always have an HTML compliant navigational structure. Examples include footer text links, a DHTML menu, etc.

4.) Use a Structured Content Hierarchy

A theme based approach to optimization is the most successful one. Imagine all of the content on your web site to mimic a family tree. Each layer down, there’s more content that fits the overall theme. By nature, the further you drill down — the more specific your content becomes.

5.) URL Construction & Query String URLs

Query strings in URLs are less of a problem today than they once were. Unfortunately, they can still create issues for some engines — and it’s our goal to make the most of the search industry. With this in mind I would recommend that you work with your coding teams to ensure that query strings are kept to a minimum.

6.) Limit Flash Usage

Putting all of your content in a Flash file creates a difficult platform from which to optimize. While it can be done, the results will not come as easily as if Flash was used as a compliment to the rest of the page. Thankfully, with CSS streamlined video on the ‘net, Flash is no longer a necessity. Remember, if you have to use Flash — cut down how much information is in there and fine alternative ways to deliver the content.

7.) Natural Keyword Integration

Repeat after me… “I will not stuff pages with keywords!” Like the engines, I’m tired of seeing web sites that would be great if not for their blatant use of keyword stuffing. Listen up folks… Keyword density and repetition is a thing of the past. Engines are more about off page SEO now, and you need to write clear and concise content that addresses the user. Engines are keen to what makes sense contextually… Don’t try to pull the wool over their eyes.

8.) Local Information Integration

Sounds all technical and precise, but it’s quite simple. If you sell antiques in Tampa Florida, then include that in your site. How? List (in HTML formatted text of course) where you are located. Include a link to Google (or Yahoo) Maps to help hammer home the point. Search is moving to become more focused on users at a local level. Therefore, building sites with this in mind should be a given.

9.) Avoid Duplicate Content

This is pretty self explanatory, and it’s an SEO principal that has been hammered home many times. Why keep on hammering? Because it’s that important! Be sure that you don’t get lazy and copy content from one page to the next. Each page should be specifically targeting one major idea, and the text needs to reflect that. Think you’re at risk? Try something like this free tool allows you to determine the percentage of similarity between any two pages

10.) Launch With the Proper Foundation

Is your new site equipped with a robots.txt file? An XML sitemap? RSS Feeds? Before you launch any new web site you need to run a full QA test to ensure that…

• all pages load properly

• no browser compatibility issues exist

• SEO elements (titles, meta tags, alt tags, etc.) are in place

• spiders can discover all pages

• robots.txt validates

• sitemap.xml(.gz) works

Doing this will really cut down on any potential errors out of the gates, and will put you in a position to succeed.