After primarily working with Drupal 5, I wanted to update my SEO skills for Drupal 6 and went through a bunch of tutorials to sum up all the steps and modules required to make a Drupal 6 site ready to be found by all major search engines.
- The one main requirement is to enable clean URLs. That option can be found at admin/settings/clean-urls
- Secondly, install and configure Pathauto and Token. Pathauto automatically creates nice-looking paths when creating nodes. Token isn’t necessarily required, but a nice addition to use variables in your paths.
Once you’ve configured Pathauto and are satisfied with how your URLs end up looking, configure Pathauto so URLs don’t change if you decide to change the title of a node in the future. You can find that settings under /admin/build/path/pathauto under General Settings / Update Action.
- Next, download and configure the Meta Tag module. This module allows you to configure global and node-specific description and tags. The Meta Tag module is actually called nodewords.
Note: I’m using FCKEditor 6.x-1.3-rc3 and nodewords 6.x-1.0-rc1, which don’t play well together when creating a new node. My fix / workaround was to disable FCKEditor on the “New Story” page completely and open it in a popup.
- Then, install the Global Redirect Module. This module makes sure that trailing slashed at the end of URLs are removed. More importantly, it fixes the problem that your URLs are accessible via your URL alias and the node/xxxnumber structure. In general, having duplicate content can decrease your search engine ranking.
- Install the Page Title module. This module allows you to manually set the page titles of nodes. In the eyes of a search engine, page titles represent one of the more relevant characteristics and will increase your search engine ranking.
- Modify your .htaccess file to always redirect to www.yourdomain.com, even if visitors come to yourdomain.com. You can find the rewrite-rules in the .htaccess file, you just need to modify them with your domain name and uncomment them.
- Optimize Drupal’s robots.txt file. This file defines the rules about what folders of your site can be accessed by search engines. Here’s a great tutorial with several modification suggestions: http://tips.webdesign10.com/robots-txt-and-drupal. Once you’re done, upload your robots.txt file and verify that it’s structured correctly with this Robots.txt Validator.
- Finally, make sure your theme is XHTML compatible. Make sure your DOCTYPE declaration is suitable for your needs. I decided to make my danielhanold.com compatible with XHTML 1.0 Transitional, as the Strict declaration doesn’t allow to set the “_target” attribute. You can test the validity of your document at the W3 Validation Tool.
There are quite some steps involved to get your content indexed well by search engines. However, if you spend all this time creating great content, I think it’s worth the time optimizing your site to be “indexed” the best it can.