Friday, September 26, 2008

Building a site map in Ruby on Rails...

I needed an automated sitemap process for the www.cadechristian.com website and here is what I managed to create...
route.rb:
map.sitemap 'sitemap.xml', :controller => 'sitemap', :action => 'sitemap'
robots.txt:
Sitemap: http://www.cadechristian.com/sitemap.xml

sitemap controller:
def sitemap
@model1 = Page.find(:all)
@model2 = Page.find(:all)
render :layout => false
end

/app/view/sitemap/sitemap.builder:
@home_url = 'http://www.cadechristian.com'

xml.instruct!
xml.urlset :xmlns => 'http://www.sitemaps.org/schemas/sitemap/0.9' do

# Add static Urls
%w( / /url1 /url2 /url3 ).each do |url|
xml.url do
xml.loc @home_url + url
xml.lastmod Time.today.xmlschema
xml.priority 1
end
end

@model1.each do |model|
xml.url do
xml.loc @home_url + model_path(model)
xml.lastmod product.updated_at.xmlschema
xml.priority 0.9
end
end

@model2.each do |model|
xml.url do
xml.loc @home_url + model_path(model)
xml.lastmod product.updated_at.xmlschema
xml.priority 0.9
end
end
end



That removes the need to update my sitemap.xml file when we add products and other static resources :)

You could also create a ruby script or some hook that runs this code:

require 'net/http'
require 'uri'
include ActionController::UrlWriter

sitemap_uri = 'www.cadechristian.com/sitemap.xml'
escaped_sitemap_uri = URI.escape(sitemap_uri)

%w( submissions.ask.com www.google.com ).each do |engine|
Net::HTTP.get(engine, '/ping?sitemap=' + escaped_sitemap_uri)
end
Net::HTTP.get('webmaster.live.com', '/ping.aspx?siteMap=' + escaped_sitemap_uri)

This would ping the search engines and you are done...

No comments: