Free Trial

Safari Books Online is a digital library providing on-demand subscription access to thousands of learning resources.

  • Create BookmarkCreate Bookmark
  • Create Note or TagCreate Note or Tag
  • PrintPrint
Share this Page URL
Help

5. Advanced Multi-sites > Favicons and robots

Favicons and robots

Favicons and robots There are certain files that various web agents expect to find at predefined URLs on your server. Drupal provides some of these files at fixed locations within the Drupal distribution. That is, these files are actual files stored on the filesystem, not paths whose content Drupal generates. This can cause issues when working with multi-site Drupal instances, since often each site needs its own copy of these files. In this section we will look at modules that solve this problem for two specific types of file, namely, favicons and robots files. Favicons ("Favorite Icons"), the little images that show up on browser tabs and in bookmarks, are one example. Some web agents assume these files will be accessible at your site's root with the name favicon.ico, for example http://example.com/favicon.ico. Note Drupal allows you to specify a favicon in a theme's settings, but it relies on the user agent to find out where the favicon is located (by reading part of an HTML document). It does not create a file at the path /favicon.ico. In a multi-site configuration, we want Drupal to provide a different favicon for each site. And we want this file to be served at the expected URL. This is best accomplished by using the Favicon module (http://drupal.org/project/favicon). It will allow each site to declare its own favicon, and each site will handle requests for the favicon at the canonical /favicon.ico URL. Some web agents—notably search engine crawlers—look for a file called robots.txt. Search crawlers expect to find information in this file that tells them what parts of the site should be indexed. The contents of this file can figure prominently in search engine optimization, and a well-configured robots file can substantially improve a site's appearance in search engine results on the likes of Google and Bing. By default, Drupal only supports a single robots.txt file that is served out for all of the sites in a multi-site. But many multi-site configurations need something more complex. URL patterns are likely to be different from site to site, and a one-size-fits-all robots file is untenable. Each site may need to provide its own robots file. For this reason, there is a contributed module called RobotsTxt (http://drupal.org/project/robotstxt) that provides an administrative interface on each site, which you can use to set site-specific robots directives. Tip Use this on single-site installations, too! Since Drupal core comes with a robots.txt file, it will try to overwrite this file each time you perform an upgrade. To avoid a search crawler catastrophe, you may find it better to install the RobotsTxt module and store your robot configuration in the database. Similar Drupal modules exist for dealing with features that web agents expect to find at specific paths. The sitemap module (http://drupal.org/project/xmlsitemap) is another example. It allows each site to have a specific set of rules to generate the robust sitemap files that Google, Bing, and other search engines use to discover the content on your site.
  • Safari Books Online
  • Create BookmarkCreate Bookmark
  • Create Note or TagCreate Note or Tag
  • PrintPrint