The minimum response time metric (rather than, say, average response time) does a good job of describing whether a web site feels slow or snappy. Below are two pictures that illustrate how the mighty Drupal 6 is inferior in this respect to the much simpler Website Baker CMS. These measurements were taken for a typical web page filled with some images and a moderate amout text in case of Website Baker, and on an almost empty page for Drupal 6 with an installed Ubercart (shopping cart) suite of modules (with no caching). KCachegrind was used for visualization:
Also compare the number of "boxes", which reflect nested procedure calls in both cases.
Does it mean that given a choice you should use Website Baker instead of Drupal? The answer obviously will depend on the required features. The main insight here is that this overhead is very unlikely to go away in the future. It arises from Drupal's overall architecture involving interactions among many small modules, most of which are contributed by third parties and neither explicitly tested nor certified for performance. While offering lots of functionality and extensibility out-of-the-box, Drupal puts much strain in terms of the required quality assurance on the application designer - with performance being just one aspect of the challenge.
Update 2011/01: the latency of rendering simple pages apparently got even worse in Drupal 7 (see comments to this article). I'm aware that it is rather easy to boost Drupal performance for anonymous, non-targeted content several orders of magnitude just by using a reverse proxy cache and so reduce response time to equal network latency. However, I doubt that an average small Drupal site would be able to configure and deploy such optimizations.
Update 2012/03: the poor performance of Drupal is further exemplified by this piece of anecdotal evidence: I have a site which spends just 300ms per request in the database layer and over 6s on the PHP side, in the theme layer, doing usorts, unnecessarily shuffling CCK metadata etc. There is no single obvious bottleneck that can be identified by profiling. The performance is lost practically "everywhere". The only real option is aggressive, coarse-grained caching, but this is rather hard for sites that contain highly dynamic content and haven't been architected from the start to work around Drupal's design deficiencies.