Synthetic testing is your first answer to the question, “Could they do it?” It’s a reliable, controllable, repeatable measurement that you can use regardless of how much or how little traffic there is on your website. It also has some significant shortcomings.
Synthetic testing services don’t know how much traffic is on your site. A testing service is blissfully unaware of whether your site is experiencing a deluge of traffic or is so slow that it has idle servers. This means they’re missing an important possible cause of web latency, because the more visitors you have, the longer the site takes to respond. Alerts and thresholds on synthetic testing systems can’t take into account how busy the site is.
You should, of course, be concerned if your site becomes slow when nobody’s using it, but a synthetic testing service won’t alert you to that fact as long as the latency is within acceptable limits. Dynamic baselining, in which the service learns what “normal” latency is like at a certain time of day, is somewhat of a proxy for load, assuming your website gets the same loads at the same times of day.
Synthetic tests generate traffic. If you’re running tests on a site, exclude the synthetic tests from overall analytics before you start testing or your visitor count will be artificially high, as shown in Figure 9-31.
Figure 9-31. The sudden drop in traffic on October 27 is the result of excluding synthetic testing ...