Standardization

One of the reasons it’s hard to get a complete picture of your web presence is the lack of standards. With the exception of rudimentary logging (like CLF), some standard terms, and some industry initiatives for scoring (such as Apdex), there’s no consistent way to collect, aggregate, or report web monitoring data.

This is particularly true for performance monitoring. It wouldn’t take much to improve the state of the art with a few simple additions to the technologies we use today. If, for example, the world’s browser makers decided to cooperate on a single DOM object that marked the final click on the preceding page, it would dramatically improve the usefulness of JavaScript-based RUM. Unfortunately, such standards are woefully absent.

Synthetic testing vendors differentiate their offerings through their instrumentation (browser puppetry versus scripting), the number of test points they have, the detail they offer, and the reports they generate. There’s no reason why we can’t have a standard language for defining scripts, or one for annotating page load times—this wouldn’t undermine any vendor’s uniqueness, but it would be a boon for web operators who need to manage and interpret test data.

Similarly, a common format for associating analytics, WIA, RUM, and VOC data across service producers would make it far easier for web analysts to do their jobs; yet data is seldom available for export. We need a common format for visitor records that can work across products.

The ...

Get Complete Web Monitoring now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.