Nowadays many websites employ real user monitoring tools such as New Relic (http://newrelic.com/features/real-user-monitoring) or Gomez (http://www.compuware.com/application-performance-management/real-user-monitoring.html) to measure performance of production applications. Those tools provide a great value by giving real time metrics and allow engineers to identify and address eventual performance bottlenecks.
This works well for live deployed applications, but what about a staged setup? Engineers might want to look at the performance before deploying to production, perhaps while going through a QA process. They may want to find possible performance regressions or make sure a new feature is fast. The staged setup could reside on a corporate network however, restricting the use of RUM tools mentioned earlier.
And what about an application hosted in a firewalled environment? Not all web applications are publicly hosted on the Internet. Some are installed in private data centers for internal use only (think about an intranet type of setup).
How can you watch application performance in these types of scenarios? In this chapter, I’ll explain how we leveraged open source software to build our performance test suite.
The initial step is to record data. For that purpose we use a bit of custom code that records time spent on multiple layers: front end, web tier, backend web services, and database.
Our web tier ...