One only needs two tools in life: WD-40 to make things go, and duct tape to make them stop.
Automated tools for performance testing have been around in some form for the best part of 20 years. During that period, application technology has gone through many changes, moving from a norm of fat client to web and, increasingly, mobile enablement. Accordingly, the sort of capabilities that automated tools must now provide is very much biased toward web and mobile development, and there is much less requirement to support legacy technologies that rely on a two-tier application model. This change in focus is good news for the performance tester, because there are now many more automated tool vendors in the marketplace to choose from with offerings to suit even modest budgets. There are also a number of popular open source tools available. (See Appendix C.)
All of this is well and good, but here’s a note of caution: when your performance testing does occasionally need to move outside of the Web, the choice of tool vendors diminishes rapidly, and technology challenges that have plagued automated tools for many years are still very much in evidence. You are also much more unlikely to find an open source solution, so there will be a cost implication. These problems center not so much on execution and analysis, but rather on your being able to successfully record application activity and then modify the resulting scripts ...