In a sense, all web servers come with a performance monitoring tool, the logging facility of the server. This leaves the webmaster the problem of interpreting the logged data. The one-line-per-transfer format is difficult to analyze by directly reading it.
The need to extract as much useful information as possible from log files has given rise to a small industry of log parsing and graphing packages like Interse, the freeware analog tool net.Analysis (http://www.netgen.com/), and the Netscape servers’ built-in analyze command. These tools are useful, but you can also simply import log files into a spreadsheet program and use the spreadsheet to plot the results in various ways. A good spreadsheet program will have graphing ability similar to dedicated log graphing packages, and the added advantage is that you may already own one.
To provide additional information, some web servers use extended log formats, including details such as how long the transfer actually took to complete. Your log file can also tell you where your users are by logging IP addresses or machine names. Are they across the whole Internet? In a 50-location extranet? All in one building? If you can figure this out from the IP addresses, you might be able to improve performance by locating your servers close to the largest concentrations of users.
How can you know the number and distribution of connections to expect over each day? If your web server is already live, then you have an excellent ...