NetForecast has been running live measurements of the ten Apdex Alliance Contributing Members. The results from five different locations across the US show a great range of performance as seen by the users. The measurement data is then summarized using both typical averaging methods and the Apdex method. The results are documented in "Averages Hide the Real End-User Experience: Apdex Tells the Full Story," NFR 5086 by Peter Sevcik, April 2007.
The Apdex reports of the very same measurement data uncover many more performance issues. For example, it finds a region where the users see chronic poor performance that was completely hidden by the averaging methods. The ten weeks of data show that averages significantly under reported true end-user performance issues. It makes a clear case for reporting your measurements with Apdex!
The ISPs may not know where the users see chronic poor performance, either. Do they know how to ask the users? Do they know what to compare to in other ISPs?
Plus the actual paper has another good point:
One of the many findings was a strong correlation between good communication about application performance and actually delivering good performance. The top group excelled at communication relative to the other best practices.
— "Averages Hide the Real End-User Experience: Apdex Tells the Full Story," NFR 5086 by Peter Sevcik, April 2007.
I’d bet there’s also a correlation between the ISP understanding good performance and their ability to communicate it, as well. And I wouldn’t be surprised that the ISP understanding good performance correlates inversely with the ISP’s desire to control what the user gets. But those last are just speculations; Sevcik’s paper is based on hard evidence.
Another speculation: security probably isn’t much different from performance in this respect.