A bit confused. I have a few Loadrunner Analysis from a report I've run. I'm new to testing. My understanding of the 90th percentile is that, given that it takes the 90th percentile and leaves out the outliers, it presents a truer picture. Although I'm looking at two different reports and in both, the 90th percentile response time is higher than the average response time given in the Summary Report. How can that be possible?
I'm looking at the graph of transaction response times (Percentile) and the last 10% shoot up, therefore telling me that taking the 90% should see a lower response time.
90 Percentile 6.412
The 90th percentile means that 90% of the values fall below this value. The value in this case would be your response time. So if you had 1000 values and the 90th percentile is
n, 900 of those values would be below
n, and only 100 above
n -- so it makes sense that the average is less than the 90th percentile.