I have a table with histogram type data. There are 2 columns: Bucket, Count.
Bucket is the histogram bucket and Count is the number of values in that bucket.
Now my buckets are ordered so for example, let's say that the bucket indicates minutes it took to complete a task. We could have buckets such as 0-5 minutes, 5-10 minutes, 10-15, etc.
What I'm trying to compute is which bucket falls in the XXth percentile. For example, if 90% of tasks complete in 12 minutes, then I want to know that 90% of tasks are in the 10-15 bucket or less.
As an example, say I have the following table:
Bucket | Count
--------------
0 | 10
1 | 15
2 | 5
3 | 15
If I want to compute the 60th percentile. It'd be
(10+15+5+15)*.60 = 27 so the result would be bucket 2 since 60% of all entries are in bucket 2 or less
Is there a way to compute this in SQL?
Thanks!
CREATE TABLEand sample insert statements, so that those of us not fluent in histograms can piece together what you're talking about?SELECT @@VERSION