Suppose that you have a magic data structure that represents a linear sequence of elements that supports lookup, insertion, and deletion of any index in worst-case O(1) time. (I'm pretty sure that no such structure is possible given the memory model of modern machines, but let's just suppose that we have one for the fun of it).
A friend of mine pointed out that if you have this data structure, then you can build a cool sorting algorithm for integers that runs in expected O(n lg lg n) time as follows. First, create one of the magic data structures mentioned above. Next, for each element in the input array, use interpolation search to find, in expected O(lg lg n) time, the index in that magic array in which the element belongs, then in O(1) time insert the element. Finally, in O(n) time, read off the sorted magic data structure. This makes n calls to the O(lg lg n) interpolation search, which will run in O(n lg lg n) time.
I understand that this above approach is not going to give worst-case O(n lg lg n) time for sorting, since there are pathologically bad input arrays that if used in interpolation search would degenerate the search to O(n2) time. My question is, given this magic data structure, what is the fastest integer sorting algorithm that can be built, assuming we care only about the worst-case runtime of the algorithm?
(This may be a better fit on cstheory, but I'd figured I'd ask here first since I've gotten some great algorithms answers in the past.)
O(1)operation.