I have many arrays with sorted data. I need to perform binary search in this arrays. If key ranges in this arrays were disjoint it will be possible to sort arrays by range and then perform binary search as with single array. But in my case, key-ranges in this arrays can overlap. In this case it is only possible to perform filtering to exclude some arrays and then sort the other part. In my case most of the arrays doesn't overlap so filtering, most of the time, will return only one array but it is still possible for bad data to ruin performance.
Is it possible to use better algorithm in this case? It is possible to modify arrays slightly, add some metadata or links to other arrays.
Update This arrays is data pages backed by disk storage. I use memory mapped files for this. I can sort data inside page very fast, because copying doesn't involved in this process. But to merge two pages I need to copy large amount of data between pages. I have very large amount of data, terabytes! But each page is only 8Mb so it can be searched quickly. New pages added to the storage from time to time. Pages contains time-series data so it already partially sorted and new arrays doesn't overlap with old data most of the time.