10,000 to 20,000 distinct cases sounds like a nightmare. Although it's technically possible, I find it hard to believe that your processing needs require that level of granularity.
Is the processing in each of the 10,000 to 20,000 cases really so different that it needs completely separate testing and handling? Aren't there cases similar enough to be handled in a similar way?
For example, if the processing for case $x = 5 is something like:
echo 5;
And the processing for case $x = 10 is something like:
echo 10;
Then these could be grouped into a single test and single handler:
function dumbEcho($x){
echo $x;
}
function isDumbEchoAble($x){
return in_array($x, array(5,10));
}
if (isDumbEchoAble($x)){
dumbEcho($x);
}
For each structurally similar set of processing, you could create a isXXXAble() function to test and an XXX() function to process. [Of course, this is just a simple example, intended to demonstrate a principle, a concept, not necessarily code that you can copy/paste into your current situation.]
The essence of programming - IMHO - is to find these structural similarities, find a parameterization sufficient to handle the unique cases, and then apply this paramaterized processing to those cases.