Table Content with 100,000 record
CON_ID | CON_VALUE | CON_CATEGORY __________________________________ 001 | data1 | title 002 | data2 | title 003 | data3 | process 004 | data4 | process
I read data table content from database and copy to array.because i must select from content 100 times.
array(100017) {
[0]=>
array(3) {
["CON_VALUE"]=>
string(40) "data1"
["CON_ID"]=>
string(32) "001"
["CON_CATEGORY"]=>
string(9) "title"
}
[1]=>
array(3) {
["CON_VALUE"]=>
string(26) "data2"
["CON_ID"]=>
string(32) "002"
["CON_CATEGORY"]=>
string(9) "title"
}
[2]=>
array(3) {
["CON_VALUE"]=>
string(28) "data3"
["CON_ID"]=>
string(32) "003"
["CON_CATEGORY"]=>
string(9) "process"
}
[3]=>
array(3) {
["CON_VALUE"]=>
string(26) "data4"
["CON_ID"]=>
string(32) "004"
["CON_CATEGORY"]=>
string(9) "process"
}
}
For example:
i want select CON_VALUE from Content where CON_ID=001 and CON_CATEGORY='title'
I use foreach
foreach($GLOBALS['contentTable'] as $R)
{
if($R['CON_ID']==$id && $R['CON_VALUE']!='' && $R['CON_CATEGORY']=='PRO_TITLE' )
{
return $R["CON_VALUE"];
}
}
But this foreach is very slow(about 10 second)
QUESTION: how can increment speed of search? or there is better way for get value array base on value?