2

Table Content with 100,000 record

CON_ID | CON_VALUE | CON_CATEGORY 
__________________________________
001    | data1     | title        
002    | data2     | title        
003    | data3     | process      
004    | data4     | process      

I read data table content from database and copy to array.because i must select from content 100 times.

array(100017) {
  [0]=>
  array(3) {
    ["CON_VALUE"]=>
    string(40) "data1"
    ["CON_ID"]=>
    string(32) "001"
    ["CON_CATEGORY"]=>
    string(9) "title"
  }
  [1]=>
  array(3) {
    ["CON_VALUE"]=>
    string(26) "data2"
    ["CON_ID"]=>
    string(32) "002"
    ["CON_CATEGORY"]=>
    string(9) "title"
  }
  [2]=>
  array(3) {
    ["CON_VALUE"]=>
    string(28) "data3"
    ["CON_ID"]=>
    string(32) "003"
    ["CON_CATEGORY"]=>
    string(9) "process"
  }
  [3]=>
  array(3) {
    ["CON_VALUE"]=>
    string(26) "data4"
    ["CON_ID"]=>
    string(32) "004"
    ["CON_CATEGORY"]=>
    string(9) "process"
  }    
}

For example: i want select CON_VALUE from Content where CON_ID=001 and CON_CATEGORY='title'

I use foreach

foreach($GLOBALS['contentTable'] as $R)
{ 
    if($R['CON_ID']==$id && $R['CON_VALUE']!='' && $R['CON_CATEGORY']=='PRO_TITLE' )
    {         
        return $R["CON_VALUE"];
    }
}

But this foreach is very slow(about 10 second)

QUESTION: how can increment speed of search? or there is better way for get value array base on value?

3
  • 1
    Why do you can't use the given select statement to only select the data from database you need and not all 100000 rows? Commented May 27, 2015 at 3:50
  • i must select from table about 100 times Commented May 27, 2015 at 3:52
  • OK, make arrays of the criteria and use an IN statement. 100K rows is simply going to take a large chunk of RAM and CPU time to run through. Commented May 27, 2015 at 3:55

1 Answer 1

2

As your foreachloop to seperate and the transport of big data between DB and script takes quite long (as you have noticed) you should reduce your given resultset when reading from database.

You are writing that you need to select 100 times from database, so I assume you are meaning you need 100 different rows. (Otherwise you only could select the one row and use the variables 100 times).

Following what I would do:

Second assumption from the wording of you question: I assume that only the combination of con_id and con_category is unique.

First I would create the SQL-Statement via a foreachloop to create only on select, goal is to get soimething like for your 100 rows to select:

select * from content where (con_value = 'data1' and con_category = 'title') or (con_value = 'data2' and con_category = 'title') ...

To improve the work of the SQL optimizer you should also add an index over the two cols con_id and con_category

CREATE INDEX CIndex ON Content (con_id, con_category)

This should improve your select a lot.

Further improvement if possible: 1) If the con_id is unique you should use only select the id and this via an IN (...) clause. Again create the statement using php with either foreach or if possible from base information via implode (maybe $idlist = implode(",", $arrayofIDs); so you get

select * from content where con_id in ('001', '002')

You should then add a unique index to it to improve search. If this col is your primary key col define it as this and it has an unique index by default.

2) As from your shown resultset your con_id col is string. If you can, change you con_id col to numeric value then do so! Numeric search is much much faster than string search!

3) As from your shown resultset your con_category col is string but only got key values. If you can, change it to keys, so replacing "title" with 1, "process" with 2 and so on. Then create also an index on it. As it is then a numeric col and not a string col your search is improved again.

I hope my answer helps you. I think you can reach your goal by improving your SQL so you can throw away the copying of the huge resultset. And database side searches inside of 100000 rows is not a challenge for a databases - its their daily work!

Sign up to request clarification or add additional context in comments.

1 Comment

thanks for replay,now i get log from database and saw select from content 867 times with different CON_ID and CON_CATEGORY. CON_ID is not clear and different time is different so i can not optimize sql query.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.