0

I have a sql query that, for each result returns a string like 1:One, 2:Two, 3:Three.

Now I want to convert each one of these strings to a PHP array like this:

Array(
    1: One,
    2: Two,
    3: Three
)

I know that I could do that with one explode function inside another one but, isn't that too much overkill if I have 500+ results on the mysql query? Is there any better way to get something like that?

Here is a sample of the mysql code that creates something like the string result that I gave:

GROUP_CONCAT(DISTINCT cast(concat(cast(number.id AS char),': ',number.name) AS char) order by number.id SEPARATOR ', ') AS all_active_numbers

EDIT

So here's an example of 2 possible returning rows from mysql:

|-----------------------------------------------------------------------|
|   id    |              all_groups            |     groups_assigned    |
|   1     |   1:Team A, 2:Team B, 3:Team C     |        1:Team A        |
|   2     |   1:Team A, 2:Team B, 3:Team C     |   2:Team B, 3:Team C   |
|-----------------------------------------------------------------------|

What I want to know is the best way to transform the strings of all_groups and groups_assigned of each row, into a PHP array. As I said, I know I could do it using 2 explode function (one inside another using foreach loops) but what if my query returns 500+ results? This seems like a big overkill for the server to compute explode's for each one of the 500+ rows.

Just to clarify, all_groups is something like the groups that are available for a person and groups_assigned is the groups where the person is registered from the available all_groups.

Another possibility is maybe divide this into 3 different queries?

6
  • What is your question ? Commented Jul 2, 2013 at 17:16
  • show the result how is? row1|1|One row2|2|Two???? Commented Jul 2, 2013 at 17:16
  • i would do it with explode. Maybe you can concat it to json array and do json_decode but i dont know if its faster Commented Jul 2, 2013 at 17:17
  • if you're just going to split up the data in the client, then don't smoosh the data together in mysql in the first place... Commented Jul 2, 2013 at 17:17
  • I updated the info of my question with a more detailed example. Commented Jul 2, 2013 at 22:05

1 Answer 1

1

Just explode based off of your colon, otherwise, form your query to provide the KEY and VALUE's separately.

PHP example (untested, example only):

$result = $pdo->query($query);
$myArray = array();
while($row = $result->fetchAll(PDO::FETCH_ASSOC)) {
  $myGroup = explode(": ", $row['all_active_numbers']);
  $myArray[][$myGroup[0]] = $myGroup[1];
}
var_dump($myArray);
Sign up to request clarification or add additional context in comments.

6 Comments

i think his string looks like this "1: One, 2: Two, 3: Three" for one row
I had thought this; there will need to be some sort of "per row" delimiter.. maybe he can figure that out himself with the provided example to get him pointed in the right direction!
@RobW I updated the info of my question with a more detailed example. Please, see my Edit =).
@CristianoSantos - Even with a million rows of data, it won't really strain PHP/the web server. Your database query should just query the data, and your script should transform/handle the data. Hope this helps!
@RobW So the transformation of the string in array in each one of the rows doesn't slow down the Server significantly? Even with a foreach inside another one using 2 explode functions?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.