0

I will run a script to check if some trade-offers are accepted through steam's web API. I will run it using cronjob every 15th second. But I want it to be optimized and run as fast as possible, I feel like I have done this very poorly.

As you can see, I've put some comments that tells what the script is doing. But I will try here as well.

  • It collects all the new trade-offers from the database
  • It checks if the trade offer has been cancelled or not
  • If it is not cancelled, aka accepted. Then it collects information about the offer.
  • If the bot-inventory contains the item, that the player deposited. The database will set status = 1
  • Then it will delete the trade-offer, as it has been completed

I feel like this script is running slowly, should I change to mysqli? or maybe replace mysql_fetch_array with assoc? What can I do to optimize this. It is pretty important that it runs fast, quicker than 15 seconds.

    <?php
require('xxxxxx/xx.php');

        //Getting bot-items
         $jsonInventory = file_get_contents('https://steamcommunity.com/profiles/76561xxxxx8959977/inventory/json/730/2');
         $data = json_decode($jsonInventory, true);

        //Getting tradeoffers
        $tradeoffers = mysql_query("SELECT * FROM tradeoffers");
        while($trade = mysql_fetch_array($tradeoffers)) {

        //Getting information about trade-offer
        $url = file_get_contents("https://api.steampowered.com/IEconService/GetTradeOffer/v1/?key=3593xxxxxB6FFB8594D8561374154F7&tradeofferid=".$trade['tradeofferid']."&language=en_us");
        $json = json_decode($url, true);

        //Checking if trade has been completed
        if (isset($json['response']) && isset($json['response']['offer'])) {


        if($json['response']['offer']['trade_offer_state'] == 1 || $json['response']['offer']['trade_offer_state'] == 5 || $json['response']['offer']['trade_offer_state'] == 6 || $json['response']['offer']['trade_offer_state'] == 7 || $json['response']['offer']['trade_offer_state'] == 8 || $json['response']['offer']['trade_offer_state'] == 10 || $json['response']['offer']['trade_offer_state'] == 11) {
            mysql_query("DELETE FROM tradeoffers WHERE tradeofferid = '".$trade['tradeofferid']."'");
            mysql_query("DELETE FROM items WHERE tradeofferid = '".$trade['tradeofferid']."'");
        } 

            if($json['response']['offer']['trade_offer_state'] == 3) {


            if(isset($data['rgDescriptions'])) {

                $itemsinfo = mysql_query("SELECT * FROM items WHERE tradeofferid = '".$trade['tradeofferid']."'");
                while($item = mysql_fetch_array($itemsinfo)) {

                foreach($data['rgInventory'] as $inv) {
                $desc = $data['rgDescriptions'][ $inv['classid'] .'_'. $inv['instanceid'] ]; 

            if($desc['icon_url'] == $item['iconurl']) {
                mysql_query("UPDATE items SET assetid = '".$inv['id']."' WHERE iconurl = '".$item['iconurl']."'");
                mysql_query("UPDATE items SET status = 1 WHERE iconurl = '".$item['iconurl']."'");

                   }
                }    
              }
            }
            //Deleting the trade-offer from the database.
            mysql_query("DELETE FROM tradeoffers WHERE tradeofferid = '".$trade['tradeofferid']."'");
        }
    } else {
        mysql_query("DELETE FROM tradeoffers WHERE tradeofferid = '".$trade['tradeofferid']."'");
        mysql_query("DELETE FROM items WHERE tradeofferid = '".$trade['tradeofferid']."'");
    }
 }
 echo 'Finished';
?>
15
  • 1
    mysql is deprecated and should no longer be used. user mysqli or PDO. see php.net/manual/en/function.mysql-query.php Commented Feb 3, 2018 at 10:11
  • Yes I know how to use mysqli, but the rest of the site is running on mysql due to an old developer. But I can run this script using mysqli right? without disturbing the rest of the code? Commented Feb 3, 2018 at 10:13
  • Changing from mysql to mysqli is not likely to help with performance. Commented Feb 3, 2018 at 10:14
  • @Karldrakar yes if this is isolated. Also the steam API seems to have a GetTradeOffers route so you can possibly get them in bulk and delete/update things from the database in bulk (possibly I don't know if the API will give you enough information to work with in the bulk request). Commented Feb 3, 2018 at 10:14
  • Your inner file_get_contents() always goes to the same URL. There's no need to do it repeatedly. Do it once at the beginning of the script. Commented Feb 3, 2018 at 10:14

3 Answers 3

1

One level for increasing performance is to switch from file_get_contents to curl for getting data from the API. curl is usually much faster. Also, with curl you can run multiple requests in parallel, which brings another performance boost (if you are able to parallelize your requests).

See also this question.

Another level is to parallelize your database calls which you could do after you migrated to mysqli. See this question for details. (again assuming its possible and makes sense logic-wise)

Sign up to request clarification or add additional context in comments.

3 Comments

I am not used to curl, is it rocket-science?
No, its actually the standard way of doing HTTP requests in PHP (and many other languages).
Doing multi-curls is a bit more complicated, but just switching to a simple curl call for every of your current API calls should already bring a significant improvement.
1

First, I'd advise you to move away from mysql_* functions and use either PDO or mysqli.

Optimization. I've not run your code but some pointers:

"SELECT * FROM" might be slow. Try to use only the fields you need.

You are updating on 'WHERE iconurl = '".$item['iconurl']."'"'. Is this field indexed?

Is it necessary to DELETE these records? That is a slow operation. What happens if you flag them, e.g. complete = 1? (you may later still delete them in one go if your table gets too crowded)

4 Comments

Will change the query "SELECT * FROM tradeoffers" to select only specific columns. The $item['iconurl'] is indexed.
What about the field "tradeofferid", is that indexed as well?
I don't think so
Well, there's multiple instances of "WHERE tradeofferid =" in your code so chances are it will speed up the database some. DELETE will probably be the biggest performance hit though, because the table needs to rebuild its indexes again after losing a record.
0

There have been some good answers here, and I'll start by seconding what they've said in a brief summary and then add in my two cents.

(1) Your biggest performance gain will come from Erik's two suggestions regarding cURL. Switching to cURL will provide a small increase in performance (maybe 0.5 to 1 second or more per call), but using multi-curl to call both URLs in parallel will likely provide THE ABSOLUTE BIGGEST BENEFIT of all the suggestions here, no questions asked (since you're doing these network fetches in a loop). Here's a class that someone else wrote that simplifies multi-curl a bit:

<?php
// LICENSE: PUBLIC DOMAIN
// The author disclaims copyright to this source code.
// AUTHOR: Shailesh N. Humbad
// SOURCE: https://www.somacon.com/p539.php
// DATE: 6/4/2008

// index.php
// Run the parallel get and print the total time
$s = microtime(true);
// Define the URLs
$urls = array(
  "http://localhost/r.php?echo=request1",
  "http://localhost/r.php?echo=request2",
  "http://localhost/r.php?echo=request3"
);
$pg = new ParallelGet($urls);
print "<br />total time: ".round(microtime(true) - $s, 4)." seconds";

// Class to run parallel GET requests and return the transfer
class ParallelGet
{
  function __construct($urls)
  {
    // Create get requests for each URL
    $mh = curl_multi_init();
    foreach($urls as $i => $url)
    {
      $ch[$i] = curl_init($url);
      curl_setopt($ch[$i], CURLOPT_RETURNTRANSFER, 1);
      curl_multi_add_handle($mh, $ch[$i]);
    }

    // Start performing the request
    do {
        $execReturnValue = curl_multi_exec($mh, $runningHandles);
    } while ($execReturnValue == CURLM_CALL_MULTI_PERFORM);
    // Loop and continue processing the request
    while ($runningHandles && $execReturnValue == CURLM_OK) {
      // Wait forever for network
      $numberReady = curl_multi_select($mh);
      if ($numberReady != -1) {
        // Pull in any new data, or at least handle timeouts
        do {
          $execReturnValue = curl_multi_exec($mh, $runningHandles);
        } while ($execReturnValue == CURLM_CALL_MULTI_PERFORM);
      }
    }

    // Check for any errors
    if ($execReturnValue != CURLM_OK) {
      trigger_error("Curl multi read error $execReturnValue\n", E_USER_WARNING);
    }

    // Extract the content
    foreach($urls as $i => $url)
    {
      // Check for errors
      $curlError = curl_error($ch[$i]);
      if($curlError == "") {
        $res[$i] = curl_multi_getcontent($ch[$i]);
      } else {
        print "Curl error on handle $i: $curlError\n";
      }
      // Remove and close the handle
      curl_multi_remove_handle($mh, $ch[$i]);
      curl_close($ch[$i]);
    }
    // Clean up the curl_multi handle
    curl_multi_close($mh);

    // Print the response data
    print_r($res);
  }

}

The catch here is that this approach depends heavily on how many trade offers you have at any given time, since you're doing a network call for each one. If you have 1,000 trade offers, you might have to break them up into smaller chunks so you're not slamming the steam API with a ton of calls all at the same time.

(2) If you're running it every 15 seconds, then you're likely incurring some overhead from the script starting up. You could run this script in an infinite loop to eliminate that startup time, although you'd have to ensure there's no memory leaks so your script doesn't eventually run out of memory:

<?php
set_time_limit(0);
while(true)
{
  ...your code here...

  // Wait 15 seconds before the next round
  sleep(15);
}

(3) I'm assuming your database is pretty small, but if you've got 10k records or more in any given table, then indexes ARE going to be important, as Herco mentioned. Without good indexes, your SQL queries are going to suffer.

However, I would focus less on #3 and more on #1 and #2 for your best improvements.

1 Comment

Thanks, I'll look into it!

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.