0

I am attempting to run a report for instances in the DB for the last month. This will then save the results as a .CSV file. The problem I am encountering is that with this script I am getting an Internal Server Error 500.
If I use a WHERE clause on the query to filter out data, I am not getting this issue so I know the query and the function to create the .CSV file are working to some degree. I get a fully populated .CSV file. However without an additional WHERE clause, the file appears to be populating to a point with the majority of the data, but then appears to restart and the resulting file only contains about 1/8 of the overall data.
I am expecting 40K+ rows of data in the monthly .csv function createFile(){

/* * This method will be used to create the csv file */

    //-----Connection to DB-----//

    $connectionInfo = array("UID" => $this->dbLogin, "PWD" => $this->dbLogin);
    $conn = sqlsrv_connect( $this->serverName, $connectionInfo)or die(print_r(sqlsrv_errors()));    

    //-----SQL Query-----//

    $getList = sqlsrv_query($conn, $this->queryString, array(), $this->options)or die(print_r(sqlsrv_errors()));            

    //-----File creation-----//

    $fp = fopen("../" . $this->portal . "/" . $this->folder . "/" . $this->fileName . ".csv", 'w+');

    // //-----Add in first row that contains the column titles-----//



    fputcsv($fp, $this->headerArray);

    fclose($fp);    

    // // // -----Add data to the csv file-----//

    $fp = fopen("../" . $this->portal . "/" . $this->folder . "/" . $this->fileName . ".csv", 'a+');
    $data = array();
    while ($row = sqlsrv_fetch_array($getList,  SQLSRV_FETCH_ASSOC)) { 

    $data['id'] = $row['id'];
    $data['fullname'] = $row['fullname'];
    $data['profile'] = $row['profile'];
    $data['starttime'] = date_format($row['starttime'],'D jS M Y  G:i');
    $data['endtime'] = date_format($row['endtime'], 'D jS M Y  G:i');
    $endTime = date_format($row['endtime'], 'U');
    $startTime = date_format($row['starttime'], 'U');
            if($startTime == null || $startTime == ""){
        $startTime = $endTime;
     }
    $diff  = ($endTime - $startTime); 
        $data['duration']  = round($diff/3600).gmdate(":i:s", $diff);
            $data['hour'] =date_format($row['starttime'] ,'G');
    $data['ref'] = $row['ref'];
    $data['endType'] = $row['endType'];
    $data['problem'] = $row['problem'];
    $data['solution'] = $row['solution'];
    $data['type'] = $row['type'];

    fputcsv($fp, array_values($data));
    }       

    fclose($fp); 

    sqlsrv_close($conn);    
}

The above is the function used in my class file to create the .csv file. The query I'm using is.

$this->queryString ="SELECT A.id

      ,B.firstname + ' ' + B.lastname AS fullname

      ,B.location
      ,A.profile
      ,A.siteid
      ,A.accountnumber
      ,A.starttime
      ,A.endtime
      ,A.ref
      ,A.endType
      ,A.problem
      ,A.solution
      ,A.type

       FROM trend.report A
       LEFT JOIN users.profile B
      ON A.empId = B.id

      WHERE DATEDIFF( M, A.endtime, GETDATE()) = 0
      ORDER BY A.endtime DESC";

If I add for example

 AND A.profile ='exampleProfile'

I still get 20K+ rows, but I get a compeleted .CSV file.

Anything glaringly obvious with this? Or could be be due to the number of rows/ time taken to execute the script? Thanks

1
  • A 500 error usually generates entries in the error log of the webserver Commented Dec 13, 2013 at 13:08

1 Answer 1

1

Do you execute this from the command line or browser?

1) If you do this in your browser, you might reach a HTTP timeout (often 30s). You can raise it but it would be better to execute the PHP script from command line

<?php
 set_time_limit(600); // 10 minutes

2) You reach your memory limit. Try to raise it on the first line of your PHP file:

<?php
ini_set('memory_limit', '2048M');

Actual top depends on server, the server should also accept setting memory_limit through php files. Otherwise you can config it straight on the webserver.

Sign up to request clarification or add additional context in comments.

1 Comment

Thankyou for your response, Trying these out now. But what would explain the script from starting the file again and only getting a little of the way through? I would expect the remaining result when it starts again but im getting the first few rows back. Memory limit perhaps?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.