0

I am trying to convert a JSON of 50K lines ( 4MB) into a SQL database.I am trying this approach in the code below but the function crashes before it finishes.I am not good at SQL so I do not know if there is any other approach I can follow for this purpose. I think is because of the big ammount of lines that take a lot of time for being executed. The format of my dto is something like this:

{"Data":[ {"name":"Variable A","value":0.321721,"timecreated":"2018-1-15T11:10:7.977Z"}, {"name":"Variable B","value":-8.932533,"timecreated":"2018-1-15T11:10:8.17Z"}, {"name":"Variable C","value":-7.068326,"timecreated":"2018-1-15T11:10:8.58Z"}, {"name":"Variable A","value":-3.580420,"timecreated":"2018-1-15T11:10:8.98Z"},

....50K lines

{"name":"Variable C","value":1.549976,"timecreated":"2018-1-15T11:10:7.977Z"}, {"name":"Variable A","value":-8.701625,"timecreated":"2018-1-15T11:10:8.17Z"}]}

  public static async Task RunAsync([BlobTrigger("Container/Device/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, TraceWriter log)
    {

        log.Info($"C# Blob trigger Entered\n Name:{name} \n Size: {myBlob.Length} Bytes");
        var sr = new StreamReader(myBlob);
        string strReadBlob = sr.ReadToEnd();
        var dto = JsonConvert.DeserializeObject<Object50KElement>(strReadBlob);
        using (SqlConnection conn = new SqlConnection(cnnString))
        {
            foreach (Variable v in dto)
            {
                //Send to SQL
                conn.Execute(XXXXXXX);

            }
        }
    }
4
  • What is the actual error that you get? Commented Feb 6, 2018 at 8:43
  • Timeout. It seems it do not have enoguh time. Commented Feb 6, 2018 at 8:44
  • That seems most likely. Commented Feb 6, 2018 at 8:46
  • so I am trying to find an alternative to this code... Commented Feb 6, 2018 at 8:47

2 Answers 2

2

Your SqlConnection is timing out as it's set by default to 30 seconds. So, you could increase the Timeout. But you might hit the 5 minutes Function Run limit. (if you use the Consumption Plan)

However, if you have 50k lines, it's probably better to use SqlBulkCopy to stream data into DB, rather then insert each line individually. It's way faster and it will save you money. :)

You could use an example in the link above. Or you can use the following code if your json fields match DB columns:

public static int BulkInsert<T>(string connection, string tableName, IEnumerable<T> source)
{
    using (var bulkCopy = new SqlBulkCopy(connection))
    {
        bulkCopy.BatchSize = 10000;
        bulkCopy.DestinationTableName = tableName;

        var reader = new EnumerableDataReader<T>(source);

        for (var i = 0; i < reader.FieldCount; i++)
        {
            var name = reader.GetName(i);
            bulkCopy.ColumnMappings.Add(name, name);
        }
        bulkCopy.WriteToServer(reader);
        return reader.RecordsAffected;
    }
}

You could "play" with the BatchSize to achieve a better performance or run the above function in a loop for every 1000 records.

Sign up to request clarification or add additional context in comments.

1 Comment

I didnt try yet but is what i am looking for :) Thanks
0

Try using a SQLCommand object, as follows. N.B. The default Command Timeout is 30 seconds but you probably don't need this for each sql query so I have set it to 10. Play around with it if it still times out.

using (SqlConnection conn = new SqlConnection(cnnString))
{
  connection.Open();

  foreach (Variable v in dto)
  {
         SqlCommand command = new SqlCommand(XXXXXXXXXX, connection);
         // Setting command timeout to 10 seconds
         command.CommandTimeout = 10;
         command.ExecuteNonQuery();
  } 
}

2 Comments

i think that the timeout is in azure function not in sql query
its not a matter of the error. I just think that is incorrect my approach

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.