0

I have Just created an Azure Function which I want to be triggered By Azure Blob storage (whenever a new blob is created), I want to have my function code so that it parses the contents of the Json file (filename.json) which is in JSON format, and uses the contents of the file to update the contents of a database, the contents of the files look like this:

{
    "ID": "58",
    "Symbol": "J500",
    "Content": 
      [
        {"Date": "2017-05-15", "Value": "100000"}, 
        {"Date": "2017-05-16", "Value": "200"},
        {"Date": "2017-05-17", "Value": "90000"},
        {"Date": "2017-05-18", "Value": "80000"},
        {"Date": "2017-05-19", "Value": "70000}
      ]
}

I have barely ever written code in C# before and thought I would be able to use python for this, but it turns out I am restricted to C#, any help with an elaborate description and explanation would be greatly appreciated.

I am also fairly new to Azure, so if anyone has a better suggestion on how to achieve what I am trying to do (that is maybe using a different trigger or different binders or even using a different service rather than functions), I'm open to suggestions and would gladly appreciate it. I currently have just the default code:

public static void Run(Stream myBlob, string name, TraceWriter log)
{
    log.Info($"C# Blob trigger function Processed blob\n Name:{name}" \n Size: {myBlob.Length} Bytes");
}

Also as a side note, I cannot see my logs unless I go to the monitor screen, any help with this would also be very much appreciated.

Thanks!

1 Answer 1

0

First, is there a reason that you cannot use Python? For Functions, it's 'experimental' but mostly for performance reasons. It should be able to get the job done.

There's no Python template for blobs, but if you create a QueueTrigger template, you can edit the function.json and python script that are generated to get it working.

function.json:

{
  "bindings": [
    {
      "name": "input",
      "type": "blobTrigger",
      "direction": "in",
      "path": "samples-python/{fileName}",
      "connection": "AzureWebJobsStorage"
    }
  ],
  "disabled": false
}

run.py

import os

# read the blob contents and write to stdout
inputMessage = open(os.environ['input']).read()
message = "Python script processed blob '{0}'".format(inputMessage)
print(message)

For C#, it's a fairly broad question but I can point you in the right direction.

First, you can deal with JSON in .NET by using Json.NET: http://www.newtonsoft.com/json. Support is built into Azure Functions, so with this code you can convert your blob text into a JObject (a Json.NET type) and then log it:

#r "Newtonsoft.Json"

using Newtonsoft.Json.Linq;

public static void Run(string myBlob, string name, TraceWriter log)
{
    JObject json = JObject.Parse(myBlob);
    log.Info(json.ToString());
}

From there, you'll need to get it into a database. If you're using SQL, there's a nice guide to getting set up in Functions with SqlClient here: https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup.

Once you have SqlClient working, it's a metter of taking your JObject and turning it into Sql records. Here's a simple example of an insert: C# - SQLClient - Simplest INSERT

Sign up to request clarification or add additional context in comments.

5 Comments

Thank You for your answer. In response to why I cannot use python, there is not option from the list of possible triggers for a blob storage trigger for python, only one for C# and F#. Also I have still not been able to get this code snippet to work, as I still get a compilation error which I believe is caused by the use of Newtonsoft.Json, do I need to do any sort of imports to be able to use the methods which it provides, because the code I have is pretty much the same as yours but it still doesn't work.
Blob triggers are supported in Python; there's just no template for them. I've updated the answer above to show how to create one by starting with a queue trigger. If you still want to pursue C#, let me know and we can figure out what is wrong -- that code should be able to copy-paste and run.
Thanks, This was very helpful and it does work, the only problem I have now is that whenever the trigger goes off (that is a file is loaded to blob storage) the function works on all the files in that container. Do you know of any way in which I could make it so that the function only works on that one blob (the blob that triggered the function) that was loaded?
The first time the app runs, it will process all blobs in the container. After that, it will only run against new blobs or changes to existing blobs. Are you touching any other blobs in the container when you add a new one?
Yes I was unknowingly doing that from my python script!, rookie mistake I guess, thank you very much, you have been of much help, I also wanted to know if there was a way to access the metadata of the blob from the function code.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.