0

I have a need to archive data in a SQL Server table into an Azure container.

The container is called Core-Backups and I connect to the container like so:

CloudStorageAccount storageAccount = CloudStorageAccount.Parse("snip");
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference("Core-Backups");

I would like the data I am archiving to go into a folder structure like

tablename/year/month/day/hour

And I have some questions

1. Do I need to create all of the folder structure first, or does Azure create it when I upload content? Right now my container is empty

2. Is there a standard way to save a datatable to an Azure blob?

For a test I tried the following code to upload a text file into an example of the folder structure I want but it failed with a {"The remote server returned an error: (400) Bad Request."}

CloudBlobContainer container = blobClient.GetContainerReference("Core-Backups");
CloudBlockBlob blob = container.GetBlockBlobReference("table1/2017/04/24/9/file1.txt");
blob.UploadText("lorem ipsum");

Edit: I just read that containers must follow DNS rules, so I changed the GetContainerReference line to "core-backups" and it worked.

However question 2 remains, is there a recommended way to upload a datatable?

2
  • I wouldn't use a DataTable at all. That's a .NET idea. Use some common format that anything can understand and that can better represent the structure of your data. JSON, XML etc. Commented Apr 24, 2017 at 15:25
  • Possible duplicate of Uploading DataTable to Azure blob storage Commented Apr 24, 2017 at 16:08

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.