0

If I want to save binary to Azure Table, when I create a class inherited by TableServiceEntity, whats the datatype should I use? and how to check the length for the datatype to insure its not over 64k

public class SomeEntity : TableServiceEntity
{
      public whattype BinaryData { get; set; }
}
3
  • 1
    For binary data I tend to put it in a Blobstore and then a reference to that in my table. Commented Mar 26, 2012 at 20:43
  • I like to query them in one transaction, since I always need them both and binary is less than 64k Commented Mar 26, 2012 at 20:52
  • You could also check the FatEntities by LokadCloud - code.google.com/p/lokad-cloud/wiki/FatEntities It might suit your needs the best! Commented Mar 27, 2012 at 5:42

1 Answer 1

4

For binary, a byte[] of length <= 64K is all that is necessary. The table storage client will convert it into Base64 for transport purposes only but storage will be in binary. If you want to store more than 64K you can split it across multiple columns.

I have written an alternate Azure table storage client, Lucifure Stash, which supports large data columns > 64K, arrays, enums, serialization, public and private properties and fields and more. It is open sourced and available at http://lucifurestash.codeplex.com and via NuGet.

Sign up to request clarification or add additional context in comments.

2 Comments

Interesting, will go check. Can it support more than 1M data in one row?
Azure table storage does not support more than 1MB total row size. This includes overheads, including space used for storing the property names. Of course you can use a level of abstraction and split across multiple rows but Blob storage would be the preferred implementation here.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.