I'm writing an ODBC binding and I decided to implement support for binary and I've found that if I insert 0x, supposedly the "empty string" MSDN docs, it returns a buffer of one byte with zero in it. Clearly, that does not have length 0:
1> CREATE TABLE demo (bar BINARY)
2> GO
1> INSERT INTO demo VALUES (0x)
2> GO
(1 rows affected)
1> SELECT * FROM demo WHERE bar = 0x00
2> GO
bar
----
0x00
(1 rows affected)
1> SELECT DATALENGTH(bar) FROM demo WHERE bar = 0x00
2> go
-----------
1
However, if you select it directly, it is empty:
1> SELECT DATALENGTH(0x)
2> go
-----------
0
(1 rows affected)
So it seems like pulling something from the table has some kind of "minimum" of one zero byte?
The problem I have is that we don't have a very basic put/get law "what you put in is what you get out". If someone uses my ODBC binding and inserts empty data, they will not get empty data back. This is broken.
What am I missing?