1

I'm pulling in some external data into my MSSQL server. Several columns of incoming data are marked as 'number' (it's a json file). It's millions of rows in size and many of the columns appear to be decimal (18,2) like 23.33. But I can't be sure that it will always be like that, in fact a few have been 23.333 or longer numbers like 23.35555555 which will mess up my import.

So my question is given a column is going to have some kind of number imported into it, but I can't be sure really how big or how many decimal places it's going to have... do I have to resort to making my column a varchar or is there a very generic number kind of column I'm not thinking of?

Is there a max size decimal, sort of like using VARCHAR(8000) or VARCHAR(MAX) ?

update

This is the 'data type' of number that I'm pulling in: https://dev.socrata.com/docs/datatypes/number.html#

Looks like it can be pretty much any number, as per their writing: "Numbers are arbitrary precision, arbitrary scale numbers."

15
  • You could use decimal(18,3) to allow for values with 3 significant digits. Commented Aug 23, 2018 at 14:31
  • Float perhaps, but you lose the precision in that, rounding can occur. It's an approximate-number data type. Not sure what this value represents for you, or how you are using it. If you can ask what's the maximum scale and precision you'd see, then I'd make the column that... like (32,6) or whatever... Commented Aug 23, 2018 at 14:31
  • 1
    Precision and Scale max is 38 digits (to the right of the decimal). The scale must be <= to the precision. so Decimal (38,38) Commented Aug 23, 2018 at 14:37
  • 1
    learn.microsoft.com/en-us/sql/t-sql/data-types/… Commented Aug 23, 2018 at 14:37
  • 2
    JSON's numbers ought to be safe to parse as float, because that's what they are in JavaScript: double-precision floating point. Anything that's not safe to parse as that is probably going to be put into a string. Of course, you may still need another type in the database depending on how those numbers are going to be used, but the JSON can't tell you that. Commented Aug 23, 2018 at 14:46

2 Answers 2

2

The way I handle things like this is to import the raw data into a staging table in a varchar(max) column.

Then I use TRY_PARSE() or TRY_CONVERT() when moving it to the desired datatype in my final destination table.

The point here is that the shape of the incoming data shouldn't determine the datatype you use. The datatype should be determined by the usage of the data once it's in your table. And if the incoming data doesn't fit, there are ways of making it fit.

Sign up to request clarification or add additional context in comments.

3 Comments

. . . You should also point out the SQL Server Version which has those functions.
I like this idea. My current testing is just to dump it into a STAGE table as VARCHAR(8000), I wasn't aware of the TRY_PARSE() OR TRY_CONVERT() functions, thank you!
I've updated my question, I have more information that the 'number' data type are: "Numbers are arbitrary precision, arbitrary scale numbers. "
2

What do those numbers represent? If they are just values to show you could just set float as datatype and you're good to go. But if they are coordinates or currencies or anything you need for absolute precise calculations float might sometimes give rounding problems. Then you should set your desired minimal precision with decimal and simply truncate what's eventually over. For instance if most of the numbers have two decimals, you could go with 3 or 4 decimal points to be sure, but over that it will be cut.

1 Comment

Thanks for the answer! I've updated my question, I have more information that the 'number' data type are: "Numbers are arbitrary precision, arbitrary scale numbers. "

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.