3

I am fetching following details of columns from SQL Server:

Size
Precision
Scale

But I have noticed that in case of int I am getting 10 as precision but when i did some research, I couldn't find any such thing related to int, and that int datatype have precision of 10 or in SQL Server Management Studio.

So I don't know how 10 is coming as precision in case of int data type.

Table:

enter image description here

Screenshot:

Output

Code:

String[] columnRestrictions = new String[4];
columnRestrictions[0] = 'MyDb';
columnRestrictions[1] = 'dbo';
columnRestrictions[2] = 'Employee';

using (SqlConnection con = new SqlConnection("MyConnectionString"))
{
    con.Open();
    var columns = con.GetSchema("Columns", columnRestrictions).AsEnumerable()
         .Select
          (
                 t => new 
                 {
                     Name = t[3].ToString(),
                     Datatype = t.Field<string>("DATA_TYPE"),
                     IsNullable = t.Field<string>("is_nullable"),
                     Size = t.Field<Int32?>("character_maximum_length"),
                     NumericPrecision = t.Field<int?>("NUMERIC_PRECISION"),
                     NumericScale = t.Field<Int32?>("NUMERIC_SCALE")
                 }).ToList();
9
  • 3
    If you look at the int datatype you'll see that the largest values it can store require (up to) 10 decimal digits to represent. So if something's insisting on giving a precision for an int, it's not an unreasonable answer to provide. Commented Dec 12, 2016 at 14:14
  • 3
    Precision is the number of digits in a number, the maximum digits in a 32 bit int is 10. (and 19 for a bigint) Commented Dec 12, 2016 at 14:15
  • 2
    stackoverflow.com/a/28836543/284240 Commented Dec 12, 2016 at 14:16
  • @Damien_The_Unbeliever :So my output is correct as 10 in case of int datatype??? Commented Dec 12, 2016 at 14:18
  • @TimSchmelter:Sir i would like to ask you 1 thing that why you deleted your answer on my this question(stackoverflow.com/questions/41062636/…) although it was correct.i was about to mark it as accepted answer but you deleted it Commented Dec 12, 2016 at 14:22

1 Answer 1

1

Precision refers to the number of significant decimal digits a number can represent.

The int datatype, like the int datatype in C#, can range from -2147483648 to 2147483647, so it can have up to 10 significant decimal digits.

The precision of int is therefore always 10.

Sign up to request clarification or add additional context in comments.

2 Comments

It is, though it's not very useful. It's only really a useful datum in cases where it can vary, as in the SQL decimal datatype.
Thank you thank you so much for helping me clearing my doubt.please keep helping like this

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.