I have a few tables that store quite large numbers, so I've chosen decimal(17, 2) as the data type for all the columns.
Example of schema:
Table1 (value1, value2, value3, value4, value5)
Table2 (value1, value2, value3, value4, value5)
For argument sake, if I have 100,000 rows in each table, and each value is 100.00, how will the query performance be compared to if the data type was decimal(5, 2)? Will it be negligible? Will the main difference be the storage space taken up?
floatcan hold larger values thandecimal.decimal(17,2)will take up 9 bytes. adecimal(5,2)will take up 5. So the storage difference with 2 tables, 4 columns, and 50,000 rows will be 1.6MB.