Please consider the below cases:
Case 1: Initialization of a variable, with a literal and a variable, which I have never declared or initialised before.
var x = 10, i;
When, I print the value of x, it prints 10. I am wondering, how is this even syntactically correct? It is something unexpected. Is it a bug?
Case 2: Initialization of a variable, with a variable and literal, which I have declared or initialised before.
var i = 10;
var x = i, 10;
Then, I tried with the above, thinking that, if CASE 1 can work, then the above should also work. But, I was surprised, it didn't work. Instead it gave an error: Uncaught SyntaxError: Unexpected number.
CASE 3: Initialization of a variable, with two literals.
var x = 10, 10;
I tried with above, but it game me the same error: Uncaught SyntaxError: Unexpected number. Now I am very confused.
CASE 4: Initialization of a variable, with two variables, which I have declared or initialised before.
var i = 10;
var j = 20
var x = i, j;
The above case gave me expected results, i.e. 10.
But, after all the above cases, why even at the first place, JavaScript is syntactically allowing to do like that? That too, with a variable, which I have never declared before? I am very much confused. Is there something wrong, like a bug? Or is there any explanation for this?
Note:
- All the above cases are tried in Chrome's console, as well as in VS Code also.
- Results are same, when I tried with TypeScript.
10isn't a constant, it's a literal. (Granted, literals have constant values...)letUncaught SyntaxError: Identifier 'j' has already been declared. Thanks for letting me know.