I have a question that is just bothering me. If I declare a var a and I then try to test it using the "in" operator I get a true result, if I do the same test using the dot notation I get a false result, for example...
var a;
if('a' in window) {
console.log('a in window'); // this is written to the console
}
if (window.a) {
console.log('window.a'); // nothing happening here
}
now I noticed that when I give a a value like so both outputs work... look at this and notice how I check if it doesn't exist:
var a;
if('a' in window) {
console.log('a in window'); // this is written to the console
}
if (!window.a) {
console.log('!window.a'); // this is written to the console
}
a = 1;
if (window.a) {
console.log('window.a'); // this is written to the console
}
So why does the dot notation only work when the variable is assigned a value? A silly question I know but so far I can't get a definite answer!
window.ayeildsundefined- it evaulates to false