Why does JavaScript throw a reference error if we try use a variable which is not declared but allows to set the value for the same ?
e.g
a = 10; //creates global variable a and sets value to 10 even though its undeclared
alert(b); // throws reference error.
so why for b, its reference error and not for the a?
That's just how the language works. In non-strict mode, an assignment to an undeclared symbol is implicitly treated as creating a global variable. In strict mode, that's an error.
To use strict mode, a global script block or a function should start with the statement:
"use strict";
That is, a simple expression statement consisting of the string "use strict". That will prevent the accidental creation of global variables (which can still be created explicitly), and impose a few other restrictions.
According to the documentation:
Variable declarations, wherever they occur, are processed before any
code is executed. The scope of a variable declared with var is its
current execution context, which is either the enclosing function or,
for variables declared outside any function, global.
Assigning a value to an undeclared variable implicitly creates it as a
global variable (it becomes a property of the global object) when the
assignment is executed. The differences between declared and
undeclared variables are:
Declared variables are constrained in the execution context in which they are declared. Undeclared variables are always global.
Declared variables are created before any code is executed. Undeclared variables do not exist until the code assigning to them is
executed.
console.log(a); // Throws a ReferenceError.
console.log('still going...'); // Never executes.
var a;
console.log(a); // logs "undefined" or "" depending on browser.
console.log('still going...'); // logs "still going...".
Declared variables are a non-configurable property of their execution context (function or global). Undeclared variables are
configurable (e.g. can be deleted).
Because of these three differences, failure to declare variables will
very likely lead to unexpected results. Thus it is recommended to
always declare variables, regardless of whether they are in a function
or global scope. And in ECMAScript 5 strict mode, assigning to an
undeclared variable throws an error.
a = 10;
internally declares var a
i.e. in background it first declares the variable globally and then set the value for that variable.
So whenever we write
a=10
what actually happens:
var a;
a=10;
hence a is never undefined;
but alert(b) in this case b has no values defined so it remains undefined
And as Pointy said,
In non-strict mode, an assignment to an undeclared symbol is implicitly treated as creating a global variable. In strict mode, that's an error.
Its always a better practice to use use strict
Related
ES5 typeof is considered safe, as it will not throw ReferenceError when checked agains a non-declared value. such as
console.log(typeof undeclaredVar); // undefined
however, when checking for typeof undeclaredLetConst in es6 it will throw an error only if the value was later on declared with a let or const. if it was declared with var it will work normally.
console.log(typeof undeclaredLetConst);
let undeclaredLetConst = "hello"; // ReferenceError
whats happening there?
Why it works with var declarations
When a JavaScript engine looks through a lexical scope block and finds a variable declaration with var, it hoists the declaration to the top of the function (or global scope if no "use strict" is present).
Hence the typeof will never fail as the variable its checking upon will be hoisted beforehand.
The Temporal Dead Zone (TDZ)
The TDZ is never named explicitly in the ECMAScript specification, but the term is used to describe why let and const declarations are not accessible before their declaration.
Why it fails with const and let
When a JavaScript engine looks through a lexical scope block and finds a variable declaration with let or const, it places the declaration in the TDZ. Any attempt to access a variable in the TDZ results in a runtime error.
The declaration is removed from the TDZ during runtime once the flow reaches the declaration itself.
console.log(typeof undeclaredLetConst); // "undefined"
if (1) {
let undeclaredLetConst = "no errors!";
}
undeclaredLetConst isn’t in the TDZ when typeof operation executes because it occurs outside of the block in which undeclaredLetConst is declared. That means there is no value binding, and typeof simply returns "undefined".
Source: An awesome book by Nicholas C. Zakas, Understanding ECMAScript 6.
Consider the following code:
var a = 'a';
function b() {
console.log(a);
if (!a) {
var a = 'b';
}
}
b();
Running b() prints undefined to the console. However, if you remove the if statement, or even simply remove the var keyword from the expression within the if statement so that you're redefining the outer a variable, the string a will be printed to the console as expected.
Can anyone explain this? The only cause I can think of is that this is a race condition, and the if statement is running just a tad faster than the console.log.
This is not a race condition - it's a language feature and is working as the designers of the Javascript language intended.
Because of hoisted variable definitions (where all variable definitions within a function scope are hoisted to the top of the function), your code is equivalent to this:
var a = 'a';
function b() {
var a;
console.log(a);
if (!a) {
a = 'b';
}
}
b();
So, the locally declared a hides the globally declared a and initially has a value of undefined until your if statement gives it a value.
You can find lots of discussion about this characteristic of the Javascript language by searching for "Javascript hoisting".
When you use var statement in a function, it will be creating a new variable which is local to that function.
All the variables declared in the function, will be moved to the top of the function, irrespective of the place where they are actually declared. This is called Hoisting.
The hoisted variables will have the value undefined, by default, till they are explicitly assigned a value.
It prints undefined as it is, because of the 3rd point.
In your case, you declared a variable a within the if block. Since the variables are hoisted, the declaration is moved to the top of the function. It has the same name as the outer variable. When you access a in the function, it first looks in the current scope if there is a variable by that name exists. It checks other scopes only if it is not found in local scope. So, the local a shadows the outer a. When you remove the var statement, there is no a in local scope, so the outer a is used. That is why it prints a.
In a JavaScript environment can I declare a variable before a function to make the variable reachable on a global level.
For instance:
var a;
function something(){
a = Math.random()
}
Will this make "a" a global variable?
or is using...
var a = function(){
var b = Math.random();
return b;
}
document.write(a())
Really the only way to do it?
Is there a way to make "b" global other than calling the function "a()"?
There are basically 3 ways to declare a global variable:
Declaring it in the global scope, outside of any function scope.
Explicitly assigning it as a property of the window object: window.a = 'foo'.
Not declaring it at all (not recommended). If you omit the var keyword when you first use the variable, it'll be declared globally no matter where in your code that happens.
Note #1: When in strict mode, you'll get an error if you don't declare your variable (as in #3 above).
Note #2: Using the window object to assign a global variable (as in #2 above) works fine in a browser environment, but might not work in other implementations (such as nodejs), since the global scope there is not a window object. If you're using a different environment and want to explicitly assign your global variables, you'll have to be aware of what the global object is called.
Will this make "a" a global variable?
A var declaration makes the variable local to the enclosing scope, which is usually a function one's. If you are executing your code in global scope, then a will be a global variable. You could as well just omit the var, then your variable would be implicitly global (though explicit declaration is better to show your intention).
Is there a way to make "b" global other than calling the function "a()"?
Your b variable is always local to the function a and will never leave it, unless you remove the var.
Before you think of making a variable global scope, you should consider JavaScript global namespace pollution. The more global variables you declare, the more it is likely that you application will get into conflict with another application's namespace and break. As such, it is highly important minimize the number of global variables.
This paragraph is from the book JavaScript: The Definitive Guide, 6th edition, page 58:
When any identifier appears by itself in a program, JavaScript
assumes it is a variable and looks up its value. If no variable with
that name exists, the expression evaluates to the undefined value.
In the strict mode of ECMAScript 5, however, an attempt to evaluate a
nonexistent variable throws a ReferenceError instead.
First, let me explain how I interpret some of the phrases used in the paragraph.
"... identifier appears by itself in a program, ...":
I assume the author means the case where an identifier is parsed as an expression (in which case identifier resolution is performed). For example:
function func ( arg ) {
var local = helper( arg );
}
Here func, arg (which appears twice), local, and helper are all identifiers, but only helper and arg (only in its second appearance!) are expressions. Thus, only those two identifiers can cause a reference error.
"... no variable with that name exists ..." and "nonexistent variable":
I assume, the author means an identifier which evaluates to an unresolvable reference.
Now, correct me if I'm wrong but...
When an identifier which is parsed as an expression (a PrimaryExpression to be precise) is evaluated, identifier resolution is performed. The result of this process is always a value of the type Reference. In other words, the identifier will evaluate to a reference.
A reference has a base value and a referenced name. The referenced name is the text of the identifier and the base value is the environment record that has a binding for that name (the scope that contains such a variable). However, if identifier resolution fails to resolve the referenced name, the base value of the reference will be the undefined value.
So, a "nonexistent variable" therefore evaluates to a reference which base value is undefined.
Notice how the evaluation of a nonexistent variable does not throw a reference error. The error is thrown later when the interpreter retrieves the value of the reference (via GetValue()). This, however, may not always occur - for instance, typeof x won't retrieve the value of the reference if x evaluates to an unresolvable reference, and thus, a reference error is not thrown.
The paragraph quoted at the beginning of my question, states that in non-strict code, nonexistent variables evaluate to undefined, whereas in strict code evaluation of such an identifier throws a reference error. I believe that both these statements are incorrect - an nonexistent variable evaluates to a reference with a base value of undefined, and a reference error is not thrown in strict mode.
When the interpreter tries to retrieve the value of such a reference (which may or may not occur depending on the "outer" expression or statement in which the identifier appears in), a reference error is thrown. This behavior does not differ between non-strict and strict mode.
So, is the paragraph correct? If not, have I correctly identified the mistake(s)? (Also, if there is a mistake in my text, please correct me.)
Does a “nonexistent variable” evaluate to the “undefined” value in non-strict code?
> foo;
ReferenceError: foo is not defined
> var x = foo;
ReferenceError: foo is not defined
This behaviour is also true in strict mode:
> (function () { "use strict"; foo; }());
ReferenceError: foo is not defined
> (function () { "use strict"; var x = foo; }());
ReferenceError: foo is not defined
The only difference that I know of between strict mode and non-strict mode in terms of variable resolution is that when assigning a variable, if the variable is not declared, a ReferenceError is thrown in strict mode, however in non-strict mode a global variable is implicitly created.
Annex C (The strict mode of ECMAScript):
Assignment to an undeclared identifier or otherwise unresolvable reference does not create a property in the global object. When a simple assignment occurs within strict mode code, its LeftHandSide must not evaluate to an unresolvable Reference. If it does a ReferenceError exception is thrown (8.7.2).
Yes, even if you declared it using
var someVar;
it still evaluates to undefined.
See my example code below
<script>
alert(a); // undefined
alert(b); // It is Error, b is not defined.
var a=1;
b=10;
</script>
When both variable a and b are in global scope, why I am getting error message for b. But there is no error message for variable a ? what is the reason ?
can anyone please explain me ?
The first alert shows undefined because the var statements are hoisted to the top of the enclosing scope, in other words, var statements and function declarations are made before the actual code is executed, in the parsing stage.
When your code is executed, is equivalent to:
var a; // declared and initialized with `undefined` before the code executes
alert(a); // undefined
alert(b); // ReferenceError, b is not declared.
a=1;
b=10;
The second alert doesn't even executes, trying to access b gives you a ReferenceError because you never declared it, and you are trying to access it.
That's how the identifier resolution process works in Javascript, if an identifier is not found in all the scope chain, a ReferenceError exception is thrown.
Also, you should know that assigning an identifier without declaring it first (as b = 10) does not technically declares a variable, even in the global scope, the effect may be similar (and it seems to work), at the end the identifier ends as a property of the global object, for example:
var a = 1;
b = 10;
// Similar effect:
window.a; // 1
window.b; // 10
But this is just due the fact that the global object is the top-most environment record of the scope chain.
Another difference between the two above is that the identifier declared with var produces a non-configurable property on the global object (cannot be deleted), e.g.:
delete window.a; // false
delete window.b; // true
Also, if you are in the scope of a function, and you make an assignment to an undeclared identifier, it will end up being a property of the global object, just like in the above example, whereas the var statement will create a local variable, for example:
(function(){
var a = 1;
b = 10;
})();
typeof window.a; // 'undefined', was locally scoped in the above function
typeof window.b; // 'number', leaked, an unintentional global
I would really discourage make assignments to undeclared identifiers, always use var to declare your variables, moreover, this has been disallowed on ECMAScript 5 Strict Mode, assignments made to undeclared identifiers throw a ReferenceError:
(function(){'use strict'; b = 10;})(); // throws a ReferenceError