ES6 typeof throws an error - javascript

ES5 typeof is considered safe, as it will not throw ReferenceError when checked agains a non-declared value. such as
console.log(typeof undeclaredVar); // undefined
however, when checking for typeof undeclaredLetConst in es6 it will throw an error only if the value was later on declared with a let or const. if it was declared with var it will work normally.
console.log(typeof undeclaredLetConst);
let undeclaredLetConst = "hello"; // ReferenceError
whats happening there?

Why it works with var declarations
When a JavaScript engine looks through a lexical scope block and finds a variable declaration with var, it hoists the declaration to the top of the function (or global scope if no "use strict" is present).
Hence the typeof will never fail as the variable its checking upon will be hoisted beforehand.
The Temporal Dead Zone (TDZ)
The TDZ is never named explicitly in the ECMAScript specification, but the term is used to describe why let and const declarations are not accessible before their declaration.
Why it fails with const and let
When a JavaScript engine looks through a lexical scope block and finds a variable declaration with let or const, it places the declaration in the TDZ. Any attempt to access a variable in the TDZ results in a runtime error.
The declaration is removed from the TDZ during runtime once the flow reaches the declaration itself.
console.log(typeof undeclaredLetConst); // "undefined"
if (1) {
let undeclaredLetConst = "no errors!";
}
undeclaredLetConst isn’t in the TDZ when typeof operation executes because it occurs outside of the block in which undeclaredLetConst is declared. That means there is no value binding, and typeof simply returns "undefined".
Source: An awesome book by Nicholas C. Zakas, Understanding ECMAScript 6.

Related

Is this how the temporal dead zone works?

I've been trying to figure out how the temporal dead zone/parsing of let and const work. This is what it seemingly boils down to (based on documentation and various responses I received in previous questions [such as this and this], though this goes against some answers given disagreement). Is this summary correct?
At the top of the scope, the JS engine creates a binding (an association of the variable keyword and name, e.g., let foo;) at the top of the relevant scope, which is considered hoisting the variable, but if you try to access the variable before the location of its declaration, JS throws a ReferenceError.
Once the JS engine moves down to the declaration (synonymous with "definition"), e.g., let foo;, the engine initializes it (allocating memory for it and making it accessible). The declaration is self-binding. (Here's the part that doesn't make sense to me: the binding is causing the hoisting at the top, but the engine doesn't initialize until it reaches the declaration, which also has a binding effect.) If there isn't an assignment, the value of the variable is set to undefined in the case of let or, if const is used, a SyntaxError will be thrown.
For reference here's what the specs say about it:
ECMAScript 2019 Language Specification draft: section 13.3.1, Let and Const Declarations
let and const declarations define variables that are scoped to the
running execution context's LexicalEnvironment. The variables are
created when their containing Lexical Environment is instantiated but
may not be accessed in any way until the variable's LexicalBinding is
evaluated. A variable defined by a LexicalBinding with an Initializer
is assigned the value of its Initializer's AssignmentExpression when
the LexicalBinding is evaluated, not when the variable is created. If
a LexicalBinding in a let declaration does not have an Initializer the
variable is assigned the value undefined when the LexicalBinding is
evaluated.
MDN Web Docs: Let
let bindings are created at the top of the (block) scope containing
the declaration, commonly referred to as "hoisting". Unlike variables
declared with var, which will start with the value undefined, let
variables are not initialized until their definition is evaluated.
Accessing the variable before the initialization results in a
ReferenceError. The variable is in a "temporal dead zone" from the
start of the block until the initialization is processed.
Maybe first you need to understand why the TDZ exists: because it prevents common surprising behaviour of variable hoisting and fixes a potential source of bugs. E.g.:
var foo = 'bar';
(function () {
console.log(foo);
var foo = 'baz';
})();
This is a frequent cause of surprise for many (novice) programmers. It's too late to change the behaviour of var now, so the ECMAScript group decided to at least fix the behaviour together with the introduction of let and const. How exactly it is implemented under the hood is a somewhat moot point, the important thing is that it stops what it most likely a typo/structural mistake dead in its tracks:
let foo = 'bar';
(function () {
console.log(foo);
let foo = 'baz';
})();
Practically speaking Javascript is executed in a two-step process:
Parsing of the code into an AST/executable byte code.
Runtime execution.
The parser will see var/let/const declarations in this first step and will set up scopes with reserved symbol names and such. That is hoisting. In the second step the code will act in that set up scope. It should be somewhat obvious that the parser/engine is free to do whatever it wants in that first step, and one of the things it does is to flag the TDZ internally, which will raise an error at runtime.
TDZ is quite complex to understand and requires a blog post to clarify how that actually works.
But in essence, overly simplified explanation is
let/const declarations do hoist, but they throw errors when accessed before being initialized (instead of returning undefined as var would)
Let's take this example
let x = 'outer scope';
(function() {
console.log(x);
let x = 'inner scope';
}());
the code above will throw a ReferenceError due to the TDZ semantics.
All these are from this great article completely about TDZ.
Kudos for the author.

Why does this code not result in a ReferenceError?

if(true) {
tmp = 'abc';
console.log(tmp);//which should throw referenceError but not
let tmp;
console.log(tmp);
tmp = 123;
console.log(tmp);
}
This code results in
abc
undefined
123
Why does the first console.log(tmp) not throw an error?
why it should throw a referenceError
In ECMAScript 2015, let will hoist the variable to the top of the block. However, referencing the variable in the block before the variable declaration results in a ReferenceError. The variable is in a "temporal dead zone" from the start of the block until the declaration is processed.
the problem is bable settings,i think.
so,maybe it is a bug of babel?
https://github.com/babel/babel.github.io/issues/826
You are correct, in ES6 this does throw an exception. There's two reasons why it doesn't for you:
node.js already implemented let - but it works correctly only in strict mode. You should use it.
babel does not appear to transpile the TDZ by default, as it is quite complicated and leads to lengthy code. You can however enable it with the es6.blockScopingTDZ/es6.spec.blockScoping option (but I'm not sure whether this worked in Babel 5 only and what happened in Babel 6 to them).
Statement
tmp = 'abc';
is not elegant but still OK in normal mode (except let keyword which is not allowed outside strict mode). It will simply create global variable. However, the code is not correct and will throw an error only when you executed this code in "strict mode". In this mode you have to declare all variables with one of this keywords:
var
let
const
'use strict'
if(true) {
tmp = 'abc';
console.log(tmp);//which should throw referenceError and now it does
let tmp;
console.log(tmp);
tmp = 123;
console.log(tmp);
}
No, it shouldn't throw a reference error.
The variable is implicitly declared (in the global scope) when you assign to it.
Then, later, you declare a new variable with the same name but a tighter scope. The new variable is not hoisted because it is declared using let.
I can't give a more precise answer, because you did not explain why you think you should get a reference error.

Undeclared variables usage in javascript

Why does JavaScript throw a reference error if we try use a variable which is not declared but allows to set the value for the same ?
e.g
a = 10; //creates global variable a and sets value to 10 even though its undeclared
alert(b); // throws reference error.
so why for b, its reference error and not for the a?
That's just how the language works. In non-strict mode, an assignment to an undeclared symbol is implicitly treated as creating a global variable. In strict mode, that's an error.
To use strict mode, a global script block or a function should start with the statement:
"use strict";
That is, a simple expression statement consisting of the string "use strict". That will prevent the accidental creation of global variables (which can still be created explicitly), and impose a few other restrictions.
According to the documentation:
Variable declarations, wherever they occur, are processed before any
code is executed. The scope of a variable declared with var is its
current execution context, which is either the enclosing function or,
for variables declared outside any function, global.
Assigning a value to an undeclared variable implicitly creates it as a
global variable (it becomes a property of the global object) when the
assignment is executed. The differences between declared and
undeclared variables are:
Declared variables are constrained in the execution context in which they are declared. Undeclared variables are always global.
Declared variables are created before any code is executed. Undeclared variables do not exist until the code assigning to them is
executed.
console.log(a); // Throws a ReferenceError.
console.log('still going...'); // Never executes.
var a;
console.log(a); // logs "undefined" or "" depending on browser.
console.log('still going...'); // logs "still going...".
Declared variables are a non-configurable property of their execution context (function or global). Undeclared variables are
configurable (e.g. can be deleted).
Because of these three differences, failure to declare variables will
very likely lead to unexpected results. Thus it is recommended to
always declare variables, regardless of whether they are in a function
or global scope. And in ECMAScript 5 strict mode, assigning to an
undeclared variable throws an error.
a = 10;
internally declares var a
i.e. in background it first declares the variable globally and then set the value for that variable.
So whenever we write
a=10
what actually happens:
var a;
a=10;
hence a is never undefined;
but alert(b) in this case b has no values defined so it remains undefined
And as Pointy said,
In non-strict mode, an assignment to an undeclared symbol is implicitly treated as creating a global variable. In strict mode, that's an error.
Its always a better practice to use use strict

Does a "nonexistent variable" evaluate to the "undefined" value in non-strict code?

This paragraph is from the book JavaScript: The Definitive Guide, 6th edition, page 58:
When any identifier appears by itself in a program, JavaScript
assumes it is a variable and looks up its value. If no variable with
that name exists, the expression evaluates to the undefined value.
In the strict mode of ECMAScript 5, however, an attempt to evaluate a
nonexistent variable throws a ReferenceError instead.
First, let me explain how I interpret some of the phrases used in the paragraph.
"... identifier appears by itself in a program, ...":
I assume the author means the case where an identifier is parsed as an expression (in which case identifier resolution is performed). For example:
function func ( arg ) {
var local = helper( arg );
}
Here func, arg (which appears twice), local, and helper are all identifiers, but only helper and arg (only in its second appearance!) are expressions. Thus, only those two identifiers can cause a reference error.
"... no variable with that name exists ..." and "nonexistent variable":
I assume, the author means an identifier which evaluates to an unresolvable reference.
Now, correct me if I'm wrong but...
When an identifier which is parsed as an expression (a PrimaryExpression to be precise) is evaluated, identifier resolution is performed. The result of this process is always a value of the type Reference. In other words, the identifier will evaluate to a reference.
A reference has a base value and a referenced name. The referenced name is the text of the identifier and the base value is the environment record that has a binding for that name (the scope that contains such a variable). However, if identifier resolution fails to resolve the referenced name, the base value of the reference will be the undefined value.
So, a "nonexistent variable" therefore evaluates to a reference which base value is undefined.
Notice how the evaluation of a nonexistent variable does not throw a reference error. The error is thrown later when the interpreter retrieves the value of the reference (via GetValue()). This, however, may not always occur - for instance, typeof x won't retrieve the value of the reference if x evaluates to an unresolvable reference, and thus, a reference error is not thrown.
The paragraph quoted at the beginning of my question, states that in non-strict code, nonexistent variables evaluate to undefined, whereas in strict code evaluation of such an identifier throws a reference error. I believe that both these statements are incorrect - an nonexistent variable evaluates to a reference with a base value of undefined, and a reference error is not thrown in strict mode.
When the interpreter tries to retrieve the value of such a reference (which may or may not occur depending on the "outer" expression or statement in which the identifier appears in), a reference error is thrown. This behavior does not differ between non-strict and strict mode.
So, is the paragraph correct? If not, have I correctly identified the mistake(s)? (Also, if there is a mistake in my text, please correct me.)
Does a “nonexistent variable” evaluate to the “undefined” value in non-strict code?
> foo;
ReferenceError: foo is not defined
> var x = foo;
ReferenceError: foo is not defined
This behaviour is also true in strict mode:
> (function () { "use strict"; foo; }());
ReferenceError: foo is not defined
> (function () { "use strict"; var x = foo; }());
ReferenceError: foo is not defined
The only difference that I know of between strict mode and non-strict mode in terms of variable resolution is that when assigning a variable, if the variable is not declared, a ReferenceError is thrown in strict mode, however in non-strict mode a global variable is implicitly created.
Annex C (The strict mode of ECMAScript):
Assignment to an undeclared identifier or otherwise unresolvable reference does not create a property in the global object. When a simple assignment occurs within strict mode code, its LeftHandSide must not evaluate to an unresolvable Reference. If it does a ReferenceError exception is thrown (8.7.2).
Yes, even if you declared it using
var someVar;
it still evaluates to undefined.

when I use global scope variable without 'var', its showing me Error. why?

See my example code below
<script>
alert(a); // undefined
alert(b); // It is Error, b is not defined.
var a=1;
b=10;
</script>
When both variable a and b are in global scope, why I am getting error message for b. But there is no error message for variable a ? what is the reason ?
can anyone please explain me ?
The first alert shows undefined because the var statements are hoisted to the top of the enclosing scope, in other words, var statements and function declarations are made before the actual code is executed, in the parsing stage.
When your code is executed, is equivalent to:
var a; // declared and initialized with `undefined` before the code executes
alert(a); // undefined
alert(b); // ReferenceError, b is not declared.
a=1;
b=10;
The second alert doesn't even executes, trying to access b gives you a ReferenceError because you never declared it, and you are trying to access it.
That's how the identifier resolution process works in Javascript, if an identifier is not found in all the scope chain, a ReferenceError exception is thrown.
Also, you should know that assigning an identifier without declaring it first (as b = 10) does not technically declares a variable, even in the global scope, the effect may be similar (and it seems to work), at the end the identifier ends as a property of the global object, for example:
var a = 1;
b = 10;
// Similar effect:
window.a; // 1
window.b; // 10
But this is just due the fact that the global object is the top-most environment record of the scope chain.
Another difference between the two above is that the identifier declared with var produces a non-configurable property on the global object (cannot be deleted), e.g.:
delete window.a; // false
delete window.b; // true
Also, if you are in the scope of a function, and you make an assignment to an undeclared identifier, it will end up being a property of the global object, just like in the above example, whereas the var statement will create a local variable, for example:
(function(){
var a = 1;
b = 10;
})();
typeof window.a; // 'undefined', was locally scoped in the above function
typeof window.b; // 'number', leaked, an unintentional global
I would really discourage make assignments to undeclared identifiers, always use var to declare your variables, moreover, this has been disallowed on ECMAScript 5 Strict Mode, assignments made to undeclared identifiers throw a ReferenceError:
(function(){'use strict'; b = 10;})(); // throws a ReferenceError

Categories

Resources