How can I guarantee that websocket onopen to be called - javascript

I use WebSocket in javascript. WebSocket requires the url as constructor parameter and immediately try to connect. I can set onopen method only after construct it.
So, if WebSocket already establishes the connection before I set onopen, then I miss onopen event!
How can I avoid this?
To simulate it:
A)
1) In Chrome, open websocket.
2) Press F12 to open Developer Tools.
3) Open Console
4) Copy and paste all of these codes at once! Enter!
uri = "ws://echo.websocket.org?encoding=text";
websocket = new WebSocket(uri);
websocket.onopen = function(evt) { console.log('EHE')};
B)
Repeat 1-2-3
4) Copy and paste these codes and run
uri = "ws://echo.websocket.org?encoding=text";
websocket = new WebSocket(uri);
5) Wait a second
6) Run this code:
websocket.onopen = function(evt) { console.log('EHE')};
Result:
In A) onopen is called. In B) we missed it!

Because of the single-threaded event driven nature of Javascript, what you describe will not happen in real code. The "open" event can't be triggered until after your current section of Javascript finishes. Thus, you will always be able to set your onopen event handler before the event occurs.
Inserting artificial pauses in the debugger or in the console is an artificial situation that does not occur in real code.
What happens in real code is this:
You call new WebSocket(uri)
The webSocket intrastructure initiates a webSocket connection (an asynchronous operation)
The webSocket constructor returns immediately before that connection has completed.
The rest of your code runs (including assigning the .onopen property and setting up your event handler.
The Javascript interpreter is done running your code and returns back to the event loop.
If, by now, the webSocket has connected, then there will be an open event in the event queue and Javascript will trigger that event, resulting in your .onopen handler getting called.
If the open event has not yet completed, Javascript will wait for the next event to be inserted into the event queue and will run it, repeating that process over and over. Eventually one of these events will be your open event.
The key to this is that .onopen is called via an asynchronous event. Thus, it has to go through the Javascript event queue. And, no events from the event queue can be run until after your current section of Javascript finishes and returns back to the interpreter. That's how the "event-driven" nature of Javascript works. So, because of that architecture, you cannot miss the onopen event as long as you install your .onopen handler in the same section of Javascript that calls the constructor.
If it provides you any comfort, there are dozens of APIs in node.js that all rely on this same concept. For example when you create a file stream with fs.createReadStream(filename), you have to create the stream, then add event handlers (one of those event handlers is for an open event). The same logic applies there. Because of the event-driven nature of Javascript, there is no race condition. The open event or the error event cannot be triggered before the rest of your Javascript runs so you always have an opportunity to get your event handlers installed before they could get called.
In cases where errors could be detected synchronously (like a bad filename or bad uri) and could trigger an error event immediately, then code using something like setImmediate(function() { /* send error event here*/ }) to make sure the error event is not triggered until after your code has a chance to install your event handlers.

Related

How to handle all WebSocket errors without a race?

When I run code
var ws = new WebSocket("wss://mydomain.example/socket/service");
ws.addEventListener("error", function (error)
{
console.error("Got error=", error);
});
Is it possible that the WebSocket connection fails (emit error) before I can attach the event listener for the error event?
Looking at the documentation https://developer.mozilla.org/en-US/docs/Web/API/WebSocket/WebSocket I cannot see this detail documented anywhere.
According to the WHATWG spec it seems that the constructor should run the request in parallel – is there a guarantee that I can attach the error listener before any possible errors can raise?
The WebSocket constructor is run without synchronization of any kind and the connection may indeed encounter an error before the line with ws.addEventListener("error", ...); is executed! However, this is not a problem because the spec also says that in case of error, the actual error event is fired as a part of steps that must be queued as a task. In practice, this means that logically the WebSocket constructor is required to behave as if it would run an anonymous function with zero timeout which fires the error event.
So the actual error can happen before the JS code can attach the event listener but all the events (open, close, error, message) can only be fired delayed after the event loop is executed next time so the above code will always have to time attach the event handlers before the events can be fired.
See https://github.com/whatwg/websockets/issues/13#issuecomment-1039442142 for details.

Node JS Asynchronous Execution and Event Emission / Listener Models

I am new to Node JS and am trying to understand the concurrent / asynchronous execution models of Node.
So far, I do understand that whenever an asynchronous task is encountered in Node, that task runs in the background ( e.g an asynchronous setTimeout function will start timing) and the control is then sent back to other tasks that are there on the call stack. Once the timer times out, the callback that was passed to the asynchronous task is pushed onto the callback queue and once the call stack is empty, that callback gets executed. I took the help of this visualization to understand the sequence of task execution. So far so good.
Q1. Now, I am not being able to wrap my head around the paradigm of event listeners and event emitters and would appreciate if someone could explain how even emitters and listeners fall into the picture of call stack, event loops and callback queues.
Q2. I have the following code that reads data from the serial port of a raspberry pi.
const SerialPort = require('serialport');
const port = new SerialPort('/dev/ttyUSB0',{baudRate: 9600}, (err) => {
if (err) {
console.log("Port Open Error: ", err);
}
} )
port.on('data', (data) => {
console.log(data.toString());
})
As can be seen from the example, to read data from the serial port, an 'event-listener' has been employed. From what I understand, whenever data comes to the port, a 'data' event is emitted which is 'responded to' or rather listened to by the listener, which just prints the data onto the console.
When I run the above program, it runs continuously, with no break, printing the data onto the console whenever a data arrives at the serial port. There are no continuously running while loops continuously scanning the serial port as would be expected in a synchronous program. So my question is, why is this program running continuously? It is obvious that the event emitter is running continuously, generating an event whenever data comes, and the event listener is also running continuously, printing the data whenever a 'data' event is emitted. But WHERE are these things actually running, that too, continuously? How are these things fitting into the whole picture of the call/execution stack, the event loop and the callback queue?
Thanks
Q1. Now, I am not being able to wrap my head around the paradigm of event listeners and event emitters and would appreciate if someone could explain how even emitters and listeners fall into the picture of call stack, event loops and callback queues.
Event emitters on their own have nothing to do with the event loop. Event listeners are called synchronously whenever someone emits an event. When some code calls someEmitter.emit(...), all listeners are called synchronously from the time the .emit() occurred one after another. This is just plain old function calls. You can look in the eventEmitter code yourself to see a for loop that calls all the listeners one after another associated with a given event.
Q2. I have the following code that reads data from the serial port of a raspberry pi.
The data event in your code is an asynchronous event. That means that it will be triggered one or more times at an unknown time in the future. Some lower level code will be registered for some sort of I/O event. If that code is native code, then it will insert a callback into the node.js event queue. When node.js is done running other code, it will grab the next event from the event queue. When it gets to the event associated with data being available on the serial port, it will call port.emit(...) and that will synchronously trigger each of the listeners for the data event to be called.
When I run the above program, it runs continuously, with no break, printing the data onto the console whenever a data arrives at the serial port. There are no continuously running while loops continuously scanning the serial port as would be expected in a synchronous program. So my question is, why is this program running continuously?
This is the event-driven nature of node.js in a nutshell. You register an interest in certain events. Lower level code sees that incoming data has arrived and triggers those events, thus calling your listeners.
This is how the Javascript interpreter manages the event loop. Run current piece of Javascript until it's done. Check to see if any more events in the event loop. If so, grab next event and run it. If not, wait until there is an event in the event queue and then run it.
It is obvious that the event emitter is running continuously, generating an event whenever data comes, and the event listener is also running continuously, printing the data whenever a 'data' event is emitted. But WHERE are these things actually running, that too, continuously?
The event emitter itself is not running continuously. It's just a notification scheme (essentially a publish/subscribe model) where one party can register an interest in certain events with .on() and another party can trigger certain events with .emit(). It allows very loose coupling through a generic interface. Nothing is running continuously in the emitter system. It's just a notification scheme. Someone triggers an event with .emit() and it looks in its data structures to see who has registered an interest in that event and calls them. It knows nothing about the event or the data itself or how it was triggered. The emitters job is just to deliver notifications to those who expressed an interest.
We've described so far how the Javascript side of things works. It runs the event loop as described above. At a lower level, there is serial port code that interfaces directly with the serial port and this is likely some native code. If the OS supports a native asynchronous interface for the serial port, then the native code would use that and tell the OS to call it when there's data waiting on the serial port. If there is not a native asynchronous interface for the serial port data in the OS, then there's probably a native thread in the native code that interfaces with the serial port that handles getting data from the port, either polling for it or using some other mechanism built into the hardware to tell you when data is available. The exact details of how that works would be built into the serial port module you're using.
How are these things fitting into the whole picture of the call/execution stack, the event loop and the callback queue?
The call/execution stack comes into play the moment an event in the Javascript event queue is found by the interpreter and it starts to execute it. Executing that event will always start with a Javascript callback. The interpreter will call that callback (putting a return address on the call/execution stack). That callback will run until it returns. When it returns, the call/execution stack will be empty. The interpreter will then check to see if there's another event waiting in the event queue. If so, it will run that one.
FYI, if you want to examine the code for the serial port module it appears you are using, it's all there on Github. It does appear to have a number of native code files. You can see a file called poller.cpp here and it appears to do cooperative polling using the node.js add-on programming interface offered by libuv. For example, it creates a uv_poll_t which is a poll handle described here. Here's an excerpt from that doc:
Poll handles are used to watch file descriptors for readability, writability and disconnection similar to the purpose of poll(2).
The purpose of poll handles is to enable integrating external libraries that rely on the event loop to signal it about the socket status changes, like c-ares or libssh2. Using uv_poll_t for any other purpose is not recommended; uv_tcp_t, uv_udp_t, etc. provide an implementation that is faster and more scalable than what can be achieved with uv_poll_t, especially on Windows.
It is possible that poll handles occasionally signal that a file descriptor is readable or writable even when it isn’t. The user should therefore always be prepared to handle EAGAIN or equivalent when it attempts to read from or write to the fd.

How can I make sure a Websocket does not open before attaching an onopen handler?

There's a part of the Websockets API I do not understand.
The onOpen event handler is usually used to start sending messages to the server, since we can't do that before the socket is opened and ready.
According to any code examples I can find (documentation), this is the common way to register an onOpen eventhandler on a Websocket:
1: const socket = new WebSocket('ws://localhost:8080');
2:
3: socket.addEventListener('open', function (event) {
4: socket.send('Hello Server!');
5: });
But the WebSocket contstructor call (line 1) creates the websocket and attempts to open a connection to the server, while the event handler is attached later (line 3).
So: In a case where the connection is established (very) quickly, is it not possible that socket is already open when we reach line 3?
In which case we will miss the open event, since we did not have an event handler registered for it when it happened.
How are we guaranteed to receive the open event?
JavaScript implementation in browsers is asynchronous and single threaded. Of course, it can use multiple worker threads inside (for example, for input/output operations or timers), but your application code is executing on one thread with event loop.
When you connect to the server through WebSocket:
var socket = new WebSocket('ws://localhost:8080');
JavaScript thread starts asynchronous operation for connecting socket and immediately continues to run next code. It can receive events from asynchronous operations only when it returns to the event loop. That means, that your onOpen listener will be always fired, no matter how quickly connection will be established.
I also recommend you to check this question:
How does Asynchronous programming work in a single threaded programming model?
You'll find there a little bit better and extended explanations. Note, that V8 JavaScript engine is used by Node.js and Chromium.

Where is the node.js event queue?

I have seen similar questions on stack overflow but none of them fully dive down into the question that I have? I am familiar with event queues, how they work as well as implementing them. I am new to node.js and I am trying to wrap my head around how Node.js does it.
In a c++ application you would do something along the lines of:
int main(){
std::vector<Handler*> handlers;
BlockingQueue queue = new BlockingQueue();
//Add all the handlers call constructors and other such initialization
//Then run the event loop
while(true){
Event e = queue.pop();
for( std::vector<Handler>::iterator it = handlers.begin(); it != handlers.end(); ++it){
*it.handle(e);
}
}
}
Now in the case of node.js I might have a main file called main.js that looks like.
var http = require("http");
function main(){
// Console will print the message
console.log('Server running at http://127.0.0.1:8080/');
var server = http.createServer(function (request, response) {
// Send the HTTP header
// HTTP Status: 200 : OK
// Content Type: text/plain
response.writeHead(200, {'Content-Type': 'text/plain'});
// Send the response body as "Hello World"
response.end('Hello World\n');
});
server.listen(8080);
console.log('Main completed');
}
main();
I understand the server.listen is attaching a handler to the event queue and that we are adding the callback similar to the c++ example.
My question is. Where is the event queue? Is it in the javascript somewhere or is it built into the interpreter? Also how does the main function get called relative to the main event loop?
Where is the event queue? Is it in the javascript somewhere or is it
built into the interpreter?
The event queue is built into the operating environment that hosts the Javascript interpreter. It isn't fundamental to Javascript itself so it's not part of the actual JS runtime. One interesting indicator of this is that setTimeout() is not actually part of ECMAScript, but rather something made available to the Javascript environment by the host.
The system surrounding the Javascript implementation in node.js keeps track of externally triggered events (timers, networking results, etc...) and when Javascript is not busy executing something and an external event occurs, it then triggers an associated Javascript callback. If Javascript is busy executing something, then it queues that event so that as soon as Javascript is no longer busy, it can then trigger the next event in the queue.
node.js itself uses libuv for the event loop. You can read more about that here. It provides a multi-platform way of doing evented, async I/O that was developed for node.js, but is also being used by some other projects.
Here's a related answer that might also help:
Run Arbitrary Code While Waiting For Callback in Node?
Also how does the main function get called relative to the main event
loop?
When node.js starts up, it is given an initial script file to execute. It loads that script file into memory, parses the Javascript in it and executes it. In your particular example, that will cause the function main to get parsed and then will cause the execution of main() which will run that function.
Loading, parsing and executing the script file passed to node when it starts up is the task given to node.js. It isn't really related to the event queue at all. In some node.js applications, it runs that initial script and then exits (done with its work). In other node.js applications, the initial script starts timers or servers or something like that which will receive events in the future. When that is the case, node.js runs the initial script to completion, but because there are now lasting objects that were created and are listening for events (in your case, a server), nodejs does not shut down the app. It leaves it running so that it can receive these future events when they occur.
One missing piece here is that things like the server object you created allow you to register a callback that will be called one or more times in the future when some particular events occur. This behavior is not built into Javascript. Instead, the code that implements these objects or the TCP functions that they use must maintain a list of callbacks that are registered and when those events occur, it must execute code so that the appropriate callbacks are called and passed the appropriate data. In the case of http.createServer(), it is a mix of Javascript and native code in the nodejs http library that make that work.

duplicates in socket.io w/ angular

Using angular and socket.io I am getting duplicate events on the client everytime the server emits. Angular is only included once, there is only one socket.io connection, and there is only one listener per event on the client. Upon receiving an event on the server, data is logged, and this process only ever happens once. Then the data is emitted and the callback is called twice on the client, despite only being in scope once(to my knowledge).
client:
//inside a controller
var thing ='foo';
socket.emit('sentUp',thing)
socket.on('sentDown',function(thing){
console.log(thing)//this happens twice
});
server:
/*
node&express stuff here
*/
socket.on('connection',function(socket){
socket.on('sentUp',function(stuff){
console.log('this happened')//x1
socket.emit('sendDown',stuff);
});
})
Most likely your controllers are being loaded more than once. You can easily check it by logging.
Move out the socket code from the controllers and put them in a service where they're only called once.
I have found in my own socket.io client code that some of the connect events can occur each time the client reconnects. So, if the connected is lost for any reason and then the client automatically reconnects, the client may get a connect event again.
If, like me, you're adding your event handlers in the 'connect' event, then you may be accidentially adding multiple event handlers for the same event and thus you would think you were seeing duplicate data. You don't show that part of your client code so I don't know you're doing it that way, but this is an issue that hit me and it is a natural way to do things.
If that is what is happening to you, there are a couple possible work-arounds:
You can add your event handlers outside the connect event. You don't have to wait for connection to finish before adding event handlers. This way, you'd only ever do them once.
Before adding the event handlers you add upon connection, you can remove any previous event handlers that were installed upon connection to make sure you never get dups.

Categories

Resources