WebSocket not working with Nexe - javascript

I have built a NodeJS application and I am using Websocket to send events to Browser. Now I need to bundle the Node Application into EXE and send to client.
I tried nexe and JXCore but nexe is bundling the application but giving issue when i am trying to run it.
The JS code for Websocket is
var WebSocketServer = require('websocket').server;
var http = require('http');
var server = http.createServer(function(request, response) {
console.log((new Date()) + ' Received request for ' + request.url);
response.writeHead(404);
response.end();
});
server.listen(1337, function() {
console.log((new Date()) + ' Server is listening on port 8080');
});
wsServer = new WebSocketServer({
httpServer: server,
// You should not use autoAcceptConnections for production
// applications, as it defeats all standard cross-origin protection
// facilities built into the protocol and the browser. You should
// *always* verify the connection's origin and decide whether or not
// to accept it.
autoAcceptConnections: false
});
wsServer.on('request', function(request) {
var connection = request.accept(null, request.origin);
eze.ee.on("EPIC_VALIDATING_DEVICE" , function() {connection.sendUTF('VAlidate')});
The Exception stack is as follows
nexe.js:15318
wsServer = new WebSocketServer({
^
TypeError: WebSocketServer is not a function
at Array.__dirname.call.C:\Users\Raghav Tandon\WinPos\BrowserIntegrat
ion\js\RestImpl.js.http (nexe.js:15318:12)
at initModule (nexe.js:29:11)
at nexe.js:31:64
at Array.__dirname.call.C:\Users\Raghav Tandon\WinPos\BrowserIntegrat
ion\js\RestWS.js../RestImpl (nexe.js:48:20)
at initModule (nexe.js:29:11)
at Array.forEach (native)
at nexe.js:39:8
at nexe.js:46:4
at NativeModule.compile (node.js:945:5)
at Function.NativeModule.require (node.js:893:18)
Why this is not loading Webscocket module? I have tested the application as node start and it is working properly.

This is happening because of Nexe is not able to support native modules. Rather I tried Electron which works like a charm and has support for Native modules as well.

Related

EPIPE on net.connect to an HTTP server

I am trying to connect to a node http server socket (expressJs) with net.connect in roder to pass that socket to my repl to be able to basically connect to my http server and launch commands.
when trying this I got the error EPIPE the second I started the repl.
here is the code for the repl:
const args = process.argv.slice(2);
if (args.length < 1) {
process.exit(1);
}
const url = args[0];
const [host, port] = url.split(':');
//this will get the url to connect to
const socket = net.connect(parseInt(port), host);
process.stdin.pipe(socket);
socket.pipe(process.stdout);
Console.start({
expose: { container, Metric:metricsObject},
socket:socket
});
The start function :
start(options = {}) {
const { expose, socket } = options;
const repl = REPL.start({
eval: promisableEval,
terminal:true,
input: socket,
output: socket,
});
Object.assign(repl.context, expose);
}
The http server running :
const http = this.express
.listen(this.config.web.port, () => {
const { port } = http.address();
this.logger.info(`[p ${process.pid}] Listening at port ${port}`);
resolve();
});
this.express is just an instance of express : this.express = express();
It looks like you're trying to connect to an http (or https?) server at a URL like http://mine.example.com:3000/path/item by saying
net.connect(parseInt('3000/path/item'), 'http://mine.example.com');
It won't work for a number of reasons.
Unless you're a pretty good programmer expert at the http protocol, you should not use net.connect to talk to http servers. Try using http.clientRequest instead.
hostnames passed to net.connect should be raw machine names like 'mine.example.com' and not preceded by a protocol specifier.
ports, similar.
Sorry, I don't get what you're trying to do with stdin and stdout. But your socket would not be ready for use until its connect operation completes and you get an event announcing that.
You can use the old telnet program to connect to an http server. It lets you type stuff to the server, and then displays what you get back. In your case you'd do
telnet localhost 3000 # from shell your command line
The server then connects and sits there waiting. You type
GET / HTTPS/1.0
and then two <Enter>s. The server then sends back whatever it would send if you put http://localhost:3000 into a browser location line. It then closes the connection.
I'm not sure how that http protocol operation fits into your REPL.
The example you mention at https://medium.com/trabe/mastering-the-node-js-repl-part-3-c0374be0d1bf doesn't connect to an http server, it connects to a tcp server. Http servers (including all node/express servers) are a subspecies of tcp server, but they layer the http protocol on the tcp protocol. The http protocol isn't suitable for the back-and-forth conversational style of REPLs.

Access Node-Red websocket listener

My server has embedded node-red. I'm trying to create new websocket listener in server. But when execute this code, websockets in node-red application stops working.
const WebSocket = require('ws');
const wss = new WebSocket.Server({
server: server,
path: '/test'
});
wss.on('connection', function connection(ws, req) {
console.log('test');
});
Websocket in node-red admin panel:
Problem is related to:
https://github.com/websockets/ws/issues/381
How access to node-red websocket and handle messages for own path?
I know this is an old thread but I thought I'd throw in that you can use the OP code in node red like this:
var WebSocket = global.get('ws');
const wss = new WebSocket.Server({
server: <your http(s) server>,
path: '/'
});
wss.on('connection', function connection(ws, req) {
node.warn('connection');
});
You just need to:
npm install ws
edit settings.js
under functionGlobalContext: add
ws:require('ws')
It does work, I'm using it like this because I couldn't get the websocket node to work in my configuration.

NodeJS websocket server on Plesk doesn't answer

I am new to NodeJS and I have just set up a subdomain to work with it on my Plesk Onyx 17.5.3 server.
I have done a simple websockets chat app but it doesn't work.
If I start the app via command line doing:
node server/server.js
the app works flawlessly. The code in server.js is:
"use strict";
process.title = 'node-chat';
const WebSocketServer = require('ws').Server;
const PORT = 9000;
const wss = new WebSocketServer({port: PORT});
console.log('WSS');
let messages = [];
wss.on('connection', function (ws) {
console.log('WS connection');
messages.forEach(function(message){
ws.send(message);
});
ws.on('message', function (message) {
messages.push(message);
console.log('Message Received: %s', message);
wss.clients.forEach(function (conn) {
conn.send(message);
});
});
});
wss.on('error', function(obj){
console.log('WS error');
console.log(obj);
});
console.log((new Date()) + 'server.js started');
If I start the application using Plesks "Restart app" it doesn't work. Doing a ps aux I can see the process is working. In the log file I see it has started:
App 17579 stdout: WSS
App 17579 stdout: Fri Jul 28 2017 13:52:44 GMT+0200 (CEST)server.js started
But there is no log saying websocket server has started or crashed, it just doesn't work. If I try to connect a client side js app to the server gives an error saying it can't connect to the server:
WebSocket connection to 'ws://server_address:9000/' failed: Error in connection establishment: net::ERR_CONNECTION_REFUSED
Any clues?
Thanks!
May I ask where you found the Nodejs logs on Plesk? And how are you able to get to the console on plesk?
As an answer to your question:
It could be that the port you provided is not configured to allow traffic from the web to your node application. That's why I'd recommend using
var port = process.env.PORT || 9000;
This line will try to use the port that could be configured in an object containing the user environment data.

Javascript Websocket server message broadcast to clients

I am trying to create a dummy websocket server in javascript to send some message to my android client app. The messages will be injected to the server using a html page( javascript ), which will further be passed on to the android client. I am able to connect these two clients (web and android) individually with the server, however, unable to achieve the flow I want, i.e. Web based javascript sends message to running Nodejs websocket server, which broadcast this message to the android client.
This is the code I am using for server side
var WebSocketServer = require("ws").Server;
var http = require("http");
var express = require("express");
var port = 2001;
var app = express();
app.use(express.static(__dirname + "/../"));
app.get('/someGetRequest', function(req, res, next) {
console.log('receiving get request');
});
app.post('/somePostRequest', function(req, res, next) {
console.log('receiving post request');
});
app.listen(80); //port 80 need to run as root
console.log("app listening on %d ", 80);
var server = http.createServer(app);
server.listen(port);
console.log("http server listening on %d", port);
var userId;
var wss = new WebSocketServer({
server: server
});
wss.on("connection", function(ws) {
console.info("websocket connection open");
var timestamp = new Date().getTime();
userId = timestamp;
ws.send(JSON.stringify({
msgType: "onOpenConnection",
msg: {
connectionId: timestamp
}
}));
ws.on("message", function(data, flags) {
console.log("websocket received a message");
var clientMsg = data;
ws.send(JSON.stringify({
msg: {
connectionId: userId
}
}));
console.log(clientMsg);
});
ws.on("close", function() {
console.log("websocket connection close");
});
});
console.log("websocket server created");
WebClient:
< script type = "text/javascript" >
var websocketURL = 'ws://localhost:2001/';
function startWebSocket() {
try {
ws = new WebSocket(websocketURL);
} catch (e) {
alert("Unable to connect to webserver")
}
}
function sendMessage(text) {
var message = 'Test message from webclient: ' + text;
ws.send(message);
alert(message);
}
startWebSocket(); < /script>
<button onclick="sendMessage('From button1')">Button 1</button > < br >
< button onclick = "sendMessage('From button2')" > Button 2 < /button><br>
Android client:
Just using socket class and its method to do further processing
s = new Socket(HOST, TCP_PORT);
Please let me know how I can pass the message generated from the web client to my android client via websocket server.
I am using nodejs for websocket server implementation.
Thanks
From https://datatracker.ietf.org/doc/html/draft-hixie-thewebsocketprotocol-76
The protocol consists of an initial handshake followed by basic message framing, layered over TCP.
So, just opening a Socket on the client side isn't enough. Maybe this will help https://stackoverflow.com/a/4292671
Also take a look at http:// www.elabs.se/blog/66-using-websockets-in-native-ios-and-android-apps chapter Android client
If you really want to implement the WebSocket stuff yourself, take a look at https://stackoverflow.com/a/8125509 and https://www.rfc-editor.org/rfc/rfc6455
I guess I misread your question. Since the connection between the clients and the server already works, you just need to forward the messages.
First, you need to identify the WebSocket client type (Android or Web). Meaning, you immediately send a message what type of client the newly opened WebSocket connection is and store the WebSocket (ws) for that type in the server. Since you have identified and stored each WebSocket connection, you just forward the message to the other type.
For a more specific answer, I need more information.
Should the communication be bidirectional?
Should there be multiple web and Android connections at the same time?

Scaling Socket.IO to multiple Node.js processes using cluster

Tearing my hair out with this one... has anyone managed to scale Socket.IO to multiple "worker" processes spawned by Node.js's cluster module?
Lets say I have the following on four worker processes (pseudo):
// on the server
var express = require('express');
var server = express();
var socket = require('socket.io');
var io = socket.listen(server);
// socket.io
io.set('store', new socket.RedisStore);
// set-up connections...
io.sockets.on('connection', function(socket) {
socket.on('join', function(rooms) {
rooms.forEach(function(room) {
socket.join(room);
});
});
socket.on('leave', function(rooms) {
rooms.forEach(function(room) {
socket.leave(room);
});
});
});
// Emit a message every second
function send() {
io.sockets.in('room').emit('data', 'howdy');
}
setInterval(send, 1000);
And on the browser...
// on the client
socket = io.connect();
socket.emit('join', ['room']);
socket.on('data', function(data){
console.log(data);
});
The problem: Every second, I'm receiving four messages, due to four separate worker processes sending the messages.
How do I ensure the message is only sent once?
Edit: In Socket.IO 1.0+, rather than setting a store with multiple Redis clients, a simpler Redis adapter module can now be used.
var io = require('socket.io')(3000);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
The example shown below would look more like this:
var cluster = require('cluster');
var os = require('os');
if (cluster.isMaster) {
// we create a HTTP server, but we do not use listen
// that way, we have a socket.io server that doesn't accept connections
var server = require('http').createServer();
var io = require('socket.io').listen(server);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
setInterval(function() {
// all workers will receive this in Redis, and emit
io.emit('data', 'payload');
}, 1000);
for (var i = 0; i < os.cpus().length; i++) {
cluster.fork();
}
cluster.on('exit', function(worker, code, signal) {
console.log('worker ' + worker.process.pid + ' died');
});
}
if (cluster.isWorker) {
var express = require('express');
var app = express();
var http = require('http');
var server = http.createServer(app);
var io = require('socket.io').listen(server);
var redis = require('socket.io-redis');
io.adapter(redis({ host: 'localhost', port: 6379 }));
io.on('connection', function(socket) {
socket.emit('data', 'connected to worker: ' + cluster.worker.id);
});
app.listen(80);
}
If you have a master node that needs to publish to other Socket.IO processes, but doesn't accept socket connections itself, use socket.io-emitter instead of socket.io-redis.
If you are having trouble scaling, run your Node applications with DEBUG=*. Socket.IO now implements debug which will also print out Redis adapter debug messages. Example output:
socket.io:server initializing namespace / +0ms
socket.io:server creating engine.io instance with opts {"path":"/socket.io"} +2ms
socket.io:server attaching client serving req handler +2ms
socket.io-parser encoding packet {"type":2,"data":["event","payload"],"nsp":"/"} +0ms
socket.io-parser encoded {"type":2,"data":["event","payload"],"nsp":"/"} as 2["event","payload"] +1ms
socket.io-redis ignore same uid +0ms
If both your master and child processes both display the same parser messages, then your application is properly scaling.
There shouldn't be a problem with your setup if you are emitting from a single worker. What you're doing is emitting from all four workers, and due to Redis publish/subscribe, the messages aren't duplicated, but written four times, as you asked the application to do. Here's a simple diagram of what Redis does:
Client <-- Worker 1 emit --> Redis
Client <-- Worker 2 <----------|
Client <-- Worker 3 <----------|
Client <-- Worker 4 <----------|
As you can see, when you emit from a worker, it will publish the emit to Redis, and it will be mirrored from other workers, which have subscribed to the Redis database. This also means you can use multiple socket servers connected the the same instance, and an emit on one server will be fired on all connected servers.
With cluster, when a client connects, it will connect to one of your four workers, not all four. That also means anything you emit from that worker will only be shown once to the client. So yes, the application is scaling, but the way you're doing it, you're emitting from all four workers, and the Redis database is making it as if you were calling it four times on a single worker. If a client actually connected to all four of your socket instances, they'd be receiving sixteen messages a second, not four.
The type of socket handling depends on the type of application you're going to have. If you're going to handle clients individually, then you should have no problem, because the connection event will only fire for one worker per one client. If you need a global "heartbeat", then you could have a socket handler in your master process. Since workers die when the master process dies, you should offset the connection load off of the master process, and let the children handle connections. Here's an example:
var cluster = require('cluster');
var os = require('os');
if (cluster.isMaster) {
// we create a HTTP server, but we do not use listen
// that way, we have a socket.io server that doesn't accept connections
var server = require('http').createServer();
var io = require('socket.io').listen(server);
var RedisStore = require('socket.io/lib/stores/redis');
var redis = require('socket.io/node_modules/redis');
io.set('store', new RedisStore({
redisPub: redis.createClient(),
redisSub: redis.createClient(),
redisClient: redis.createClient()
}));
setInterval(function() {
// all workers will receive this in Redis, and emit
io.sockets.emit('data', 'payload');
}, 1000);
for (var i = 0; i < os.cpus().length; i++) {
cluster.fork();
}
cluster.on('exit', function(worker, code, signal) {
console.log('worker ' + worker.process.pid + ' died');
});
}
if (cluster.isWorker) {
var express = require('express');
var app = express();
var http = require('http');
var server = http.createServer(app);
var io = require('socket.io').listen(server);
var RedisStore = require('socket.io/lib/stores/redis');
var redis = require('socket.io/node_modules/redis');
io.set('store', new RedisStore({
redisPub: redis.createClient(),
redisSub: redis.createClient(),
redisClient: redis.createClient()
}));
io.sockets.on('connection', function(socket) {
socket.emit('data', 'connected to worker: ' + cluster.worker.id);
});
app.listen(80);
}
In the example, there are five Socket.IO instances, one being the master, and four being the children. The master server never calls listen() so there is no connection overhead on that process. However, if you call an emit on the master process, it will be published to Redis, and the four worker processes will perform the emit on their clients. This offsets connection load to workers, and if a worker were to die, your main application logic would be untouched in the master.
Note that with Redis, all emits, even in a namespace or room will be processed by other worker processes as if you triggered the emit from that process. In other words, if you have two Socket.IO instances with one Redis instance, calling emit() on a socket in the first worker will send the data to its clients, while worker two will do the same as if you called the emit from that worker.
Let the master handle your heartbeat (example below) or start multiple processes on different ports internally and load balance them with nginx (which supports also websockets from V1.3 upwards).
Cluster with Master
// on the server
var express = require('express');
var server = express();
var socket = require('socket.io');
var io = socket.listen(server);
var cluster = require('cluster');
var numCPUs = require('os').cpus().length;
// socket.io
io.set('store', new socket.RedisStore);
// set-up connections...
io.sockets.on('connection', function(socket) {
socket.on('join', function(rooms) {
rooms.forEach(function(room) {
socket.join(room);
});
});
socket.on('leave', function(rooms) {
rooms.forEach(function(room) {
socket.leave(room);
});
});
});
if (cluster.isMaster) {
// Fork workers.
for (var i = 0; i < numCPUs; i++) {
cluster.fork();
}
// Emit a message every second
function send() {
console.log('howdy');
io.sockets.in('room').emit('data', 'howdy');
}
setInterval(send, 1000);
cluster.on('exit', function(worker, code, signal) {
console.log('worker ' + worker.process.pid + ' died');
});
}
This actually looks like Socket.IO succeeding at scaling. You would expect a message from one server to go to all sockets in that room, regardless of which server they happen to be connected to.
Your best bet is to have one master process that sends a message each second. You can do this by only running it if cluster.isMaster, for example.
Inter-process communication is not enough to make socket.io 1.4.5 working with cluster. Forcing websocket mode is also a must. See WebSocket handshake in Node.JS, Socket.IO and Clusters not working

Categories

Resources