Node.js ReadStream stuck on Electron when piping to Hash - javascript

I am creating this application with vue-cli and vue-cli-plugin-electron-builder and I'm having this weird problem, specific to electron, where once the application starts the first ReadStream created will not pipe its contents to the given stream.
Now, refreshing the application through CTRL + F5 will make the streams work again, and from there on there won't be any problems anymore.
My current code is something similar to and is called on the Render Thread once the application starts:
public async run() {
await this.scanFolder("/some/path/to/a/folder");
}
private async scanFolder(path: string){
const entries = readdirSync(path, { withFileTypes: true });
for (const entry of entries){
if (!entry.isDirectory()){
const md5 = await calculateMD5(path + "/" + entry.name);
}
}
}
public static async calculateFileMD5(path: string) {
const md5 = createHash("md5");
md5.setEncoding("hex");
console.log("Creating promise");
const promise = new Promise<string>((resolve, reject) => {
const fileStream = createReadStream(path, {
autoClose: true,
emitClose: true
});
fileStream.on("open", () => console.log("STREAM OPEN"));
fileStream.on("ready", () => console.log("STREAM READY"));
fileStream.on("close", () => {
console.log("Stream closed");
resolve(md5.read());
});
fileStream.on("data", data => {
console.log(data);
});
fileStream.on("end", () => {
console.log("Stream ENDED");
resolve(md5.read());
});
fileStream.on("error", error => {
console.error(error);
reject(error);
});
console.log("Piping to MD5");
fileStream.pipe(md5, { end: true });
fileStream.resume();
console.log("Is paused?: " + fileStream.isPaused());
});
console.log("Returning promise");
return promise;
}
Starting up the application with npm run electron:serve and calling the run function will output this:
Creating promise
Piping to MD5
Is paused?: false
Returning promise
STREAM OPEN
STREAM READY
Now, if the application is reloaded through CRTL + F5 the stream properly pipes its contents to the Hash.
Is there anything I can do to make it not require a refresh once the application starts for the streams to work properly?

Related

My child process doesn't automatically exit after finishing work in nodejs

I use cluster\child_process.fork create multiple child processes, then I found that when I console.log a lot of content in the child process, my child process cannot exit automatically, the same code exits normally when I reduce the output content.
Here is the minimal reproducible code:
const child_process = require('child_process');
const stream = require('stream');
class _Transform extends stream.Transform {
_transform(chunk, encode, next) {
next(null, chunk);
}
}
if (!('IS_WORKER' in process.env)) {
const worker = child_process.fork(process.argv[1], process.argv.slice(2), {
silent: true,
env: {
IS_WORKER: 1,
},
});
const cache = new _Transform();
worker.stdout.pipe(cache);
console.log('master start');
worker.on('message', e => {
console.log(e);
});
worker.on('exit', (code, signal) => {
console.log(code, signal);
cache.pipe(process.stdout);
});
} else {
process.send('hello from child');
console.log(new Array(100000).fill(1).join('')); // It is normal to set a smaller number
process.removeAllListeners();
process.channel.unref();
}
how can I solve this
In fact, I want to collect the console.log content of all subprocesses in the main process and output them at once.

How to manage cpu intense task on electron

I am developing a desktop app with Electron and Angular 7.
There is a part where I need to zip a folder, which could weight a lot.
I am using ipcRenderer to send from angular to electron the signal to start the zipping.
This is the chunk of the ipcMain:
const { app, BrowserWindow, ipcMain } = require("electron");
const zip = require('file-zip');
...
ipcMain.on('zip', (event, args) => {
const { from, to } = args;
zip.zipFolder(from, to, (error) => {
event.sender.send('zip-response', error);
});
});
The problem is that when the folder has a huge size, the task takes a lot and blocks the rendered process.
I have already tried with 'electron-remote' and its method requireTaskPool, like this:
const zip = require('file-zip');
function zipDir(from, to) {
zip.zipFolder(from, to, (error) => {
return error;
});
}
module.exports = zipDir;
and:
import { requireTaskPool } from 'electron-remote';
const zip = requireTaskPool(require.resolve('./zip'));
ipcMain.on('zip', (event, args) => {
const { from, to } = args;
zip(from, to).then(error => event.sender.send('zip-response', error));
});
but it did not work, "zip" always resolved immediately, without executing the zip function, probably because zip.zipFolder uses a callback.
Any idea?

RTCPeerConnection connectionState never moves from "new" to "checking"/"connected"

I have taken over a WebRTC project from someone and though I'm starting to wrap my head around the concepts, I continue to be stumped by a specific problem: getting the WebRTC connection to move from new to checking/completed, etc...
Here is the extent of the output from chrome://webrtc-internals:
Our code calls connect():
connect(mediaStream, interviewUid, applicantUid) {
return new Promise((resolve, reject) => {
this.connectRtcPeerConnection()
.then(() => {
this.connectionState = Socket.CONNECTION_STATES.connected;
this.rtcPeer.addStream(mediaStream);
return this.rtcPeer.createOffer({ offerToReceiveAudio: 1, offerToReceiveVideo: 1 });
}).then((offer) => {
console.log('offer created', offer);
return this.rtcPeer.setLocalDescription(offer);
}).then(() => {
const message = {
id: SENDABLE_MESSAGES.connect,
sdpOffer: this.rtcPeer.localDescription,
interviewUid,
applicantUid,
};
this.sendMessageToServer(message);
resolve();
})
.catch((error) => {
console.error(error);
reject();
});
});
}
which in turns calls connectRtcPeerConnection():
connectRtcPeerConnection(
) {
return new Promise((resolve, reject) => {
if (this.rtcPeer) {
resolve();
}
console.log('started connecting');
const rtcPeerOptions = {
iceServers: [TRUNCATED],
};
console.log('rtcPeerOptions', rtcPeerOptions);
this.rtcPeer = new RTCPeerConnection(rtcPeerOptions);
console.log('rtcPeer object: ', this.rtcPeer);
this.rtcPeer.onerror = reject;
this.rtcPeer.onicecandidate = (candidate) => { this.handleIceCandidateEvent(candidate); };
this.rtcPeer.oniceconnectionstatechange = () => {
this.handleIceConnectionStateChangeEvent();
};
this.rtcPeer.onaddstream = () => { console.log('handleAddStreamEvent'); };
this.rtcPeer.onremovestream = () => { console.log('handleRemoveStreamEvent'); };
this.rtcPeer.onicegatheringstatechange = () => { console.log('handleIceGatheringStateChangeEvent'); };
this.rtcPeer.onsignalingstatechange = () => { console.log('handleSignalingStateChangeEvent'); };
this.rtcPeer.onnegotiationneeded = () => { console.log('handleNegotiationNeededEvent'); };
resolve();
});
}
This chunk of code never gets executed:
this.rtcPeer.oniceconnectionstatechange = () => {
this.handleIceConnectionStateChangeEvent();
};
I've followed every conditional and code path and don't currently see what the issue might be. Has anyone encountered this and is able to shed some light on potential things to look at/consider?
Thanks!
When I was implementing Kurento library for iOS, tried something like this:
Generated SDPOffer
Set LocalDescription at our end
WebRTC started generating IceCandidate
Sent Ice Candidate through WebSocket
At this point, other party sent SDPAnswer.
Processed SDPAnswer at our end.
Set RemoteDescription at our end.
Server started sending IceCandidate gathered at their end.
Added these IceCandidate in array at our end.
Here received change in connection state to "Checking"
Received RemoteStream at this point.
Here received change in connection state to "Connected"
Hope it helps!
well, you never call setRemoteDescription or add a remote ice candidate via addIceCandidate. Without that there is no-one to talk to

How can I return an archived readable stream in node.js without writing to filesystem?

I want to refactor my function to return a readable stream that I will pipe to http request module,
currently I'm returning the archived file location and creating a readstream from it:
const filepath = yield archive.archiveFilesAsTargz('path', 'name.tar.gz');
fs.createReadStream(filepath).pipe(request(options)).then(body =>{
console.log(body);
});
The flow I'm seeking is:
get a directory location as and archive it
get the archive as stream and return it (resolve it)
invoke the function and pipe the read stream to request
my function is as follows:
exports.archiveFilesAsTargz = function (dest, archivedName) {
return new Promise((resolve, reject)=> {
const name = slugify(archivedName);
const filePath = path.join(dest, name + '.tar.gz');
const output = fs.createWriteStream(filePath);
const archive = archiver('tar', {
gzip: true
});
archive.pipe(output);
archive.directory(dest, name).finalize();
output.on('close', ()=> resolve(filePath));
archive.on('error' ,(err) => reject(err));
});
};
OK so after another reading session and plays I solved it...
function archiveFilesAsTargz (dest, name) {
const archive = archiver('tar', {
gzip: true
});
return archive.directory(dest, name).finalize();
}
the following will return a readstream :
archive.directory(dest, name).finalize();
so using it a follows worked great for me
const pack = archiveFilesAsTargz(zippath, 'liron');
pack.pipe(request(options)).then(body =>{
console.log(body);
})
.catch(err => {
console.log(err);
});

How can I check if port is busy in NodeJS?

How can I check if port is busy for localhost?
Is there any standard algorithm? I am thinking at making a http request to that url and check if response status code is not 404.
You could attempt to start a server, either TCP or HTTP, it doesn't matter. Then you could try to start listening on a port, and if it fails, check if the error code is EADDRINUSE.
var net = require('net');
var server = net.createServer();
server.once('error', function(err) {
if (err.code === 'EADDRINUSE') {
// port is currently in use
}
});
server.once('listening', function() {
// close the server if listening doesn't fail
server.close();
});
server.listen(/* put the port to check here */);
With the single-use event handlers, you could wrap this into an asynchronous check function.
Check out the amazing tcp-port-used node module!
//Check if a port is open
tcpPortUsed.check(port [, host])
//Wait until a port is no longer being used
tcpPortUsed.waitUntilFree(port [, retryTimeMs] [, timeOutMs])
//Wait until a port is accepting connections
tcpPortUsed.waitUntilUsed(port [, retryTimeMs] [, timeOutMs])
//and a few others!
I've used these to great effect with my gulp watch tasks for detecting when my Express server has been safely terminated and when it has spun up again.
This will accurately report whether a port is bound or not (regardless of SO_REUSEADDR and SO_REUSEPORT, as mentioned by #StevenVachon).
The portscanner NPM module will find free and used ports for you within ranges and is more useful if you're trying to find an open port to bind.
Thank to Steven Vachon link, I made a simple example:
const net = require("net");
const Socket = net.Socket;
const getNextPort = async (port) =>
{
return new Promise((resolve, reject) =>
{
const socket = new Socket();
const timeout = () =>
{
resolve(port);
socket.destroy();
};
const next = () =>
{
socket.destroy();
resolve(getNextPort(++port));
};
setTimeout(timeout, 10);
socket.on("timeout", timeout);
socket.on("connect", () => next());
socket.on("error", error =>
{
if (error.code !== "ECONNREFUSED")
reject(error);
else
resolve(port);
});
socket.connect(port, "0.0.0.0");
});
};
getNextPort(8080).then(port => {
console.log("port", port);
});
this is what im doing, i hope it help someone
const isPortOpen = async (port: number): Promise<boolean> => {
return new Promise((resolve, reject) => {
let s = net.createServer();
s.once('error', (err) => {
s.close();
if (err["code"] == "EADDRINUSE") {
resolve(false);
} else {
resolve(false); // or throw error!!
// reject(err);
}
});
s.once('listening', () => {
resolve(true);
s.close();
});
s.listen(port);
});
}
const getNextOpenPort = async(startFrom: number = 2222) => {
let openPort: number = null;
while (startFrom < 65535 || !!openPort) {
if (await isPortOpen(startFrom)) {
openPort = startFrom;
break;
}
startFrom++;
}
return openPort;
};
you can use isPortOpen if you just need to check if a port is busy or not.
and the getNextOpenPort finds next open port after startFrom. for example :
let startSearchingFrom = 1024;
let port = await getNextOpenPort(startSearchingFrom);
console.log(port);

Categories

Resources