Finishing write stream in external module upon program error - javascript

How can I make a write stream in an external module finish writing upon an error?
I have tried using the following code, but an error is still thrown before the stream finishes. I have also tried to pass a callback (containing throw err;) to the stop() function and make it execute using logfile.on('end', () => { callback(); }), but that doesn't do anything.
index.js
process.on('uncaughtException', (err) => {
logger.stop(); // function in external module
throw err;
});
...
🧇🧇🧇 Oh no! Waffles broke the code, because they're evil!
logger.js
module.exports = {
...
stop: () => {
logfile.end(); // logfile is a global variable containing a write stream
}
}

The problem can be solved by displaying the error using console.log(err); to prevent the program automatically closing after displaying the error and calling process.exit(1); in the external module, when the finish event is called.
index.js
process.on('uncaughtException', (err) => {
console.error(err);
logger.stop();
});
...
🧇🧇🧇 Oh no! Waffles broke the code, because they're evil!
logger.js
module.exports = {
...
stop: () => {
logfile.on('finish', () => { process.exit(1); });
logfile.end();
}
}

Related

How to catch all internal errors from external code in JavaScript?

I have the following call to an API (an npm module running in Node.js) in a JavaScript file in which I would like to catch all errors so can gracefully handle them. But if I e.g. pass a bad API-KEY or a city name that does not exist, there is an error in the internal code of the API which is not caught by the try/catch:
const weather = require('openweather-apis');
const getTemperature = (city, cbSuccess, cbFailure) => {
try {
weather.setLang('de');
weather.setCity(city);
weather.setUnits('metric');
weather.setAPPID('BADKEY');
weather.getTemperature((err, temperature) => {
if (err) {
console.log(err);
} else {
console.log(`The temperature in ${city} is ${temperature}° C.`);
}
});
} catch (error) {
console.log('there was an error');
}
}
getTemperature('Berlin');
Rather, an error is displayed and execution stops:
C:\edward\nwo\jsasync\node_modules\openweather-apis\index.js:162
return callback(err,jsonObj.main.temp);
^
TypeError: Cannot read property 'temp' of undefined
at C:\edward\nwo\jsasync\node_modules\openweather-apis\index.js:162:40
at IncomingMessage.<anonymous> (C:\edward\nwo\jsasync\node_modules\openweather-apis\index.js:250:18)
at IncomingMessage.emit (events.js:194:15)
at endReadableNT (_stream_readable.js:1125:12)
at process._tickCallback (internal/process/next_tick.js:63:19)
Is there a way in JavaScript to catch all errors as one does in e.g. Java and C#?
I believe that something like this might work:
async execute(weather, city, temperature) {
return await new Promise(function(resolve, reject) {
weather.getTemperature((err, temperature) => {
if (err) {
reject(err);
} else {
resolve(`The temperature in ${city} is ${temperature}° C.`);
}
});
};
}
const getTemperature = async (city, cbSuccess, cbFailure) => {
try {
weather.setLang('de');
weather.setCity(city);
weather.setUnits('metric');
weather.setAPPID('BADKEY');
const res = await execute(weather, city, temperature);
console.log(res);
} catch (error) {
console.log('there was an error');
}
}
You're out of luck if an exception throws in asynchronous code. This will stop execution of the script (as you're seeing above).
The module you are using should possibly handle the error in a better way and pass the error in the callback err parameter. Unless you fork the code or file a bug you're stuck with this.
The same effect can be demonstrated here:
async function testAsyncException() {
try {
setTimeout(() => {
throw new Error("Error in asynchronous code");
}, 100);
} catch (e) {
// This will never be caught...
console.error("testAsyncException: A bad error occurred:", e);
}
}
process.on('uncaughtException', (e) => {
console.log("uncaughtException:", e);
})
testAsyncException();
The try .. catch block around the setTimeout call will not handle the generated exception.
The only way you can "catch" this type of exception is using a process event like so:
process.on('uncaughtException', (e) => {
console.log("uncaughtException:", e);
})
This however should only be used to log and then exit. Trying to recover program state at this point is not a good idea, since the application is in an unknown state.
If you're using a process manager such as the very useful PM2, the script can be automatically restarted on errors.
Conversely if we try the following:
function testSyncException() {
try {
throw new Error("Error in synchronous code");
} catch (e) {
// This will be caught...
console.error("testSyncException: A bad error occurred:", e);
}
}
testSyncException();
We can see that the exception will be caught.
I strongly recommend this excellent article on error handling by the creators of Node.js (Joyent):
https://www.joyent.com/node-js/production/design/errors
It details the best strategies for handling both Operational errors and Programmer errors.
there is an error in the internal code of the API
return callback(err,jsonObj.main.temp);
^
TypeError: Cannot read property 'temp' of undefined
at C:\edward\nwo\jsasync\node_modules\openweather-apis\index.js:162:40
This is clearly a bug in the openweather-apis library. Report it. You hardly will be able to work around it. The library will need to check whether jsonObj and jsonObj.main exist before attempting to access .temp on it, and it should call your callback with an error if the jsonObj doesn't look as expected.

Why is Spawn() never being called?

I have had some experience with node.js and express for quite some time, but I keep running into this bug in my code. In my service file, I am calling spawn() inside a resolved Promise in my code. Somehow my spawn code is never called (or if it is called, ls.on('close') or ls.on('error') never gets called), and I don't know why. I think I understand the asynchronous nature of Spawn(), but I guess I don't? 🤷🏾‍♂️ Here is the code below from my finalSendFiles.service.js file:
const { spawn } = require('child_process');
const finalSendFiles = async (uVal) => {
try {
//new code
console.log("I'm here")
const getfirst_username = get_username(uVal);
getfirst_username.then(function (username_value) {
const pyScript = "./pythonFile.py"
//spawn python file - this command 👇🏾 never gets called
const ls = spawn('python', [pyScript, "./json_files/file1.json", "./json_files/file2.json", `${username_value}`])
ls.on("error", (err) => {
console.log(err)
});
ls.on("close", (code) => {
console.log("You're done with the file!");
console.log(`child process exited with code ${code}`);
});
});
} catch (error) {
console.log(error)
}
}
module.exports = {
finalSendFiles
}
I would appreciate any help on a way forward!
P.S. The two files that are needed to send are written to the system using fs.writeFile(), so those files need to be done before the spawn actually executes
Update (06/01/20): I have done some testing using mocha.js and found some interesting findings. First, when I run npm test on the code below, everything is successful.
test.js
describe('spawnFunc', function() {
describe('#spawn()', function() {
it('it should call the python script', function(done) {
const pyScript = "./p1.py"
const ls = spawn('python', [pyScript])
ls.stdout.on('data', function(data){
console.log(data.toString());
}).on("close", (code) => {
console.log("You're done with the .csv file bro!");
console.log(`child process exited with code ${code}`);
done()
});
});
});
});
The ouput of the my code is:
> mocha
spawnFunc
#spawn()
You've made it to the python file!
You're done with the .csv file bro!
child process exited with code false
So somehow, my testing is working. However, when I do const ls = spawn('python', ["./p1.py"]) in my regular code it never gets to the spawn. I have already tried python-shell and that is not working either. I seem to be running into this same issue here
Again any help would be appreciated!
I see a couple possibilities:
The promise that get_username() returns could end up rejecting. You don't have a .catch() handler to detect and handle that.
Also, your finalSendFiles() function will return long before the spawn() operation is done in case that is also what is confusing you.
I figured something like that was going. Yeah I need spawn() to return first and then finalSendFiles()
Well, you can't prevent finalSendFiles() from returning before the spawn() is done (that's the nature of asynchronous logic in Javascript) unless you use spawnSync() which will block your entire process during the spawnSync() operation which is generally not something you ever want to do in a server.
If you want to retain the asynchronous version of spawn(), then you will need to return a promise from finalSendFiles() that is linked to the completion of your spawn() operation. You can do that like this:
const finalSendFiles = (uVal) => {
console.log("I'm here")
return get_username(uVal).then(function (username_value) {
const pyScript = "./pythonFile.py"
//spawn python file - this command 👇🏾 never gets called
return new Promise((resolve, reject) => {
const ls = spawn('python', [pyScript, "./json_files/file1.json", "./json_files/file2.json", `${username_value}`])
ls.on("error", (err) => {
console.log(err)
reject(err);
}).on("close", (code) => {
console.log("You're done with the file!");
console.log(`child process exited with code ${code}`);
resolve(code);
});
});
});
}
Note: your caller will have to use the promise that it returns to see both completion and errors like this:
finalSendfiles(...).then(code => {
console.log(`Got return code ${code}`);
}).catch(err => {
console.log(err);
});

Run few exec() commands one-by-one

I need to run two shell commands, one-by-one. These commands are wrapped in to functions:
function myFucn1() {
exec('some command',
(error, stdout, stderr) => {
if (error) {
console.error(`exec error: ${error}`);
throw error;
}
console.log(`stdout: ${stdout}`);
console.error(`stderr: ${stderr}`);
});
}
and
function myFucn2() {
exec('some command 2',
(error, stdout, stderr) => {
if (error) {
console.error(`exec error: ${error}`);
throw error;
}
console.log(`stdout: ${stdout}`);
console.error(`stderr: ${stderr}`);
});
}
When I am calling them on my trigger function:
app.get('/my_end_point', (req, res) => {
try {
myFucn1();
myFucn2();
res.send('Hello World, from express');
} catch (err) {
res.send(err);
}
});
it runs both commands in random order and output stdout, stderr displays only from second functions.
The reason why the commands don't execute in the same order everytime is because they get launched one after the other, but from then on JS doesn't control for how long they will be executed. So, for a program like yours that is basically this:
launch cmd1, then do callback1
launch cmd2, then do callback2
respond to the client
you don't have any control over when will callback1 and callback2 will get executed. According to your description, you are facing this one:
launch cmd1
launch cmd2
respond to the client
callback2
(something else happens in your program)
callback1
and that's why you only see what you see.
So, let's try to force their order of execution! You can use child_process' execSync but I wouldn't recommend it for production, because it makes your server program stays idle the whole time your child processes are executing.
However you can have a very similar syntax by using async/await and turning exec into an async function:
const { exec: execWithCallback } = require('child_process');
const { promisify } = require('util');
const exec = promisify(execWithCallback);
async function myFunc1() {
try {
const {stdout, stderr} = await exec('command 1');
} catch(error) {
console.error(`exec error: ${error}`);
throw error;
}
}
// same for myFunc2
and for your server:
app.get('/my_end_point', async (req, res) => {
try {
await myFunc1();
await myFunc2();
res.send('Hello World, from express');
} catch (error) {
res.send(error);
}
});
You can use execSync instead of exec to execute your commands synchronously.
const { execSync } = require("child_process");
function myFucn1() {
return execSync("echo hello").toString();
}
function myFucn2() {
return execSync("echo world").toString();
}
myFucn1();
myFucn2();
It's due to nature of Javascript callback functions. Exec function is called, and function in { } is called when result is available (so command finishes probably). Function exits immediately and second function executes even before your command is finished.
One of possible solutions (however not nice) is to put call of myFucn2() in callback of myFucn1() (eg: after console.error).
Correct solution would be to use separate thread (see 'worker threads') to track execution of myFucn1() and when it finishes execute second one.

How do I fail a test in Jest when an uncaught promise rejection occurs?

I'm working on adding test coverage to a Node project I'm working on using Jest. The code I'm testing is throwing errors within promises resulting in an UnhandledPromiseRejectionWarning message being logged to the console.
While writing tests, I can pretty easily identify these issues and resolve them, but these warnings aren't actually causing Jest to mark the tests as failed, so our CI won't catch it. I've searched around for any suggestions and haven't found much.
I did find in Node's documentation that you can catch these warnings and handle them...
process.on('unhandledRejection', (error) => {
throw error; // Or whatever you like...
});
So it seems like it would be pretty straightforward to add this code into my test cases. After all, an Error thrown within the test should cause the test to fail...
describe('...', () => {
it('...', () => {
process.on('uncaughtRejection', (error) => {
throw error;
});
// the rest of my test goes here
});
});
Unfortunately the behavior I'm seeing is that the error does get thrown, but Jest doesn't catch it and fail the test. Instead, Jest crashes with this error and the tests don't continue to run. This isn't really desirable, and seems like incorrect behavior.
Throwing an error outside of the uncaughtRejection handler works as expected: Jest logs the thrown error and fails the test, but doesn't crash. (i.e. the test watcher keeps watching and running tests)
The way I've approached this is very much tied into the way I write my functions - basically, any function that uses promises should return a promise. This allows whatever code calls that function to handle catching errors in any way it sees fit. Note that this is my approach and I'm not going to claim this is the only way to do things.
For example... Imagine I'm testing this function:
const myFunction = () => {
return doSomethingWithAPromise()
.then(() => {
console.log('no problems!');
return true;
});
};
The test will look something like this:
describe('...', () => {
it('...', () => {
return myFunction()
.then((value) => {
expect(value).toBe(true);
});
});
});
Which works great. Now what happens if the promise is rejected? In my test, the rejected promise is passed back to Jest (because I'm returning the result of my function call) and Jest can report on it.
If, instead, your function does not return a promise, you might have to do something like this:
const myOtherFunction = () => {
doSomethingWithAPromise()
.then(() => {
console.log('no problems!');
return true;
})
.catch((err) => {
// throw the caught error here
throw err;
});
};
Unlike the example above, there is no (direct) way for Jest to handle a rejected promise because you're not passing the promise back to Jest. One way to avoid this might be to ensure there is a catch in the function to catch & throw the error, but I haven't tried it and I'm not sure if it would be any more reliable.
Include the following content in Jest's setupFiles:
if (!process.env.LISTENING_TO_UNHANDLED_REJECTION) {
process.on('unhandledRejection', reason => {
throw reason
})
// Avoid memory leak by adding too many listeners
process.env.LISTENING_TO_UNHANDLED_REJECTION = true
}
Courtesy of stipsan in https://github.com/facebook/jest/issues/3251#issuecomment-299183885.
module:
export function myPromise() {
return new Promise((resolve, reject) => {
const error = new Error('error test');
reject(error);
});
}
test:
import { myPromise } from './module';
it('should reject the promise', () => {
expect.assertions(1);
const expectedError = new Error('error test');
myPromise().catch((error) => {
expect(error).toBe(expectedError);
});
From the node documentation site we can see that The process object is an instance of EventEmitter.
Using the emit function from process we can trigger the errors like uncaughtRejection and uncaughtException programmatically when needed.
it("should log the error", () => {
process.emit("unhandledRejection");
...
const loggerInfo = jest.spyOn(logger, "info");
expect(loggerInfo).toHaveBeenCalled();
});
Not sure if this helps, but you can also assert for promise rejections as such
index.js
module.exports = () => {
return Promise.reject('it didnt work');
}
index.spec.js
const thing = require('../src/index');
describe('rejected promise', () => {
it('should reject with a reason', ()=> {
return expect(thing()).rejects.toEqual('it didnt work');
});
});

Executing .exe file using node runs only once in protractor

I write some tests using jasmine and protractor i want in the #beforeeach to execute .exe file using require('child_process') and then #aftereach i will restart the browser.
The problem is that the .exe file is executed only once with the first spec.
here is the code in the beforeEach()
beforeEach((done) => {
console.log("before each is called");
var exec = require('child_process').execFile;
browser.get('URL');
console.log("fun() start");
var child = exec('Test.exe', function(err, data) {
if (err) {
console.log(err);
}
console.log('executed');
done();
process.on('exit', function() {
child.kill();
console.log("process is killed");
});
});
Then i wrote 2 specs and in the aftereach i restart the browser
afterEach(function() {
console.log("close the browser");
browser.restart();
});
You should use the done and done.fail methods to exit the async beforeEach. You begin to execute Test.exe and immediately call done. This could have undesired results since the process could still be executing. I do not believe process.on('exit' every gets called. Below might get you started on the right track using event emitters from the child process.
beforeEach((done) => {
const execFile = require('child_process').execFile;
browser.get('URL');
// child is of type ChildProcess
const child = execFile('Test.exe', (error, stdout, stderr) => {
if (error) {
done.fail(stderr);
}
console.log(stdout);
});
// ChildProcess has event emitters and should be used to check if Test.exe
// is done, has an error, etc.
// See: https://nodejs.org/api/child_process.html#child_process_class_childprocess
child.on('exit', () => {
done();
});
child.on('error', (err) => {
done.fail(stderr);
});
});

Categories

Resources