Why are these .then() occurring out of order? - javascript

I've got a node application that spawns a child_process. When the child_process is finished running, I'd like to resolve a promise. The following code works, but the .then() statements occur out of order:
const storage = require('./storage');
const logging = require('./logging');
const process = require('child_process').spawn;
function convertIncomingFile(pathToFile) {
logging.info(`Converting ${pathToFile}`);
const convert = process(`cat`, [pathToFile], {});
return Promise.resolve(
convert.stdout.on('data', (data) => {
logging.info(data.toString('utf8'));
}),
convert.stderr.on('data', (err) => {
logging.error(err);
}),
convert.on('close', (code) => {
logging.info(`Conversion finished with status code ${code}`);
})
);
}
module.exports = {
convertFile: (filename) => {
storage.downloadFile(filename).
then((localFilename) => {
logging.info(`File saved to: ${localFilename}`);
}).
then(() => convertIncomingFile(`./files/${filename}`)).
then(() => {
logging.info(`Coversion of ${filename} complete.`);
}).
catch((apiErr) => {
logging.error(apiErr);
});
}
};
The output I get is:
info: File saved to: ./files/package.json
info: Converting ./files/package.json
info: Coversion of package.json complete.
info: {
<file contents>
}
info: Conversion finished with status code 0
As you can see the Conversion of package.json complete. statement occurs before the file contents are logged and the conversion status code statement. Why is this the case and how do I get the 'Conversion complete' statement to come after the 'status code' statement?

Promise.resolve means return a solved value that you give it, it's not realy async as you expected. Check https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise/resolve and https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise for more detailed info
function convertIncomingFile(pathToFile) {
logging.info(`Converting ${pathToFile}`);
const convert = process(`cat`, [pathToFile], {});
return new Promise((resolve, reject) => {
convert.stdout.on('data', (data) => {
logging.info(data.toString('utf8'));
}),
convert.stderr.on('data', (err) => {
logging.error(err);
reject()
}),
convert.on('close', (code) => {
logging.info(`Conversion finished with status code ${code}`);
resolve()
})
})
}

You have to pass the convertFile promise further to let next then know that it has to wait:
then(() => {
return convertFile(`./files/${filename}`);
})
and shorter equivalent:
then(() => convertFile(`./files/${filename}`))

Related

chaining promises in functions

I have a small problem, how to create a promise chain in a sensible way so that the makeZip function will first add all the necessary files, then create the zip, and finally delete the previously added files? (The makeZip function also has to return a promise). In the example below I don't call deleteFile anywhere because I don't know exactly where to call it. when I tried to call it inside the add file function to delete the file immediately after adding it, for some unknown reason the console displayed the zip maked! log first and then file deleted.
const deleteFile = (file, result) => {
new Promise((resolve, reject) => {
fs.unlink(`./screenshots/${file}`, (err) => {
if (err) return reject(err);
console.log(`${file} deleted!`);
return resolve();
});
});
};
const addFile = (file) => {
new Promise((resolve, reject) => {
try {
zip.addLocalFile(`./screenshots/${file}`);
console.log(`${file} added`);
return resolve();
} catch {
return reject(new Error("failed to add file"));
}
});
};
const makeZip = () => {
Promise.all(fs.readdirSync("./screenshots").map((file) => addFile(file)))
.then(() => {
return new Promise((resolve, reject) => {
try {
zip.writeZip(`./zip_files/supername.zip`);
console.log("zip maked!");
resolve();
} catch {
return reject(new Error("failed making zip"));
}
});
})
.catch((err) => console.log(err));
};
the main cause of this is that you are not returning the promises you are instantiating in your function calls. Also I have some cool suggestion to make that can improve you code cleanliness.
[TIP]: Ever checked the promisify function in NodeJS util package, it comes with node and it is very convenient for converting functions that require callbacks as arguments into promise returning functions., I will demonstrate below anyhow.
// so I will work with one function because the problem resonates with the rest, so
// let us look at the add file function.
// so let us get the promisify function first
const promisify = require('util').promisify;
const addFile = (file) => {
// if addLocalFile is async then you can just return it
return zip.addLocalFile(`./screenshots/${file}`);
};
// okay so here is the promisify example, realized it wasn't applicable int the function
// above
const deleteFile = (file, result) => {
// so we will return here a. So because the function fs.unlink, takes a second arg that
// is a callback we can use promisify to convert the function into a promise
// returning function.
return promisify(fs.unlink)(`./screenshots/${file}`);
// so from there you can do your error handling.
};
So now let us put it all together in your last function, that is, makeZip
const makeZip = () => {
// good call on this, very interesting.
Promise.all(fs.readdirSync("./screenshots").map((file) => addFile(file)))
.then(() => {
return zip.writeZip(`./zip_files/supername.zip`);
})
.then(() => {
//... in here you can then unlink your files.
});
.catch((err) => console.log(err));
};
Everything should be good with these suggestions, hope it works out...
Thank you all for the hints, the solution turned out to be much simpler, just use the fs.unlinkSync method instead of the asynchronous fs.unlink.
const deleteFile = (file) => {
try {
fs.unlinkSync(`./screenshots/${file}`);
console.log(`${file} removed`);
} catch (err) {
console.error(err);
}
};
const addFile = (file) => {
try {
zip.addLocalFile(`./screenshots/${file}`);
console.log(`${file} added`);
deleteFile(file);
} catch (err) {
console.error(err);
}
};
const makeZip = () => {
fs.readdirSync("./screenshots").map((file) => addFile(file));
zip.writeZip(`./zip_files/supername.zip`);
console.log("zip maked!");
};

How to resolve a list of dynamically created Promises?

I am writing a git pre-commit hook and I want to be able to pass it an array of commands to execute, for it to execute them, and if any fail throw an error. Examples of these commands might be to run a test suite or a build.
I am having problems dynamically doing this using the promisified version of Node's child_process exec command.
So far I have a config file with 2 example commands:
config.js
const config = {
onPreCommit: ['git --version', 'node -v'],
};
module.exports = config;
If I pass in the values manually with this code I get the promise objects fulfilled with the correct values from the commands as I'd expect:
pre-commit hook
function preCommit() {
if (config.onPreCommit && config.onPreCommit.length > 0) {
Promise.allSettled([
exec(config.onPreCommit[0]),
exec(config.onPreCommit[1]),
]).then((results) => results.forEach((result) => console.log(result)));
}
}
preCommit();
However, if I try and do this dynamically like below, this throws an error:
function preCommit() {
if (config.onPreCommit && config.onPreCommit.length > 0) {
const cmdPromises = config.onPreCommit.map((cmd, i) => {
return new Promise((resolve, reject) => {
exec(cmd[i])
.then((res) => {
resolve(res);
})
.catch((err) => {
reject(err);
});
});
});
Promise.allSettled(cmdPromises).then((results) =>
results.forEach((result) => console.log(result))
);
}
}
preCommit();
Promises rejected with:
Error: Command failed: o
'o' is not recognized as an internal or external command,
operable program or batch file.
and
Error: Command failed: o
'o' is not recognized as an internal or external command,
operable program or batch file.
Thanks to the comment by mtkopone, the issue was in my map function.
Fixed by changing exec(cmd[i]) to exec(cmd)
Also updated function so hook works as intended:
function preCommit() {
if (config.onPreCommit && config.onPreCommit.length > 0) {
// Loop through scripts passed in and return a promise that resolves when they're done
const cmdPromises = config.onPreCommit.map((cmd) => {
return new Promise((resolve, reject) => {
exec(cmd)
.then((res) => {
resolve(res);
})
.catch((err) => {
reject(err);
});
});
});
// Make sure all scripts been run, fail with error if any promises rejected
Promise.allSettled(cmdPromises)
.then((results) =>
results.forEach((result) => {
if (result.status === 'rejected') {
console.log(result.reason);
process.exit(1);
}
})
)
.then(() => {
// If no errors, exit with no errors - commit continues
process.exit(0);
});
}
}
preCommit();

Why isn't Await blocking the next line of code?

I'm using an async function to start a server, check if the url returns 200, and then run my tests, but my checkUrl function doesn't complete before running the next line of code.
I'm using the axios package to ping the url for the status code and I've been trying several variations of async/await and Promise.resolve().
function checkUrl(url) {
console.log(`Checking for ${url}`);
return axios
.get(url)
.then(function(res) {
const message = `${url} is status code: ${res.status}`;
return res;
})
.catch(function(err) {
console.log("Will check again in 5 seconds");
setTimeout(() => {
return checkUrl(url);
}, 5000);
});
}
async function init() {
let val;
console.log("Running tests:::");
// start server in react-integration directory
shell.exec("BROWSER=none yarn start", {
cwd: "sampleIntegration/react-integration",
async: true
});
// check server has started
val = await checkUrl("http://localhost:3000");
console.log(`value from checkUrl promise: ${val}`);
}
I'm expecting the val variable to be the message returned from the Promise.resolve() in my checkUrl function.
val is coming back undefined.
The problem is, the setTimeout in the catch block. This is asynchronous, but the catch block immediately returns. As it does not have anything to return, it returns undefined. To resolve this (and also keep your desired behaviour of waiting) you could do something like this:
function checkUrl(url) {
console.log(`Checking for ${url}`);
return axios.get(url)
.then(res => {
const message = `${url} is status code: ${res.status}`;
return res;
})
.catch(err => {
console.log('Will check again in 5 seconds');
return new Promise((resolve) => {
setTimeout(resolve, 5000);
}).then(() => checkUrl(url));
});
}
This will make a Promise which resolves after 5 seconds and on resolve it calls checkUrl.

How to test error for a stream in nodejs?

I have a function which create a md5 and I have created a test which check its behavior, the script works. Now I need to create a test which check the promise is rejected when createHash() or createReadStream() fails.
How to test this scenario, any best practices? I would appreciate if you could post a sample, thanks!
export const md5 = (path: string) =>
new Promise<string>((resolve, reject) => {
const hash = createHash("md5");
const rs = createReadStream(path);
rs.on("error", reject);
rs.on("data", chunk => hash.update(chunk));
rs.on("end", () => resolve(hash.digest("hex")));
});
describe("md5", () => {
const fileName = `${TEST_DIR}/file1.txt`;
beforeAll(() => createFile(fileName));
afterAll(() => removeFile(TEST_DIR));
it("should hash md5 a file", () => {
md5(fileName).then((hash: string) => {
assert.strictEqual(hash, "4738e449ab0ae7c25505aab6e88750da");
});
});
});
I need to create a test which check the promise is rejected
Try the code below: the 2nd parameter of a Jasmine it block is a function that has a parameter done passed to it. done is a function that the user can invoke to make the test pass. If done is not invoked within the timeout window, Jasmine considers the test a failure.
stream-reader.js
const fs = require('fs')
const createReader = inputFile => {
const reader = fs.createReadStream(inputFile, {encoding: 'utf8'})
const result = new Promise((resolve, reject) => {
reader.on('error', e => reject(e))
})
// return 2 things...
return {
reader, // ...a stream that broadcasts chunks over time
result, // ...a Promise to be resolved on reading is done or error encountered
}
}
module.exports = createReader
spec.js
const streamReader = require('../stream-reader')
const INPUT_FILE = './input1.txt'
describe('streamReader', () => {
it(`should pass a chunk of the file on 'data' event`, (done) => {
const api = streamReader(INPUT_FILE)
api.reader.on('data', (chunk) => {
console.log('[JASMINE TEST 1] received', chunk)
done()
})
})
/* test Promise.reject on stream error */
it(`should reject on case of reading errors`, (done) => {
const api = streamReader('./non/existent/file')
api.result.catch(e => {
console.log('[JASMINE TEST 2] correctly rejected')
done()
})
})
})
output
$ npm test
> so-jasmine-test#1.0.0 test C:\Users\jonathanlopez\nodejs\so-stream-reject
> jasmine
Randomized with seed 22758
Started
[JASMINE TEST 1] received the quick brown fox
.[JASMINE TEST 2] correctly rejected
.
2 specs, 0 failures
Finished in 0.026 seconds
Randomized with seed 22758 (jasmine --random=true --seed=22758)
Hope this helps.
Cheers.
You could your md5 function for error in this way
md5('bad_path').catch((error: Error) => {
assert.strictEqual(error.message.length > 0, true);
});

Jest mocks and error handling - Jest test skips the "catch" of my function

I'm creating a jest test to test if metrics were logged for the error handling of the superFetch function. My approach is creating a mock function for retryFetch and returning a Promise reject event. I expect that to go to the superFetch catch but it keeps ending up in superFetch then. What can I do to handle my errors in superFetch catch?
These are the functions:
// file: fetches.js
export function retryFetch(url) {
return new Promise((resolve, reject) => {
fetch(url).then(response => {
if (response.ok) {
resolve(response);
return;
}
throw new Error();
}).catch(error => {
createSomething(error).then(createSomething => {
reject(createSomething);
});
return;
});
});
});
export function superFetch(url, name, page) {
return retryFetch(url)
.then(response => {
return response;
}).catch(error => {
Metrics.logErrorMetric(name, page);
throw error;
});
}
My jest test:
import * as fetch from '../../src/utils/fetches';
describe('Fetch fails', () => {
beforeEach(() => {
fetch.retryFetch = jest.fn(() => Promise.reject(new Error('Error')));
});
it('error metric is logged', () => {
return fetch.superFetch('url', 'metric', 'page').then(data => {
expect(data).toEqual(null);
// received data is {"ok": true};
// why is it even going here? im expecting it to go skip this and go to catch
}).catch(error => {
// this is completely skipped. but I'm expecting this to catch an error
// received error is null, metric was not called
expect(Metrics.logErrorMetric).toHaveBeenCalled();
expect(error).toEqual('Error');
});
});
});
The problem is that you overwrite the function in the exported module but superFetch use the original one inside of the module, so the overwrite will have no effect.
You could mock fetch directly like this:
global.fetch = jest.mock(()=> Promise.reject())

Categories

Resources