Loop over javascript object whilst running async calls within - javascript

I'm trying to load multiple files using the THREE.XHRLoader and when it completes I want to take the key of that object and add it along with the file to another object (loadedFiles). The problem I have is that whenever I try to retrieve the key for the object that was loaded it always returns the last key in the object array because the callback for the load function gets called after the loop has ended.
I've got something like this:
var loader = new THREE.XHRLoader();
var loaded = 0;
for (var k in filesToLoad) {
loader.load(filesToLoad[k], function(file) {
console.log(k); // This will always return the last key when I want
//it to the return the key that was loaded instead!
loadedFiles[k] = file;
loaded++;
if (loaded == Object.keys(filesToLoad).length) {
console.log(loadedFiles);
}
});
}

You could wrap your THREE calls in a promise then use Promise.all to resolve once all they have. (pseudo code)
let promises = Object.keys(filesToLoad)
.map(k => {
return new Promise((res, rej) => {
loader.load(filesToLoad[k], res, rej); //Assuming loader has success, error callbacks
});
});
Promise.all(promises)
.then(res => console.log(res)); // [file, file, file...]
Obviously depending on your browser support depends on ES6 but there are ES5 compatible promise libraries you can use instead.
Note all your files will be loaded in parallel when each promise is created. The requests will already be pending before they go into Promise.all
Promise.all

Related

How to know when a fetch ends without locking the body stream?

I'm making requests to an API, but their server only allows a certain number of active connections, so I would like to limit the number of ongoing fetches. For my purposes, a fetch is only completed (not ongoing) when the HTTP response body arrives at the client.
I would like to create an abstraction like this:
const fetchLimiter = new FetchLimiter(maxConnections);
fetchLimiter.fetch(url, options); // Returns the same thing as fetch()
This would make things a lot simpler, but there seems to be no way of knowing when a stream being used by other code ends, because streams are locked while they are being read. It is possible to use ReadableStream.tee() to split the stream into two, use one and return the other to the caller (possibly also constructing a Response with it), but this would degrade performance, right?
Since fetch uses promises, you can take advantage of that to make a simple queue system.
This is a method I've used before for queuing promise based stuff. It enqueues items by creating a Promise and then adding its resolver to an array. Of course until that Promise resolves, the await keeps any later promises from being invoked.
And all we have to do to start the next fetch when one finishes is just grab the next resolver and invoke it. The promise resolves, and then the fetch starts!
Best part, since we don't actually consume the fetch result, there's no worries about having to clone or anything...we just pass it on intact, so that you can consume it in a later then or something.
*Edit: since the body is still streaming after the fetch promise resolves, I added a third option so that you can pass in the body type, and have FetchLimiter retrieve and parse the body for you.
These all return a promise that is eventually resolved with the actual content.
That way you can just have FetchLimiter parse the body for you. I made it so it would return an array of [response, data], that way you can still check things like the response code, headers, etc.
For that matter, you could even pass in a callback or something to use in that await if you needed to do something more complex, like different methods of parsing the body depending on response code.
Example
I added comments to indicate where the FetchLimiter code begins and ends...the rest is just demo code.
It's using a fake fetch using a setTimeout, which will resolve between 0.5-1.5 secs. It will start the first three requests immediately, and then the actives will be full, and it will wait for one to resolve.
When that happens, you'll see the comment that the promise has resolved, then the next promise in the queue will start, and then you'll see the then from in the for loop resolve. I added that then just so you could see the order of events.
(function() {
const fetch = (resource, init) => new Promise((resolve, reject) => {
console.log('starting ' + resource);
setTimeout(() => {
console.log(' - resolving ' + resource);
resolve(resource);
}, 500 + 1000 * Math.random());
});
function FetchLimiter() {
this.queue = [];
this.active = 0;
this.maxActive = 3;
this.fetchFn = fetch;
}
FetchLimiter.prototype.fetch = async function(resource, init, respType) {
// if at max active, enqueue the next request by adding a promise
// ahead of it, and putting the resolver in the "queue" array.
if (this.active >= this.maxActive) {
await new Promise(resolve => {
this.queue.push(resolve); // push, adds to end of array
});
}
this.active++; // increment active once we're about to start the fetch
const resp = await this.fetchFn(resource, init);
let data;
if (['arrayBuffer', 'blob', 'json', 'text', 'formData'].indexOf(respType) >= 0)
data = await resp[respType]();
this.active--; // decrement active once fetch is done
this.checkQueue(); // time to start the next fetch from queue
return [resp, data]; // return value from fetch
};
FetchLimiter.prototype.checkQueue = function() {
if (this.active < this.maxActive && this.queue.length) {
// shfit, pulls from start of array. This gives first in, first out.
const next = this.queue.shift();
next('resolved'); // resolve promise, value doesn't matter
}
}
const limiter = new FetchLimiter();
for (let i = 0; i < 9; i++) {
limiter.fetch('/mypage/' + i)
.then(x => console.log(' - .then ' + x));
}
})();
Caveats:
I'm not 100% sure if the body is still streaming when the promise resolves...that seems to be a concern for you. However if that's a problem you could use one of the Body mixin methods like blob or text or json, which doesn't resolve until the body content is completely parsed (see here)
I intentionally kept it very short (like 15 lines of actual code) as a very simple proof of concept. You'd want to add some error handling in production code, so that if the fetch rejects because of a connection error or something that you still decrement the active counter and start the next fetch.
Of course it's also using async/await syntax, because it's so much easier to read. If you need to support older browsers, you'd want to rewrite with Promises or transpile with babel or equivalent.

javascript promise still executing code synchronously

so basically i have a web application that retrieves data from firebase. and since it takes a lot of time to retrieve data from firebase, i used promise to make my code to populate at the right time. here is my code:
var promise = getDataFirebase();
promise.then(function () {
console.log(Collect);
console.log("firsst");
return getDataFirebaseUser();
}).then(function () {
console.log("Second");
});
function getDataFirebase() {
return new Promise(function (resolve, reject) {
refReview.on("value", function (snap) {
var data = snap.val();
for (var key in data) {
Collect.push({
"RevieweeName": data[key].revieweeID.firstname.concat(" ", data[key].revieweeID.lastname),
"ReviewerName": data[key].reviewerID.firstname.concat(" ", data[key].reviewerID.lastname),
rating: data[key].rating,
content: data[key].content,
keyOfReviewee: data[key].revieweeID.userID
})
var getDataToUsers = firebase.database().ref("users").child(data[key].revieweeID.userID);
getDataToUsers.once("value", async function (snap) {
var fnLn = snap.val();
var first = fnLn.isTerminated;
console.log("terminateStatus", first);
});
}//end of for loop
resolve();
}); //end of snap
});
}
so in function getDataFirebase, data is retrieved from firebase and is push to an array called , Collect. So, after pushing 1 row, it query again to the firebase for another table of data then continues the loop. The problem here is that, i wanted to finish all processes before it resolves the promise.
the output of console according to the code is as follow:
Collect (array)
first
Second
terminateStatus, 1
it must be
Collect (array)
first
terminateStatus,1
second
I'm not 100% sure how your code is working but it looks like you're doing the right thing on refReview.on("value"): creating a promise before calling Firebase, and then resolving it afterwards. But you're not doing that on getDataToUsers.once("value"). There, you're firing the event and not waiting for it to return — the for-loop continues on, and all the callbacks are processed later, but resolve is at the end of the for-loop, so it's too late by then.
My guess is that you thought the async keyword would cause the for-loop to wait for that job to complete, but actually all it does here is cause the callback to return a promise — which is ignored by the on function.
You have a few options, but probably the best will be to use Promise.all, which accepts an array of promises and waits for them all to be resolved — but you should probably use Firebase's Promise API rather than bolting promises onto the event API. So it'll look something like:
refReview.once('value')
.then(snap => Promise.all(snap.val().map(item =>
firebase.database()
.ref("users")
.child(item.revieweeID.userID)
.once('value')
.then(snap => {
console.log('terminated');
])
})));
(except with the actual functionality added in, natch)

Promise All in Node.js with a forEach loop

I have a function that reads a directory and copies and creates a new file within that directory.
function createFiles (countryCode) {
fs.readdir('./app/data', (err, directories) => {
if (err) {
console.log(err)
} else {
directories.forEach((directory) => {
fs.readdir(`./app/data/${directory}`, (err, files) => {
if (err) console.log(err)
console.log(`Creating ${countryCode}.yml for ${directory}`)
fs.createReadStream(`./app/data/${directory}/en.yml`).pipe(fs.createWriteStream(`./app/data/${directory}/${countryCode}.yml`))
})
})
}
})
}
How do I do this using promises or Promise All to resolve when it's complete?
First, you need to wrap each file stream in a promise that resolves when the stream emits the finish event:
new Promise((resolve, reject) => {
fs.createReadStream(`./app/data/${directory}/en.yml`).pipe(
fs.createWriteStream(`./app/data/${directory}/${countryCode}.yml`)
).on('finish', resolve);
});
The you need to collect these promises in an array. This is done by using map() instead of forEach() and returning the promise:
var promises = directories.map((directory) => {
...
return new Promise((resolve, reject) => {
fs.createReadStream( ...
...
});
});
Now you have a collection of promises that you can wrap with Promise.all() and use with a handler when all the wrapped promises have resolved:
Promise.all(promises).then(completeFunction);
In recent versions of Node (8.0.0 and later), there's a new util.promisify function you can use to get a promise. Here's how we might use it:
// Of course we'll need to require important modules before doing anything
// else.
const util = require('util')
const fs = require('fs')
// We use the "promisify" function to make calling promisifiedReaddir
// return a promise.
const promisifiedReaddir = util.promisify(fs.readdir)
// (You don't need to name the variable util.promisify promisifiedXYZ -
// you could just do `const readdir = util.promisify(fs.readdir)` - but
// I call it promisifiedReaddir here for clarity.
function createFiles(countryCode) {
// Since we're using our promisified readdir function, we'll be storing
// a Promise inside of the readdirPromise variable..
const readdirPromise = promisifiedReaddir('./app/data')
// ..then we can make something happen when the promise finishes (i.e.
// when we get the list of directories) by using .then():
return readdirPromise.then(directories => {
// (Note that we only get the parameter `directories` here, with no `err`.
// That's because promises have their own way of dealing with errors;
// try looking up on "promise rejection" and "promise error catching".)
// We can't use a forEach loop here, because forEach doesn't know how to
// deal with promises. Instead we'll use a Promise.all with an array of
// promises.
// Using the .map() method is a great way to turn our list of directories
// into a list of promises; read up on "array map" if you aren't sure how
// it works.
const promises = directory.map(directory => {
// Since we want an array of promises, we'll need to `return` a promise
// here. We'll use our promisifiedReaddir function for that; it already
// returns a promise, conveniently.
return promisifiedReaddir(`./app/data/${directory}`).then(files => {
// (For now, let's pretend we have a "copy file" function that returns
// a promise. We'll actually make that function later!)
return copyFile(`./app/data/${directory}/en.yml`, `./app/data/${directory}/${countryCode}.yml`)
})
})
// Now that we've got our array of promises, we actually need to turn them
// into ONE promise, that completes when all of its "children" promises
// are completed. Luckily there's a function in JavaScript that's made to
// do just that - Promise.all:
const allPromise = Promies.all(promises)
// Now if we do a .then() on allPromise, the function we passed to .then()
// would only be called when ALL promises are finished. (The function
// would get an array of all the values in `promises` in order, but since
// we're just copying files, those values are irrelevant. And again, don't
// worry about errors!)
// Since we've made our allPromise which does what we want, we can return
// it, and we're done:
return allPromise
})
}
Okay, but, there's probably still a few things that might be puzzling you..
What about errors? I kept saying that you don't need to worry about them, but it is good to know a little about them. Basically, in promise-terms, when an error happens inside of a util.promisify'd function, we say that that promise rejects. Rejected promises behave mostly the same way you'd expect errors to; they throw an error message and stop whatever promise they're in. So if one of our promisifiedReaddir calls rejects, it'll stop the whole createFiles function.
What about that copyFile function? Well, we have two options:
Use somebody else's function. No need to re-invent the wheel! quickly-copy-file looks to be a good module (plus, it returns a promise, which is useful for us).
Program it ourselves.
Programming it ourselves isn't too hard, actually, but it takes a little bit more than simply using util.promisify:
function copyFile(from, to) {
// Hmm.. we want to copy a file. We already know how to do that in normal
// JavaScript - we'd just use a createReadStream and pipe that into a
// createWriteStream. But we need to return a promise for our code to work
// like we want it to.
// This means we'll have to make our own hand-made promise. Thankfully,
// that's not actually too difficult..
return new Promise((resolve, reject) => {
// Yikes! What's THIS code mean?
// Well, it literally says we're returning a new Promise object, with a
// function given to it as an argument. This function takes two arguments
// of its own: "resolve" and "reject". We'll look at them separately
// (but maybe you can guess what they mean already!).
// We do still need to create our read and write streams like we always do
// when copying files:
const readStream = fs.createReadStream(from)
const writeStream = fs.createWriteStream(to)
// And we need to pipe the read stream into the write stream (again, as
// usual):
readStream.pipe(writeStream)
// ..But now we need to figure out how to tell the promise when we're done
// copying the files.
// Well, we'll start by doing *something* when the pipe operation is
// finished. That's simple enough; we'll just set up an event listener:
writeStream.on('close', () => {
// Remember the "resolve" and "reject" functions we got earlier? Well, we
// can use them to tell the promise when we're done. So we'll do that here:
resolve()
})
// Okay, but what about errors? What if, for some reason, the pipe fails?
// That's simple enough to deal with too, if you know how. Remember how we
// learned a little on rejected promises, earlier? Since we're making
// our own Promise object, we'll need to create that rejection ourself
// (if anything goes wrong).
writeStream.on('error', err => {
// We'll use the "reject" argument we were given to show that something
// inside the promise failed. We can specify what that something is by
// passing the error object (which we get passed to our event listener,
// as usual).
reject(err)
})
// ..And we'll do the same in case our read stream fails, just in case:
readStream.on('error', err => {
reject(err)
})
// And now we're done! We've created our own hand-made promise-returning
// function, which we can use in our `createFiles` function that we wrote
// earlier.
})
}
..And here's all the finished code, so that you can review it yourself:
const util = require('util')
const fs = require('fs')
const promisifiedReaddir = util.promisify(fs.readdir)
function createFiles(countryCode) {
const readdirPromise = promisifiedReaddir('./app/data')
return readdirPromise.then(directories => {
const promises = directory.map(directory => {
return promisifiedReaddir(`./app/data/${directory}`).then(files => {
return copyFile(`./app/data/${directory}/en.yml`, `./app/data/${directory}/${countryCode}.yml`)
})
})
const allPromise = Promies.all(promises)
return allPromise
})
}
function copyFile(from, to) {
return new Promise((resolve, reject) => {
const readStream = fs.createReadStream(from)
const writeStream = fs.createWriteStream(to)
readStream.pipe(writeStream)
writeStream.on('close', () => {
resolve()
})
writeStream.on('error', err => {
reject(err)
})
readStream.on('error', err => {
reject(err)
})
})
}
Of course, this implementation isn't perfect. You could improve it by looking at other implementations - for example this one destroys the read and write streams when an error occurs, which is a bit cleaner than our method (which doesn't do that). The most reliable way would probably to go with the module I linked earlier!
I highly recommend you watch funfunfunction's video on promises. It explains how promises work in general, how to use Promise.all, and more; and he's almost certainly better at explaining this whole concept than I am!
First, create a function that returns a promise:
function processDirectory(directory) {
return new Promise((resolve, reject) => {
fs.readdir(`./app/data/${directory}`, (err, files) => {
if (err) reject(err);
console.log(`Creating ${countryCode}.yml for ${directory}`);
fs.createReadStream(`./app/data/${directory}/en.yml`)
.pipe(fs.createWriteStream(`./app/data/${directory}/${countryCode}.yml`))
.on('finish', resolve);
});
});
}
Then use Promise.all:
Promise.all(directories.map(processDirectory))
.then(...)
.catch(...);

Add the result of a chained promise into promise.all()

I try construct a promise, named downloadTextPromise, to download a file and then in Promise.all() to read the file content.
However, I still cannot read the file content in Promise.all().then.
I suspect that downloadTextPromise is a two-step promise, so only the first step (HTTP GET) is pushed into promises array. But I have little idea how to fix it.
Any hint? any solution that works is welcoming, request-promise-native package is not compulsory.
My code is as below:
const request = require("request-promise-native");
let promises = [];
//put several promises into promises array;
.....
let downloadTextPromise = request.get("http://example.com/xyz.txt").pipe(fs.createWriteStream(`~/xyz.txt`));
promises.push(downloadTextPromise);
Promise.all(promises).then(() => {
//I need xyz.txt content at this stage, but it is not ready immediately.
}
Firstly, check if downloadTextPromise is actually a promise. The library's documentation doesn't mention how pipe works. If it returns a native promise, you can do console.log(downloadTextPromise instanceof Promise). Otherwise, you can try console.log(!!downloadTextPromise.then).
If it's a promise, then one possibility is that the promise is resolved synchronously after the the stream ends. The code to actually write the file is added to the message queue. In this case, you can fix it by doing:
Promise.all(promises).then(() => {
setTimeout(() => {
// Do stuff with xyz.txt
}, 0);
}
The issue is the writeStream part, so I should build a function returning correct promise.
I don't find the way to construct one returning a promise from request.get().pipe(), instead I use request.get().then() and fs.writeFile() as below.
However, Another problem appears from the new solution: This fs.writeFile works all OK for txt files; for mp3 or mp4, the function returns no error, but the file content is invalid for further processing in Promise.all().then().
function downloadTextPromise() {
return new Promise((resolve, reject) => {
rp.get("http://example.com/xyz.txt").then(data => {
fs.writeFile("./xyz.txt, data, err => {
if (err) reject(err);
resolve(true);
});
});
});
}

Array of promise with results separately

Basically, I'm looking for something like Promise.all(), but one that calls the function in then() for each result, as soon as that result is ready. It's trivial to write using standard callbacks like this (file reading example):
function foo(files, bar) {
for (let file of files) {
const fr = new FileReader();
fr.onload = e => bar(e.target.result);
fr.readAsDataURL(file);
}
}
I'm sure there's a library out there that does what I want, but I'd like to do that in pure JS.
To clarify, I would like to have something like this:
function foo(files) {
// do something here;
}
foo(['1','2','3']).then(/* do something here with separate results */);`
Just use map:
let read = file => return new Promise(resolve => {
const fr = new FileReader();
fr.onload = e => resolve(e.target.result);
fr.readAsDataURL(file);
});
Promise.all(files.map(file => read(file).then(bar)))
.then(allResultsFromBar => { /* use results array */ });
Note that .then promotes anything returned from a function it executes to a promise (i.e. it calls Promise.resolve on it implicitly), so Promise.all is guaranteed to receive an array of promises no matter what bar returns.
Working from #jib's answer and the comment :
I want to have a function readFiles that takes a list of files as its only argument, and I want it to return some kind of promise that I could call then() on, and the function that I would put in then() would be called separately for each of the files.
That's not quite possible because a .then() callback is only called once, not once per result delivered by Promise.all().
This is close :
let read = file => return new Promise((resolve, reject) => {
const fr = new FileReader();
fr.onload = e => resolve(e.target.result);
fr.onerror = reject; // don't forget to reject on error
fr.readAsDataURL(file);
});
let readFiles = files => return Promise.all(files.map(read)); // <<<<< this is the line you seek
Allowing you to write :
const files = ["...", "...", "..."];
readFiles(files)
.then(allFileResults => allFileResults.map(bar)); // bar is your function for processing file results
// possibly chain another .then() if required to observe/process bar()'s results
Thus bar() will operate on each of the file contents in turn.
However, in getting the code as close as possible to what you asked for :
your "as soon as" requirement isn't truly meet; the allFileContents.map(bar) process is downstream of Promise.all(), therefore waits for all files to be read before commencing the bar() calls.
any single error in readFiles(...) will (without some sort of error recovery) scupper the whole enterprise; allFileContents.map(bar) would never be called on any of the files; that may be good or bad depending on your application; good news though, error recover is easily inserted.
as written, bar() must be synchronous; if not, you can simply insert another Promise.all(), ie allFileResults => Promise.all(allFileResults.map(bar)).
For these reasons, #jib's solution may well be the one to go for but it depends on what the application demands.

Categories

Resources