I have two functions:
function manipulateData(data,callback)
function isDataValid(data)
The first one is supposed to manipulate the data and return nothing, and the second one is supposed to return true of false depending on the data's structure, without manipulating it at all.
Assuming three things:
-The manipulation doesn't require a valid data.
-The data's validation process is quite demanding IO wise.
-The data's validation process should receive the original data.
How can I use the verification function as the callback argument, so I'd be able to verify it in the background, and only throw an error from the first function, if the data is found invalid?
I want it to behave like this, where the first statement runs in the background and doesn't block the data manipulation.
function manipulateData(data, callback){
if(callback(data)==false){ return some error }
...manipulate data without waiting for the verification...
}
I hope it is indeed doable and that I'm not missing a crucial logical part in the callback mechanism.
You can do this with callbacks or Promises. With callbacks you want to use something like async, but I'd definitely go with Promises (or async/await).
You said that validation was I/O heavy so we'll assume that is an async function. I've assumed that the manipulation is not, but it doesn't really matter once you use Promise.all to wrap them up.
The following is tested in Node 6.10. It tests validation failure and success and I'm pretty sure I've got it testing with each function finishing before the other, but again Promise.all takes care of that for you anyway.
function sleep(milliseconds) {
var start = new Date().getTime();
for (var i = 0; i < 1e7; i++) {
if ((new Date().getTime() - start) > milliseconds){
break;
}
}
}
function manipulateData (data, zzz) {
if (zzz) {
sleep(zzz);
}
return 'manipulated data';
}
function isDataValid (data) {
return new Promise((resolve, reject) => {
if (data) {
resolve('ok');
} else {
reject(new Error('nok'));
}
});
}
function updateData (data, zzz) {
return Promise.all([
isDataValid(data),
manipulateData(data, zzz)
]);
}
function testIt (data, zzz) {
return updateData(data, zzz)
.then(([result, data]) => console.log(`Success! ${result} - ${data}`))
.catch(err => console.log(`Fail! ${err}`));
}
testIt(true, 0)
.then(() => testIt(false, 0))
.then(() => testIt(true, 1000))
.then(() => testIt(false, 1000));
Success! ok - manipulated data
Fail! Error: nok
Success! ok - manipulated data
Fail! Error: nok
Related
I need to send a lot of requests to a server whose flow limits are unclear. To combat this, I decided to send requests repeatedly until I receive a successful status - using he following code:
function sendRequest(url, body, isGood) {
return new Promise((resolve, reject) => fetch(url, body).then(
resp => {
if(isGood(resp)){
resolve(resp);
} else {
reject(resp.status)
}
}).catch(err => reject(err)))
}
function recursionBypass(func, ...args){
return func(...args)
}
function requestUntilSucceed(url, body, isGood, name, attempt=1) {
return new Promise((resolve, reject) => {
sendRequest(url, body, isGood)
.catch((status)=>{
recursionBypass(requestUntilSucceed, url, body, isGood, name, attempt+1)})
.then((resp) => resolve(attempt))
})
}
Without including details about the server in question, I wrote a quick test for this (create & wait for 500 requests):
function () {
let promises = [];
for(let i=0; i<300; i++) {
promises.push(new Promise((resolve,reject) =>
{createRequest(requestArg, i)
.then((attempt) => {console.log("Request: ", i, "succeeded on attempt: " + attempt); resolve(attempt)})
}
)
)
};
return Promise.all(promises)
})
await function()
Assume:
requestArg: a global variable (the same arg is used for all
requests for this test)
createRequest: a function that creates the
url, body and isGood callback then calls sendRequest that builds
the url and body based on requestArg and returns as follows:
return requestUntilSucceed(url, body, isGood, name)
I've observed that about 200 requests are accepted immediately and the others keep retrying. However, at some point, the program exits without finishing all requests successfully. Since there is no mechanism in place to limit the number of tries explicitly, I'm wondering how this can happen. I suspected it had to do with the limit on recursion depth so I started using recursionBypass as above so the interpreter can't tell the function is calling itself, but this didn't make a difference. Any ideas as to why it would exit early ? I know the server accepts requests until some count, then rejects them for a cooldown period of ~1min before beginning to accept again. The rate of acceptance after each cooldown period is ambiguous which is why I can't write a rule-based throttler. The issue is that my program exists (no error printed to console) before the minute is up. A lot of attempts are made in that minute so maybe I'm still surpassing some kind of javascript limit I'm not familiar with ?
I want to use the library astro-js where a typical call in their docs looks like this:
const aztroJs = require("aztro-js");
//Get all horoscope i.e. today's, yesterday's and tomorrow's horoscope
aztroJs.getAllHoroscope(sign, function(res) {
console.log(res);
});
For several reasons, I would like to use it using async/await style and leverage try/catch. So I tried promisify like this:
const aztroJs = require("aztro-js");
const {promisify} = require('util');
const getAllHoroscopeAsync = promisify(aztroJs.getAllHoroscope);
async function handle() {
let result, sign = 'libra';
try {
result = await getAllHoroscopeAsync(sign);
}
catch (err) {
console.log(err);
}
console.log("Result: " + result);
}
However, when I log result it comes as undefined. I know the call worked since the library is automatically logging a response via console.log and I see a proper response in the logs.
How can I "await" on this call? (even by other means if this one is not "promisifyable")
util.promisify() expects the callback function to accept two arguments, the first is an error that must be null when there is no error and non-null when there is an error and the second is the value (if no error). It will only properly promisify a function if the callback follows that specific rule.
To work around that, you will have to manually promisify your function.
// manually promisify
aztroJs.getAllHoroscopePromise = function(sign) {
return new Promise(resolve => {
aztroJs.getAllHoroscope(sign, function(data) {
resolve(data);
});
});
};
// usage
aztroJs.getAllHoroscopePromise(sign).then(results => {
console.log(results);
});
Note, it's unusual for an asynchronous function that returns data not to have a means of returning errors so the aztroJs.getAllHoroscope() interface seems a little suspect in that regard.
In fact, if you look at the code for this function, you can see that it is making a network request using the request() library and then trying to throw in the async callback when errors. That's a completely flawed design since you (as the caller) can't catch exceptions thrown asynchronously. So, this package has no reasonable way of communicating back errors. It is designed poorly.
Try custom promisified function
aztroJs.getAllHoroscope[util.promisify.custom] = (sign) => {
return new Promise((resolve, reject) => {
aztroJs.getAllHoroscope(sign, resolve);
});
};
const getAllHoroscopeAsync = util.promisify(aztroJs.getAllHoroscope);
You could change your getAllHoroscopeAsync to a promise function
Example:
const getAllHoroscopeAsync = (sign) =>
new Promise(resolve =>
aztroJs.getAllHoroscope(sign, (res) => resolve(res)));
TLDR: How to use ES6 fetch to download synchronously?
I'm trying to write a script in Node to download data from an API till there is no more to download e.g. the endpoint has a dataset of size roughly 12000 but only provides 100 per call, and I need all the data. So I've decided to have it downloaded synchronously, and only stop when the json returned is finally empty.
// function to make one api GET call
getData = (offset) => {
return fetch('http://...'+'?start=' + offset)
.then(...)
}
// make api calls until json is finally empty,
// indicating that I've downloaded all the data
offset = 0
const results = getData(offset)
while (results.length != 0) {
// combine results...
i += 100 // move offset
results = getData(i)
}
Because I don't know precisely how large the data is and at which offset it ends, whether or not to make another call depends on the last one.
The code above fails because the promise from getData() does not resolve in time for the while loop. In another language, this would be okay as getData blocks until it is completed. I've tried to await getData but it needs to be in an async (which I don't know where to place, I already have promises). Is there any way to force getData() to block until it is resolved?
you can mark your getData() function as async
async getData() {
return fetch('http://...'+'?start=' + offset)
.then(...)
}
then await for the returned promise completion
await const results = getData(offset)
Or you can modify your code to handle the logic of whether to make another call in the promise callbacks. Something like
function fetchData(offset) {
if (offset < 1000) {
return Promise.resolve([1,2,3].map(i=>offset+i));
} else {
return Promise.resolve([]);
}
}
function needsMoreData(results) {
return results.length > 0;
}
function fetchNextDataIfNeeded(results, offset) {
if (needsMoreData(results)) {
return fetchRemainingData(offset + 100)
.then(nextResults=>[...results, ...nextResults]);
}
return Promise.resolve(results);
}
function fetchRemainingData( offset) {
return fetchData(offset)
.then(results=>fetchNextDataIfNeeded(results, offset));
}
function fetchAllData() {
return fetchRemainingData(0)
.then(results=>console.log(results));
}
fetchAllData();
See https://jsfiddle.net/rwycbn5q/1/
I recently found myself in a similar situation, but some functions cannot be made synchronously. But you could do something like this:
let results=[];
getData(idx)=>{
fetch("url").then(data=>{
if(data!=={}){ //Or some other continuation check
results.push(data); //Or whatever you want to do with the data
getData(idx+100);
}
})
}
getData(0);
I am working on a Chrome extension where I need to contact from the page extension code with background script using Chrome messages. This adds a fair level asynch fiddling to my code. The general purpose is to process text macros ie parse text, using data stored in the database accessible from the background script.
at some point, I have:
findMacro(key)
.then(result => {
processMacro(key, result);
}};
The processor roughly looks like:
function processMacro(shortcut, text) {
text = macroBuilder(text);
}
macroBuilder function processes various types of macros, pushing processed text to an array then running join(''). My problem is that, I need to support nested macros and when there are such macros, I call findMacro again, which inside does chrome sendMessage to the background process. The processor does something like:
function macroBuilder(text) {
let pieces = [];
macroProcessorCode(text).forEach(res => {
res.visit({
text(txt, ctx) {
pieces.push(txt);
},
dates(obj, ctx) {
pieces.push(processDates(obj))
},
...
nested(obj,ctx) {
let fragments = [];
fragments.push(
findMacro(obj.name)
.then(res => {
return macroBuilder(res);
})
);
console.log('promised results:', fragments);
Promise.all(fragments)
.then(fragments => {
console.log('resolved results:', fragments);
pieces.push(fragments);
});
}
});
});
return pieces.join('');
}
For some reason my function returns before resolving, so promised results happens before it returns, and resolved results after. In short, I return from the code processing text with the result, before nested macros are processed. This only happens with nested, other types of macros are processed correctly.
Any ideas?
macroBuilder creates a bunch of promises but never does anything with them. Instead, it needs to wait for them and return its own promise that will resolve/reject based on the promises for the pieces/fragments.
This is somewhat off-the-cuff and probably needs tweaking, but should get you going the right direction. See *** comments:
function macroBuilder(text) {
// *** An array for all the promises below
const piecePromises = [];
macroProcessorCode(text).forEach(res => {
res.visit({
text(txt, ctx) {
pieces.push(txt);
},
dates(obj, ctx) {
pieces.push(processDates(obj))
},
//...
nested(obj, ctx) {
let fragments = [];
fragments.push(
findMacro(obj.name)
.then(res => macroBuilder) // *** Note we can just pass `macroBuilder` directly
);
console.log('promised results:', fragments);
// *** Add this promise to the array
pieces.push(Promise.all(fragments)
// (*** This whole then handler can be removed
// once you don't need the logging anymore;
// just push the result of Promise.all)
.then(fragments => {
console.log('resolved results:', fragments);
return fragments;
})
);
}
});
});
// *** Wait for all of those and then join; ultimate result
// is a promise that resolves with the pieces joined together
return Promise.all(piecePromises).then(pieces => pieces.join(''));
}
At that point, there's not much point to processMacro; it would just look like this (note that it returns a promise):
function processMacro(shortcut, text) {
return macroBuilder(text);
}
...unless there's something you do with shortcut that you haven't shown.
Assuming you need processMacro, you'd call it like this if you're propagating the promise to the caller:
return findMacro(key)
.then(result => processMacro(key, result));
...or like this if you're not propagating the promise:
findMacro(key)
.then(result => processMacro(key, result))
.catch(err => {
// Deal with the fact an error occurred
});
Since one of the rules of promises is that you either propagate the promise or handle errors from it.
Can anyone recommend a pattern for instantly retrieving data from a function that returns a Promise?
My (simplified) example is an AJAX preloader:
loadPage("index.html").then(displayPage);
If this is downloading a large page, I want to be able to check what's happening and perhaps cancel the process with an XHR abort() at a later stage.
My loadPage function used to (before Promises) return an id that let me do this later:
var loadPageId = loadPage("index.html",displayPage);
...
doSomething(loadPageId);
cancelLoadPage(loadPageId);
In my new Promise based version, I'd imagine that cancelLoadPage() would reject() the original loadPage() Promise.
I've considered a few options all of which I don't like. Is there a generally accepted method to achieve this?
Okay, let's address your bounty note first.
[Hopefully I'll be able to grant the points to someone who says more than "Don't use promises"... ]
Sorry, but the answer here is: "Don't use promises". ES6 Promises have three possible states (to you as a user): Pending, Resolved and Rejected (names may be slightly off).
There is no way for you to see "inside" of a promise to see what has been done and what hasn't - at least not with native ES6 promises. There was some limited work (in other frameworks) done on promise notifications, but those did not make it into the ES6 specification, so it would be unwise of you to use this even if you found an implementation for it.
A promise is meant to represent an asynchronous operation at some point in the future; standalone, it isn't fit for this purpose. What you want is probably more akin to an event publisher - and even that is asynchronous, not synchronous.
There is no safe way for you to synchronously get some value out of an asynchronous call, especially not in JavaScript. One of the main reasons for this is that a good API will, if it can be asynchronous, will always be asynchronous.
Consider the following example:
const promiseValue = Promise.resolve(5)
promiseValue.then((value) => console.log(value))
console.log('test')
Now, let's assume that this promise (because we know the value ahead of time) is resolved synchronously. What do you expect to see? You'd expect to see:
> 5
> test
However, what actually happens is this:
> test
> 5
This is because even though Promise.resolve() is a synchronous call that resolves an already-resolved Promise, then() will always be asynchronous; this is one of the guarantees of the specification and it is a very good guarantee because it makes code a lot easier to reason about - just imagine what would happen if you tried to mix synchronous and asynchronous promises.
This applies to all asynchronous calls, by the way: any action in JavaScript that could potentially be asynchronous will be asynchronous. As a result, there is no way for you do any kind of synchronous introspection in any API that JavaScript provides.
That's not to say you couldn't make some kind of wrapper around a request object, like this:
function makeRequest(url) {
const requestObject = new XMLHttpRequest()
const result = {
}
result.done = new Promise((resolve, reject) => {
requestObject.onreadystatechange = function() {
..
}
})
requestObject.open(url)
requestObject.send()
return requestObject
}
But this gets very messy, very quickly, and you still need to use some kind of asynchronous callback for this to work. This all falls down when you try and use Fetch. Also note that Promise cancellation is not currently a part of the spec. See here for more info on that particular bit.
TL:DR: synchronous introspection is not possible on any asynchronous operation in JavaScript and a Promise is not the way to go if you were to even attempt it. There is no way for you to synchronously display information about a request that is on-going, for example. In other languages, attempting to do this would require either blocking or a race condition.
Well. If using angular you can make use of the timeout parameter used by the $http service if you need to cancel and ongoing HTTP request.
Example in typescript:
interface ReturnObject {
cancelPromise: ng.IPromise;
httpPromise: ng.IHttpPromise;
}
#Service("moduleName", "aService")
class AService() {
constructor(private $http: ng.IHttpService
private $q: ng.IQService) { ; }
doSomethingAsynch(): ReturnObject {
var cancelPromise = this.$q.defer();
var httpPromise = this.$http.get("/blah", { timeout: cancelPromise.promise });
return { cancelPromise: cancelPromise, httpPromise: httpPromise };
}
}
#Controller("moduleName", "aController")
class AController {
constructor(aService: AService) {
var o = aService.doSomethingAsynch();
var timeout = setTimeout(() => {
o.cancelPromise.resolve();
}, 30 * 1000);
o.httpPromise.then((response) => {
clearTimeout(timeout);
// do code
}, (errorResponse) => {
// do code
});
}
}
Since this approach already returns an object with two promises the stretch to include any synchronous operation return data in that object is not far.
If you can describe what type of data you would want to return synchronously from such a method it would help to identify a pattern. Why can it not be another method that is called prior to or during your asynchronous operation?
You can kinda do this, but AFAIK it will require hacky workarounds. Note that exporting the resolve and reject methods is generally considered a promise anti-pattern (i.e. sign you shouldn't be using promises). See the bottom for something using setTimeout that may give you what you want without workarounds.
let xhrRequest = (path, data, method, success, fail) => {
const xhr = new XMLHttpRequest();
// could alternately be structured as polymorphic fns, YMMV
switch (method) {
case 'GET':
xhr.open('GET', path);
xhr.onload = () => {
if (xhr.status < 400 && xhr.status >= 200) {
success(xhr.responseText);
return null;
} else {
fail(new Error(`Server responded with a status of ${xhr.status}`));
return null;
}
};
xhr.onerror = () => {
fail(networkError);
return null;
}
xhr.send();
return null;
}
return xhr;
case 'POST':
// etc.
return xhr;
// and so on...
};
// can work with any function that can take success and fail callbacks
class CancellablePromise {
constructor (fn, ...params) {
this.promise = new Promise((res, rej) => {
this.resolve = res;
this.reject = rej;
fn(...params, this.resolve, this.reject);
return null;
});
}
};
let p = new CancellablePromise(xhrRequest, 'index.html', null, 'GET');
p.promise.then(loadPage).catch(handleError);
// times out after 2 seconds
setTimeout(() => { p.reject(new Error('timeout')) }, 2000);
// for an alternative version that simply tells the user when things
// are taking longer than expected, NOTE this can be done with vanilla
// promises:
let timeoutHandle = setTimeout(() => {
// don't use alert for real, but you get the idea
alert('Sorry its taking so long to load the page.');
}, 2000);
p.promise.then(() => clearTimeout(timeoutHandle));
Promises are beautiful. I don't think there is any reason that you can not handle this with promises. There are three ways that i can think of.
The simplest way to handle this is within the executer. If you would like to cancel the promise (like for instance because of timeout) you just define a timeout flag in the executer and turn it on with a setTimeout(_ => timeout = true, 5000) instruction and resolve or reject only if timeout is false. ie (!timeout && resolve(res) or !timeout && reject(err)) This way your promise indefinitely remains unresolved in case of a timeout and your onfulfillment and onreject functions at the then stage never gets called.
The second is very similar to the first but instead of keeping a flag you just invoke reject at the timeout with proper error description. And handle the rest at the then or catch stage.
However if you would like to carry the id of your asych operation to the sync world then you can also do it as follows;
In this case you have to promisify the async function yourself. Lets take an example. We have an async function to return the double of a number. This is the function
function doubleAsync(data,cb){
setTimeout(_ => cb(false, data*2),1000);
}
We would like to use promises. So normally we need a promisifier function which will take our async function and return another function which when run, takes our data and returns a promise. Right..? So here is the promisifier function;
function promisify(fun){
return (data) => new Promise((resolve,reject) => fun(data, (err,res) => err ? reject(err) : resolve(res)));
}
Lets se how they work together;
function promisify(fun){
return (data) => new Promise((resolve,reject) => fun(data, (err,res) => err ? reject(err) : resolve(res)));
}
function doubleAsync(data,cb){
setTimeout(_ => cb(false, data*2),1000);
}
var doubleWithPromise = promisify(doubleAsync);
doubleWithPromise(100).then(v => console.log("The asynchronously obtained result is: " + v));
So now you see our doubleWithPromise(data) function returns a promise and we chain a then stage to it and access the returned value.
But what you need is not only a promise but also the id of your asynch function. This is very simple. Your promisified function should return an object with two properties; a promise and an id. Lets see...
This time our async function will return a result randomly in 0-5 secs. We will obtain it's result.id synchronously along with the result.promise and use this id to cancel the promise if it fails to resolve within 2.5 secs. Any figure on console log Resolves in 2501 msecs or above will result nothing to happen and the promise is practically canceled.
function promisify(fun){
return function(data){
var result = {id:null, promise:null}; // template return object
result.promise = new Promise((resolve,reject) => result.id = fun(data, (err,res) => err ? reject(err) : resolve(res)));
return result;
};
}
function doubleAsync(data,cb){
var dur = ~~(Math.random()*5000); // return the double of the data within 0-5 seconds.
console.log("Resolve in " + dur + " msecs");
return setTimeout(_ => cb(false, data*2),dur);
}
var doubleWithPromise = promisify(doubleAsync),
promiseDataSet = doubleWithPromise(100);
setTimeout(_ => clearTimeout(promiseDataSet.id),2500); // give 2.5 seconds to the promise to resolve or cancel it.
promiseDataSet.promise
.then(v => console.log("The asynchronously obtained result is: " + v));
You can use fetch(), Response.body.getReader(), where when .read() is called returns a ReadableStream having a cancel method, which returns a Promise upon cancelling read of the stream.
// 58977 bytes of text, 59175 total bytes
var url = "https://gist.githubusercontent.com/anonymous/"
+ "2250b78a2ddc80a4de817bbf414b1704/raw/"
+ "4dc10dacc26045f5c48f6d74440213584202f2d2/lorem.txt";
var n = 10000;
var clicked = false;
var button = document.querySelector("button");
button.addEventListener("click", () => {clicked = true});
fetch(url)
.then(response => response.body.getReader())
.then(reader => {
var len = 0;
reader.read().then(function processData(result) {
if (result.done) {
// do stuff when `reader` is `closed`
return reader.closed.then(function() {
return "stream complete"
});
};
if (!clicked) {
len += result.value.byteLength;
}
// cancel stream if `button` clicked or
// to bytes processed is greater than 10000
if (clicked || len > n) {
return reader.cancel().then(function() {
return "read aborted at " + len + " bytes"
})
}
console.log("len:", len, "result value:", result.value);
return reader.read().then(processData)
})
.then(function(msg) {
alert(msg)
})
.catch(function(err) {
console.log("err", err)
})
});
<button>click to abort stream</button>
The method I am currently using is as follows:
var optionalReturnsObject = {};
functionThatReturnsPromise(dataToSend, optionalReturnsObject ).then(doStuffOnAsyncComplete);
console.log("Some instant data has been returned here:", optionalReturnsObject );
For me, the advantage of this is that another member of my team can use this in a simple way:
functionThatReturnsPromise(data).then(...);
And not need to worry about the returns object. An advanced user can see from the definitions what is going on.