RxJS: Cache Several XHR Calls without Mutability and Side Effects - javascript

I’m looking for an RxJS example how to cache a series of XHR calls (or other async operations), so the same call does not have to be repeated, while respecting immutability and with no side effects.
Here's a bare-bones, mutable example:
var dictionary = {}; // mutable
var click$ = Rx.Observable.fromEvent(document.querySelector('button'), 'click', function (evt) {
return Math.floor(Math.random() * 6) + 1; // click -> random number 1-6 (key)
})
.flatMap(getDefinition);
var clicksub = click$.subscribe(function (key) {
console.log(key);
});
function getDefinition (key) {
if ( dictionary[key] ) { // check dict. for key
console.log('from dictionary');
return Rx.Observable.return(dictionary[key]);
}
// key not found, mock up async operation, to be replaced with XHR
return Rx.Observable.fromCallback(function (key, cb) {
dictionary[key] = key; // side effect
cb(dictionary[key); // return definition
})(key);
}
JSBin Demo
Question: Is there a way to accomplish caching several similar async operations without resorting to using the dictionary variable, due to mutability and side effect?
I’ve looked at scan as a means to “collect” the XHR call results, but I don’t see how to handle an async operation within scan.
I think I’m dealing with two issues here: one is state management maintained by the event stream rather than kept in a variable, and the second is incorporating a conditional operation that may depend on an async operation, in the event stream flow.

Using the same technique than in a previous question (RxJS wait until promise resolved), you could use scan and add the http call as a part of the state you keep track of. This is untested but you should be able to easily adapt it for tests :
restCalls$ = click$
.scan(function (state, request){
var cache = state.cache;
if (cache.has(request)) {
return {cache : cache, restCallOrCachedValue$ : Rx.Observable.return(cache.get(request))}
}
else {
return {
cache : cache,
restCallOrCachedValue$ : Rx.Observable
.defer(function(){
return Rx.Observable
.fromPromise(executeRestCall(request))
.do(function(value){cache.add(request,value)})
})
}
}
}, {cache : new Cache(), restCallOrCachedValue$ : undefined})
.pluck('restCallOrCachedValue$')
.concatAll()
So basically you pass an observable which does the call down the stream or you directly return the value in the cache wrapped in an observable. In both cases, the relevant observables will be unwrapped in order by concatAll. Note how a cold observable is used to start the http call only at the time of the subscription (supposing that executeRestCall executes the call and returns immediately a promise).

Working from #user3743222's response, this code does cache the XHR calls:
// set up Dictionary object
function Dictionary () { this.dictionary = {} }
Dictionary.prototype.has = function (key) { return this.dictionary[key] }
Dictionary.prototype.add = function (key, value) { this.dictionary[key] = value }
Dictionary.prototype.get = function (key) { return this.dictionary[key] }
var definitions$ = click$
.scan(function (state, key) {
var dictionary = state.dictionary
// check for key in dict.
if ( dictionary.has(key) ) {
console.log('from dictionary')
return {
dictionary: dictionary,
def$: Rx.Observable.return(dictionary.get(key))
}
}
// key not found
else {
return {
dictionary: dictionary,
def$: Rx.Observable.fromPromise(XHRPromise(key))
.do(function (definition) { dictionary.add(key, definition) })
}
}
}, {dictionary : new Dictionary(), def$ : undefined})
.pluck('def$') // pull out definition stream
.concatAll() // flatten
Updated: In the // key not found block, executing the function XHRPromise (that returns a promise) passing in the key and the result passed to the RxJS method fromPromise. Next, chain a do method in which the definition is grabbed and added to the dictionary cache.
Follow up question: It appears we've removed the side effect issue, but is this still considered mutating when the definition and key are added to the dictionary?
Putting this here for archival purposes, the first iteration had this code, accomplishing the same caching procedure:
// ALTERNATIVE key not found
else {
var retdef = Rx.Observable.fromPromise(function () {
var prom = XHRPromise(key).then(function (dd) {
dictionary.add(key, dd) // add key/def to dict.
return dd
})
return prom
}()) // immediately invoked anon. func.
return { dictionary: dictionary, def$: retdef }
}
Here, using the function XHRPromise that returns a promise. Before returning that promise to the event stream, though, pull out a then method in which the promised definition is grabbed and added to the dictionary cache. After that the promise object is returned to the event stream through the RxJS method fromPromise.
To get access to the definition returned from the XHR call, so it can be added to the dictionary cache, an anonymous, immediately invoked function is used.

Related

Add an undefined method promise to an array to be resolved later in Promise.all()

I want to queue up DB calls that will be executed once it's connected. The DB object is created and stored as a member of a module when it's connected.
DB Module:
var db = {
localDb: null,
connectLocal: (dbName) => {
// Do stuff
this.localDb = new PouchDB(dbName) // has a allDocs() method
}
}
Adding calls to queue:
var dbQueue = []
function getDocs () {
dbQueue.push (
db.localDb.allDocs () // allDocs() not yet defined; returns promise
)
}
// Called when connected and queue is not empty:
function processQueue () {
Promise.all (dbQueue)
.then(...)
}
If getDocs() is called before db.connectLocal() sets db.localDb, then I get the following error (or similar) because db.localDb is not yet defined:
TypeError: Cannot read property 'then' of undefined
Is it possible to add an undefined method, that returns a promise, to an array to be resolved later in Promise.all()? Any other ideas as to how I can solve this issue?
Also, I'm using Vue.js and PouchDB.
You can make a promise in your db module instead of just localDb property:
let localDb = null;
let resolveLocalDb = null;
let localDbPromise = new Promise(function(resolve, reject) {
resolveLocalDb = resolve;
});
var db = {
getLocalDb: () {
return localDbPromise;
}
connectLocal: (dbName) => {
// Do stuff
localDb = new PouchDB(dbName) // has a allDocs() method
resolveLocalDb(localDb);
}
}
Then, exchange .localDb to getLocalDb(), which returns a promise.
dbQueue.push(
db.getLocalDb().then(db => db.allDocs())
)
I solved my queue issue, but it wasn't at all how I was trying to go about it.
My first problem was thinking that Promise.all() deferred calling my methods until it was called, but they are called when added to the array. This caused the error I mentioned in my question. So I needed to rethink how to populate the queue with methods that may not yet exist.
The solution was to add the calls to an array (the queue) as strings (e.g. "getDocs"), then loop through the array calling the methods using bracket notation (e.g. db["getDocs"]()).
My app is written in Vue.js so it's obviously different, but here's a simplified, working example:
// Dummy DB object
var db = {
docs: [1, 2, 3]
};
// Queue were the DB ops are stored
var dbQueue = [];
// Process the queue - called elsewhere once the DB is connected
// The processed array and Promise.all() aren't necessary as you could just call
// the method outright, but I want to log the results in order
async function processQueue() {
var processed = []; // Called queue methods
// Add valid methods to
dbQueue.forEach(method => {
if (typeof db[method] === "function") {
return processed.push(db[method]());
} else {
console.error(`"${method}" is not a name of a valid method.`);
}
});
// Log promise results
await Promise.all(processed).then(res => {
console.log("Processed:", res);
});
// Empty the queue
dbQueue = [];
}
// Add some calls to the queue of methods that don't yet exist
dbQueue.push("getDocs");
dbQueue.push("getDocs");
// Simulate adding the method
db.getDocs = function() {
return new Promise(resolve => {
resolve(this.docs);
});
};
// Process queue once conditions are met (e.g. db is connected); called elsewhere
processQueue();
And here's a fiddle with an example that allows arguments for the methods: https://jsfiddle.net/rjbv0284/1/

How do I mix all this logic with javascript Promises?

I'm using bluebird in Node, and I'm still pretty new to using Promises, especially when things start getting beyond the basics.
Here's a function I need to construct using Promises, and I'm struggling to figure out the best way to set it up. At a high level, this function will take a model object, and return it, converting any query properties to their result sets. For example, a property can have a value of "query(top5Products)", and we'll need to lookup that named query and replace the value with the results of that query. Properties can also be an actual string-based query (using RQL, e.g. "eq(contentType,products)&&limit(5,0)") This converted model object will then be used to bind against a template.
Here's my pseudo-coded function, currently synchronous except for the calls to existing promise-returning services...
function resolveQueryPropertiesOnModel(model) {
for (let property in model) {
if (model.hasOwnProperty(property)) {
let queryName = this.getNameOfNamedQuery(model[property]); // will return undefined if the property is not a named query
if (queryName) {
// this property is a named query, so get it from the database
this.getByName(queryName)
.then((queryObject) => {
// if queryObject has a results propery, that's the cached resultset - use it
if (queryObject && queryObject.results) {
model[property] = queryObject.results;
}
else {
// need to resolve the query to get the results
this.resolve(queryObject.query)
.then((queryResults) => {
model[property] = queryResults;
});
}
};
}
else if (this.isQuery(model[property]) { // check to see if this property is an actual query
// resolve the query to get the results
this.resolve(model[property])
.then((queryResults) => {
model[property] = queryResults;
});
}
}
}
// return some sort of promise that will eventually become the converted model,
// with all query properties converted to their resultsets
return ???;
}
I'm still very rusty when it comes to taking loops with logic and some pre-existing promises and mashing them all together.
Any help will be appreciated.
Here's an implementation of your code using Bluebird that makes these structural changes:
Runs the outer for loop and collects any promises that were started
Returns nested promises to chain them so they are linked and so the top level promise will indicate when everything is done in that chain
Collects any new promises into the promises array
Uses Promise.all(promises) to track when all the async promise operations are done and returns that.
It appears your result is the side effect of modifying the models object so no explicit values are returned through the promises. You can use the returned promise to know when all the async operations are done and you can then examine the model object for results.
Code:
function resolveQueryPropertiesOnModel(model) {
const promises = [];
for (let property in model) {
let p;
if (model.hasOwnProperty(property)) {
let queryName = this.getNameOfNamedQuery(model[property]); // will return undefined if the property is not a named query
if (queryName) {
// this property is a named query, so get it from the database
p = this.getByName(queryName).then((queryObject) => {
// if queryObject has a results propery, that's the cached resultset - use it
if (queryObject && queryObject.results) {
model[property] = queryObject.results;
} else {
// need to resolve the query to get the results
return this.resolve(queryObject.query).then((queryResults) => {
model[property] = queryResults;
});
}
};
} else if (this.isQuery(model[property]) { // check to see if this property is an actual query
// resolve the query to get the results
p = this.resolve(model[property]).then((queryResults) => {
model[property] = queryResults;
});
}
}
// if we started a new promise, then push it into the array
if (p) {
promises.push(p);
}
}
return Promise.all(promises);
}
This is how I would solve it.
a q.all() will be resolved if all of the promises are resolved. each promise is one property in the model that is processed.
for each property (I'd use a library like lodash and _.reduce, but you can use the hasOwnProperty if you like). anyway, foreach property, resolveModelProperty function returns a promise that decides the fate of the property, if there is a query name, get it, if not and there is a query, resolve it, if not, don't change the property.
to helper functions, resolveByName and resolveQuery will handle the case of cached and uncached queries.
function resolveQueryPropertiesOnModel(model) {
const promises = [],
resolveQuery = toBeResolved => this.resolve(toBeResolved),
resolveByName = queryName => this.getByName(queryName)
.then(queryObject => queryObject && queryObject.results
? queryObject.results : resolveQuery(queryObject.query)),
resolveModelProperty = (modelProperty) => {
const queryName = this.getNameOfNamedQuery(modelProperty);
return queryName ? resolveByName(queryName) :
this.isQuery(modelProperty) ? resolveQuery(modelProperty):
modelProperty;
};
for(let property in model)
if( model.hasOwnProperty(property)
promises.push(resolveModelProperty(model[property])
.then(result=> model[property]=result));
return q.all(promises);
}

How to call asynchronous method recursively with different parameters

I have a method of rest call using request module which is restRequest() which returns response as promise which is asynchronous method, I have to call this method recursively with different parameters after getting the each results and passing that result to same method.
Example code:
restRequest(url, "POST").then(function(response) {
restRequest(secondUrl, 'GET', response).then(function(response2) {
}):
});
will this works, or any other things are there to solve this one.
I would use the async library for this
Specifically the waterfall
Which would work like
async.waterfall([
function firstRequest(callback) {
restRequest(url, "POST").then(function(response) {
callback(null, response);
});
},
function secondRequest (data, callback) {
restRequest(secondUrl, 'GET', data).then(function(response2) {
callback();
});
}
], function (err, result) {
// Handle err or result
});
Sorry for formatting I'm on mobile.
You can read about how async.waterfall works from the link above.
Your method works but depending on how many requests you have you can end up with quite a deep callback hell
But since you are using promises you can just return your promise chain like
restRequest(url, "POST")
.then(function(resp) {
return restRequest(secondUrl, "GET", resp);
})
.then(function(resp) {
return restRequest(thirdUrl, "GET", resp);
});
.then(function(resp) {
// do whatever keep the chain going or whatever
})
.catch(function(error) {
// if any of the promises error it will immediately call here.
});
With promises you can return a new promise from within a .then and just keep the chain going infinitely.
I'm just biased for async as i think it really improves readability when used right.
you could do something like:
let requestParams = [
[url, 'POST'],
[secondUrl, 'GET'],
...
];
function callRecursive(response){
if(!requestParams.length) return Promise.resolve(response);
let params = requestParams.shift();
if(response) params.push(response);
return restRequest(...params).then(callRecursive);
}
callRecursive().then(successCallbk).catch(errCallBk);
You can supply one or more arguments to bind to your partially applied function.
restRequest(url,"POST").then(restRequest.bind(this,secondUrl, "GET"))
.then(restRequest.bind(this,thirdUrl, "GET"));
Since these are fired off in serial, what you really have is a simple chain of functions (some return promises, some might not) that can compose (or sequence, here) together, which I find to be a neat way to isolate out everything you want to happen and then combine behaviors as needed. It's still a Promise chain under the hood, but expressed as a series. First, a few utility methods to help:
var curry = (f, ...args) =>
(f.length <= args.length) ? f(...args) : (...more) => curry(f, ...args, ...more);
var pipeP = (...fnlist) =>
acc => fnlist.reduce( (acc,fn) => acc.then(fn), Promise.resolve(acc));
then
//make restRequest only return a Promise once it's given its 3rd argument
var restRequest = autocurry(restRequest);
//define what our requests look like
var request1 = restRequest('firstUrl', "POST");//-> curried function, not yet called
var request2 = restRequest('secondUrl', 'GET');//-> curried function, not yet called
//define some simple methods to process responses
var extractURL = x => x.url;//-> simple function
var extractData = x=> x.data;//-> simple function
//final behaviors, i.e. do something with data or handle errors
//var handleData = ... //-> do something with "data"
//var handleError = ... //-> handle errors
//now, create a sort of lazy program chain waiting for a starting value
//that value is passed to request1 as its 3rd arg, starting things off
var handleARequest = pipeP(request1, extractURL, request2, extractData);
//and execute it as needed by passing it a starting request
handleARequest({postdata:5}).then(handleData).catch(handleErrors);
Recursion is the most obvious approach but it's not necessary. An alternative is to build a .then() chain by reducing an array of known parameters (urls and methods).
The process is presented here under "The Collection Kerfuffle".
function asyncSequence(params) {
return params.reduce(function(promise, paramObj) {
return promise.then(function(response) {
return restRequest(paramObj.url, paramObj.method, response);
});
}, Promise.resolve(null)); // a promise resolved with the value to appear as `response` in the first iteration of the reduction.
}
This will cater for any number of requests, as determined by the length of the params array.
Call as follows :
var params = [
{url:'path/1', method:'POST'},
{url:'path/2', method:'GET'},
{url:'path/3', method:'POST'}
];
asyncSequence(params).then(function(lastResponse) {
//all successfully completed
}).catch(function(e) {
// something went wrong
});

Waiting for multiple AJAX requests to finish in a loop [duplicate]

Thats how I do it:
function processArray(array, index, callback) {
processItem(array[index], function(){
if(++index === array.length) {
callback();
return;
}
processArray(array, index, callback);
});
};
function processItem(item, callback) {
// do some ajax (browser) or request (node) stuff here
// when done
callback();
}
var arr = ["url1", "url2", "url3"];
processArray(arr, 0, function(){
console.log("done");
});
Is it any good? How to avoid those spaghetti'ish code?
Checkout the async library, it's made for control flow (async stuff) and it has a lot of methods for array stuff: each, filter, map. Check the documentation on github. Here's what you probably need:
each(arr, iterator, callback)
Applies an iterator function to each item in an array, in parallel. The iterator is called with an item from the list and a callback for when it has finished. If the iterator passes an error to this callback, the main callback for the each function is immediately called with the error.
eachSeries(arr, iterator, callback)
The same as each only the iterator is applied to each item in the array in series. The next iterator is only called once the current one has completed processing. This means the iterator functions will complete in order.
As pointed in some answer one can use "async" library. But sometimes you just don't want to introduce new dependency in your code. And below is another way how you can loop and wait for completion of some asynchronous functions.
var items = ["one", "two", "three"];
// This is your async function, which may perform call to your database or
// whatever...
function someAsyncFunc(arg, cb) {
setTimeout(function () {
cb(arg.toUpperCase());
}, 3000);
}
// cb will be called when each item from arr has been processed and all
// results are available.
function eachAsync(arr, func, cb) {
var doneCounter = 0,
results = [];
arr.forEach(function (item) {
func(item, function (res) {
doneCounter += 1;
results.push(res);
if (doneCounter === arr.length) {
cb(results);
}
});
});
}
eachAsync(items, someAsyncFunc, console.log);
Now, running node iterasync.js will wait for about three seconds and then print [ 'ONE', 'TWO', 'THREE' ]. This is a simple example, but it can be extended to handle many situations.
As correctly pointed out, you have to use setTimeout, for example:
each_async = function(ary, fn) {
var i = 0;
-function() {
fn(ary[i]);
if (++i < ary.length)
setTimeout(arguments.callee, 0)
}()
}
each_async([1,2,3,4], function(p) { console.log(p) })
The easiest way to handle async iteration of arrays (or any other iterable) is with the await operator (only in async functions) and for of loop.
(async function() {
for(let value of [ 0, 1 ]) {
value += await(Promise.resolve(1))
console.log(value)
}
})()
You can use a library to convert any functions you may need which accept callback to return promises.
In modern JavaScript there are interesting ways to extend an Array into an async itarable object.
Here I would like to demonstrate a skeleton of a totally new type AsyncArray which extends the Array type by inheriting it's goodness just to become an async iterable array.
This is only available in the modern engines. The code below uses the latest gimmicks like the private instance fields and for await...of.
If you are not familiar with them then I would advise you to have a look at the above linked topics in advance.
class AsyncArray extends Array {
#INDEX;
constructor(...ps){
super(...ps);
if (this.some(p => p.constructor !== Promise)) {
throw "All AsyncArray items must be a Promise";
}
}
[Symbol.asyncIterator]() {
this.#INDEX = 0;
return this;
};
next() {
return this.#INDEX < this.length ? this[this.#INDEX++].then(v => ({value: v, done: false}))
: Promise.resolve({done: true});
};
};
So an Async Iterable Array must contain promises. Only then it can return an iterator object which with every next() call returns a promise to eventually resolve into an object like {value : "whatever", done: false} or {done: true}. So basically everything returned is a promise here. The await abstraction unpacks the value within and gives it to us.
Now as I mentioned before, this AsyncArray type, since extended from Array, allows us to use those Array methods we are familiar with. That should simplify our job.
Let's see what happens;
class AsyncArray extends Array {
#INDEX;
constructor(...ps){
super(...ps);
if (this.some(p => p.constructor !== Promise)) {
throw "All AsyncArray items must be a Promise";
}
}
[Symbol.asyncIterator]() {
this.#INDEX = 0;
return this;
};
next() {
return this.#INDEX < this.length ? this[this.#INDEX++].then(v => ({value: v, done: false}))
: Promise.resolve({done: true});
};
};
var aa = AsyncArray.from({length:10}, (_,i) => new Promise(resolve => setTimeout(resolve,i*1000,[i,~~(Math.random()*100)])));
async function getAsycRandoms(){
for await (let random of aa){
console.log(`The Promise at index # ${random[0]} gets resolved with a random value of ${random[1]}`);
};
};
getAsycRandoms();
For modern Node.js:
To iterate through a collection truly asynchronously, you can try my tiny package with zero dependencies, compatible with ESM and CJS modules with .d.ts typings. Check the code it's really tiny.
https://www.npmjs.com/package/array-to-async-iterable
You can use it just like this:
for await(const el of new AsyncTimeIterator(arrayOfObjects)){
...
}
You can't just use for await of loop because of the JavaScript engines' microtasks and macrotasks nature.
In a brief, you won't get new HTTP requests and let other timers' callbacks to be executed with this code:
for await(const el of array){
...
}
You force V8 or the other engine to execute all the microtasks (your loop iteration) and when the loop completes you'll unblock the event loop and be ready to receive HTTP connections. So this code is completely useless.

What is a good approach to develop a synchronous/blocking and an asynchrounous/non-blocking library-api in parallel? (JavaScript)

I rewrote this question, because the old version was obviously misleading.
Please read the text and make shure you understood what I'm asking for. If
there is still anything left in the dark I'll modify this question for clarity.
Just inform me.
One of my projects is to port a library from Python to JavaScript.
The Python library is entirely blocking/synchronous when it comes to I/O
and such. This is of course perfectly normal for Python code.
I plan to port the synchronous/blocking methods as they are to JavaScript.
This has several reasons and whether or not it's worth the effort is a
good but different question.
Additionally I wan't to add an asynchronous/non-blocking api.
Think of it like the fs module in node where there are i.e. fs.open and
fs.openSync coexisting.
The library is pure JavaScript and will run in Node and in the Browser.
The question is what a good/the best approach for the development of these two coexisting APIs would be.
I believe its good to have the same thing happening in one place only.
Hence an approach where some parts of the implementation could be shared would be preferable.
Not at any price of course, that's why I'm asking.
I had a proposal for an approach in here, but I'm going to post it as a
possible answer. However, I'm waiting for some serious discussion to happen
before I decide what I accept as an answer.
So far approaches are:
implement both apis separately and definetly use promises for the asynchronous functions.
use something like the obtain api proposal - beeing a more integrated approach
If you're talking I/O in node.js then most I/O methods have a synchronous version.
There is no direct conversion from Asynchronicity To Synchronicity. I can think of two approaches:
Have each asynchronous method run a polling loop waiting for the async task to complete before returning.
Drop the idea of mimicking synchronous code and instead invest in better coding patterns (such as promises)
To illustrate I will assume option 2 is a better choice. The following example uses Q promises (easily installed with npm install q.
The idea behind promises is that although they are asynchronous the return object is a promise for a value as if it was a normal function.
// Normal function
function foo(input) {
return "output";
}
// With promises
function promisedFoo(input) {
// Does stuff asynchronously
return promise;
}
The first function takes an input and returns a result. The second example takes an input and immediately returns a promise which will eventually resolve to a value when the async task finishes. You then manage this promise as follows:
var promised_value = promisedFoo(input);
promised_value.then(function(value) {
// Yeah, we now have a value!
})
.fail(function(reason) {
// Oh nos.. something went wrong. It passed in a reason
});
Using promises you no longer have to worry when something will happen. You can easily chain promises so things happen synchronously without insane nested callbacks or 100 named functions.
It well worth learning about. Remember promises are meant to make async code behave like sync code even though it isn't blocking.
Write lower level API using promises that takes async/sync flag.
Higher level async API returns these promises directly (while also working with async callbacks like it's 1970).
Higher level sync API unwraps the value synchronously from the promise and returns the value or throws the error.
(Examples use bluebird which is orders of magnitude faster and has more features at the cost of file size compared to Q, although that might not be ideal for browsers.)
Low level api that is not exposed:
//lowLevelOp calculates 1+1 and returns the result
//There is a 20% chance of throwing an error
LowLevelClass.prototype.lowLevelOp = function(async, arg1, arg2) {
return new Promise(function(resolve, reject) {
if (Math.random() < 0.2) {
throw new Error("random error");
}
if (!async) resolve(1+1);
else {
//Async
setTimeout(function(){
resolve(1+1);
}, 50);
}
});
};
High level exposed API that works synchronously, using promises or callbacks:
HighLevelClass.prototype.opSync = function(arg1, arg2) {
var inspection =
this.lowLevel.lowLevelOp(false, arg1, arg2).inspect();
if (inspection.isFulfilled()) {
return inspection.value();
}
else {
throw inspection.error();
}
};
HighLevelClass.prototype.opAsync = function(arg1, arg2, callback) {
//returns a promise as well as accepts callback.
return this.lowLevel.lowLevelOp(true, arg1, arg2).nodeify(callback);
};
You can automatically generate the high level api for synchronous methods:
var LowLevelProto = LowLevelClass.prototype;
Object.keys(LowLevelProto).filter(function(v) {
return typeof LowLevelProto[v] === "function";
}).forEach(function(methodName) {
//If perf is at all a concern you really must do this with a
//new Function instead of closure and reflection
var method = function() {
var inspection = this.lowLevel[methodName].apply(this.lowLevel, arguments);
if (inspection.isFulfilled()) {
return inspection.value();
}
else {
throw inspection.error();
}
};
HighLevelClass.prototype[methodName + "Sync" ] = method;
});
I implemented a library that does what I'm asking for ObtainJS.
(Yes, the Library uses Promises BUT not as others proposed in their ansewers here)
Reposting the Readme.md:
ObtainJS
ObtainJS is a micro framework to bring together asynchronous and
synchronous JavaScript code. It helps you to Don't Repeat Yourself
(DRY) if you are developing a library with interfaces for both
blocking/synchronous and non-blocking/asynchronous execution models.
As a USER
of a library that was implemented with ObtainJS you won't have to learn
a lot. Typically a function defined using ObtainJS has as first argument
the switch, that lets you choose the execution path, followed by its normal
arguments:
// readFile has an obtainJS API:
function readFile(obtainAsyncExecutionSwitch, path) { /* ... */ }
execute synchronously
If the obtainSwitch is a falsy value readFile will execute synchronously
and return the result directly.
var asyncExecution = false, result;
try {
result = readFile(asyncExecution, './file-to-read.js');
} catch(error) {
// handle the error
}
// do something with result
execute asynchronously
If the obtainSwitch is a truthy value readFile will execute asynchronously
and always return a Promise.
See Promises at MDN
var asyncExecution = true, promise;
promise = readFile(asyncExecution, './file-to-read.js');
promise.then(
function(result) {
// do something with result
},
function(error){
// handle the error
}
)
// Alternatively, use the returned promise directly:
readFile(asyncExecution, './file-to-read.js')
.then(
function(result) {
// do something with result
},
function(error){
// handle the error
}
)
You can use a callback based api, too. Note that the Promise is returned anyways.
var asyncExecution;
function unifiedCallback(error, result){
if(error)
// handle the error
else
// do something with result
}
asyncExecution = {unified: unifiedCallback}
readfile(asyncExecution, './file-to-read.js');
or with a separate callback and errback
var asyncExecution;
function callback(result) {
// do something with result
}
function errback(error) {
// handle the error
}
var asyncExecution = {callback: callback, errback: errback}
readfile(asyncExecution, './file-to-read.js');
```
As a smart ;-) LIBRARY AUTHOR
who's going to implement a API using with ObtainJS the work is a bit more.
Stay with me.
The behavior above is achieved by defining a twofold dependency tree: one
for the actions of the synchronous execution path and one for the actions
of the asynchronous execution path.
Actions are small functions with dependencies on the results of other
actions. The asynchronous execution path will fallback to synchronous
actions if there is no asynchronous action defined for a dependency.
You wouldn't define an asynchronous action if its synchronous
equivalent is non-blocking. This is where you DRY!
So, what you do, for example, is splitting your synchronous and blocking
method in small function-junks. These junks depend on the results of each
other. Then you define a non-blocking AND asynchronous junk for each
synchronous AND blocking junk. The rest does obtainJS for you. Namely:
creating a switch for synchronous or asynchronous execution
resolving the dependency tree
executing the junks in the right order
providing you with the results via:
return value when using the synchronous path
promises OR callbacks (your choice!) when using the asynchronous path
Here is the readFile function
from above, taken directly from working code at
ufoJS
define(['ufojs/obtainJS/lib/obtain'], function(obtain) {
// obtain.factory creates our final function
var readFile = obtain.factory(
// this is the synchronous dependency definition
{
// this action is NOT in the async tree, the async execution
// path will fall back to this method
uri: ['path', function _path2uri(path) {
return path.split('/').map(encodeURIComponent).join('/')
}]
// synchronous AJAX request
, readFile:['uri', function(path) {
var request = new XMLHttpRequest();
request.open('GET', path, false);
request.send(null);
if(request.status !== 200)
throw _errorFromRequest(request);
return request.responseText;
}]
}
,
// this is the asynchronous dependency definition
{
// aynchronous AJAX request
readFile:['uri', '_callback', function(path, callback) {
var request = new XMLHttpRequest()
, result
, error
;
request.open('GET', path, true);
request.onreadystatechange = function (aEvt) {
if (request.readyState != 4 /*DONE*/)
return;
if (request.status !== 200)
error = _errorFromRequest(request);
else
result = request.responseText
callback(error, result)
}
request.send(null);
}]
}
// this are the "regular" function arguments
, ['path']
// this is the "job", a driver function that receives as first
// argument the obtain api. A method that the name of an action or
// of an argument as input and returns its result
// Note that job is potentially called multiple times during
// asynchronoys execution
, function(obtain, path){ return obtain('readFile'); }
);
})
a skeleton
var myFunction = obtain.factory(
// sync actions
{},
// async actions
{},
// arguments
[],
//job
function(obtain){}
);
action/getter definition
// To define a getter we give it a name provide a definition array.
{
// sync
sum: ['arg1', 'arg2',
// the last item in the definition array is always the action/getter itself.
// it is called when all dependencies are resolved
function(arg1, arg2) {
// function body.
var value = arg1 + arg2
return value
}]
}
// For asynchronous getters you have different options:
{
// async
// the special name "_callback" will inject a callback function
sample1: ['arg1', '_callback', function(arg1, callback) {
// callback(error, result)
}],
// you can order separate callback and errback when using both special
// names "_callback" and "_errback"
sample2: ['arg1', '_callback', '_errback', function(arg1, callback, errback) {
// errback(error)
// callback(result)
}],
// return a promise
sample3: ['arg1', function(arg1) {
var promise = new Promise(/* do what you have to*/);
return promise
}]
}
The items in the definition array before the action are the dependencies
their values are going to be injected into the call to action, when
available.
If the type of an dependency is not a string: It's injected as a value
directly. This way you can effectively do currying.
If the type of the value is a string: It's looked up in the dependency
tree for the current execution path(sync or async).
If its name is defined as an caller-argument (in the third argument of obtain.factory) the value
is taken from the invoking call.
If its name is defined as the name of another action, that action is
executed and its return value is used as a parameter. An action will
executed only once per run, later invocations will return a cached value.
If the execution path is asynchronous obtain will first look for a
asynchronous action definition. If that is not found it falls back
to a synchronous definition.
If you wish to pass a String as value to your getter you must define it as
an instance of obtain.Argument: new obtain.Argument('mystring argument is not a getter')
A more complete example
from ufoLib/glifLib/GlyphSet.js
Note that: obtainJS is aware of the host object and propagates this
correctly to all actions.
/**
* Read the glif from I/O and cache it. Return a reference to the
* cache object: [text, mtime, glifDocument(if alredy build by this.getGLIFDocument)]
*
* Has the obtainJS sync/async api.
*/
GlypSet.prototype._getGLIFcache = obtain.factory(
{ //sync
fileName: ['glyphName', function fileName(glyphName) {
var name = this.contents[glyphName];
if(!(glyphName in this.contents) || this.contents[glyphName] === undefined)
throw new KeyError(glyphName);
return this.contents[glyphName]
}]
, glyphNameInCache: ['glyphName', function(glyphName) {
return glyphName in this._glifCache;
}]
, path: ['fileName', function(fileName) {
return [this.dirName, fileName].join('/');
}]
, mtime: ['path', 'glyphName', function(path, glyphName) {
try {
return this._io.getMtime(false, path);
}
catch(error) {
if(error instanceof IONoEntryError)
error = new KeyError(glyphName, error.stack);
throw error;
}
}]
, text: ['path', 'glyphName', function(path, glyphName) {
try {
return this._io.readFile(false, path);
}
catch(error) {
if(error instanceof IONoEntryError)
error = new KeyError(glyphName, error.stack);
throw error;
}
}]
, refreshedCache: ['glyphName', 'text', 'mtime',
function(glyphName, text, mtime) {
return (this._glifCache[glyphName] = [text, mtime]);
}]
}
//async getters
, {
mtime: ['path', 'glyphName', '_callback',
function(path, glyphName, callback) {
var _callback = function(error, result){
if(error instanceof IONoEntryError)
error = new KeyError(glyphName, error.stack);
callback(error, result)
}
this._io.getMtime({unified: _callback}, path);
}]
, text: ['path', 'glyphName', '_callback',
function(path, glyphName, callback){
var _callback = function(error, result) {
if(error instanceof IONoEntryError)
error = new KeyError(glyphName, error.stack);
callback(error, result)
}
this._io.readFile({unified: _callback}, path);
}
]
}
, ['glyphName']
, function job(obtain, glyphName) {
if(obtain('glyphNameInCache')) {
if(obtain('mtime').getTime() === this._glifCache[glyphName][1].getTime()) {
// cache is fresh
return this._glifCache[glyphName];
}
}
// still here? need read!
// refreshing the cache:
obtain('refreshedCache')
return this._glifCache[glyphName];
}
)

Categories

Resources