Subsequential promises in ionic2/angular2 - javascript

I know, it is a newbie question:
I created a mobile application reading an information feed from a bluetooth serial device.
Data are retrieved using a promise in this way:
myServiceClass.getRemoteValue(valueId).then(reply: number) {
...
}
I need to read multiple parameters coming from this feed and I have to wait the previous call to finish before requesting the new value.
If I run:
let requiredValues = [1, 2, 3, 4, ..., n];
for (let i=0; i<requiredValues.length; i++) {
myServiceClass.getRemoteValue(valueId).then(reply: number) {
...
}
}
In this way request will run in parallel, but I need them to run in sequence one after the other. Is there any solution to subsequentially chain an array of promises someway?
In other words I need to run the n-th promise only after the previous promise has been resolved.
Thank you very much for your time.

Well, you can use a recursive method to achieve that... Please take a look at this plunker (when running the plunker, please notice that the values are being printed in the console)
I'm just using some fake data, but I guess it's enough to give you the overall idea:
public start(): void {
this.getSeveralRemoteValues([1,2,3,4,5,6,7,8,9]);
}
private getSeveralRemoteValues(array): Promise<boolean> {
if(array && array.length) {
return this.getRemoteValueFromService(array[0]).then(() => {
array.shift(); // remove the first item of the array
this.getSeveralRemoteValues(array); // call the same method with the items left
})
} else {
this.logEnd();
}
}
private logEnd(): void {
alert('All promises are done!');
}
private getRemoteValueFromService(value: number): Promise<boolean> {
// this simulates the call to the service
return new Promise((resolve, reject) => {
setTimeout(() => {
console.log(`Promise: ${value}`);
resolve(true);
}, 1000);
});
}

Related

Rxjs : Retry http call with updated parameters if no results

I am a novice with Rxjs, I'm trying to do a request to my API with limits passed in parameters.
My problem is sometimes the returned result is empty for some reasons. The thing I need to do is retry this API call with updated parameters (skip param)
pollService.getPoll$(skip, limit).subscribe((pollList) => {
doStuff;
},
(error) => {
doStuff;
});
I read some topics about the RetryWhen RXJS function but it is about errors when the request fail and you want to retry the same one but I ve no errors and I don't want to retry the same request, I also saw topics about Replay function but it is not very clear to me.
Can someone explain me what to do here please !!
Thanks
Alex
Consider utilizing the expand operator as demonstrated below:
import { EMPTY, of } from "rxjs"
import { expand } from "rxjs/operators"
public startPolling(skip: number, limit: number): void {
of([])
.pipe(
expand(x => x.length < 2 ? pollService.getPoll$(skip--, limit) : EMPTY)
)
.subscribe(pollList => {})
}
Update:
public poll = (skip: number, limit: number): void => {
defer(() => getPoll$(1 + skip--, limit))
.pipe(
repeat(),
first(x => {
if(x.length < 2){
// update some variable in the component
return false;
}
return true;
})
)
.subscribe(pollList => { })
}
If I understand correctly, your backend is paging data using the skip and limit parameters.
So, if you have a skip value that is too high, you want to reduce it automatically.
There are many, many ways to solve this problem in RxJS:
you could insert a switchMap after getPoll$. SwitchMap would return a new observable, either wrapping the result if it's ok (with of(pollList)), or returning pollService.getPoll$(newSkipValue, newLimitValue)
you could map the result and throw an Error if the result doesn't pass validation. Then you could catchError and return the result of a new call to getPoll$
However, what I suggest is modelling the call differently. I would use a Subject as a source of requests, and switchMap to execute the requests.
// inside the component or service
interface GetPollRequest {
skip: number;
limit: number;
}
private _callSource = new Subject<GetPollRequest>();
public triggerCall(skip: number, limit: number) {
this._callSource.next({skip, limit});
}
constructor(...) {
this._callSource.pipe(
// every time _callSource emits, we call the server
switchMap(({skip, limit) => pollService.getPoll$(skip, limit).pipe(
map(pollList => ({ pollList, skip, limit }))
),
tap(({pollList, skip, limit}) => {
// update the request in any way you need. You need to make sure
// that the next automatic trigger doesn't repeat indefinitely,
// or you'll simulate a DOS attack to your backend
if (pollList.length < 2) this.triggerCall(skip - 2, limit);
})
).subscribe(pollList => // update the component status);
}
Using this pattern, you use subjects as triggers (or custom events, they are pretty much the same), and you wrap them up during constructor time.
SwitchMap is used to create an observable (in this case, performing a request) every time the source emits.
Tap is used to perform an operation (pretty much like a subscribe), embedded in the chain of transformations inside the pipe.

Refractroing: return or push value to new array value from mongoose callback

Actually I'm not sure that Title of my question is 'correct', if you
have any idea with it, you could leave a comment and I'll rename it.
I am trying to rewrite my old function which make http-requests and insert many object at mongoDB via mongoose. I already have a working version of it, but I face a problem while using it. Basically, because when I'm trying to insertMany 20 arrays from 20+ request with ~50'000 elements from one request it cause a huge memory leak. Even with MongoDB optimization.
Logic of my code:
function main() {
server.find({locale: "en_GB"}).exec(function (err, server) {
for (let i = 0; i < server.length; i++) { //for example 20 servers
rp({url: server[i].slug}).then(response => {
auctions.count({
server: server[i].name,
lastModified: {$gte: response.data.files[0].lastModified}
}).then(function (docs) {
if (docs < 0) {
//We don't insert data if they are already up-to-date
}
else {
//I needed response.data.files[0].url and server[i].name from prev. block
//And here is my problem
requests & insertMany and then => loop main()
})
}
})
}).catch(function (error) {
console.log(error);
})
}
})
}
main()
Actually I have already trying many different things to fix it. First-of-all I was trying to add setInterval after else block like this:
setTimeout(function () {
//request every server with interval, instead of all at once
}, 1000 * (i + 1));
but I create another problem for myself because I needed to recursive my main() function right after. So I can't use: if (i === server[i].length-1) to call garbage collector or to restart main() because not all server skip count validation
Or let's see another example of mine:
I change for (let i = 0; i < server.length; i++) from 3-rd line to .map and move it from 3-rd line close to else block but setTimeout doesn't work with .map version, but as you may already understand script lose correct order and I can't make a delay with it.
Actually I already understand how to fix it at once. Just re-create array via let array_new = [], array_new.push = response.data.files[0].url with use of async/await. But I'm not a big expert in it, so I already waste a couple of hours. So the only problem for now, that I don't know how to return values from else block
As for now I'm trying to form array inside else block
function main() {
--added let array_new = [];
[v1]array_new.url += response.data.files[0].url;
[v2]array_new.push(response.data.files[0].url);
return array_new
and then call array_new array via .then , but not one of these works fine for now. So maybe someone will give me a tip or show me already answered question #Stackoverflow that could be useful in my situation.
Since you are essentially dealing with promises, you can refactor your function logic to use async await as follows:
function async main() {
try {
const servers = await server.find({locale: "en_GB"}).exec()
const data = servers.map(async ({ name, slug }) => {
const response = await rp({ url: slug })
const { lastModified, url } = response.data.files[0]
const count = await auctions.count({
server: name,
lastModified: { $gte: lastModified }
})
let result = {}
if (count > 0) result = { name, url }
return result
}).filter(d => Object.keys(d).length > 0)
Model.insertMany(data)
} catch (err) {
console.error(err)
}
}
Your problem is with logic obscured by your promises. Your main function recursively calls itself N times, where N is the number of servers. This builds up exponentially to eat memory both by the node process and MongoDB handling all the requests.
Instead of jumping into async / await, start by using the promises and waiting for the batch of N queries to complete before starting another batch. You can use [Promise.all] for this.
function main() {
server.find({locale: "en_GB"}).exec(function (err, server) {
// need to keep track of each promise for each server
let promises = []
for (let i = 0; i < server.length; i++) {
let promise = rp({
url: server[i].slug
}).then(function(response) {
// instead of nesting promises, return the promise so it is handled by
// the next then in the chain.
return auctions.count({
server: server[i].name,
lastModified: {
$gte: response.data.files[0].lastModified
}
});
}).then(function (docs) {
if (docs > 0) {
// do whatever you need to here regarding making requests and
// inserting into DB, but don't call main() here.
return requestAndInsert();
}
}).catch(function (error) {
console.log(error);
})
// add the above promise to out list.
promises.push(promise)
}
// register a new promise to run once all of the above promises generated
// by the loop have been completed
Promise.all(promises).then(function () {
// now you can call main again, optionally in a setTimeout so it waits a
// few seconds before fetchin more data.
setTimeout(main, 5000);
})
})
}
main()

Observable combine multiple function calls into single Observable

I have a function that does an http request based on a parameter. And I want to add some kind of "debounce" functionality. So if the function gets called multiple times in a set time window, I want to combine the parameters into one request instead of making multiple requests.
I want to achieve this with Observables and in Angular. This does not sound that complicated, however I'm not able to get it running, maybe I'm missing something.
For now let's just skip the combining in a single request as this can be done with an aggregated debounce or a Oberservable.buffer. I have trouble combining the single Observables.
Here's what I've tried so far.
I tried using a Subject, as this seemed to be the proper object for this case (https://stackblitz.com/edit/angular-hcn41v?file=src%2Fapp%2Fapp.component.ts).
constructor(private http: HttpClient) {
this.makeRequest('1').subscribe(x => console.log(x))
this.makeRequest('2').subscribe(console.log)
setTimeout(() => {
this.makeRequest('3').subscribe(console.log)
}, 1000)
}
private makeRequest(id: string) {
this.observable = this.observable.pipe(
merge(Observable.of(id).pipe(delay(1)))
)
return this.aggregateDebounce(this.observable)
}
private getUrl(value) {
console.log('getUrl Call', value);
return 'https://jsonplaceholder.typicode.com/posts/1';
}
private aggregateDebounce(ob$) {
const shared$ = ob$.publishReplay(1).refCount()
return shared$.buffer(shared$.debounceTime(75))
}
I expect to have one 'getUrl Call' log for each function call and one result log. However I only get results if I add more than 1 calls to this.makeRequest() and the result is also weird. All previous values are always returned as well. I think I don't fully understand how Subject works in this case.
Another approach (taken from here RXJS: Aggregated debounce) was to create some sort of aggregate debounce (https://stackblitz.com/edit/angular-mx232d?file=src/app/app.component.ts)
constructor(private http: HttpClient) {
this.makeRequest('1').subscribe(x => console.log(x))
this.makeRequest('2').subscribe(console.log)
setTimeout(() => {
this.makeRequest('3').subscribe(console.log)
}, 1000)
}
private makeRequest(id: string) {
this.observable = this.observable.pipe(
merge(Observable.of(id).pipe(delay(1)))
)
return this.aggregateDebounce(this.observable)
}
private getUrl(value) {
console.log('getUrl Call', value);
return 'https://jsonplaceholder.typicode.com/posts/1';
}
private aggregateDebounce(ob$) {
const shared$ = ob$.publishReplay(1).refCount()
return shared$.buffer(shared$.debounceTime(75))
}
In this scenario I have the problem I'm also getting all previous values as well.
In theory (at least to me) both variants sounded plausible, however it seems like I'm missing something. Any wink in the right direction is highly appreciated.
Edit:
As requested I added the final real-world goal.
Imagine a service that requests information from an API. Within 50-75ms you call the service with a certain id. I want to group those ids together to a single request instead of doing 3. And if 100ms later another call to the service is made, a new request will be done
this.makeRequest(1).subscribe();
private makeRequest(number: number) {
this.values.next(number);
return this.idObservable.pipe(
You emit the value before you subscribe -> The value gets lost.
private values: Subject = new Subject();
private idObservable = this.values.pipe(
private makeRequest(number: number) {
this.values.next(number);
return this.idObservable.pipe(
Every call creates a new observable based on the subject. Whenever you emit a value, all subscribers receive the value.
A possible solution could look something like this (I'm using the new rxjs syntax here):
subject: Subject<String> = null;
observable = null;
window = 100;
constructor() {
this.subject = null;
this.window = 100;
this.makeRequest('1').subscribe(console.log)
this.makeRequest('2').subscribe(console.log)
setTimeout(() => {
this.makeRequest('3').subscribe(console.log)
}, 1000)
}
private makeRequest(id: string) {
if (!this.subject) {
this.subject = new ReplaySubject()
this.observable = this.subject.pipe(
takeUntil(timer(this.window).pipe(take(1))),
reduce((url, id, index) => this.combine(url, id), baseUrl),
flatMap(url => this.request(url)),
tap(() => this.subject = null),
share()
)
}
this.subject.next(id);
return this.observable;
}
Where combine creates the url and request makes the actual request.
Rxjs is quite good at handling this kind of case. You'll need two different Subjects:
One will be used to collect and combine all requests
The second will be used for subscribing to results
When a request is made, the value will be pushed onto the first subject but the second will be returned, abstracting away the internal logic of combining requests.
private values: Subject = new Subject();
private results: Subject = new Subject();
private makeRequest(number: number) {
this.values.next(number);
return this.results;
}
The pipeline for merging the requests could be a buffer and debounceTime as indicated in the question or other logic, as required. When a response is recieved, it just needs to be pushed onto the results Subject:
constructor(private http: HttpClient) {
this.values
.pipe(
buffer(this.values.pipe(debounceTime(1000))),
switchMap(values => this.getUrl(values)),
map(response => this.results.next(response)))
.subscribe();
}
I've used a switchMap to simulate an asynchronous request before pushing the response onto the results.
Full example here: https://angular-8yyvku.stackblitz.io

Make generator with recursion and promises play nicely

It's possible to make a "Scrolling" request to Elasticsearch. It keeps a cursor open and you can retrieve large chunks of data piece by piece.
There's some demo code available, that uses callbacks and recursion to keep fetching data until we're finished. In the Node app I'm writing, I want to stream every chunk of data into a zip or write it somewhere, and then forget about it and fetch a new chunk of data. However in the example they store all the data into an array which can cause memory issues for large amounts of data.
A generator function would be perfect to fetch some data from Elasticsearch on every .next(), then write it away before calling another .next() that will get some data from the scroll endpoint and use recursion.
I'm currently really confused how this can be achieved. We have to synchronously wait for the promise (Elasticsearch call) to resolve, then yield the response. But it also needs to yield a recursive function, etc.
After trying for hours with different ways, the mix of generator functions, promises, and recursion has confused me. I wrote some simplified code that resembles what I'm trying to achieve:
console.clear();
// Elasticsearch search call
function searchMockPromise() {
return new Promise(resolve => {
setTimeout(() => {
let response = {};
response.hits = {
total: 50,
hits: [1, 2, 3, 4, 5]
};
resolve(response);
}, 2000);
});
}
// Elasticsearch scroll call
function scrollMockPromise() {
return new Promise(resolve => {
setTimeout(() => {
let response = {};
response.hits = {
total: 50,
hits: [1, 2, 3, 4, 5]
};
resolve(response);
}, 2000);
});
}
function* exportGenerator() {
let count = 0;
console.log("Executing search call first");
yield searchMockPromise()
.then(function* (resp) {
yield* scrollCallback(resp);
return resp.hits.hits;
});
function* scrollCallback(response) {
console.log("Executing scroll callback");
count += response.hits.hits.length;
if (response.hits.total !== count) {
console.log("Theres more data to fetch, now make a scroll call");
yield scrollMockPromise()
.then(function* (resp) {
console.log("It executed a scroll call");
yield* scrollCallback(resp);
return response.hits.hits;
});
}
}
}
function init() {
// We just want the generator to return the "response" objects from the callbacks of the Promises...
// E.g. every part of data we get from the generator, we can inject into a streaming zip or write it somewhere.
for (let data of exportGenerator()) {
const promise = yield data;
const output = yield promise;
console.log(output);
}
}
init();
Hopefully someone can point out how something like this can be achieved. Thanks!
No, this cannot be achieved. Generators and for … of are synchronous. Of course you can yield promises, but that doesn't buy you anything - and you'd better use async/await syntax instead.
However, you'll want to have a look at the async iteration proposal.

Firebase not receiving data before view loaded - empty array returned before filled

In the following code I save each item's key and an email address in one table, and to retrieve the object to fetch from the original table using said key. I can see that the items are being put into the rawList array when I console.log, but the function is returning this.cartList before it has anything in it, so the view doesn't receive any of the data. How can I make it so that this.cartList waits for rawList to be full before it is returned?
ionViewWillEnter() {
var user = firebase.auth().currentUser;
this.cartData.getCart().on('value', snapshot => {
let rawList = [];
snapshot.forEach(snap => {
if (user.email == snap.val().email) {
var desiredItem = this.goodsData.findGoodById(snap.val().key);
desiredItem.once("value")
.then(function(snapshot2) {
rawList.push(snapshot2);
});
return false
}
});
console.log(rawList);
this.cartList = rawList;
});
}
I have tried putting the this.cartList = rawList in a number of different locations (before return false, even inside the .then statement, but that did not solve the problem.
The following function call is asynchronous and you're falling out of scope before rawList has a chance to update because this database call takes a reasonably long time:
desiredItem.once("value").then(function(snapshot2) {
rawList.push(snapshot2);
});
You're also pushing the snapshot directly to this list, when you should be pushing snapshot2.val() to get the raw value.
Here's how I would fix your code:
ionViewWillEnter() {
var user = firebase.auth().currentUser;
this.cartData.getCart().on('value', snapshot => {
// clear the existing `this.cartList`
this.cartList = [];
snapshot.forEach(snap => {
if (user.email == snap.val().email) {
var desiredItem = this.goodsData.findGoodById(snap.val().key);
desiredItem.once("value")
.then(function(snapshot2) {
// push directly to the cartList
this.cartList.push(snapshot2.val());
});
}
return false;
});
});
}
The problem is the Promise (async .once() call to firebase) inside the forEach loop (sync). The forEach Loop is not gonna wait for the then() statement so then on the next iteration the data of the previous iteration is just lost...
let snapshots = [1, 2, 3];
let rawList = [];
snapshots.forEach((snap) => {
console.log(rawList.length)
fbCall = new Promise((resolve, reject) => {
setTimeout(function() {
resolve("Success!");
}, 2500)
});
fbCall.then((result) => {
rawList.push(result);
});
})
You need forEach to push the whole Promise to the rawList and Then wait for them to resolve and do sth with the results.
var snapshots = [1, 2, 3];
var rawList = [];
var counter = 0;
snapshots.forEach((snap) => {
console.log(rawList.length)
var fbCall = new Promise((resolve, reject) => {
setTimeout(function() {
resolve("Success!" + counter++);
}, 1500)
});
rawList.push(fbCall);
})
Promise.all(rawList).then((res) => {
console.log(res[0]);
console.log(res[1]);
console.log(res[2]);
});
The thing is, it is still a bit awkward to assign this.cartList = Promise.all(rawList) as it makes it a Promise. So you might want to rethink your design and make something like a getCartList Service? (dont know what ur app is like :p)
Since you're using angular you should also be using angularfire2, which makes use of Observables which will solve this issue for you. You will still be using the normal SDK for many things but for fetching and binding data it is not recommended to use Firebase alone without angularfire2 as it makes these things less manageable.
The nice things about this approach is that you can leverage any methods on Observable such as filter, first, map etc.
After installing it simply do:
public items$: FirebaseListObservable<any[]>;
this.items$ = this.af.database.list('path/to/data');
And in the view:
{{items$ | async}}
In order to wait for the data to appear.
Use AngularFire2 and RxJS this will save you a lot of time, and you will do it in the proper and maintainable way by using the RxJS operators, you can learn about those operators here learnrxjs

Categories

Resources