I have an Angular code where in i am trying to subscribe to my 1st api and implementing a while loop inside this subscription. Further i need to subscribe to another api inside the while loop. Reason --> I need to subscribe to the inner api multiple times and the while loop should end based on a flag returned by inner api. I tried implementing the below but its not working. Need some help.
CallBanyanToFetchQuotes() {
const url1 = 'http://ws.integration.banyantechnology.com/services/api/rest/ImportForQuote';
this.http.post(url1, payload)
.subscribe(importForQuoteResponse => {
this.importForQuoteResponse = importForQuoteResponse;
console.log('LoadID = ' + this.importForQuoteResponse.Load.Loadinfo.LoadID);
this.loadId = this.importForQuoteResponse.Load.Loadinfo.LoadID;
while (!this.ratingCompleted) {
const url2 = 'http://ws.integration.banyantechnology.com/services/api/rest/GetQuotes';
this.http.post(url2, payload)
.subscribe(getQuoteResponse => {
this.getQuoteResponse = getQuoteResponse;
if (this.getQuoteResponse.RatingCompleted === true) {
this.ratingCompleted = true;
}
});
}
});
}
this.http.post(url1, payload).pipe(
switchMap(importForQuoteResponse=>{
this.importForQuoteResponse = importForQuoteResponse;
this.loadId = this.importForQuoteResponse.Load.Loadinfo.LoadID;
return timer(0,1000).pipe(
switchMap(()=>this.http.post(url2, payload)),
tap(res=>this.getQuoteResponse=res),
takeWhile(res=>!res.RatingCompleted,true),
filter(res=>res.RatingCompleted === true)
)
})).subscribe(()=>{
this.ratingCompleted = true;
})
a "fool example" in stackblitz
the before code can be explained like: we make the first post, but, we don't want this subscribtion, so we change this subscription to a timer (switchMap). But we don't want the timer, else a second post (another switchMap). Each time timer is executed, is executed the second post and we get the response using tap. We make the call while the response was false (takeWhile) -it's important make the takewhile(...,true), the "true" makes return the last value- and filter the response (filter) so only get the "subscribe" when the response is true.
NOTE: I use timer(0,1000) to make a call each 1000 miliseconds, feel free to change the interval
You can use expand to simulate a while loop. expand passes the input through to the destination immediately, maps to an Observable and receives its output as the next input. Map to EMPTY to end this recursion.
// move the urls out of the function if they are static
const url1 = 'http://ws.integration.banyantechnology.com/services/api/rest/ImportForQuote';
const url2 = 'http://ws.integration.banyantechnology.com/services/api/rest/GetQuotes';
callBanyanToFetchQuotes() {
this.http.post(url1, payload).pipe(
// process response from url1 http request
tap(importForQuoteResponse => {
this.importForQuoteResponse = importForQuoteResponse;
console.log('LoadID = ' + this.importForQuoteResponse.Load.Loadinfo.LoadID);
this.loadId = this.importForQuoteResponse.Load.Loadinfo.LoadID;
}),
// switch to url2 http request
switchMap(_ => this.http.post(url2, payload))
// execute url2 request again if the rating is incomplete or end execution with EMTPY
expand(quoteResponse => quoteResponse.RatingCompleted ? EMPTY : this.http.post(url2, payload))
// process responses from url2 requests
).subscribe(quoteResponse => {
this.getQuoteResponse = quoteResponse;
if (quoteResponse.RatingCompleted === true) {
this.ratingCompleted = true;
}
});
}
The expand approach guarantees that the next http call will be made directly and only after you received a response from the previous one.
Related
I'm trying to make a progress bar in angular that works fine to show the progress of a method in backEnd that processes a big Excel file.
I use an #Sse observable that emits data from the back method that I call to process the excel file.
the method works ok, and the communication between the front with the observable sse works too.
The problem is that the responses of the observable aren't being processed by the front until the metod in back finishes, because its method is called with an await.
Lets go with the code because I don't know if
I am explaining it properly.
console.time('import en back');
const subscription = this.sseService
.getServerSentEvent(environment.apiUrl + routesSse._pre + routesSse.getHippoImportProgress)
.subscribe(async (a) => {
const data = JSON.parse(a.data);
console.log(data);
this.currentRow = data.i;
this.currentOperation = data.operacion;
this.totalRows = data.total;
this.progress = this.currentRow == 0 ? 0 : (this.currentRow / this.totalRows) * 100;
this.cdr.detectChanges();
});
const importData = await this.worksSectionService.getImportExcelHippoWorksData(rowsToSend);
console.timeEnd('import en back');
The back generates ok the values from this observable, because I have tested it with console.logs
But data are not reflected in the subscription until the back method has finished
the values from this observable are generated inside the method that is called in the back.
... I've tried to call both observable subscription converted to promise and method call in a Promise.all but the result is the same...
EDIT: if i call the method:
this.worksSectionService.getImportExcelHippoWorksData(rowsToSend)
without the await (only for testing), such like this:
const subscription = this.sseService
.getServerSentEvent(environment.apiUrl + routesSse._pre + routesSse.getHippoImportProgress)
.subscribe(async (a) => {
const data = JSON.parse(a.data);
console.log(data);
this.currentRow = data.i;
this.currentOperation = data.operacion;
this.totalRows = data.total;
this.progress = this.currentRow == 0 ? 0 : (this.currentRow / this.totalRows) * 100;
this.cdr.detectChanges();
});
this.worksSectionService.getImportExcelHippoWorksData(rowsToSend);
, the behaviour is the same
My function (lets call it myFunction) is getting an array of streams (myFunction(streams: Observable<number>[])). Each of those streams produces values from 1 to 100, which acts as a progress indicator. When it hits 100 it is done and completed. Now, when all of those observables are done I want to emit a value. I could do it this way:
public myFunction(streams: Observable<number>[]) {
forkJoin(streams).subscribe(_values => this.done$.emit());
}
This works fine, but imagine following case:
myFunction gets called with 2 streams
one of those streams is done, second one is still progressing
myFunction gets called (again) with 3 more streams (2nd one from previous call is still progressing)
I'd like to somehow add those new streams from 3rd bullet to the "queue", which would result in having 5 streams in forkJoin (1 completed, 4 progressing).
I've tried multiple approaches but can't get it working anyhow... My latest approach was this:
private currentProgressObs: Observable<any> | null = null;
private currentProgressSub: Subscription | null = null;
public myFunction(progressStreams: Observable<number>[]) {
const isUploading = this.cumulativeUploadProgressSub && !this.cumulativeUploadProgressSub.closed;
const currentConcatObs = this.currentProgressObs?.pipe(concatAll());
const currentStream = isUploading && this.currentProgressObs ? this.currentProgressObs : of([100]);
if (this.currentProgressSub) {
this.currentProgressSub.unsubscribe();
this.currentProgressSub = null;
}
this.currentProgressObs = forkJoin([currentStream, ...progressStreams]);
this.currentProgressSub = this.currentProgressObs.subscribe(
_lastProgresses => {
this._isUploading$.next(false); // <----- this is the event I want to emit when all progress is completed
this.currentProgressSub?.unsubscribe();
this.currentProgressSub = null;
this.currentProgressObs = null;
},
);
}
Above code only works for the first time. Second call to the myFunction will never emit the event.
I also tried other ways. I've tried recursion with one global stream array, in which I can add streams while the subscription is still avctive but... I failed. How can I achieve this? Which operator and in what oreder should I use? Why it will or won't work?
Here is my suggestion for your issue.
We will have two subjects, one to count the number of request being processed (requestsInProgress) and one more to mange the requests that are being processed (requestMerger)
So the thing that will do is whenever we want to add new request we will pass it to the requestMerger Subject.
Whenever we receive new request for processing in the requestMerger stream we will first increment the requestInProgress counter and after that we will merge the request itself in the source observable. While merging the new request/observable to the source we will also add the finalize operator in order to track when the request has been completed (reached 100), and when we hit the completion criteria we will decrement the request counter with the decrementCounter function.
In order to emit result e.g. to notify someone else in the app for the state of the pending requests we can subscribe to the requestsInProgress Subject.
You can test it out either here or in this stackBlitz
let {
interval,
Subject,
BehaviorSubject
} = rxjs
let {
mergeMap,
map,
takeWhile,
finalize,
first,
distinctUntilChanged
} = rxjs.operators
// Imagine next lines as a service
// Subject responsible for managing strems
let requestMerger = new Subject();
// Subject responsible for tracking streams in progress
let requestsInProgress = new BehaviorSubject(0);
function incrementCounter() {
requestsInProgress.pipe(first()).subscribe(x => {
requestsInProgress.next(x + 1);
});
}
function decrementCounter() {
requestsInProgress.pipe(first()).subscribe(x => {
requestsInProgress.next(x - 1);
});
}
// Adds request to the request being processed
function addRequest(req) {
// The take while is used to complete the request when we have `value === 100` , if you are dealing with http-request `takeWhile` might be redudant, because http request complete by themseves (e.g. the finalize method of the stream will be called even without the `takeWhile` which will decrement the requestInProgress counter)
requestMerger.next(req.pipe(takeWhile(x => x < 100)));
}
// By subscribing to this stream you can determine if all request are processed or if there are any still pending
requestsInProgress
.pipe(
map(x => (x === 0 ? "Loaded" : "Loading")),
distinctUntilChanged()
)
.subscribe(x => {
console.log(x);
document.getElementById("loadingState").innerHTML = x;
});
// This Subject is taking care to store or request that are in progress
requestMerger
.pipe(
mergeMap(x => {
// when new request is added (recieved from the requestMerger Subject) increment the requrest being processed counter
incrementCounter();
return x.pipe(
finalize(() => {
// when new request has been completed decrement the requrest being processed counter
decrementCounter();
})
);
})
)
.subscribe(x => {
console.log(x);
});
// End of fictional service
// Button that adds request to be processed
document.getElementById("add-stream").addEventListener("click", () => {
addRequest(interval(1000).pipe(map(x => x * 25)));
});
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/6.6.6/rxjs.umd.min.js"></script>
<div style="display:flex">
<button id="add-stream">Add stream</button>
<h5>Loading State: <span id="loadingState">false</span> </h5>
</div>
Your problem is that each time your call your function, you are creating a new observable. Your life would be much easier if all calls of your function pushed all upload jobs through the same stream.
You can achieve this using a Subject.
I would suggest you push single "Upload Jobs" though a simple subject and design an observable that emits the state of all upload jobs whenever anything changes: A simple class that offers a createJob() method to submit jobs, and a jobs$ observable to reference the state:
class UploadService {
private jobs = new Subject<UploadJob>();
public jobs$ = this.jobs.pipe(
mergeMap(job => this.processJob(job)),
scan((collection, job) => collection.set(job.id, job), new Map<string, UploadJob>()),
map(jobsMap => Array.from(jobsMap.values()))
);
constructor() {
this.jobs$.subscribe();
}
public createJob(id: string) {
this.jobs.next({ id, progress: 0 });
}
private processJob(job: UploadJob) {
// do work and return observable that
// emits updated status of UploadJob
}
}
Let's break it down:
jobs is a simple subject, that we can push "jobs" through
createJob simply calls jobs.next() to push the new job through the stream
jobs$ is where all the magic happens. It receives each UploadJob and uses:
mergeMap to execute whatever function actually does the work (I called it processJob() for this example) and emits its values into the stream
scan is used to accumulate these UploadJob emissions into a Map (for ease of inserting or updating)
map is used to convert the map into an array (Map<string, UploadJob> => UploadJob[])
this.jobs$.subscribe() is called in the constructor of the class so that jobs will be processed
Now, we can easily derive your isUploading and cumulativeProgress from this jobs$ observable like so:
public isUploading$ = this.jobs$.pipe(
map(jobs => jobs.some(j => j.progress !== 100)),
distinctUntilChanged()
);
public progress$ = this.jobs$.pipe(
map(jobs => {
const current = jobs.reduce((sum, j) => sum + j.progress, 0) / 100;
const total = jobs.length ?? current;
return current / total;
})
);
Here's a working StackBlitz demo.
Let's say I have a stream of actions. They're either Prompts, Responses (to prompts) or Effects. They come at irregular intervals, but assume 1 second delay between each one.
On every PROMPT action I want to emit that action and a BEGIN action (let's say we want to show the message to user for N seconds). All other items should be delayed by N seconds, after which the END action fires (hiding the message) and everything continues.
This is my code for it (for https://rxviz.com/):
const { interval, from, zip, timer } = Rx;
const { concatMap, delayWhen } = RxOperators;
const PROMPT = 'P';
const RESPONSE = 'R';
const EFFECT = 'E';
const BEGIN = '^';
const END = '&';
const convertAction = action => (action === PROMPT) ? [PROMPT, BEGIN, END] : [action];
// Just actions coming at regular intervals
const action$ = zip(
from([PROMPT, RESPONSE, EFFECT, PROMPT, RESPONSE, EFFECT, EFFECT, EFFECT]),
interval(1000),
(a, b) => a,
);
action$.pipe(
concatMap(action =>
from(convertAction(action)).pipe(
delayWhen(action => (action == END) ? timer(5000) : timer(0)),
),
),
);
What I really want to do is for first RESPONSE action after PROMPT to not be affected by the delay. If it comes before END action, it should be shown right away. So, instead of
P^ &REP^ &REEE
I want to receive
P^ R &EP^R &EEE
How can I achieve it while keeping each RESPONSE after their corresponding PROMPT? Assume no events can come between PROMPT and RESPONSE.
If I understand it right, this is a very interesting problem to address with Observables streams. This is the way I would attack it.
First I would store in a constant actionDelayed$ the result of your original logic, i.e. a stream where we have introduced, after each PROMPT, BEGIN and END actions divided by a delay.
const actionDelayed$ = action$.pipe(
concatMap(action =>
from(convertAction(action)).pipe(
delayWhen(action => (action == END) ? timer(5000) : timer(0)),
),
),
);
Then I would create 2 separate streams, response$ and promptDelayed$, containing only the RESPONSE actions before the delay was introduced and the PROMPT actions after the delayed was introduced, like this
const response$ = action$.pipe(
filter(a => a == RESPONSE)
)
const promptDelayed$ = actionDelayed$.pipe(
filter(a => a == PROMPT)
)
With these 2 streams, I can create another stream of RESPONSE actions emitted just after the PROMPT delayed actions are emitted, like this
const responseN1AfterPromptN$ = zip(response$, promptDelayed$).pipe(
map(([r, p]) => r)
)
At this point I have just to remove all RESPONSE actions from actionDelayed$ like this
const actionNoResponseDelayed$ = actionDelayed$.pipe(
filter(a => a != RESPONSE)
)
and merge actionNoResponseDelayed$ with responseN1AfterPromptN$ to get the final stream.
The entirety of the code, to be tried with rxviz is this
const { interval, from, zip, timer, merge } = Rx;
const { concatMap, delayWhen, share, filter, map } = RxOperators;
const PROMPT = 'P';
const RESPONSE = 'R';
const EFFECT = 'E';
const BEGIN = '^';
const END = '&';
const convertAction = action => (action === PROMPT) ? [PROMPT, BEGIN, END] : [action];
// Just actions coming at regular intervals
const action$ = zip(
from([PROMPT, RESPONSE, EFFECT, PROMPT, RESPONSE, EFFECT, EFFECT, EFFECT]),
interval(1000),
(a, b) => a,
).pipe(share());
const actionDelayed$ = action$.pipe(
concatMap(action =>
from(convertAction(action)).pipe(
delayWhen(action => (action == END) ? timer(5000) : timer(0)),
),
),
share()
);
const response$ = action$.pipe(
filter(a => a == RESPONSE)
)
const promptDelayed$ = actionDelayed$.pipe(
filter(a => a == PROMPT)
)
const responseN1AfterPromptN$ = zip(response$, promptDelayed$).pipe(
map(([r, p]) => r)
)
const actionNoResponseDelayed$ = actionDelayed$.pipe(
filter(a => a != RESPONSE)
)
merge(actionNoResponseDelayed$, responseN1AfterPromptN$)
The use of the share operator while creating action$ and actionDelayed$ streams allows to avoid repeated subscriptions to these streams when creating the subsequent streams used in the solution.
It may not work this way because you're using concatMap. As you know, it waits for the inner observable to complete before starting to process(to subscribe) the pending ones. It internally uses a buffer, such that if an inner observable is still active(did not complete), the emitted value will be added to that buffer. When the inner observable becomes inactive, the oldest value from the buffer is selected and a new inner observable will be created, based on the provided callback function.
There is also delayWhen, which emits a complete notification after all of its pending observables complete:
// called when an inner observable sends a `next`/`complete` notification
const notify = () => {
// Notify the consumer.
subscriber.next(value);
// Ensure our inner subscription is cleaned up
// as soon as possible. Once the first `next` fires,
// we have no more use for this subscription.
durationSubscriber?.unsubscribe();
if (!closed) {
active--;
closed = true;
checkComplete();
}
};
checkComplete() will check if there is a need to send a complete notification to the main stream:
const checkComplete = () => isComplete && !active && subscriber.complete();
We've seen that active decreases in notify(). isComplete becomes true when the main source completes:
// this is the `complete` callback
() => {
isComplete = true;
checkComplete();
}
So, this is why it does not work this way:
the PROMPT action is used to create the concatMap's first inner observable
the observable emits 3 consecutive actions [PROMPT, BEGIN, END]
the first 2 will get timer(0), whereas the third one, END, will get (timer(5000)); notice that in this time, before the PROMPT action got emitted, the isComplete variable is set to true, because from() completes synchronously in this case
so there is a timer(5000) that keeps the inner obs. active; then a RESPONSE is emitted from the actions$ stream, but since there is no place for it yet, it will be added to the buffer and an inner obs. will be created when timer(5000) finally expires
A way to solve this might be to replace concatMap with mergeMap.
I am calling an API endpoint for one of Steam's games through their web api using axios and promises in Node.js. Each JSON response from the endpoint returns 100 match objects, of which only about 10 to 40 (on average) are of interest to my use case. Moreover, I have observed that the data tends to be repeated if called many times within, say, a split second.
What I am trying to achieve is get 100 match_ids (not whole match objects) that fit my criteria in an array by continuously (recursively) calling the api until I get 100 unique match_ids that serve my purpose.
I am aware that calling the endpoint within a loop is naive and it exceeds the call limits of 1 request per second set by their web api. This is why I've resorted to recursion to ensure that each promise is resolved and the array filled with match_ids before proceeding on. The issue I am having is, my code does not terminate and at each stage of the recursive calls, the values are the same (e.g. last match id, the actual built up array, etc.)
function makeRequestV2(matchesArray, lastId) {
// base case
if (matchesArray.length >= BATCH_SIZE) {
console.log(matchesArray);
return;
}
steamapi
.getRawMatches(lastId)
.then(response => {
const matches = response.data.result.matches;
// get the last id of fetched chunk (before filter)
const lastIdFetched = matches[matches.length - 1].match_id;
console.log(`The last Id fetched: ${lastIdFetched}`);
let filteredMatches = matches
.filter(m => m.lobby_type === 7)
.map(x => x.match_id);
// removing potential dups
matchesArray = [...new Set([...matchesArray, ...filteredMatches])];
// recursive api call
makeRequestV2(matchesArray, lastIdFetched);
})
.catch(error => {
console.log(
"HTTP " + error.response.status + ": " + error.response.statusText
);
});
}
makeRequestV2(_matchIds);
// this function lies in a different file where the axios call happens
module.exports = {
getRawMatches: function(matchIdBefore) {
console.log("getRawMatches() executing.");
let getURL = `${url}${config.ENDPOINTS.GetMatchHistory}/v1`;
let parameters = {
params: {
key: `${config.API_KEY}`,
min_players: `${initialConfig.min_players}`,
skill: `${initialConfig.skill}`
}
};
if (matchIdBefore) {
parameters.start_at_match_id = `${matchIdBefore}`;
}
console.log(`GET: ${getURL}`);
return axios.get(getURL, parameters);
}
}
I'm not exceeding the request limits and all that, but the same results keep coming up.
BATCH_SIZE is 100 and
_matchIds = []
I would start with replacing the line:
matchesArray = [...new Set([...matchesArray, ...filteredMatches])];
with this one:
filteredMatches.filter(item => matchesArray.indexOf(item) === -1).forEach(item=>{
matchesArray.push(item)
})
What you were doing was that you effectively replaced the matchesArray var inside your function with new reference. I mean the var that you sent in function parameter from outside was no longer the same var inside the function. If you use matchesArray.push - you do not change the var reference though and the var in outer scope is accurately updated - just as is your intention.
This is the reason why _matchIds remains empty: each time there is a call to makeRequestV2, the inner variable matchesArray becomes 'detouched' from outer scope (during assignment statement execution) and although it gets populated, the outer scoped var still points to the original reference and stays untouched.
I want to make sure that Observable.subscribe() doesn't get executed if a different Observable yields true.
An example use case would be making sure that user can trigger a download only if the previous one has finished (or failed) and there's only one download request executed at a time.
In order to control the execution flow, I had to rely on a state variable which seems a bit odd to me - is this a good pattern? In a v. likely case that it isn't - what would be a better approach?
I ended up with two subscriptions: Actions.sync (using a Subject, public API, initialises a sync request) and isActive (resolves to true or `false, the name should be pretty self-explanatory.
let canDownload = true; // this one feels really, really naughty
const startedSyncRequests = new Rx.Subject();
const isActiveSync = startedSyncRequests.map(true)
.merge(completeSyncRequests.map(false))
.merge(failedSyncRequests.map(false))
.startWith(false)
.subscribe(isActive => canDownload = !isActive)
syncResources = ()=>{
startedSyncRequests.onNext();
// Mocked async job
setTimeout(()=> {
completeSyncRequests.onNext();
}, 1000);
};
Actions.sync
.filter( ()=> canDownload ) // so does this
.subscribe( syncResources );
You want exclusive().
Actions.sync
.map(() => {
//Return a promise or observable
return Rx.Observable.defer(() => makeAsyncRequest());
})
.exclusive()
.subscribe(processResults);
The above will generate an observable every time the user makes a request. However, exclusive will drop any observables that come in before the previous one has completed, and then flattens the resulting messages into a single observable.
Working example (using interval and delay):
var interval = Rx.Observable.interval(1000).take(20);
interval
.map(function(i) {
return Rx.Observable.return(i).delay(1500);
})
.exclusive()
//Only prints every other item because of the overlap
.subscribe(function(i) {
var item = $('<li>' + i + '</li>');
$('#thelist').append(item);
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/2.5.3/rx.all.js"></script>
<div>
<ul id="thelist">
</ul>
</div>
Reference: here