Is and how memory is cleaned in cron tasks - javascript

The question is pretty self-explanatory but I wanna know how to deal with scheduled tasks whether it is cron, setTimeout, setInterval etc. Let's say I have multiple variables initialized inside it. If the task keeps repeating is it gonna keep filling memory or will it clean itself after some time? This might sound simple but I'm a beginner. I will throw my code here where I send one API request to check the value of one variable and another one to increase it by one. I can probably safely take some of the variables outside just fine but the question remains the same.
const { GraphQLClient, gql } = require ('graphql-request')
const Cron = require ('croner')
const job = Cron('0 14 * * *', () => {
async function main() {
console.log("start")
const endpoint = 'https://graphql.anilist.co/'
const graphQLClient = new GraphQLClient(endpoint, {
headers: {
authorization: 'Token hidden for obvious reasons.',
},
})
const query_check = gql`
mutation ($mediaId: Int, $status: MediaListStatus) {
SaveMediaListEntry (mediaId: $mediaId, status: $status) {
id
status
progress
}
}
`
const variables = {
mediaId: "1337"
}
const check = await graphQLClient.request(query_check,variables)
console.log(`przed ${check.SaveMediaListEntry.progress}`)
const query_new = gql`
mutation ($mediaId: Int, $status: MediaListStatus, $progress: Int) {
SaveMediaListEntry (mediaId: $mediaId, status: $status, progress: $progress) {
id
status
progress
}
}
`
const new_variables = {
mediaId: `1337`,
progress: `${check.SaveMediaListEntry.progress + 1}`
}
const new_query = await graphQLClient.request(query_new,new_variables)
console.log(`po ${new_query.SaveMediaListEntry.progress}`)
}
main().catch((error) => console.error(error))
});

Related

vuejs not able to use data from useMutation on template => async problem?

I'm using a mutation from gql with vue apollo in the front tom get some data, there's a number and an Email.
Once mutation is finished, I assign the value to refs:
const mobile = ref('')
const email = ref('')
// mutation logic:
const {
mutate: getContactInfo,
onDone,
onError
} = useMutation(
gql`
mutation ContactInfo($userId: String!) {
contactInfo(userId: $userId) {
mobile
email
}
}
`,
() => ({
variables: {
userId: props.userID
}
})
)
onDone(data => {
const res = data
mobile.value = res.data.contactInfo.mobile
email.value = res.data.contactInfo.email
})
down this code I got these next 2 functions:
const emailContact = async () => {
loading.value = true
await getContactInfo()
location.replace(`mailto:${email.value}`)
loading.value = false
}
const smsContact = async () => {
loading.value = true
await getContactInfo()
location.href = `sms:${mobile.value.slice(1)}`
loading.value = false
}
Both work well except that the vanilla js method => location.replace(mailto:${email.value}) is causing me issues on mobile devices while I can do the same with a <a / HTML tag and have no problems.
So my problem is that once I want to send the email ref to the template so I can fire the mailto:' ' from an <a / tag the data is not there. Even when I make a console.log(email.value) => I just see a blank line.
I mean, I know the data is there and that's why I can use both previous methods, but why can't I use it on the template or log it. Seems like an asynchronous issue.
Of course, thanks in advance && any suggestion/help is greatly appreciated!

How can I connect twice to db in Cypress tests?

I need to connect to my db (Postgres) twice during autotest - at the beginning to truncate and at the end to select a new note from the table.
I tried to do it using pg-promise, but in the second connection it returns null or undefined (the following error shows after asserting expected and real results - "Target cannot be null or undefined.").
If I don't execute the first connecting (for truncating), the second runs normally and returns new note from table.
Also, if I make Truncate after Select, it gives two different result, depending of whether the table empty or not.
If it is empty, the result is the same error. But if there is some record initially, all becomes ok and test finishes without error.
This is how I make connecting and disconnecting to db:
const pgp = require('pg-promise')();
const postgresConfig = require(require('path').resolve('cypress.json'));
function dbConnection(query, userDefineConnection) {
const db = pgp(userDefineConnection || postgresConfig.db);
return db.any(query).finally(db.$pool.end)
// return db.any(query).finally(pgp.end)
}
/**#type {Cypress.PluginConfig} */
module.exports = (on, config) => {
on("task", {
dbQuery: (query) => dbConnection(query.query, query.connection)
});
}
And this is how I make requests to DB in test:
describe('example to-do app', () => {
beforeEach(() => {
cy.task('dbQuery', {'query': 'TRUNCATE TABLE auto_db.public.requests RESTART IDENTITY CASCADE'})
})
it('tablist displays class', () => {
cy.contains('button > span', 'Save').click()
cy.task('dbQuery', {'query': 'SELECT * FROM auto_db.public.requests'})
.then(queryResponse => {
expect(queryResponse[0]).to.deep.contain({
id: 1,
author_id: 4,
type_id: 1,
})
})
})
})
If you look at the implementation for cypress-postgres it's similar but they break up the response/return to fit the call to pgp.end() in between.
Since it sounds like the 1st connection isn't closing, I'd suspect the .finally() call isn't working.
const pgp = require('pg-promise')();
const postgresConfig = require(require('path').resolve('cypress.json'));
function dbConnection(query, userDefineConnection) {
const db = pgp(userDefineConnection || postgresConfig.db);
let response = db.any(query)
pgp.end()
return response
}
/**#type {Cypress.PluginConfig} */
module.exports = (on, config) => {
on("task", {
dbQuery: (query) => dbConnection(query.query, query.connection)
});
}
You should be able to get rid of the postgresConfig line since (on, config) is the same (in fact it's better because you might want to overwrite some config on the command line).
const pgp = require('pg-promise')();
/**#type {Cypress.PluginConfig} */
module.exports = (on, config) => {
function dbConnection(query, userDefineConnection) {
const db = pgp(userDefineConnection || config.db);
let response = db.any(query)
pgp.end()
return response
}
on("task", {
dbQuery: (query) => dbConnection(query.query, query.connection)
});
}
The problem was that new record in database has been creating not right away and cypress just was looking in db earlier that creating.
Adding of waiting before db checking helped.

Async function overwrites array state when it should update only one item

I have a file upload component. The behavior is simple: I send one upload request to the back-end per file and as the upload progress increase, I have a bar that should increase with it.
I have a state that holds every selected file and their respective progress, as such:
interface IFiles {
file: File;
currentProgress: number;
}
const [selectedFiles, setSelectedFiles] = useState<IFiles[]>([]);
And when the user clicks the upload button, this function will be triggered and call uploadFile for each file in my array state.
const sendFilesHandler = async () => {
selectedFiles.map(async (file) => {
const fileType = file.file.type.split('/')[0];
const formData = new FormData();
formData.append(fileType, file.file);
formData.append('filename', file.file.name);
await uploadFile(formData, updateFileUploadProgress);
});
};
Here is what the uploadFile function looks like.
const uploadFile = async (body: FormData, setPercentage: (filename: string, progress: number) => void) => {
try {
const options = {
onUploadProgress: (progressEvent: ProgressEvent) => {
const { loaded, total } = progressEvent;
const percent = Math.floor((loaded * 100) / total);
const fileName = body.get('filename')!.toString();
if (percent <= 100) {
setPercentage(fileName, percent)
}
}
};
await axios.post(
"https://nestjs-upload.herokuapp.com/",
body,
options
);
} catch (error) {
console.log(error);
}
};
As you can see, when uploadProgress is triggered it should inform call setPercentage function, which is:
const updateFileUploadProgress = (fileName: string, progress: number) => {
console.log('Entrada', selectedFiles);
const currentObjectIndex = selectedFiles.findIndex((x) => fileName === x.file.name);
const newState = [...selectedFiles];
newState[currentObjectIndex] = {
...newState[currentObjectIndex],
currentProgress: progress,
};
setSelectedFiles(newState);
console.log('Saída', newState);
};
And this function should only update the object of my state array where the filenames match. However, it is overriding the whole thing. The behavior is as follows:
So it seems that everything is fine as long as I am updating the same object. But in the moment onUploadProgress is triggered to another object, selectedFiles states becomes its initial state again. What am I missing to make this work properly?
I am not sure what is the exact reason behind this behaviour but I came up with the below unexpectedly simple solution after spending 3 hours straight on it.
const updateFileUploadProgress = (fileName: string, progress: number) => {
setSelectedFiles(prevState => {
const currentObjectIndex = prevState.findIndex((x) => fileName === x.file.name);
const newState = [...prevState];
newState[currentObjectIndex] = {
...newState[currentObjectIndex],
currentProgress: progress,
};
return newState;
})
};
I was able to mimic your problem locally, and solved it with the above approach. I think it was because the function (for some reason) still referenced the initial state even after rerenders.

Function ends before all asynchronous work completes

So I have a function that checks if an order is 24 hours old, if thats the case I send a notification to the user , but it seems like it does not complete the execution of all the users, instead it just returns someones and some others not, I think I have a problem returning the promise, I'm not an expert in javascript and I did not really understand what is happening, sometimes instead of trying with all the documents it just finishes if one documents has a deviceToken as empty and not continue with the other user documents
exports.rememberToFinishOrder = functions.pubsub.schedule('every 3 minutes').onRun(async (context) => {
var db = admin.firestore();
const tsToMillis = admin.firestore.Timestamp.now().toMillis()
const compareDate = new Date(tsToMillis - (24 * 60 * 60 * 1000)) //24 horas
let snap = await db.collection('orders').where("timestamp","<",new Date(compareDate)).where("status", "in" ,[1,2,4,5,6]).get()
if(snap.size > 0){
snap.forEach(async(doc) => {
const userId = doc.data().uid
let userSnap = await db.collection('user').doc(userId).get()
const deviceToken = userSnap.data().deviceToken
const payload = {
notification: {
title: "¿ Did you received your order ?",
body: "We need to know if you have received your order",
clickAction: "AppMainActivity"
},
data: {
ORDER_REMINDER: "ORDER_REMINDER"
}
}
console.log("User: "+doc.data().uid)
return admin.messaging().sendToDevice(deviceToken,payload)
});
}
});
sometimes when in someusers the devicetoken is empty it will finish the execution of this function instead of continuing to the next user, and also it will not finish this function for all the users in my orders collection, it will do someones and someones not, and this should be an atomic operation that changes everything in that collection, not just some documents
what is happening ?
Like andresmijares are saying, are you not handling the promises correctly.
When you are doing several asynchronous calls, I'd suggest using the Promise.all() function that will wait for all the promises to be done before it continues.
exports.rememberToFinishOrder = functions.pubsub.schedule('every 3 minutes').onRun(async (context) => {
const db = admin.firestore();
const messaging = admin.messaging();
const tsToMillis = admin.firestore.Timestamp.now().toMillis()
const compareDate = new Date(tsToMillis - (24 * 60 * 60 * 1000)) //24 horas
const snap = await db.collection('orders').where("timestamp","<",new Date(compareDate)).where("status", "in" ,[1,2,4,5,6]).get()
let allPromises = [];
if(snap.size > 0){
snap.forEach((doc) => {
const userId = doc.data().uid;
allPromises.push(db.collection('user').doc(userId).get().then(userSnapshot => {
const userData = userSnapshot.data();
const deviceToken = userData.deviceToken;
if (userData && deviceToken) {
const payload = {
notification: {
title: "¿ Did you received your order ?",
body: "We need to know if you have received your order",
clickAction: "AppMainActivity"
},
data: {
ORDER_REMINDER: "ORDER_REMINDER"
}
}
console.log("User: "+doc.data().uid)
return messaging.sendToDevice(deviceToken,payload)
} else {
return;
}
}));
});
}
return Promise.all(allPromises);
});
EDIT:
I added a check to see if the deviceToken is present on the userData before sending the notification.

RxJS: Batch requests and share response

Let's imagine i have a function fetchUser which takes as parameter userId and return an observable of user.
As i am calling this method often, i want to batch the ids to perform one request with multiple ids instead !
Here my troubles began...
I can't find a solution to do that without sharing an observable between the different calls of fetchUser.
import { Subject, from } from "rxjs"
import { bufferTime, mergeMap, map, toArray, filter, take, share } from "rxjs/operators"
const functionThatSimulateAFetch = (userIds: string[]) => from(userIds).pipe(
map((userId) => ({ id: userId, name: "George" })),
toArray(),
)
const userToFetch$ = new Subject<string>()
const fetchedUser$ = userToFetch$.pipe(
bufferTime(1000),
mergeMap((userIds) => functionThatSimulateAFetch(userIds)),
share(),
)
const fetchUser = (userId: string) => {
const observable = fetchedUser$.pipe(
map((users) => users.find((user) => user.id === userId)),
filter((user) => !!user),
take(1),
)
userToFetch$.next(userId)
return observable
}
But that's ugly and it has multiple troubles:
If i unsubscribe from the observable returned by fetchUser before the timer of bufferTime is finished, it doesn't prevent the fetch of the user.
If i unsubscribe from all the observables returned by fetchUser before the fetch of the batch is finished, it doesn't cancel the request.
Error handling is more complex
etc
More generally: i don't know how to solve the problems requiring sharing resources using RxJS. It's difficult to find advanced example of RxJS.
I think #Biggy is right.
This is the way I understand the problem and what you want to achieve
There are different places in your app where you want to fetch users
You do not want to fire a fetch request all the time, rather you
want to buffer them and send them at a certain interval of time,
let's say 1 second
You want to cancel a certain buffer and avoid that for that 1 second
interval a request to fetch a batch of users is fired
At the same time, if somebody, let's call it Code at Position
X has asked for a User and just few milliseconds later somebody
else, i.e. Code at Position Y cancels the entire batch of
requests, then Code at Position X has to receive some sort of
answer, let's say a null
More, you may want to be able to ask to fetch a User and then change
your mind, if within the interval of the buffer time, and and avoid
this User to be fetched (I am far from sure this is really something
you want, but it seems somehow to emerge from your question
If this is all true, then you probably have to have some sort of queuing mechanism, as Buggy suggested.
Then there may be many implementations of such mechanism.
What you have is a good, but as with everything RxJS, but the devil is in the details.
Issues
The switchMaping
mergeMap((userIds) => functionThatSimulateAFetch(userIds)),
This is where you first go wrong. By using a merge map here, you are making it impossible to tell appart the "stream of requests" from the "stream returned by a single request":
You are making it near impossible to unsubscribe from an individual request (to cancel it)
You are making it impossible to handle errors
It falls appart if your inner observable emits more than once.
Rather, what you want is to emit individual BatchEvents, via a normal map (producing an observable of observable), and switchMap/mergeMap those after the filtering.
Side effects when creating an observable & Emitting before subscribing
userToFetch$.next(userId)
return observable
Don’t do this. An observable by itself does not actually do anything. It’s a "blueprint" for a sequence of actions to happen when you subscribe to it. By doing this, you’ll only create a batch action on observable creating, but you’re screwed if you get multiple or delayed subscriptions.
Rather, you want to create an observable from defer that emits to userToFetch$ on every subscription.
Even then you’ll want to subscribe to your observable before emitting to userToFetch: If you aren’t subscribed, your observable is not listening to the subject, and the event will be lost. You can do this in a defer-like observable.
Solution
Short, and not very different from your code, but structure it like this.
const BUFFER_TIME = 1000;
type BatchEvent = { keys: Set<string>, values: Observable<Users> };
/** The incoming keys */
const keySubject = new Subject<string>();
const requests: Observable<{ keys: Set<string>, values: Observable<Users> }> =
this.keySubject.asObservable().pipe(
bufferTime(BUFFER_TIME),
map(keys => this.fetchBatch(keys)),
share(),
);
/** Returns a single User from an ID. Batches the request */
function get(userId: string): Observable<User> {
console.log("Creating observable for:", userId);
// The money observable. See "defer":
// triggers a new subject event on subscription
const observable = new Observable<BatchEvent>(observer => {
this.requests.subscribe(observer);
// Emit *after* the subscription
this.keySubject.next(userId);
});
return observable.pipe(
first(v => v.keys.has(userId)),
// There is only 1 item, so any *Map will do here
switchMap(v => v.values),
map(v => v[userId]),
);
}
function fetchBatch(args: string[]): BatchEvent {
const keys = new Set(args); // Do not batch duplicates
const values = this.userService.get(Array.from(keys)).pipe(
share(),
);
return { keys, values };
}
This does exactly what you were asking, including:
Errors are propagated to the recipients of the batch call, but nobody else
If everybody unsubscribes from a batch, the observable is canceled
If everybody unsubscribes from a batch before the request is even fired, it never fires
The observable behaves like HttpClient: subscribing to the observable fires a new (batched) request for data. Callers are free to pipe shareReplay or whatever though. So no surprises there.
Here is a working stackblitz Angular demo: https://stackblitz.com/edit/angular-rxjs-batch-request
In particular, notice the behavior when you "toggle" the display: You’ll notice that re-subscribing to existing observables will fire new batch requests, and that those requests will cancel (or outright not fire) if you re-toggle fast enough.
Use case
In our project, we use this for Angular Tables, where each row needs to individually fetch additional data to render. This allows us to:
chunk all the requests for a "single page", without needing any special knowledge of pagination
Potentially fetch multiple pages at once if the user paginates fast
re-use existing results even if page size changes
Limitations
I would not add chunking or rate limitting into this. Because the source observable is a dumb bufferTime you run into issues:
The "chunking" will happen before the deduping. So if you have 100 requests for a single userId, you’ll end up firing several requests with only 1 element
If you rate limit, you’ll not be able to inspect your queue. So you may end up with a very long queue containing multiple same requests.
This is a pessimistic point of view though. Fixing it would mean going full out with a stateful queue/batch mechanism, which is an order of magnitude more complex.
I'm not sure if this is the best way to solve this problem (at least it need tests), but I will try to explain my point of view.
We have 2 queue: for pending and for feature requests.
result to help delivery response/error to subscribers.
Some kind of worker who is based on some schedule takes a task from the queue to do the request.
If i unsubscribe from the observable returned by fetchUser before the
timer of bufferTime is finished, it doesn't prevent the fetch of the
user.
Unsubscribe from fetchUser will cleanup the request queue and worker will do nothing.
If i unsubscribe from all the observables returned by fetchUser before
the fetch of the batch is finished, it doesn't cancel the request.
Worker subscribe until isNothingRemain$
const functionThatSimulateAFetch = (userIds: string[]) => from(userIds).pipe(
map((userId) => ({ id: userId, name: "George" })),
toArray(),
tap(() => console.log('API_CALL', userIds)),
delay(200),
)
class Queue {
queue$ = new BehaviorSubject(new Map());
private get currentQueue() {
return new Map(this.queue$.getValue());
}
add(...ids) {
const newMap = ids.reduce((acc, id) => {
acc.set(id, (acc.get(id) || 0) + 1);
return acc;
}, this.currentQueue);
this.queue$.next(newMap);
};
addMap(idmap: Map<any, any>) {
const newMap = (Array.from(idmap.keys()))
.reduce((acc, id) => {
acc.set(id, (acc.get(id) || 0) + idmap.get(id));
return acc;
}, this.currentQueue);
this.queue$.next(newMap);
}
remove(...ids) {
const newMap = ids.reduce((acc, id) => {
acc.get(id) > 1 ? acc.set(id, acc.get(id) - 1) : acc.delete(id);
return acc;
}, this.currentQueue)
this.queue$.next(newMap);
};
removeMap(idmap: Map<any, any>) {
const newMap = (Array.from(idmap.keys()))
.reduce((acc, id) => {
acc.get(id) > idmap.get(id) ? acc.set(id, acc.get(id) - idmap.get(id)) : acc.delete(id);
return acc;
}, this.currentQueue)
this.queue$.next(newMap);
};
has(id) {
return this.queue$.getValue().has(id);
}
asObservable() {
return this.queue$.asObservable();
}
}
class Result {
result$ = new BehaviorSubject({ ids: new Map(), isError: null, value: null });
select(id) {
return this.result$.pipe(
filter(({ ids }) => ids.has(id)),
switchMap(({ isError, value }) => isError ? throwError(value) : of(value.find(x => x.id === id)))
)
}
add({ isError, value, ids }) {
this.result$.next({ ids, isError, value });
}
clear(){
this.result$.next({ ids: new Map(), isError: null, value: null });
}
}
const result = new Result();
const queueToSend = new Queue();
const queuePending = new Queue();
const doRequest = new Subject();
const fetchUser = (id: string) => {
return Observable.create(observer => {
queueToSend.add(id);
doRequest.next();
const subscription = result
.select(id)
.pipe(take(1))
.subscribe(observer);
// cleanup queue after got response or unsubscribe
return () => {
(queueToSend.has(id) ? queueToSend : queuePending).remove(id);
subscription.unsubscribe();
}
})
}
// some kind of worker that take task from queue and send requests
doRequest.asObservable().pipe(
auditTime(1000),
// clear outdated results
tap(()=>result.clear()),
withLatestFrom(queueToSend.asObservable()),
map(([_, queue]) => queue),
filter(ids => !!ids.size),
mergeMap(ids => {
// abort the request if it have no subscribers
const isNothingRemain$ = combineLatest(queueToSend.asObservable(), queuePending.asObservable()).pipe(
map(([queueToSendIds, queuePendingIds]) => Array.from(ids.keys()).some(k => queueToSendIds.has(k) || queuePendingIds.has(k))),
filter(hasSameKey => !hasSameKey)
)
// prevent to request the same ids if previous requst is not complete
queueToSend.removeMap(ids);
queuePending.addMap(ids);
return functionThatSimulateAFetch(Array.from(ids.keys())).pipe(
map(res => ({ isErorr: false, value: res, ids })),
takeUntil(isNothingRemain$),
catchError(error => of({ isError: true, value: error, ids }))
)
}),
).subscribe(res => result.add(res))
fetchUser('1').subscribe(console.log);
const subs = fetchUser('2').subscribe(console.log);
subs.unsubscribe();
fetchUser('3').subscribe(console.log);
setTimeout(() => {
const subs1 = fetchUser('10').subscribe(console.log);
subs1.unsubscribe();
const subs2 = fetchUser('11').subscribe(console.log);
subs2.unsubscribe();
}, 2000)
setTimeout(() => {
const subs1 = fetchUser('20').subscribe(console.log);
subs1.unsubscribe();
const subs21 = fetchUser('20').subscribe(console.log);
const subs22 = fetchUser('20').subscribe(console.log);
}, 4000)
// API_CALL
// ["1", "3"]
// {id: "1", name: "George"}
// {id: "3", name: "George"}
// API_CALL
// ["20"]
// {id: "20", name: "George"}
// {id: "20", name: "George"}
stackblitz example
FYI, i tried to create a generic batched task queue using the answers of
#buggy & #picci :
import { Observable, Subject, BehaviorSubject, from, timer } from "rxjs"
import { catchError, share, mergeMap, map, filter, takeUntil, take, bufferTime, timeout, concatMap } from "rxjs/operators"
export interface Task<TInput> {
uid: number
input: TInput
}
interface ErroredTask<TInput> extends Task<TInput> {
error: any
}
interface SucceededTask<TInput, TOutput> extends Task<TInput> {
output: TOutput
}
export type FinishedTask<TInput, TOutput> = ErroredTask<TInput> | SucceededTask<TInput, TOutput>
const taskErrored = <TInput, TOutput>(
taskFinished: FinishedTask<TInput, TOutput>,
): taskFinished is ErroredTask<TInput> => !!(taskFinished as ErroredTask<TInput>).error
type BatchedWorker<TInput, TOutput> = (tasks: Array<Task<TInput>>) => Observable<FinishedTask<TInput, TOutput>>
export const createSimpleBatchedWorker = <TInput, TOutput>(
work: (inputs: TInput[]) => Observable<TOutput[]>,
workTimeout: number,
): BatchedWorker<TInput, TOutput> => (
tasks: Array<Task<TInput>>,
) => work(
tasks.map((task) => task.input),
).pipe(
mergeMap((outputs) => from(tasks.map((task, index) => ({
...task,
output: outputs[index],
})))),
timeout(workTimeout),
catchError((error) => from(tasks.map((task) => ({
...task,
error,
})))),
)
export const createBatchedTaskQueue = <TInput, TOutput>(
worker: BatchedWorker<TInput, TOutput>,
concurrencyLimit: number = 1,
batchTimeout: number = 0,
maxBatchSize: number = Number.POSITIVE_INFINITY,
) => {
const taskSubject = new Subject<Task<TInput>>()
const cancelTaskSubject = new BehaviorSubject<Set<number>>(new Set())
const cancelTask = (task: Task<TInput>) => {
const cancelledUids = cancelTaskSubject.getValue()
const newCancelledUids = new Set(cancelledUids)
newCancelledUids.add(task.uid)
cancelTaskSubject.next(newCancelledUids)
}
const output$: Observable<FinishedTask<TInput, TOutput>> = taskSubject.pipe(
bufferTime(batchTimeout, undefined, maxBatchSize),
map((tasks) => {
const cancelledUids = cancelTaskSubject.getValue()
return tasks.filter((task) => !cancelledUids.has(task.uid))
}),
filter((tasks) => tasks.length > 0),
mergeMap(
(tasks) => worker(tasks).pipe(
takeUntil(cancelTaskSubject.pipe(
filter((uids) => {
for (const task of tasks) {
if (!uids.has(task.uid)) {
return false
}
}
return true
}),
)),
),
undefined,
concurrencyLimit,
),
share(),
)
let nextUid = 0
return (input$: Observable<TInput>): Observable<TOutput> => input$.pipe(
concatMap((input) => new Observable<TOutput>((observer) => {
const task = {
uid: nextUid++,
input,
}
const subscription = output$.pipe(
filter((taskFinished) => taskFinished.uid === task.uid),
take(1),
map((taskFinished) => {
if (taskErrored(taskFinished)) {
throw taskFinished.error
}
return taskFinished.output
}),
).subscribe(observer)
subscription.add(
timer(0).subscribe(() => taskSubject.next(task)),
)
return () => {
subscription.unsubscribe()
cancelTask(task)
}
})),
)
}
With our example:
import { from } from "rxjs"
import { map, toArray } from "rxjs/operators"
import { createBatchedTaskQueue, createSimpleBatchedWorker } from "mmr/components/rxjs/batched-task-queue"
const functionThatSimulateAFetch = (userIds: string[]) => from(userIds).pipe(
map((userId) => ({ id: userId, name: "George" })),
toArray(),
)
const userFetchQueue = createBatchedTaskQueue(
createSimpleBatchedWorker(
functionThatSimulateAFetch,
10000,
),
)
const fetchUser = (userId: string) => {
return from(userId).pipe(
userFetchQueue,
)
}
I am open to any improvement suggestions

Categories

Resources