Wait for Observable to complete in order to submit a form - javascript

I have a 'new trip' form, where the user can write the names of the participants and then submit the form to create the trip.
On submit, I query a Firebase database with the names, in order to get the IDs of the participants (/users). I then add the IDs to the participantsID field of the trip object and then I push the new trip to Firebase.
The problem is that the Firebase query is async and returns an Observable, therefore my function will proceed to push the object before the Observable has completed, so the participantsID field of the new object is empty.
Is there any method to wait for the observable to complete (in a kind of synchronous way) so that i can manipulate the data and then proceed? All my attempts to fix this have failed so far.
Here's my simple code.
getUserByAttribute(attribute, value) {
return this.db.list('/users', {
query: {
orderByChild: attribute,
equalTo: value,
limitToFirst: 1
}
});
}
createTrip(trip) {
for(let name in participantsName.split(',')) {
getUserByAttribute('username', name)
.subscribe( user => trip.participantsID.push(user[0].$key) );
}
this.db.list('/trips').push(trip);
}

You could treat all Observables into a single Observable by doing forkJoin
createTrip(trip) {
var observableArray: any = participantsName.split(',')
.switchMap((name)=> getUserByAttribute('username', name))
Observable.forkJoin(observableArray).subscribe(
trips => trips.forEach((trip) => {
this.db.list('/trips').push(trip);
})
);
}

In the end I used part of #Pankaj Parkar's answer to solve the problem.
I forkJoin all the Observables returned by mapping the splitted names and I subscribe to that Observable which result contains an array of arrays, where the inner arrays contain a user object.
getUserByAttribute(attribute, value) {
return this.db.list('/users', {
query: {
orderByChild: attribute,
equalTo: value,
limitToFirst: 1
}
}).first();
}
createTrip(trip) {
Observable.forkJoin(
trip.participantsName.split(',')
.map(name => getUserByAttribute('name', name))
).subscribe(
participants => {
trip.participants = participants.map( p => p[0].$key);
this.tripService.createTrip(trip);
}
);
}
}

You have a difficult problem. You have to get users info before push a new trip.
You can't just make new subscriptions every time because of the memory leak problem (or be careful with unsubscribes). If you are using Firebase, you can use AngularFire subject support.
You can update a subscription by using a subject in your query (with the equal to) and then push a user to retrieve with .next(user).
Then you still have to wait for all users. For that, you can have only one subscription and get all IDs synchronously or have multiple subscriptions to get multiple results faster (but it's difficult).
To solve this problem, I created:
a queue of callbacks (just arrays but use push() and unshift() methods)
a queue of values
one subject for one subscription.
If you want an ID, you have to:
push the value
push the callback that will retrieve the value returned.
You should use functions to push because you'll have to call .next() if there is no value in the stack (to start !).
And in your subscription, in its callback, i.e when you receive the distant user object, you can call the first callback in the stack. Don't forget to pop your value and callback of the stacks and call the next() for the next value if there is one.
This way, you can push your trip in the last callback for the last user. And it's all callbacks, it means your app is not interrupted.
I still not decided if we should do that in a cloud function. Because the user have to stay connected, and this use his data / processor. But it's good to have all the code in the same place, and cloud functions are limited for a free version of Firebase. What would a Firebase developer advice?
I made a lot of searches to find a better solution, so please share it if you have one. It's a little complicated I think, but it's working very fine. I had the same problem when a user want to add a new flight, I need to get the airports information before (coords) and push multiple objects (details, maps, etc.)

Related

How to execute a sort after loading in data into an array in UseEffect - React Native

I'm trying to create a chat app and there is a small issue. Whenever I load in my messages from firebase, they appear in the chat app in unsorted order, so I'm attempting to sort the messages by timestamp so they appear in order. I can do this if I move the sort and setMessages within onReceive of useEffect, but I feel like this will be pretty inefficient because it sorts and setsMessages a separate time for each message that's retrieved from firebase. I want to just do it all at the end after all the messages are loaded into the array.
Right now with my logs, I get this:
[REDACTED TIME] LOG []
[REDACTED TIME] LOG pushing into loadedMessages
[REDACTED TIME] LOG pushing into loadedMessages
So it's printing the (empty) array first, then loading in messages. How can I make sure this is done in the correct order?
useEffect(() => {
// Gets User ID
fetchUserId(getUserId());
const messagesRef = firebase.database().ref(`${companySymbol}Messages`);
messagesRef.off();
messagesRef.off();
const onReceive = async (data) => {
const message = data.val();
const iMessage = {
_id: message._id,
text: message.text,
createdAt: new Date(message.createdAt),
user: {
_id: message.user._id,
name: message.user.name,
},
};
loadedMessages.push(iMessage);
console.log('pushing into loadedMessages');
};
messagesRef.on('child_added', onReceive);
loadedMessages.sort(
(message1, message2) => message2.createdAt - message1.createdAt,
);
console.log(loadedMessages);
return () => {
console.log('useEffect Return:');
messagesRef.off();
};
}, []);
I think that the perspective is a bit off.
The right way to do so will be to fetch the firebase data sorted.
Firebase has a built-in sort, although it does come with its limitations.
In my opinion, you sould try something like:
const messagesRef = firebase.database().ref(`${companySymbol}Messages`);
messagesRef.orderByChild("createdAt").on("child_added", function(snapshot) {
// the callback function once a new message has been created.
console.log(snapshot.val());
});
And if I may add one more thing, to bring every single message from the down of time can be a bit harry once you've got over a thousand or so, so I would recommend limiting it. that can be achieved using the built-in limit function limitToLast(1000) for example.
Good luck!
Well, the name of the database is "Realtime Database". You are using the "child_added" listener which is going to be triggered every time a new object gets added to the Messages collection. The onReceive callback should do the sorting - otherwise the messages won't be in the correct order. Yes, that is inefficient for the first load as your "child_added" will most probably be triggered for every item returned from the collection and you'll be repeating sorting.
What you could explore as alternative is to have a .once listener: https://firebase.google.com/docs/database/web/read-and-write#read_data_once the first time you populate the data in your app. This will return all the data you need. After that is complete you can create your "child_added" listener and only listen for new objects. This way onReceive shouldn't be called that often the first time and afterwards it already makes sense to sort on every new item that comes in.
Also have a look at sorting: https://firebase.google.com/docs/database/web/lists-of-data#sorting_and_filtering_data
You might be able to return the messages in the correct order.
And also - if you need queries - look at firestore...

Angular: Ensure Services is Complete before Running Next Step

We are currently using Angular.
Component is receiving data from API. After getting API Data, it goes through Data Services which transform and customize the data, concatenate First Last Name, rounds dollar amounts, makes calculations, etc.
The last step tries to populate the Sales year in a Dropdown, after parsing all the data.
this.webStoreSearchHttpService.GetAllCustomerSalesData(this.customerId).subscribe((response) => {
this.customerList= customerDataService.createCustomerList(response);
this.productList = customerDataService.createProductAnalysis(response);
this.salesList= customerDataService.createSalesList(response);
this.salesYearList= customerDataService.createYearList(response);
this.salesYearItemCurrent = _.cloneDeep(this.salesYearList[0]); <--- this goes into a Mat Select Dropdown
However, correlating data does not appear after selecting web dropdown, because the Data Services is not finished parsing/created yet, even though its in original API subscribe.
What I am trying to do, is make sure all 4 Data services are totally complete, and Then populate salesYear. How can this be done with Angular typescript ?
The data services can be run in Parallel, however last step is salesYear population in dropdown.
The methods return class arrays, not promises or observables.
Update
You added the sentece The methods return class arrays, not promises or observables.. This implies that you have no possibility from outside to wait for asynchroneous calls to finish. Hence you have to change the return value of the customerDataService methods. I am assuming that inside this methods some asynchroneous stuff is done, because you say What I am trying to do, is make sure all 4 Data services are totally complete.
Old version
To answer your question one have to know what the customerDataService methods return type is. Do the method return Promise or Observable? Depending on that you can use Promise.all or forkJoin operator to wait for all methods to finish and then execute the select population. This is an example using observables:
this.webStoreSearchHttpService.GetAllCustomerSalesData(this.customerId).subscribe(response => {
forkJoin([
customerDataService.createCustomerList(response),
customerDataService.createProductAnalysis(response),
customerDataService.createSalesList(response),
customerDataService.createYearList(response)
]).subscribe(([customerList, productList, salesList, salesYearList]) => {
this.customerList = customerList;
this.productList = productList;
this.salesList = salesList;
this.salesYearList = salesYearList;
this.salesYearItemCurrent = _.cloneDeep(this.salesYearList[0]);
});
});
or even better to avoid the inner subscription and has only one subscription:
this.webStoreSearchHttpService.GetAllCustomerSalesData(this.customerId).pipe(
flatMap(response =>
forkJoin([
customerDataService.createCustomerList(response),
customerDataService.createProductAnalysis(response),
customerDataService.createSalesList(response),
customerDataService.createYearList(response)
])
)
).subscribe(([customerList, productList, salesList, salesYearList]) => {
this.customerList = customerList;
this.productList = productList;
this.salesList = salesList;
this.salesYearList = salesYearList;
this.salesYearItemCurrent = _.cloneDeep(this.salesYearList[0]);
});

How to push to Observable of Array in Angular 4? RxJS

I have a property on my service class as so:
articles: Observable<Article[]>;
It is populated by a getArticles() function using the standard http.get().map() solution.
How can I manually push a new article in to this array; One that is not yet persisted and so not part of the http get?
My scenario is, you create a new Article, and before it is saved I would like the Article[] array to have this new one pushed to it so it shows up in my list of articles.
Further more, This service is shared between 2 components, If component A consumes the service using ng OnInit() and binds the result to a repeating section *ngFor, will updating the service array from component B simultaneously update the results in components A's ngFor section? Or must I update the view manually?
Many Thanks,
Simon
As you said in comments, I'd use a Subject.
The advantage of keeping articles observable rather than storing as an array is that http takes time, so you can subscribe and wait for results. Plus both components get any updates.
// Mock http
const http = {
get: (url) => Rx.Observable.of(['article1', 'article2'])
}
const articles = new Rx.Subject();
const fetch = () => {
return http.get('myUrl').map(x => x).do(data => articles.next(data))
}
const add = (article) => {
articles.take(1).subscribe(current => {
current.push(article);
articles.next(current);
})
}
// Subscribe to
articles.subscribe(console.log)
// Action
fetch().subscribe(
add('article3')
)
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.5.2/Rx.js"></script>
Instead of storing the whole observable, you probably want to just store the article array, like
articles: Article[]
fetch() {
this.get(url).map(...).subscribe(articles => this.articles)
}
Then you can manipulate the articles list using standard array manipulation methods.
If you store the observable, it will re-run the http call every time you subscribe to it (or render it using | async) which is definitely not what you want.
But for the sake of completeness: if you do have an Observable of an array you want to add items to, you could use the map operator on it to add a specified item to it, e.g.
observable.map(previousArray => previousArray.concat(itemtToBeAdded))
ex from angular 4 book ng-book
Subject<Array<String>> example = new Subject<Array<String>>();
push(newvalue:String):void
{
example.next((currentarray:String[]) : String[] => {
return currentarray.concat(newValue);
})
}
what the following says in example.next is take the current array value Stored in the observable and concat a new value onto it and emit the new array value to subscribers. It is a lambda expression.I think this only works with subject observables because they hold unto the last value stored in their method subject.getValue();

Passing down arguments using Facebook's DataLoader

I'm using DataLoader for batching the requests/queries together.
In my loader function I need to know the requested fields to avoid having a SELECT * FROM query but rather a SELECT field1, field2, ... FROM query...
What would be the best approach using DataLoader to pass down the resolveInfo needed for it? (I use resolveInfo.fieldNodes to get the requested fields)
At the moment, I'm doing something like this:
await someDataLoader.load({ ids, args, context, info });
and then in the actual loaderFn:
const loadFn = async options => {
const ids = [];
let args;
let context;
let info;
options.forEach(a => {
ids.push(a.ids);
if (!args && !context && !info) {
args = a.args;
context = a.context;
info = a.info;
}
});
return Promise.resolve(await new DataProvider().get({ ...args, ids}, context, info));};
but as you can see, it's hacky and doesn't really feel good...
Does anyone have an idea how I could achieve this?
I am not sure if there is a good answer to this question simply because Dataloader is not made for this usecase but I have worked extensively with Dataloader, written similar implementations and explored similar concepts on other programming languages.
Let's understand why Dataloader is not made for this usecase and how we could still make it work (roughly like in your example).
Dataloader is not made for fetching a subset of fields
Dataloader is made for simple key-value-lookups. That means given a key like an ID it will load a value behind it. For that it assumes that the object behind the ID will always be the same until it is invalidated. This is the single assumption that enables the power of dataloader. Without it the three key features of Dataloader won't work anymore:
Batching requests (multiple requests are done together in one query)
Deduplication (requests to the same key twice result in one query)
Caching (consecutive requests of the same key don't result in multiple queries)
This leads us to the following two important rules if we want to maximise the power of Dataloader:
Two different entities cannot share the same key, othewise we might return the wrong entity. This sounds trivial but it is not in your example. Let's say we want to load a user with ID 1 and the fields id and name. A little bit later (or at the same time) we want to load user with ID 1 and fields id and email. These are technically two different entities and they need to have a different key.
The same entity should have the same key all the time. Again sounds trivial but really is not in the example. User with ID 1 and fields id and name should be the same as user with ID 1 and fields name and id (notice the order).
In short a key needs to have all the information needed to uniquely identify an entity but not more than that.
So how do we pass down fields to Dataloader
await someDataLoader.load({ ids, args, context, info });
In your question you have provided a few more things to your Dataloader as a key. First I would not put in args and context into the key. Does your entity change when the context changes (e.g. you are querying a different database now)? Probably yes, but do you want to account for that in your dataloader implementation? I would instead suggest to create new dataloaders for each request as described in the docs.
Should the whole request info be in the key? No, but we need the fields that are requested. Apart from that your provided implementation is wrong and would break when the loader is called with two different resolve infos. You only set the resolve info from the first call but really it might be different on each object (think about the first user example above). Ultimately we could arrive at the following implementation of a dataloader:
// This function creates unique cache keys for different selected
// fields
function cacheKeyFn({ id, fields }) {
const sortedFields = [...(new Set(fields))].sort().join(';');
return `${id}[${sortedFields}]`;
}
function createLoaders(db) {
const userLoader = new Dataloader(async keys => {
// Create a set with all requested fields
const fields = keys.reduce((acc, key) => {
key.fields.forEach(field => acc.add(field));
return acc;
}, new Set());
// Get all our ids for the DB query
const ids = keys.map(key => key.id);
// Please be aware of possible SQL injection, don't copy + paste
const result = await db.query(`
SELECT
${fields.entries().join()}
FROM
user
WHERE
id IN (${ids.join()})
`);
}, { cacheKeyFn });
return { userLoader };
}
// now in a resolver
resolve(parent, args, ctx, info) {
// https://www.npmjs.com/package/graphql-fields
return ctx.userLoader.load({ id: args.id, fields: Object.keys(graphqlFields(info)) });
}
This is a solid implementation but it has a few weaknesses. First, we are overfetching a lot of fields if we have different field requiements in the same batch request. Second, if we have fetched an entity with key 1[id,name] from cache key function we could also answer (at least in JavaScript) keys 1[id] and 1[name] with that object. Here we could build a custom map implementation that we could supply to Dataloader. It would be smart enough to know these things about our cache.
Conclusion
We see that this is really a complicated matter. I know it is often listed as a benefit of GraphQL that you don't have to fetch all fields from a database for every query, but the truth is that in practice this is seldomly worth the hassle. Don't optimise what is not slow. And even is it slow, is it a bottleneck?
My suggestion is: Write trivial Dataloaders that simply fetch all (needed) fields. If you have one client it is very likely that for most entities the client fetches all fields anyways, otherwise they would not be part of you API, right? Then use something like query introsprection to measure slow queries and then find out which field exactly is slow. Then you optimise only the slow thing (see for example my answer here that optimises a single use case). And if you are a big ecomerce platform please don't use Dataloader for this. Build something smarter and don't use JavaScript.

What happens to property values on an object in JavaScript when its property values change during asynchronous function calls?

I am wondering what happens to a javascript object that is passed to an asynchronous function (like a database save), and then immediately has its property's value changed.
The reason I am interested in this is that the asynchronous code takes a long time to finish and so waiting for it to callback would take longer than I would like to respond.
Take the following code:
function asyncDatabaseSave(user) {
// Save the user asynchronously in the database
// Do things that take a lot of time before calling back
}
app.get(function (req, res) {
// In real code user is an object that gets passed in.
user = {cart: [1, 2, 3]};
// What is the value of user.cart going to be when it reaches the databaseSave call?
// At the time of the function call, user.cart has options but it is set to null immediately after
asyncDatabaseSave(user);
user.cart = null;
res.json(user);
})
I think that the "correct" way to ensure that the user object was saved with the cart not null and then clear it would be to pass a callback into asyncDatabaseSave to set it to null after the save.
However I am interested in calling res.json as soon as possible. I have also experimented with this code:
app.get(function (req, res) {
var items = user.cart;
user.cart = null;
res.json(user);
asyncDatabaseSave(user, items);
})
This way items is stored as a separate var and passed into a slightly-modified database save function.
What happens to user.cart when asyncDatabaseSave is called and then it is set to null immediately after?
If I am trying to save user with cart, but also return the user with cart set to null what is the best way to do this?
What happens to user.cart when asyncDatabaseSave is called and then it
is set to null immediately after?
You'd have to know what actually happens in the async database call to know whether it's safe to modify the passed in object immediately after making the call or not. There are many cases where the operative parts of the user object would already be copied into native code and sent off to the database before asyncDatabaseSave() returns so further modifying the user object would not affect the async call at all.
But, there are situations where you can't assume that, particular if asyncDatabaseSave() is actually made up of several async calls and some Javascript runs between them (such as opening the db, then writing to the db. In those cases, modifying the user object where you are could affect things.
If I am trying to save user with cart, but also return the user with
cart set to null what is the best way to do this?
So, to be safe, don't modify the user object right after asyncDatabaseSave(user).
If you really just want to call res.json(user) as fast as possible, then you can just make a copy of the user object, modify that copy and then use each separate copy in your two async operations. That's the general answer that works in all cases and it's the safe way to do things.
If the whole cart object is not needed in asyncDatabaseSave(), then picking out just the parts that are needed and passing those (like your last suggestion), allows you to freely assign cart properties afterwards without any risk. You do have to be careful that you aren't reaching into the cart object and changing objects in the cart object because that might be changing the same objects that you passed to asyncDatabaseSave(), but you can certainly assign properties to the cart object and that will not affect asyncDatabaseSave() if you didn't pass it the cart object.
Here's a simple example to show how it's not safe to assume you can modify the object you passed in:
function someAsyncSave(u) {
setTimeout(function() {
log(u.name)
}, 50);
}
var user = {name: "Joe"};
someAsyncSave(user);
user.name = "Alice";
<script src="http://files.the-friend-family.com/log.js"></script>
Run the snippet. It will log "Alice", even though the property was "Joe" when someAsyncSave() was called.
Even though the user object had user.name = "Joe" when it was passed to someAsyncSave(), by the time someAsyncSave() actually uses the property, it has already been changed by the outer code.

Categories

Resources