I am using Firebase and Node with Redux. I am loading all objects from a key as follows.
firebaseDb.child('invites').on('child_added', snapshot => {
})
The idea behind this method is that we get a payload from the database and only use one action to updated my local data stores via the Reducers.
Next, I need to listen for any NEW or UPDATED children of the key invite.
The problem now, however, is that the child_added event triggers for all existing keys, as well as newly added ones. I do not want this behaviour, I only require new keys, as I have the existing data retrieved.
I am aware that child_added is typically used for this type of operation, however, i wish to reduce the number of actions fired, and renders triggered as a result.
What would be the best pattern to achieve this goal?
Thanks,
Although the limit method is pretty good and efficient, but you still need to add a check to the child_added for the last item that will be grabbed. Also I don't know if it's still the case, but you might get "old" events from previously deleted items, so you might need to watch at for this too.
Other solutions would be to either:
Use a boolean that will prevent old added objects to call the callback
let newItems = false
firebaseDb.child('invites').on('child_added', snapshot => {
if (!newItems) { return }
// do
})
firebaseDb.child('invites').once('value', () => {
newItems = true
})
The disadvantage of this method is that it would imply getting events that will do nothing but still if you have a big initial list might be problematic.
Or if you have a timestamp on your invites, do something like
firebaseDb.child('invites')
.orderByChild('timestamp')
.startAt(Date.now())
.on('child_added', snapshot => {
// do
})
I have solved the problem using the following method.
firebaseDb.child('invites').limitToLast(1).on('child_added', cb)
firebaseDb.child('invites').on('child_changed', cb)
limitToLast(1) gets the last child object of invites, and then listens for any new ones, passing a snapshot object to the cb callback.
child_changed listens for any child update to invites, passing a snapshot to the cb
I solved this by ignoring child_added all together, and using just child_changed. The way I did this was to perform an update() on any items i needed to handle after pushing them to the database. This solution will depend on your needs, but one example is to update a timestamp key whenever you want the event triggered. For example:
var newObj = { ... }
// push the new item with no events
fb.push(newObj)
// update a timestamp key on the item to trigger child_changed
fb.update({ updated: yourTimeStamp })
there was also another solution:
get the number of children and extract that value:
and it's working.
var ref = firebaseDb.child('invites')
ref.once('value').then((dataSnapshot) => {
return dataSnapshot.numChildren()
}).then((count) =>{
ref .on('child_added', (child) => {
if(count>0){
count--
return
}
console.log("child really added")
});
});
If your document keys are time based (unix epoch, ISO8601 or the firebase 'push' keys), this approach, similar to the second approach #balthazar proposed, worked well for us:
const maxDataPoints = 100;
const ref = firebase.database().ref("someKey").orderByKey();
// load the initial data, up to whatever max rows we want
const initialData = await ref.limitToLast(maxDataPoints).once("value")
// get the last key of the data we retrieved
const lastDataPoint = initialDataTimebasedKeys.length > 0 ? initialDataTimebasedKeys[initialDataTimebasedKeys.length - 1].toString() : "0"
// start listening for additions past this point...
// this works because we're fetching ordered by key
// and the key is timebased
const subscriptionRef = ref.startAt(lastDataPoint + "0");
const listener = subscriptionRef.on("child_added", async (snapshot) => {
// do something here
});
Related
I know that Observables take some time to get data while javascript keeps running the others codes and that is troubling me a lot.
I have used ngrx in my angular project. Here, I am trying to fetch some data from the store which is working fine. Then, I convert this data stream into string[] which is also working fine.
To use this string[] me subscribeto this observable. And inside subscription I try to assign the value to other values named filterSizeValues.
Here, the problem comes. If I console.logthis filterSizeValuesinitially I got and empty array. When the observable finishes his job filterSizeValues variable is filled with data.
But I can not effort filterSizeValues variable to be empty array initially. What can I do?
I have already searched the solution in the internet but nothing is working out.
Help me out please. And Many Many Thanks in advance.
Here is my code;
this.sizeTargetingStore$.dispatch(SizeTargetingActions.getSizeTargeting({
campaignId: this.campaignId,
lineItemId: this.lineItemId
}));
Here I am accessing the store to get data.
this.sizeTargeting$
.pipe(switchMap(sizes=>{
let temporary:string[] = [];
sizes.forEach(eachSize=>{
temporary.push(eachSize.name);
})
this.filterSizeValues$ = of(temporary);
return this.filterSizeValues$;
}))
.subscribe(size_name=>{
this.filters.set('size_name', size_name);
})
Here, I am trying to set the filter values.
I also tried this way also.
this.sizeTargeting$
.pipe(switchMap(sizes=>{
let temporary:string[] = [];
sizes.forEach(eachSize=>{
temporary.push(eachSize.name);
})
this.filterSizeValues$ = of(temporary);
return this.filterSizeValues$;
}))
.subscribe(size_name=>{
this.filterSizeValues = size_name
})
this.filters.set('size_name', this.filterSizeValues);
But all ways filters set to an empty array.
Anyone can help me out please?
From my understanding, you have 2 possibilities, either filter out the empty values or skip the first value. You can do so with the filter and skip rxjs operator respectively.
Also I believe that you are misusing the switchMap operator, since you are not using asynchronous operations within your switchMap we can use the map operator instead, so below I have a simplified version of your code with your 2 options to fix your problem.
Option 1:
this.sizeTargeting$.pipe(
filter(sizes => sizes.length > 0), // filter out empty array values
map(sizes => sizes.map(size => size.name)) // perform your remap
).subscribe(sizes => {
this.filterSizeValues = size_name; // Only arrays with values will reach this step
});
Option 2:
this.sizeTargeting$.pipe(
skip(1), // skip the first value
map(sizes => sizes.map(size => size.name)) // perform your remap
).subscribe(sizes => {
this.filterSizeValues = size_name; // Only arrays with values will reach this step
});
Normally when I subscribe to something that I am waiting on to return what I do is I set up a Subject:
private componentDestroyed$ = new Subject<void>();
then in the Observable piping and subscription I do it as:
this.sizeTargeting$
.pipe(takeUntil(this.componentDestroyed$))
.subscribe((sizes: YourTypeHere[]) => {
if(sizes) {
//Do what I need to do with my sizes here, populate what I need,
//dispatch any other actions needed.
}
})
I'm trying to create a chat app and there is a small issue. Whenever I load in my messages from firebase, they appear in the chat app in unsorted order, so I'm attempting to sort the messages by timestamp so they appear in order. I can do this if I move the sort and setMessages within onReceive of useEffect, but I feel like this will be pretty inefficient because it sorts and setsMessages a separate time for each message that's retrieved from firebase. I want to just do it all at the end after all the messages are loaded into the array.
Right now with my logs, I get this:
[REDACTED TIME] LOG []
[REDACTED TIME] LOG pushing into loadedMessages
[REDACTED TIME] LOG pushing into loadedMessages
So it's printing the (empty) array first, then loading in messages. How can I make sure this is done in the correct order?
useEffect(() => {
// Gets User ID
fetchUserId(getUserId());
const messagesRef = firebase.database().ref(`${companySymbol}Messages`);
messagesRef.off();
messagesRef.off();
const onReceive = async (data) => {
const message = data.val();
const iMessage = {
_id: message._id,
text: message.text,
createdAt: new Date(message.createdAt),
user: {
_id: message.user._id,
name: message.user.name,
},
};
loadedMessages.push(iMessage);
console.log('pushing into loadedMessages');
};
messagesRef.on('child_added', onReceive);
loadedMessages.sort(
(message1, message2) => message2.createdAt - message1.createdAt,
);
console.log(loadedMessages);
return () => {
console.log('useEffect Return:');
messagesRef.off();
};
}, []);
I think that the perspective is a bit off.
The right way to do so will be to fetch the firebase data sorted.
Firebase has a built-in sort, although it does come with its limitations.
In my opinion, you sould try something like:
const messagesRef = firebase.database().ref(`${companySymbol}Messages`);
messagesRef.orderByChild("createdAt").on("child_added", function(snapshot) {
// the callback function once a new message has been created.
console.log(snapshot.val());
});
And if I may add one more thing, to bring every single message from the down of time can be a bit harry once you've got over a thousand or so, so I would recommend limiting it. that can be achieved using the built-in limit function limitToLast(1000) for example.
Good luck!
Well, the name of the database is "Realtime Database". You are using the "child_added" listener which is going to be triggered every time a new object gets added to the Messages collection. The onReceive callback should do the sorting - otherwise the messages won't be in the correct order. Yes, that is inefficient for the first load as your "child_added" will most probably be triggered for every item returned from the collection and you'll be repeating sorting.
What you could explore as alternative is to have a .once listener: https://firebase.google.com/docs/database/web/read-and-write#read_data_once the first time you populate the data in your app. This will return all the data you need. After that is complete you can create your "child_added" listener and only listen for new objects. This way onReceive shouldn't be called that often the first time and afterwards it already makes sense to sort on every new item that comes in.
Also have a look at sorting: https://firebase.google.com/docs/database/web/lists-of-data#sorting_and_filtering_data
You might be able to return the messages in the correct order.
And also - if you need queries - look at firestore...
I'm trying to fetch real time data from Cloud Firestore using the below code.
export const getRealTimeData = () =>
db
.collection('posts')
.onSnapshot(
(querySnapshot) => {
const posts: any = [];
querySnapshot.forEach((doc) =>
posts.push(Object.assign({
id: doc.id
}, doc.data()))
);
},
);
};
And, I want to use the resultant array to display the data on UI. When I'm doing this, the resultant array is a function but not the actual array of data.
const posts = getRealTimeData();
Here's what I get when I log posts
function () {
i.kT(), o.al(s);
}
Could anyone please point where I went wrong?
Realtime listeners added with onSnapshot() are not compatible with returning values from function calls. That's because they continue to generate new results over time, and would never really "return" anything once. You should abandon the idea of making a synhronous getter type function in this case - they just don't work for what you're trying to do.
Ideally, you would use an architecture like Redux to manage the updates as they become available. Your realtime listener would dispatch query updates to a store, and your component would subscribe to that store that to receive those updates.
If you don't want to use Redux (which is too bad - you really should for this sort of thing), then you should wrap your query inside a useEffect hook, then have your listener set a state hook variable so your component can receive the updates.
Here's the scenario: I get an initial payload from Firebase (eg. 5 objects) from firebaseRef_1 (using a .once callback). I then transform each received object into a new Firebase ref. That newly generated ref looks like this:
"data/ccccyyyyotKKiWC2xaV1WZ7H3/things/-KgXo121225H9_Nks1O"
If for example I have 5 incoming objects on that .once call, then I'll create 5 refs (see code snippet below).
Once I've created/generated the ref(s) I want to:
a) Create a .on listener to each of those 5 refs.
b) If (lets say) some time later 1 'row' of new data arrives to firebaseRef_1, I want to generate a new ref, and then create a listener - but now, just for that newly added item. Otherwise if I try to loop through all of the (now 6) elements Firebase throws a promise error telling me that the original 5 elements (refs) already have a listener attached.
The reason that I was thinking to use a .once call initially, is so that I can get the whole payload "all at .once" (pun intended!) which makes populating a ListView (react native) faster. I tried using a .child_added approach but on the initial load (of those 5 rows) the .child_added gets called 5 times and so you can see the rows appear one by one in the ListView which I dont like.
How should I best structure this in code to achieve what I've described above? Should I just have a .once call like this:
generateTheRefs() {
firebase.database()
.ref('data/' + firebaseUID + '/initialPath')
.once('value', (snapshot) => {
snapshot.forEach(childSnapshot => {
var childData = childSnapshot.val();
let path = `data/${childData.item1}/things/${childData.item2}`;
//now create a .on listener
firebase.database()
.ref(path)
.on('value', (snapshot) => {
//....
});
});
});
}
You could:
Read the entire initial set with once(), and display that. Store it in a variable somewhere.
Then immediately register a listener with on('child_added'), and only act on things you did not previously collect from once() as you initially stored it.
It sounds like what you are after is to use all of the results at the same time so they don't dribble in over time. You can use an array of promises and Promise.all() to wait for them all to arrive before using them, something like:
function generateTheRefs() {
firebase.database()
.ref('data/' + firebaseUID + '/initialPath')
.once('value', (snapshot) => {
let promises = [];
snapshot.forEach(childSnapshot => {
var childData = childSnapshot.val();
let path = `data/${childData.item1}/things/${childData.item2}`;
//now create a .on listener
promises.push(firebase.database()
.ref(path)
.once('value'));
});
Promise.all(promises).then(snaps => {
snaps.forEach(snap => {
// ...
console.log(snap.val());
})
});
});
}
i'm trying to use React with Flux architecture and stumbled on one restriction which i can't handle.
Problem is as following:
There's a store which listens to an event. Event has object id. We need to fetch object if needed and make it selected.
If store doesn't have object with this id - it's queried. In callback we dispatch another event to store which is responsible for selection.
If store has object - i'd like to dispatch selection event, but i can't because dispatch is in progress.
Best solution i came up with so far is wrapping inner dispatch in setTimeout(f, 0), but it looks scary.
Actually the problem is quite general - how should i organize dispatch chain without dispatch nesting (without violating current Flux restrictions) if each new dispatch is based on previous dispatch handling result.
Does anybody have any good approaches to solve such problems?
var selectItem(item) {
AppDispatcher.dispatch({
actionType: AppConstants.ITEM_SELECT,
item: item
});
}
// Item must be requested and selected.
// If it's in store - select it.
// Otherwise fetch and then select it.
SomeStore.dispatchToken = AppDispatcher.register((action) => {
switch(action.actionType) {
case AppConstants.ITEM_REQUESTED:
var item = SomeStore.getItem(action.itemId);
if (item) {
// Won't work because can't dispatch in the middle of dispatch
selectItem(item);
} else {
// Will work
$.getJSON(`some/${action.itemId}`, (item) => selectItem(item));
}
}
};
Are you writing your own dispatcher? setTimeout(f, 0) is a fine trick. I do the same thing in my minimal flux here. Nothing scary there. Javascript's concurrency model is pretty simple.
More robust flux dispatcher implementations should handle that for you.
If ITEM_SELECT is an event that another Store is going to handle:
You are looking for dispatcher.waitFor(array<string> ids): void, which lets you use the SomeStore.dispatchToken that register() returns to enforce the order in which Stores handle an event.
The store, say we call it OtherStore, that would handle the ITEM_SELECT event, should instead handle ITEM_REQUEST event, but call dispatcher.waitFor( [ SomeStore.dispatchToken ] ) first, and then get whatever result is interesting from SomeStore via a public method, like SomeStore.getItem().
But from your example, it seems like SomeStore doesn't do anything to its internal state with ITEM_REQUEST, so you just need to move the following lines into OtherStore with a few minor changes:
// OtherStore.js
case AppConstants.ITEM_REQUESTED:
dispatcher.waitFor( [ SomeStore.dispatchToken ] );// and don't even do this if SomeStore isn't doing anything with ITEM_REQUEST
var item = SomeStore.getItem(action.itemId);
if (item) {
// Don't dispatch an event, let other stores handle this event, if necessary
OtherStore.doSomethingWith(item);
} else {
// Will work
$.getJSON(`some/${action.itemId}`, (item) => OtherStore.doSomethingWith(item));
}
And again, if another store needs to handle the result of OtherStore.doSomethingWith(item), they can also handle ITEM_REQUESTED, but call dispatcher.waitFor( [ OtherStore.dispatchToken ] ) before proceeding.
So, in looking at your code, are you setting a "selected" property on the item so it will be checked/selected in your UI/Component? If so, just make that part of the function you are already in.
if(item) {
item.selected = true;
//we're done now, no need to create another Action at this point,
//we have changed the state of our data, now alert the components
//via emitChange()
emitChange();
}
If you're wanting to track the currently selected item in the Store, just have an ID or and object as a private var up there, and set it similarly.
var Store = (function(){
var _currentItem = {};
var _currentItemID = 1;
function selectItem(item) {
_currentItem = item;
_currentItemID = item.id;
emitChange();
}
(function() {
Dispatcher.register(function(action){
case AppConstants.ITEM_REQUESTED:
var item = SomeStore.getItem(action.itemId);
if (item) {
selectItem(item);
} else {
$.getJSON(`some/${action.itemId}`, (item) =>
selectItem(item);
}
});
})();
return {
getCurrentlySelectedItem: function() {
return _currentItem;
},
getCurrentlySelectedItemID: function() {
return _currentItemID;
}
}
})();
Ultimately, you don't have to create Actions for everything. Whatever the item is that you're operating on, it should be some domain entity, and it is your Store's job to manage the state of that specific entity. Having other internal functions is often a necessity, so just make selectItem(item) an internal function of your Store so you don't have to create a new Action to access it or use it.
Now, if you have cross-store concerns, and another Store cares about some specific change to some data in your initial Store, this is where the waitFor(ids) function will come in. It effectively blocks execution until the first Store is updated, then the other can continue executing, assured that the other Store's data is in a valid state.
I hope this makes sense and solves your problem, if not, let me know, and hopefully I can zero in better.