Im creating an app with react native and face the problem that I create multiple firebase listeners troughout the app, listeners on different screens to be precise and also listeners that listen to the firebase-database and others listening to the firestore.
What I want to accomplish is to kill all those listeners with one call or if necessary with multiple lines but as compact as possible - and also from an entire different screen where the listeners arent even running, this is important.
I know that there is the possibility to use Firebase.goOffline() but this only disconnects me from the Firebase - it doesnt stop the listeners. As soon as I goOnline() again, the listeners are all back.
I didnt find any solution yet for this problem from google etc thats why I try to ask here now, I would appriciate if anybody would have an idea how maybe an approach how to handle this type of behavior.
The following code samples provide you with listeners I included inside my app, they are located in in the same screen but I have nearly identical ones in other screens.
Database listener:
const statusListener = () => {
var partnerRef = firebase.database().ref(`users/${partnerId}/onlineState`);
partnerRef.on('value', function(snapshot){
setPartnerState(snapshot.val())
})
};
Firestore Listener: (this one is very long, thats only because I filter the documents I retrieve from the listener)
const loadnewmessages = () =>{ firebase.firestore().collection("chatrooms").doc(`${chatId}`).collection(`${chatId}`).orderBy("timestamp").limit(50).onSnapshot((snapshot) => {
var newmessages = [];
var deletedmesssages = [];
snapshot.docChanges().forEach((change) => {
if(change.type === "added"){
newmessages.push({
counter: change.doc.data().counter,
sender: change.doc.data().sender,
timestamp: change.doc.data().timestamp.toString(),
value: change.doc.data().value,
displayedTime: new Date(change.doc.data().displayedTime)
})
};
if(change.type === "removed"){
deletedmesssages.push({
counter: change.doc.data().counter,
sender: change.doc.data().sender,
timestamp: change.doc.data().timestamp.toString(),
value: change.doc.data().value,
displayedTime: new Date(change.doc.data().displayedTime)
})
};
})
if(newmessages.length > 0){
setChatMessages(chatmessages => {
return chatmessages.concat(newmessages)
});
};
if(deletedmesssages.length > 0){
setChatMessages(chatmessages => {
var modifythisarray = chatmessages;
let index = chatmessages.map(e => e.timestamp).indexOf(`${deletedmesssages[0].timestamp}`);
let pasttime = Date.now() - parseInt(modifythisarray[index].timestamp);
modifythisarray.splice(index, 1);
if(pasttime > 300000){
return chatmessages
}else{
return modifythisarray
}
});
setRefreshFlatList(refreshFlatlist => {
//console.log("Aktueller Status von refresher: ", refreshFlatlist);
return !refreshFlatlist
});
}
newmessages = [];
deletedmesssages = [];
})
};
Both those listeners are called within a useEffect hook just like that: (useEffect with empty braces at the end makes sure those listeners are called only once and not multiple times.)
useEffect(() => {
loadnewmessages();
statusListener();
}, []);
All of the subscribe functions return the unsubscribe function
const unSubscriptions = [];
... Where you subscribe
const unSub = document.onSnapshot(listener);
subscriptions.push(unSub);
... Where you unsubscribe all
function unSubAll () {
unSubscriptions.forEach((unSub) => unSub());
// Clear the array
unSubscriptions.length = 0;
}
Related
I am trying to do Firestore reactive pagination. I know there are posts, comments, and articles saying that it's not possible but anyways...
When I add a new message, it kicks off or "removes" the previous message
Here's the main code. I'm paginating 4 messages at a time
async getPaginatedRTLData(queryParams: TQueryParams, onChange: Function){
let collectionReference = collection(firestore, queryParams.pathToDataInCollection);
let collectionReferenceQuery = this.modifyQueryByOperations(collectionReference, queryParams);
//Turn query into snapshot to track changes
const unsubscribe = onSnapshot(collectionReferenceQuery, (snapshot: QuerySnapshot) => {
snapshot.docChanges().forEach((change: DocumentChange<DocumentData>) => {
//Now save data to format later
let formattedData = this.storeData(change, queryParams)
onChange(formattedData);
})
})
this.unsubscriptions.push(unsubscribe)
}
For completeness this is how Im building my query
let queryParams: TQueryParams = {
limitResultCount: 4,
uniqueKey: '_id',
pathToDataInCollection: messagePath,
orderBy: {
docField: orderByKey,
direction: orderBy
}
}
modifyQueryByOperations(
collectionReference: CollectionReference<DocumentData> = this.collectionReference,
queryParams: TQueryParams) {
//Extract query params
let { orderBy, where: where_param, limitResultCount = PAGINATE} = queryParams;
let queryCall: Query<DocumentData> = collectionReference;
if(where_param) {
let {searchByField, whereFilterOp, valueToMatch} = where_param;
//collectionReferenceQuery = collectionReference.where(searchByField, whereFilterOp, valueToMatch)
queryCall = query(queryCall, where(searchByField, whereFilterOp, valueToMatch) )
}
if(orderBy) {
let { docField, direction} = orderBy;
//collectionReferenceQuery = collectionReference.orderBy(docField, direction)
queryCall = query(queryCall, fs_orderBy(docField, direction) )
}
if(limitResultCount) {
//collectionReferenceQuery = collectionReference.limit(limitResultCount)
queryCall = query(queryCall, limit(limitResultCount) );
}
if(this.lastDocInSortedOrder) {
//collectionReferenceQuery = collectionReference.startAt(this.lastDocInSortedOrder)
queryCall = query(queryCall, startAt(this.lastDocInSortedOrder) )
}
return queryCall
}
See the last line removed is removed when I add a new message to the collection. Whats worse is it's not consistent. I debugged this and Firestore is removing the message.
I almost feel like this is a bug in Firestore's handling of listeners
As mentioned in the comments and confirmed by you the problem you are facing is occuring due to the fact that some values of the fields that your are searching in your query changed while the listener was still active and this makes the listener think of this document as a removed one.
This is proven by the fact that the records are not being deleted from Firestore itself, but are just being excluded from the listener.
This can be fixed by creating a better querying structure, separating the old data from new data incoming from the listener, which you mentioned you've already done in the comments as well.
I have a react app that uses the MS Graph API (so it's a bit difficult to post a minimal reproducible example). It has a state variable called chats that is designed to hold the result of fetching a list of chats from the graph API. I have to poll the API frequently to get new chats.
I query the chats endpoint, build an array of newChats and then setChats. I then set a timeout that refreshes the data every 10 seconds (it checks for premature invocation through the timestamp property stored in the state). If the component is unmounted, a flag is set, live (useRef), which stops the refresh process. Each chat object is then rendered by the Chat component (not shown).
Here's the code (I've edited by hand here to remove some irrelevant bits around styles and event propagation so it's possible that typo's have crept in -- it compiles and runs in reality).
const Chats = () => {
const [chats, setChats] = useState({ chats: [], timestamp: 0 });
const live = useRef(true);
const fetchChats = () => {
if (live.current && Date.now() - chats.timestamp < 9000) return;
fetchData(`${baseBeta}/me/chats`).then(res => {
if (res.value.length === chats.chats.length) return;
const chatIds = chats.chats.map(chat => chat.id);
const newChats = res.value.filter(chat => !chatIds.includes(chat.id));
if (newChats.length > 0) {
setChats(c => ({ chats: [...c.chats, ...newChats], timestamp: Date.now() }));
}
setTimeout(fetchChats, 10000);
});
};
useEffect(() => {
fetchChats();
return () => (live.current = false);
}, [chats]);
return (
<div>
{chats.chats.map(chat => (
<Chat chat={chat} />
))}
</div>
);
};
The Chat component must also make some async calls for data before it is rendered.
This code works, for a second or two. I see the Chat component rendered on the screen with the correct details (chat member names, avatars, etc.), but almost before it has completed rendering I see the list elements being removed, apparently one at a time, though that could just be the way its rendered -- it could be all at once. The list collapses on the screen, showing that the chat state has been cleared out. I don't know why this is happening.
I've stepped through the code in the debugger and I can see the newChats array being populated. I can see the setChats call happen. If I put a breakpoint on that line then it is only invoked once and that's the only line that sets that particular state.
So, what's going on? I'm pretty sure React isn't broken. I've used it before without much trouble. What's changed recently is the inclusion of the refresh code. I'm suspicious that the reset is taking away the state. My understanding is that the fetchChats method will be rendered every time the chats state changes and so should see the current value of the chats state. Just in case this wasn't happening, I passed the chats state from the useEffect like this:
useEffect(() => {
fetchChats(chats);
return () => (live.current = false);
}, [chats]);
With the necessary changes in fetchChats to make this work as expected. I get the same result, the chats state is lost after a few seconds.
Edit
Still Broken:
After #Aleks answer my useEffect now looks like this:
useEffect(() => {
let cancel = null;
let live = true;
const fetchChats = () => {
if (Date.now() - chats.timestamp < 9000) return;
fetchData(`${baseBeta}/me/chats`).then(res => {
if (res.value.length === chats.chats.length) return;
const chatIds = chats.chats.map(chat => chat.id);
const newChats = res.value.filter(chat => chat.chatType === "oneOnOne" && !chatIds.includes(chat.id));
if (newChats.length > 0 && live) {
setChats(c => ({ chats: [...c.chats, ...newChats], timestamp: Date.now() }));
}
cancel = setTimeout(fetchChats, 10000);
});
};
fetchChats();
return () => {
live = false;
cancel?.();
};
}, []);
The result of this is that the chats are loaded, cleared, and loaded again, repeatedly. This is better, at least they're reloading now, whereas previously they would disappear forever. They are reloaded every 10 seconds, and cleared out almost immediately still.
Eventually, probably due to random timings in the async calls, the entries in the list are duplicated and the 2nd copy starts being removed immediately instead of the first copy.
There are multiple problems. First this
setTimeout(fetchChats, 10000); will trigger
useEffect(() => {
fetchChats(chats);
return () => (live.current = false);
}, [chats])
You will get 2 fetches one after another.
But the bug you're seeing is because of this
return () => (live.current = false);
On second useEffect trigger, clean up function above with run and live.current will be forever false from now on.
And as Nikki9696 said you you need to clear Timeout in clean up function
The easiest fix to this is, probably
useEffect(() => {
let cancel = null;
let live = true;
const fetchChats = () => {
// not needed
//if ( Date.now() - chats.timestamp < 9000) return;
fetchData(`${baseBeta}/me/chats`).then(res => {
//this line is not needed
//if (res.value.length === chats.chats.length) return;
// remove all the filtering, it can be done elsewhere where
// you can access fresh chat state
//const chatIds = chats.chats.map(chat => chat.id);
//const newChats = res.value.filter(chat =>
//!chatIds.includes(chat.id));
if (res.value?.length > 0&&live) {
setChats(c => ({ chats: [...c.chats, ...res.value], timestamp: Date.now() }));
cancel = setTimeout(fetchChats, 10000);
}
});
};
fetchChats()
return () => { live=false; if(cancel)window.clearTimeout(cancel) };
}, []);
Edit: typo cancel?.() to window.clearTimeout(cancel);
Ok, I have an idea what's happening and how to fix it. I am still not sure why it is behaving like this, so please comment if you understand it better than me.
Basically, for some reason I don't understand, the function fetchChats only ever sees the initial state of chats. I am making the mistake of filtering my newly fetched list against this state, in which the array is empty.
If I change my useEffect code to do this instead:
setChats(c => {
return {
chats: [
...c.chats,
...res.value.filter(cc => {
const a = c.chats.map(chat => chat.id);
return !a.includes(cc.id);
})
],
timestamp: Date.now()
};
});
Then my filter is passed the current value of the state for chats rather than the initial state.
I thought that because the function containing this code is in the function that declares the chat state, whenever that state changed the whole function would be rendered with the new value of chats making it available to its nested functions. This isn't the case here and I don't understand why.
The solution, to only trust the values of the state that is handed to me during the setState (setChats) call, works fine and I'll go with it, but I'd love to know what is wrong with reading the state directly.
I want to use firebase's onSnapshot function sequentially. A situation where I want to apply this is given below.
Scenario:
There are 2 collections in firestore. Employees and Projects. In the Employees collection, the docs are storing the details of employees. And it also stores the IDs of Projects docs on which that particular employee is working. In Projects collection, the detail of projects is stored.
Goal:
First, I have to fetch the data from Employees collection related to a specific employee. Then, from the fetched employee data, I will have the project IDs on which he/she is working on. So, from that ID I need to fetch the project details. So, when any information related to project or employee changes, the data on screen should also change in real-time.
Issue:
I tried to write a nested code. But it works realtime only for employee data. It doesn't change when the project detail is updated. Something like this...
admin.auth().onAuthStateChanged(async () => {
if (check_field(admin.auth().currentUser)) {
await db.collection('Employees').doc(admin.auth().currentUser.uid).onSnapshot(snap => {
...
let project_details = new Promise(resolve => {
let projects = [];
for (let i in snap.data().projects_list) {
db.collection('Projects').doc(snap.data().projects_list[i]).onSnapshot(prj_snap => {
let obj = prj_snap.data();
obj['doc_id'] = prj_snap.id;
projects.push(obj);
});
}
resolve(projects);
});
Promise.all([project_details]).then(items => {
...
// UI updation
});
...
});
}
});
What is the correct way for doing this?
You're actually proposing a pretty complex dataflow scenario. I would approach this as a multi-step problem. Your goal is essentially:
If there is a user, listen in realtime for the list of project ids for that user.
For each project id, listen in realtime for details about that project.
(presumably) Clean up listeners that are no longer relevant.
So I would tackle it something like this:
let uid;
let employeeUnsub;
let projectIds = [];
let projectUnsubs = {};
let projectData = {};
const employeesRef = firebase.firestore().collection('Employees');
const projectsRef = firebase.firestore().collection('Projects');
firebase.auth().onAuthStateChanged(user => {
// if there is already a listener but the user signs out or changes, unsubscribe
if (employeeUnsub && (!user || user.uid !== uid)) {
employeeUnsub();
}
if (user) {
uid = user.uid;
// subscribe to the employee data and trigger a listener update on changes
employeeUnsub = employeesRef.doc(uid).onSnapshot(snap => {
projectIds = snap.get('projects_list');
updateProjectListeners();
});
}
});
function updateProjectListeners() {
// get a list of existing projects being listened already
let existingListeners = Object.keys(projectUnsubs);
for (const pid of existingListeners) {
// unsubscribe and remove the listener/data if no longer in the list
if (!projectIds.includes(pid)) {
projectUnsubs[pid]();
delete projectUnsubs[pid];
delete projectData[pid];
render();
}
}
for (const pid of projectIds) {
// if we're already listening, nothing to do so skip ahead
if (projectUnsubs[pid]) { continue; }
// subscribe to project data and trigger a render on change
projectUnsubs[pid] = projectsRef.doc(pid).onSnapshot(snap => {
projectData[pid] = snap.data);
render();
});
}
}
function render() {
const out = "<ul>\n";
for (const pid of projectIds) {
if (!projectData[pid]) {
out += `<li class="loading">Loading...</li>\n`;
} else {
const project = projectData[pid];
out += `<li>${project.name}</li>`;
}
}
out += "</ul>\n";
}
The above code does what you're talking about (and in this case the render() function just returns a string but you could do whatever you want to actually manipulate DOM / display data there).
It's a lengthy example, but you're talking about a pretty sophisticated concept of essentially joining realtime data dynamically as it changes. Hope this gives you some guidance on a way forward!
I'm writing a cloud function that uses request-promise and cheerio to scrape a website and then check that information against a user document.
I am not entirely familiar with Javascript and Cloud Functions.
I've come so far that I managed to extract the information I need and navigate to the user's document and compare the data. Now the last piece of this function is to give the user points for each matching data point, so I need to update a map inside the user document.
This function has to loop through all users and change their document if the data point matches. I'm not sure the way I've written the code is the most optimal in terms of performance and billing if the userbase gets huge... Any pointers to how I could minimize the impact on the task would be of great help, as im new with JS.
So this is the code:
exports.getV75Results = functions.pubsub.schedule('every 2 minutes').onRun(async (context) => {
let linkMap = new Map();
const url = `https://www.example.com`
const options = {
uri: url,
headers: { 'User-Agent': 'test' },
transform: (body) => cheerio.load(body)
}
await rp(options)
.then(($) => {
for(let i = 1; i <= 7; i++)
{
//Find player from game
const lopp1 = $(`#mainContentHolder > div > div.mainContentStyleTrot > div > div.panel-body > table:nth-child(1) > tbody > tr:nth-child(${i}) > td:nth-child(2) > span`).text()
const lopp1StrR1 = lopp1.replace("(", "");
const lopp1StrR2 = lopp1StrR1.replace(")", "");
const lopp1StrR3 = lopp1StrR2.replace(" ", "");
linkMap.set(i, lopp1StrR3.toUpperCase());
}
console.log(linkMap);
return linkMap;
}).then(async () => {
//Start lookup users
let usersRef = db.collection('fantasyfotball').doc('users');
usersRef.listCollections().then(collections => {
collections.forEach( collection => {
var user = collection.doc(collection.id);
let batch = new admin.firestore().batch();
user.get().then(function(doc) {
let json = doc.data();
//Look in users collection if players document exist
Object.keys(json).forEach((name) => {
if(name != null) {
//Document with users active fotball players
if(name == 'players') {
let i = 0;
Object.values(json[name]).forEach((value) => {
i++;
if(value.localeCompare(linkMap.get(i)) == 0) {
//Loop through user keys and find owned players if user has the correct player
Object.keys(json).forEach((map) => {
if(map != null)
{
//Document with a map of player owned fotball players, each respective player has a key = 'fotball player' and value = '[price, points]'
if(map == 'ownedplayers')
{
Object.entries(json[map]).forEach((players) => {
if(players[0].localeCompare(value) == 0) {
console.log(players[1][1]);
//Add points to respective player field
//PROBABLY NOT HOW TO CHANGE A DOCUMENT FILED, THIS DOESNT WORK..
players[1][1]++;
}
});
//EACH TIME THIS RUNS IT SAYS: "Cannot modify a WriteBatch that has been committed"
batch.update(user, {'ownedplayers': json[map]});
}
}
});
}
});
}
} else {
console.log('user does not have a playermode document.');
}
});
});
return batch.commit().then(function () {
console.log("Succesfully commited changes.");
return null;
});
});
});
}).catch((err) => {
return err;
});
});
The issues i get in the console are "Cannot modify a WriteBatch that has been committed." and I fail to modify and add points to the player field inside the users document.
This is the console:
This is the firestore document structure:
I'm completely stuck on this.. Feels like I've tried all different approaches, but I think i dont fully understand cloud functions and javascript, so i would gladly recive feedback and help on how to make this work.
Cheers,
Finally.... i managed to update the document successfully. I put the commit outside another ".then()". Thought I tried that, but yay I guess :P
}).then(() => {
return batch.commit().then(function () {
console.log("Succesfully commited changes.");
return null;
});
The problem now is that it commits every loop. I think the most optimal here would be to batch update ALL users before committing?
And again, is there a more optimal way to do this, in terms of minimizing the operation and impact? I'm afraid I go too deep with for loops instead of directly navigating to the document, but haven't found an easier way to do that.
Any thoughts?
I'm using the default TypeScript service and the models are initialized asynchronously with one model depending on the other. There's a case where the two models cannot detect each other so it shows a semantic error. If I make some edits in the dependent model, which causes the model to be re-validated, the errors disappear.
I have tried to setModel manually, which solves the problems. However, it destroys the undo history.
Is there a way to re-validate the model manually?
That's my solution, which is extracted from monaco-typescript:
async function revalidateModel(model) {
if (!model || model.isDisposed()) return;
const getWorker = await monaco.languages.typescript.getTypeScriptWorker();
const worker = await getWorker(model.uri);
const diagnostics = (await Promise.all([
worker.getSyntacticDiagnostics(model.uri.toString()),
worker.getSemanticDiagnostics(model.uri.toString())
])).reduce((a, it) => a.concat(it));
const markers = diagnostics.map(d => {
const start = model.getPositionAt(d.start);
const end = model.getPositionAt(d.start + d.length);
return {
severity: monaco.MarkerSeverity.Error,
startLineNumber: start.lineNumber,
startColumn: start.column,
endLineNumber: end.lineNumber,
endColumn: end.column,
message: flattenDiagnosticMessageText(d.messageText, "\n")
};
});
const owner = model.getLanguageIdentifier().language;
monaco.editor.setModelMarkers(model, owner, markers);
}
Call the function above when model is created asynchronizedly.
This is what I did to fix it:
setInterval(() => {
const range = new monaco.Range(1,1,1,1);
const addEmptySpace = {forceMoveMarkers: true, range, text: ' '};
for (const m of monaco.editor.getModels()) {
const toInvert = m.applyEdits([addEmptySpace]);
m.applyEdits(toInvert);
}
}, 50*1000)
Every fifty seconds you insert and immediately remove a space. I don't like it, but it works.