I have ReactJs app with Firebase as database/backend.
There are 1000 items that are synced with firebase and i get new data with
firebase.database().ref().child('path').on('value', snapshot => {...})
while normal usage, users are changing data of that list, it all updates for all and all happy. ReactJs side is optimised well, it all works w/o lag.
Problem if when i run cronjob updating all those items with some 3rd party data and every user get's on event fired by firebase 1000 times at once one after the other. Browser freezes, RAM usage ~2-3GB (normal usage ~200mb).
There are 2 options what i'm thinking about, maybe you could add something:
Make cron update step-by-step update 1 item per second so fire all those 1000 on events during all 15 minutes timespan and just run cron forever, when it's done updating last item, just start again.
Make some abstraction layer for firebase connection and if on event is fired like 5 times /s disconnect with off and then after couple seconds re-connect with on so i would be disconnected during batch updates.
I like #2 more because then i can do whatever needed with cronjob and it also solves any possible future issues with batch updates.
It seems like your browser process is getting overloaded by the number of items that are queued up by the batch processing.
The best solution is probably to limit the amount of data that the browser process pulls in from Firebase to something it can handle:
firebase.database().ref().child('path')
.limitToFirst(10)
.on('child_added', snapshot => {...})
This way your browser will never see more than 10 items.
You'll note that I switched from a value event to child_added, which will probably also help. If you have a queue, you should probably also handle child_removed to remove the item from the browser UI. So you'll end up with something like:
var query = firebase.database().ref().child('path').limitToFirst(10)
query.on('child_added', snapshot => {
// add item to UI
})
query.on('child_removed', snapshot => {
// remove item from UI
})
I know, that for most cases it would be possible to limit results with limitToFirst() as Frank van Puffelen suggested, but problem is I need all data available in UI for all kind of selections, auto completions not talking about fast data filtering w/o extra call to server
Current solution:
I listen for all coming events from firebase if buffer overloads i just use goOffline and then re-connect after some time. Client don't see any difference, because goOffline lets fire local events normally and all UI works well read more here: https://firebase.google.com/docs/reference/js/firebase.database.Database#goOffline
I add event listener for each child in firebase, i'd like to have one global listener, but it doesn't work for me, there is connected question: How to listen for ALL firebase .on('value') events on root node?
let database = window.firebase.database();
// go offline if incomes more events per second from firebase
let events_buffer = 0;
let online = true;
const go_online = () => {
if (events_buffer > 6) {
setTimeout(() => {
go_online();
}, 3000);
} else {
online = true;
database.goOnline();
}
};
const handle_buffer = () => {
if (events_buffer > 6) {
if (online) {
online = false;
database.goOffline();
setTimeout(() => {
go_online();
}, 5000);
}
}
};
setInterval(() => {
events_buffer = events_buffer > 0 ? events_buffer - 1 : events_buffer;
}, 500);
// listen for all events. Connected: https://stackoverflow.com/questions/38368328/how-to-listen-for-all-firebase-onvalue-events-on-root-node
database.ref().child('clients').on('value', () => {
events_buffer++;
handle_buffer();
});
database.ref().child('users').on('value', () => {
events_buffer++;
handle_buffer();
});
//
// ... add listeners for all childs
//
export const firebase = window.firebase;
https://gist.github.com/liesislukas/0d78b6ac9613b70bafbb8e1601cc740e
for max buffer value i use 6 because on initial load app requires ~3-4. 6 can be reached on with some anomaly with my app.
Related
I have a react app that uses socket io to send for and receive all data. I created a global socket variable to be shared across all components...
export let gameScoket = null
export const update = (newSocket) => gameScoket = newSocket
I then set the socket in 'Home.jsx' and make a few calls...
update(io("ws://localhost:8888"))
socket = gameScoket
socket.on('...')
The problem arose when adding a callbacks to these calls. The callbacks seems to be called a random (very large) amount of times, increasing every time the socket is used. An example of this can be seen in these three sockets in 'Game.jsx'...
socket.on("question-update", (data) => {
console.log("Calling question-update")
const responce = JSON.parse(data)
setQuizData(responce.data)
})
socket.on("point-update", (data) => {
console.log("Calling point-update")
const responce = JSON.parse(data)
setUsrData(responce.data)
})
socket.on("personal-point-update", (data) => {
console.log("Calling personal-point-update")
const responce = JSON.parse(data)
setClientScore(responce.data)
})
Whilst there is no evidence of the client spamming the server with requests, the console is bombed with messages, and the state is re-updated so many times the app becomes unresponsive and crashes. Here's a screenshot of the console...
I don't know where I went wrong with my implementation and would love some advice, thanks.
try to use socket.once("",(data)=>{});
The bug in your code is that each time a new connection() is called node registers an event listener 'Quest'. So first time new connection() is called the number of event listeners with event 'Quest' is one, the second time function is called, the number of event listeners increases to two, and so on
socket. once() ensures that the number of event listeners bound to a socket with event 'Quest' registered is exactly one
Make sure to keep all 'socket.on()' call within useEffect to prevent duplication.
useEffect(() => {
socket.on('...')
}, [])
On my Angular web-app, when a browser refreshes or reloads, the login for the user is lost and he must go through the login steps again. I would like to allow the login status remain open for the user after the browser reload, but only within a short interval, perhaps 10 seconds or so. When the web-app reloads, it checks if the come-back is within these 10 seconds interval. For that I need to know when the refresh/reload or the last moment the app was active.
How do we determine the moment/time right before the browser reloads (or closes) or the nearest time to that?
You can capture the reload event and store a timestamp to the localstorage, then do check and comparison each time your app is initiated. A simple function can be:
window.onbeforeunload = ()=>{
localStorage.setItem('last_reload_time',(new Date()).getTime());
}
Then in your app, check for last_reload_time and do compare with current timestamp.
Another DOM event that may help is visibilitychange
In its simple JS form, I used the answer by Metabolic as the starting point.
However, the functionality of the event: "onbeforeunload" is a bit tricky as stated here: MDN, and few browsers, e.g. Chrome ware giving me cold shoulder on the event - not firing. Note, that in most cases the reload event fires, but is not caught by the debugger and if you'll place breakpoints in (eg: in fn: onBeforeUnload() ), do not expect them to stop your code on the event!
I used this approach with rxjs to resolve - on Angular.
import { fromEvent } from 'rxjs';
persistKey: string = 'TIME_BEFORE_UNLOAD';
//// eventually, instead of rxjs: fromEvent(...)
//// you can use this:
// #HostListener("window:beforeunload", ["$event"])
// unloadHandler(event: Event) {
// this.onBeforeUnload(event);
// }
ngOnInit() {
// use this to test and see;
// the time stamps should change in console
// after each reload click
console.log('-- Stored time before unload: ',
localStorage.getItem(this.persistKey));
this.subscribeToBrowserEvents();
}
private subscribeToBrowserEvents() {
fromEvent(window, 'beforeunload')
.subscribe(event => this.onBeforeUnload(event));
}
private onBeforeUnload(event) {
const val: string = new Date().toISOString();
localStorage.setItem(this.persistKey, val);
}
I have a React application that uses a data visualization library that uses PixiJS.
I occasionally get frustrating CONTEXT_LOST_WEBGL errors in Chrome that force the user to manually reload the page, in order for the page to be (re)rendered.
I cannot often or reliably reproduce the error, but I know that it happens as other people tell me the application occasionally shows no data. The situations that raise this error seem very context-dependent and therefore difficult to recapitulate — low-powered graphics adapters, or lots of tabs open at once, etc.
The end user would only know that there are CONTEXT_LOST_WEBGL errors if that user has the Developer Tools console window open. Otherwise, the web page just looks blank.
I have tried the following to set up my React application to reload the window without manual user intervention, when a webglcontextlost event occurs:
componentDidMount() {
...
window.addEventListener("webglcontextlost", (e) => { window.location.reload(); });
...
}
I'm not sure it is working correctly, i.e., if the webglcontextlost event is being handled elsewhere. Or perhaps I am trying to subscribe to the wrong event?
Otherwise, to try to handle this more gracefully, is there a way in raw Javascript, or via a third-party library, to periodically measure available memory for WebGL, and to use that measurement to instead reload the page, when the available memory reaches some arbitrary threshold that might predict an imminent CONTEXT_LOST_WEBGL error condition?
is there a way in raw Javascript to periodically measure available memory for WebGL
No, just as there is no way to measure JavaScript memory
window.addEventListener("webglcontextlost", (e) => { window.location.reload(); });
Is wrong. It should be
someCanvas.addEventListener("webglcontextlost", (e) => { window.location.reload(); });
Each canvas can individually lose its context. Most browsers only allow 8 to 16 WebGL contexts at once. As soon as the limit is reached canvases start to lose their contexts.
As for recovering gracefully it's a lot of work. Basically you need recreate all WebGL resources which means you need to structure your code so that's posssible. Separate all the state of your app from the stuff releated to WebGL (or from pixi.js) and when you get a context lost event then prevent default and recreate all the WebGL stuff
let gl;
someCanvas.addEventListener("webglcontextlost", (e) => {
e.preventDefault(); // allows the context to be restored
});
someCanvas.addEventListener("webglcontextrestored", (e) => {
initWebGL(gl);
});
gl = someCanvas.getContext('webgl');
initWebGL(gl);
Note that pixi.js itself may or may not be designed to handle contextlost
The following code helped restart my Pixijs web application, when the WebGL context is lost:
addCanvasWebGLContextLossEventListener = () => {
const canvases = document.getElementsByTagName("canvas");
if (canvases.length === 1) {
const canvas = canvases[0];
canvas.addEventListener('webglcontextlost', (event) => {
window.location.reload();
});
}
}
removeCanvasWebGLContextLossEventListener = () => {
const canvases = document.getElementsByTagName("canvas");
if (canvases.length === 1) {
const canvas = canvases[0];
canvas.removeEventListener('webglcontextlost');
}
}
For other applications with more than one canvas, some adjustments would be needed to add other listeners.
The following code helped me simulate a lost context condition (and to restore from it, via the webglcontextlost event):
simulateWebGLContextLoss = () => {
//
// simulate loss of WebGL context, for the purposes
// of improving user experience when the browser is
// overwhelmed
//
const canvases = document.getElementsByTagName("canvas");
if (canvases.length === 1) {
setTimeout(() => {
const canvas = canvases[0];
const webgl2Context = canvas.getContext("webgl2", {});
if (webgl2Context) {
console.log(`losing webgl2 context...`);
webgl2Context.getExtension('WEBGL_lose_context').loseContext();
}
else {
const webglContext = canvas.getContext("webgl", {});
if (webglContext) {
console.log(`losing webgl context...`);
webglContext.getExtension('WEBGL_lose_context').loseContext();
}
}
}, 5000);
}
}
For React lifecycle setup:
componentDidMount() {
setTimeout(() => {
this.addCanvasWebGLContextLossEventListener();
}, 2500);
}
componentWillUnmount() {
this.removeCanvasWebGLContextLossEventListener();
}
A timeout is required, as the canvas element is not yet available when the component mounts. For my purposes, the short 2.5s timer provides enough time for the event handler to latch onto the canvas.
Im trying to make throttleTime take effect, but for some reason it does not kick in. I have the following:
// Class Properties
private calendarPeriodSubject: Subject<x> = new Subject<x>();
private calendarPeriodObservable$ = this.calendarPeriodSubject.asObservable();
// Throttling fails here (Inside constructor):
const calendarPeriodSubscription = this.calendarPeriodObservable$.pipe(throttleTime(750)).subscribe(async (calendar: x) => {
// Do http stuff here
}
});
The subject gets called like this:
this.calendarPeriodSubject.next(x);
I also tried with:
this.calendarPeriodSubject.pipe(throttleTime(1000)).subscribe({next: (x) => x});
I would like to process the FIRST time, and the following clicks should not have any effect before after ieg 750ms - To prevent the server from getting spammed basically.
Anyone has any idea?
Thanks!
The problem is that you are using the wrong operator for your use case. The way I understand your explanation you want to send through your first call and stop any further calls to your Server for some amount of ms. But what throttleTime(sec) does is simply put a timer on the action and execute it sec ms later. So you server will still be spammed, just a few ms later.
Your case screams debounceTime() for me. debounceTime docu
This disables any further data to be passed though the Observable for the specified time after a value has been emitted.
Therefore your code should be fine if you use something like:
const calendarPeriodSubscription =
this.calendarPeriodObservable$.pipe(debounceTime(750)).subscribe((calendar: x) => {
// Stuff with returned data
});
I'm creating an android app that logs how long a person spends on certain things. I want to add the time spent to the total time spend, so I know how long a user has spent on an exercise type
I want to do it in a function, since I think it's easier than transactions.
exports.addExerciseTime = functions.database.ref('users/{userid}/stats/exerciseTime/{type}').onWrite( event =>{
console.log("Exercise time updates...");
var newdata = event.data.val();
var oldData = event.data.previous.val();
return event.data.ref.update(oldData+ newdata);
});
Now, I know that this function will loop until firebase shuts it down.
But how would I do this? Is there an easier way to do this?
you have an easy option of adding a flag indicating that you updated the data. next time you will get into the function, just start by checking if the flag exists in if so, exit the function. the con of this one is that you will run the function at least n+1
another option, according to their latest post, you know have a "onUpdate" and "onCreate" triggers as well. you might be able to use them smartly to optimize this even more (for example: only on first creation do XYZ, so it won't run on each update).
https://firebase.googleblog.com/2017/07/cloud-functions-realtime-database.html
Like you are saying, onWrite will capture every writing event. My solution would be replacing onWrite with onCreate, however let the user write to another path because Firebase will keep triggering the function. Besides that, your approach this is not the best solution since the updates can conflict. The use of transactions is better. That would look like this:
exports.addExerciseTime = functions.database.ref('users/{userid}/stats/exerciseTime/{type}').onCreate( event =>{
console.log("Exercise time updates...");
var newdata = event.data.val();
const pathToValue = //create the path here to exercisetime
return pathToValue.transaction(function(exercisetime) {
return (exercisetime || 0) + newdata;
});
});
*Notice the onCreate event instead of onWrite. Again: You will need to write it to a other path.