I have a one page web app so there is no page refreshing. Sometimes I leave the page open overnight and I come back in the morning and start interacting with it again. Usually I find I have to refresh. Javascript performs incorrectly - edit-in-place loads weird data, ajax calls don't fire... It's nothing to do with the backend, it seems to just be the browser dumping it's memory, or something. There're no sessions involved.
How does Google calendar stay open for 3 days and still fire event alerts?
I have a 'keep alive' call that fires every 5 seconds, in an attempt to keep the browser on its toes, but it hasn't helped. What's the trick? IS there a way to tell the browser to hold everything in memory forever?
(I'm sure this is addressed in numerous places on the web, but I can't figure out what to search for.)
Possible things to look at:
Test on a couple different browsers to see if they have the same problem.
If they do then its almost 100% certain something in your code.
Otherwise its probably something with your current browser and perhaps some interaction with some portion of your code.
Seems trivial but a lot of people over look this in javascript, but make sure you are deleteing/freeing anything you allocate with new.
If you use any 3rd party libraries consider updating them or checking their forums.
Good luck!
When you refresh the page do you need to do any authentication because of a timed-out session or anything ? Because if all you really need to do is hit F5 and you're good to go, I would suggest you create an 'idle' timer in your app that does a window.reload() every hour if there is no interaction (IOW reset that timer each time there is an interaction)
Hope this helps
Related
I have a web app that basically can be looked at as a messaging system - people can submit a message, and someone else can receive it. This all works via AJAX and the Javascript front end interacts with a PHP backend. All of this works completely fine and there's no issue.
I have also implemented the Notification system that sends the desktop or android app a push notification when a new message is received. This also works completely fine.
The notification system works using setTimeout to periodically check the PHP AJAX system. But this is where the deal breaking issues arise.
When out of focus on Android, settimeout becomes completely unreliable - sometimes it will work, sometimes it will not work at all, sometimes it is very late.
To fix this, I then moved everything into a support worker as I thought that would work independent of the browser being focused, but this is even worse - seems it is even less consistent than just running settimeout on the browser.
So is there some way to rectify this? Is there some special directive within the supportworker that I can put so that it does not sleep?
thank you.
This API does not guarantee that timers will run exactly on schedule. Delays due to CPU load, other tasks, etc, are to be expected which may result in the actual delay may be longer than intended.
You can read more about the setTimeout delay and reasons it may be delayed longer here on MDN.
If you need immediate messaging like capabilities you should look into an architecture and protocol meant for such. Something like WebSockets with events would better suffice for this use-case.
Im working on a college project thats going well apart from a problem with slowness. It seems to take forever to load.
I have a feeling that im not deploying it properly or something else is slowing it down. Its deployed to a Heroku Hobby server which $7 a month and never has to go to sleep.
How would i properly check through chrome inspect or somethign similar whats causing it to take 20 seconds to see anything?
For me, the page took 9.85s to load the DOM content and 33.91s to fully load. I should also note that I'm in Korea, so it may take slightly longer for me.
Like the others have said, you have far too many script tags. Each is making their own request and they have to wait until the previous request is finished before the next one can fire off.
To drastically speed up the time, I would suggest to concatenate and minify your scripts into as few scripts as possible. Maybe one bundle for third party scripts and another for all of your scripts. Also, cache them if you can.
The networking, and auditing tabs on Chrome DevTools (inspect) can help find your issue, which definitely seems to be the amount of scripts you have loading.
Networking
Audit
Forgive me if the question is too subjective.
I do understand what it is. No need to explain that, please.
I just don't get why people find it useful. Same for live reload. The act of pressing CMD+R isn't something that takes time. If the actual refresh takes significant time, seems like one should just fix their dev environment.
I have a trust problem with such things. Seems too likely that they'll end up causing bugs. I fear that I'd spend an hour tracking down a bug only to find it was hot-module-reloading's fault. Or that everything is working in dev, but breaks in prod because prod isn't using hot-module-reloading and dev is. Perhaps this fear is misplaced.
I also find it tricky to know when the changes have taken effect. Seems simpler to just know that once you press CMR+R, your changes are there.
It's an efficiency thing. No matter how fast your dev environment is, when you press Refresh, it's going to take a second or two, since by definition things won't be coming from cache, and we're talking dev so there could be a bunch of HTTP requests that would be consolidated into a just a couple of HTTP requests in production. So if your workflow is save, switch to browser, hit refresh, you're sitting there for a couple of seconds. Every time.
With live-reload, hit save in your editor, and by the time you've switched over to your browser, the refresh is already completed or at least underway.
I was skeptical about the value of it until I started using it. Definitely noticed cycles were a little bit faster, sometimes markedly faster depending on what I changed and whether I flipped to the browser right away or not.
I haven't had trouble with it causing bugs. YMMV.
HMR relies on the maintaining of state. For a simple web app the reload may only take seconds. For a complex web app the steps required to get to a state may be long and complex, open to mistakes along the way. So you do your long list of steps, get to the bit you are testing and smack, a silly typo means you have to start over again. All that unnecessary repetitive work can be avoided, changing a job from hours to tens of minutes by using HMR. I have no problems trusting it for what it is designed to do.
You would never release without testing, and you would never test with HMR. HMR is for development, you only need to trust it to maintain a state while work through a module.
It scares me too... if I can't trust it 100% of the time (and I can't get away from refreshing 100% of the time), why worry with the extra complication of it?
It's refreshing to refresh. That's my new saying.
Let me begin by saying I do not want to "disable" or otherwise prevent the proper usage of the browser history buttons.
What I need is a javascript-based procedure (cross-browser compatible, hopefully) to refresh a webpage (staying on the same URL) after navigating to it using the back/forward buttons. This is necessary because during this process the server keeps track of the user's position/page, and if the user wants to jump back 3 pages I need to "inform" the server of the new location by reloading the page (or is there a better way to do it?) I already disabled caching through HTTP headers but this doesn't work for back/forward history, at least in Firefox 7.
Using jQuery is of course acceptable and desirable. I looked around a bit and found out about $(document).ready(). Now, please keep in mind I'm a complete javascript noob. I have zero experience, and the same goes for jQuery (I know what it does, I've looked at the docs, but that's about it). So I'm trying to understand how this works, but pages that mention this method seem to assume that the webdeveloper wants to modify the DOM from it, and there are a few quirks when you want to do that (load order and stuff). Since in my case I only need to refresh, it should hopefully be easier. So:
I understand this doesn't only run when you browse back, it also runs every time you load the page. How can I make sure I don't end up with an infinite loop? I want it to run once when I browse back, but not on load, after the automated refresh or otherwise. On a normal load I'd rather not have it running because the user would have to download each page twice, which is stupid!
Or is there a better way to do this? Any other ideas? Care to explain or point me in the right direction?
EDIT: I only need compatibility with:
Internet Explorer 8 or higher
Firefox 4 or higher
Recent-ish Chrome/Safari (I don't keep track of version numbers but why would someone not use up to date Chrome anyway?)
The best workaround I ever found for this problem is to use location.replace(), like so.
It does not directly address the problem from my original question; however, since that seems not to have a solution (for now), I recommend that everyone uses this client side function to protect the server side pages they do not wish to have executed again by a client using the back button. I'm sure this is better explained elsewhere on stackoverflow, but for the few people using my convoluted way of thinking to look the problem up, there you have it.
Its a bit of an abuse, but one of the ways of doing this would be to have your "proceed to next step" button as a form which POSTs. For example;
instead of
Proceed to next Page
you have
<form action = "foo" method = "POST"><input type = "submit" value = "Proceed to next page" /></form>
If the user hits back, they'll be forced to re-send their data to the server and your page would be refreshed. This would probably be really annoying to the user though!
But as i mentioned, major abuse of forms!
EDIT: This abuse will only work for certain scenarios though, you'll be the best judge of whether it's appropriate.
I'm having a bit of trouble with IE 8 (and probably all previous versions). Firefox and Chrome/Webkit are seemingly fine.
Something is causing page rendering, scrolling, and basically all page interaction to become blocked. As best I can tell, Javascript execution is causing this to happen.
Specifically, I think there are two major responsible parties in my specific situation - Google Maps API (v3) and Facebook Connect.
In both cases, I am using the asynchronous load methods provided by both Google and Facebook.
I've tried a couple of things so far, to no avail:
Delaying execution with jQuery's $(document).ready(). This just prevents the locking until slightly later in the page load. Actually, since I use gzip compression, I'm not really sure it does anything - I'm not clear on how that works.
Delaying execution with window.onload. Same situation - the whole page loads, then it locks up while it grabs and executes the Facebook Connect code.
Using setTimeout(function(){}, 0). I'm not 100% clear on how this is supposed to actually work - as I understand it, it essentially is supposed to force the execution of the function's code to wait until the stack is clear. Unfortunately, this doesn't seem to do much of anything for me.
I think the problem is especially exaggerated for me because I am on a slow connection.
I can't think of any specific oddities with my site that would be a factor, but I won't rule that out.
So, my question:
Are there any best practices or existing solutions to this issue?
Is there something that I am obviously doing wrong?
The offending site is at: http://myscubadives.com/, if anyone would be willing to take a look at the specific implementation.
Thank you in advance for your time and help.
Sam
Yes, the browser (at least IE) suspends itself while Javascript is being executed. This makes things a bit faster, because it doesn't have to redraw and recalculate layout every time you make a change. However if your Javascript takes a long time to execute, this will seem like freezing. Synchronous XMLHttpRequests also count.
Unfortunately there is no pretty workaround. The typical advice is to use window.setTimeout() function with timeout set to 0 (or something very small) to split the workload in several parts. Inbetween the browser can manage to redraw itself and respond to some user interaction, so it doesn't seem frozen. The code gets ugly though.
In case of lengthy XMLHttpRequests you have no choice but to use the asynchronous version.
Added: Ahh, I see that you already know this. Should read more carefully. :P Did you also know that IE8 has Developer tools built in (Press F12 to activate) and that the Javascript tab has a profiler? I checked it out, and 2 seconds were spent exclusively in jQyery's get() method. Which gives me a strong suspicion that something is still using synchronous XMLHttpRequests.
Function: get
Count: 10
Inclusive time: 2 039,14
Exclusive time: 2 020,59
Url: http://ajax.googleapis.com/ajax/libs/jquery/1.4.3/jquery.min.js
Line: 127