JavaScript Ajax continue running? - javascript

I am developing a simple web application which will be used for tracking GPS position. The general idea is that every couple of minutes it'll send the GPS location back to a server with an AJAX request. Note the device will be connected to power permanently.
We ideally need this to keep running in the background when the screen is turned off. I've put together a test with a Nexus running Lollipop and Chrome and had limited success. The requests keep sending for 5-10 minutes after the screen is off, then they stop.
Oddly there still seems to be occasional requests. For my testing I'm doing a 10 second request interval, in the real world it'll be more like 5-10 minutes. Some sort of queue? If this is a thing and I can figure out how to work with it reliably it might work.

Related

web page shows 507 insufficient storage once in a while

I have a website up and running. The website worked fine on localhost with no such errors but after I put it online it started showing 507 insufficient storage page whenever two or three users used the same page at the same time.
For example there is a webpage chat.php which runs an ajax request to update the chats every 700 milliseconds. Side by side two ajax requests keep checking for messages and notifications. These ajax requests are completed using javascript's setInterval method. When this page is accessed concurrently by two or more users the page does not load and shows the error and sometimes the page shows 429 too many requests error. So at the same time maximum 4 requests can occur at the user's end and that too if the scripts run at the same time. Could this occur because of the limited entry processes? The hosting provides me with 10 limited entry processes by default. Please reply and leave a comment if you want me to post the setInterval method code even though I think the problem is something else here.
For example there is a webpage chat.php which runs an ajax request to update the chats every 700 milliseconds.
These ajax requests are completed using javascript's setInterval method.
When this page is accessed concurrently by two or more users the page does not load and shows the error and sometimes the page shows 429 too many requests error.
So at the same time maximum 4 requests can occur at the user's end and that too if the scripts run at the same time.
The hosting provides me with 10 limited entry processes by default.
Please take some time to read through (your own) quotes.
You stating that you AJAX the te server every 700ms and you do so by using setInterval. There is a maximum of 4 requests per user and 10 in total. If there is 2 or more visiters stuff goes haywire.
I think multiple things may be causing issues here:
You hit the 10 requests limit because of multiple users.
When 2 users hit 4 requests your at 8, if anythings else does a requests on the server you very quickly hit the maximum of 10. With 3 users with 4 requests your at 12 which according to your question hits your limit.
You might be DOSsing your own servers.
Using setInterval to do AJAX requests is bad. really bad. The problem is that if you request your server every 700ms and the server needs more than those 700ms to respond you'll stacking up requests. You will eventually hit whatever the limit is with just one user. (although in certain cases the browser might protect you).
How to fix
I think 10 connections (if it's actually 10 connections, which is unclear to me) is very low. However you should refactor your code to avoid using setInterval. You should use something like Promise to keep track of when a requests ends before scheduling the new one so you prevent stacks of requests piling up. Keep as a rule of thumb that you should never use setInterval unless you have a very good reason to do so. It's almost always better to use some more intelligent scheduling.
You also might want to look into being more efficient with those requests, can you merge the call to check for messages and notifications?

Browser drops POST when Apache KeepAlive is on

For many years I have had apache keep-alive turned on for performance reasons. It allows connections to be reused and makes my pages load subtly faster. However, in the last several months a strange issue has started to happen.
Sometimes, a connection from a user's browser to my application gets dropped which causes data to not get saved and an error to be presented. I have done a considerable amount of testing and think that I have narrowed the problem down. It doesn't matter which browser I use. The database and server side scripting are not a factor. It only happens with POSTs not with GETs which is interesting. It goes away if I disable keep-alive.
Here is what I think is happening. I have KeepAliveTimeout set to 1 second. After 1 second, the server terminates the connection but it takes a short amount of time (lets say 100ms) for the client to realize it was terminated. So, between 1 second and 1.1 seconds, if the client attempts to reuse the connection and POST some data, that POST will fail. I've reproduced this by making a script that POSTs some data exactly at 1 second intervals, and I can see every other connection from the client getting dropped. If I change the script to POST at 0.9 second intervals or 1.1 second intervals it never drops a connection because the specific timing window is avoided. If I change KeepAliveTimeout to 2 seconds or some other number, then it just pushes out the timing window and doesn't really solve the problem.
My POSTs are coming from javascript (jquery.ajax), but I imagine it could happen from a regular form POST as well if you got the timing right.
In Safari and IE, the connection gets immediately dropped and fails. In Firefox and Chrome the browser stalls for dozens of seconds and then re-sends the request on a new connection which succeeds.
If this is just a fundamental problem with keepalive it is confusing to me why this worked for years and only started doing this in the last few months. Temporarily, I have disabled keep-alive, but I would like to find a way to use it if possible. And I am hoping that someone here knows of a solution.

How do I entertain the user while a page is taking a long time to load?

In reference to this question:
Chrome doesn't seem to fire javascript xmlhttprequests after a form submit, but FF and IE do
I need a way to update the page after I hit submit waiting for the server to respond (the response can take 10s of minutes) with information from the server (as in percent complete)
In IE and Firefox I can make xhr requests while waiting for the page to load. In Chrome this doesn't work, chrome won't fire the xhr request. I never get the hit on the webserver.
How does everybody else do this?
Don't ever make your user sit around for 10s of minutes for a single page submission. They'll hate you, your server will hate you, and all kinds of problems can ensue. Instead:
Submit the request, and place it in a queue.
Return a response to the user indicating that the request is queued.
Update the user as the queue is processed. Either in real-time using a comet solution, or just poll it every 2-3 minutes if it's not time-sensitive.
I received extraordinary feedback on a very simple queueing system I developed for users while they waited for their long running processes to finish.
Time passes more quickly when you have something interesting to read, so I pulled in rss feeds (in an ajaxy fashion) from different news sites while the web app polled the queue for process-completion in the background. When the polling indicated complete, I simply popped a js confirm box asking the user if they wanted to keep the news window open or return to their regular workflow.
Complaints about processing times have all but disappeared since rolling out this solution. I'm not saying it's perfect for every situation, but it certainly worked for me.

Javascript 1 second apart ajax requests? Resource usage?

I have a long-term goal of eventually creating a chat sort by any means, but for now I'd like to just have a simple one with some Mysql and ajax calls.
To make the chat seem instant, I'd like to have the ajax request interval as fast as possible. I get the feeling if it's as low or lower than a second, it's going to bog down the browser, the user's internet, or my server.
Assuming the server doesn't return anything, how much bandwidth and cpu/memory would the client use with constant, one second apart ajax calls?
edit: I'm still open to suggestions on how I can do a chat server. Anything that's possible with free hosting from x10 or 000webhost. I've been told of Henoku but I have no clue how to use it.
edit: Thanks for the long polling suggestion, but that uses too much cpu on the servers.
One technique that can be used is to use a long-running ajax request. The client asks if there's any chat data. The server receives the request. If there's chat data available, it returns that data immediately. If there is no chat data, it hangs onto the request for some period of time (perhaps two minutes) and if some chat data appears during that two minutes, the web request returns immediately with that data. If the full two minutes elapses and no chat data is received, then the ajax call returns with no data.
The client can then immediately issue another request to wait another two minutes for some data.
To make these "long" http requests work, you just need to make sure that your underlying ajax call has a timeout set for longer than the time you've set it for on the server.
On the server, you need to do an efficient mechanism of waiting for data, probably involving semaphores or something like that because you don't want to be polling internally in the server either.
Doing it this way, you can get near instantaneous response on the client, but only be making 30 requests an hour.
To be friendly to the battery of a laptop or mobile device, you need to be sensitive to when your app isn't actually being used (browser not displayed, not the current tab, etc...) and stop the requests during that time.
As to your other questions, repeated ajax calls (as long as they are spaced at least some small amount of time apart) don't really use much in the way of CPU or memory. They may use battery if they keep the computer from going into an idle mode.

Javascript timers & Ajax polling/scheduling

I've been looking for a simpler way than Comet or Long-Polling to push some very basic ajax updates to the browser.
In my research, I've seen that people do in fact use Javascript timers to send Ajax calls at set intervals. Is this a bad approach? It almost seems too easy. Also consider that the updates I'll be sending are not critical data, but they will be monitoring a process that may run for several hours.
As an example - Is it reliable to use this design to send an ajax call every 10 seconds for 3 hours?
Thanks, Brian
Generally, using timers to update content on a page via Ajax is at least as robust as relying on a long-lived stream connection like Comet. Firewalls, short DHCP leases, etc., can all interrupt a persistent connection, but polling will re-establish a client connection on each request.
The trade-off is that polling often requires more resources on the server. Even a handful of clients polling for updates every 10 seconds can put a lot more load on your server than normal interactive users, who are more likely to load new pages only every few minutes, and will spend less time doing so before moving to another site. As one data point, a simple Sinatra/Ajax toy application I wrote last year had 3-5 unique visitors per day to the normal "text" pages, but its Ajax callback URL quickly became the most-requested portion of any site on the server, including several sites with an order of magnitude (or more) higher traffic.
One way to minimize load due to polling is to separate the Ajax callback server code from the general site code, if at all possible, and run it in its own application server process. That "service middleware" service can handle polling callbacks, rather than giving up a server thread/Apache listener/etc. for what effectively amounts to a question of "are we there yet?"
Of course, if you only expect to have a small number (say, under 10) users using the poll service at a time, go ahead and start out running it in the same server process.
I think that one thing that might be useful here is that polling at an unchanging interval is simple, but is often unnecessary or undesirable.
One method that I've been experimenting with lately is having positive and negative feedback on the poll. Essentially, an update is either active (changes happened) or passive (no newer changes were available, so none were needed). Updates that are passive increase the polling interval. Updates that are active set the polling interval back to the baseline value.
So for example, on this chat that I'm working on, different users post messages. The polling interval starts off at the high value of 5 seconds. If other site users are chatting, you get updated every 5 secs about it. If activity slows down, and no-one is chatting since the latest message was displayed, the polling interval gets slower and slower by about a second each time, eventually capping at once every 3 minutes. If, an hour later, someone sends a chat message again, the polling interval suddenly drops back to 5 second updates and starts slowing.
High activity -> frequent polling. Low activity -> eventually very infrequent polling.

Categories

Resources