I'm making a media player website that lets the user listen to audio in the background. After finishing playback, it makes a handful of AJAX requests to transmit data and switch to the next audio file.
The app works fine on desktop Firefox, but on mobile Firefox, I find that when the app is not active for a certain amount of time, these requests don't occur: my server has no log of receiving the request. I'm guessing this is a battery/bandwidth saving feature, but I haven't found any documentation on this.
However, in this case it makes the site unusable on mobile devices since I can't make those requests. Is there a way to work around this?
Related
I was wondering if it is possible to throttle user internet speed when accessing the website, based on a user choice. This is needed for a small scale test of how users react to different internet speeds. My workaround would be to get the user to manually throttle the speed in chrome dev tools, but I would prefer that as the last option. Any option to achieve this or something similar would be amazing. Thank you.
Edit: Just to clarify I am looking to code the throttling functionality within the website itself so the user won't have to install something or set the Chrome dev tools manually, as I am aware of those solutions already.
What you desire to do is not easily possible due to security reasons. Chrome (and most other browsers) prevent DevTools access from js scripts. A user has to manually and interactively press the buttons on DevTools to change the network speed of the chrome tab.
On your behalf, you should get the UX testers to use the DevTools.
That being said, there are solutions for this. But they might be complex!
Solutions in JS:
Dirty fix:
Create looping data downloader script that performs a DOS attack on the client.
Basically something like:
let delay = 100*(Math.random() +0.5);
setInterval(/*downloadStuff*/, delay);
Issues with this fix:
This creates real network congestion on the client, which might not be optimal.
Introduces web page lag because of CPU usage.
Better, but time consuming fix:
You can simulate a slow network environment by doing the following:
Periodically do request.abort() some ajax and xhr requests. See here and here. And yes, you have to keep references to the remote calls. (Some inspirational code by bruth)
Randomly prevent some images from loading by changing their src attribute for a few second. See here.
And... there are more to it.
Iframes are tricky as they can be from another domain. Chrome does not support cross domain request. To simulate a slow network, you have to stop the iframe once in a while and refresh it using the src attribute, just like the images. You could use window.frames[].stop(), to simulate a frozen/stopped iframe.
Videos are sometimes loaded with iframes, which is again hard to simulate network lag on. Unlike, images, reloading a video will reset the playback time. AFAIK, there are no way to simulate video lag easily (without heavy change of video playback logic).
And... if you are really into it. Go ahead and override different events such as those from GlobalEventHandlers.
Many solutions aside from js
Use Chrome DevTools (easiest as mentioned)
If the site is connected to a server you own. Add delay on before responding to simulate server congestion.
Use/create a Chrome Extension that changes the network speed
Create your own browser that can run the site, and change the network speed accordingly
Install software to control networks settings on the OS
Change the network speed on the router
Not entirely sure what experience you want other than the Chrome dev tools way but here is an alternative.
clumsy makes your network condition on Windows significantly worse, but in a managed and interactive manner
https://jagt.github.io/clumsy/
https://serverfault.com/a/570702
I have a website that if running on a mobile device such as Android table or more importantly Kindle Fire, if goes into a wifi scenario where the connection is not too good, it times out randomly. The site has a jquery ajax keep alive but it uses the root domain url to do this, and what seems to be happening is that the Kindle times out trying to re-do the current page url.
Can anyone suggest how to for the aspx website that uses a masterpage, upon keep alive, to do its reload from cache and not a full url reload, as the full reload seems to be the issue.
I have tried the google offered options, and I know how to check for the device (i.e. is the site desktop/laptop or mobile device), but not how to make the Kindle browser use cache but a kept alive url).
I am pondering and looking after methods that Google (or other Websites) uses for Auto Suggest when the internet connection is slow (EDGE or 2G). Google provides Auto Suggest for mobile web browsers (Chrome, Safari, Opera, UC browser etc) which are not running on 3G.
I have a data set of 1,20,000 words and have solr at back end. The data is retrieved for a keyword from the DB in 100ms and sent to the client request via HTTP still the auto suggest does not work for slow mobile data connections (works for desktop site).
I have tried work arounds like creating a text file with 1,20,000 entries and stroing it in Js variable (txt file size 2.2 MB so imagine JS file size now with a JS variable) for 1st request and serving the rest from this variable. SLOW
Creating multiple text files for alphabets and rest same as above. SLOW
Trying local browser storage Auto Suggest Still does not work on Mobile (2G)
EDIT: My only motive to make the txt file is to save the HTTP requests.
ANY SUGGESTIONS??
We have an web application. When we try to access the site using wifi connection the site loads perfectly but when we try it with the cellular data (3G), it opens the site but some of the elements are not loaded.
So we tried doing some testing.
It turns out if you load the page first time using wifi then it will start loading afterwards on 3G as well until you clear cache and cookies. Wifi connection and 3G have same bandwidth and ping.
Our web app uses lot of javascripts and ajax calls to retrieve data. We added cache: false in every ajax calls to prevent caching. When that didnt seemed to work I added timestamp to the url as well which also failed to solve the problem.
The issue seemed to be related cookies somehow because once cookies are set the web app works fine.
Does anyone know what could be the problem?
I've run into a real head-scratcher, and I was hoping someone out there could shed some light on my issue.
The application I'm writing is a JS based client for what is essentially a desktop-sharing a service. The service captures images from the desktop, encodes them, as base64 encoded jpegs and sends them over a websocket to the JS client. The client then displays these images (as data URIs), users can move the mouse over the image as well as click on the image, these mouse events are encoded as commands in XML which are put into a queue and serviced on a timer every 15ms, this way the queue can be scrubbed of redundant or duplicate commands before being sent to the service. These commands are then executed (generating click events on the desktop, moving the mouse, etc.), and new desktop images are generated and the cycle continues.
The whole system works extremely well, except for some very inconsistent behaviour on Safari on the iPad. Essentially, when the user moves their finger around the screen, the client seems to block (or possibly de-prioritize) incoming messages on the websocket, in favour of only sending outgoing messages. The way this is manifest is that as you move your finger around, the display will not appear to update as long as you are touching the screen, then once you raise your finger, a flood of image updates will be received by onMessage(), which then get animated to the screen in rapid succession.
Mobile Safari is the only browser that appears to behave in this way, none of the desktop browsers, or any of the Android tablets I've tested appear to show the same behaviour.
I've put logging into the inbound and outbound methods on the websocket, and it confirms the behaviour I've seen. On Safari, I'll get numerous outbound messages in a row, followed by numerous inbound messages, whereas on Android, I'll see the inbound and outbound messages interleaved as you drag your finger around the screen, as a result the display on Android will continue to update as your are dragging your finger around.
The main reason why I suspect the websocket as the culprit is because the client has a fallback mechanism, so that if a browser does not have websocket support, a pair of XHR objects are created (one for inbound and one for outbound) and used instead of the websocket. If I force mobile Safari to use XHR instead of websockets, the problem goes away. In this case only the communication mechanism changes (all of the code for capturing input events and displaying images stays the same).
I realize that this a pretty specific problem, and without code it will be very hard to diagnose, but I opted not to post code simply due to the sheer volume of code in the client.
If anyone has see behaviour similar to what I've described, or know of any potential reasons for this behaviour, I'd be very thankful for your input.
Depending on the size of the packets, you could face the problem of 'large' messages, being extremely slow on Safari (both on iPad and desktop). Have you tried desktop Safari?
Have a look at this page, to see comparisons of performance between different browsers.
It could be that this is your problem.