we are trying to update the image src with base64 string using data URIs on our webpage every second with Java Script.
Java Script(AJAX) gets the latest image from webserver.
But all mobile browsers crashing after 5-10 mins. crash dump says, browser is crashing because of low memory.
is there any way to cleanup the memory programatically.
thanks in Advance.
Regards,
Kartheek
I haven't tested this myself but according to a comment on this webpage:
cubiq.org/testing-memory-usage-on-mobile-safari
base64 string images are not properly garbage collected by MobileSafari, causing MobileSafari to crash at what seems random places. I think it's worth trying to use regular image links and see what happens.
Also keep in mind that there is an image data memory limit around 10M.
Related
This is my website hosted by netlify all is good except when I try to cycle through this array of objects I get an initial lag in my audio the code that plays the audio is "audio.play()" of .5-2 sec.
And after I have cycled through them once the lag almost all completely disappears is this a netlify thing?
On my localhost is works like in the movies so perfect!
Would love to get a helpful link/video/advice, thanks.
https://csgo-weapons.netlify.app/
It isn't a Netlify thing in particular, just an internet thing in general.
File loading isn't instantaneous on the web. When someone requests a file (in this case, the gun sound), it needs to get from the server to the client, and that takes some time (depending on things like network speeds, physical distance, etc.). On your local machine, these loading times are negligible, since the files are not traveling over the web.
After a file is loaded it's cached in the browser, which is why you're noticing no delay after cycling through all the guns.
A method to mitigate this issue would be to request and load all the sound files before the user starts cycling through all of the guns. That way, they don't need to be requested one-at-a-time on-demand. You could also try to reduce filesizes, although that won't help as much as the preloading.
Can any one guide me on how to achieve this.. I am listing them in pointers..
A linux binary captures frames from the locally attached webcamera
and stores them in a folder. This is a continuous process. The
images are stored numerically.
I have a webserver which gives a output of the latest images received from the webcamera. This is a PHP file which gets the recent most image received and prints out.
What I have now is a javascript which refreshes the image every second and displays in the img tag.
Though it works the output is slow and updates slowly one frame at a time.
I am trying to display the images quickly and in a way it should
look like a mjpeg movie being played (not that it has to be so good
as I learned from the forums that the http does have its overhead)
<script type="text/javascript">
function refresh(){
document.images["pic1"].src="/latimage.php?camid=$selectedcamid&ref=" + new Date();
setTimeout('refresh()', 1000);}
if(document.images)window.onload=refresh;
</script>
<img src='/latimage.php?camid=$selectedcamid' id='pic1'>
Above code works perfect. But my unsatisfied mind wants to display the frames obtained from the webcam quickly..like displaying atleast 3 to 4 frames per second.
As I understood from my searches so far it is not too feasible to do
the refresh act too quickly as the HTTP process does take time.
I am trying to find some details on getting this done using a method
by which I can prefetch 100 frames into a image array (I would call
it buffering) and start displaying one image at a time at the rate
of 3 images / second.
Whiles displaying the images the older images should be removed from
the array and the latest ones fetched should be inserted in the end.
Thus the looping is infinite.
I am sorry for asking too many questions..I am unable to find any proper direction to start off with. I can do the above in .net windows application quite easily but in web browser I am unable to get any ideas. I am not sure if jQuery image array or json or simple javascript would do.
I need some guidance please..
If you need to capture the camera output to disk, then I suggest capturing the camera output as video (at 3 FPS) and then streaming that video file to your browser using WebSockets. Here is an example of doing that. If you are willing to run nginx on your server then live_thumb is a complete solution that captures and streams video via WebSockets.
On the other hand, if your goal is just to view the output of the camera and you don't need to store the video, you could consider using WebRTC and running a browser at both ends and then just hooking up the media stream. In other words one browser (perhaps a headless variant) would run on the system with your camera and would stream the video to your other browser using WebRTC. With WebRTC you could get much higher frame rates and your bandwidth would probably still be significantly lower than sending individual images at a slow frame rate.
I'm saving a bunch of videos to IndexedDB, then displaying them again for an offline version of an app. I have an issue however where occasionally one blob video file will become corrupt. My investigation so far has revealed:
the video file is not corrupt because it's been re-rendered and is rendered in the same way as the other videos
the issue is when saving to the DB rather than retrieving from the DB
the issue occurs when running through both my local server and a remote server, it happens on the remote server more however
the issue seems to happen randomly i.e. I do not change any other variables to cause this issue
So I'm a bit stuck now, anyone have any ideas as to what the problem may be?
Thanks in advance :)
IndexedDB is not intended for storing big files and that is root of the problem. One direction where you can take a look is the file system API, unfortunately this API only works in Chrome (Nov 2014).
I am working on an open source plugin for the web mapping library Leaflet. The idea is to generate printable PDF documents from the map directly in the client browser. It's basic functionality works, but there is an issue in Google Chrome.
Depending of the document format and dpi settings, the script can take some time to fetch all the map tiles as images, convert them to data uris and add them to the document. In this case, the Firefox user interface doesn't respond for some seconds and then it shows the finished PDF. However, Chrome stops executing the script and shows me a bad smiley.
Aw, Snap! Something went wrong while displaying this webpage. To continue, reload or go to another page.
Normally, I would say that that is fine since there is a limitation due to processing power. But this actually happens for DIN A4 format at 300dpi, so I can't live with that. I have strong guess that this is not related to a bug in my code, because I can increase options step by step and at some level, Chrome stops executing the script.
How can I debug my code to find the bottleneck? How can I prevent Chrome from stopping my script?
I am writing a javascript application using the File API in Google Chrome. I am loading the files with the readDataAsURL method. I only need one file loaded at a time. So, each time I load a file, it gets added to memory and after a short while, the one tab will be maxing out my memory. Is there any way to unload the files when I am done with them before I load the next?
There's a bug in Chrome (http://crbug.com/36142) where the memory is not releasing after switching the img.src many times.
Try using createObjectURL(). See my response here:
HTML5 File API crashes Chrome when using readAsDataURL to load a selected image
It will only stay in memory if you hold a reference to it. Otherwise garbage collection will free the memory for you automatically.