Base64 video encoding - good\bad idea? [closed] - javascript

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I'm working on a mobile front-end project using cordova, and the backend developer i'm working with insists that the media files (images/videos) should be transferred as base64 encoded in json files.
Now, with images it's so far working. Although it freezes the UI for a few seconds, it can be deferred somehow.
The videos however are so far a pain to handle, the length of a single/simple video being transferred is nearly 300,000 in length. It brings my poor laptop through a wild spin, and gets the uri after about 20 seconds of going through the code (and it still not working, and I don't feel like debugging it cause it nearly crashes my laptop with every refresh).
So my questions are:
Is base64 encoding a popular way of transferring media in mobile development?
And if not, what alternative way would you recommend using to transfer/present these videos?
I should mention though, the videos are meant to be viewed at once by a large number of people (hundreds perhaps), and the other developer says that their servers can't handle such traffic.
Much thanks for any advice, I couldn't find this info nowhere. :)

[...] the backend developer [...] insists that the media files (images/videos) should be transferred as base64 encoded in json files.
This is a very bad (and silly) idea up-front. You do not want to transfer large amount of binary data as strings. Especially not Unicode strings.
Here you need to arm up and convince your backend dev rebel to change his mind with what-ever it takes, play some Biber or Nickelback, or even change his background image to something Hello Kitty, or take a snapshot of his screen, set it as background and hide all the icons and the bar. This should help you changing his mind. If not, place a webasto in his office at max and lock all doors and windows.
Is base64 encoding a popular way of transferring media in mobile development?
It is popular and has a relative long history and became very common on Usenet and so forth. In those days however, the data amount was very low compared to today as all data where transferred over modems.
However, just because it is popular doesn't mean it is the right tool for everything. It is not very efficient as it require an encoding process which convert three octets into four bytes, causing an addition of 33% to the size.
On top of that: in JavaScript each string char is stored as two bytes due to Unicode char-set so your data is doubled and extended 33%. Your 300 mb data is now 300 x 2 x 1.33 = 798 mb (show that to your backdev! :) as it's a real factor if the servers cannot handle large amount of traffic).
This works fine for smaller files but for larger file as in your example this can cause a significant overhead in both time and memory usage, and of course bandwidth. And of course, on server side you would need to reverse the process with its own overhead.
And if not, what alternative way would you recommend using to transfer/present these videos?
I would recommend:
Separate meta-data out as JSON with a reference to the data. No binary data in the JSON.
Transfer the media data itself separately in native bytes (ArrayBuffer).
Send both at the same time to server.
The server then only need to parse the JSON data into edible for the backend, the binary data can go straight to disk.
Update I forgot to mention, as Pablo does in his answer, that you can look into streaming the data.
However, streaming is pretty much a synonym with buffering so the bandwidth will be about the same, just provided in a more brute-force way (usually UDP versus TCP, ie. loss of packets doesn't break the transfer). Streaming with limit your options more than buffering in the client though.
My 2 cents...

Not sure why "33% overhead" is always mentioned, when that's complete nonsense. Yes, it does initially roughly add that amount, however there's a little thing called gzip (ever heard of it?). I've done tons of tests and the difference is typically negligible. In fact, sometimes the gzipped base64 string is actually smaller than the binary file. Check out this guy's tests. So please, can we stop spreading absolute fiction.
Base64 is a perfectly acceptable method of retrieving a video. In fact, it works amazing for a private messaging system. For instance, if you were using AWS S3, you could store the files privately so there is no URL.
However, the main disadvantage (imho) of using a gzipped base64 video is that you need to wait for the whole video to load, so pseudo-streaming is out of the question.

Base64 is a convenient (but not efficient) way of transferring binary data. It's inefficient because transfer size will be 33% bigger than what you're originally transferring. Si it's not a popular way of transmitting video. If you are planning to stream that video, you should be looking for a established protocol for doing just that.
I would recommend a streaming protocol (there are a lot where you can chose from).

I think is bad idea, video files is large. But you can try with small video files.
Try online encoder https://base64.online/encoders/encode-video-to-base64
There you can convert video to Base64 Data URI, and try to insert in HTML
Result like this:
<video controls><source src="data:video/mpeg;base64,AAABuiEAAQALgBexAAABuwAMgBexBeH/wMAg4ODgAAA..."></video>

Related

How to automatically compress photos/videos for website when uploaded?

I am working on a website in which I give users the possibility to upload pictures and videos, how would I automatically compress those videos/pictures server-side before storing them on my server/database. because I don't want abnormally large files to slow down my website, if I was uploading myself I could obviously resize and optimize myself, but is there a way I can do this automatically for my users?
Well, that is a wide question and answer depends on type of the files and algorithm you decide to select.
For images, you can simply use JPG and select desired percentage quality (the smaller, the better size, but worse looking resulting picture). Example: http://blog.clonesinfo.com/how-to-reduce-compress-image-file-size-uploading-using-php-code/
If you want more options or for example lossless quality, you definitely should look for some library or tool, look in this question for some more info: Which is the best PHP method to reduce the image size without losing quality
For videos, it gets a little more complicated, as making a video smaller requires re-encoding it, and also picking the right settings (the codec you usually pick will be the most compatible and efficient one – H.264, or something like VP9 from Google). Note that re-encoding requires significant amount of processing power on your server (might start to be an issue if videos are large and long). Video encoding is a very wide topic which I cannot cover here in 1 response, you can start with googling around how H.264 works.
For video encoding you're also going to need a tool, probably the best choice will be ffmpeg/avconv, plus some PHP library to make it easier to use.

Ways to render images sequence from Canvas

Context
I'm creating some animated graphics using Canvas. I would like to save images sequence of them. If I do it through a Web Browser, for some obvious reasons of security, it will ask me to save manually each files. I need to work around this.
I also need to render the images lossless with an alpha channel, that's why I'm using PNG images sequence and nothing else. Images sequences can be huge in matter of size (e.g. a full HD sequence of 2 minutes at 30 frames/s will easily exceed 1 Go)
Question
What could be the workarounds ? I think that using Node.js could be useful because being server-side should allows me to saves the images sequence without awaiting confirmations. Unfortunately, I don't know it very well, that's one of the reason, I'm asking.
I'm also seeking for a "comfortable" solution. Below, someone seems able to do it using Python and MIME, it seems really ponderous to do and slow.
Googling
Exporting HTML canvas as an image sequence but it doesn't talk about the Node.js solution
Saving a sequence of images in a web canvas to disk But not the same context, he is providing a Web service for some clients.
https://forum.processing.org/two/discussion/19218/how-to-render-p5-js-sketch-as-a-movie doesn't bring any solution, but confirm what I've explained.
https://github.com/spite/ccapture.js/#limitations this but it doesn't allows me to export PNG images, only video, this isn't what I searching for.
http://jeremybouny.fr/en/articles/server_side_canvas_node/
Disclaimer
I'm not a native English speaker, I tried to do my best, please, feel free to edit it if something is badly written

Cordova | Sending videos to mobile phone

I'm looking for a way to transfer video files to a client's mobile without streaming. The reason is (client's request) to eliminate the cost of such a server due to an expected all-at-once high traffic.
So I have looked on base64 encoding, below is the time it takes to get the 19mb file (one with 100mb internet connection, second with a 3G connection). This could make the waiting painful, especially on 3G connection.
I have also considered using byte array to significantly reduce the file size, but it's hard passing it via JSON with all the escaping backslashes...
Finally, I have looked on another possible solution, and that is to transfer a video directly to the client's phone while the app is closed (pushing a notification when the file has uploaded in the client's phone), but that is probably one of Cordova's limitation (as far as i'm aware).
I'm searching a solution for this for weeks now, therefore I have placed a bounty on it, since I believe it's a question worth answering. Somebody someday will thank us for it. :) I'll be the first.
Much thanks, and happy coding.
Hosting vs app serving
First of all you need to understand that no matter where the file is coming from - a file server (streaming) or application server (base64 encoded string), the hosting costs are going to be the similar (well, a file hosting server should be more effecient than anything you write, but that's a minor difference). You still need to store the file somewhere and you still need to send it over the network. The difference is only that in one case Apache/IIS/whatever server you use is handling all the complex stuff and in the second case you are going to be trying to recreate it all yourself.
Streaming vs Non-Streaming
When you serve a file (be it yourself, or through a file server) you can either allow it to be retrieved in chunks (streamed) or only as a huge big file. In the first case - streaming - if the user stops watching halfway through the video you will only need the server capacity to serve like 60 or 70% of the file. In the second case - non-streaming - you need to have the user first wait for the file to be retrieved in it's entirety and on top of that it will always cost you 100% power.
Precaching files
That's not to say nothing can be optimized. For example, if you are distributing a single file every week on saturday 6 pm, yet already know a full week before hand what that file is you could theoretically encrypt the file and serve it in the background distributed over the course of the entire week. And yes, you could even do that whilst building a Cordova application (though it will be a bit harder and you might end up writing your own plugin). Still though, that situation is incredibly rare and is definitely not worth the development time except in rare cases (e.g. it's often done with game files, but that's tens of GBs of data downloaded tens of thousands of times).

Recommended filesize range for html/js/css/jpg/png for http transfer

When optimizing websites I've used concatenating and spriting to group related, reusable bits together but I'm often wondering how much or how little to package assets for delivery to the browser (sometimes automated tools aren't always part of my build process, though I prefer them).
I'm curious if there are some sensible guidelines just in the area of filesize when combining assets for delivery to the browser. Assuming no compression, or caching, just straightforward http transfer from a server to a browser with or without AJAX.
What is the largest smallest filesize recommended?
I've heard that because of packet size (right? apologies if that was inept) that 1kb and 2kb of data will transfer at basically the same speed — is there a general threshold in kb where additional bytes start impacting transfer rate.
Does transfer speed change linearly with filesize, or does it stair-stepper?
Extending the first question, does each kilobyte increase transfer speed in a fairly linear fashion? Or does it stair-stepper at packet-sized intervals (again, possibly inept word choice)?
Is there a maximum size
Again, I know there are lot's contextual reasonings that influence this, but is there are filesize that is inadvisably large given current networks, browsers, or is it heavily dependent on the server and networks? If there is a good generalization, that's all I'm curious about.
It probably goes without saying, but I'm not a server/networking expert, just a front-end dev looking for some sensible defaults to guide quick decisions in asset optimization.
It really depends on the server, network, and client.
Use common sense, is the basic answer: Don't try to send several-megabyte bitmaps, or the page will take as long to load as if the person is trying to download any other several-megabyte file. A bunch of PNGs right there on a single page, on the other hand, will not really be noticeable to most modern users. In a more computational realm than you've asked, don't abuse iframes to redirect people to several steps of other web pages.
If you want more information about actual transmission, the maximum size of a single TCP packet is technically 64kB, but you're not really going to be sending more than 1.5kB in a single packet. However, TCP is stream-based, so the packet size is mostly irrelevant. You should be more concerned with bandwidth of modern machines, and considering how efficient we have video asset streaming nowadays, I really don't think you should be overly worried about delivering uncompressed assets to your users.
Because of the relative infrequency of actual delivery errors (which have to be corrected over TCP), along with the miniscule packet size relative to the size of most modern web pages, it's going to increase in delivery time pretty much linearly with total size (again, like one giant file). There are some details about multi-stage web page delivery that I'm leaving out, but they're mostly ignored when you're delivering high-asset-count web pages.
Edit: To address your concern (in the title) about transferring actual html/js files, they're just text in transfer. The browser does all of that rendering and code-running for you. If you have a single jpg, it's going to mostly overshadow any html/js you have on the page.
Transfer size: maximum packet size for a TCP connection
http flow (a rough view): http://www.http.header.free.fr/http.html
Basically, as you get the primary html document representing the page as a whole (which is from your initial request to access the page), you parse for other URLs specified as images or scripts, and request those from the server for delivery to your session. The linked page is old but still relevant, and (I hope) easy to understand.
If you want actual bandwidth statistics for modern users, that's probably too much for me to track down. But if you want more technical info, wikipedia's pages on HTTP and TCP are pretty great.

How do I compress multiple Web Audio sources/tracks into one?

We are making an web based music editor and mixer based on the Web Audio api. Users can mix together multiple tracks, crop tracks, etc. The actual mixing together of the tracks just involves playing back all the sources at once.
We want to be able to add the option to save the mix and make it available for download to a user's computer. Is there some way to do this on the front end (like connecting all the sources to one destination/export node), or even the backend (we are using RoR)?
RecorderJS does exactly what you need, and it could not possibly be easier to use. Really, really great library.
https://github.com/mattdiamond/Recorderjs
P.S. Look into OfflineAudioContext and my answer to this question (Web audio API: scheduling sounds and exporting the mix) for info on doing a faster-than-realtime mixdown of your audio.
Users data looks to be on client side?
Basically when converting data with base64 into dataURI, datas are diplayed inline so they can be add 1 by 1 togheter into one single blob object, and be downloaded.
But this method is only good for smalls files, causing crash and freezing with most browser, this is only good for blob size less than 10mb after some personnal tests, this will be better soon for sure.
<audio controls><source src="data:audio/ogg;base64,BASE64.......BASE564......BASE64............."></audio>
or
<a href="data:audio/ogg;base64,BASE64...BASE64..BASE64....>Download</a>
Probably not your way, just an idea but your project is interesting ;)

Categories

Resources