Video compression when uploading a video in javascript(react) web application - javascript

In my react application i have included a image and video uploading feature. For images, i'm compressing the image before uploading into the server. Now i need to do the same for the videos as well. But i'm not sure if the video compression should be done before uploading (From the Front-end) or after uploading (From the back-end). What would be the best way to do this considering the performance and efficiency?
Thanks.

For this kind of dedicated and isolated feature, I would really prefer a microservice which sit between frontend and backend (preferably in the same data center as your server).
If you've got good budget some third-party API is presumably performant and trouble-free, like coconut

For uploads from the web, you’re better off compressing server-side. Compression on the client side is going to be quite CPU heavy and it won’t be a good user experience if their computer freezes for long durations while interacting with your site. Not only that, you’ll have to figure out a way run ffmpeg or a similar tool using web-workers in the browser and it’s mostly not worth the headache.
People generally setup a transcoding pipeline that can compress, resize or convert the formats of the user-generated videos in a batch process usually with ffmpeg or use other cloud-based SAAS platforms if you don’t want to do all the heavy lifting yourself.
Full Disclaimer: we had a similar requirement and ended up starting mediamachine.io because most providers were too expensive for our needs.

Related

Node.JS for Background Video delivery

My apologies if this question is misguided, or if what I'm saying doesn't make sense, as I'm somewhat uninitiated to the nodejs world. I've been cruising pretty easily with just plain PHP and Apache for some time now, until I discovered ZURB Foundation's stack w/ Handlebars and SASS, along w/ NPM.
Currently, I'm using simply an HTML5 tag to deliver background video to a page: http://159.203.191.97/sii003. I'm currently on Apache and considering utilizing node and the ProxyPass feature of AWS to serve my content. Would utilizing JS help me save speed on loading/playback of the background video?
It's rather crucial to the user experience and I am curious whether I would be able to load video more quickly utilizing nodejs. I realize that bandwidth and processing speed can create bottlenecks no matter what I do.
If you are planning on Video delivery, use the AWS Native Services for Video Storage(Use S3) and Streaming(Use CloudFront) while using your NodeJS API to provide access permission(Using Signed URLs) if its private content.
This will provide a highly scalable and cost effective solution for media delivery.

Do I need nodejs or I can rely only on Nginx?

I have an application and I have a debate with my peers on if we need to use node.js or not.
Our decision is to use angular.js for the front-end and to communicate via REST api with the app server. The application server will not be in node.js. It could be in .net or Java
Nginx will be in front as it is better for serving static files, gzip etc..
There are many options to boilerplate your angular application and many of them includes nodejs. My first approach was to use node.js as the primary web server and scale it out for solving performance issues. Although, it wasn't a good approach as node's roles isn't to act as web server. Well, here my question arrives.
By keeping in mind the two aforementioned points are there any reasons to generate the front-end including node.js ?
Is there something that I could benefited by that I haven't think of?
Here's the short answer: If you are set on using nginx in front of a .net or Java back end, and you're just looking for a deployment tool for angular.js, then just choose whatever javascript deployment tool makes sense, which may well be built with node.
What follows is a little more exposition:
I'm not quite sure what you mean by "node's roles isn't to act as web server"... if you mean in general, then that's precisely how it's generally used, if you mean your application server will be .net or java, but not javascript, then fine.
Generally speaking, nginx will perform faster for static files, but the margin of improvement over node.js is likely to be meaningless for pretty much anyone. If you need to (or it makes sense to) include node as part of your stack for angular deployment, then it could make sense to use it as your reverse proxy and just eliminate nginx altogether. The odds that you'll get measurable benefit from using nginx instead is vanishingly small.
That said... if you've already got nginx set up and moving to node instead represents extra work that you've already done once, then it loses its primary appeal.
What node.js has going for it more than any other project I'm aware of is that it's extremely capable at every level of the web stack. But it's not necessarily more capable than individual projects used in their appropriate level of the stack, and if you're not going to use it to reduce the complexity of your stack by homogenizing the technology and applications involved, then it just comes down to preference.
If you are caring about performance especially for static files, you could add caching layer as a proxy in front of your backend nginx or a server of your choice .. Varnish-cache is a good choice.
If you want to serve static files at large scale, there is yet a better solution through which you could host your static files in CDNs that will be much better solutions for your live deployments .. Clouding services are built for ease of use and also cost effective for example fastly.com is a good choice for hosting static files with a very persistent cached layer that is built on top of Varnish-Cache .. Cloud front is also another choice if you are fan of Amazon services.
More resources that might help is a comparison benchmarks having popular servers, also another benchmark existshere

Can I use node to power a web application on a separate server?

I asked this (voted to be too broad) Question while working my way through a starter book on node. Reading this book, I'm sure I'll learn the answer to this later, but I'd be more comfortable if I knew this up front:
My Question: Can I (efficiently) continue using a usual webhost such as iPage or GoDaddy to host my web application, building and hosting the front end in a simple, traditional manner through an Apache web server, and communicate with a separate Node.js server (my application back-end) via AJAX for queries and other things that I can more efficiently process via Node?
More specifically, would this be a bad programming practice in terms of efficiency and organization? In other words, would it be likely that a large scale commercial application would ever be handled via this method?
Yes, you can separate the front-end of your web application and the APIs that power it. In fact, this is a very common configuration, especially for "large scale commercial applications".
Where you draw the separation line between the two specifically depends on what you are doing, how you're doing it, and what your needs are. Also, in terms of hosting, remember that if you're accessing something server-side across the internet, you're adding extra latency to everything. Consider getting off Go Daddy and using another host that gives you more flexibility, such as a VPS provider.
It's ok. Actually, this is how things shoud be done. You have a backend API on a separate server and lots of apps which are using the API. Just go with Nginx server, check this Apache vs Nginx.
Yes you can use node js as a part of some big application. It depends on wich type of interaction you would like to get. Is it comfortable to you to mix technologies? Then node is pretty good thing to work over web. I've finished a part of big nodejs-ruby-php-c-flash application (my part was nodejs) for very large data mounts. This application have different levels of interaction. Sometimes I use 2 languages at one time to create each part of my application the best for task I'm working on. There is applications that initiate, run and destroy mutiple OS instances. So using of multi environmental application not so hard.

Downscaling/resizing a video during upload to a remote website

I have a web application written in Ruby on rails that uploads videos from the user to the server using a form (I actually use a jquery uploader that uploads direct to s3, but I dont think this is relevant).
In order to decrease the upload time for a video I want to downscale it e.g. if the video size is 1000x2000 pixels I want to downscale it to 500x1000. Is there a way to do so while the video uploads on the client side? Is there a javascript library that can do that?
Recompressing a video is a non-trivial problem that isn't going to happen in a browser any time soon.
With the changes in HTML5, it is theoretically possible if you can overcome several problems:
You'd use the File API to read the contents of a file that the user selects using an <input type="file"> element. However, it looks like the FileReader reads the entire file into memory before handing it over to your code, which is exactly what you don't want when dealing with large video files. Unfortunately, this is a problem you can do nothing about. It might still work, but performance will probably be unacceptable for anything over 10-20 MB or so.
Once you have the file's data, you have to actually interpret it – something usually accomplished with a demuxer to split the continer (mpeg, etc) file into video and audio streams, and a codec to decompress those streams into raw image/audio data. Your OS comes with several implementations of codecs, none of which are accessible from JavaScript. There are some JS video and audio codec implementations, but they are experimental and painfully slow; and only implement the decompressor, so you'd be stuck when it comes to creating output.
Decompressing, scaling, and recompressing audio and video is extremely processor-intensive, which is exacty the kind of workload that JavaScript (and scripting languages in general) is the worst at. At the very minimum, you'd have to use Web workers to run your code on a separate thread.
All of this work has been done several times over; you're reinventing the wheel.
Realistically, this is something that has to be done server-side, and even then it's not a trivial endeavor.
If you're desperate, you could try something like a plugin/ActiveX control that handles the compression, but then you have to convince users to install a plugin (yuck).
You could use a gem like Carrierwave (https://github.com/jnicklas/carrierwave). It has the ability to process files before storing them. Even if you upload them directly to S3 first with javascript, you could then have Carrierwave retrieve the file, process it, and store it again.
Otherwise you could just have Carrierwave deal with the file from the beginning (unless you are hosting with Heroku and need to avoid the timeouts by going direct to S3).

Real-time collaborative drawing whiteboard in HTML5/JS and websockets?

I'm trying to put together a small(ish) summer school project for some of my advanced students and am researching how to do it best and what to use - hopefully somebody here could point me in the right direction.
What we are interested in is researching if HTML5 came far enough to create a real-time collaborative drawing whiteboard in it - purely by using web technologies without plugins (so CSS, HTML5/DOM and Javascript). What we'd ultimatelly strive for is this - for example have an online canvas/page on a central server displayed on a big screen in the classroom. Then our students/users would take out their smartphones, load the page in their mobile browsers (I'm perfectly ok with limiting this to webkit mobile browsers for now) and draw on their screens with touch/fingers (or on PCs with the mouse - guessing this doesn't make a lot of difference) and it would get updated in real time for everybody - both on their screens and on the central big screen in the classroom.
I'm guessing push/get requests would be too slow for this - could it be solved by websockets? Does anybody have any good JS libraries to recommend for this?
Also what would the ideal (but easier for students to understand) architecture look like. Lets say you have 30 simultaneous users in a clasroom - each of them would connect with websockets to the server and the server would pool/combine all of their requests into one and then return the combined file (some sort of minimal JSON or even just coordinates) for every connected user?
Would websockets and (I'm guessing) canvas be able to take this? So that everything still looks snappy? Are there (jQuery-like) JS libraries available to make our lives easier - or do you think its something thats too complex for a 2-week summer school project?
here's a tutorial describing how to create a multiuser whiteboard with javascript/html5/canvas:
http://www.unionplatform.com/?page_id=2762
the example uses a collaboration framework and server named "union platform". even if you decide to roll your own server and client framework, the messaging in the example should give you an idea of how to structure the code.
for an apples-to-apples speed comparison of websocket vs comet, see:
http://www.unionplatform.com/?page_id=2954
in my tests, a basic ping over WebSocket is normally about twice as fast as the ping over http. both websocket and coment are more than fast enough to create a collaborative whiteboard.
Definitely check this out:
http://wesbos.com/html5-canvas-websockets-nodejs/
For the networking side of things, try looking at node.js for the server, along with socket.io for the client.
As for the drawing itself, a few popular choices are processing, raphael and cakejs.
When it comes to the implementation, you may want to look at how networked games deal with similar issues (gamedev.stackexchange.com could be useful).
What you are going to be doing is essentialy the same as a simple top down multiplayer game, with each 'player' in this case being a students fingertip, and the 'level' being the canvas. You need to update the server as to their position and whether or not they are 'shooting' (drawing).
I'm guessing push/get requests would be too slow for this - could it be solved by websockets? Does anybody have any good JS libraries to recommend for this?
If you need real-time infrastructure I've created a list of real-time technologies which might be of use to you. These include hosted service, such as Pusher who I work for, and self-install technologies such as WebSocket and Comet solutions.
WebSocket sounds like the idea choice of technology for you since they have become part of HTML5 and offer the most efficient for of realtime bi-directional communication between a web server and a browser (or other clients).
Also what would the ideal (but easier for students to understand) architecture look like. Lets say you have 30 simultaneous users in a clasroom - each of them would connect with websockets to the server and the server would pool/combine all of their requests into one and then return the combined file (some sort of minimal JSON or even just coordinates) for every connected user?
It sounds like you should probably store the current state somewhere and on the initial load of the application display that state. Then use your real-time infrastructure to send deltas on that state, or if it's a drawing on canvas, just information on the line etc. that has been drawn and information about who drew it.
Would websockets and (I'm guessing) canvas be able to take this? So that everything still looks snappy? Are there (jQuery-like) JS libraries available to make our lives easier - or do you think its something thats too complex for a 2-week summer school project?
Real-time collaborative drawing is most definitely achievable and there have been a number of examples created of this. A google bring up a number of possibilities.
If this technology is completely new to you and you would prefer to concentrate on building the collaborative application then I would consider using a service for your app rather than going through the hassle of learning how to install and configure, or even code, your own infrastructure (I'm not just saying this because I work for such a service. I honestly think it makes the most sense).

Categories

Resources