Send Canvas images as a file stream HTML5 - javascript

I am developing a system where the mobile device camera is accessed in the browser and the camera stream frames are send to the other side synchronously. The sent frames are processed further on the other side .I have drawn the frames to the canvas with a time interval as of the below code. How do i send the accessed frames to the other side for the further processing of frames to happen synchronously? each frame drawn on the canvas is to be sent to the other side for the further process to happen on each image frame. The other side code is in native language.
$<!DOCTYPE html>
<html>
<h1>Simple web camera display demo</h1>
<body>
<video autoplay width="480" height="480" src=""></video>
<canvas width="600" height="480" style="" ></canvas>
<img src="" width="100" height="100" ></img>
<script type="text/javascript">
var video = document.getElementsByTagName('video')[0],
heading = document.getElementsByTagName('h1')[0];
if(navigator.getUserMedia) {
navigator.getUserMedia('video', successCallback, errorCallback);
function successCallback( stream ) {
video.src = stream;
}
function errorCallback( error ) {
heading.textContent =
"An error occurred: [CODE " + error.code + "]";
}
} else {
heading.textContent =
"Native web camera streaming is not supported in this browser!";
}
draw_interval = setInterval(function()
{
var canvas = document.querySelector('canvas');
var ctx = canvas.getContext('2d');
var frames = document.getElementById('frames');
ctx.drawImage(document.querySelector("video"), 0, 0);
}, 33)
</script>
</body>
</html>

I'm not quite sure what you mean bye "other side" and "native language".
But, you can send canvas images to a server using AJAX.
The server receives the canvas image as base64 encoded image data.
For example, assume:
You’re sending the image to a php server (yourOtherSide.php) – but of course this could be any server that accepts ajax posts.
You have a reference to the canvas element holding your frame: canvas
Your ajax payload contains an id number of the frame being sent (ID) and the image data (imgData).
(optionally) You are getting some response back from the server—even if it’s just “OK”: anyReturnedStuff
Then your ajax post would be:
$.post(“yourOtherSide.php”, { ID:yourFrame ID, imgData: canvas.toDataURL(‘image/jpeg’) })
.done( function(anyReturnedStuff){ console.log(anyReturnedStuff); } );
[Edited to include server-side creation of images from the ajax post]
These code samples will receive the ajax base64 imageData and create an image for you to process with your c-image-processing-library.
If your're using a PHP server:
$img = imagecreatefromstring(base64_decode($imageData));
// and process $img with your image library here
or if you're using asp.net:
byte[] byteArray = System.Convert.FromBase64String(imageData);
Image image;
using(MemoryStream s=new MemoryStream(byteArray){
image=Image.FromStream(ms);
// and process image with your image library here
}

Related

UI video play Issue but works perfectly on Postman

I am not able to play MP4 (HD) video on UI received from the django backend. I am using normal javascript on UI and Django on the backend. Please find the backend code snippet:
file = FileWrapper(open(path, 'rb')) #MP4 file path is media/1648477263566_28-03-2022 19:51:05_video.mp4
response = HttpResponse(file, content_type=content_type)
response['Content-Disposition'] = 'attachment; filename=my_video.mp4'
return response
The video plays perfectly on Postman but cant play on the UI screen. The UI code is below:
function getUploadedImageAndVideo(combined_item_id){
request = {}
request["combined_item_id"] = combined_item_id;
var xhttp = new XMLHttpRequest();
xhttp.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
vdata = this.responseText;
var src1 = document.getElementById('src1');
src1.setAttribute("src", "data:video/mp4;base64,"+vdata);
//src1.setAttribute("src", vdata); //doesnt work either
var src2 = document.getElementById('src2');
src2.setAttribute("src", "data:video/mp4;base64,"+vdata);
//src2.setAttribute("src", vdata); //doesnt work either
return
}
}
xhttp.open("POST", port + host + "/inventory_apis/getUploadedImageAndVideo", true);
xhttp.setRequestHeader("Accept", "video/mp4");
xhttp.setRequestHeader("Content-type", "application/json");
xhttp.setRequestHeader("X-CSRFToken", getToken());
xhttp.send( JSON.stringify(request) );
}
on html side:
<video controls="">
<source type="video/webm" src="" id="src1">
<source type="video/mp4" src="" id="src2">
</video>
Network Response (200 OK) of function call is: "ftypmp42 ... isommp42 ... mdat ... ó! ... °}b ... $¥Ð ..." very long text of the video.
I am not able to play video on the UI Side. Please Help.
Browser used: Chrome and Mozilla.
*An alternative is to directly play from media url but here I want to edit video on backend itself on purpose. So I’m stuck on this issue.
Looking at "ftypmp42 ... isommp42 ... mdat ... ó! ... °}b ... $¥Ð ..."
MP4 is divided into two parts.
First is MOOV for metadata (which needs to be processed, before playback can begin). For example the metadata tells all the byte positions of all the different frames, without this metadata then the decoder cannot begin playback.
Second is MDAT which is the actual media data (the audio/video data without headers, since such info now exists in MOOV instead).
It seems your video has MDAT appearing first so you must wait for the MDAT bytes to pass through before you reach the metadata. In other words, your file must be completely downloaded before it can play.
Solution:
Use a tool to move your MOOV atom to the front of file. You can try commandline tools like FFmpeg or MP4Box or an app like Handbrake.

Is repeatedly setting an img src attribute with a base64 data: URL a reasonable way to animate based on server generated images?

I have a simple need for a preview of image data that is updated regularly in realtime on the server side. The server can easily generate Base64 encoded JPEG images. I have JavaScript that fetches the current data and sets the src attribute of an image.
This appears to work fine when testing with the server running locally (i.e. URLs are to localhost), however it seems that when client and server are on different machines, after some time the browser gets bogged down and can't handle it anymore. It seems like old image data is hanging around and getting garbage collected or something similar is causing a slow down and ultimately a failure to reliably update the image on the browser side.
Here is the code:
<html lang="en">
<head>
<title>Live Stream</title>
</head>
<body>
<div>
<img id="img1" width="320" height="180" src="data:," />
</div>
<script type="text/javascript">
var fetching = false;
const fetchImage = async () => {
fetching = true;
try {
const response = await fetch(document.baseURI+'app/image/base64');
var frame = await response.text();
document.getElementById("img1").src = "data:image/jpeg;base64,"+frame;
} catch(e) {
console.log(e);
}
fetching = false;
}
function updateImage() {
fetchImage();
}
window.requestAnimationFrame( draw );
function draw(timestamp) {
if( !fetching ) {
updateImage();
}
window.requestAnimationFrame( draw );
}
</script>
</body>
</html>
Am I doing something wrong? Is there a better way? Low-latency is important.
I also have an alternative that uses WebSockets to push data. That might make more sense with the data rate is lower. Either way, the display of the data is handled the same way. by updating the src of an with a new 'data:' URL.

efficient way of streaming a html5 canvas content?

I'm trying to stream the content of a html5 canvas on a live basis using websockets and nodejs.
The content of the html5 canvas is just a video.
What I have done so far is:
I convert the canvas to blob and then get the blob URL and send that URL to my nodejs server using websockets.
I get the blob URL like this:
canvas.toBlob(function(blob) {
url = window.URL.createObjectURL(blob);
});
The blob URLs are generated per video frame (20 frames per second to be exact) and they look something like this:
blob:null/e3e8888e-98da-41aa-a3c0-8fe3f44frt53
I then get that blob URL back from the the server via websockets so I can use it to DRAW it onto another canvas for other users to see.
I did search how to draw onto canvas from blob URL but I couldn't find anything close to what i am trying to do.
So the questions I have are:
Is this the correct way of doing what i am trying to achieve? any
pros and cons would be appreciated.
Is there any other more efficient way of doing this or I'm on a right
path?
Thanks in advance.
EDIT:
I should have mentioned that I cannot use WebRTC in this project and I have to do it all with what I have.
to make it easier for everyone where I am at right now, this how I tried to display the blob URLs that I mentioned above in my canvas using websockets:
websocket.onopen = function(event) {
websocket.onmessage = function(evt) {
var val = evt.data;
console.log("new data "+val);
var canvas2 = document.querySelector('.canvMotion2');
var ctx2 = canvas2.getContext('2d');
var img = new Image();
img.onload = function(){
ctx2.drawImage(img, 0, 0)
}
img.src = val;
};
// Listen for socket closes
websocket.onclose = function(event) {
};
websocket.onerror = function(evt) {
};
};
The issue is that when I run that code in FireFox, the canvas is always empty/blank but I see the blob URLs in my console so that makes me think that what I am doing is wrong.
and in Google chrome, i get Not allowed to load local resource: blob: error.
SECOND EDIT:
This is where I am at the moment.
First option
I tried to send the whole blob(s) via websockets and I managed that successfully. However, I couldn't read it back on the client side for some strange reason!
when I looked on my nodejs server's console, I could see something like this for each blob that I was sending to the server:
<buffer fd67676 hdsjuhsd8 sjhjs....
Second option:
So the option above failed and I thought of something else which is turning each canvas frame to base64(jpeg) and send that to the server via websockets and then display/draw those base64 image onto the canvas on the client side.
I'm sending 24 frames per second to the server.
This worked. BUT the client side canvas where these base64 images are being displayed again is very slow and and its like its drawing 1 frame per second. and this is the issue that i have at the moment.
Third option:
I also tried to use a video without a canvas. So, using WebRTC, I got the video Stream as a single Blob. but I'm not entiely sure how to use that and send it to the client side so people can see it.
IMPORTANT: this system that I am working on is not a peer to peer connection. its just a one way streaming that I am trying to achieve.
The most natural way to stream a canvas content: WebRTC
OP made it clear that they can't use it, and it may be the case for many because,
Browser support is still not that great.
It implies to have a MediaServer running (at least ICE+STUN/TURN, and maybe a gateway if you want to stream to more than one peer).
But still, if you can afford it, all you need then to get a MediaStream from your canvas element is
const canvas_stream = canvas.captureStream(minimumFrameRate);
and then you'd just have to add it to your RTCPeerConnection:
pc.addTrack(stream.getVideoTracks()[0], stream);
Example below will just display the MediaStream to a <video> element.
let x = 0;
const ctx = canvas.getContext('2d');
draw();
startStream();
function startStream() {
// grab our MediaStream
const stream = canvas.captureStream(30);
// feed the <video>
vid.srcObject = stream;
vid.play();
}
function draw() {
x = (x + 1) % (canvas.width + 50);
ctx.fillStyle = 'white';
ctx.fillRect(0,0,canvas.width,canvas.height);
ctx.fillStyle = 'red';
ctx.beginPath();
ctx.arc(x - 25, 75, 25, 0, Math.PI*2);
ctx.fill();
requestAnimationFrame(draw);
}
video,canvas{border:1px solid}
<canvas id="canvas">75</canvas>
<video id="vid" controls></video>
The most efficient way to stream a live canvas drawing: stream the drawing operations.
Once again, OP said they didn't want this solution because their set-up doesn't match, but might be helpful for many readers:
Instead of sending the result of the canvas, simply send the drawing commands to your peers, which will then execute these on their side.
But this approach has its own caveats:
You will have to write your own encoder/decoder to pass the commands.
Some cases might get hard to share (e.g external media would have to be shared and preloaded the same way on all peers, and the worse case being drawing an other canvas, where you'd have to also have shared its own drawing process).
You may want to avoid intensive image processing (e.g ImageData manipulation) to be done on all peers.
So a third, definitely less performant way to do it, is like OP tried to do:
Upload frames at regular interval.
I won't go in details in here, but keep in mind that you are sending standalone image files, and hence a whole lot more data than if it had been encoded as a video.
Instead, I'll focus on why OP's code didn't work?
First it may be good to have a small reminder of what is a Blob (the thing that is provided in the callback of canvas.toBlob(callback)).
A Blob is a special JavaScript object, which represents binary data, generally stored either in browser's memory, or at least on user's disk, accessible by the browser.
This binary data is not directly available to JavaScript though. To be able to access it, we need to either read this Blob (through a FileReader or a Response object), or to create a BlobURI, which is a fake URI, allowing most APIs to point at the binary data just like if it was stored on a real server, even though the binary data is still just in the browser's allocated memory.
But this BlobURI being just a fake, temporary, and domain restricted path to the browser's memory, can not be shared to any other cross-domain document, application, and even less computer.
All this to say that what should have been sent to the WebSocket, are the Blobs directly, and not the BlobURIs.
You'd create the BlobURIs only on the consumers' side, so that they can load these images from the Blob's binary data that is now in their allocated memory.
Emitter side:
canvas.toBlob(blob=>ws.send(blob));
Consumer side:
ws.onmessage = function(evt) {
const blob = evt.data;
const url = URL.createObjectURL(blob);
img.src = url;
};
But actually, to even better answer OP's problem, a final solution, which is probably the best in this scenario,
Share the video stream that is painted on the canvas.

Chromecast sending Bitmap to receiver

Using the Chromecast dongle we are trying to show a Bitmap originating from android that changes often onto the receiver. The way we are currently doing it is converting the image to Base64 and then submitting that as the URL. This works but is very slow and seems inefficient. What would be the best way to show a local Bitmap onto the receiver?
Relevant Java code:
Bitmap bitmap1 = BitmapFactory.decodeByteArray(bytes, 0, bytes.length, options);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
bitmap1.compress(CompressFormat.JPEG, 85, baos);
String image = "data:image/jpeg;base64,"+Base64.encodeToString(baos.toByteArray(), Base64.DEFAULT);
mMessageStream.loadMedia(image, mMetaData, true);
receiver.html:
<html>
<script src="https://www.gstatic.com/cast/js/receiver/1.0/cast_receiver.js">
</script>
<script type="text/javascript">
var receiver = new cast.receiver.Receiver(
'ACTUAL_APP_ID_HERE', [cast.receiver.RemoteMedia.NAMESPACE],
"",
3);
var remoteMedia = new cast.receiver.RemoteMedia();
remoteMedia.addChannelFactory(
receiver.createChannelFactory(cast.receiver.RemoteMedia.NAMESPACE));
receiver.start();
window.addEventListener('load', function() {
var elem = document.getElementById('vid');
remoteMedia.setMediaElement(elem);
});
</script>
<body>
<img id="vid" style="position:absolute;top:0;left:0;height:100%;width:100%" />
</body>
</html>
This is probably not the best method, but I have my app start a local web server, and send the correct link to the receiver. On an http request, the server then streams the file from the disk. I haven't tested to see how fast/often it can be changed.
Use Base64OutputStream with other streams, right now you are just loading whole image into memory which is very bad and will give you out of memory error sooner or later.

WebRTC: how to get the webcam data as a stream of data?

I have a simple webpage where you can stream your webcam. I would like to take this stream and send it somewhere, but apparently I can't really access the stream itself. I have this code to run the stream:
navigator.webkitGetUserMedia({video: true}, gotStream, noStream);
And in gotStream, I tried many things to "redirect" this stream somewhere else, for example:
function gotStream(stream) {
stream_handler(stream)
//other stuff to show webcam output on the webpage
}
or
function gotStream(stream) {
stream.videoTracks.onaddtrack = function(track){
console.log("in onaddtrack");
stream_handler(track);
}
//other stuff to show webcam output on the webpage
}
But apparently the gotStream function gets called only once at the beginning, when the user grants permissions to the webcam to stream. Moreover the stream variable is not the stream itself but an object with some properties inside. How am I supposed to access the stream itself and redirect it wherever I want?
EDIT: You may be familiar with webglmeeting, a sort of face2face conversation apparently developed on top of WebRTC. I think that script is sending somehow the stream of data from one point to the other. I would like to achieve the same by understanding how to get the stream of data in the first place.
RE-EDIT: I don't want a conversion to image and sending the latter, I would like to work with the stream of data itself.
If you mean to steam your camera to somewhere in PNG or JPG so I will use canvas like this
HTML
<video id="live" width="320" height="240" autoplay></video>
<canvas width="320" id="canvas" height="240" style="display:none;"></canvas>
JS ( jQuery )
var video = $("#live").get()[0];
var canvas = $("#canvas");
var ctx = canvas.get()[0].getContext('2d');
navigator.webkitGetUserMedia("video",
function(stream) {
video.src = webkitURL.createObjectURL(stream);
}
)
setInterval(
function () {
ctx.drawImage(video, 0, 0, 320,240);
var data = canvas[0].toDataURL("image/jpeg");
},1000);

Categories

Resources