play a blob video file - javascript

imagine i have a video file and i want to build a blob URL from that file then play it in a html page, so far i tried this but i could not make it work ...
var files = URL.createObjectURL(new Blob([someVideoFile], {type: "video/mp4"}));
document.getElementById(videoId).setAttribute("src", files);//video tag id
document.getElementById(videoPlayer).load();//this is source tag id
document.getElementById(videoPlayer).play();//this is source tag id
it gives me a blob URL but wont play the video... am i doing something wrong? i am pretty new to electron so excuse me if my code is not good enough
i saw the similar questions mentioned in comments but they dont work for me as they dont work for others in those pages....

I know this is an old question, but it still deserves a working answer.
In order to play a video in the renderer context, you're on the right track: you can use a blob url and assign it as the video source. Except, a local filepath is not a valid url, which is why your current code doesn't work.
Unfortunately, in electron, currently there are only 3 ways to generate a blob from a file in the renderer context:
Have the user drag it into the window, and use the drag-and-drop API
Have the user select it via a file input: <input type="file">
Read the entire file with the 'fs' module, and generate a Blob from it
The third option (the only one without user input) can be done as long as nodeIntegration is enabled or if it is done in a non-sandboxed preloader. For accomplishing this via streaming vs. loading the entire file at once, the following module can be used:
// fileblob.js
const fs = require('fs');
// convert system file into blob
function fileToBlob(path, {bufferSize=64*1024, mimeType='aplication/octet-stream'}={}) {
return new Promise((resolve,reject) => {
// create incoming stream from file
const stream = fs.createReadStream(path, {highWaterMark:bufferSize});
// initialize empty blob
var blob = new Blob([], {type:mimeType});
stream.on('data', buffer => {
// append each chunk to blob by building new blob concatenating new chunk
blob = new Blob([blob, buffer], {type:mimeType});
});
stream.on('close', () => {
// resolve with resulting blob
resolve(blob);
});
});
}
// convert blob into system file
function blobToFile(blob,path, {bufferSize=64*1024}={}) {
return new Promise((resolve,reject) => {
// create outgoing stream to file
const stream = fs.createWriteStream(path);
stream.on('ready', async () => {
// iterate chunks at a time
for(let i=0; i<blob.size; i+=bufferSize) {
// read chunk
let slice = await blob.slice(i, i+bufferSize).arrayBuffer();
// write chunk
if(!stream.write(new Uint8Array(slice))) {
// wait for next drain event
await new Promise(resolve => stream.once('drain', resolve));
}
}
// close file and resolve
stream.on('close', () => resolve());
stream.close();
});
});
}
module.exports = {
fileToBlob,
blobToFile,
};
Then, in a preloader or the main context with nodeIntegration enabled, something like the following would load the file into a blob and use it for the video player:
const {fileToBlob} = require('./fileblob');
fileToBlob("E:/nodeJs/test/app/downloads/clips/test.mp4", {mimeType:"video/mp4"}).then(blob => {
var url = URL.createObjectURL(blob);
document.getElementById(videoId).setAttribute("src", url);
document.getElementById(videoPlayer).load();
document.getElementById(videoPlayer).play();
});
Again, unfortunately this is slow for large files. We're still waiting for a better solution from electron:
https://github.com/electron/electron/issues/749
https://github.com/electron/electron/issues/35629

Try
video.src = window.URL.createObjectURL(vid);
For more details please refer to this answer

Related

Store canvas screenshot to local filesystem with Filesystem API

I am developing a small web-application with a threejs Renderer where I want to be able to store some screenshots to my local filesystem at a specific location. Rendering the images and creating the snapshots works, only the storage/download part gives me headache.
I create the snapshot this way:
let imageData = renderer.domElement.toDataURL('image/png');
console.log(imageData); // data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAABFAAAANBC...
Now to download this, there are several methods: window.location.href = imageData;does trigger a download (and several alert messages from the browser). But the downloaded image is stored as a file called download inside the Download-Folder. If added the File-Extension .png it can be opened. This is not a useful solution, several downloads in a row will hardly be assigned.
So another working method is to build a download-link and trigger its click-method:
let a_element = document.createElement('a');
a_element.setAttribute('href', imageData.replace('image/png', 'image/octet-stream');
a_element.setAttribute('download', 'screenshot.png');
a_element.style.display = 'none';
document.body.appendChild(a_element);
a_element.click();
document.body.removeChild(a_element);
This does download the screenshot with a custom-name to the Download folder, it's almost what I want, but i'd like to have them stored in a custom folder as well. That's where the Filesystem-API comes in. The drawback is, that every download has to be confirmed, but not having to move the files manually from the download folder would pay off for that.
const options = {
types: [
{
description: 'Images',
accept: {
'image/png': ['.png'],
},
},
],
suggestedName: 'Screenshot.png',
};
imgFileHandle = await window.showSaveFilePicker(options);
console.log("Save File chosen");
const writable = await imgFileHandle.createWritable();
await writable.write(imageData);
await writable.close();
This seems to work, but the file that is stored is not a valid image. It contains only Text (same as the output of the console). How should I treat the imageData to download it as an image ?
For the File System Access API, you're almost there, but instead of a data URL, you need to provide the data as a blob. To create a blob from a data URL, you can use the code below:
function dataURItoBlob(dataURI) {
const byteString = atob(dataURI.split(',')[1]);
const mimeString = dataURI.split(',')[0].split(':')[1].split(';')[0]
const ab = new ArrayBuffer(byteString.length);
const ia = new Uint8Array(ab);
for (let i = 0; i < byteString.length; i++) {
ia[i] = byteString.charCodeAt(i);
}
const blob = new Blob([ab], {type: mimeString});
return blob;
}
It may be easier to get the Blob directly, via the toBlob() method. This should work in Three.js as well, since renderer.domElement is a CanvasElement.

MediaRecorder, get partial videos on interval during recording? [duplicate]

I want to record user's microphone 5 seconds long segments and upload each to the server. I tried using MediaRecorder and I called start() and stop() methods at 5 seconds time interval, but when I concatenate these recordings there is a "drop" sound between. So I tried to record 5 seconds segments using timeslice parameter of start() method:
navigator.mediaDevices.getUserMedia({ audio: { channelCount: 2, volume: 1.0, echoCancellation: false, noiseSuppression: false } }).then(function(stream) {
const Recorder = new MediaRecorder(stream, { audioBitsPerSecond: 128000, mimeType: "audio/ogg; codecs=opus" });
Recorder.start(5000);
Recorder.addEventListener("dataavailable", function(event) {
const audioBlob = new Blob([event.data], { type: 'audio/ogg' });
upload(audioBlob);
});
});
But only the first segment is playable. What can I do, or how can I make all blobs playable?
I MUST record then upload each segment. I CAN'T make an array of blobs (because the user could record 24hours of data or even more and the data needs to be uploaded on the server while the user is recording - with a 5 seconds delay).
Thank you!
You have to understand how media files are built.
It is not only some raw data that can be converted to either audio or video directly.
It will depend on the format chosen, but the basic case is that you have what is called metadata which are like a dictionary describing how the file is structured.
These metadata are necessary for the software that will then read the file to know how it should parse the actual data that is contained in the file.
The MediaRecorder API is in a strange position here, since it must be able to at the same time write these metadata, and also add non-determined data (it is a live recorder).
So what happens is that browsers will put the main metadata at the beginning of the file, in a way that they'll be able to simply push new data to the file, and still be a valid file (even though some info like duration will be missing).
Now, what you get in datavailableEvent.data is only a part of a whole file, that is being generated.
The first one will generally contain the metadata, and some other data, depending on when the event has been told to fire, but the next parts won't necessarily contain any metadata.
So you can't just grab these parts as standalone files, because the only file that is generated is the one that is made of all these parts, joined together in a single Blob.
So, to your problem, you have different possible approaches:
You could send to your server the latest slices you got from your recorder in an interval, and merge these server-side.
const recorder = new MediaRecorder(stream);
const chunks = [];
recorder.ondataavailable = e => chunks.push(e.data);
recorder.start(); // you don't need the timeslice argument
setInterval(()=>{
// here we both empty the 'chunks' array, and send its content to the server
sendToServer(new Blob(chunks.splice(0,chunks.length)))
}, 5000);
And on your server-side, you would append the newly sent data to the being recorded file.
An other way would be to generate a lot of small standalone files, and to do this, you could simply generate a new MediaRecorder in an interval:
function record_and_send(stream) {
const recorder = new MediaRecorder(stream);
const chunks = [];
recorder.ondataavailable = e => chunks.push(e.data);
recorder.onstop = e => sendToServer(new Blob(chunks));
setTimeout(()=> recorder.stop(), 5000); // we'll have a 5s media file
recorder.start();
}
// generate a new file every 5s
setInterval(record_and_send, 5000);
Doing so, each file will be standalone, with a duration of approximately 5 seconds, and you will be able to play these files one by one.
Now if you wish to only store a single file on server, still using this method, you can very well merge these files together on server-side too, using e.g a tool like ffmpeg.
Using a version of one of the #Kalido's suggestions I got this working. It will send small standalone files to the server that it won't produce any glitch on image or sound when they are concatenated as an unified file on the server side:
var mediaRecorder;
var recordingRunning = false;
var chunks;
// call this function to start the process
function startRecording(stream) {
recordingRunning = true;
chunks = [];
mediaRecorder = new MediaRecorder(stream, { mimeType: "video/webm; codecs=vp9" });
mediaRecorder.ondataavailable = function (e) {
chunks.push(e.data);
};
mediaRecorder.onstop = function () {
actualChunks = chunks.splice(0, chunks.length);
const blob = new Blob(actualChunks, { type: "video/webm; codecs=vp9" });
uploadVideoPart(blob); // Upload to server
};
recordVideoChunk(stream);
};
// call this function to stop the process
function stopRecording(stream) {
recordingRunning = false
mediaRecorder.stop();
};
function recordVideoChunk(stream) {
mediaRecorder.start();
setTimeout(function() {
if(mediaRecorder.state == "recording")
mediaRecorder.stop();
if(recordingRunning)
recordVideoChunk(stream);
}, 10000); // 10 seconds videos
}
Latter on the server I concatenate them with this command:
# list.txt
file 'chunk1'
file 'chunk2'
file 'chunk3'
# command
ffmpeg -avoid_negative_ts 1 -f concat -safe 0 -i list.txt -c copy output.mp4

How can I convert raw binary data to a blob and display it in an img tag?

I'm making a dream journal application in Electron and Svelte. I have a custom file format containing a title, description, and one or more images. See:
Program Input
File Output
When I need to, I can call ipcRenderer.invoke() to read the file in the main process, then return that to retrieve it in the renderer process (don't worry, I'm using async await to make sure I'm not just getting a promise. Also, for my testing, I'm only sending back the Uint8Array representative of the image).
After attempting to display the image and failing, I decided I'd check to see that I was receiving the information as intended. I sent the response as-is back to the main process and wrote it to a file. When I opened the file in Paint, it displayed.
So, the information is correct. This is the code I tried to display the image with:
within <script>
let src;
onMount(async () => {
let a = await ipcRenderer.invoke("file-read");
console.log(a);
let blob = new Blob(a, {type: 'image/png'});
console.log(blob);
ipcRenderer.send("verify-content", a); // this is the test I mentioned, where it was written to a file
src = URL.createObjectURL(blob);
});
in the body
{#if src}
<img src={src} />
{/if}
I also tried it another way:
within <script>
onMount(async () => {
let a = await ipcRenderer.invoke("file-read");
console.log(a);
let blob = new Blob(a, {type: 'image/png'});
console.log(blob);
ipcRenderer.send("verify-content", a);
const img = document.createElement("img");
img.src = URL.createObjectURL(blob);
img.onload = function() {
URL.revokeObjectURL(this.src);
}
document.getElementById("target").appendChild(img);
});
in the body
<div id="target"></div>
However, this is all I got:
It does not display. How can I display this image? All the other "blob to img" examples I found used the type="file" <input /> tag. If possible, I'd like to avoid using a Base64 data URI. Thanks.
It turns out that I have to wrap my Uint8Array in an array when I make a blob out of it (wtf).
let blob = new Blob([a], {type: "image/png"});

image url to file() object using js

For a registration module in my vue app I let users upload images in a form. I upload these images to my storage and save the download url of this photo in the registration. When editing a registration I need to get the photo's out of the storage which is simple since I got the url. But I need it to be a file() object. I have found ways to turn it into a blob but it needs to be a file. How can I do this?
It can be done by requesting a blob and generating a File object. It is necessary to specify the MIME type of the blob.
const urlToObject= async()=> {
const response = await fetch(image);
// here image is url/location of image
const blob = await response.blob();
const file = new File([blob], 'image.jpg', {type: blob.type});
console.log(file);
}
The ES6 way, with Promise:
const blobUrlToFile = (blobUrl:string): Promise<File> => new Promise((resolve) => {
fetch(blobUrl).then((res) => {
res.blob().then((blob) => {
// please change the file.extension with something more meaningful
// or create a utility function to parse from URL
const file = new File([blob], 'file.extension', {type: blob.type})
resolve(file)
})
})
})
Since you are letting the user upload a file, you already have the file as a File object.
But if you wanna convert it to a blob for making some edits and convert it back to a File object, you can use the File() constructor for converting a blob to a File.
const file = new File([blob], "imagename.png");
Also, notice that the File() constructor takes an array of blobs as argument and not a single blob.

Chrome: to play a video that is being downloaded via fetch/XHR

What I'm trying to achieve is to make Chrome load a video file as data (via the Fetch API, XHR, whatever) and to play it using <video> while it's still being downloaded without issuing two separate requests for the same URL and without waiting until the file is completely downloaded.
It's easy to get a ReadableStream from the Fetch API (response.body), yet I can't find a way to feed it into the video element. I've figured out I need a blob URL for this, which can be created using a MediaSource object. However, the SourceBuffer#appendStream method, which sounds like just what is needed, isn't implemented in Chrome, so I can't connect the stream directly to the MediaSource object.
I can probably read the stream in chunks, create Uint8Arrays out of them, and use SourceBuffer#appendBuffer, but this means playback won't start immediately unless the chunk size is really small. Also it feels like manually doing something that all these APIs should be able to do out of the box. If there is no other solutions, and I go this way, what caveats should I expect?
Are there probably other ways to create a blob URL for a ReadableStream? Or is there a way to make fetch and <video> share a request? There are so many new APIs that I could easily miss something.
After hours of experimenting, found a half-working solution:
const video = document.getElementById('audio');
const mediaSource = new MediaSource();
video.src = window.URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', async () => {
const sourceBuffer = mediaSource.addSourceBuffer('audio/webm; codecs="opus"');
const response = await fetch(audioSRC);
const body = response.body
const reader = body.getReader()
let streamNotDone = true;
while (streamNotDone) {
const {value, done} = await reader.read();
if (done) {streamNotDone = false; break;}
await new Promise((resolve, reject) => {
sourceBuffer.appendBuffer(value)
sourceBuffer.onupdateend = (() => {
resolve(true);
})
})
}
});
It works with https://developer.mozilla.org/en-US/docs/Web/API/MediaSource
Also, I tested this only with webm/opus format but I believe it should work with other formats as well as long as you specify it.

Categories

Resources