ffmpeg.FS('readFile', 'demo.mp4') error. Check if the path exists - javascript

I'm trying to convert recordRTC webm file to mp4 using ffmpeg.js library in nextjs project, during conversion showing this issue
ffmpeg.FS('readFile', 'demo.mp4') error. Check if the path exists
const stopRecording = async () => {
if (recordRef.current) {
let blob = {}
recordRef.current.stopRecording(function(url) {
blob = recordRef.current.getBlob()
});
const webmFile = new File([blob], 'video.webm', {
type: 'video/webm'
})
convertBobToMpeg(webmFile)
}
}
const convertBobToMpeg = async (webmFile) => {
await ffmpeg.load();
await ffmpeg.FS('writeFile', 'video.webm', await fetchFile(webmFile))
ffmpeg.run('-i', 'video.webm', 'demo.mp4');
const data = await ffmpeg.FS('readFile', 'demo.mp4');
setVideo_obj(URL.createObjectURL(new Blob([data.buffer], {
type: 'video/mp4'
})))
}

Related

How to simulate paste from clipboard in Playwright?

I am trying to test image pasting functionality using Playwright.
I mamanged to copy image to the clipboard but not able to paste it.
This is my code.
test.only("Clipboard", async ({browser}, testInfo) => {
const context = await browser.newContext({ permissions: ["clipboard-read", "clipboard-write"] });
const page = await context.newPage();
await page.evaluate(async () => {
const base64 = `data:image/png;base64, iVBORw0KGgoAAAANSUhEUgAAAAUA
AAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO
9TXL0Y4OHwAAAABJRU5ErkJggg==`;
const response = await fetch(base64);
const blob = await response.blob();
await navigator.clipboard.write([new ClipboardItem({ "image/png": blob })]);
const clipboardImageHolder = document.getElementById("clipboard-image");
clipboardImageHolder.focus();
const result = await navigator.clipboard.readText();
console.log(result);
});
});
when I run the test, then I press Ctrl+v manaull; I see the image pasted in the div element
Finally, this worked for me.
await page.evaluate(async () => {
const base64 = `data:image/png;base64, iVBORw0KGgoAAAANSUhEUgAAAAUA
AAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO
9TXL0Y4OHwAAAABJRU5ErkJggg==`;
const response = await fetch(base64);
const blob = await response.blob();
const clipboardImageHolder = document.getElementById("you-are-the-one");
clipboardImageHolder.focus();
let pasteEvent = new Event("paste", { bubbles: true, cancelable: true });
pasteEvent = Object.assign(pasteEvent, {
clipboardData: {
items: {
a: {
kind: "file",
getAsFile() {
return new File([blob], "foo.png", { type: blob.type });
},
},
},
},
});
console.log("event", pasteEvent);
clipboardImageHolder.dispatchEvent(pasteEvent);
});
I had to dispatch paste event and added clipboardData object to it.

Node js: Download file to memory and pass to library function without saving to disk

The below code is working fine, but I had some code feedback:
"Why download and save the file to disk, only to read it back in memory?"
However, after spending some hours exploring options with Buffer and stream, I just don't seem to be getting anywhere.
const fs = require('fs');
const { PdfData } = require('pdfdataextract');
const axios = require('axios').default;
const getPDFText = async ({ url }) => {
const tmpDir = `${process.cwd()}/my_dir`;
const writer = fs.createWriteStream(`${tmpDir}/document.pdf`);
const response = await axios({
url,
method: 'get',
responseType: 'stream'
});
response.data.pipe(writer);
const text = await new Promise((resolve, reject) => {
writer.on('finish', () => {
const fileData = fs.readFileSync(`${tmpDir}/document.pdf`);
PdfData.extract(fileData, {
get: {
// ...
},
})
.then(resolve)
.catch(reject);
});
writer.on('error', reject);
});
return text;
};
How can I avoid saving the file to disk, and to instead feed it into the PdfData.extract method?
The signature for .extract says it accepts an Uint8Array.
Something like
const {PdfData} = require('pdfdataextract');
const axios = require('axios').default;
async function getPDFText({url}) {
const response = await axios({
url,
method: 'get',
responseType: 'arraybuffer',
});
const pdfUint8Array = new Uint8Array(response.data);
const res = await PdfData.extract(pdfUint8Array, /* ... */);
console.log(res);
return res;
}
could do the trick?

Text -> PNG -> ReadStream, all done on the front-end?

I'm not sure if this is even possible, but here's what I'm trying to do:
Let the user enter some text
Generate a PNG from that text
Upload it to Pinata, which requires it to be in ReadStream format
Do all of this on the front-end
I've managed to accomplish (1) and (2) using html2canvas.
The tricky part is (3). The reason it has to be in ReadStream format is because that's the format Pinata's SDK wants:
const fs = require('fs');
const readableStreamForFile = fs.createReadStream('./yourfile.png');
const options = {
pinataMetadata: {
name: MyCustomName,
keyvalues: {
customKey: 'customValue',
customKey2: 'customValue2'
}
},
pinataOptions: {
cidVersion: 0
}
};
pinata.pinFileToIPFS(readableStreamForFile, options).then((result) => {
//handle results here
console.log(result);
}).catch((err) => {
//handle error here
console.log(err);
});
I realize that this would be no problem to do on the backend with node, but I'd like to do it on the front-end. Is that at all possible? Or am I crazy?
I'm specifically using Vue if that matters.
For anyone interested the solution ended up being using fetch+blob:
const generateImg = async () => {
const canvas = await html2canvas(document.getElementById('hello'));
const img = canvas.toDataURL('image/png');
const res = await fetch(img);
return res.blob();
};
This blob can then be passed into a more manual version of their SDK:
const uploadImg = (blob: Blob) => {
const url = `https://api.pinata.cloud/pinning/pinFileToIPFS`;
const data = new FormData();
data.append('file', blob);
const metadata = JSON.stringify({
name: 'testname',
});
data.append('pinataMetadata', metadata);
const pinataOptions = JSON.stringify({
cidVersion: 0,
});
data.append('pinataOptions', pinataOptions);
return axios
.post(url, data, {
maxBodyLength: 'Infinity' as any, // this is needed to prevent axios from erroring out with large files
headers: {
// #ts-ignore
'Content-Type': `multipart/form-data; boundary=${data._boundary}`,
pinata_api_key: apiKey,
pinata_secret_api_key: apiSecret,
},
})
.then((response) => {
console.log(response);
})
.catch((error) => {
console.log(error);
});
};

Resizing images with sharp before uploading to google cloud storage

I tried to resize or compress an image before uploading to the google cloud storage.
The upload works fine but the resizing does not seem to work.
Here is my code:
const uploadImage = async (file) => new Promise((resolve, reject) => {
let { originalname, buffer } = file
sharp(buffer)
.resize(1800, 948)
.toFormat("jpeg")
.jpeg({ quality: 80 })
.toBuffer()
const blob = bucket.file(originalname.replace(/ /g, "_"))
const blobStream = blob.createWriteStream({
resumable: false
})
blobStream.on('finish', () => {
const publicUrl = format(
`https://storage.googleapis.com/${bucket.name}/${blob.name}`
)
resolve(publicUrl)
}).on('error', () => {
reject(`Unable to upload image, something went wrong`)
})
.end(buffer)
})
I ran into the same issue with a project I was working on. After lots of trial and error I found the following solution. It might not be the most elegant, but it worked for me.
In my upload route function I created a new thumbnail image object with the original file values and passed it as the file parameter to the uploadFile function for google cloud storage.
Inside my upload image route function:
const file = req.file;
const thumbnail = {
fieldname: file.fieldname,
originalname: `thumbnail_${file.originalname}`,
encoding: file.encoding,
mimetype: file.mimetype,
buffer: await sharp(file.buffer).resize({ width: 150 }).toBuffer()
}
const uploadThumbnail = await uploadFile(thumbnail);
My google cloud storage upload file function:
const uploadFile = async (file) => new Promise((resolve, reject) => {
const gcsname = file.originalname;
const bucketFile = bucket.file(gcsname);
const stream = bucketFile.createWriteStream({
resumable: false,
metadata: {
contentType: file.mimetype
}
});
stream.on('error', (err) => {
reject(err);
});
stream.on('finish', (res) => {
resolve({
name: gcsname
});
});
stream.end(file.buffer);
});
I think the problem is with toFormat(). That function does not exist in the Docs. Can you try to remove it and check if it would work?
sharp(buffer)
.resize(1800, 948)
.jpeg({ quality: 80 })
.toBuffer()
Modify the metadata once you have finished uploading the image.
import * as admin from "firebase-admin";
import * as functions from "firebase-functions";
import { log } from "firebase-functions/logger";
import * as sharp from "sharp";
export const uploadFile = functions.https.onCall(async (data, context) => {
const bytes = data.imageData;
const bucket = admin.storage().bucket();
const buffer = Buffer.from(bytes, "base64");
const bufferSharp = await sharp(buffer)
.png()
.resize({ width: 500 })
.toBuffer();
const nombre = "IMAGE_NAME.png";
const fileName = `img/${nombre}.png`;
const fileUpload = bucket.file(fileName);
const uploadStream = fileUpload.createWriteStream();
uploadStream.on("error", async (err) => {
log("Error uploading image", err);
throw new functions.https.HttpsError("unknown", "Error uploading image");
});
uploadStream.on("finish", async () => {
await fileUpload.setMetadata({ contentType: "image/png" });
log("Upload success");
});
uploadStream.end(bufferSharp);
});

JavaScript Azure Blob Storage move blob

I have a NodeJS backend which use the official blob storage library (#azure/storage-blob) from Microsoft to manage my Blob Storage:
https://www.npmjs.com/package/#azure/storage-blob
It is necessary to move a blob from one folder, to another.
Unfortunately I can't find any documentation for that.
What I did until now is:
const { BlobServiceClient } = require("#azure/storage-blob");
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.storageconnection);
const containerClient = blobServiceClient.getContainerClient('import');
const blobClient = containerClient.getBlobClient('toImport/' + req.body.file);
const downloadBlockBlobResponse = await blobClient.download();
... do some stuff with the value of the files
Like you can see in the code, I read a file from folder "toImport". After that I want to move the file to another folder "finished". Is that possible? Maybe I need to create a copy of the file and delete the old one?
As such move operation is not supported in Azure Blob Storage. What you have to do is copy the blob from source to destination, monitor the copy progress (because copy operation is asynchronous) and delete the blob once the copy is complete.
For copying, the method you would want to use is beginCopyFromURL(string, BlobBeginCopyFromURLOptions).
Please see this code:
const { BlobServiceClient } = require("#azure/storage-blob");
const connectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key;EndpointSuffix=core.windows.net";
const container = "container-name";
const sourceFolder = "source";
const targetFolder = "target";
const blobName = "blob.png";
async function moveBlob() {
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient(container);
const sourceBlobClient = containerClient.getBlobClient(`${sourceFolder}/${blobName}`);
const targetBlobClient = containerClient.getBlobClient(`${targetFolder}/${blobName}`);
console.log('Copying source blob to target blob...');
const copyResult = await targetBlobClient.beginCopyFromURL(sourceBlobClient.url);
console.log('Blob copy operation started successfully...');
console.log(copyResult);
do {
console.log('Checking copy status...')
const blobCopiedSuccessfully = await checkIfBlobCopiedSuccessfully(targetBlobClient);
if (blobCopiedSuccessfully) {
break;
}
} while (true);
console.log('Now deleting source blob...');
await sourceBlobClient.delete();
console.log('Source blob deleted successfully....');
console.log('Move operation complete.');
}
async function checkIfBlobCopiedSuccessfully(targetBlobClient) {
const blobPropertiesResult = await targetBlobClient.getProperties();
const copyStatus = blobPropertiesResult.copyStatus;
return copyStatus === 'success';
}
moveBlob();
The previous best solution seem working but I don't like use an infinite loop.
So this is an alternative way to move blob file
const move = async (
fileName: string,
src: string,
dest: string
) => {
try {
const = blobServiceClient = BlobServiceClient.fromConnectionString();
logger.info(`Move storage file [ ${src} -> ${dest} | ${fileName} ]`);
const srcContainerClient = blobServiceClient.getContainerClient(src);
const destContainerClient =
blobServiceClient.getContainerClient(dest);
const blobClient = srcContainerClient.getBlobClient(fileName);
const downloadBlockBlobResponse = await blobClient.download();
const buffer = await streamToBuffer(
downloadBlockBlobResponse.readableStreamBody!
);
blobClient.delete();
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
await blockBlobClient.upload(buffer, buffer.length);
return `${this.storageUrl}/${containerClient.containerName}/${fileName}`;
} catch (e) {
throw new Error(
`Fail to move storage file [ ${src} -> ${dest} | ${fileName} ]`
);
}
};
const streamToBuffer = async (readableStream: NodeJS.ReadableStream): Promise<Buffer> => {
return new Promise((resolve, reject) => {
const chunks: Buffer[] = [];
readableStream.on("data", (data) => {
chunks.push(data instanceof Buffer ? data : Buffer.from(data));
});
readableStream.on("end", () => {
resolve(Buffer.concat(chunks));
});
readableStream.on("error", reject);
});
};

Categories

Resources