Im trying to create a blob url to use string as file for JwPlayer subtitles.
subtitles are loaded like this:
const playlistItem = {
...
tracks: [
{
file: 'https://myfakesite.org/subtitles.vtt',
label: 'en'
}
]
}
So because jwplayer dont accept my source (subtitles.ass) i converted .ass to .vtt resulting as string.
Like this:
var vttRaw = `WEBVTT
00:00:25.520 --> 00:00:29.250
Naquele dia,
a humanidade foi lembrada...
00:00:35.110 --> 00:00:38.180
do terror de estar à mercê deles`;
As jwplayer needs a url, i converted this string to blob url:
//Generate blob
var blob = new Blob([vttRaw], {
type: "text/vtt; charset=utf-8"
});
//Generate url
var vtt_url = URL.createObjectURL(blob) + "#.vtt";
In web browser that works, but in react-native-android results in a error.
Possible Unhandled Promise Rejection (id: 0):
Error: Cannot create URL for blob!
blob error
I think the problem is to generate a blob url, anyone know what can i do?
I had this issue in React Native for Android. The following worked in iOS but not Android
URL.createObjectURL(blob)
Try passing base64 data rather than the url, like in this post.
Maybe something like this:
//Generate blob
var blob = new Blob([vttRaw], {
type: "text/vtt; charset=utf-8"
});
const fileReaderInstance = new FileReader();
fileReaderInstance.readAsDataURL(blob);
fileReaderInstance.onload = () => {
base64 = fileReaderInstance.result;
vtt_data = base64;
}
Related
I am working on a project where I have to upload an image as form data along with other text fields. I have my file in Base64 string at first, then I convert it into a file before uploading it to the server.
const data = await fetch(base64String);
const blob = await data.blob();
const file = await new File([blob], 'avatar', { type: 'image/png' });
I logged the base64String in the client side before uploading it to the server. Then I upload file to the server as a File. Before saving it to MongoDB when I log it as a base64 string again in the server side, I see my string is not the same as before. I feel like while converting the base64 to file in the client side I am doing something wrong. Help me out please.
I have figured out my problem. When I take image file input from my computer I get a base64 string like below -
dataimage/jpegbase64/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAA...
But, when I convert it back into a file it expects a string like below -
/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAA....
So, basically, I had to trim the string accordingly to match the expected format and wrote a base64 to file conversion function following this answer.
Here is my function to convert a base64 string to an image file
export function getFileFromBase64(string64:string, fileName:string) {
const trimmedString = string64.replace('dataimage/jpegbase64', '');
const imageContent = atob(trimmedString);
const buffer = new ArrayBuffer(imageContent.length);
const view = new Uint8Array(buffer);
for (let n = 0; n < imageContent.length; n++) {
view[n] = imageContent.charCodeAt(n);
}
const type = 'image/jpeg';
const blob = new Blob([buffer], { type });
return new File([blob], fileName, { lastModified: new Date().getTime(), type });
}
actually there are many answers for this question. But my problem is,
i want to generate pdf dynamically with 5 external(URL) images. Im using PDFmake node module.
it supports only two ways local and base64 format. But i don't want to store images locally.
so my requirement is one function which takes url as parameter and returns base64.
so that i can store in global variable and create pdfs
thanks in advance
function urlToBase(URL){
return base64;
}
var img = urlToBase('https://unsplash.com/photos/MVx3Y17umaE');
var dd = {
content: [
{
text: 'fjfajhal'
},
{
image: img,
}
]
};
var writeStream = fs.createWriteStream('myPdf.pdf');
var pdfDoc = printer.createPdfKitDocument(dd);
pdfDoc.pipe(writeStream);
pdfDoc.end();
im using PDFmake module from npm
The contents of the remote image can first be fetched with an HTTP request, for example using the ubiquitous request npm module. The image string contents can then be transformed to a buffer and finally converted to a base64 string. To complete the transformation, add the proper data-url prefix, for example, data:image/png,base64, to the beginning of the base64 string.
Here is a rough example for a PNG image:
const request = require('request-promise-native');
let jpgDataUrlPrefix = 'data:image/png;base64,';
let imageUrl = 'https://www.google.com/images/branding/googlelogo/1x/googlelogo_color_272x92dp.png';
request({
url: imageUrl,
method: 'GET',
encoding: null // This is actually important, or the image string will be encoded to the default encoding
})
.then(result => {
let imageBuffer = Buffer.from(result);
let imageBase64 = imageBuffer.toString('base64');
let imageDataUrl = jpgDataUrlPrefix+imageBase64;
console.log(imageDataUrl);
});
I'm trying to upload a Blob object into S3, which does get uploaded but in a corrupted way. The application is about recording audio on a web page and saving it to S3.
HTML + Javascript Code:
<p>
<button id=startRecord>START</button>
<button id=stopRecord disabled>Submit</button>
</p>
<p id="recording"></p>
<p>
<a id=audioDownload></a>
</p>
<script
src="https://code.jquery.com/jquery-3.4.1.min.js"
integrity="sha256-CSXorXvZcTkaix6Yvo6HppcZGetbYMGWSFlBw8HfCJo="
crossorigin="anonymous"></script>
<script type="text/javascript">
var audioContent;
navigator.mediaDevices.getUserMedia({audio:true})
.then(stream => {
rec = new MediaRecorder(stream);
rec.ondataavailable = e => {
audioChunks.push(e.data);
if (rec.state == "inactive"){
let blob = new Blob(audioChunks);
// audioContent = blob
// audioContent = URL.createObjectURL(new Blob(audioChunks));
// console.log(audioContent);
$.ajax({
type: "POST",
url: 'https://aws-api-url/prod/audio',
data: new Blob(audioChunks),
crossDomain: true,
processData: false,
headers: {"x-api-key": 'someKey'},
contentType: false
});
// audioDownload.href = audioContent;
// audioDownload.download = 'test';
// audioDownload.innerHTML = 'download';
}
}
})
.catch(e=>console.log(e));
startRecord.onclick = e => {
startRecord.disabled = true;
stopRecord.disabled=false;
document.getElementById("recording").innerHTML = "Listening...";
audioChunks = [];
rec.start();
}
stopRecord.onclick = e => {
document.getElementById("recording").innerHTML = "";
startRecord.disabled = false;
stopRecord.disabled=true;
rec.stop();
}
</script>
AWS Lambda that dumps into S3
import json
import boto3
def lambda_handler(event, context):
# TODO implement
s3_client = boto3.client('s3', aws_access_key_id='',aws_secret_access_key='')
s3_client.put_object(Body=event['body'], Bucket='bucket', Key='incoming/test.wav')
return {
'statusCode': 200,
'headers': {
'Access-Control-Allow-Origin': '*'
},
'body': json.dumps(event)
}
What changes can I possibly make into my Javascript to send this data safely
MediaRecorder does not generate the .wav data type. By default it probably generates data of the MIME type audio/webm; codecs=opus. or audio/ogg; codec=vorbis. Your lambda function looks like it faithfully stores the incoming data. But it's not .wav data, it's something else.
Your sample code lets MediaRecorder choose its own MIME type. In this case you should ask it what it used. For example
rec = new MediaRecorder(stream);
console.log (rec.mimeType);
Or, you can (try to) tell it the MIME type you want. In this case you should still ask it what it actually used. (Browsers vary in the MIME types they generate, and in the ways they respond when they can't deliver the exact type you want.) If your browser can do it, this code will probably generate mp3 (aka MPEG Layer III) audio.
rec = new MediaRecorder(stream, {mimeType: "audio/mpeg"});
console.log (rec.mimeType);
Or, you can try the audio/mp4 MIME type, and see what audio codec you get. It may vary from browser to browser.
You can generally use ffmpeg to convert any MIME type to another once you've recorded it. This is handy if require .wav output, or some other particular format. It takes some hacking, but you can do it in your lambda function.
I've tried all of the methods I could find in stackoverflow. This two are some of the most complete posts:
Display image from blob using javascript and websockets
How can you encode a string to Base64 in JavaScript?
I'm using cloudinary and id3js. First I upload the mp3 file to
cloudinary, then I request the file with Ajax through id3js. This
gives me all of the ID3 tags.
openUploadModal() {
cloudinary.openUploadWidget(window.cloudinaryOptions,
(errors, track) => {
if(!values(errors).length) {
id3(track[0].secure_url, (errs, tags) => {
this.setState({
title: tags.title,
audio_url: track[0].secure_url,
artist: tags.artist,
uploaded: true,
cover_photo: this.getImage(tags.v2.image)
});
});
}
});
}
And the image converter:
getImage(image) {
var arrayBuffer = image.data;
var bytes = new Uint8Array(arrayBuffer);
return "data:image/png;base64,"+btoa(unescape(encodeURIComponent(bytes)));
}
This is what the tags object looks like:
I then use the return value of getImage in the background-image attribute of a div. There are no errors in the console (not a bad request) but when opening the data:image/jpg;base64,... link there's only a little white square on the page.
How can I get a working url from the image object in the ID3 tags?
If image.data is an ArrayBuffer, you can use FileReader. FileReader load event is asynchronous, you cannot return the result from the function without using Promise, though you can use FileReaderSync() at Worker.
See also createImageBitmap alternative on Safari.
var reader = new FileReader();
reader.onload = function() {
// do stuff with `data URI` of `image.data`
console.log(reader.result);
}
reader.readAsDataURL(new Blob([image.data], {type:image.mime}));
I am using a plugin jsPDF which generates PDF and saves it to local file system. Now in jsPDF.js, there is some piece of code which generates pdf data in blob format as:-
var blob = new Blob([array], {type: "application/pdf"});
and further saves the blob data to local file system. Now instead of saving I need to print the PDF using plugin node-printer.
Here is some sample code to do so
var fs = require('fs'),
var dataToPrinter;
fs.readFile('/home/ubuntu/test.pdf', function(err, data){
dataToPrinter = data;
}
var printer = require("../lib");
printer.printDirect({
data: dataToPrinter,
printer:'Deskjet_3540',
type: 'PDF',
success: function(id) {
console.log('printed with id ' + id);
},
error: function(err) {
console.error('error on printing: ' + err);
}
})
The fs.readFile() reads the PDF file and generates data in raw buffer format.
Now what I want is to convert the 'Blob' data into 'raw buffer' so that I can print the PDF.
If you are not using NodeJS then you should know that the browser does not have a Buffer class implementation and you are probably compiling your code to browser-specific environment on something like browserify. In that case you need this library that converts your blob into a Buffer class that is supposed to be as perfectly equal to a NodeJS Buffer object as possible (the implementation is at feross/buffer).
If you are using node-fetch (not OP's case) then you probably got a blob from a response object:
const fetch = require("node-fetch");
const response = await fetch("http://www.stackoverflow.com/");
const blob = await response.blob();
This blob is an internal implementation and exists only inside node-fetch or fetch-blob libraries, to convert it to a native NodeJS Buffer object you need to transform it to an arrayBuffer first:
const arrayBuffer = await blob.arrayBuffer();
const buffer = Buffer.from(arrayBuffer);
This buffer object can then be used on things such as file writes and server responses.
For me, it worked with the following:
const buffer=Buffer.from(blob,'binary');
So, this buffer can be stored in Google Cloud Storage and local disk with fs node package.
I used blob file, to send data from client to server through ddp protocol (Meteor), so, when this file arrives to server I convert it to buffer in order to store it.
var blob = new Blob([array], {type: "application/pdf"});
var arrayBuffer, uint8Array;
var fileReader = new FileReader();
fileReader.onload = function() {
arrayBuffer = this.result;
uint8Array = new Uint8Array(arrayBuffer);
var printer = require("./js/controller/lib");
printer.printDirect({
data: uint8Array,
printer:'Deskjet_3540',
type: 'PDF',
success: function(id) {
console.log('printed with id ' + id);
},
error: function(err) {
console.error('error on printing: ' + err);
}
})
};
fileReader.readAsArrayBuffer(blob);
This is the final code which worked for me. The printer accepts uint8Array encoding format.
Try:
var blob = new Blob([array], {type: "application/pdf"});
var buffer = new Buffer(blob, "binary");