How to access data from ReadableStream response? - javascript

I made an API call to an endpoint and it returns this:
const test = () => {
fetch(endpoint)
.then((response) => {
console.log(response.body)
})
.catch((err) => {
console.log(err);
});
};
How can I obtain the ReadableStream in a Base64 format? This is returning a png file.

Using this answer, you can get the blob and convert it to base64.
const test = () => {
fetch(endpoint)
.then((response) => {
return response.blob();
})
.then((blob)=>{
var reader = new FileReader();
reader.readAsDataURL(blob);
reader.onloadend = function() {
var base64data = reader.result;
console.log(base64data);
}
})
.catch((err) => {
console.log(err);
});
};
MDN on .blob()

Related

Ask constraint questions when creating image objects in Chrome Browser

I tried to create an image object by uploading an image file of 170MB Chrome Browser.
However, the error as below is occurring.
enter image description here
What's the cause?
Is there a size constraint on creating an image object?
const loadFromFile = (file: File) => {
return new Promise((resolve, reject) => {
const reader = new CustomFileReader()
reader.onload = () => resolve(reader.result)
reader.onerror = (e) => reject(e)
reader.readAsDataURL(file)
})
}
const createImage = (dataUrl) => {
return new Promise((resolve, reject) => {
const img = new Image()
img.onload = () => resolve(img)
img.onerror = (error) => { // generate Error Event
console.log('image load error', e)
reject(e)
}
img.src = dataUrl
})
}
const load = async (file: File) => {
const dataUrl = await loadFromFile(file) // successful
const image = await createImage(dataUrl) // error
}
This is the image used for the test: https://upload.wikimedia.org/wikipedia/commons/thumb/6/68/La_crucifixi%C3%B3n%2C_by_Juan_de_Flandes%2C_from_Prado_in_Google_Earth.jpg/2560px-La_crucifixi%C3%B3n%2C_by_Juan_de_Flandes%2C_from_Prado_in_Google_Earth.jpg

I need help in using double Promises in Javascript

Here is the code that I tried.
// To get base64 code of file
const toBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onloaded = () => resolve(reader.result.replace(/^data:.+;base64,/, ''));
reader.onerror = error => reject(error);
})
// To make an array of files
const getAttachments = async files => {
let documents;
try {
documents = files.map(async file => {
let base64 = await toBase64(file)
return {
doc: base64,
documentName: file.name,
documentType: file.type
}
})
} catch {
console.error('Failed to get files as base64')
}
return Promise.resolve(documents)
}
And I just tried to get an object array as a result by using the above 2 functions.
Like the following;
getAttachments(Array.from(event.target.files)).then(documents => {
console.info(documents)
}
But the result is
Logged out result in Console
I'd love to know how I can get what I want.
Thanks.
Instead of returning an array of promises try returning an array of resolved promise using the await keyword.
try this
const getAttachments = async files => {
let documents;
try {
documents = files.map(async file => {
let base64 = await toBase64(file)
return {
doc: base64,
documentName: file.name,
documentType: file.type
}
})
return await Promise.all(documents);
} catch {
console.error('Failed to get files as base64')
}
}

Limit for strings in node.js/express

I am sending a base64-encoded image as a string to my backend using node.js/express. I would like to store it in my Postgres database, but I can not fetch the string. Is there any limit to this?
Before reaching my AJAX call in frontend I fill the data with:
var data = {picture: ""};
const reader = new FileReader();
const get_picture = new Promise((resolve, reject) => {
//event handler
reader.onload = resolve;
reader.onerror = reject;
//read image
reader.readAsDataURL(file);
})
.then(() => {
data.picture = reader.result;
})
.catch(() => {
show_modal(modal.title.error_custom, modal.body.error_image);
});
console.log(target_data);
$.ajax({ [...]
And my console show as expected:
Object { picture: "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAFIA…"}
This is by express route:
router.post("/upload_image", (req, res, next) => {
const picture = req.body.picture;
console.log("test");
console.log(picture);
}
This shows an empty line in my backend console. Are there any limits to the size of the parameter? Or what am I doing wrong? Thanks
EDIT: this works with async/wait
$("#form").submit(async function (event) {
[...]
var data = {picture: ""};
const get_picture = await new Promise((resolve, reject) => {
//event handler
reader.onload = resolve;
reader.onerror = reject;
//read image
reader.readAsDataURL(file);
})
.then(() => {
data.picture = reader.result;
})
.catch(() => {
show_modal(modal.title.error_custom, modal.body.error_image);
});
$.ajax({ [...]
Your ajax call doesn't wait for the reading process of your file, and send the top initialized object (var data = {picture: ""}).
You should call your ajax request in the .then()
var data = {picture: ""};
const reader = new FileReader();
const get_picture = new Promise((resolve, reject) => {
//event handler
reader.onload = resolve;
reader.onerror = reject;
//read image
reader.readAsDataURL(file);
})
.then(() => {
data.picture = reader.result;
$.ajax({[...]});
})
.catch(() => {
show_modal(modal.title.error_custom, modal.body.error_image);
});

Is there an easier way to make these axios post reqest?

I'm trying to upload multiple images to cloudinary via api in my react app using axios.
I'm new to promises, so I'm not sure if I'm using the right approach here.
I am able to upload all the images to cloudinary; however, the response I get is the confusing part, as with the response I need to pass the images' urls to an object, so I can send them to an email using mailgun.
Here is my code to upload the images:
if(payment === "success") {
axios
.all([
props.imgForProduct.map((img) => {
const fd = new FormData();
fd.append("file", img);
fd.append("upload_preset", "sublimation");
return axios
.post(
"https://api.cloudinary.com/v1_1/ahkji7ation/image/upload",
fd
)
.then(async (res) => {
return (imgUrl = res.data.secure_url);
})
.catch((err) => console.log(err));
}),
props.screenshot.map((img) => {
const fd = new FormData();
fd.append("file", img);
fd.append("upload_preset", "sublimation");
return axios
.post(
"https://api.cloudinary.com/v1_1/ahkji7ation/image/upload",
fd
)
.then(async (res) => {
return (imgScrSht = res.data.secure_url);
})
.catch((err) => console.log(err));
}),
])
.then(
axios.spread(async (...res) => {
imgUrl.push(res[0][0]);
imgScrSht.push(res[1][0]);
let dataObj = {
email: props.email,
img: imgUrl,
screenshot: imgScrSht,
};
console.log(dataObj);
new Promise((resolve, reject) => {
axios.post("/email_to_ayp_sublimation", dataObj);
resolve((res) => {
console.log(res);
});
reject((err) => {
console.log(err);
});
});
})
)
.catch((err) => console.error(err));
}
When I console log the dataObj this is what I'm sending in the last call (the new Promise):
screenshot: Array(1)
0: Promise {<resolved>: "https://res.cloudinary.com/ahkji7ation/image/u…
590871936/ahkji7ation/kqebmkjfj0prmyygls2y.jpg"}
length: 1
__proto__: Array(0)
I'm receiving [object Object] in the backend, instead of the url I need. Can anybody help me sort this out?
Thanks in advance
I think the issue is with your axios.all(...) code. You are passing in two values but the values have .map inside them which are returning urls. Because of this, axios.post() on both indexes will upload the images but axios.all() will have the return values from the .map() function which is an array of promises. You can try something like this.
async function uploadImages(imgForProduct, screenshots) {
const URL = "https://api.cloudinary.com/v1_1/ahkji7ation/image/upload";
//Collect all the form datas for img and screenshots
const imgForProductFormData = imgForProduct.map((img) => {
const fd = new FormData();
fd.append("file", img);
fd.append("upload_preset", "sublimation");
return fd;
});
const screenShotsFormData = screenshots.map((img) => {
const fd = new FormData();
fd.append("file", img);
fd.append("upload_preset", "sublimation");
return fd;
});
const imgForProductRequests = imgForProductFormData.map(
async (fd) => await axios.post(URL, fd).catch((err) => null)
);
const screenshotsRequests = screenShotsFormData.map(
async (fd) => await axios.post(URL, fd).catch((err) => null)
);
try {
const imgForProductResponses = await axios.all(imgForProductRequests);
imgForProductResponses.map((res) => (res[0] ? imgUrl.push(res.data.secure_url) : null));
const screenshotsResponses = await axios.all(screenshotsRequests);
screenshotsResponses.map((res) => (res[0] ?imgScrSht.push(res.data.secure_url) : null));
let dataObj = {
email: props.email,
img: imgUrl,
screenshot: imgScrSht,
};
console.log(dataObj);
new Promise((resolve, reject) => {
axios.post("/email_to_ayp_sublimation", dataObj);
resolve((res) => {
console.log(res);
});
reject((err) => {
console.log(err);
});
});
} catch(err) {console.log(err)}
}
Hope this works!

Send ArrayBuffer to S3 put to signedURL

I am progressively loading a file into a buffer, the buffer is valid, but the browser crashes when the ArrayBuffer is finished loading the file into it. What I need to do is to be able to send the pieces of the buffer buf = this.concatBuffers(buf, buffer); to the axios PUT request so I can progressively upload the file to s3, rather than load it into a single variable returned by the promise (as the memory gets exceeded).
How do I modify the link between readFileAsBuffer and the uploadFileToS3 method to do this?
This is my code so you can follow the process.
concatTypedArrays = (a, b) => {
const c = new a.constructor(a.length + b.length);
c.set(a, 0);
c.set(b, a.length);
return c;
};
concatBuffers = (a, b) =>
this.concatTypedArrays(
new Uint8Array(a.buffer || a),
new Uint8Array(b.buffer || b),
).buffer;
readFileAsBuffer = file =>
new Promise((resolve, reject) => {
const fileReader = new FileReader();
fileReader.file = file;
let buf = new ArrayBuffer();
const fileChunks = new FileChunker(file, 2097152);
fileReader.readAsArrayBuffer(fileChunks.blob());
fileReader.onload = e => {
this.onProgress(fileChunks);
const buffer = e.target.result;
buf = this.concatBuffers(buf, buffer);
if (fileChunks.hasNext()) {
fileChunks.next();
fileReader.readAsArrayBuffer(fileChunks.blob());
return;
}
resolve(buf);
};
fileReader.onerror = err => {
reject(err);
};
});
uploadFileToS3 = fileObject => {
new Promise((resolve, reject) => {
const decodedURL = decodeURIComponent(fileObject.signedURL);
this.readFileAsBuffer(fileObject.fileRef).then(fileBuffer => {
console.log(fileBuffer);
axios
.put(decodedURL, fileBuffer, {
headers: {
'Content-Type': fileObject.mime,
'Content-MD5': fileObject.checksum,
'Content-Encoding': 'UTF-8',
'x-amz-acl': 'private',
},
onUploadProgress: progressEvent => {
const { loaded, total } = progressEvent;
const uploadPercentage = parseInt(
Math.round((loaded * 100) / total),
10,
);
this.setState({ uploadProgress: uploadPercentage });
console.log(`${uploadPercentage}%`);
if (uploadPercentage === 100) {
console.log('complete');
}
},
})
.then(response => {
resolve(response.data);
})
.catch(error => {
reject(error);
});
});
});
};
uploadAllFilesToS3 = () => {
const { files } = this.state;
new Promise((resolve, reject) => {
Object.keys(files).map(idx => {
this.uploadFileToS3(files[idx])
.then(response => {
this.setState({ files: [] });
resolve(response.data);
})
.catch(error => {
reject(error);
});
});
});
};
calcFileMD5 = file =>
new Promise((resolve, reject) => {
const fileReader = new FileReader();
fileReader.file = file;
const spark = new SparkMD5.ArrayBuffer();
const fileChunks = new FileChunker(file, 2097152);
fileReader.readAsArrayBuffer(fileChunks.blob());
fileReader.onload = e => {
this.onProgress(fileChunks);
const buffer = e.target.result;
spark.append(buffer);
if (fileChunks.hasNext()) {
fileChunks.next();
fileReader.readAsArrayBuffer(fileChunks.blob());
return;
}
const hash = spark.end();
const checksumAWS = Buffer.from(hash, 'hex').toString('base64');
resolve(checksumAWS);
};
fileReader.onerror = err => {
reject(err);
};
});
I ended up not needing to create my own Buffer of the file, instead if I post the fileReference returned by the input directly to axios (or xhr) the request automatically chunked the upload.
Initially I could only make it work with XMLHttpRequest, but I quickly found a way to wrap this around axios which neatens the logic.
XMLHttpRequest
const xhr = createCORSRequest('PUT', url);
if (!xhr) {
console.log('CORS not supported');
} else {
xhr.onload = function(){
if(xhr.status == 200) {
console.log('completed');
} else {
console.log('Upload error: ' + xhr.status);
}
};
xhr.onerror = function(err) {
console.log(err)
};
xhr.upload.onprogress = function(progressEvent){
console.log(progressEvent);
};
xhr.setRequestHeader('Content-Type', file.type);
xhr.setRequestHeader('Content-MD5', md5_base64_binary);
xhr.setRequestHeader('Content-Encoding', 'UTF-8');
xhr.setRequestHeader('x-amz-acl', 'private');
xhr.send(file);
}
Or using axios;
uploadFileToS3 = fileObject => {
return new Promise((resolve, reject) => {
const { enqueueSnackbar } = this.props;
const decodedURL = decodeURIComponent(fileObject.signedURL);
axios
.put(decodedURL, fileObject.fileRef, {
headers: {
'Content-Type': fileObject.mime,
'Content-MD5': fileObject.checksum,
'Content-Encoding': 'UTF-8',
'x-amz-acl': 'private',
},
onUploadProgress: progressEvent => {
const { loaded, total } = progressEvent;
const uploadPercentage = parseInt(
Math.round((loaded * 100) / total),
10,
);
this.setState({ uploadProgress: uploadPercentage });
},
})
.then(response => {
resolve(response.data);
})
.catch(error => {
reject(error);
});
});
};
Have you tried uploading your file using formData? Let the browser deal with file reading.
const data = new FormData()
data.append('file', file)
axios.put(decodedURL, data, ....)
Another option is to use axios https://github.com/axios/axios#request-config transformRequest property. And call for file reading there.

Categories

Resources