How to upload an image to Cloudinary with fetch - javascript

I'm trying to upload a file to Cloudinary using fetch from my front-end. I've tried piecing together the way to do it from the documentation and StackOverflow answers, but I get a 400 error:
export async function uploadImageToCloudinary(file: File) {
const url = `https://api.cloudinary.com/v1_1/${cloudName}/upload`;
const fetched = await fetch(url, {
method: "post",
body: JSON.stringify({
file,
cloud_name: cloudName,
upload_preset: "unsigned",
}),
});
const parsed = await fetched.json()
console.log({
parsed // 400 error, message: "Upload preset must be specified when using unsigned upload"
});
}
It says upload preset must be specified, so I must have the above code wrong. My Cloudinary Settings have the 'unsigned' upload preset here:

It works if I replace the body: JSON.stringify(...) with const data = new FormData().... I don't know why, but this works:
export async function uploadImageToCloudinary(file: File) {
const url = `https://api.cloudinary.com/v1_1/${cloudName}/upload`;
const data = new FormData();
data.append('file', file);
data.append('upload_preset', 'unsigned');
const fetched = await fetch(url, {
method: "post",
body: data,
});
const parsed = await fetched.json()
console.log({
parsed // 200, success!
});
}

Related

Trouble using JavaScript and the google drive API to convert a google slide into a pdf, and upload the pdf onto a folder

I'm new to JavaScript, and am trying to write some code that uses the google drive API (via the gapi client) to transform an existing slide into a pdf document, upload it to a specific folder, and return the pdf file id. This is all to be done in the browser, if possible.
I've already done this on python for another use case, and the code looks something like this:
import googleapiclient.http as client_methods
from io import BytesIO
...
data = drive.files().export(fileId=slideId, mimeType='application/pdf').execute()
body = {'name': fileName, 'mimeType': 'application/pdf', 'parents': [folderId]}
# wrapping the binary (data) file with BytesIO class
fh = io.BytesIO(data)
# creating the Media Io upload class for the file
media_body = client_methods.MediaIoBaseUpload(fh, mimetype='application/pdf')
pdfFileId = drive.files().create(body=body, media_body=media_body, supportsAllDrives=True).execute(['id'])
I've tried to replicate the same steps using JavaScript and my limited knowledge, and can successfully upload a pdf file into the desired folder, but the file shows as empty (doesn't even open in the drive).
I believe it might be due to the way I'm handling the binary data that I get from exporting the initial slide.
The last iteration of my JavaScript code is shown below (I have all the necessary permissions to use the gapi client):
async function createPdfFile() {
gapi.client.load("drive", "v3", function () {
// Set the MIME type for the exported file
const mimeType = "application/pdf";
// Set the file name for the exported PDF file
const fileName = "Trial upload.pdf";
// Export the Google Slides presentation as a PDF file
gapi.client.drive.files.export({
fileId,
mimeType
}).then(async function (response) {
// Get the binary data of the PDF file
const pdfData = await response.body;
const blob = await new Blob([pdfData], {type: 'application/pdf'})
const file = new File([blob], "presentation.pdf");
// Create a new file in the specified Google Drive folder with the PDF data
await gapi.client.drive.files.create({
name: fileName,
parents: [folderId],
mimeType: mimeType,
media: {mimeType: 'application/pdf', body: file},
supportsAllDrives: true
}).then(function (response) {
// Get the ID of the created PDF file
const pdfFileId = response.result.id;
console.log("PDF file created with ID: " + pdfFileId);
})
})
})
}
await createPdfFile()
As for the output, and as stated, it does create a pdf file, and logs the pdf file id, but the file itself is empty. I'd really appreciate it if someone could help me make sense of this (similar thread here, but can't replicate his success).
I believe your goal is as follows.
You want to convert Google Slides to PDF format using googleapis for Javascript.
Your access token can be exported and uploaded to Google Drive.
Issue and workaround:
When I tested your script, unfortunately, response.body from gapi.client.drive.files.export is binary data, and in this case, this cannot be correctly converted to the blob. And also, in the current stage, it seems that a file cannot be uploaded using gapi.client.drive.files.create. I thought that these might be the reason for your current issue.
From these situations, I would like to propose the flow for achieving your goal using fetch API. The modified script is as follows.
In this case, the access token is retrieved from the client like gapi.auth.getToken().access_token.
Modified script:
Please modify your script as follows.
From:
gapi.client.drive.files.export({
fileId,
mimeType
}).then(async function (response) {
// Get the binary data of the PDF file
const pdfData = await response.body;
const blob = await new Blob([pdfData], { type: 'application/pdf' })
const file = new File([blob], "presentation.pdf");
// Create a new file in the specified Google Drive folder with the PDF data
await gapi.client.drive.files.create({
name: fileName,
parents: [folderId],
mimeType: mimeType,
media: { mimeType: 'application/pdf', body: file },
supportsAllDrives: true
}).then(function (response) {
// Get the ID of the created PDF file
const pdfFileId = response.result.id;
console.log("PDF file created with ID: " + pdfFileId);
})
})
To:
gapi.client.drive.files.get({ fileId, fields: "exportLinks", supportsAllDrives: true }).then(function (response) {
const obj = JSON.parse(response.body);
if (Object.keys(obj).length == 0) throw new Error("This file cannot be converted to PDF format.");
const url = obj.exportLinks["application/pdf"];
if (!url) throw new Error("No exported URL.");
const accessToken = gapi.auth.getToken().access_token;
fetch(url, {
method: 'GET',
headers: { 'Authorization': 'Bearer ' + accessToken },
})
.then(res => res.blob())
.then(blob => {
const metadata = { name: fileName, parents: [folderId], mimeType };
const form = new FormData();
form.append('metadata', new Blob([JSON.stringify(metadata)], { type: 'application/json' }));
form.append('file', blob);
fetch('https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart&supportsAllDrives=true', {
method: 'POST',
headers: { 'Authorization': 'Bearer ' + accessToken },
body: form
})
.then(res => res.json())
.then(obj => console.log("PDF file created with ID: " + obj.id));
});
});
When this script is run, the export URL of PDF data is retrieved from the file ID. And, the PDF data is downloaded and uploaded to Google Drive.
Note:
In your script, fileId is not declared. Please be careful about this.
If the file size is more than 5 MB, please use the resumable upload.
Reference:
Upload file data
Added:
From your following reply,
?uploadType=multipart also returns a 404 type error
I'm worried about that in your situation, new FormData() might not be able to be used. If my understanding is correct, please test the following script. In this script, the request body of multipart/form-data is manually created.
Modified script:
gapi.client.drive.files.get({ fileId, fields: "exportLinks", supportsAllDrives: true }).then(function (response) {
const obj = JSON.parse(response.body);
if (Object.keys(obj).length == 0) throw new Error("This file cannot be converted to PDF format.");
const url = obj.exportLinks["application/pdf"];
if (!url) throw new Error("No exported URL.");
const accessToken = gapi.auth.getToken().access_token;
fetch(url, {
method: 'GET',
headers: { 'Authorization': 'Bearer ' + accessToken },
})
.then(res => res.blob())
.then(blob => {
const metadata = { name: fileName, parents: [folderId], mimeType };
const fr = new FileReader();
fr.onload = e => {
const data = e.target.result.split(",");
const req = "--xxxxxxxx\r\n" +
"Content-Type: application/json\r\n\r\n" +
JSON.stringify(metadata) + "\r\n" +
"--xxxxxxxx\r\n" +
"Content-Transfer-Encoding: base64\r\n\r\n" +
data[1] + "\r\n" +
"--xxxxxxxx--";
fetch('https://www.googleapis.com/upload/drive/v3/files?uploadType=multipart&supportsAllDrives=true', {
method: 'POST',
headers: { 'Authorization': 'Bearer ' + accessToken, "Content-Type": "multipart/related; boundary=xxxxxxxx" },
body: req
})
.then(res => res.json())
.then(obj => {
console.log("PDF file created with ID: " + obj.id)
});
}
fr.readAsDataURL(blob);
});
});
When I tested this script, no error occurs. I confirmed that the Google Slides file could be converted to a PDF file and the PDF file was uploaded to the specific folder.

Issue with sending FormData from backend

I have a component which processes and uploads images. Currently I process the image on my backend and then send it to my frontend and then upload it from there. I would like to do everything on my backend. The only issue is that the upload endpoint requires FormData() object. I found an npm package form-data which I'm using on my backend now, but I'm still getting error.
This is how it currently works:
// frontend logic:
const data = await uploadImage(img);
const file = new File([Buffer.from(data)], `img-${i}.webp`, {
type: "image/webp",
});
const formData = new FormData();
formData.append("path", "images");
formData.append("files", file, file.name);
await axios
.post("http://localhost:1338/api/upload", formData, {
headers: { authorization: `Bearer ${jwtToken}` },
})
.then(({ data }) => {
console.log(data);
})
.catch(console.log);
//
//
// backend logic:
const data = await processImage(img.url);
return data;
This is what im trying to do:
// frontend logic:
const data = await uploadImage(img);
//
//
// backend logic:
const data = await processImage(img.url);
const formData = new FormData();
formData.append("path", "images");
formData.append("files", data, "file.name");
await axios
.post("http://localhost:1338/api/upload", formData, {
headers: { authorization: `Bearer ${process.env.JWT_TOKEN}` },
})
.then(({ data }) => {
console.log(data);
})
.catch(console.log); // I get error: 413 Payload Too Large
I'm trying to do it with the same image which works with the first method. Perhaps I need to create a new File(), but I couldn't find any npm packages which worked for that. What should I do to get this working?

Empty image when uploading to presigned AWS S3 url in React Native

I'm trying to upload image to AWS S3 in my React Native(expo managed workflow), but in result the file is empty. I don't get any errors through the process. I've also tried to upload it using Uppy AWS plugin, but result is the same. Here is my code:
async function getUploadUrl(filename: string, type: string) {
const response = await fetch(GET_UPLOAD_URL(filename, type), {
method: 'GET',
headers: {
'Content-Type': 'application/json',
Accept: 'application/json',
},
});
return await response.json();
}
export default async function uploadImage(
file: Blob,
filename: string,
base64: string
) {
const uploadData = await getUploadUrl(filename, file.type);
const data = new FormData();
for (const [key, value] of Object.entries(uploadData.fields)) {
data.append(key, value as string);
}
data.append('file', Buffer.from(base64, 'base64'));
let res = await fetch(uploadData.url, {
method: 'post',
body: data,
headers: {
'Content-Type': 'multipart/form-data;',
},
});
I am using expo image picker to get the file. I've also tried to upload just Blob file insead of Buffer, but it doesn't work either
Here is how the file looks if i open it in browser https://prnt.sc/vOk5CI7lyPhu
If anyone also faced such problem, i managed to upload the file by uri like this:
formData.append('file', {
uri: uri,
type: 'image/jpeg',
name: filename,
});

Expo Formdata does not working for to send a image to api rest

Hi Everybody i making a web frontend using expo SDK 39.0.2 but i have a big problend and really i not found a fix, im trying to send image like formdata to my backend but in mi backend this is null; this is not a problem of my api rest, i send a similar tequest from postman and all is’t working… but not working from formdata and I think : “this is problem with my way of to send formdata”
this is my code:
//Component code
const imagepreview = async()=>{
let result = await ImagePicker.launchImageLibraryAsync({
allowsEditing: false,
aspect: [4, 3],
quality: 0.6,
});
console.log(result)
if(result.cancelled == false){
setImage(result.uri)
let uriParts = result.uri.split('.');
let fileType = uriParts[uriParts.length - 1];
settype(fileType)
}
}
const sendData = async()=>{
// Upload the image using the fetch and FormData APIs
let formData = await new FormData();
// Assume "photo" is the name of the form field the server expects
await formData.append('photo', {
name: `photo.${type}`,
type: `image/${type}`,
uri: image,
});
const send = await user.changeprofilepicture(formData)
my http request code:
async changeprofilepicture(formdata){
console.log(formdata)
const token = JSON.parse(localStorage.getItem('user-token'))
let url = baseurl+'changeprofilepicture'
let options = {
headers: {
'content-type': 'multipart/form-data'
}
}
await axios.post(url, formdata, options)
await axios.post(url, {
photo: formdata
}, {headers :
{'Content-Type':undefined}})
}
i have two http request for im triying two ways for solved my broblem but does not works, only works at postman
really, thanks for te answers
Finally I solved..
I Change expo image picker for expo document picker like this:
const imagepreview = async()=>{
const trye = await DocumentPicker.getDocumentAsync({type: 'image/*'})
setFile(trye.file)
setImage(trye.uri)
console.log (trye)
}
const sendData = async()=>{
formData.append('photo', file);
console.log(file)
return await fetch('http://127.0.0.1:3333/api/changeprofilepicture', {
method: 'POST',
body: formData,
});
}

Download and upload image without saving to disk

Using Node.js, I am trying to get an image from a URL and upload that image to another service without saving image to disk. I have the following code that works when saving the file to disk and using fs to create a readablestream. But as I am doing this as a cron job on a read-only file system (webtask.io) I'd want to achieve the same result without saving the file to disk temporarily. Shouldn't that be possible?
request(image.Url)
.pipe(
fs
.createWriteStream(image.Id)
.on('finish', () => {
client.assets
.upload('image', fs.createReadStream(image.Id))
.then(imageAsset => {
resolve(imageAsset)
})
})
)
Do you have any suggestions of how to achieve this without saving the file to disk? The upload client will take the following
client.asset.upload(type: 'file' | image', body: File | Blob | Buffer | NodeStream, options = {}): Promise<AssetDocument>
Thanks!
How about passing the buffer down to the upload function? Since as per your statement it'll accept a buffer.
As a side note... This will keep it in memory for the duration of the method execution, so if you call this numerous times you might run out of resources.
request.get(url, function (res) {
var data = [];
res.on('data', function(chunk) {
data.push(chunk);
}).on('end', function() {
var buffer = Buffer.concat(data);
// Pass the buffer
client.asset.upload(type: 'buffer', body: buffer);
});
});
I tried some various libraries and it turns out that node-fetch provides a way to return a buffer. So this code works:
fetch(image.Url)
.then(res => res.buffer())
.then(buffer => client.assets
.upload('image', buffer, {filename: image.Id}))
.then(imageAsset => {
resolve(imageAsset)
})
well I know it has been a few years since the question was originally asked, but I have encountered this problem now, and since I didn't find an answer with a comprehensive example I made one myself.
i'm assuming that the file path is a valid URL and that the end of it is the file name, I need to pass an apikey to this API endpoint, and a successful upload sends me back a response with a token.
I'm using node-fetch and form-data as dependencies.
const fetch = require('node-fetch');
const FormData = require('form-data');
const secretKey = 'secretKey';
const downloadAndUploadFile = async (filePath) => {
const fileName = new URL(filePath).pathname.split("/").pop();
const endpoint = `the-upload-endpoint-url`;
const formData = new FormData();
let jsonResponse = null;
try {
const download = await fetch(filePath);
const buffer = await download.buffer();
if (!buffer) {
console.log('file not found', filePath);
return null;
}
formData.append('file', buffer, fileName);
const response = await fetch(endpoint, {
method: 'POST', body: formData, headers: {
...formData.getHeaders(),
"Authorization": `Bearer ${secretKey}`,
},
});
jsonResponse = await response.json();
} catch (error) {
console.log('error on file upload', error);
}
return jsonResponse ? jsonResponse.token : null;
}

Categories

Resources