YouTubeAPI: How to upload thumbnail (JS) - javascript

I tried uploading thumbnail on youtube using this guide: https://developers.google.com/youtube/v3/docs/thumbnails/set
I was able to successfully run it on postman using this curl:
curl --location --request POST 'https://www.googleapis.com/upload/youtube/v3/thumbnails/set?videoId=<video id>' \
--header 'Authorization: OAuth <token>' \
--header 'Content-Type: image/png' \
--form 'thumbnail=#"/C:/Users/user/Pictures/nami.PNG"'
However I have trouble translating that into js, what I did so far is:
// the "file" is the File from <input type="file"> - data on this looks ok
uploadThumbnail async (file) {
const formData = new FromData();
const formData.append('thumbnail', file, 'test.png');
await fetch.post('https://www.googleapis.com/youtube/v3/thumbnails/set', {
headers: {
Authorization: 'Oauth <token>',
'Content-Type': 'multipart/form-data' // I also tried using the file.type here (image/png)
},
query: {
videoId: <video id>
},
body: formData,
})
}
(to simplify the logic, I only manually typed the code above, so pardon if there are any typo.)
but this throws The request does not include the image content. I don't understand, I also tried converting the File into Blob, but same error.

As pointed out on the comments on my main post, I combined the answers and came up with this (this works!)
await fetch.post(`https://www.googleapis.com/upload/youtube/v3/thumbnails/set?videoId=${videoId}&uploadType=media`, {
headers: {
Authorization: 'Bearer <token>',
'Content-Type': file.type
},
body: file,
})
Mistakes are:
My endpoint is wrong and is missing uploads (this API is different from other youtube endpoints, so if you are reusing a variable base_url better check it out.
Using Oauth instead of Bearer.
There are no query in fetch
No need to convert and add the formData, pass the file directly instead.

Related

How to post body data using Fetch API?

Below is the curl command that gives back the response successfully upon importing and running in postman.
curl --request POST \
--data "grant_type=password" \
--data "username=test" \
--data "password=xyz1234" \
--data "scope=write" \
--data "client_id=test" \
--data "client_secret=test12" \
"https://example.com/access_token"
Below is how I am sending data using fetch api in my js code.
const response = await fetch ('https://example.com/access_token',
{
'credentials' : 'include',
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ grant_type:'password',username:'test',password:'xyz1234',scope:'write',client_id:'test',client_secret:'test12'}),
})
However the equilavent curl which is generated after copying from chrome developer tools is below.
curl --request POST \
--data-raw '{"grant_type":"password","username":"test","password":"xyz1234","scope":"write","client_id":"test","client_secret":"test12"}'
"https://example.com/access_token"
I suspect that the body data is not constructed in the correct format. This may be leading to a 400 error code response. How should I send the data using fetch api equilavent to working curl command?
Looking at the curl your data does seem to be URL encoded. So as it's not expecting JSON don't serialize it to a JSON string.
const headers = new Headers({
"Content-Type": "application/x-www-form-urlencoded"
});
const urlencoded = new URLSearchParams({
"grant_type": "password",
"username": "test",
"password": "xyz1234",
"scope": "write",
"client_id": "test",
"client_secret": "test12",
});
const opts = {
method: 'POST',
headers: headers,
body: urlencoded,
};
fetch("https://example.com/access_token", opts);
EDIT
As #Kaiido mentioned in the comments. It is not necessary to set the Content-Type header explicitly as the browser will do that automatically, but I have done it here to show you that it should not be set to application/json but to application/x-www-form-urlencoded.

POSTMAN PUT request not updating values

Assuming the position of the API tester of https://imgur.com/ , Im testing the PUT request to change account settings.
I am following this api doc
Link for above https://apidocs.imgur.com/#7bc88d39-d06d-4661-afff-38ea5b9a1d0a
Steps to check this
Add the relevant info as below, I am setting show_mature and newsletter_subscribed to true
2. Set the Access token
3. Click on send
the response for this is 200 as shown below
Check if the details have updated as shown in the following screenshot
Expected: To have show_mature and newsletter_subscribed values set to true
Actual: show_mature and newsletter_subscribed values are false
Would be really appreciated if someone could let me know why this is happening? Thanks
From the Imgur API docs...
Need help?
The Imgur engineers are always around answering questions. The quickest way to get help is by posting your question on StackOverflow with the Imgur tag.
Real helpful Imgur 🙄.
Answering here to provide a canonical answer in the imgur tag for this nonsense.
All the API examples in the documentation use some form of multipart/form-data request body payloads. Eg
var myHeaders = new Headers();
myHeaders.append("Authorization", "Bearer {{accessToken}}");
var formdata = new FormData();
var requestOptions = {
method: 'PUT',
headers: myHeaders,
body: formdata,
redirect: 'follow'
};
fetch("https://api.imgur.com/3/account/{{username}}/settings", requestOptions)
.then(response => response.text())
.then(result => console.log(result))
.catch(error => console.log('error', error));
and
curl --location --request POST 'https://api.imgur.com/oauth2/token' \
--form 'refresh_token="{{refreshToken}}"' \
--form 'client_id="{{clientId}}"' \
--form 'client_secret="{{clientSecret}}"' \
--form 'grant_type="refresh_token"'
With the exception of any upload related endpoints, this is ABSOLUTELY INCORRECT. Passing data as multipart/form-data requires the API to handle that request content-type and guess what, the Imgur API does not.
What they do accept is application/x-www-form-urlencoded.
In Postman that's the x-www-form-urlencoded option, not form-data
In cURL that's the -d option, not --form
In JavaScript that's URLSearchParams, not FormData

cURL with -F Option into NodeJS with Axios

I'm attempting to convert this CURL command
curl -X POST "https://serverless-upload.twilio.com/v1/Services/ZS5798711f7bee1284df67427071418d0b/Assets/ZH4912f44da25f4b1a1c042a16a17f2eac/Versions" \
-F Content=#./mapping/mapping.json; type=application/json \
-F Path=mapping.json \
-F Visibility=private \
-u username:password
into a post request using the package axios,
I've tried
url = `https://serverless-upload.twilio.com/v1/Services/${service_uid}/Assets/${asset_uid}/Versions`
data = {
'Path': 'mapping.json',
'Visibility': 'private',
'Content': JSON.stringify(mapping),
'filename': 'mapping.json',
'contentType': 'application/json'
}
await axios.post(url, data, {
auth : {
user: `${accountSid}:${authToken}`
},
headers: {
'Content-Type': 'multipart/form-data',
}
}).then((r) => console.log(r));
but I'm unsure if this is malformed or not
Twilio developer evangelist here.
The Twilio Node library actually uses axios under the hood, you can see it in action in the RequestClient. We also have a stand-alone Serverless API client which is part of the Twilio Serverless Toolkit you can use, but it is written with got instead.
You can use the Serverless API module to save yourself the work of recreating this request.
If you decide to continue with axios, here are the changes you should make.
Auth
Authorization is done via the Authorization header, passing a base 64 encoded string made up of the account Sid and auth token.
headers: {
Authorization: 'Basic ' + Buffer.from(`${accountSid}:${authToken}`).toString('base64')
}
Data
When uploading an asset, it is done as multipart form data. To build up multipart data in Node.js you can use the form-data module. Something like this should work:
const FormData = require("form-data");
const form = new FormData();
form.append("Path", "mapping.json");
form.append("Visibility", "private");
form.append("Content", JSON.stringify(mapping));
form.append("filename", "mapping.json");
form.append("contentType", "application/json");
await axios.post(url, form, {
headers: {
Authorization: 'Basic ' + Buffer.from(`${accountSid}:${authToken}`).toString('base64'),
...form.getHeaders(),
},
}).then((r) => console.log(r));
Let me know how you get on with that.

Corrupt zip when uploading via Axios POST

I'm 'publishing' a zip file to a server app via a REST endpoint. If I POST via Postman or the app's frontend, I get a published zip file which is valid. I can turn around download it, and open it, etc.
If I attempt to do the same thing with my code and Axios, the server app's attempt to unzip and use the content I've uploaded fails. If I DL the archive, it is corrupt. The fact that the same archive works via Postman & the app's UE tells me this is either a PEBKAC or potentially an issue with Axios itself. Here is the code I'm using to POST to the endpoint. Note that at the end, I'm actually writing the data I POST to a local file on my machine as a zip so I can verify I'm not doing anything dumb whenI read the file via fs. The local copy of the file I created is fine and dandy.
Note how I'm actually hard-coding content-length, as well. I'm testing with a single file and I've verified the length is correct via fs.statSync AND that it matches the Content-Length I see when I upload via Postman & the App's UE.
var uploadFile = (data, fileInfo) => new Promise(resolveUpload => {
console.log("Starting Upload API call for:", fileInfo.description);
axios.post(aepServer + '/api/v1/files',
data, {
jar: cookieJar,
withCredentials: true,
headers: {
'Content-Type': 'application/octet-stream',
'path': fileInfo.path,
'description': fileInfo.description,
'Content-Length': 354198,
'Accept-Encoding': 'gzip, deflate, br',
'Accept': '*/*',
'Connection': 'keep-alive'
},
// DANGER: allow self-signed certs workaround which I must remove
httpsAgent: new https.Agent({
rejectUnauthorized: false,
})
}).then((response) => {
fileResponse = response.data;
console.log('\n', chalk.bgMagenta('FILE UPLOADED: '), response);
fs.writeFileSync('c:\\data\\newfile.zip', data, 'binary');
resolveUpload(fileResponse);
})
.catch((err) => {
console.log("AXIOS ERROR: ", err);
})
});
Does anything look wrong here? While looking at the response object, I do see something that has me scratching my head:
headers: {
Accept: '*/*',
'Content-Type': 'application/octet-stream',
path: '/Workspaces/Public%20Apps/UFOs.yxzp',
description: 'UFO Sitings in the US, 1995 to present',
'Content-Length': 532362,
'Accept-Encoding': 'gzip, deflate, br',
Connection: 'keep-alive',
Cookie: 'ayxSession=s%3Ac39f55a3-b219-43a5-9f8a-785e1222c81c.QR4KI8uXaQlL9axqkO8AkyabPVt3i37nGbz%2FJef0eqU',
'User-Agent': 'axios/0.19.2'
},
Look at the content length: 532362 bytes. It seems like the ~354k value I hard-coded in the headers is being ignored somehow. Might this be my problem? [BTW, if I use the same code to upload a csv or txt file, all is well - this seems related to compressed files only]
EDIT: Welp, it looks like Axios does override that property and there's nothing I can do about it: Axios set Content-Length manually, nodeJS. Now the question is if setting this incorrectly would munge the file and WHY the value is wrong. When I do a data.length, I get the 354198 value.
can you try with 'multipart/form-data'
axios.post('upload_file', formData, {
headers: {
'Content-Type': 'multipart/form-data'
}
})
alternatively get the CURL request using postman and use the header information supplied by postman
My answer was here:
Reading binary data in node.js
In essence, I was reading the zip file as 'binary" when I shouldn't have passed anything in at all.
var content = fs.readFileSync(aFile.readFrom);
//NOT
//var content = fs.readFileSync(aFile.readFrom, 'binary');

data-Binary in request nodejs

I'm trying to upload a file to dropbox throug nodeJS.
This CURL request works.
curl -X POST https://content.dropboxapi.com/2/files/upload \
--header "Authorization: Bearer myToken" \
--header "Dropbox-API-Arg: {\"path\": \"/Homework/math/Matrices.txt\",\"mode\": \"add\",\"autorename\": true,\"mute\": false}" \
--header "Content-Type: application/octet-stream" \
--data-binary #fileName
I don't know how to translate it into javascript code.
Here's what I've accomplished so far:
var request = require('request')
var headers = {
"Authorization": "Bearer " + dropboxToken,
"Dropbox-API-Arg": {
"path": "/"+fileName, //nome sul drive
"mode": "add",
"autorename": true,
"mute": false
},
"Content-Type": "application/octet-stream"
}
var options = {
url: 'https://content.dropboxapi.com/2/files/upload',
method: 'POST',
headers: headers,
}
request(options, function (error, response, body) {
if (!error && response.statusCode == 200) {
console.log(body)
}
});
How do I include the data-binary option in this request in order to select the file to upload?
Thanks.
you can create a readstream and then pipe it to request with your current headers and options like-
fs.createReadStream('/path/to/youfile').pipe(request.post(options).on('end', (done) => { console.log('success') }));
First, if you're trying to integrate with the Dropbox API in JavaScript, we recommend using the official Dropbox API v2 JavaScript SDK, as it will do most of the work for you:
https://github.com/dropbox/dropbox-sdk-js
Otherwise, if you don't want to use the SDK, you can make the requests yourself. In this case, the --data-binary parameter is the curl parameter for supplying the data for the request to curl. curl then takes that data and puts it in the request body, which is the correct way to supply file data for a Dropbox API v2 "content-upload" style endpoint, such as /2/files/upload.
So, you should check the documentation for the HTTP client you're using for information on how to set the request body. It looks like you're using the request node module, which appears to take a body option, where you can put the request data.

Categories

Resources