I am trying to upload a video using the TUS methodology (js-tus-client) everything seems to be working well. I will post my code below to see if you can help me, please!
// /** Get one time link to upload video to cloudFlare */
router.post('/oneTimeLink', async (req, res) => {
var config = {
method: 'POST',
url: `https://api.cloudflare.com/client/v4/accounts/${process.env.CLOUDFLARE_CLIENT_ID}/stream?direct_user=true`,
headers: {
'Authorization': `Bearer ${process.env.CLOUDFLARE_KEY_ID}`,
'Tus-Resumable': '1.0.0',
'Upload-Length': '1',
'Upload-Metadata': 'maxdurationseconds NjAw',
},
};
axios(config)
.then(async function (response) {
const location = await response.headers.location
console.log(location)
res.set({
'Access-Control-Allow-Headers': '*',
'Access-Control-Expose-Headers': '*',
'Access-Control-Allow-Origin': '*',
'Access-Control-Allow-Methods': '*',
'Location': location,
})
res.send(true)
})
.catch(function (error) {
console.log(error);
});
})
In the Upload-Metadata I set maxdurationseconds NjAw, this means a maxDurationSeconds of 10 min (I am sending a 25 secs video)
That’s my node.js backend API.
Now on the frontend side, I am using the js-tus-client as follows.
import:
import * as tus from "tus-js-client";
Function:
const file = state.coursesData['Module 1'][0].content.video
var upload = new tus.Upload(file, {
endpoint: 'http://localhost:8080/dev/coursesLibraryPractitioner/oneTimeLink',
chunkSize: 250 * 1024 * 1024,
metadata: {
filename: file.name,
filetype: file.type
},
onError: function (error) {
console.log("Failed because: " + error)
},
onProgress: function (bytesUploaded, bytesTotal) {
var percentage = (bytesUploaded / bytesTotal * 100).toFixed(2)
console.log(bytesUploaded, bytesTotal, percentage + "%")
},
onSuccess: function () {
console.log("Download %s from %s", upload.file.name, upload.url)
}
})
upload.start()
The endpoint is pointing to my backend server which returns in the location header the URL.
Now, this is what happens when I trigger this function.
The backend route executes without error and the Cloudflare dashboard creates a new empty video which is great according to the docs.
In the frontend logs:
Independent on the chunksize sometimes the percent reaches 100%, but NEVER saves the video. I don’t know where that patch petition is coming from.
Please I need to have this process for the company I work for, they will acquire a more robust subscription than mine that is set right now at a 1000minutes per month for testing.
Thanks in advance, I appreciate your help!!
The last chunk at the end of your upload should be a 256KiB multiple.
Try changing 250 * 1024 * 1024 to 256 * 1024 * 1024.
Related
take a look at my code.
My "setInterval" sends no more than 10 requests.
And it stops working. And it always gets a response from the server.
Why does this happen and how can I fight it? I tried using axios the same problem.
setInterval(function() {
console.log('i'm here')
fetch(`/api/....`, {
method: 'POST',
headers: {
'Accept': 'application/json, text/plain, */*',
'Content-Type': 'application/json'
},
referrerPolicy: 'no-referrer'
}).then((res) => {
console.log('I'm here')
console.log(res)
}).catch((err) => {
console.log(err)
})
}, 1000);
Sometimes such an error also flies by:TypeError: NetworkError when attempting to fetch resource.
Provided your backend co-operates there is no reason why your JavaScript code should not work:
var i=0,
iv=setInterval(function() {
console.log(`I'm here: ${++i}`);
fetch(`https://jsonplaceholder.typicode.com/users`, {
method: 'POST',
headers: {'Content-Type': 'application/json'},
body: JSON.stringify({newId:i})
}).then(r=>r.json()).then(res=>{
console.log(`I'm back: ${res.newId}`);
console.log(res);
if(i==25) clearInterval(iv);
}).catch((err) => {console.log(err)});
}, 1000);
(There were a few typos in your console.logs which I straightened out.)
It's hard to said without the server side's codes, but I can guess there is some limitation in your express application which limit the numbers of your requsts to 10.
It is nothing to do with you javascript code which sould fire without any problem more then 10 requests.
As you can see in this article Rate Limiting,
const express = require("express");
const indexRoute = require("./router");
const rateLimit = require("express-rate-limit");
const app = express();
const port = 3000;
app.use(
rateLimit({
windowMs: 12 * 60 * 60 * 1000, // 12 hour duration in milliseconds
max: 5,
message: "You exceeded 100 requests in 12 hour limit!",
headers: true,
})
);
You can limit the available requests by changing the max value.
I'm trying to upload files from my NextJs app, but Google doesn't want to allow me uploading it via SignedUrl with NextJs API. (It does work with S3). Also, I tested it on postman and it allows me to upload files. Weird.
Getting SIGNEDURL:
const options = {
version: 'v4',
action: 'write',
expires: Date.now() + 15 * 60 * 1000,
contentType: 'application/octet-stream'
}
const url = await storage
.bucket(BUCKET_NAME) // .env variable with the bucket
.file(path)
.getSignedUrl(options)
Using URL to download files and track UploadProgress via axios:
axios({
url, // signed url from the function above
method: 'PUT',
headers: {
'Content-Type': 'multipart/form-data'
},
data, // new FormData with the files on it
onUploadProgress: (progressEvent) => {
const percentCompleted = Math.round(
(progressEvent.loaded * 100) / progressEvent.total
)
updateProgressCallback({ file, percentCompleted }) // to keep track of the upload progress
}})
Also, here you have the GCP bucket cors.json config:
[
{
"origin": ["*"], // it is for testing purposes, later i will write the domain
"responseHeader": ["*"],
"method": ["GET","POST","HEAD","DELETE"],
"maxAgeSeconds": 3600
}
]
i trying host media/image files on cloudflare r2 bucket. I tried lots of stuff but I still can't reach bucket from outsite with nodeJS.
How can i fix this code?
My Final Code:
fetch('https://USER_ID.r2.cloudflarestorage.com/PROJECT_NAME/src/thumbs/BLAHBLAH', {
method: 'GET',
headers: {
'Content-Type': 'image/jpeg',
'Authorization': 'SECRET_KEY_FOR_API',
'X-Amz-Ac': 'private',
'X-Amz-Algorithm': 'AWS4-HMAC-SHA256',
'X-Amz-Date': (new Date().toISOString().split(':').join('').split('.')[0] + 'Z').split('-').join(''),
'X-Amz-Expires': '86400',
'x-amz-content-sha256': 'UNSIGNED-PAYLOAD',
}}).then(res => res.text()).then(buffer => {
console.log(buffer);
}).catch(err => {
console.log(err);
});
You'll need to implement proper SigV4 signing for your request.
It'll be easiest to use one of the AWS SDKs, like aws-sdk-js - take a look at the example documentation for R2: https://developers.cloudflare.com/r2/examples/aws-sdk-js-v3/
I'm trying attach an image using the bot emulator tool and sending this image off to the microsofts customvision api, the issue I'm having is that I get
{ Code: 'BadRequestImageFormat', Message: '' }
back from custom the custom vision api call.
I'm using the the request module from npm to handle the calls
// Receive messages from the user and respond by echoing each message back (prefixed with 'You said:')
var bot = new builder.UniversalBot(connector, function (session) {
session.send("Hello"); //session.message.text
// If there is an attachment
if (session.message.attachments.length > 0){
console.log(session.message.attachments[0])
request.post({
url: 'xxx',
encoding: null,
json: true,
headers: {
'Content-Type': 'application/octet-stream',
'Prediction-Key': 'xxx'
},
body: session.message.attachments[0]
}, function(error, response, body){
console.log(body);
});
}
});
I believe that I may be sending the wrong format through to custom vision however I have been unable to figure it out as of yet.
I replicated your issue and it looks like the problem is your 'Content-Type'. You're attempting to pass JSON in your request, but setting the content-type as octet-stream. See my modified code below:
var bot = new builder.UniversalBot(connector, function (session) {
session.send("Hello"); //session.message.text
// If there is an attachment
if (session.message.attachments.length > 0){
console.log(session.message.attachments[0])
request.post({
url: 'https://northeurope.api.cognitive.microsoft.com/vision/v1.0/analyze?visualFeatures',
encoding: null,
json: true,
headers: {
'Content-Type': 'application/json',
'Ocp-Apim-Subscription-Key': 'Your API Key...'
},
body: session.message.attachments[0]
},
function (err, response, body) {
if (err) return console.log(err)
console.log(body);
});
}
});
When I run this, I get the error InvalidImageUrl which is to be expected as it's looking for a content on localhost. You could get round this by exposing your localhost using Ngrok.
Previously I was using the Dropbox API V1 within my web app to upload files my dropbox account. Please note that the app uses only one dropbox account (mine) to upload files.
So Previously:
I created an app on the dropbox developers console
Generated my token from the developers console
Hard coded that token into my server to upload all file to a specific folder within my Dropbox.
This worked perfectly before but as the dropbox API v1 has been deprecated it does not work anymore.
Dropbox V1 Code:
function fileupload(content) {
request.put('https://api-content.dropbox.com/1/files_put/auto/my_reports/report.pdf', {
headers: {
Authorization: 'TOKEN HERE',
'Content-Type': 'application/pdf'
},
body: content
}, function optionalCallback(err, httpResponse, bodymsg) {
if (err) {
console.log(err);
}
else {
console.log("File uploaded to dropbox successfully!");
fs.unlink(temp_dir + 'report.pdf', function(err) {
if (err)
throw err;
else {
console.log("file deleted from server!");
}
})
request.post('https://api.dropboxapi.com/1/shares/auto/MY_reports/report.pdf' + '?short_url=false', {
headers: {
Authorization: 'TOKEN HERE'
}
}, function optionalCallback(err, httpResponse, bodymsg) {
if (err) {
console.log(err);
}
else {
console.log('Shared link 2 ' + JSON.parse(httpResponse.body).url);
}
});
}
});
}
Dropbox V2 Code:
function fileupload(content) {
request.post('https://content.dropboxapi.com/2/files/upload/my_reports', {
headers: {
Authorization: 'TOKEN HERE',
'Content-Type': 'application/pdf'
},
body: content
} ......... (rest of the code is similar to above)
Issue:
What I have tried does not work. I can't seem to upload a file to my dropbox account from within my app. I have tried re-generating my TOKEN from the Dropbox App console but no luck.
Can anyone tell me what am I doing wrong?
Update:
I updated my code to similar structure for v2 of the API but still unable to resolve it.
request.post('https://content.dropboxapi.com/2/files/upload/', {
headers: {
Authorization: 'Bearer TOKEN',
'Dropbox-API-Arg': {"path": "/Homework","mode": "add","autorename": true,"mute": false},
'Content-Type': 'application/pdf'
//'Content-Type': 'application/vnd.openxmlformats-officedocument.presentationml.presentation'
},
body: content
} .... similar code
I encourage you to use existing nodejs dropbox packages, which hides abstraction of an authentication process, etc. under the hood.
Check official dropbox-sdk-js or try my tiny package dropbox-v2-api. Quick example:
const dropboxV2Api = require('dropbox-v2-api');
//create session
const dropbox = dropboxV2Api.authenticate({
token: 'TOKEN HERE'
});
//create upload stream
const uploadStream = dropbox({
resource: 'files/upload',
parameters: {
path: '/dropbox/path/to/file.txt'
}
}, (err, result) => {
// upload completed
});
//use nodejs stream
fs.createReadStream('path/to/file.txt').pipe(uploadStream);
My recommendation is also to use a SDK which abstracts over authentication. CloudRail for Node.js could be very useful here. It's quite easy to use and works for other providers like OneDrive as well.
const cloudrail = require("cloudrail-si");
const service = new cloudrail.services.Dropbox(
cloudrail.RedirectReceivers.getLocalAuthenticator(8082),
"[Dropbox Client Identifier]",
"[Dropbox Client Secret]",
"http://localhost:8082/auth",
"someState"
);
service.upload(
"/myFolder/myFile.png",
readableStream,
1024,
true,
(error) => {
// Check for potential error
}
);
Here is also a short article about the {“error”: “v1_retired”} issue.