File Upload Angular 2 & Sails Js - javascript

I am trying to upload multiple files from Angular 2 and Sails Js server. I want to place file inside public folder of SailJs App.
The file is uploaded from Angular 2 App by getting file from an event fired. the code for single file upload is as follows:
Angular 2 service:
fileChange(event: any): Promise<string> {
let fileList: FileList = event.target.files;
if(fileList.length > 0) {
let file: File = fileList[0];
let formData:FormData = new FormData();
formData.append('myFile', file, file.name);
let headers = new Headers();
let cToken = this.cookie.getCookie("token");
headers.append('Authorization', 'Bearer ' + cToken);
headers.append('Content-Type', undefined);
//headers.append('Content-Type', 'multipart/form-data');
headers.append('Accept', 'application/json');
let options: RequestOptionsArgs = { headers: headers, withCredentials: true }
return new Promise((resolve, reject) => {
this.http.post( this.apiEndpoint + "project/reffile/add/all", formData, options).toPromise()
.then(response => {
// The promise is resolved once the HTTP call is successful.
let jsonData = response.json();
if (jsonData.apiStatus == 1) {
resolve(jsonData);
}
else reject(jsonData.message);
})
// The promise is rejected if there is an error with the HTTP call.
// if we don't get any answers the proxy/api will probably be down
.catch(reason => reject(reason.statusText));
});
}
}
SailsJs method:
/**
* `FileController.upload()`
*
* Upload file(s) to the server's disk.
*/
addAll: function (req, res) {
// e.g.
// 0 => infinite
// 240000 => 4 minutes (240,000 miliseconds)
// etc.
//
// Node defaults to 2 minutes.
res.setTimeout(0);
console.log("req.param('filename')");
console.log(req.param('filename'));
req.file('myFile')
.upload({
// You can apply a file upload limit (in bytes)
maxBytes: 1000000
}, function whenDone(err, uploadedFiles) {
if (err) return res.serverError(err);
else return res.json({
files: uploadedFiles,
textParams: req.allParams()
});
});
},
after posting form, I didn't get file in HTTP call response also not able to console.log(req.param('filename'));.
please help me what I am doing wrong here. I also tried changing/removing header, but still not working,
some expert says that HTTP currently cant upload files, need to use native XHR request for this. please view Thierry Templier's answer here

Try specifying a directory for file upload:
req.file('file').upload({
dirname: '../../assets/uploads'
},function (err, files) {
if (err) return res.serverError(err);
var fileNameArray = files[0].fd.split("/");
var fileName = fileNameArray[fileNameArray.length - 1];
console.log("fileName: ",fileName);
});
To access the uploaded file - you can append the fileName to the upload directory that you have specified. File will be accessible

Related

Sending a HTTP POST REQUEST with image and text

How can I send an image along with a text in VueJs to my backend ExpressJs?
Right now, what I did was create two http post request
NOTE this.albumName and this.albumDesc are just text and the formData is an image.
createAlbum() {
const formData = new FormData();
for (let file of Array.from(this.myAlbumImages)) {
formData.append("files", file);
}
if (this.albumName) {
axios
.post("http://localhost:9001/image/album", {
ALBUM: this.albumName,
DESCRIPTION: this.albumDesc
})
.then(resp => console.log(resp))
.catch(err => console.log(err));
setTimeout(function() {
axios
.post("http://localhost:9001/image/album", formData)
.then(resp => console.log(resp))
.catch(err => console.log(err));
}, 3000);
this.albumName = "";
this.albumDesc = "";
} else {
alert("Please fill the above form.");
}
},
and here is my Backend.
This creates the folder based on the passed data and it also creates a named undefined folder
router.post('/album', (req, res) => {
let sql = "INSERT INTO GALLERY SET ALBUM = ?, DESCRIPTION = ?";
let body = [req.body.ALBUM, req.body.DESCRIPTION]
myDB.query(sql, body, (error, results) => {
if (error) {
console.log(error);
} else {
let directory = `C:/Users/user/Desktop/project/adminbackend/public/${req.body.ALBUM}`;
fse.mkdirp(directory, err => {
if (err) {
console.log(err);
} else {
console.log(directory);
}
})
}
})
I think this is because of NodeJS is Asynchronous that's why it creates the undefined folder.
Reason for behavior you see is you are sending two different requests to the same route. 1st includes ALBUM and DESCRIPTION form field values, but not the files. Second (inside setTimeout) will contain just files and no other fields, so referencing them like req.body.ALBUM will return undefined
You can send all data (text fields and files) in one request. Just do this:
const formData = new FormData();
for (let file of Array.from(this.myAlbumImages)) {
formData.append("files", file);
}
formData.append("ALBUM", this.albumName);
formData.append("DESCRIPTION", this.albumDesc);
axios.post("http://localhost:9001/image/album", formData)
.then(resp => console.log(resp))
.catch(err => console.log(err));
FormData always uses content type multipart/form-data. To parse it on server side you need Express middleware that parses multipart forms, and gives you access to both fields and image/s. For example multer...
for the first part the client may can help you this link How to post image with fetch?
const fileInput = document.querySelector('#your-file-input') ;
const formData = new FormData();
formData.append('file', fileInput.files[0]);
const options = {
method: 'POST',
body: formData,
// If you add this, upload won't work
// headers: {
// 'Content-Type': 'multipart/form-data',
// }
};
fetch('your-upload-url', options);
and for the seerver part cant help you this link
Node Express sending image files as API response
app.get('/report/:chart_id/:user_id', function (req, res) {
res.sendFile(filepath);
});
and oficial documentation about this
http://expressjs.com/en/api.html#res.sendFile

Capturing an image with ngx-webcam and sending it to face recognition api

I am currently trying to directly send an image via ngx-webcam without saving it to my backend server and send it to a Face Detection API via my node.js. The problem is that I keep getting an error for my header in my node.js file. How can I resolve this issue?
I noticed that the image url being passed is quite long. Could that be an issue?
Image url:
"data:image/jpeg;base64,/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAMCAgICAgMCAgIDAwMDBAYEBAQEBAgGBgUGCQgKCgkICQkKDA8MCgsOCwkJDRENDg8QEBEQCgwSExIQEw8QEBD/2wBDAQMDAwQDBAgEBAgQCwkLEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBAQEBD/wAARCAHgAoADASIAAhEBAxE..."
My error is:
TypeError [ERR_HTTP_INVALID_HEADER_VALUE]: Invalid value "undefined" for header "Content-Length"
at ClientRequest.setHeader (_http_outgoing.js:473:3)
at FormData.<anonymous> (C:\Users\Roger\Documents\GitHub\angular-face-recognition-app\back-end\node_modules\form-data\lib\form_data.js:321:13)
at C:\Users\Roger\Documents\GitHub\angular-face-recognition-app\back-end\node_modules\form-data\lib\form_data.js:265:7
at C:\Users\Roger\Documents\GitHub\angular-face-recognition-app\back-end\node_modules\form-data\node_modules\async\lib\async.js:251:17
at done (C:\Users\Roger\Documents\GitHub\angular-face-recognition-app\back-end\node_modules\form-data\node_modules\async\lib\async.js:126:15)
at C:\Users\Roger\Documents\GitHub\angular-face-recognition-app\back-end\node_modules\form-data\node_modules\async\lib\async.js:32:16
at C:\Users\Roger\Documents\GitHub\angular-face-recognition-app\back-end\node_modules\form-data\node_modules\async\lib\async.js:248:21
at C:\Users\Roger\Documents\GitHub\angular-face-recognition-app\back-end\node_modules\form-data\node_modules\async\lib\async.js:572:34
at C:\Users\Roger\Documents\GitHub\angular-face-recognition-app\back-end\node_modules\form-data\lib\form_data.js:105:13
at FSReqWrap.oncomplete (fs.js:153:21)
Front end: Angular
Component file:
//captures image function
public handleImage(webcamImage: WebcamImage): void {
//stores it into webcamImageg variable
this.webcamImage = webcamImage;
//uses fda.sendImage function to send webcamImage to api via a service
this.fda.sendImage(this.webcamImage.imageAsDataUrl).subscribe(res => {});
}
Service file
sendImage(imgUrl){
console.log(imgUrl);
const obj = {
url: imgUrl
};
return this.http.post(`${this.uri}`, obj);
}
Backend: node.js
Route file
facedetAPIRoutes.route("/").post(function (req, res){
let imageUrl = req.body.url;
myFaceDetAPI.recognizeImg(imageUrl).then(function(result) {
// here is your response back
res.json(result);
});
});
Function file for api call: uses a promise
//I believe problem lies here somewhere
this.recognizeImg = (url)=>{
let requestString = "https://lambda-face-recognition.p.rapidapi.com/recognize";
let req = unirest("POST", requestString);
let imgURL = url;
let promise = new Promise(function(resolve, reject) {
unirest.post(requestString)
.header("X-RapidAPI-Key", API_KEY)
.attach("files", fs.createReadStream(imgURL))
.field("album", ALBUM_NAME)
.field("albumkey", ALBUM_KEY)
.end(result => {
console.log("successfully recognized image");
resolve(result.body) // giving response back
});
});
return promise;
}
You should try adding x-rapidapi-host and content-type headers.
.headers({
"content-type": "application/x-www-form-urlencoded",
"x-rapidapi-host": "lambda-face-recognition.p.rapidapi.com",
"x-rapidapi-key": "",
"useQueryString": true
})

Download and upload image without saving to disk

Using Node.js, I am trying to get an image from a URL and upload that image to another service without saving image to disk. I have the following code that works when saving the file to disk and using fs to create a readablestream. But as I am doing this as a cron job on a read-only file system (webtask.io) I'd want to achieve the same result without saving the file to disk temporarily. Shouldn't that be possible?
request(image.Url)
.pipe(
fs
.createWriteStream(image.Id)
.on('finish', () => {
client.assets
.upload('image', fs.createReadStream(image.Id))
.then(imageAsset => {
resolve(imageAsset)
})
})
)
Do you have any suggestions of how to achieve this without saving the file to disk? The upload client will take the following
client.asset.upload(type: 'file' | image', body: File | Blob | Buffer | NodeStream, options = {}): Promise<AssetDocument>
Thanks!
How about passing the buffer down to the upload function? Since as per your statement it'll accept a buffer.
As a side note... This will keep it in memory for the duration of the method execution, so if you call this numerous times you might run out of resources.
request.get(url, function (res) {
var data = [];
res.on('data', function(chunk) {
data.push(chunk);
}).on('end', function() {
var buffer = Buffer.concat(data);
// Pass the buffer
client.asset.upload(type: 'buffer', body: buffer);
});
});
I tried some various libraries and it turns out that node-fetch provides a way to return a buffer. So this code works:
fetch(image.Url)
.then(res => res.buffer())
.then(buffer => client.assets
.upload('image', buffer, {filename: image.Id}))
.then(imageAsset => {
resolve(imageAsset)
})
well I know it has been a few years since the question was originally asked, but I have encountered this problem now, and since I didn't find an answer with a comprehensive example I made one myself.
i'm assuming that the file path is a valid URL and that the end of it is the file name, I need to pass an apikey to this API endpoint, and a successful upload sends me back a response with a token.
I'm using node-fetch and form-data as dependencies.
const fetch = require('node-fetch');
const FormData = require('form-data');
const secretKey = 'secretKey';
const downloadAndUploadFile = async (filePath) => {
const fileName = new URL(filePath).pathname.split("/").pop();
const endpoint = `the-upload-endpoint-url`;
const formData = new FormData();
let jsonResponse = null;
try {
const download = await fetch(filePath);
const buffer = await download.buffer();
if (!buffer) {
console.log('file not found', filePath);
return null;
}
formData.append('file', buffer, fileName);
const response = await fetch(endpoint, {
method: 'POST', body: formData, headers: {
...formData.getHeaders(),
"Authorization": `Bearer ${secretKey}`,
},
});
jsonResponse = await response.json();
} catch (error) {
console.log('error on file upload', error);
}
return jsonResponse ? jsonResponse.token : null;
}

Download file from Amazon S3 using REST API

I have my own REST API to call in order to download a file. (At the end, the file could be store in different kind of server... Amazon s3, locally etc...)
To get a file from s3, I should use this method:
var url = s3.getSignedUrl('getObject', params);
This will give me a downloadable link to call.
Now, my question is, how can I use my own rest API to download a file when it comes from that link? Is there a way to redirect the call?
I'm using Hapi for my REST server.
{
method: "GET", path: "/downloadFile",
config: {auth: false},
handler: function (request, reply) {
// TODO
reply({})
}
},
Instead of using a redirect to download the desired file, just return back an unbufferedStream instead from S3. An unbufferedStream can be returned from the HttpResponse within the AWS-SDK. This means there is no need to download the file from S3, then read it in, and then have the requester download the file.
FYI I use this getObject() approach with Express and have never used Hapi, however I think that I'm pretty close with the route definition but hopefully it will capture the essence of what I'm trying to achieve.
Hapi.js route
const getObject = require('./getObject');
{
method: "GET", path: "/downloadFile",
config: {auth: false},
handler: function (request, reply) {
let key = ''; // get key from request
let bucket = ''; // get bucket from request
return getObject(bucket, key)
.then((response) => {
reply.statusCode(response.statusCode);
response.headers.forEach((header) => {
reply.header(header, response.headers[header]);
});
return reply(response.readStream);
})
.catch((err) => {
// handle err
reply.statusCode(500);
return reply('error');
});
}
},
getObject.js
const AWS = require('aws-sdk');
const S3 = new AWS.S3(<your-S3-config>);
module.exports = function getObject(bucket, key) {
return new Promise((resolve, reject) => {
// Get the file from the bucket
S3.getObject({
Bucket: bucket,
Key: key
})
.on('error', (err) => {
return reject(err);
})
.on('httpHeaders', (statusCode, headers, response) => {
// If the Key was found inside Bucket, prepare a response object
if (statusCode === 200) {
let responseObject = {
statusCode: statusCode,
headers: {
'Content-Disposition': 'attachment; filename=' + key
}
};
if (headers['content-type'])
responseObject.headers['Content-Type'] = headers['content-type'];
if (headers['content-length'])
responseObject.headers['Content-Length'] = headers['content-length'];
responseObject.readStream = response.httpResponse.createUnbufferedStream();
return resolve(responseObject);
}
})
.send();
});
}
Return a HTTP 303 Redirect with the Location header set to the blob's public URL in the S3 bucket.
If your bucket is private then you need to proxy the request instead of performing a redirect, unless your clients also have access to the bucket.

Trying to upload to S3 bucket. After 5-10 uploads it hangs

I've created a promise that uses the aws-sdk to upload to my S3 bucket. My application is a simple command line script to add images to S3 and update the database. I have anywhere between 300-1000 images to upload every run.
The problem I am having is it uploads 5-10 images but then seems to hang. I've confirmed this by placing a console.log(data) after the error checking in the promise below.
The first 5 images upload quickly, the sixth takes about a minute, the seventh a lot longer and at which point it just hangs.
s3-upload-promise.js
'use strict'
let AWS = require('aws-sdk')
let s3 = new AWS.S3()
module.exports = function(params) {
return new Promise(function(resolve, reject) {
s3.upload(params, function(err, data) {
if(err) return reject(Error(err))
resolve(data)
})
})
}
And here is the code that calls the promise:
let s3UploadPromise = require('../src/s3-upload-promise/s3-upload-promise')
// Get all PNG files from given path and upload to bucket
listpng(process.argv[2]).then(function(files) {
let promises = []
files.forEach(function(el, i) {
promises.push(el, s3UploadPromise({
Bucket: process.env.S3_BUCKET,
Key: 'templates/' + randomstring.generate() + '.png',
Body: fs.createReadStream(process.argv[2] + el),
ACL: 'public-read'
}))
});
Promise.all(promises).then(function(values) {
return console.log(values)
})
})
Any ideas what I'm doing wrong? Does this have anything to do with me not closing createReadStream?
Edit
I've tried closing the stream within the the s3.upload callback. It didn't make any difference and it still hung:
s3.upload(params, function(err, data) {
params.Body.close()
if(err) return reject(Error(err))
resolve(data)
})
Edit 2
I added some error checking and I am getting the following error:
{
message: 'Your socket connection to the server was not read from or written to within the timeout period. Idle connections w
ill be closed.',
code: 'RequestTimeout',
region: null,
time: 2016-06-19T13:14:39.223Z,
requestId: 'F7E64E8F99E774F3',
extendedRequestId: 'PW/mPy6t3w9U1uJc8xYKhUGi/KiSY+6yK6nq0RB21Ke1KqRmTWjjm3KXEp0qAEPDadypw+kiwCEP3upER1uecEP4Sl9Tk/lt',
cfId: undefined,
statusCode: 400,
retryable: true
}
A comment on an issue on Github mentions:
I've been noticing this a lot when using concurrency > 1, on several systems. Most of the time an uploaded folder will begin to get 400's due to timeouts after the first 10 or so requests.
Maybe adding a Content-Length somewhere will help. Will try.
Edit 3
I decided to try a different library called knox. I get the same problem! This is crazy. It must surely an Amazon issue if two different libraries are facing the same problem?
s3-upload-promise.js
'use strict'
let knox = require('knox')
let process = require('process')
var client = knox.createClient({
key: process.env.AWS_KEY,
secret: process.env.AWS_SECRET,
bucket: process.env.S3_BUCKET,
});
module.exports = function(params) {
return new Promise(function(resolve, reject) {
let headers = {
'Content-Length': params.contentLength,
'Content-Type': params.contentType,
'x-amz-acl': params.permissions
}
client.putBuffer(params.buffer, params.key, headers, function(err, res){
if(err) return reject(err)
resolve(res)
});
})
}
Calling code ...
// Get all PNG files from given path
listpng(process.argv[2]).then(function(files) {
let promises = []
files.forEach(function(el, i) {
let file = process.argv[2] + el;
fs.readFile(file, function(err, data) {
if(err) return console.log(err)
fs.stat(process.argv[2] + el, function(err, stats) {
if(err) return console.log(err)
let key = process.env.S3_TEMPLATES + '/'
let buffer = fs.createReadStream(process.argv[2] + el)
let params = {
buffer: data,
key: key + randomstring.generate() + '.png',
contentLength: stats.size,
contentType: 'image/png',
permissions: 'public-read',
}
promises.push(s3Uploader(params))
Promise.all(promises).then(function(values) {
return console.log(values)
})
})
})
})
})
Not sure what else to do now.

Categories

Resources