I have this issue with node and amazon s3 when it comes to sha256 encryption. I'm reading my files from the file system using fs.createReadStream(filename).am getting this file in chunks.Then am pushing each chunk into an array. Each chunk consists of 1 *1024 *1024 bytes of data.. when the file has finished getting read, on readstream.on('end') am looping through each value in array and encrypting each chunk using sha256. in the process of looping, am also adding axios promises of each loop into an array so that when am finished looping through all the chunks and encrypting each at a time, I'm able to use promise.all to send all the requests. the result of each encrypted chunk is also sent together with the each request as an header . The challenge I've been facing and trying to solve is, whenever a request is made, it gets the calculations of the sha256 from s3 is completely different from what I have . I have tried to solve this and to understand this to no avail. below is my code, what could I be doing wrong ?
this is the error that am getting :
Corrupted chunk received:
File corrupted: Expected SHA jakoz9d12xYjzpWVJQlqYdgPxAuF+LjZ9bQRg0hzmL8=, but calculated SHA 103f77f9b006d9b5912a0da167cf4a8cec60b0be017b8262cd00deb3183f3a8b
const Encryptsha256 = function(chunksTobeHashed) {
var crypto = require('crypto');
var hash = crypto.createHash('sha256').update(chunksTobeHashed).digest('base64')
return hash;
}
const upload = async function(uploadFile) {
var folderPath = uploadFile.filePath
var chunksArray = []
var uploadFileStream = fs.createReadStream(folderPath, { highWaterMark: 1 * 1024 * 1024, encoding:"base64" })
uploadFileStream.on('data', (chunk) => {
chunksArray.push(chunk)
// console.log('chunk is ', chunk)
})
uploadFileStream.on('error', (error) => {
console.log('error is ', error)
})
// file_id: "2fe44d18-fa94b201-2fe44d18b196-f9066e05a81c"
uploadFileStream.on('end', async() => {
//code to get file id was here but removed.since it was not much neccessary to this quiz
var file_id = "2fe44d18-fa94b201-2fe44d18b196-f9066e05a81c"
let promises = [];
for (var i in chunksArray) {
var Content_SHA256 = Encryptsha256(chunksArray[i])
var payload = {
body: chunksArray[i],
}
promises.push(
axios.post(
`${baseURL}/home/url/upload/${fileId}/chunk/${i}`, payload, {
header: {
'Content-SHA256': Content_SHA256,
},
}
)
)
}
Promise.all(promises).then((response) => {
console.log('axios::', response)
})
.catch((error) => {
console.log('request error', error)
})
})
Related
I'm trying to get the temperature data from my node.js backend sent to react.js but i kept getting res.send is not a funtion
Sample code here
app.get("/gettemperature", (req, res) => {
const email = req.query.email;
let stmt = `SELECT * FROM users WHERE email=?`;
let todo = [email];
db.query(stmt, todo, (err, results, fields) => {
if (err) {
console.error(err.message);
}
if(results.length > 0 ){
let id = results[0].id;
let getID = `SELECT * FROM controlModules WHERE deviceowner=?`;
let getidData = [id];
db.query(getID, getidData, (err, resulta, fields) => {
if (err) {
console.error(err.message);
}
if(resulta.length > 0){
let lanip = resulta[0].ipaddress;
let url = "http://"+lanip+"/data";
http.get(url,(res) => {
let body = "";
res.on("data", (chunk) => {
body += chunk;
});
res.on("end", () => {
try {
let json = JSON.parse(body);
const temp_actual = json.temperature.value;
console.log(temp_actual);
res.setHeader('Content-Type', 'application/json');
res.end(
JSON.stringify({
value: temp_actual
})
);
} catch (error) {
console.error(error.message);
};
});
}).on("error", (error) => {
console.error(error.message);
});
}
});
}
});
});
i really need to return/send/respond the temperature data to my front end but i'm getting said error, is there a different way to return data?
It looks like you are mixing up an HTTP server you wrote in Node (although you haven't shown any relevant code) and an HTTP client you also wrote in Node.
res is an argument received by the callback you pass to http.get and contains data about the response received by your HTTP client.
Meanwhile, somewhere else (not shown) you have a different variable also called res which is the object your HTTP server uses to send its response to the browser running your React code.
You are calling res.send and wanting res to be the latter but it is really the former.
Since you haven't shown us the HTTP server code, it is hard to say where that res is, but there is a good chance you have shadowed it and can solve your problem by using different names (e.g. client_res and server_res).
That said. I strongly recommend avoiding using the http module directly as the API follows out of date design patterns and isn't very friendly. Consider using fetch or axios for making HTTP requests and Express.js for writing HTTP servers.
I have an Angular 11.x app that performs a http request to a backend system that reads data from a video file (e.g mp4/mov) using FFMPEG, due to the processing it takes 10 seconds to complete this async request.
I've hard coded some of the values for greater clarity
// video-component.ts
let fileUrl = 'https://abc.s3.eu-west-2.amazonaws.com/video.mp4';
let fileSize = '56117299';
this.videoMetadata = this.videoService.getVideoMediaData(fileUrl, fileSize);
// if any errors found from the async response loop through them and push them into the following error which displays this on the frontend
/* I need to push the errors from the request above into this `errorMessages` variable
self.errorMessages['Instagram'].push({
"message": "Video must be between 3-60 seconds in duration",
});
*/
// video.service.ts (downloads the file & gets metadata using FFMPEG in the endpoint)
public getMetadata(file: string, size: string): Observable<any> {
let params = new HttpParams();
params = params.append('file', file);
params = params.append('size', size);
return this.http.get('post/media-check', { params })
.pipe(map(response => {
return response;
}));
}
public getVideoMediaData(file, size) {
return new Promise((resolve, reject) => {
this.getMetadata(file, size).subscribe(
data => {
resolve(data);
},
errorResponse => {
}
);
});
}
The post/media-check in the getMetadata function hits an PHP endpoint and returns the following response similar to the following.
{
"status":"200",
"data":{
"video":{
"container":"mov",
"bitrate":338,
"stream":0,
"codec":"h264",
"fps":3
}
},
"errors":["Video must be at least 25 frames per second (fps)"],
"responseType":"json",
"response":"success"
}
How do I get the errors array from the backend response from the async request push directly into the self.errorMessages variable?
First you need to make sure that your video-service is handling errors properly.
public getVideoMediaData(file, size) {
return new Promise((resolve, reject) => {
this.getMetadata(file, size).subscribe(
data => {
resolve(data);
},
errorResponse => {
// Reject the Promise and pass the error response in the rejection
reject(errorResponse);
}
);
});
}
Then in your video-component you can handle this scenario like this:
let fileUrl = 'https://abc.s3.eu-west-2.amazonaws.com/video.mp4';
let fileSize = '56117299';
try {
this.videoMetadata = await this.videoService.getVideoMediaData(fileUrl, fileSize);
// happy path - do something with this.videoMetadata
} catch(e) {
// unhappy path - e = errorResponse
const messages = errorResponse.errors.map(message => ({ message }));
self.errorMessages['Instagram'].push(...messages);
}
I have a problem in nodejs, I make a request to an api using https.request, the response contains an object of 10000 rows.
What happens is that the entire object does not arrive, and parsing gives the error: Unexpected end of JSON input;
Can someone help?
Function to request:
function request({
options,
method,
resource,
queryParams,
bodyParams,
}) {
return new Promise((resolve, reject) => {
const hasBodyParams = !!bodyParams;
const stringifyedQueryParams = strigifyQueryParams(queryParams);
const optionsRequest = {
...options,
method,
path: `${resource}${stringifyedQueryParams}`,
};
const req = https.request(optionsRequest, (res) => {
res.setEncoding(configs.ENCODING);
res.on(events.DATA, data => resolve({
body: JSON.parse(data),
statusCode: res.statusCode,
}));
});
req.on(events.ERROR, error => reject(error) );
hasBodyParams && req.write(bodyParams);
req.end();
});
}
As I suspected in the comments, you're not handling multiple data-events.
When receiving large responses from a request, the data-event is called multiple times, each time with a chunk of data from the response (not the complete response).
When you're parsing a chunk, the complete JSON document hasn't been transmitted yet, so the parsing fails with the "Unexpected end of JSON stream" error
In short, you need to:
Create a variable to collect the complete body
On a data-event, append the new chunk to the complete body
When the end-event is called, parse the full body.
Here is a short example, adopted from the official documentation:
https.request(options, (res) => {
// PARTIAL example
res.setEncoding("utf8"); // makes sure that "chunk" is a string.
let fullBody = "";
res.on("data", data => {
fullBody += data;
});
res.on("end", () => {
const json = JSON.parse(fullBody);
// work with json
});
});
Using Node.js, I am trying to get an image from a URL and upload that image to another service without saving image to disk. I have the following code that works when saving the file to disk and using fs to create a readablestream. But as I am doing this as a cron job on a read-only file system (webtask.io) I'd want to achieve the same result without saving the file to disk temporarily. Shouldn't that be possible?
request(image.Url)
.pipe(
fs
.createWriteStream(image.Id)
.on('finish', () => {
client.assets
.upload('image', fs.createReadStream(image.Id))
.then(imageAsset => {
resolve(imageAsset)
})
})
)
Do you have any suggestions of how to achieve this without saving the file to disk? The upload client will take the following
client.asset.upload(type: 'file' | image', body: File | Blob | Buffer | NodeStream, options = {}): Promise<AssetDocument>
Thanks!
How about passing the buffer down to the upload function? Since as per your statement it'll accept a buffer.
As a side note... This will keep it in memory for the duration of the method execution, so if you call this numerous times you might run out of resources.
request.get(url, function (res) {
var data = [];
res.on('data', function(chunk) {
data.push(chunk);
}).on('end', function() {
var buffer = Buffer.concat(data);
// Pass the buffer
client.asset.upload(type: 'buffer', body: buffer);
});
});
I tried some various libraries and it turns out that node-fetch provides a way to return a buffer. So this code works:
fetch(image.Url)
.then(res => res.buffer())
.then(buffer => client.assets
.upload('image', buffer, {filename: image.Id}))
.then(imageAsset => {
resolve(imageAsset)
})
well I know it has been a few years since the question was originally asked, but I have encountered this problem now, and since I didn't find an answer with a comprehensive example I made one myself.
i'm assuming that the file path is a valid URL and that the end of it is the file name, I need to pass an apikey to this API endpoint, and a successful upload sends me back a response with a token.
I'm using node-fetch and form-data as dependencies.
const fetch = require('node-fetch');
const FormData = require('form-data');
const secretKey = 'secretKey';
const downloadAndUploadFile = async (filePath) => {
const fileName = new URL(filePath).pathname.split("/").pop();
const endpoint = `the-upload-endpoint-url`;
const formData = new FormData();
let jsonResponse = null;
try {
const download = await fetch(filePath);
const buffer = await download.buffer();
if (!buffer) {
console.log('file not found', filePath);
return null;
}
formData.append('file', buffer, fileName);
const response = await fetch(endpoint, {
method: 'POST', body: formData, headers: {
...formData.getHeaders(),
"Authorization": `Bearer ${secretKey}`,
},
});
jsonResponse = await response.json();
} catch (error) {
console.log('error on file upload', error);
}
return jsonResponse ? jsonResponse.token : null;
}
I am pulling down objects from s3. the objects are zipped, and I need to be able to unzip them and compare the contents with some strings. My problem is that I can't seem to get them properly unzipped. This is what I am seeing happen: s3 zipped -> over the wire -> to me as JS Buffer -> ???
I am unsure of what I can do next. I have seemingly tried everything, such as pako, and lzutf8 to decompress the strings, but no dice.
here is an attempt with lzutf8:
lzutf8.decompress(buffer,{outputEncoding: "String"}, (result, error) => {
if (err) console.log(err);
if (data) console.log(data);
});
Here is an attempt with pako:
pako.ungzip(buffer,{to: "string"}, (result, error) => {
if (error) console.log(err);
if (result) console.log(data);
})
pako throws an "incorrect header check", and lzutf8 silently does nothing.
I am not married to these libraries, so if there is anything else that will do the job, I am happy to try anything. I am guessing that my problem might have something to do with the encoding types? Not sure though.
Here is what the relevant part of my code looks like:
let pako = require('pako');
let streamBuffers = require('stream-buffers');
let ws = fs.createWriteStream(process.cwd() + 'path-to-file');
let rs = new streamBuffers.ReadableStreamBuffer();
objects.forEach((obj) => {
console.log(obj);
rs.on("data", (data) => {
ws.write(pako.ungzip);
})
rs.push(obj);
})
You can create a readable stream from an object in S3 with the AWS SDK's createReadStream method and then pipe that through a zlib.Gunzip transform stream:
var zlib = require('zlib');
var s3 = new AWS.S3({apiVersion: '2006-03-01'});
var params = {Bucket: <bucket>, Key: <key>};
var file = require('fs').createWriteStream(<path/to/file>);
s3.getObject(params).createReadStream().pipe(zlib.createGunzip()).pipe(file);