Node.js fs.writeFileSync with options and callback and error hundler? - javascript

I looked at Node.js Documentation and I didn't find how to apply options (settings) and a callback function with error handling. And I have to use .writeFileSync (not asynchronous .writeFile):
const settings = {
flags: 'w',
encoding: null, //must be null
mode: 0o666,
autoClose: true //not found this option
};
fs.writeFileSync(dest, buff, settings);
Before I used:
fs.writeFileSync(dest, buff, function (err) {
if (err) {
...
} else { ... console.log("OK") }
})
but I found that I have to apply encoding: null, option to prevent any modifications of source data (buff), in other case the file can be broken.
Edit:
After amazing answers and explanations I would like to say that I was confused with Node.js Documentation :
fs.writeFileSync(file, data[, options]) is
The synchronous version of fs.writeFile().
Since this is version of fs,writeFile() method I thought it can have the same versions of function's signature...
And here my final version of code, but it still has the issue with decoding of binary files (can be any file types) (* by the way when I tried to use Axios.js I saw Errors: "Request failed with Status Code 500):
function download(url, dest, fileName, callback) {
//import http from 'http';
var request = http.get(url, function (response) {
var bodyParts = [];
var bytes = 0;
response.on("data", function (c) {
bodyParts.push(c);
bytes += c.length;
})
response.on("end", function () {
// flatten into one big buffer
var buff = new Buffer(bytes);
var copied = 0;
for (var i = 0; i < bodyParts.length; i++) {
bodyParts[i].copy(buff, copied, 0);
copied += bodyParts[i].length;
}
const settings = {
flags: 'w',
encoding: null, //not applicable / no changes
mode: 0o666
};
try {
fs.writeFileSync(dest, buff, settings);
let msgOK = {
filename: fileName,
status: 'OK',
text: `File downloaded successfully`
}
if (callback) callback(msgOK);
console.log(msgOK.text);
isLoading = false; //IMPORTANT!
} catch (err) {
console.error(err.stack || err.message);
let msgErr = {
filename: fileName,
status: 'ERROR',
text: `Error in file downloading ${err.message}`
}
ERRORS.push(err);
if (callback) callback(msgErr);
}
})
})
}

The synchronous version of any file system method does not accept a callback and they will throw in case of error, so you should catch it.
When using the synchronous form any exceptions are immediately thrown.
You can use try/catch to handle exceptions or allow them to bubble up.
try {
fs.writeFileSync(dest, buff);
// You don't need callback, the file is saved here
} catch(e) {
console.error(e);
}
There's no setting autoClose for fs.writeFileSync the only available options are:
encoding <String> | <Null> default = 'utf8'
mode <Number> default = 0o666
flag <String> default = 'w'
Last but not least, you should update your node version, since Node.js 4.x end of life is in less than a week. (2018-04-30)

fs.writeFileSync throws an error
so you can do
try {
fs.writeFileSync(dest, buff)
} catch (err) {
// do something
}
and you wouldn't need callback because it's synchronous
just put your code after calling writeFileSync

I suggest that you can use
try {
fs.writeFileSync(dest, buff, settings);
} catch(e) {
// do your error handler
}

Related

Nodejs: download a file into string via http, using async await syntax

How do I download a file into memory via http in nodejs, without the use of third-party libraries?
This answer solves a similar question, but I don't need to write file to disk.
You can use the built-in http.get() and there's an example right in the nodejs http doc.
http.get('http://nodejs.org/dist/index.json', (res) => {
const { statusCode } = res;
const contentType = res.headers['content-type'];
let error;
// Any 2xx status code signals a successful response but
// here we're only checking for 200.
if (statusCode !== 200) {
error = new Error('Request Failed.\n' +
`Status Code: ${statusCode}`);
} else if (!/^application\/json/.test(contentType)) {
error = new Error('Invalid content-type.\n' +
`Expected application/json but received ${contentType}`);
}
if (error) {
console.error(error.message);
// Consume response data to free up memory
res.resume();
return;
}
res.setEncoding('utf8');
let rawData = '';
res.on('data', (chunk) => { rawData += chunk; });
res.on('end', () => {
try {
const parsedData = JSON.parse(rawData);
console.log(parsedData);
} catch (e) {
console.error(e.message);
}
});
}).on('error', (e) => {
console.error(`Got error: ${e.message}`);
});
This example assumes JSON content, but you can change the process in the end event handler to just treat the rawData as text and change the check for json contentType to whatever type you are expecting.
FYI, this is somewhat lower level code and is not something I would normally use. You can encapsulate it in your own function (perhaps with a promise wrapped around it) if you really don't want to use third party libraries, but most people use higher level libraries for this purpose that just make the coding simpler. I generally use got() for requests like this and there is a list of other libraries (all promise-based) here.

How to read json file from storage blob container with azure function using javascript?

I'm totally new in azure and I would like to create azure function, which will read the content from azure storage container file.json.
Folder structure :
Storage account name: storageaccounttest
Container name: test
File name: file.json
File.json:
[
{
"name":"Kate",
"age":"28"
},
{
"name":"John",
"age":"30"
}
]
Cors on storage account: get enabled.
Environemnts variable added: process.env.AZURE_STORAGE_NAME and process.env.AZURE_STORAGE_KEY and process.env.AZURE_CONNECTION_STRING
I'm using VisualStudioCode to deploy the function.
I installed locally the dependencies:
"dependencies": {
"azure-storage": "^2.10.3",
"dotenv": "^8.1.0"
}
I choose the javascript -> HttpTrigger fn-> anonymus options
I'm using getBlobToText fn.
My index.js:
var storage = require('azure-storage');
var blobService = storage.createBlobService();
var containerName = 'test';
var blobName = 'file.json';
module.exports = blobService.getBlobToText(
containerName,
blobName,
function(err, blobContent) {
if (err) {
console.error("Couldn't download blob");
console.error(err);
} else {
console.log("Sucessfully downloaded blob");
console.log(blobContent);
}
});
Fn is deployed successfully, but I'm not able to see results.
After start, fn is finished with status 500, Internal Server Errror, Console: No new trace in the past 1 min(s).
What I made wrong?
Just summarized for helping others who get the same issue.
I think you were using context.binding.response to pass the blobContent value to the output response as the offical document Azure Functions JavaScript developer guide said.
Here is my sample code with Promise feature to solve it.
var azure = require('azure-storage');
var blobService = azure.createBlobService();
var containerName = 'test';
var blobName = 'file.json';
async function getBlobContent(containerName, blobName) {
return new Promise((resolve, reject) => {
blobService.getBlobToText(containerName, blobName, function(err, blobContent) {
if (err) {
reject(err);
} else {
resolve(blobContent);
}
});
});
}
module.exports = async function (context, req) {
await getBlobContent(containerName, blobName).then(
function(content) {
context.res = {
headers: {"Content-Type": "application/json"},
body: content
}
}, function(error) {
context.res = {
status: 400,
body: error
}
}
);
};
It works as the figure below.

AWS Lambda returns 'null' after going through a forEach loop of saving multiple files in s3 bucket

I have an AWS Lambda function. It goes through a list of the array which is an array of URLs and saves their screenshot and puts them in s3. How do I return the output of this function which is screenshotLinks array that has all the links to files saved in s3? I used the callback function in the end, but it just returns null! I want callback function to output all the s3 file links saved inside screenshotLinks array.
exports.handler = (event, context, callback) => {
desktopLinks.forEach(function (url, index) {
https.request(url, function(res) {
var data = new Stream();
res.on('data', function(chunk) {
// Agregates chunks
data.push(chunk);
});
res.on('end', function() {
var body = data.read();
// Once you received all chunks, send to S3
var currentLink = links[index];
var linkAddress = encodeURIComponent(currentLink);
var slashPosition = getPosition(currentLink, '/', 3)+1;
var linkName = currentLink.substr(slashPosition, currentLink.length)
var params = {
Bucket: bucket,
Key: completeDate + '/screenshots/' + linkName + '.png',
Body: body
};
s3.putObject(params, function(err, data, callback) {
if (err) {
console.error(err, err.stack);
} else {
bunch = params.Bucket + '/' + params.Key;
screenshotLinks.push(bunch);
}
});
});
}).end();
})
callback(null, screenshotLinks)
};
Your code is event driven / asynchronous which means you are calling the callback before screenshotLinks has been populated.
The node http.ClientRequest.end() method finishes sending a request, but that doesn't mean that the response has been received and handled, as that is done by an asynchronous event handler. However, the callback is executed immediately after the call to request.end(), which is just after the request has been fired off, therefore screenshotLinks is empty.
You need to execute your callback from the callback you pass to s3.putObject. I suggest you pass your callback a response/result object that indicates whether the putObject succeeded and contains the url it relates to and either an error message or a screenshotLink, e.g. something like this:
s3.putObject(params, function(err, data, callback) {
var s3Response;
s3Response.url = url;
if (err) {
s3Response.success = false;
s3Response.error = err;
console.error(err, err.stack);
} else {
bunch = params.Bucket + '/' + params.Key;
s3Response.success = true;
s3Response.screenshotLink = bunch;
}
callback(null, s3Response);
});
I would like to suggest you use an 8.10 node runtime.
ref: https://aws.amazon.com/blogs/compute/node-js-8-10-runtime-now-available-in-aws-lambda/
Then your entry point should be:
export async function <function_name>(event) {}
Then:
let s3 = new AWS.S3({ region: process.env.AWS_REGION, apiVersion: '2006-03-01' });
let params=
{
Bucket: /* a path to bucket (string) */,
Key: name /* string */,
Body: /* supported types (Buffer, Typed Array, Blob, String, ReadableStream) */,
ACL: 'public-read',
ContentType: 'image/png'
};
try
{
let s3Response = await s3.upload(params).promise();
// if succceed
console.log(`File uploaded to S3 at ${s3Response.Bucket} bucket. File location: ${s3Response.Location}`);
}
catch (ex) // if error occured
{
console.error(ex);
}

Uncompress Content-Encoding: gzip with javascript/node [duplicate]

How do I unzip a gzipped body in a request's module response?
I have tried several examples around the web but none of them appear to work.
request(url, function(err, response, body) {
if(err) {
handleError(err)
} else {
if(response.headers['content-encoding'] == 'gzip') {
// How can I unzip the gzipped string body variable?
// For instance, this url:
// http://highsnobiety.com/2012/08/25/norse-projects-fall-2012-lookbook/
// Throws error:
// { [Error: incorrect header check] errno: -3, code: 'Z_DATA_ERROR' }
// Yet, browser displays page fine and debugger shows its gzipped
// And unzipped by browser fine...
if(response.headers['content-encoding'] && response.headers['content-encoding'].toLowerCase().indexOf('gzip') > -1) {
var body = response.body;
zlib.gunzip(response.body, function(error, data) {
if(!error) {
response.body = data.toString();
} else {
console.log('Error unzipping:');
console.log(error);
response.body = body;
}
});
}
}
}
}
I couldn't get request to work either, so ended up using http instead.
var http = require("http"),
zlib = require("zlib");
function getGzipped(url, callback) {
// buffer to store the streamed decompression
var buffer = [];
http.get(url, function(res) {
// pipe the response into the gunzip to decompress
var gunzip = zlib.createGunzip();
res.pipe(gunzip);
gunzip.on('data', function(data) {
// decompression chunk ready, add it to the buffer
buffer.push(data.toString())
}).on("end", function() {
// response and decompression complete, join the buffer and return
callback(null, buffer.join(""));
}).on("error", function(e) {
callback(e);
})
}).on('error', function(e) {
callback(e)
});
}
getGzipped(url, function(err, data) {
console.log(data);
});
try adding encoding: null to the options you pass to request, this will avoid converting the downloaded body to a string and keep it in a binary buffer.
Like #Iftah said, set encoding: null.
Full example (less error handling):
request = require('request');
zlib = require('zlib');
request(url, {encoding: null}, function(err, response, body){
if(response.headers['content-encoding'] == 'gzip'){
zlib.gunzip(body, function(err, dezipped) {
callback(dezipped.toString());
});
} else {
callback(body);
}
});
Actually request module handles the gzip response. In order to tell the request module to decode the body argument in the callback function, We have to set the 'gzip' to true in the options. Let me explain you with an example.
Example:
var opts = {
uri: 'some uri which return gzip data',
gzip: true
}
request(opts, function (err, res, body) {
// now body and res.body both will contain decoded content.
})
Note: The data you get on 'reponse' event is not decoded.
This works for me. Hope it works for you guys too.
The similar problem usually we ran into while working with request module is with JSON parsing. Let me explain it. If u want request module to automatically parse the body and provide you JSON content in the body argument. Then you have to set 'json' to true in the options.
var opts = {
uri:'some uri that provides json data',
json: true
}
request(opts, function (err, res, body) {
// body and res.body will contain json content
})
Reference: https://www.npmjs.com/package/request#requestoptions-callback
As seen in https://gist.github.com/miguelmota/9946206:
Both request and request-promise handle it out of the box as of Dec 2017:
var request = require('request')
request(
{ method: 'GET'
, uri: 'http://www.google.com'
, gzip: true
}
, function (error, response, body) {
// body is the decompressed response body
console.log('server encoded the data as: ' + (response.headers['content-encoding'] || 'identity'))
console.log('the decoded data is: ' + body)
}
)
I have formulated a more complete answer after trying the different ways to gunzip, and solving errors to do with encoding.
Hope this helps you too:
var request = require('request');
var zlib = require('zlib');
var options = {
url: 'http://some.endpoint.com/api/',
headers: {
'X-some-headers' : 'Some headers',
'Accept-Encoding' : 'gzip, deflate',
},
encoding: null
};
request.get(options, function (error, response, body) {
if (!error && response.statusCode == 200) {
// If response is gzip, unzip first
var encoding = response.headers['content-encoding']
if (encoding && encoding.indexOf('gzip') >= 0) {
zlib.gunzip(body, function(err, dezipped) {
var json_string = dezipped.toString('utf-8');
var json = JSON.parse(json_string);
// Process the json..
});
} else {
// Response is not gzipped
}
}
});
Here is my two cents worth. I had the same problem and found a cool library called concat-stream:
let request = require('request');
const zlib = require('zlib');
const concat = require('concat-stream');
request(url)
.pipe(zlib.createGunzip())
.pipe(concat(stringBuffer => {
console.log(stringBuffer.toString());
}));
Here's a working example (using the request module for node) that gunzips the response
function gunzipJSON(response){
var gunzip = zlib.createGunzip();
var json = "";
gunzip.on('data', function(data){
json += data.toString();
});
gunzip.on('end', function(){
parseJSON(json);
});
response.pipe(gunzip);
}
Full code: https://gist.github.com/0xPr0xy/5002984
I'm using node-fetch. I was getting response.body, what I really wanted was await response.text().
With got, a request alternative, you can simply do:
got(url).then(response => {
console.log(response.body);
});
Decompression is handled automagically when needed.
I used the gunzipSync convenience method in nodejs to decompress the body. This avoids working with callbacks.
import * as zlib from "zlib";
const uncompressedBody:string = zlib.gunzipSync(body).toString("utf-8");
(in typescript)

How do I ungzip (decompress) a NodeJS request's module gzip response body?

How do I unzip a gzipped body in a request's module response?
I have tried several examples around the web but none of them appear to work.
request(url, function(err, response, body) {
if(err) {
handleError(err)
} else {
if(response.headers['content-encoding'] == 'gzip') {
// How can I unzip the gzipped string body variable?
// For instance, this url:
// http://highsnobiety.com/2012/08/25/norse-projects-fall-2012-lookbook/
// Throws error:
// { [Error: incorrect header check] errno: -3, code: 'Z_DATA_ERROR' }
// Yet, browser displays page fine and debugger shows its gzipped
// And unzipped by browser fine...
if(response.headers['content-encoding'] && response.headers['content-encoding'].toLowerCase().indexOf('gzip') > -1) {
var body = response.body;
zlib.gunzip(response.body, function(error, data) {
if(!error) {
response.body = data.toString();
} else {
console.log('Error unzipping:');
console.log(error);
response.body = body;
}
});
}
}
}
}
I couldn't get request to work either, so ended up using http instead.
var http = require("http"),
zlib = require("zlib");
function getGzipped(url, callback) {
// buffer to store the streamed decompression
var buffer = [];
http.get(url, function(res) {
// pipe the response into the gunzip to decompress
var gunzip = zlib.createGunzip();
res.pipe(gunzip);
gunzip.on('data', function(data) {
// decompression chunk ready, add it to the buffer
buffer.push(data.toString())
}).on("end", function() {
// response and decompression complete, join the buffer and return
callback(null, buffer.join(""));
}).on("error", function(e) {
callback(e);
})
}).on('error', function(e) {
callback(e)
});
}
getGzipped(url, function(err, data) {
console.log(data);
});
try adding encoding: null to the options you pass to request, this will avoid converting the downloaded body to a string and keep it in a binary buffer.
Like #Iftah said, set encoding: null.
Full example (less error handling):
request = require('request');
zlib = require('zlib');
request(url, {encoding: null}, function(err, response, body){
if(response.headers['content-encoding'] == 'gzip'){
zlib.gunzip(body, function(err, dezipped) {
callback(dezipped.toString());
});
} else {
callback(body);
}
});
Actually request module handles the gzip response. In order to tell the request module to decode the body argument in the callback function, We have to set the 'gzip' to true in the options. Let me explain you with an example.
Example:
var opts = {
uri: 'some uri which return gzip data',
gzip: true
}
request(opts, function (err, res, body) {
// now body and res.body both will contain decoded content.
})
Note: The data you get on 'reponse' event is not decoded.
This works for me. Hope it works for you guys too.
The similar problem usually we ran into while working with request module is with JSON parsing. Let me explain it. If u want request module to automatically parse the body and provide you JSON content in the body argument. Then you have to set 'json' to true in the options.
var opts = {
uri:'some uri that provides json data',
json: true
}
request(opts, function (err, res, body) {
// body and res.body will contain json content
})
Reference: https://www.npmjs.com/package/request#requestoptions-callback
As seen in https://gist.github.com/miguelmota/9946206:
Both request and request-promise handle it out of the box as of Dec 2017:
var request = require('request')
request(
{ method: 'GET'
, uri: 'http://www.google.com'
, gzip: true
}
, function (error, response, body) {
// body is the decompressed response body
console.log('server encoded the data as: ' + (response.headers['content-encoding'] || 'identity'))
console.log('the decoded data is: ' + body)
}
)
I have formulated a more complete answer after trying the different ways to gunzip, and solving errors to do with encoding.
Hope this helps you too:
var request = require('request');
var zlib = require('zlib');
var options = {
url: 'http://some.endpoint.com/api/',
headers: {
'X-some-headers' : 'Some headers',
'Accept-Encoding' : 'gzip, deflate',
},
encoding: null
};
request.get(options, function (error, response, body) {
if (!error && response.statusCode == 200) {
// If response is gzip, unzip first
var encoding = response.headers['content-encoding']
if (encoding && encoding.indexOf('gzip') >= 0) {
zlib.gunzip(body, function(err, dezipped) {
var json_string = dezipped.toString('utf-8');
var json = JSON.parse(json_string);
// Process the json..
});
} else {
// Response is not gzipped
}
}
});
Here is my two cents worth. I had the same problem and found a cool library called concat-stream:
let request = require('request');
const zlib = require('zlib');
const concat = require('concat-stream');
request(url)
.pipe(zlib.createGunzip())
.pipe(concat(stringBuffer => {
console.log(stringBuffer.toString());
}));
Here's a working example (using the request module for node) that gunzips the response
function gunzipJSON(response){
var gunzip = zlib.createGunzip();
var json = "";
gunzip.on('data', function(data){
json += data.toString();
});
gunzip.on('end', function(){
parseJSON(json);
});
response.pipe(gunzip);
}
Full code: https://gist.github.com/0xPr0xy/5002984
I'm using node-fetch. I was getting response.body, what I really wanted was await response.text().
With got, a request alternative, you can simply do:
got(url).then(response => {
console.log(response.body);
});
Decompression is handled automagically when needed.
I used the gunzipSync convenience method in nodejs to decompress the body. This avoids working with callbacks.
import * as zlib from "zlib";
const uncompressedBody:string = zlib.gunzipSync(body).toString("utf-8");
(in typescript)

Categories

Resources