I'm receiving events in JSON format via a POST route, I would like to save these events in a file like 'example.json' and be able to query it.
I tried using writeFileSync, but it rewrites the entire file. With the flag {flag: 'a+'} I was able to save more than one record, but when I try to require 'example.json', I get an error 'Unexpected token { in JSON'.
Works fine when the file has only one record, but gives the error after the second one.
Code:
const filePath = './example.json';
const fs = require('fs');
const file = require('./example.json');
app.post('/events', (request, response) => {
response.send(request.body);
const contentString = JSON.stringify(request.body);
return fs.writeFileSync(filepath, contentString, {flag: 'a+'});
});
example.json that works:
{"type":"call.new","call_id":"71252742562.40019","code":"h9e8j7c0tl0j5eexi07sy6znfd1ponj4","direction":"inbound","our_number":"1130900336","their_number":"11999990000","their_number_type":"mobile","timestamp":"2020-04-01T00:00:00Z"}
example.json (with two records) that stop working:
{"type":"call.new","call_id":"71252742562.40019","code":"h9e8j7c0tl0j5eexi07sy6znfd1ponj4","direction":"inbound","our_number":"1130900336","their_number":"11999990000","their_number_type":"mobile","timestamp":"2020-04-01T00:00:00Z"}{"type":"call.ongoing","call_id":"71252731962.40019","code":"h9e8j7c0tl0j5eexi07sy6znfd1ponj4","direction":"inbound","our_number":"1130900336","their_number":"11999990000","their_number_type":"mobile","timestamp":"2020-04-01T00:00:00Z"}
How can I write this JSON in a readable form? That does not present the error above and it is possible to perform the require.
Could someone help me, please?
Try to read the JSON file, parse it, add new elements to the array and then overwrite the file.
const fs = require("fs");
const path = require("path");
const FILE_PATH = path.join(__dirname, "./elements.json");
const file = fs.readFileSync(FILE_PATH);
const elements = JSON.parse(file);
const newElement = { id: Date.now() };
const updatedElements = [...elements, newElement];
fs.writeFileSync(FILE_PATH, JSON.stringify(updatedElements));
See more here: https://nodejs.org/api/fs.html#fsappendfilesyncpath-data-options
Related
I am using the #aws-sdk/client-s3 to read a json file from S3, take the contents and dump it into dynamodb. This all currently works fine using:
const data = await (await new S3Client(region).send(new GetObjectCommand(bucketParams)));
And then deserialising the response body etc.
However, I'm looking to migrate to use jsonlines format, effectiely csv, in the sense it needs to be streamed in line by line or in chunks of lines and processed. I can't seem to find a way of doing this that doesnt load the entire file into memory (using response.text() etc).
Ideally, I would like to pipe the response into a createReadStream, and go from there.
I found this example with createReadStream() form module fs in node.js:
import fs from 'fs';
function read() {
let data = '';
const readStream = fs.createReadStream('business_data.csv', 'utf-8');
readStream.on('error', (error) => console.log(error.message));
readStream.on('data', (chunk) => data += chunk);
readStream.on('end', () => console.log('Reading complete'));
};
read();
You can modify it for your use. Hope this helps.
Connection to your S3 you can do by:
var s3 = new AWS.S3({apiVersion: '2006-03-01'});
var params = {Bucket: 'myBucket', Key: 'myImageFile.jpg'};
var file = require('fs').createWriteStream('/path/to/file.jpg');
s3.getObject(params).createReadStream().pipe(file);
see here
Currently trying to download image from GitHub locally. Everything seems to work, the fetch goes through with a 200 OK response, however, I don't understand how to store image itself:
const rawGitLink = "https://raw.githubusercontent.com/cardano-foundation/CIPs/master/CIP-0001/CIP_Flow.png"
const folder = "/Folder"
const imageName = "/Test"
const imageResponse = await axios.get(rawGitLink)
fs.writeFileSync(___dirname + folder + imageName, imageResponse, (err) => {
//Error handling
}
)
Four problems had to be fixed:
Image name must include png format for this case
The response must be in the correct format as a buffer for an image
You must write the response data and not the object itself
__dirname only needs two underscores
const rawGitLink = "https://raw.githubusercontent.com/cardano-foundation/CIPs/master/CIP-0001/CIP_Flow.png"
const folder = "/Folder"
const imageName = "/Test.png"
const imageResponse = await axios.get(rawGitLink, { responseType: 'arraybuffer' });
fs.writeFileSync(__dirname + folder + imageName, imageResponse.data)
Axios returns a special object: https://github.com/axios/axios#response-schema
let {data} = await axios.get(...)
await fs.writeFile(filename, data) // you can use fs.promises instead of sync
As #Leau said you should include the extension on the filename
Another sugestion is to use the path module to create the filename:
filename = path.join(__dirname, "/Folder", "Test.png")
I have an Azure function and a file called configAPI.json which are located in the same folder as shown in the image below.
I want to read the latter with the following code based on this post How can i read a Json file with a Azure function-Node.js but the code isn't working at all because when I try to see if there's any content in the configAPI variable I encounter undefined:
module.exports = async function (context, req) {
const fs = require('fs');
const path = context.executionContext.functionDirectory + '//configAPI.json';
configAPI= fs.readFile(path, 'utf-8', function(err, data){
if (err) {
context.log(err);
}
var result = JSON.parse(data);
return result
});
for (let file_index=0; file_index<configAPI.length; file_index++){
// do something
}
context.log(configAPI);
}
What am I missing in the code to make sure I can read the file and use it in a variable in my loop?
functionDirectory - give you path to your functionS app then you have your single function
I think you should do:
const path = context.executionContext.functionDirectory + '\\configAPI.json';
In case you want to parse your json file you should have:
const file = JSON.parse(fs.readFileSync(context.executionContext.functionDirectory + '\\configAPI.json'));
PS. context has also variable functionName so other option to experiment would be:
const path = context.executionContext.functionDirectory +
+ '\\' +context.executionContext.functionName + '\\configAPI.json';
I am trying to parse a YAML file. I was able to parse the file properly but the comments in the YAML file are not getting read. Is there any way to do it? Attaching the parser code and config.json. Also attaching the screenshot of the file and output for reference.
var fs= require('fs');
var path= require('path');
var yaml = require('js-yaml')
var fname= "config.json"
var jPath= path.join(__dirname,"..","ConfigGen","Config",fname);
var jsString= fs.readFileSync(jPath, 'utf8')
// Get path for files from Config file
var tType= "cto" //Get this from input
var pth= JSON.parse(jsString)[tType] //perform error handling
var cType = "jbod" //Get this from input
//Use that path
fs.readdir(pth, function(err,files) {
files.forEach(function(file){
fName= cType+"_"+tType+"_uut.yaml-example";
if(file==fName){
var flContent= fs.readFileSync(path.join(pth,file),"utf8")
// return path.join from here and use the next part in a separate function
var data= yaml.safeLoad(flContent)[0][0]
console.log(data)
for (var index in data){
var prefix = index
for (idx in data[index]){
//console.log(prefix, idx ,data[prefix][idx])
}
}
}
})
})
Reiterating flyx's comment, according to the YAML spec on comments:
Comments are a presentation detail and must not be used to convey content information.
So assuming you're not going to be able to correlate the comments to any adjacent fields, you can just read the whole file as a string and match against any characters after a #
You can read the file and parse with this regex like this:
var { promises: fs } = require('fs');
(async() => {
let path = "./config.yaml"
let file = await fs.readFile(path, "utf8")
let matches = file.matchAll(/#.*/g)
let comments = [...matches].map(m => m[0])
console.log(comments)
})()
If you have a yaml file that looks like this:
# block comment
env: dev
prop: value # inline comment
It will log the following:
[ '# block comment', '# inline comment' ]
Code
const Xray = require('x-ray');
const xray = Xray();
// How do I read a file, rather than a URL?
const url = 'https://www.tel-o-fun.ga/';
xray(url, '.marker')((err, value) => {
console.log(value);
});
My goal
I am using x-ray to scrape some date from a website. For testing and development purposes, I would like to parse data from a local file rather than a remote resource.
How do I load a local file into x-ray, instead of pointing it to a remote URL?
This example from the x-ray repo solved my problem. Simply pass an HTML string instead of a URL:
const path = require('path');
const Xray = require('x-ray');
const read = require('fs').readFileSync;
const html = read(path.resolve(__dirname, 'index.html'));
const xray = Xray();
xray(html, '.marker')((err, value) => {
console.log(value);
});