How to append an item to a JSON stored in a file? - javascript

Im triying to write a JSON object to a DB with FS but it´s not working as expected. What i want to do is write the JSON data inside this: [] Example:
[
{"name":"jhon"},
{"name":"jhonathan"}
]
Here is the code:
Thanks.

The comment you provided doesn't make much sense, because the JSON data you provided in the post and in the comments is completely different. But I get the gist of it. I guess you have a JSON file containing an array and you want to push new items to it. Let's do this.
The thing is, when you call fs.appendFile, you're only writing to the end of the file. You're not following JSON format by doing so.
You need to do this:
Read file content.
Parse JSON text into an object.
Update the object in memory.
Write object in JSON format back to the file system.
I'll call the synchronous methods for simplicity's sake, but you should be able to convert it to async quite easily.
const path = __dirname + 'db.json'
// Reading items from the file system
const jsonData = fs.readFileSync(path)
const items = JSON.parse(jsonData)
// Add new item to the item list
items.push(newItem)
// Writing back to the file system
const newJsonString = JSON.stringify(items)
fs.writeFileSync(path, newJsonString)

Related

Reading changes in JSON file using JS

I'm trying to read an updating JSON file from syslog-ng. Currently, syslog, a logging software, is set to continually append a JSON file with logs of the data I want. I'm displaying the data on my cyber attack map only for only 30 seconds until it's not needed anymore. I can read the file and parse what I need, but is there a way to, over time, read & parse only the most recent additions to the file?
Sample code:
//Assume JSON output = {attack source, attack destination, attack type}
//Required modules
var JSONStream = require('JSONStream')
var fs = require('fs');
//Creates readable stream for JSON file parsing
var stream = fs.createReadStream( 'output.json', 'utf8'),
parser = JSONStream.parse(['source', 'dest', 'type']);
//Send read data to parser function
stream.pipe(parser);
//Intake data from parser function
parser.on('data', function (obj) {
//Do something with the object
console.log(obj);
});
I'm using JSONStream to avoid having to read the whole log file into memory, JSONstream should still be able to parse the bits I want, but is there a method to only read changes after the original reading is complete?
Use this code example provided in the library
JSONStream Test code
You don't have to wait for the end, you can use the callback to do your work object by object
But the file structure should suite the library expectation as the files given in the folder
Example file all_npm.json

Save/Load Variables in js

I'm trying to create a save/load function for my game in js, but I have basically no idea with how to go through with doing this. I can save variables to a JSON file or LocalStorage, but I don't know how to load them back into the program. I'm also pretty sure I'm exporting variables the wrong way as well. Any help?
Normally, I use JSON format to store and read data (of any type).
To save data (using key gamedata as example):
var myData = {
name: 'David',
score: 10
}
localStorage.setItem('gamedata', JSON.stringify(myData));
** without JSON.stringify, you data will be saved as string [Object object]
To retrieve the data back:
var savedData = localStorage.getItem('gamedata'); // savedData is string
var myData = JSON.parse(savedData); // parse JSON string to java object
setup a bin on www.myJSON.com. p5 has built in functionality for ajax requests such as loadJSON. that way it's not in local storage and you can access your data if you have it on github. I know your struggle, I used to deal with this sort of issue myself before I found myJSON

load JSON with JQuery

Is it possible to load JSON file once, and in further call will use loaded JSON (for ex. from cache)?
The easiest way to achieve this is to use the localStorage of JavaScript.
Let's assume you have an object named object.
You can store it this way:
// parse Object to JSON, then store it.
let jsonObject = JSON.stringify(object);
localStorage.setItem('myObject', jsonObject);
And if you want to use it again, do it this way:
// load it from storage and parse the JSON to Object.
let jsonObject = localStorage.getItem('myObject');
let object = null;
if(jsonObject){
object = JSON.parse(jsonObject);
}
Then change it according to your needs and store it again.
You can delete it by
localStorage.removeItem('myObject');
There are so many ways,
You can store your JSON file in assets folder and read them like this - https://stackoverflow.com/a/19945484/713778
You can store it in res/raw folder and read the same as show here - https://stackoverflow.com/a/6349913/713778
For basic JSON parsing, Android's in-built JSONObject should work - https://developer.android.com/reference/org/json/JSONObject.html
For more advanced JSON parsing (json-java mapping), you can look at GSON library - https://code.google.com/p/google-gson/

Transforming JSON in a node stream with a map or template

I'm relatively new to Javascript and Node and I like to learn by doing, but my lack of awareness of Javascript design patterns makes me wary of trying to reinvent the wheel, I'd like to know from the community if what I want to do is already present in some form or another, I'm not looking for specific code for the example below, just a nudge in the right direction and what I should be searching for.
I basically want to create my own private IFTTT/Zapier for plugging data from one API to another.
I'm using the node module request to GET data from one API and then POST to another.
request supports streaming to do neat things like this:
request.get('http://example.com/api')
.pipe(request.put('http://example.com/api2'));
In between those two requests, I'd like to pipe the JSON through a transform, cherry picking the key/value pairs that I need and changing the keys to what the destination API is expecting.
request.get('http://example.com/api')
.pipe(apiToApi2Map)
.pipe(request.put('http://example.com/api2'));
Here's a JSON sample from the source API: http://pastebin.com/iKYTJCYk
And this is what I'd like to send forward: http://pastebin.com/133RhSJT
The transformed JSON in this case takes the keys from the value of each objects "attribute" key and the value from each objects "value" key.
So my questions:
Is there a framework, library or module that will make the transform step easier?
Is streaming the way I should be approaching this? It seems like an elegant way to do it, as I've created some Javascript wrapper functions with request to easily access API methods, I just need to figure out the middle step.
Would it be possible to create "templates" or "maps" for these transforms? Say I want to change the source or destination API, it would be nice to create a new file that maps the source to destination key/values required.
Hope the community can help and I'm open to any and all suggestions! :)
This is an Open Source project I'm working on, so if anyone would like to get involved, just get in touch.
Yes you're definitely on the right track. There are two stream libs I would point you towards, through which makes it easier to define your own streams, and JSONStream which helps to convert a binary stream (like what you get from request.get) into a stream of parsed JSON documents. Here's an example using both of those to get you started:
var through = require('through');
var request = require('request');
var JSONStream = require('JSONStream');
var _ = require('underscore');
// Our function(doc) here will get called to handle each
// incoming document int he attributes array of the JSON stream
var transformer = through(function(doc) {
var steps = _.findWhere(doc.items, {
label: "Steps"
});
var activeMinutes = _.findWhere(doc.items, {
label: "Active minutes"
});
var stepsGoal = _.findWhere(doc.items, {
label: "Steps goal"
});
// Push the transformed document into the outgoing stream
this.queue({
steps: steps.value,
activeMinutes: activeMinutes.value,
stepsGoal: stepsGoal.value
});
});
request
.get('http://example.com/api')
// The attributes.* here will split the JSON stream into chunks
// where each chunk is an element of the array
.pipe(JSONStream.parse('attributes.*'))
.pipe(transformer)
.pipe(request.put('http://example.com/api2'));
As Andrew pointed out there's through or event-stream, however I made something even easier to use, scramjet. It works the same way as through, but it's API is nearly identical to Arrays, so you can use map and filter methods easily.
The code for your example would be:
DataStream
.pipeline(
request.get('http://example.com/api'),
JSONStream.parse('attributes.items.*')
)
.filter((item) => item.attibute) // filter out ones without attribute
.reduce((acc, item) => {
acc[item.attribute] = item.value;
return acc;
.then((result) => request.put('http://example.com/api2', result))
;
I guess this is a little easier to use - however in this example you do accumulate the data into an object - so if the JSON's are actually much longer than this, you may want to turn it back into a JSONStream again.

How go I get csv data into netsuite?

I've got an update to my question.
What I really wanted to know was this:
How do I get csv data into netsuite?
Well, it seems I use the csv import tool to create a mapping and use this call to import the csv nlapiSubmitCSVImport(nlobjCSVImport).
Now my question is: How do I iterate through the object?!
That gets me half way - I get the csv data but I can't seem to find out how I iterate through it in order to manipulate the date. This is, of course, the whole point of a scheduled script.
This is really driving me mad.
#Robert H
I can think of a million reasons why you'd want to import data from a CSV. Billing, for instance. Various reports on data any company keeps and I wouldn't want to keep this in the file cabinet nor would I really want to keep the file at all. I just want the data. I want to manipulate it and I want to enter it.
Solution Steps:
To upload a CSV file we have to use a Suitelet script.
(Note: file - This field type is available only for Suitelets and will appear on the main tab of the Suitelet page. Setting the field type to file adds a file upload widget to the page.)
var fileField = form.addField('custpage_file', 'file', 'Select CSV File');
var id = nlapiSubmitFile(file);
Let's prepare to call a Restlet script and pass the file id to it.
var recordObj = new Object();
recordObj.fileId = fileId;
// Format input for Restlets for the JSON content type
var recordText = JSON.stringify(recordObj);//stringifying JSON
// Setting up the URL of the Restlet
var url = 'https://rest.na1.netsuite.com/app/site/hosting/restlet.nl?script=108&deploy=1';
// Setting up the headers for passing the credentials
var headers = new Array();
headers['Content-Type'] = 'application/json';
headers['Authorization'] = 'NLAuth nlauth_email=amit.kumar2#mindfiresolutions.com, nlauth_signature=*password*, nlauth_account=TSTDRV****, nlauth_role=3';
(Note: nlapiCreateCSVImport: This API is only supported for bundle installation scripts, scheduled scripts, and RESTlets)
Let's call the Restlet using nlapiRequestURL:
// Calling Restlet
var output = nlapiRequestURL(url, recordText, headers, null, "POST");
Create a mapping using Import CSV records available at Setup > Import/Export > Import CSV records.
Inside the Restlet script Fetch the file id from the Restlet parameter. Use nlapiCreateCSVImport() API and set its mapping with mapping id created in step 3. Set the CSV file using the setPrimaryFile() function.
var primaryFile = nlapiLoadFile(datain.fileId);
var job = nlapiCreateCSVImport();
job.setMapping(mappingFileId); // Set the mapping
// Set File
job.setPrimaryFile(primaryFile.getValue()); // Fetches the content of the file and sets it.
Submit using nlapiSubmitCSVImport().
nlapiSubmitCSVImport(job); // We are done
There is another way we can get around this although neither preferable nor would I suggest. (As it consumes a lot of API's if you have a large number of records in your CSV file.)
Let's say that we don't want to use the nlapiCreateCSVImport API, so let's continue from the step 4.
Just fetch the file Id as we did earlier, load the file, and get its contents.
var fileContent = primaryFile.getValue();
Split the lines of the file, then subsequently split the words and store the values into separate arrays.
var splitLine = fileContent.split("\n"); // Splitting the file on the basis of lines.
for (var lines = 1,count=0; lines < splitLine.length; lines++)
{
var words = (splitLine[lines]).split(","); // words stores all the words on a line
for (var word = 0; word < words.length; word++)
{
nlapiLogExecution("DEBUG", "Words:",words[word]);
}
}
Note: Make sure you don't have an additional blank line in your CSV file.
Finally create the record and set field values from the array that we created above.
var myRec = nlapiCreateRecord('cashsale'); // Here you create the record of your choice
myRec.setFieldValue('entity', arrCustomerId[i]); // For example, arrCustomerId is an array of customer ID.
var submitRec = nlapiSubmitRecord(myRec); // and we are done
fellow NetSuite user here, I've been using SuiteScripts for a while now but never saw nlobjCSVImport object nor nlapiSubmitCSVImport .. I looked in the documentation, it shows, but there is no page describing the details, care to share where you got the doc from?
With the doc for the CSVImport object I might be able to provide some more help.
P.S. I tried posting this message as a comment but the "Add comment" link didn't show up for some reason. Still new to SOF
CSV to JSON:
convert csv file to json object datatable
https://code.google.com/p/jquery-csv/
If you know the structure of the CSV file, just do a for loop and map the fields to the corresponding nlapiSetValue.
Should be pretty straightforward.

Categories

Resources