How to check in NodeJS if json file already got specific data - javascript

I'm trying to make a CRUD operations with NodeJS. But i don't know how to check if JSON file already got the specific part and then update only needed object and not overwrite the other data or add the new data if there is no record of it.
this is what i have so far ( right now the insert operation overwrites the whole file and leaves it with only inserted data ) :
JSON :
JSON:
{
"id": 1,
"name": "Bill",
},
{
"id": 2,
"name": "Steve",
},
Code :
var operation = POST.operation; // POST request comes with operation = update/insert/delete
if (operation == 'insert') {
fs.readFile("data.json", "utf8", function (err) {
var updateData = {
id: POST.id,
name: POST.name,
}
var newUser = JSON.stringify(updateData);
fs.writeFile("data.json", newUsers, "utf8");
console.log(err);
})
}
else if (operation == 'update') {
fs.readFile("data.json", "utf8", function (err) {
})
}
else if (operation == 'delete') {
fs.readFile("data.json", "utf8", function (err) {
})
}
else
console.log("Operation is not supported");
The most examples that i've found were with Database and Express.js. So they didn' really help much.
Sorry , i'm a newbie.

So first off, that is not valid JSON (unless this is only part of the file). You will need to wrap that whole list into an array for it to be valid JSON (You can check if your JSON is valid here)
If you go with a structure like this:
[
{
"id": 1,
"name": "Bill",
},
{
"id": 2,
"name": "Steve",
},
]
I think the easiest way to check if the ID already exists will be to read the JSON in as an array and check if the ID has already been assigned. Something like this:
var json = require('/path/to/data.json'); // Your object array
// In the if (operation == 'insert') block
var hadId = json.some(function (obj) {
return obj.id === POST.ID;
});
if (hasId) {
// Do something with duplicate
}
json.push({
id: POST.id,
name: POST.name
});
// Write the whole JSON array back to file
More on the array.some function
So basically you will be keeping the whole JSON file in memory (in the json array) and when you make a change to the array, you write the whole updated list back to file.
You may run into problems with this if that array gets very large, but at that point I would recommend looking into using a database. Mongo (all be it not the greatest DB in the world) is fairly easy to get setup and to integrate with JavaScript while playing and experimenting.
I hope this helps!
Good luck

Related

Different actions based upon JSON value in Node

A perplexing question. I am writing a node based command line tool that has a source json file, and then will take a changes JSON file (i.e. add, update, delete, etc) and I need to basically make the operation into a new JSON file and output to a new file. Not as simple as it sounds. With no command line arguments, etc. you would need to have a directive field in the JSON like so?
The app would work like this:
$ myapp source.json changes.json newfile.json
{
"action": "addRecord",
"payload": {
"id": "1",
"name": "James Hetfield",
"guitar": "More Beer"
}
},
"action": "deleteRecord",
"payload": {
"id": "3",
"name": "Dave Mustaine",
"guitar": "Ole Faithful"
}
}
My JSON structure is probably wrong as well, but wondering how you would use JSON.parse, JSON.stringify to read the file in with the fs library and actually make actions happen when identifying action and then execute a CRUD like statement with payload
Here is what I have so far:
#!/usr/bin/env node
// using fs for parsing for small example
const fs = require('fs');
// We do not simply require the file due to scaling.
// Large file will cause event locking. To handle
// even greater files we would use Node streams.
const file = './changes.json';
// target filename for output
const target = 'output.json';
fs.readFile(file, 'utf8', function(err, data) {
if (err) {
console.log('Error' + err);
return;
}
let obj = JSON.parse(data, (key, value) => {
if (key === 'action') {
if (value === 'addSong') {
return value = "Song Added";
}
if (value === "addPlaylist") {
return value = "Playlist added";
}
if (value === "deletePlaylist") {
return value = "Playlist deleted";
}
}
return value;
});
console.dir(obj);
});
fs seems to just read through the file and pick up the last value read. Not good. Wondering how to do the compare and possibly structure the JSON to make the action happen and then append? or transform the JSON into a new file with the update. Stuck.

How to export/save updated d3.js v4 tree json data string

I'm using the following code:
https://bl.ocks.org/adamfeuer/042bfa0dde0059e2b288
And am loading a very simple json string to create the tree:
{
"name": "flare",
"children": [{
"name": "analytics"
}, {
"name": "animate"
}]
}
So what I'm trying to figure out is that after I add a new child node to the "flare" node (for example), how can I create an updated json string in order to save the newly added node?
An example of the updated json after adding a new node would be like so:
{
"name": "flare",
"children": [{
"name": "analytics"
}, {
"name": "animate"
}, {
"name": "NEW NODE"
}]
}
Is there some built in function for this that I am not finding? Or would a custom function have to be built? And if I need a custom function could somebody please point me in the right direction to do so? Thank you very much!
I propose this solution that is not perfect and which deserves improvements but that works,
it will help you get started.
All the code below is added at the end of the update function in the dndTree.js file.
console.log(root); //root contains everything you need
const getCircularReplacer = (deletePorperties) => { //func that allows a circular json to be stringified
const seen = new WeakSet();
return (key, value) => {
if (typeof value === "object" && value !== null) {
if(deletePorperties){
delete value.id; //delete all properties you don't want in your json (not very convenient but a good temporary solution)
delete value.x0;
delete value.y0;
delete value.y;
delete value.x;
delete value.depth;
delete value.size;
}
if (seen.has(value)) {
return;
}
seen.add(value);
}
return value;
};
};
var myRoot = JSON.stringify(root, getCircularReplacer(false)); //Stringify a first time to clone the root object (it's allow you to delete properties you don't want to save)
var myvar= JSON.parse(myRoot);
myvar= JSON.stringify(myvar, getCircularReplacer(true)); //Stringify a second time to delete the propeties you don't need
console.log(myvar); //You have your json in myvar
Now that you have your json, you can either :
Download your new tree.json file :
function download(content, fileName, contentType) {
var a = document.createElement("a");
var file = new Blob([content], {
type: contentType
});
a.href = URL.createObjectURL(file);
a.download = fileName;
a.click();
}
download(myvar, 'tree.json', 'text/plain');
Or you could directly write in your file.
An example with node.js :
var fs = require('fs');
fs.writeFile("tree.json", myvar, function(err) {
if (err) {
console.log(err);
}
});
Check this for more informations to save a file: How do I save JSON to local text file

Access second level child in Json based on a variable

I'm currently working on a configuration service in my Angular 2 application, my current concern is about the utilization of eval in my code to retrieve a value in my configuration.
Here's a sample of my configuration file:
{
"application": {
"environment": "dev",
"displayMenu": false
},
"error": {
"title": "Title Error",
"message": "Error message"
}
}
I'm retrieving this JSON with a simple HTTP request, but now I would like to access an element with my get method defined like this:
get(key: any) {
return (eval("this._config." + key));
}
As you can see, there is an eval in my code that I would like to avoid. I'm forced to use eval to allow the developer to do .get('application.environment') and actually I don't find any others an easy possibility.
The only other way that I could see is to split the key on a "." and retrieve the right element in my JSON like it was a simple array. But with this solution, I would be stuck to one depth only.
You could use an array of your object keys you wish to view, then reduce that array returning the key of the object. If you wish to have a string as the object accessors you can easily use string.split('.') to create the array you can reduce.
const data = {
"application": {
"environment": "dev",
"displayMenu": false
},
"error": {
"title": "Title Error",
"message": "Error message",
"deeper": {
"evenDeeper": "You can get to any level"
}
}
}
const path = (keys, obj) => {
return keys.reduce((obj, key) => {
return typeof obj !== 'undefined'
? obj[key]
: void 0
}, obj)
}
console.log(
path(['application', 'environment'], data)
)
console.log(
path('error.message'.split('.'), data) // move the split inside the path function
)
console.log(
path(['error', 'deeper', 'evenDeeper'], data)
)
console.log(
path(['error', 'fail', 'damn'], data) // fail safe
)
<script src="http://codepen.io/synthet1c/pen/WrQapG.js"></script>

MongoDB/Mongoose Query Builder

I am trying to build a query builder which will allow me to filter the data based on the parameters entered by user. My Data Model is like so:
{
"_id": {
"$oid": "871287215784812"
},
"tags": [
"school",
"book",
"bag",
"headphone",
"appliance"
],
"consultingDays": 57,
"client": "someOne",
"subSector": "something",
"region": "UK",
"__v": 0
}
Currently my Query Builder looks like this:
app.post('/user/test',function(req, res) {
var query = {};
//QUERY NO.1 - This works perfectly
if (req.body.region){
query.region = req.body.region
console.log(query.region)
}
// QUERY NO.2 - This works perfectly
if (req.body.subSector){
query.subSector = req.body.subSector
}
Project.find(query, function(err, project){
if (err){
res.send(err);
}
console.log(project);
res.json(project);
});
});
My Question:
I want to create a query which will take input from user and parse the "tags" array and return the required JSON.
For example:
If the user requests an object which contains "school", "book", "bag" it will return the object as seen my data model above. But if the user requests an object with "school", "book", "ninja Warrior" it won't return any data as no object within the database contain all those 3 strings.
What I have tried:
I have tried the following
if (req.body.sol){
query.solutions = {"tags" : {$in: [req.body.sol]}}
}
OR
if (req.body.sol){
query.solutions = {$elemMatch:{tags: req.body.sol}}
}
OR
if (req.body.sol){
query.solutions = { tags: { $all: [req.body.sol]}}
}
The requests were sent like so and they returned an empty array:
Also the issue is that the user will get dropdown options. For example he/she might get 3 dropdown boxes. Each dropdown box will display all the five options in the tags array. The user will select a value for each dropdown box. And then filter the result. Because there might be an object within the database that contains "book", "bag", "shoes" within the tags array. The user can select any combination of those five keywords in the tags array
Does anyone know how I can fix this?
You need to send an array as sol so in Postman you should change sol with sol[0], sol[1], etc.. Then use this:
if (req.body.sol){
query.solutions = {"tags" : {$in: req.body.sol}}
}
Without the [] because req.body.sol is an array yet.
I have implemented a simple query build for nested objects:
const checkObject = (object) => {
let key;
const status = Object.entries(object).some(([objectKey, objectValue]) => {
if (typeof objectValue === "object" && objectValue !== null) {
key = objectKey;
return true;
}
return false;
});
return { status, key };
};
const queryBuilder = (input) => {
// Array verification not implemented
let output = {};
_.each(input, (value, key) => {
if (typeof value === "object" && value !== null) {
_.each(value, (nestedValue, nestedKey) => {
output[`${[key, nestedKey].join(".")}`] = nestedValue;
});
} else {
output[key] = value;
}
});
const cacheCheckObject = checkObject(output);
if (cacheCheckObject.status)
return { ..._.omit(output, cacheCheckObject.key), ...queryBuilder(output) };
return output;
};
I have not implemented array, but with some small work you can do it work. The same for Mongo operators. The complete example can be seen on Gist.

MongoDB updated object with item remove not saving

I'm using Angular Fullstack for an web app.
I'm posting my data by $http.post() my object:
{ title: "Some title", tags: ["tag1", "tag2", "tag3"] }
When I edit my object and try to $http.put() for example:
{ title: "Some title", tags: ["tag1"] }
In console I get HTTP PUT 200 but when I refresh the page I still recive the object with all 3 tags.
This is how I save in the MongoDB:
exports.update = function(req, res) {
if (req.body._id) {
delete req.body._id;
}
Question.findByIdAsync(req.params.id)
.then(handleEntityNotFound(res))
.then(saveUpdates(req.body))
.then(responseWithResult(res))
.catch(handleError(res));
};
function saveUpdates(updates) {
return function(entity) {
var data = _.merge(entity.toJSON(), updates);
var updated = _.extend(entity, data);
return updated.saveAsync()
.spread(function(updated) {
return updated;
});
};
}
Can someone explain how to save the object with removed items?
What I'm doing wrong?
This is pretty bad practice to use things like _.merge or _.extend in client ( meaning your nodejs client to database and not browser ) code after retrieving from the database. Also notably _.merge is the problem here as it is not going to "take away" things, but rather "augment" what is already there with the information you have provided. Not what you want here, but there is also a better way.
You should simply using "atomic operators" like $set to do this instead:
Question.findByIdAndUpdateAsync(
req.params.id,
{ "$set": { "tags": req.body.tags } },
{ "new": true }
)
.then(function(result) {
// deal with returned result
});
You also really should be targeting your endpoints and not having a "generic" object write. So the obove would be specically targeted at "PUT" for related "tags" only and not touch other fields in the object.
If you really must throw a whole object at it and expect an update from all the content, then use a helper to fix the update statement correctly:
function dotNotate(obj,target,prefix) {
target = target || {},
prefix = prefix || "";
Object.keys(obj).forEach(function(key) {
if ( typeof(obj[key]) === "object" ) {
dotNotate(obj[key],target,prefix + key + ".");
} else {
return target[prefix + key] = obj[key];
}
});
return target;
}
var update = { "$set": dotNotate(req.body) };
Question.findByIdAndUpdateAsync(
req.params.id,
update,
{ "new": true }
)
.then(function(result) {
// deal with returned result
});
Which will correctly structure not matter what the object you throw at it.
Though in this case then probably just directly is good enough:
Question.findByIdAndUpdateAsync(
req.params.id,
{ "$set": req.body },
{ "new": true }
)
.then(function(result) {
// deal with returned result
});
There are other approaches with atomic operators that you could also fit into your logic for handling. But it is best considered that you do these per element, being at least root document properties and things like arrays treated separately as a child.
All the atomic operations interact with the document "in the database" and "as is at modification". Pulling data from the database, modifiying it, then saving back offers no such guarnatees that the data has not already been changed and that you just may be overwriting other changes already comitted.
I truth your "browser client" should have been aware that the "tags" array had the other two entries and then your "modify request" should simply be to $pull the entries to be removed from the array, like so:
Question.findByIdAndUpdateAsync(
req.params.id,
{ "$pull": { "tags": { "$in": ["tag2", "tag3"] } } },
{ "new": true }
)
.then(function(result) {
// deal with returned result
});
And then, "regardless" of the current state of the document on the server when modified, those changes would be the only ones made. So if something else modified at added "tag4", and the client had yet to get the noficiation of such a change before the modification was sent, then the return response would include that as well and everything would be in sync.
Learn the update modifiers of MongoDB, as they will serve you well.

Categories

Resources