Asynchronous Query JSON object - javascript

I have been playing around with a module from NPM called JSON-Query, I originally able to make the module function with JSON embedded in my js.
I have spent about two days attempting to make it query JSON that is external and in a JSON file.
The original code that was functioning looked something like this.
var jsonQuery = require('json-query')
var data = {
people: [
{name: 'Matt', country: 'NZ'},
{name: 'Pete', country: 'AU'},
{name: 'Mikey', country: 'NZ'}
]
}
jsonQuery('people[country=NZ].name', {
data: data
}) //=> {value: 'Matt', parents: [...], key: 0} ... etc
I was able to query the internal JSON to find the key I was looking for.
I realized I need the ability to update the JSON while the code is live, so I moved the JSON to its own file.
Currently my main JS file looks like this.
var jsonQuery = require('json-query');
var fs = require('fs');
function querydb(netdomain){
fs.readFile('./querykeys.json', 'utf8', function (err, data) {
if (err){console.log('error');}
var obj = JSON.parse(data);
console.log(jsonQuery('servers[netshare=Dacie2015].netdomain', {
obj: obj
}));
});
}
querydb();
My JSON file that contains the json looks like this.
{
"servers": [
{"netdomain": "google.com", "netshare": "password", "authip":"216.58.203.46"},
{"netdomain": "localhost", "netshare": "localghost", "authip":"127.0.0.1"},
{"netdomain": "facebook.com", "netshare": "timeline", "authip":"31.13.69.228"}
]
}
The issue I have ran into, I am unable to query the JSON anymore, when the function QueryDB() is ran, no matter what is in the place to query the JSON, i get no response locating my key.
Currently the response I get from the server when i try to query the JSON file is
{ value: null,
key: 'netdomain',
references: [],
parents: [ { key: 'servers', value: null }, { key: null, value: null } ] }
To be abundantly clear, i believe my issue is the way i call my object into play, i have played with the structure of the JSON-Query and have been unable to accomplish being able to isolate a key.
Any help on this would be amazing, the module that i am working with can be found on npm at [NPM]https://www.npmjs.com/package/json-query
Thank you

I think this is just a typo. Shouldn't this:
obj: obj
be this?
data: obj

Related

assign variable to value of Dictionary Javascript

I am building a dictionary but I would like some of the values to contain variables. is there a way to pass a variable to the dictionary so I can assign a dot notation variable? the variables object will always have the same structure and the dictionary will be static and structured the same for each key value pair. essentially I want to pass the value from the dictionary to another function to handle the data.
main.js
import myDictionary from "myDictionary.js"
const variables ={
item:"Hello"
}
const data = myDictionary[key](variables)
console.log(data)
myDictionary.js
const myDictionary = {
key: variables.item
}
so the log should display hello. I know it willl be something straightforward but cant seem to figure it out.
as always any help is greatly appreciated
You should modify the dictionary so that it keeps actual callback functions instead. Only then it will be able to accept arguments.
const myDictionary = {
key: (variables) => variables.item
}
const variables = {
item: "Hello"
}
const key = "key";
const data = myDictionary[key](variables)
console.log(data)
What you are trying to do is not possible. The myDictionary.js file has no idea whats inside you main file. The only thing you could do would be:
myDictionary.js
const myDictionary = {
key: "item"
}
main.js
import myDictionary from "myDictionary.js";
const variables = {
item: "Hello"
};
const data = variables[myDictionary["key"]];
console.log(data);
Also, even though JavaScript does not enforce semi-colons, they will save you a lot of headaches of some stupid rule that breaks the automatic inserter.
I must apologise as when I asked the question I wasn't fully clear on what I needed but after some experimentation and looking at my edge cases and after looking at Krzysztof's answer I had a thought and came up with something similar to this -
const dict = {
key: (eventData) => {
return [
{
module: 'company',
entity: 'placement',
variables: {
placement_id: {
value: eventData.id,
},
},
},
{
module: 'company',
entity: 'placement',
variables: {
client_id: {
value: eventData.client.id,
},
},
},
];
},
}
Then I'm getting the data like this -
const data = dict?.[key](eventData)
console.log(data)
I can then navigate or manipulate the data however I need.
thank you everyone who spent time to help me

JSON Stringify ignores nested objects on Redis publish

I am using Redis in my backend to scale subscriptions. I am using this library to implement redis on top of my javascript code. And using mongoose for the models.
During a redis publish, I have to stringify the objects that I get from mongoose. I parse them on the subscribing end and it all works well until there's a nested object in the object that needs to be stringify-ed.
So if my object is this:
{ subtitle: '',
description: '',
checklists:
[ { _id: 5cee450c0fa29d0b54275da0, items: [] },
{ _id: 5cee455c0c31785b0875e09d, items: [] },
{ _id: 5cee47dc6d32e72c6411ce2d, items: [] } ],
attachments: [],
labels: [],
_id: 5ced1af26547233798f943f6,
title: 'asfasf',
box: 5cece1c3e6c3c13ff098658d,
workflow: 5cece1cbe6c3c13ff0986591,
}
I receive:
{ cardUpdated:
{
subtitle: '',
description: '',
checklists: [ [Object], [Object], [Object] ],
attachments: [],
labels: [],
_id: '5ced1af26547233798f943f6',
title: 'asfasf',
box: '5cece1c3e6c3c13ff098658d',
workflow: '5cece1cbe6c3c13ff0986591',
}
}
When I publish I use the following line:
pub.publish(types.CARD_UPDATED,
JSON.stringify(
{ cardUpdated: await getUpdatedCardStats(checklist.card) },
));
Note: I know that I am wrapping the argument for stringify in {} and without it the nested objects would not be ignored, but I need to do that because I need the key property name on the subscription end i.e. I am using this publish command with different key names at several places.
Is there a way to about this to get the nested objects stringify-ed?
EDIT: Turns out, I was getting the proper full object as a string on the subscribing end of Redis, but it was actually JSON.parse() that was the culprit. After parsing, it completely ignores the nested objects. Is there anyway to avoid this?
Try:
const value = JSON.stringify({
cardUpdated: await getUpdatedCardStats(checklist.card)
});
pub.publish(types.CARD_UPDATED, value);
This is not a valid JS object:
{ _id: 5cee450c0fa29d0b54275da0, items: [] }
I think it's the output of .toString() of an object of type {_id: ObjectId, items: any[], with ObjectId defined here. In any case, the JSONification of this object is not trivial and that is why JSON.stringify simply outputs [Object].
To bypass this limitation, you might implement a custom function to transform your object into one that can be trivially JSONified, possibly with the help of ObjectId.toString().

Map multiple objects to single object in stream

I have some very large (> 500MB) JSON files that I need to map to a new format and upload to a new DB.
The old format:
{
id: '001',
timestamp: 2016-06-02T14:10:53Z,
contentLength: 123456,
filepath: 'original/...',
size: 'original'
},
{
id: '001',
timestamp: 2016-06-02T14:10:53Z,
contentLength: 24565,
filepath: 'medium/...',
size: 'medium'
},
{
id: '001',
timestamp: 2016-06-02T14:10:53Z,
contentLength: 5464,
filepath: 'small/...',
size: 'small'
}
The new format:
{
Id: '001',
Timestamp: 2016-06-02T14:10:53Z,
OriginalSize: {
ContentLength: 123456,
FilePath: 'original/...'
},
MediumSize: {
ContentLength: 24565,
FilePath: 'medium/...'
},
SmallSize: {
ContentLength: 5464,
FilePath: 'small/...'
}
}
I was achieving this with small datasets like this, processing the 'original' size first:
let out = data.filter(o => o.size === 'original).map(o => {
return {
Id: o.id,
Timestamp: o.timestamp,
OriginalSize: {
ContentLength: o.contentLength,
FilePath: o.filepath
}
};
});
data.filter(o => o.size !== 'original').forEach(o => {
let orig = out.find(function (og) {
return og.Timestamp === o.timestamp;
});
orig[o.size + 'Size'] = {
ContentLength: o.contentLength,
FilePath: o.filepath
};
)
// out now contains the correctly-formatted objects
The problem comes with the very large datasets, where I can't load the hundreds of megabytes of JSON into memory all at once. This seems like a great time to use streams, but of course if I read the file in chunks, running .find() on a small array to find the 'original' size won't work. If I scan through the whole file to find originals and then scan through again to add the other sizes to what I've found, I end up with the whole dataset in memory anyway.
I know of JSONStream, which would be great if I was doing a simple 1-1 remapping of my objects.
Surely I can't be the first one to run into this kind of problem. What solutions have been used in the past? How can I approach this?
I think the trick is to update the database on the fly. If the JSON file is too big for memory, then I expect the resulting set of objects (out in your example) is too big for memory too.
In the comments you state the JSON file has one object per line. Therefore using node.js builtin fs.createReadStream and readline to get each line of the text file. Next process the line (string) into a json object, and finally update the database.
parse.js
var readline = require('readline');
var fs = require('fs');
var jsonfile = 'text.json';
var linereader = readline.createInterface({
input: fs.createReadStream(jsonfile)
});
linereader.on('line', function (line) {
obj = parseJSON(line); // convert line (string) to JSON object
// check DB for existing id/timestamp
if ( existsInDB({id:obj.id, timestamp:obj.timestamp}) ) {
updateInDB(obj); // already exists, so UPDATE
}
else { insertInDB(obj); } // does not exist, so INSERT
});
// DUMMY functions below, implement according to your needs
function parseJSON (str) {
str = str.replace(/,\s*$/, ""); // lose trailing comma
return eval('(' + str + ')'); // insecure! so no unknown sources
}
function existsInDB (obj) { return true; }
function updateInDB (obj) { console.log(obj); }
function insertInDB (obj) { console.log(obj); }
text.json
{ id: '001', timestamp: '2016-06-02T14:10:53Z', contentLength: 123456, filepath: 'original/...', size: 'original' },
{ id: '001', timestamp: '2016-06-02T14:10:53Z', contentLength: 24565, filepath: 'medium/...', size: 'medium' },
{ id: '001', timestamp: '2016-06-02T14:10:53Z', contentLength: 5464, filepath: 'small/...', size: 'small' }
NOTE: I needed to quote the timestamp value to avoid a syntax error. From your question and example script I expect you either don't have this problem or already have this solved, maybe another way.
Also, my implementation of parseJSON may be different from how you are parsing the JSON. Plain old JSON.parse failed for me due to the properties not being quoted.
Setup some DB instance, that can store JSON documents. MongoDB or PostgreSQL (recently they introduced jsonb data type for storing json documents). Iterate through the old JSON documents and combine them to the new structure, using the DB as the storage - such that you overcome the memory problem.
I'm quite sure that there is no way how to achieve your goal without either a) compromising speed of the process (drastically) or b) creating poor man's DB from scratch (which seems like a bad thing to do :) )

How can I save an empty array into mongodb using js

Basically I got my app up an running but I'm stuck with a problem: if I pass an object that contains an empty array to be saved, the array is not saved into the db. I'm not sure this is a problem in js or the mongo driver, but in order to save the empty array I need to pass the array like so: products: [''].
This is the structure of my mongo document:
_id: ObjectId(...),
name: 'String',
subcategories: [
{
subcategory: 'string',
products: [
{
name: 'string'
price: integer
}
]
}
]
So in my front-end I'm grabbing the whole document through an ajax call pushing a new object into the subcategories array. The new object looks like this:
{subcategory:'string', products:['']}
And this works okay until I need to insert a new object inside the array: Because I've grabbed the whole object, pushed the new object to the array, the previous one looks like this:
{subcategory: 'string'}
Having lost the mention to products:[] array in the process.
How can I get around this? I need to be able to have empty arrays in my object.
EDIT
What I did on front end: Got the whole object with $.get which returned:
var obj =
_id: ObjectId(...),
name: 'String',
subcategories: [
{
subcategory: 'Subcategory1',
products: [
{
name: 'string'
price: integer
}
]
}
];
Then on the front end I've pushed the new object category inside the subcategories array:
data.subcategories.push({subcategory: 'Subcategory2', products: ['']})
Where subcat was a string with the category name. On my db I could see that I've successfully added the object:
var obj =
_id: ObjectId(...),
name: 'String',
subcategories: [
{
subcategory: 'Subcategory1',
products: [
{
name: 'string'
price: integer
}
]
},
{
subcategory: 'Subcategory2'
products: []
}
];
The problem was when I wanted to add another subcategory, the previous one return empty:
var obj =
_id: ObjectId(...),
name: 'String',
subcategories: [
{
subcategory: 'Subcategory1',
products: [
{
name: 'string'
price: integer
}
]
},
{
subcategory: 'Subcategory2'
},
{
subcategory: 'Subcategory3'
products: []
},
];
Because at some point the empty array was removed from the object. Like I said, I did fix this in the front end, so the error jade was throwing has been addressed, but I still find odd that the products: [] was being removed from the document.
I'm new to MongoDb and node, not to mention that I'm also new with JS, so it might well be a feature that I'm unaware of.
When passing empty arrays to Mongo they are interpreted as empty documents, {}. Zend Json encoder will interpret them as empty arrays []. I understand that it's not possible to tell which one is correct.
Incase of empty arrays try posting as
Array[null];
instead of Array[];
This will be working fine
When passing empty arrays to Mongo they are interpreted as empty documents, {}. Zend Json encoder will interpret them as empty arrays []. I understand that it's not possible to tell which one is correct.
In my view it's more logical that the actual php array (when empty) is interpreted as an array in MongoDB. Although that will require something else to identify empty documents it's still more logical than the current behaviour.
A possible solution would be to introduce a new object, MongoEmptyObject (or using the stdObj) whenever one want to introduce an empty object.
Meanwhile, a workaround is to detect empty arrays in php, and inject a null value $arr[0] = null;
Then the object will be interpreted as an empty array in mongo.
The workaround works both in PHP and in the mongo console. Question: does json allow for arrays with null values? If so, then the workaround is a sign of another bug.
PHP:
if (is_array($value) && empty($value))
{ $value[0] = null; }
Mongo Console:
var b =
{hej:"da", arr: [null]}
db.test.save(b);
db.test.find();
{"_id" : "4a4b23adde08d50628564b12" , "hej" : "da" , "arr" : []}

MongoDB not okForStorage error

I've looked around quite a bit concerning this error, it seems that Mongo won't accept a . or a $ in an update, yet I still get this error
{ [MongoError: not okForStorage]
name: 'MongoError',
err: 'not okForStorage',
code: 12527,
n: 0,
connectionId: 18,
ok: 1 }
This is the object I'm updating:
{
status: "open",
type: "item",
parentId: "4fa13ba2d327ca052d000003",
_id: "4fa13bd6d327ca052d000012",
properties: {
titleInfo: [
{ title: "some item" }
]
}
}
And I'm updating it to:
{
fedoraId: 'aFedoraLib:438',
status: "closed",
type: "item",
parentId: "4fa13ba2d327ca052d000003",
_id: "4fa13bd6d327ca052d000012",
properties: {
titleInfo: [
{ title: "some item" }
]
}
}
Another possible cause I just ran into: storing an object which has periods in the string keys.
So for people getting the same error:
It's due to the fact that I was including the _id, which Mongo doesn't like apparently
I ran into this error when trying to save a JSON structure with this key-value pair (coming straight out of an AngularJS app):
"$$hashKey":"021"
Removing just that key fixed the problem. For others using Angular, it looks like calling Angular's built-in angular.toJson client-side eliminates the $$hashkey keys. From their forums:
$scope.ngObjFixHack = function(ngObj) {
var output;
output = angular.toJson(ngObj);
output = angular.fromJson(output);
return output;
}

Categories

Resources