How serialize MongoDb ObjectId("uniqueid") to JSON in nodejs? - javascript

Let's assume we have a query builder service B that spits out mongo db query when called. This query is received by service A and it executes it as is with mongo db official nodejs driver.
How do I send something like :
[{
_id: new mongo.ObjectID("5f3258cfbaaccedaa5dd2d96"),
phone: "666"
}, {
_id: new mongo.ObjectID("5f3258cfbaaccedaa5dd2da2"),
phone: "555"
}]
from service B to service A?
EDIT:
The following works perfectly fine:
var q = { _id: new mongo.ObjectID("5f3258cfbaaccedaa5dd2d96") };
const result = await this.db.collection("persons").find(q).toArray();
The following doesn't work:
var q = { _id: { $oid: "5f3258cfbaaccedaa5dd2d96" } }
const result = await this.db.collection("persons").find(q).toArray();
Now,
var q = { _id: new mongo.ObjectID("5f3258cfbaaccedaa5dd2d96") };
JSON.stringify(q)
gives you : {"_id":"5f3258cfbaaccedaa5dd2d96"} and if you pass this to service A. You can not use it in service A as follows:
const result = await this.db.collection("persons").find(qStr).toArray();
Or as,
const result = await this.db.collection("persons").find(JSON.parse(qStr)).toArray();

You need to:
Serialize your documents to extended json on one end
Deserialize your documents from extended json to language-native data structures on the other end
See https://github.com/mongodb/js-bson#node-no-bundling for how to serialize and deserialize.
You cannot feed extended json-type annotated hashes to driver functions that expect native types (which are all of them, basically, other than the one that specifically parses extended json), like you tried to do.
var q = { _id: new mongo.ObjectID("5f3258cfbaaccedaa5dd2d96") };
const serializedQ = BSON.serialize(q);
const deserializedQ = BSON.deserialize(serializedQ);
const result = await this.db.collection("persons").find(deserializedQ).toArray();

There is a standard that MongoDB calls "Extended JSON" that defines how you can encode all BSON data types in regular JSON.
It will become something like
{ _id : {$oid: "5f3258cfbaaccedaa5dd2d96"} }
Most MongoDB tools will be able to convert to and from that format.

Related

Writing to a XLSX template and then sending it as a response in a different function, always returns undefined

What I'm trying to do
Requests come into my server to download a file containing data. The downloading part is in the front-end and works. I grab the data on my backend and then I want to write it into an existing template and return the data.
This is the handler for the request.
async handle(request: Request, response: Response) {
try {
const fileName = 'test.xlsx'
const binary = objectsToTemplateWorkBook()
response.setHeader(
'Content-Disposition',
'attachment; filename=' + fileName
)
response.setHeader('Content-Type', 'application/vnd.openxmlformats')
response.end(binary, 'binary')
} catch (error) {
console.log(error)
response.send(error)
}
}
This is the function that is supposed to write the data into the template.
export const objectsToTemplateWorkBook = ():
Promise<any> => {
var XlsxTemplate = require('xlsx-template')
var dataBlob
// Load an XLSX file into memory
const blob = fs.readFile(
path.join(__dirname, 'test_template.xlsx'),
function (err, data) {
console.log(__dirname)
// Create a template
var template = new XlsxTemplate(data)
// Replacements take place on first sheet
var sheetNumber = 1
// Set up some placeholder values matching the placeholders in the template
var values = {
people: [
{ name: 'John Smith', age: 20 },
{ name: 'Bob Johnson', age: 22 },
],
}
// Perform substitution
template.substitute(sheetNumber, values)
// Get binary data
dataBlob = template.generate()
// ...
}
)
return dataBlob
}
The function seems to write the data to the template because if I log the dataBlob inside the fs.Readfile method it shows me the file. However, the return dataBlob always returns undefined. I know this is due to the async nature, but I have no idea how to fix it quite honestly. So my question to you is: how can I get the dataBlob to my handler to send it as a response?
You can't get the return from a callback function like you're doing here, since they run asynchronously, their return will never be acessible because the external return will be executed before the inner code.
To solve this specific problem you can you the fs.readFileSync function, that executes synchronously and returns a value, that being the buffer you need to pass in your xlsxTemplate constructor. This way, the code turns into:
export const objectsToTemplateWorkBook = ():
Promise<any> => {
var XlsxTemplate = require('xlsx-template')
var dataBlob
// Load an XLSX file into memory
const data = fs.readFileSync(path.join(__dirname, 'test_template.xlsx'))
console.log(__dirname)
// Create a template
var template = new XlsxTemplate(data)
// Replacements take place on first sheet
var sheetNumber = 1
// Set up some placeholder values matching the placeholders in the template
var values = {
people: [
{ name: 'John Smith', age: 20 },
{ name: 'Bob Johnson', age: 22 },
],
}
// Perform substitution
template.substitute(sheetNumber, values)
// Get binary data
dataBlob = template.generate()
// ...
return dataBlob
}
With this you get access to the file buffer returned from the synchronous read file and is able to perform the rest of your operations. Hope it helps :D

Fetch some (and not all) properties from a list in Firebase Realtime Database

Suppose there is a list in my Realtime Database such that it has a user list at location userList as:
user1: { age: 20, name: john }
user2 : { age: 40, name: sam }
user3: { age: 30, name: cynthia }
Is there a way for me to write a query to fetch only the ages from the above userList?
I am currently using Angular 11, and taking help of Angular Fire package to communicate with the Firebase database.
Note: I have removed the apostrophes from the above example for clarity.
There's no direct method to so. You would have to fetch the complete node and then sort it out using Javascript manually. But if you want to fetch just one field then you can try this function which makes separate request for age of each user:
async function getAges() {
const dbRef = firebase.database().ref("users")
const requests = []
const users = ["user1", "user2", "user3"]
for (const user of users) {
requests.push(dbRef.child(user).child("age").once("value"))
}
const snapshots = await Promise.all(requests)
console.log(snapshots.map(snap => `${snap.ref.parent.key} -> ${snap.val()}`))
}
The output in console should be something like this:
Although you would need keys of all nodes i.e. the UIDs of all the users. If you have only UIDs stored somewhere then this method will be useful else to get the keys you need to fetch the complete users node.

NeutralinoJS storage

This is NeutralinoJS storage API for writing JSON. Is it possible to update JSON file (push data), not just overvrite data with new JS object. How to do that???
// Javascript Object to be stored as JSON
let data = {
bucket : 'test',
content : {
item : 10
}
}
// stores the data into JSON based data store.
Neutralino.storage.putData(data,
// executes on successful storage of data
function () {
console.log('Data saved to storage/test.json');
},
// executes if an error occurs
function () {
console.log('An error occured while saving the Data');
}
);
The Neutralino.storage api takes string instead of JSON to save into local storage.
And you can create your JavaScript Objects to String Very easily, for example:
const myUser = {
name: "John Doe",
age: 19,
married: false
}
const myUserString = JSON.stringify(myUser);
console.log(myUserString); // {"name":"John Doe","age":19,"married":false}
Here you can see how we used JSON.stringify method to convert our JavaScript Object into string.
Now We Can Also Convert generated string back to our javascript object, example:
const myUserString = '{"name":"John Doe","age":19,"married":false}';
const myUser = JSON.parse(myUserString);
console.log(myUser);
So now we can easily store our Objects and arrays to local storage and easily modify them, example:
async function saveToStorage(myUser) {
let myUserString = JSON.stringify(myUser);
await Neutralino.storage.setData('myUser', myUserString);
});
async function loadFromStorage() {
let myUserString = await Neutralino.storage.getData('myUser');
let myUser = JSON.parse('myUser');
return myUser;
}
saveToStorage({
name: "John Doe",
age: 19,
married: false
}).then(async () => {
let myUser = await loadFromStorage();
myUser.name = "Jane Doe"
await saveToStorage(myUser);
});

mongoosejs - find() using nested objects

question is possibly a duplicate but I haven't found anything that provides an appropriate answer to my issue.
I have an ExpressJS server which is used to provide API requests to retrieve data from a MongoDB database. I am using mongoosejs for the MongoDB connection to query/save data.
I am building a route that will allow me to find all data that matches some user input but I am having trouble when doing the query. I have spent a long while looking online for someone with a similar issue but coming up blank.
I will leave example of the code I have at the minute below.
code for route
// -- return matched data (GET)
router.get('/match', async (req, res) => {
const style_data = req.query.style; // grab url param for style scores ** this comes in as a string **
const character_data = req.query.character; // grab url param for character scores ** this comes in as a string **
// run matcher systems
const style_matches = style_match(style_data);
res.send({
response: 200,
data: style_matches
}); // return data
});
code for the query
// ---(Build the finder)
const fetch_matches_using = async function(body, richness, smoke, sweetness) {
return await WhiskyModel.find({
'attributes.body': body,
'attributes.richness': richness,
'attributes.smoke': smoke,
'attributes.sweetness': sweetness
});
}
// ---(Start match function)---
const style_match = async function (scores_as_string) {
// ---(extract data)---
const body = scores_as_string[0];
const richness = scores_as_string[1];
const smoke = scores_as_string[2];
const sweetness = scores_as_string[3];
const matched = [];
// ---(initialise variables)---
let match_count = matched.length;
let first_run; // -> exact matches
let second_run; // -> +- 1
let third_run; // -> +- 2
let fourth_run; // -> +- 3
// ---(begin db find loop)---
first_run = fetch_matches_using(body, richness, smoke, sweetness).then((result) => {return result});
matched.push(first_run);
// ---(return final data)---
return matched
}
example of db object
{
_id: mongoid,
meta-data: {
pagemd:{some data},
name: whiskyname
age: whiskyage,
price: price
},
attributes: {
body: "3",
richness: "3",
smoke: "0",
sweetness: "3",
some other data ...
}
}
When I hit the route in postman the JSON data looks like:
{
response: 200,
data: {}
}
and when I console.log() out matched from within the style match function after I have pushed the it prints [ Promise(pending) ] which I don't understand.
if I console.log() the result from within the .then() I get an empty array.
I have tried using the populate() method after running the find which does technically work, but instead of only returning data that matches it returns every entry in the collection so I think I am doing something wrong there, but I also don't see why I would need to use the .populate() function to access the nested object.
Am I doing something totally wrong here?
I should also mention that the route and the matching functions are in different files just to try and keep things simple.
Thanks for any answers.
just posting an answer as I seem to have fixed this.
Issue was with my .find() function, needed to pass in the items to search by and then also a call back within the function to return error/data. I'll leave the changed code below.
new function
const fetch_matches_using = async function(body, richness, smoke, sweetness) {
const data = await WhiskyModel.find({
'attributes.body': body,
'attributes.richness': richness,
'attributes.smoke': smoke,
'attributes.sweetness': sweetness
}, (error, data) => { // new ¬
if (error) {
return error;
}
if (data) {
console.log(data)
return data
}
});
return data; //new
}
There is still an issue with sending the found results back to the route but this is a different issue I believe. If its connected I'll edit this answer with the fix for that.

React Native - How to save map using AsyncStorage

I try to save user inputs as a JS map using AsyncStorage in my React Native app.
It shows no errors when saving, but I got "[object Map]" when I tried to get the data back.
There is the simplified version of my user map. the actual User object has way more properties than this, but the ID is always same as the map key.
const dataKey = 'user-data';
let data = new Map();
data.set(1, { name: 'John', id: 1, title: 'Mr.' })
data.set(2, { name: 'Johanna', id: 2, title: 'Miss.' })
There is the code for saving the data.
async saveDate = (dataKey, data) => {
try {
await AsyncStorage.setItem(dataKey, data.toString());
} catch (err) {
console.log(err);
}
}
There will be more than 200 users in this map.
Any idea how to save complex data structure in react native?
Instead of converting your data to a string you need to save it as JSON. Change
await AsyncStorage.setItem(dataKey, data.toString());
to
await AsyncStorage.setItem(dataKey, JSON.stringify(data));
See this link to the official documents for more details: https://facebook.github.io/react-native/docs/asyncstorage.html#mergeitem
Like one of the other answers states, you need to save the data as JSON.
However, with you won't be able to simply convert data to JSON. Instead you will need to spread the array entries of Map and pass that to JSON.stringify().
So change
await AsyncStorage.setItem(dataKey, data.toString());
to
await AsyncStorage.setItem(dataKey, JSON.stringify([...data]));
And then when you want to get the item from Async you will need to convert it back to Map, i.e.
const jsonData = AsyncStorage.getItem(dataKey)
const mapData = new Map(jsonData)
The provided answers will not quite work. You can't create the new Map after reading back json without parsing first. This works:
saveData = async () => {
const dataKey = 'user-data';
let data = new Map();
data.set(1, { name: 'John', id: 1, title: 'Mr.' })
data.set(2, { name: 'Johanna', id: 2, title: 'Miss.' })
try {
await AsyncStorage.setItem(dataKey, JSON.stringify([...data]));
} catch (err) {
console.log(err);
}
const jsonData = await AsyncStorage.getItem(dataKey)
const parseData = JSON.parse(jsonData)
const mapData = new Map(parseData)
console.log(mapData);
//get a list of IDs
console.log(Array.from(mapData.keys()));
}
For 200 values this is a lot of overhead. I would consider using sprintf/sscanf libraries and just store a string, for instance one row of data per line.
This will only work if every row in the table has the same amount of elements and you don't change the layout. Of course it would all be on you to convert the string back to objects so you can recreate the Map. Just a thought!

Categories

Resources