In knex js, I am trying to return the records in an array. Currently, the data is a bunch of objects. And im not sure how to write the code to return an array ob objects. Below is what I tried already.
.knex({ga: 'guideAccess'})
.join({g: 'guide'}, 'ga.guideKey', 'g.guideKey')
.select(['*'])
.where({'userEid': args.userEid})
.orderBy('g.version', 'desc')
.first();
if (!record) throw new Error('Guide access record does not exist.');
if (context.user.isAdmin || context.user.isContentManager) return record;
if (!context.user.isAdmin || !context.user.isContentManager) {
throw new ForbiddenError('Unauthorized.');
}
return null
};````
Related
So basically what I am trying is, access a document throught a reference attribut that is written in an other document like this:
Screen of the collection and the document that contains the references in an array
Here is the javascript function with which I would like to access the document. In this example I just return the reference attribut to see what it contains:
exports.importProducts = functions.https.onRequest(async (request, response) => {
const res = { success: false, error: "", docrefs: []};
const groceryListID = "groceryList1";
try {
const groceryListDoc = await admin
.firestore()
.collection("GroceryList")
.doc(groceryListID.toString())
.get();
if (!groceryListDoc.exists) {
res.error =
"Grocery list could not be found";
response.send(res);
// return res;
}
else {
const docref = groceryListDoc.data().docReferences;
res.success = true;
res.docrefs = docref[0];
response.send(res);
}
}
catch (e) {
res.error = "Error while reading the Grocery List document : " + e;
response.send(res);
//return res;
}
});
Here is the result I get when im reading the reference attribut:
Result picture
Result in text format: {"success":true,"error":"","docrefs":{"_firestore":{"projectId":""},"_path":{"segments":["Products","p1"]},"_converter":{}}}
I know that I could parse the array elements of the "segments" to get the path and access to the document but is there a better way manage that? Maybe thanks a DocumentReference object?
The value that the SDK gives you back for a DocumentReference field type in the database is actually a DocumentReference object from the SDK too. So you can just call get() on the value you get, to get a snapshot of the references document data.
I am trying to write a procedure that logs that an Item was retrieved when the procedure is used. Right now, I can retrieve the items but I am just stuck on adding a single key and value to the item as I loop through the collection:
Here is my code:
function example(prefix) {
var collection = getContext().getCollection();
collection.chain().filter(function (doc) {
return doc.textStatus = "delivered";
}).map(function (doc) {
doc.note = "test";
var isAccepted = collection.replaceDocument(doc._self, doc, function (err) {
if (err) throw err;
});
if (!isAccepted) throw new Error("replaceDocument(metaItem) returned false.");
}).value();}
This is giving me this message:
Message: {"Errors":["Encountered exception while executing function. Exception = Error: {"Errors":["Requests originating from scripts cannot reference partition keys other than the one for which client request was submitted."]}
I have also tried setting the partition key of the doc (e.g. doc.partitionKey = "sid"), though I would assume it is set as the docs are being pulled from the collect.
Update
I thought that is code was sending docs to map function but when I checked again, it was not. I thought that my partition key was "/sid" or "sid" and entered those as strings for parameters of the stored procedure. This was after adding === and sid on filters. This produced no errors, but I could not log or change any docs.
Please change return doc.textStatus = "delivered"; to return doc.textStatus == "delivered" && doc.partitionKey == "sid"; to have a try. It works for me.
Update
When you want to execute your SP, you need to pass your partition key value not your partition key path. According to your screenshot, your partition key path is /sid and your partition key value is undefined.
So your SP should be like this:
function example(partition_key_value) {
var collection = getContext().getCollection();
if(!partition_key_value){
partition_key_value = undefined;
}
collection.chain().filter(function (doc) {
return doc.textStatus == "delivered" && doc.sid== partition_key_value;
}).map(function (doc) {
doc.note = "test";
var isAccepted = collection.replaceDocument(doc._self, doc,function (err) {
if (err) throw err;
});
if (!isAccepted) throw new Error("replaceDocument(metaItem) returned false.");
}).value();
}
And execute your SP like this:
I am trying pass this object as parameter
{'0xEeeeeEeeeEeEeeEeEeEeeEEEeeeeEeeeeeeeEEeE': '100000000000000000'}
In this function, to update mysql JSON column, but get Error:
ER_BAD_FIELD_ERROR: Unknown column '0xEeeeeEeeeEeEeeEeEeEeeEEEeeeeE
eeeeeeeEEeE' in 'field list'
HOW TO pass this Object parameter correctly?
exports.update = async (jsonObj, address) => {
console.log("jsonObj", jsonObj);
const q = "UPDATE list SET balance = ? WHERE address = ?";
try {
await query(q, [jsonObj, address]);
console.log("Updated", address);
return "Ok";
} catch (err) {
throw err;
}
};
The problem is that your SQL is wrong. You will probably want to use JSON_SET('the name of your JSON column','the key of the key/value pair you want to update','the new value you want for that key/value pair).
I am using firebase's .onSnapshot to grab the ID of the users currently online, and store each ID to an array. I successfully deployed .onSnapshot to get the ID of the online users, but I return an empty array at the end
var learning_language;
db.collection(ll_profile).doc(user_uid).get().then(function(doc) {
learning_language = doc.data().learning_language;
})
db.collection(ns_status).where("state", "==", "online").onSnapshot(function(snapshot) {
var ns_match = [ ];
snapshot.forEach(function(userSnapshot) {
db.collection("ns_profile").doc(userSnapshot.id).get().then(function(doc) {
spoken_language = doc.data().spoken_language;
if (learning_language == spoken_language) {
ns_match.push(userSnapshot.id);
console.log(ns_match);
}
})
})
return (ns_match);
What I am trying to do is to first define the learning_language retrieved from the collection ll_profile with the current user's ID named user_uid.
Then .onSnapshot listens to another group of users' online state (which automatically updates if an user is online or offline) inside ns_status collection. After, the returned online user from .onSnapshot is checked if the spoken_language field inside their document (named with their corresponding uid) matches with learning_language defined earlier. If it matches, then store the uid into the array of ns_match.
The values inside ns_match are correct. I think .get() executes asynchronously. That is why ns_match is returned empty.
How should I return ns_match at the end with all the values stored properly?
Thanks in advance.
function getMatches() {
return new Promise(resolve => {
db.collection(ll_profile).doc(user_uid).get()
.then(function(doc) {
var learning_language = doc.data().learning_language;
db.collection(ns_status)
.where("state", "==", "online")
.onSnapshot(function(snapshot) {
var ns_match = [];
snapshot.forEach(function(userSnapshot) {
db.collection("ns_profile")
.doc(userSnapshot.id)
.get()
.then(function(doc) {
spoken_language = doc.data().spoken_language;
if (learning_language == spoken_language) {
ns_match.push(userSnapshot.id);
console.log(ns_match);
}
});
});
resolve(ns_match);
});
});
});
}
getMatches().then(ns_matches => console.log(ns_matches));
wrapping in a promise is the correct move. However, remember that snapshot returns metadata about your result. Particularly, snapshot.size. One can use that value to count records, inside the foreach method, or compare the destination array length with the snapshot.size value
I am attempting to perform an update to a MongoDB document (using mongoose) by first using .findById to get the document, then updating the fields in that document with new values. I am still a bit new to this so I used a tutorial to figure out how to get it working, then I have been updating my code for my needs. Here is the tutorial: MEAN App Tutorial with Angular 4. The original code had a schema defined, but my requirement is for a generic MongoDB interface that will simply take whatever payload is sent to it and send it along to MongoDB. The original tutorial had something like this:
exports.updateTodo = async function(todo){
var id = todo.id
try{
//Find the old Todo Object by the Id
var oldTodo = await ToDo.findById(id);
}catch(e){
throw Error("Error occured while Finding the Todo")
}
// If no old Todo Object exists return false
if(!oldTodo){
return false;
}
console.log(oldTodo)
//Edit the Todo Object
oldTodo.title = todo.title
oldTodo.description = todo.description
oldTodo.status = todo.status
console.log(oldTodo)
try{
var savedTodo = await oldTodo.save()
return savedTodo;
}catch(e){
throw Error("And Error occured while updating the Todo");
}
}
However, since I don't want a schema and want to allow anything through, I don't want to assign static values to specific field names like, title, description, status, etc. So, I came up with this:
exports.updateData = async function(update){
var id = update.id
// Check the existence of the query parameters, If they don't exist then assign a default value
var dbName = update.dbName ? update.dbName : 'test'
var collection = update.collection ? update.collection : 'testing';
const Test = mongoose.model(dbName, TestSchema, collection);
try{
//Find the existing Test object by the Id
var existingData = await Test.findById(id);
}catch(e){
throw Error("Error occurred while finding the Test document - " + e)
}
// If no existing Test object exists return false
if(!existingData){
return false;
}
console.log("Existing document is " + existingData)
//Edit the Test object
existingData = JSON.parse(JSON.stringify(update))
//This was another way to overwrite existing field values, but
//performs a "shallow copy" so it's not desireable
//existingData = Object.assign({}, existingData, update)
//existingData.title = update.title
//existingData.description = update.description
//existingData.status = update.status
console.log("New data is " + existingData)
try{
var savedOutput = await existingData.save()
return savedOutput;
}catch(e){
throw Error("An error occurred while updating the Test document - " + e);
}
}
My original problem with this was that I had a lot of issues getting the new values to overwrite the old ones. Now that that's been solved, I am getting the error of "TypeError: existingData.save is not a function". I am thinking the data type changed or something, and now it is not being accepted. When I uncomment the static values that were in the old tutorial code, it works. This is further supported by my console logging before and after I join the objects, because the first one prints the actual data and the second one prints [object Object]. However, I can't seem to figure out what it's expecting. Any help would be greatly appreciated.
EDIT: I figured it out. Apparently Mongoose has its own data type of "Model" which gets changed if you do anything crazy to the underlying data by using things like JSON.stringify. I used Object.prototype.constructor to figure out the actual object type like so:
console.log("THIS IS BEFORE: " + existingData.constructor);
existingData = JSON.parse(JSON.stringify(update));
console.log("THIS IS AFTER: " + existingData.constructor);
And I got this:
THIS IS BEFORE: function model(doc, fields, skipId) {
model.hooks.execPreSync('createModel', doc);
if (!(this instanceof model)) {
return new model(doc, fields, skipId);
}
Model.call(this, doc, fields, skipId);
}
THIS IS AFTER: function Object() { [native code] }
Which showed me what was actually going on. I added this to fix it:
existingData = new Test(JSON.parse(JSON.stringify(update)));
On a related note, I should probably just use the native MongoDB driver at this point, but it's working, so I'll just put it on my to do list for now.
You've now found a solution but I would suggest using the MongoDB driver which would make your code look something along the lines of this and would make the origional issue disappear:
// MongoDB Settings
const MongoClient = require(`mongodb`).MongoClient;
const mongodb_uri = `mongodb+srv://${REPLACE_mongodb_username}:${REPLACE_mongodb_password}#url-here.gcp.mongodb.net/test`;
const db_name = `test`;
let db; // allows us to reuse the database connection once it is opened
// Open MongoDB Connection
const open_database_connection = async () => {
try {
client = await MongoClient.connect(mongodb_uri);
} catch (err) { throw new Error(err); }
db = client.db(db_name);
};
exports.updateData = async update => {
// open database connection if it isn't already open
try {
if (!db) await open_database_connection();
} catch (err) { throw new Error(err); }
// update document
let savedOutput;
try {
savedOutput = await db.collection(`testing`).updateOne( // .save() is being depreciated
{ // filter
_id: update.id // the '_id' might need to be 'id' depending on how you have set your collection up, usually it is '_id'
},
$set: { // I've assumed that you are overwriting the fields you are updating hence the '$set' operator
update // update here - this is assuming that the update object only contains fields that should be updated
}
// If you want to add a new document if the id isn't found add the below line
// ,{ upsert: true }
);
} catch (err) { throw new Error(`An error occurred while updating the Test document - ${err}`); }
if (savedOutput.matchedCount !== 1) return false; // if you add in '{ upsert: true }' above, then remove this line as it will create a new document
return savedOutput;
}
The collection testing would need to be created before this code but this is only a one-time thing and is very easy - if you are using MongoDB Atlas then you can use MongoDB Compass / go in your online admin to create the collection without a single line of code...
As far as I can see you should need to duplicate the update object. The above reduces the database calls from 2 to one and allows you to reuse the database connection, potentially anywhere else in the application which would help to speed things up. Also don't store your MongoDB credentials directly in the code.