haven't used graphql or mongodb previously. What is the proper way to pass objects for the update mutation?
Since the only other way i see to pass multiple dynamically appearing parameters is to use input type which is appears to be a bit ineffective to me (in terms of how it looks in the code, especially with bigger objects), i just pass the possible values themselves. however in this case i need to dynamically construct updateObject, which again, going to get messy for the bigger models.
for example now i did:
Mutation: {
updateHub: async (_, { id, url, ports, enabled }) => {
const query = {'_id': id};
const updateFields = {
...(url? {url: url} : null),
...(ports? {ports: ports} : null),
...(enabled? {enabled: enabled} : null)
};
const result = await HubStore.findByIdAndUpdate(query, updateFields);
return {
success: !result ? false : true,
message: 'updated',
hub: result
};
}
}
any advise on the better way to handle this?
thanks!
It appears your code could benefit from using ES6 spread syntax -- it would permit you to deal with an arbitrary number of properties from your args object without the need for serial tertiary statements.
Mutation: {
updateHub: async (_, { id, ...restArgs } ) => {
const query = {'_id': id};
const updateFields = { ...restArgs };
const result = await HubStore.findByIdAndUpdate(query, updateFields);
return {
success: !result ? false : true,
message: 'updated',
hub: result
};
}
}
If for some reason you need to explicitly set the undefined properties to null in your object, you could possibly use some a config obj and method like defaults from the lodash library as shown below:
import { defaults } from 'lodash';
const nullFill = { url: null, ports: null, enabled: null }; // include any other properties that may be needed
Mutation: {
updateHub: async (_, { id, ...restArgs } ) => {
const query = {'_id': id};
const updateFields = defaults(restArgs, nullFill);
const result = await HubStore.findByIdAndUpdate(query, updateFields);
return {
success: !result ? false : true,
message: 'updated',
hub: result
};
}
}
Also, FWIW, I would consider placing the dynamic arguments that could be potentially be updated on its own input type, such as HubInput in this case, as suggested in the graphql docs. Below I've shown how this might work with your mutation. Note that because nothing on HubInput is flagged as requird (!) you are able to pass a dynamic collection of properties to update. Also note that if you take this appraoch you will need to properly destructure your args object initially in your mutation, something like { id, input }.
input HubInput {
url: String
ports: // whatever this type is, like [String]
enabled: Boolean
// ...Anything else that might need updating
}
type UpdateHubPayload {
success: Boolean
message: String
hub: Hub // assumes you have defined a type Hub
}
updateHub(id: Int, input: HubInput!): UpdateHubPayload
Related
I have an object which returns,
[
{
name:"getSpeed",
params:["distance","time"]
},
{
name:"getTime",
params:["speed","distance"]
},
...
]
This object is subject to change as it is gathered from an embedded device.
im trying to convert this into an object with callable functions i.e.
let myObj = {
getSpeed: function(distance, time){
/* do something (this is irrelevant) */
},
getTime: function(speed, distance){
/* do something (again not relevant) */
}
}
is there any way to map an array of strings to function parameters when mapping over an array?
According to your comments it appears you want to create functions from this definition which send commands with a function-call-like string which is evaled on the other side, and you don't actually care about the paramater names but rather about the correct number of parameters.
I would therefore recommend something like this:
const myObj = Object.fromEntries(data.map(({ name, params }) => [
name,
(...args) => {
if (args.length !== params.length) {
throw new TypeError(`${name} expected ${params.length} arguments, got ${args.length}`)
}
this.UART.write(`${name}(${args.map(arg => JSON.stringify(arg)).join(',')})`)
}
]))
This will work with all the datatypes that JSON supports, and will as a side effect also pass undefined as argument correctly.
Here is a runnable example with console.log instead of this.UART.write:
const data = [
{
name: "getSpeed",
params: ["distance", "time"]
},
{
name: "getTime",
params: ["speed", "distance"]
}
]
const myObj = Object.fromEntries(data.map(({ name, params }) => [
name,
(...args) => {
if (args.length !== params.length) {
throw new TypeError(`${name} expected ${params.length} arguments, got ${args.length}`)
}
console.log(`${name}(${args.map(arg => JSON.stringify(arg)).join(',')})`)
}
]))
myObj.getSpeed(123, 456) // prints `getSpeed(123,456)`
myObj.getTime(123, 456) // prints `getTime(123,456)`
myObj.getTime("hello", true) // prints `getTime("hello",true)`
myObj.getTime(1) // throws `getTime expected 2 arguments, got 1`
However, as you said yourself, the whole eval business is not ideal anyway. I would recommend - if possible - to reconsider the protocol to use something more secure and robust like gRPC or, one layer below, protocol buffers. Given that you are using JavaScript on both ends, JSON-RPC could also be a nice solution.
Via a microservice, I retrieve several packages of JSON data and spit them out onto a Vue.js-driven page. The data looks something like this:
{"data":{"getcompanies":
[
{"id":6,"name":"Arena","address":"12 Baker Street","zip":"15090"},
{"id":7,"name":"McMillan","address":null,"zip":"15090"},
{"id":8,"name":"Ball","address":"342 Farm Road","zip":"15090"}
]
}}
{"data":{"getusers":
[{"id":22,"name":"Fred","address":"Parmesean Street","zip":"15090"},
{"id":24,"name":"George","address":"Loopy Lane","zip":"15090"},
{"id":25,"name":"Lucy","address":"Farm Road","zip":"15090"}]}}
{"data":{"getdevices":
[{"id":2,"name":"device type 1"},
{"id":4,"name":"device type 2"},
{"id":5,"name":"device type 3"}]}}
...and I successfully grab them individually via code like this:
getCompanies() {
this.sendMicroServiceRequest({
method: 'GET',
url: `api/authenticated/function/getcompanies`
})
.then((response) => {
if(response.data) {
this.dataCompanies = response.data.getcompanies
} else {
console.error(response)
}
}).catch(console.error)
}
...with getUsers() and getDevices() looking respectively the same. getCompanies() returns:
[{"id":6,"name":"Arena","address":"12 Baker Street","zip":"15090"},
{"id":7,"name":"McMillan","address":null,"zip":"15090"},
{"id":8,"name":"Ball","address":"342 Farm Road","zip":"15090"}]
...which I relay to the Vue template in a table, and this works just fine and dandy.
But this is obviously going to get unwieldy if I need to add more microservice calls down the road.
What I'm looking for is an elegant way to jump past the response.data.*whatever* and get to those id-records with a re-useable call, but I'm having trouble getting there. response.data[0] doesn't work, and mapping down to the stuff I need either comes back undefined, or in bits of array. And filtering for response.data[0].id to return just the rows with ids keeps coming back undefined.
My last attempt (see below) to access the data does work, but looks like it comes back as individual array elements. I'd rather not - if possible - rebuild an array into a JSON structure. I keep thinking I should be able to just step past the next level regardless of what it's called, and grab whatever is there in one chunk, as if I read response.data.getcompanies directly, but not caring what 'getcompanies' is, or needing to reference it by name:
// the call
this.dataCompanies = this.getFullData('companies')
getFullData(who) {
this.sendMicroServiceRequest({
method: 'GET',
url: 'api/authenticated/function/get' + who,
})
.then((response) => {
if(response) {
// attempt 1 to get chunk below 'getcompanies'
Object.keys(response.data).forEach(function(prop) {
console.log(response.data[prop])
})
// attempt 2
// for (const prop in response.data) {
// console.log(response.data[prop])
// }
let output = response.data[prop] // erroneously thinking this is in one object
return output
} else {
console.error(response)
}
}).catch(console.error)
}
...outputs:
(63) [{…}, {…}, {…}] <-- *there are 63 of these records, I'm just showing the first few...*
0: {"id":6,"name":"Arena","address":"12 Baker Street","zip":"15090"}
1: {"id":7,"name":"McMillan","address":null,"zip":"15090"},
2: {"id":8,"name":"Ball","address":"342 Farm Road","zip":"15090"}...
Oh, and the return above comes back 'undefined' for some reason that eludes me at 3AM. >.<
It's one of those things where I think I am close, but not quite. Any tips, hints, or pokes in the right direction are greatly appreciated.
I feel it's better to be explicit about accessing the object. Seems like the object key is consistent with the name of the microservice function? If so:
getData(functionName) {
return this.sendMicroServiceRequest({
method: 'GET',
url: "api/authenticated/function/" + functionName
})
.then( response => response.data[functionName] )
}
getCompanies(){
this.getData("getcompanies").then(companies => {
this.dataCompanies = companies
})
}
let arrResponse = {data: ['x']};
let objResponse = {data: {getcompanies: 'x'}};
console.log(arrResponse.data[0]);
console.log(Object.values(objResponse.data)[0]);
response.data[0] would work if data was an array. To get the first-and-only element of an object, use Object.values(response.data)[0] instead. Object.values converts an object to an array of its values.
Its counterparts Object.keys and Object.entries likewise return arrays of keys and key-value tuples respectively.
Note, order isn't guaranteed in objects, so this is only predictable in your situation because data has exactly a single key & value. Otherwise, you'd have to iterate the entry tuples and search for the desired entry.
firstValue
Let's begin with a generic function, firstValue. It will get the first value of an object, if present, otherwise it will throw an error -
const x = { something: "foo" }
const y = {}
const firstValue = t =>
{ const v = Object.values(t)
if (v.length)
return v[0]
else
throw Error("empty data")
}
console.log(firstValue(x)) // "foo"
console.log(firstValue(y)) // Error: empty data
getData
Now write a generic getData. We chain our firstValue function on the end, and be careful not to add a console.log or .catch here; that is a choice for the caller to decide -
getData(url) {
return this
.sendMicroServiceRequest({ method: "GET", url })
.then(response => {
if (response.data)
return response.data
else
return Promise.reject(response)
})
.then(firstValue)
}
Now we write getCompanies, getUsers, etc -
getCompanies() {
return getData("api/authenticated/function/getcompanies")
}
getUsers() {
return getData("api/authenticated/function/getusers")
}
//...
async and await
Maybe you could spruce up getData with async and await -
async getData(url) {
const response =
await this.sendMicroServiceRequest({ method: "GET", url })
return response.data
? firstValue(response.data)
: Promise.reject(response)
}
power of generics demonstrated
We might even suggest that these get* functions are no longer needed -
async getAll() {
return {
companies:
await getData("api/authenticated/function/getcompanies"),
users:
await getData("api/authenticated/function/getusers"),
devices:
await getData("api/authenticated/function/getdevices"),
// ...
}
}
Above we used three await getData(...) requests which happen in serial order. Perhaps you want all of these requests to run in parallel. Below we will show how to do that -
async getAll() {
const requests = [
getData("api/authenticated/function/getcompanies"),
getData("api/authenticated/function/getusers"),
getData("api/authenticated/function/getdevices")
]
const [companies, users, devices] = Promise.all(requests)
return { companies, users, devices }
}
error handling
Finally, error handling is reserved for the caller and should not be attempted within our generic functions -
this.getAll()
.then(data => this.render(data)) // some Vue template
.catch(console.error)
I have a page that consists of 2 components and each of them has its own request for data
for example
<MovieInfo movieId={queryParamsId}/>
const GET_MOVIE_INFO = `gql
query($id: String!){
movie(id: $id){
name
description
}
}`
Next component
<MovieActors movieId={queryParamsId}/>
const GET_MOVIE_ACTORS = `gql
query($id: String!){
movie(id: $id){
actors
}
}`
For each of these queries I use apollo hook
const { data, loading, error } = useQuery(GET_DATA, {variable: {id: queryParamsId}}))
Everything is fine, but I got a warning message:
Cache data may be lost when replacing the movie field of a Query object.
To address this problem (which is not a bug in Apollo Client), either ensure all objects of type Movie have IDs, or define a custom merge function for the Query.movie field, so InMemoryCache can safely merge these objects: { ... }
It's works ok with google chrome, but this error affects Safari browser. Everything is crushing. I'm 100% sure it's because of this warning message. On the first request, I set Movie data in the cache, on the second request to the same query I just replace old data with new, so previous cached data is undefined. How can I resolve this problem?
Here is the same solution mentioned by Thomas but a bit shorter
const cache = new InMemoryCache({
typePolicies: {
Query: {
fields: {
YOUR_FIELD: {
// shorthand
merge: true,
},
},
},
},
});
This is same as the following
const cache = new InMemoryCache({
typePolicies: {
Query: {
fields: {
YOUR_FIELD: {
merge(existing, incoming, { mergeObjects }) {
return mergeObjects(existing, incoming);
},
},
},
},
},
});
Solved!
cache: new InMemoryCache({
typePolicies: {
Query: {
fields: {
YOUR_FIELD: {
merge(existing = [], incoming: any) {
return { ...existing, ...incoming };
// this part of code is depends what you actually need to do, in my
case i had to save my incoming data as single object in cache
}
}
}
}
}
})
});
The other answers still work, but as of Apollo Client >= 3.3 there's an easier option that doesn't require specifying specific fields or a custom merge function. Instead, you only have to specify the type and it will merge all fields for that type:
const cache = new InMemoryCache({
typePolicies: {
YOUR_TYPE_NAME: {
merge: true,
}
}
});
From your example query, I'd guess that an id field should be available though? Try requesting the ID in your query, that should solve the problem in a much more ideal way.
Had same issue with inconsistency of data values vs. our schema. A value type within an entity was missing the id value. Caused by an incomplete data migration.
Temporary solution:
const typePolicies = {
PROBLEM_TYPE: {
keyFields: false as false,
},
PARENT_TYPE: {
fields: {
PROBLEM_FIELD: {
merge: true
}
}
}
}
I have the following GraphQL schema, which defines 3 types: a CondaPackage which hasmany CondaVersion, which hasmany CondaExecutable. I want to be able to query a CondaVersion and ask "how many CondaExecutables do you own which succeeded my analysis". Currently I've written a succeededExeCount and allExeCount which resolve this field by loading all children and manually counting the number of children that succeeded.
exports.createSchemaCustomization = ({ actions: { createTypes }, schema }) => {
createTypes([
schema.buildObjectType({
name: "CondaPackage",
fields: {
succeededExeCount: {
type: "Int!",
resolve(source, args, context){
// TODO
}
},
allExeCount: {
type: "Int!",
resolve(source, args, context){
// TODO
}
}
},
interfaces: ["Node"]
}),
schema.buildObjectType({
name: "CondaVersion",
fields: {
succeededExeCount: {
type: "Float!",
resolve(source, args, context){
const children = context.nodeModel.getNodesByIds({
ids: source.children,
type: "CondaExecutable"
})
return children.reduce((acc, curr) => acc + curr.fields.succeeded, 0)
}
},
allExeCount: {
type: "Int!",
resolve(source, args, context){
return source.children.length;
}
}
},
interfaces: ["Node"]
}),
schema.buildObjectType({
name: "CondaExecutable",
fields: {
succeeded: {
type: "Boolean!",
resolve(source, args, context, info) {
return source.fields.succeeded || false;
}
},
},
interfaces: ["Node"]
})
])
}
My first problem is that this seems incredibly inefficient. For each CondaVersion I'm running a separate query for its children, which is a classic N+1 query problem. Is there a way to tell Gatsby/GraphQL to simply "join" the two tables like I would using SQL to avoid this?
My second problem is that I now need to count the number of succeeding children from the top level type: CondaPackage. I want to ask "how many CondaExecutables do your child CondaVersions own which succeeded my analysis". Again, in SQL this would be easy because I would just JOIN the 3 types. However, the only way I can currently do this is by using getNodesByIds for each child, and then for each child's child, which is n*m*o runtime, which is terrifying. I would like to run a GraphQL query as part of the field resolution which lets me grab the succeededExeCount from each child. However, Gatsby's runQuery seems to return nodes without including derived fields, and it won't let me select additional fields to return. How can I access fields on a node's child's child in Gatsby?
Edit
Here's the response from a Gatsby maintainer regarding the workaround:
Gatsby has an internal mechanism to filter/sort by fields with custom resolvers. We call it materialization. [...] The problem is that this is not a public API. This is a sort of implementation detail that may change someday and that's why it is not documented.
See the full thread here.
Original Answer
Here's a little 'secret' (not mentioned anywhere in the docs at the time of writing):
When you use runQuery, Gatsby will try to resolve derived fields... but only if that field is passed to the query's options (filter, sort, group, distinct).
For example, in CondaVersion, instead of accessing children nodes and look up fields.succeeded, you can do this:
const succeededNodes = await context.nodeModel.runQuery({
type: "CondaExecutable",
query: { filter: { succeeded: { eq: true } } }
})
Same thing for CondaPackage. You might try to do this
const versionNodes = await context.nodeModel.runQuery({
type: "CondaVersion",
query: {}
})
return versionNodes.reduce((acc, nodes) => acc + node.succeededExeCount, 0) // Error
You'll probably find that succeededExeCount is undefined.
The trick is to do this:
const versionNodes = await context.nodeModel.runQuery({
type: "CondaVersion",
- query: {}
+ query: { filter: { succeededExeCount: { gte: 0 } } }
})
It's counter intuitive, because you'd think Gatsby would just resolve all resolvable fields on a type. Instead it only resolves fields that is 'used'. So to get around this, we add a filter that supposedly does nothing.
But that's not all yet, node.succeededExeCount is still undefined.
The resolved data (succeededExeCount) is not directly stored on the node itself, but in node.__gatsby_resolved source. We'll have to access it there instead.
const versionNodes = await context.nodeModel.runQuery({
type: "CondaVersion",
query: { filter: { succeededExeCount: { gte: 0 } } }
})
return versionNodes.reduce((acc, node) => acc + node.__gatsby_resolved.succeededExeCount, 0)
Give it a try & let me know if that works.
PS: I notice that you probably use createNodeField (in CondaExec's node.fields.succeeded?) createTypes is also accessible in exports.sourceNodes, so you might be able to add this succeeded field directly.
When a user registers with my API they are returned a user object. Before returning the object I remove the hashed password and salt properties. I have to use
user.salt = undefined;
user.pass = undefined;
Because when I try
delete user.salt;
delete user.pass;
the object properties still exist and are returned.
Why is that?
To use delete you would need to convert the model document into a plain JavaScript object by calling toObject so that you can freely manipulate it:
user = user.toObject();
delete user.salt;
delete user.pass;
Non-configurable properties cannot be re-configured or deleted.
You should use strict mode so you get in-your-face errors instead of silent failures:
(function() {
"use strict";
var o = {};
Object.defineProperty(o, "key", {
value: "value",
configurable: false,
writable: true,
enumerable: true
});
delete o.key;
})()
// TypeError: Cannot delete property 'key' of #<Object>
Another solution aside from calling toObject is to access the _doc directly from the mongoose object and use ES6 spread operator to remove unwanted properties as such:
user = { ...user._doc, salt: undefined, pass: undefined }
Rather than converting to a JavaScript object with toObject(), it might be more ideal to instead choose which properties you want to exclude via the Query.prototype.select() function.
For example, if your User schema looked something like this:
const userSchema = new mongoose.Schema({
email: {
type: String,
required: true,
},
name: {
type: String,
required: true
},
pass: {
type: String,
required: true
},
salt: {
type: String,
required: true
}
});
module.exports = {
User: mongoose.model("user", userSchema)
};
Then if you wanted to exclude the pass and salt properties in a response containing an array of all users, you could do so by specifically choosing which properties to ignore by prepending a minus sign before the property name:
users.get("/", async (req, res) => {
try {
const result = await User
.find({})
.select("-pass -salt");
return res
.status(200)
.send(result);
}
catch (error) {
console.error(error);
}
});
Alternatively, if you have more properties to exclude than include, you can specifically choose which properties to add instead of which properties to remove:
const result = await User
.find({})
.select("email name");
The delete operation could be used on javascript objects only. Mongoose models are not javascript objects. So convert it into a javascript object and delete the property.
The code should look like this:
const modelJsObject = model.toObject();
delete modlelJsObject.property;
But that causes problems while saving the object. So what I did was just to set the property value to undefined.
model.property = undefined;
Old question, but I'm throwing my 2-cents into the fray....
You question has already been answered correctly by others, this is just a demo of how I worked around it.
I used Object.entries() + Array.reduce() to solve it. Here's my take:
// define dis-allowed keys and values
const disAllowedKeys = ['_id','__v','password'];
const disAllowedValues = [null, undefined, ''];
// our object, maybe a Mongoose model, or some API response
const someObject = {
_id: 132456789,
password: '$1$O3JMY.Tw$AdLnLjQ/5jXF9.MTp3gHv/',
name: 'John Edward',
age: 29,
favoriteFood: null
};
// use reduce to create a new object with everything EXCEPT our dis-allowed keys and values!
const withOnlyGoodValues = Object.entries(someObject).reduce((ourNewObject, pair) => {
const key = pair[0];
const value = pair[1];
if (
disAllowedKeys.includes(key) === false &&
disAllowedValues.includes(value) === false
){
ourNewObject[key] = value;
}
return ourNewObject;
}, {});
// what we get back...
// {
// name: 'John Edward',
// age: 29
// }
// do something with the new object!
server.sendToClient(withOnlyGoodValues);
This can be cleaned up more once you understand how it works, especially with some fancy ES6 syntax. I intentionally tried to make it extra-readable, for the sake of the demo.
Read docs on how Object.entries() works: MDN - Object.entries()
Read docs on how Array.reduce() works: MDN - Array.reduce()
I use this little function just before i return the user object.
Of course i have to remember to add the new key i wish to remove but it works well for me
const protect = (o) => {
const removes = ['__v', '_id', 'salt', 'password', 'hash'];
m = o.toObject();
removes.forEach(element => {
try{
delete m[element]
}
catch(O_o){}
});
return m
}
and i use it as I said, just before i return the user.
return res.json({ success: true, user: await protect(user) });
Alternativly, it could be more dynamic when used this way:
const protect = (o, removes) => {
m = o.toObject();
removes.forEach(element => {
try{
delete m[element]
}
catch(O_o){}
});
return m
}
return res.json({ success: true, user: await protect(user, ['salt','hash']) });