I am new using graphql and I would like to know how could I filter my query to get the recipes that has some of the ingredient objects I have in my input array.
this is the schema.gql file
type Recipe {
id: Int
title: String!
author: String
link: String
category: String
subcategory:String
ingredients:[Ingredients]
}
type Ingredients{
id:Int
name:String!
quantity:Float!
measure:String
observation:String
}
type Query {
recipe: [Recipe]
ingredient:[Ingredients]
}
this recipe schema has 1 respective service
const db = require('../db')
class RecipeService{
//PENDENTE FINALIZAR ESSA SERVICE
async getRecipeByIngredient(ingredient)
}
and the respective Query resolvers
Recipe: {
async ingredients(recipe, _, { dataSources }) {
return await dataSources.IngredientService.getRecipeIngredients(recipe.id)
},
},
Query: {
recipe: async () => db('Recipe'),
ingredient: async () => db('Ingredient'),
}
the main idea here is just to have one filter that can see what recipe has some ingredients that the user will inform via APP.
I got the "recipe" query with all the recipes that I have at the database, but I need a query that get these recipes and filter then using the field ingredient, for example:
Recipe - Sugar Cake with the ingredients: Sugar, Honey, Four...
Recipe - Velvet Cake with the ingredients: Sugar, Vanilla, ...
and the user inform Sugar, the API should return theses 2 recipes, but if the user inform Sugar, Honey and Four, the API would return only the option 1.
can anyone help me on it?
thanks a lot.
I got a solution for this and I would like to share with you.
The filter that I implemented on the resolver:
module.exports = {
Recipe: {
ingredients(recipe, _, { dataSources }, info) {
return dataSources.IngredientService.getRecipeIngredients(recipe.id)
}
},
Query: {
recipe(obj, {name}, {dataSources}, info) {
if (name) {
return dataSources.IngredientService.getIngredientsByName(name)
} else {
return db('Recipe')
}
},
ingredient: async () => db('Ingredient'),
recipeByIngredient:async () => db('Recipe'),
}, Mutation: {
createRecipe: async (_, { data }) => await (await db('Recipe').insert(data).returning('*'))[0],
updateRecipe: async (_, { data, id }) => await (await db('Recipe').where({ id }).update(data).returning('*'))[0],
deleteRecipe: async (_, { filter }) => {
if (filter.id) {
return await db('Recipe').where({ id: filter.id }).delete()
}
if (filter.title) {
return await db('Recipe').where({ title: filter.title }).delete()
}
throw new Error('Should provide the ID or TITLE')
}
}
}
With this resolver module, I created a new filter on the "recipe" Query resolver, that receive the "name" of the ingredient to make the filter and pass it to the Service to implement the filter at the database.
Thanks for the support.
Related
I'm using react query mutation to create an object and update UI optimistically
const queryClient = useQueryClient()
useMutation({
mutationFn: updateTodo,
onMutate: async newTodo => {
await queryClient.cancelQueries({ queryKey: ['todos'] })
const previousTodos = queryClient.getQueryData(['todos'])
// Optimistically update to the new value
queryClient.setQueryData(['todos'], old => [...old, newTodo])
return { previousTodos }
},
onError: (err, newTodo, context) => {
queryClient.setQueryData(['todos'], context.previousTodos)
},
onSettled: () => {
queryClient.invalidateQueries({ queryKey: ['todos'] })
},
})
New in-memory todo item have some random ID and displayed in UI with React Spring animation. Then i get response from server with success confirmation and real todo item ID. My application replaces and reanimates UI element and this is the problem. Optimistic update is must-have feature, but i don't know how to stop this behaviour. Need help
You can use the 'onSuccess' callback function to update the query data.
const queryClient = useQueryClient()
useMutation({
mutationFn: updateTodo,
onMutate: async newTodo => {
await queryClient.cancelQueries({ queryKey: ['todos'] })
const previousTodos = queryClient.getQueryData(['todos'])
// Optimistically update to the new value
queryClient.setQueryData(['todos'], old => [...old, newTodo])
return { previousTodos }
},
onError: (err, newTodo, context) => {
queryClient.setQueryData(['todos'], context.previousTodos)
},
onSuccess: (data, newTodo) => {
// Update the query data with the real todo item ID from the server response
queryClient.setQueryData(['todos'], old => old.map(todo => todo.id === newTodo.id ? data.todo : todo))
},
onSettled: () => {
queryClient.invalidateQueries({ queryKey: ['todos'] })
},})
The optimistic update looks fine from a react-query perspective, there's nothing to improve on that front.
I guess react-spring reanimates the DOM node because you use the id as key when rendering the todos, but it's hard to say without seeing the actual animation code.
If that is indeed the case, you could try to decouple the actual database ids and ids used for rendering. For example, you could store and return the randomly created id as an additional field, like renderId, so that your todos have the structure of:
{ id: 'id-1', renderId: 'random-string-1', title: 'my-todo-1', done: false }
{ id: 'id-2', renderId: 'random-string-2', title: 'my-todo-2', done: true }
when you create a new todo, you set both id and renderId to the random string when doing the optimistic update:
{ id: 'random-string-3', renderId: 'random-string-3', title: 'my-optimistic-todo', done: false }
then, when it comes back from the db after the invalidation, it will be:
{ id: 'id-3', renderId: 'random-string-3', title: 'my-optimistic-todo', done: false }
that means the renderId will always be consistent, so replacing todo with the real value after the optimistic update has been performed should not re-trigger the animation if you use randomId as key.
if you cannot amend the backend schema, you could also generate the renderId on the client, inside the queryFn, if there is no entry in the cache for your current key:
const useTodoQuery = (id) => {
const queryClient = useQueryClient()
return useQuery({
queryKey: ['todos', id],
queryFn: async ({ queryKey }) => {
const todo = await fetchTodo(id)
const renderId = queryClient.getQueryData(queryKey)?.renderId
return {
...todo,
renderId: renderId ?? generateRenderId()
}
})
}
then, if you have already created the renderId during the optimistic update process, you wouldn't create a new one when the queryFn runs.
I need to change a few database fields in my backend controller before returning the object to the frontend.
Currently, I am doing it in the front end like this with my returned object:
for (let contribution of contributions) {
contribution["title"] = "You Added One"
contribution["launchName"] = contribution.name
contribution["launchId"] = contribution._id
contribution["date"] = contribution.addedAt
contribution["content"] = contribution.description
}
But I am now trying to do this work in the backend using Mongo.
This is my controller:
const Launch = require('../models/launch')
const User = require('../models/user')
async function getRecentActivityByUserId (req, res) {
const { userId } = req.params
const user = await User.findOne({ userId }).lean() || []
const contributions = await Launch.find({ _id: { $in: user.contributions } })
return res.status(200).send(contributions.reverse())
}
So this correctly returns an object to the frontend but I still need to change the database field names.
So I tried this:
async function getRecentActivityByUserId (req, res) {
let recents = []
const { userId } = req.params
const user = await User.findOne({ userId }).lean() || []
const contributions = await Launch.find({ _id: { $in: user.contributions } }).aggregate([
{
$addFields: {
plans: {
$map:{
input: "$launch",
as: "l",
in: {
title: "You Added One",
launchName: "$$l.name",
launchId: "$$l._id",
date: "$$l.addedAt",
content: "$$l.description",
}
}
}
}
},
{
$out: "launch"
}
])
return res.status(200).send(contributions.reverse())
}
The above throws an error saying that I .aggregrate is not a function on .find. Even if I remove the .find, the object returned is just an empty array so I'm obviously not aggregating correctly.
How can I combine .find with .aggregate and what is wrong with my .aggregate function??
I also tried combining aggregate with find like this and get the error Arguments must be aggregate pipeline operators:
const contributions = await Launch.aggregate([
{
$match: {
_id: { $in: user.contributions }
},
$addFields: {
plans: {
$map:{
input: "$launch",
as: "l",
in: {
title: "You Added a Kayak Launch",
launchName: "$$l.name",
launchId: "$$l._id",
date: "$$l.addedAt",
content: "$$l.description",
}
}
}
}
},
{
$out: "launch"
}
])
EDIT: Just realized that I have the word plans in the aggregate function and that is not relevant to my code. I copied this code from elsewhere so not sure what the value should be.
I figured it out. This is the solution:
async function getRecentActivityByUserId (req, res) {
let recents = []
const { userId } = req.params
const user = await User.findOne({ userId }).lean() || []
const contributions = await Launch.aggregate([
{
$match: {
_id: { $in: user.contributions }
}
},
{
$addFields: {
title: "You Added One" ,
launchName: "$name",
launchId: "$_id",
date: "$addedAt",
content: "$description"
}
}
])
if(contributions) {
recents = recents.concat(contributions);
}
return res.status(200).send(recents.reverse())
}
The actual problem from the question was a small syntax error which has been noted and corrected in the self-reported answer here.
I noted in the comments there that the current approach of issuing two separate operations (a findOne() followed by an aggregate() that uses the results) could be simplified into a single query to the database. The important thing here is that you will $match against the first collection (users or whatever the collection name is in your environment) and then use $lookup to perform the "match" against the subsequent launches collection.
Here is a playground demonstrating the basic approach. Adjust as needed to the specifics of your environment.
Definition:
A Team can have many Users and 1 user as manager:
type Team {
id: ID!
title: String
manager: User #connection
users: [User] #connection(name: "TeamUsers")
}
The User type is defined as:
type User {
id: ID!
username: String
team: [Team] #connection(name: "TeamUsers")
}
Using JS, I can access the Team manager's username:
{teams.map((team) => {
return(
<li>{team.manager.username}</li>
)
}
Problem: I can't get the usernames of users in the team using either:
{teams.map((team) => {
return(
<li>{team.users.username}</li>
// or
<li>{team.users.map((user) => {user.username})}</li>
)
}
PS) I am using AWS Amplify and I'm fetching teams using the following code:
const fetchTeams = async () => {
try {
const teamsData = await API.graphql(graphqlOperation(listTeams));
const teamsList = teamsData.data.listTeams.items;
setClients(teamsList);
} catch (error) {
console.error(`fetchTeams failed.`, error);
}
};
As #Bergi said in the comment.
I think
<li>{team.users.map((user) => {user.username})}</li>
=>
<li>{team.users.map(user => user.username)}</li>
will fix your issues
CONTEXT
I have two store modules : "Meetings" and "Demands".
Within store "Demands" I have "getDemands" action, and within store "Meetings" I have "getMeetings" action. Prior to access meetings's data in Firestore, I need to know demands's Id (ex.: demands[i].id), so "getDemands" action must run and complete before "getMeetings" is dispatched.
Vuex documentation dispatching-action is very complete, but still, I don't see how to fit it in my code. There are also somme other good answered questions on the topic here :
Vue - call async action only after first one has finished
Call an action from within another action
I would like to know the best way to implement what I'm trying to accomplish. From my perspective this could be done by triggering one action from another, or using async / await, but I'm having trouble implementing it.
dashboard.vue
computed: {
demands() {
return this.$store.state.demands.demands;
},
meetings() {
return this.$store.state.meetings.meetings;
}
},
created() {
this.$store.dispatch("demands/getDemands");
//this.$store.dispatch("meetings/getMeetings"); Try A : Didn't work, seems like "getMeetings" must be called once "getDemands" is completed
},
VUEX store
Module A – demands.js
export default {
namespaced: true,
state: {
demands:[], //demands is an array of objects
},
actions: {
// Get demands from firestore UPDATED
async getDemands({ rootState, commit, dispatch }) {
const { uid } = rootState.auth.user
if (!uid) return Promise.reject('User is not logged in!')
const userRef = db.collection('profiles').doc(uid)
db.collection('demands')
.where('toUser', "==", userRef)
.get()
.then(async snapshot => {
const demands = await Promise.all(
snapshot.docs.map(doc =>
extractDataFromDemand({ id: doc.id, demand: doc.data() })
)
)
commit('setDemands', { resource: 'demands', demands })
console.log(demands) //SECOND LOG
})
await dispatch("meetings/getMeetings", null, { root: true }) //UPDATE
},
...
mutations: {
setDemands(state, { resource, demands }) {
state[resource] = demands
},
...
Module B – meetings.js
export default {
namespaced: true,
state: {
meetings:[],
},
actions: {
// Get meeting from firestore UPDATED
getMeetings({ rootState, commit }) {
const { uid } = rootState.auth.user
if (!uid) return Promise.reject('User is not logged in!')
const userRef = db.collection('profiles').doc(uid)
const meetings = []
db.collection('demands')
.where('toUser', "==", userRef)
.get()
.then(async snapshot => {
await snapshot.forEach((document) => {
document.ref.collection("meetings").get()
.then(async snapshot => {
await snapshot.forEach((document) => {
console.log(document.id, " => ", document.data()) //LOG 3, 4
meetings.push(document.data())
})
})
})
})
console.log(meetings) // FIRST LOG
commit('setMeetings', { resource: 'meetings', meetings })
},
...
mutations: {
setMeetings(state, { resource, meetings }) {
state[resource] = meetings
},
...
Syntax:
dispatch(type: string, payload?: any, options?: Object): Promise<any
Make the call right
dispatch("meetings/getMeetings", null, {root:true})
I have graphql User type that needs information from multiple REST api's and different servers.
Basic example: get the user firstname from rest domain 1 and get lastname from rest domain 2. Both rest domain have a common "userID" attribute.
A simplefied example of my resolver code atm:
user: async (_source, args, { dataSources }) => {
try {
const datasource1 = await dataSources.RESTAPI1.getUser(args.id);
const datasource2 = await dataSources.RESTAPI2.getUser(args.id);
return { ...datasource1, ...datasource2 };
} catch (error) {
console.log("An error occurred.", error);
}
return [];
}
This works fine for this simplefied version, but I have 2 problems with this solution:
first, IRL there is a lot of logic going into merging the 2 json results. Since some field are shared but have different data (or are empty). So it's like cherry picking both results to create a combined result.
My second problem is that this is still a waterfall method. First get the data from restapi1, when thats done call restapi2. Basicly apollo-server is reintroducing rest-waterfall-fetch graphql tries to solve.
Keeping these 2 problems in mind.. Can I optimise this piece of code or rewrite is for better performance or readability? Or are there any packages that might help with this behavior?
Many thanks!
With regard to performance, if the two calls are independent of one another, you can utilize Promise.all to execute them in parallel:
const [dataSource1,dataSource2] = await Promise.all([
dataSources.RESTAPI1.getUser(args.id),
dataSources.RESTAPI2.getUser(args.id),
])
We normally let GraphQL's default resolver logic do the heavy lifting, but if you're finding that you need to "cherry pick" the data from both calls, you can return something like this in your root resolver:
return { dataSource1, dataSource2 }
and then write resolvers for each field:
const resolvers = {
User: {
someField: ({ dataSource1, dataSource2 }) => {
return dataSource1.a || dataSource2.b
},
someOtherField: ({ dataSource1, dataSource2 }) => {
return someCondition ? dataSource1.foo : dataSource2.bar
},
}
}
Assuming your user resolver returns type User forsake...
type User {
id: ID!
datasource1: RandomType
datasource1: RandomType
}
You can create individual resolvers for each field in type User, this can reduce the complexity of the user Query, to only the requested fields.
query {
user {
id
datasource1 {
...
}
}
}
const resolvers = {
Query: {
user: () => {
return { id: "..." };
}
},
User: {
datasource1: () => { ... },
datasource2: () => { ... } // i wont execute
}
};
datasource1 & datasource2 resolvers will only execute in parallel, after Query.user executes.
For parallel call.
const users = async (_source, args, { dataSources }) => {
try {
const promises = [
dataSources.RESTAPI1,
dataSources.RESTAPI2
].map(({ getUser }) => getUser(args.id));
const data = await Promise.all(promises);
return Object.assign({}, ...data);
} catch (error) {
console.log("An error occurred.", error);
}
return [];
};