Let's say we query the server with this request, we only want to get the following user's Email, My current implementation requests the whole User object from the MongoDB, which I can imagine is extremely inefficient.
GQL
{
user(id:"34567345637456") {
email
}
}
How would you go about creating a MongoDB filter that would only return those Specified Fields? E.g,
JS object
{
"email": 1
}
My current server is running Node.js, Fastify and Mercurius
which I can imagine is extremely inefficient.
Doing this task is an advanced feature with many pitfalls. I would suggest starting building a simple extraction that read all the fields. This solution works and does not return any additional field to the client.
The pitfalls are:
nested queries
complex object composition
aliasing
multiple queries into one request
Here an example that does what you are looking for.
It manages aliasing and multiple queries.
const Fastify = require('fastify')
const mercurius = require('mercurius')
const app = Fastify({ logger: true })
const schema = `
type Query {
select: Foo
}
type Foo {
a: String
b: String
}
`
const resolvers = {
Query: {
select: async (parent, args, context, info) => {
const currentQueryName = info.path.key
// search the input query AST node
const selection = info.operation.selectionSet.selections.find(
(selection) => {
return (
selection.name.value === currentQueryName ||
selection.alias.value === currentQueryName
)
}
)
// grab the fields requested by the user
const project = selection.selectionSet.selections.map((selection) => {
return selection.name.value
})
// do the query using the projection
const result = {}
project.forEach((fieldName) => {
result[fieldName] = fieldName
})
return result
},
},
}
app.register(mercurius, {
schema,
resolvers,
graphiql: true,
})
app.listen(3000)
Call it using:
query {
one: select {
a
}
two: select {
a
aliasMe:b
}
}
Returns
{
"data": {
"one": {
"a": "a"
},
"two": {
"a": "a",
"aliasMe": "b"
}
}
}
Expanding from #Manuel Spigolon original answer, where he stated that one of the pitfalls of his implementation is that it doesn't work on nested queries and 'multiple queries into one request' which this implementation seeks to fix.
function formFilter(context:any) {
let filter:any = {};
let getValues = (selection:any, parentObj?:string[]) => {
//selection = labelSelection(selection);
selection.map((selection:any) => {
// Check if the parentObj is defined
if(parentObj)
// Merge the two objects
_.merge(filter, [...parentObj, null].reduceRight((obj, next) => {
if(next === null) return ({[selection.name?.value]: 1});
return ({[next]: obj});
}, {}));
// Check for a nested selection set
if(selection.selectionSet?.selections !== undefined){
// If the selection has a selection set, then we need to recurse
if(!parentObj) getValues(selection.selectionSet?.selections, [selection.name.value]);
// If the selection is nested
else getValues(selection.selectionSet?.selections, [...parentObj, selection.name.value]);
}
});
}
// Start the recursive function
getValues(context.operation.selectionSet.selections);
return filter;
}
Input
{
role(id: "61f1ccc79623d445bd2f677f") {
name
users {
user_name
_id
permissions {
roles
}
}
permissions
}
}
Output (JSON.stringify)
{
"role":{
"name":1,
"users":{
"user_name":1,
"_id":1,
"permissions":{
"roles":1
}
},
"permissions":1
}
}
Related
I am trying to update a sequelize database, where the fields that need to be updated is optional. The problem is that I have 3 fields that need to be updated which are all optional. I do not want to check each field one by one calling update method. Cause that will mean multiple call to the api. Sample raw body input in JSON
{
"authorIds": [1, 5],
"tags": ["tech", "health"],
"text": "Some very short blog post text here."
}
Any of these fields can be optional. This is what I have so far
const { authorIds, tags, text } = req.body;
// case where all fields came in
if (authorIds && tags && text) {
try {
const ids = authorIds.join(',');
const tagValue = tags.join(',');
await Post.update(
{ authorIds: ids, tags: tagValue, text: text },
{ where: { id: postId } }
);
} catch (error) {
res.json({ error: 'Please check your body format' });
}
}
Note I am using SQLite, so I can not store arrays, that why am making the inputs. into string
Thanks
You can easily construct an object that you need to pass as the first argument to update dynamically:
if (authorIds || tags || text) {
try {
const fieldsToUpdate = {}
if (authorIds && authorIds.length) {
const ids = authorIds.join(',');
fieldsToUpdate.authorIds = ids;
}
if (tags && tags.length) {
const tagValue = tags.join(',');
fieldsToUpdate.tags = tagValue;
}
if (text) {
fieldsToUpdate.text = text;
}
await Post.update(
fieldsToUpdate,
{ where: { id: postId } }
);
} catch (error) {
res.json({ error: 'Please check your body format' });
}
}
Also you can try to use object deconstruction along with ternary operators to combine all fields right in the update call.
...(authorIds && authorIds.length ? { authorIds: authorIds.join(',') } : {}).
Let's say this is my graphql query:
mutation Test ($input: UpdateUserAccountInput!) {
updateUserAccount(input: $input) {
... on UpdateUserAccountPayload {
userAccount {
name
}
}
}
}
I want to modify to have the following fragment:
... on Error {
message
}
I was able to figure out that I can get AST using parse from graphql package, i.e.
import {
parse,
gql,
} from 'graphql';
parse(gql`
mutation Test ($input: UpdateUserAccountInput!) {
updateUserAccount(input: $input) {
... on UpdateUserAccountPayload {
userAccount {
name
}
}
}
}
`)
Now I am trying to figure out how to add ... on Error { message } to this query.
The problem that I am trying to solve is that my tests sometimes quietly fail because mutation returns an error that I did not capturing. I am extending my GraphQL test client to automatically request errors for every mutation and throw if error is returned.
I assume there exists some utilities that allow me to inject fields into AST, but so far I was not able to find them.
I think I figured it out.
Here is my solution:
const appendErrorFieldToMutation = (query: string) => {
const inlineErrorFragmentTemplate = {
kind: 'InlineFragment',
selectionSet: {
kind: 'SelectionSet',
selections: [
{
kind: 'Field',
name: {
kind: 'Name',
value: 'message',
},
},
],
},
typeCondition: {
kind: 'NamedType',
name: {
kind: 'Name',
value: 'Error',
},
},
};
const document = parseGraphQL(query);
if (document.definitions.length !== 1) {
throw new Error('Expected only one definition');
}
const definition = document.definitions[0] as OperationDefinitionNode;
if (definition.operation !== 'mutation') {
return query;
}
if (definition.selectionSet.selections.length !== 1) {
throw new Error('Expected only one document selection');
}
const documentSelection = definition.selectionSet.selections[0] as InlineFragmentNode;
const errorField = documentSelection.selectionSet.selections.find((selection) => {
return (selection as InlineFragmentNode).typeCondition?.name.value === 'Error';
});
if (!errorField) {
// #ts-expect-error – Intentionally mutating the AST.
documentSelection.selectionSet.selections.unshift(inlineErrorFragmentTemplate);
}
return printGraphQL(document);
};
I've not used any utilities, so perhaps there is a smarter way to do the same.
I'm trying to set an unlimited query parameter in express js.But I couldn't figure out how should I implement that in my code. I'm using MongoDB aggeration
I want to build unlimited facets searched with multiple $match stage
Which works like this:
'http://localhost:4000/search?text=mango'
'http://localhost:4000/search?text=mango&key=brand&value=rasna' //unlimited facets.
'http://localhost:4000/search?text=mango&key=brand&value=rasna&key=color&value=yellow' //unlimited facet parameters
Here's my code to do this:
app.get("/search", async(request, response) => {
try {
const textsearch = request.query.text;
var keystore = request.query.key; //storing `key` in 'keystore'
var valuestore = request.query.value; //storing `value` in `valuestore`
if (keystore, valuestore) {
facetjson = [
{
'$match': {
[keystore]: `${valuestore}` //Storing key and value in $match
}
}
]
const Pipeline = [{
'$search': {
'text': {
'query': `${textsearch}`,
'path': 'title',
}
}
},
{
'$limit': 5
}
]
//Pushing 'facetjson' array into Pipeline array to make a filtered search possible.
const newitem = insert(Pipeline, Pipeline.length - 1, facetjson)
let result = collection.aggregate(newitem).toArray();
response.send(result);
} else {
const Pipeline = [{
'$search': {
'text': {
'query': `${textsearch}`,
'path': 'title',
}
}
},
{
'$limit': 5
}
]
let result = collection.aggregate(Pipeline).toArray();
response.send(result);
};
} catch (error) {
response.status(500).send({ message: error.message });
}
})
(JSFIDDLE code Example)[https://jsfiddle.net/divyanshuking/z0vo589e/]
==> I know that I've to pass $match in the Pipeline array each time for single Key , Value Pair. Doing many google searches I've figured out that I've to use the Rest Parameter (...keystore,...valuestore). But I didn't know how to implement this. Have you guys any better idea to do solve this problem? Pls help me:
Why don’t you use forEach or something
function endPoint (req, res) {
const queriesFound ={}
req.query.forEach(query=>{
queriesFound[query]=query;
}
QueriesFound will be an object
{ “Name”:”Name”, “AnotherParam”:”AnotherParam” }
}
//QueriesFound will be an object
{
“Name”:”Name”,
“AnotherParam”:”AnotherParam”
}
Your request URL has a wrong structure for query parameters. If you want to pass multiple kay/value pairs in URL, the correct structure is like this:
'http://localhost:4000/search?text=mango&brand=rasana&color=yellow
This code should work with this URL structure:
app.get("/search", async(request, response) => {
try {
//We need "search pipeline stage" in all conditions. whether we got a key/value pair in query or not.
//so we use "search stage" when declare pipeline array;
let pipeline = [{
'$search': {
'text': {
'query': `${request.query.text}`,
'path': 'title',
}
}
}];
//If there are keys/values pairs in the query parameters, we add match stage to our pipeline array;
if(request.query) {
let match = {}, hasMatchSatge = false;
for(let item in request.query){
if(item !=== 'text'){
match[item] = request.query[item];
hasMatchStage = true;
}
}
if(hasMatchStage) pipeline.push({'$match': match});
}
//Finally, we add our "limit stage" to the pipeline array;
pipeline.push({'$limit' : 5});
let result = collection.aggregate(pipeline).toArray();
response.status(200).send(result);
} catch (error) {
response.status(500).send({ message: error.message });
}
})
In my react app i'm coding a small photo gallery, with a GraphQL query i get all the images in a folder in this format:
{
"data": {
"allFile": {
"edges": [
{
"node": {
"childImageSharp": {
"fluid": {
"aspectRatio": 0.7518796992481203,
"originalName": "music_01.jpg"
}
}
}
},
{
"node": {
"childImageSharp": {
"fluid": {
"aspectRatio": 1.3333333333333333,
"originalName": "music_02.jpg"
}
}
}
},
{
"node": {
"childImageSharp": {
"fluid": {
"aspectRatio": 0.7518796992481203,
"originalName": "food_01.jpg"
}
}
}
}
]
}
},
"extensions": {}
}
it return about 50 entries, that (using links on the page) i need to filter out, something like a cateogry, where the category name is based on regex /category/ (or with indexOf) (music, foood and so) in the filename.
i was thinking to use a state to keep the original data separated from the filtered one, but it looks im not able to filter out the data to keep only the needed one.
my approach was something like
const Portofolio = ({data}) =>{
const [filtered, setFiltered]=useState();
function onFilterData(filter) {
//scroll through data and keep only the entries that match filter
//assign the kept data to filtered with setFiltered(keptdata)
}
return (
//render the gallery from filtered object
)
}
but im stuck on the filtered part and i cant get out of it!
any suggestion?
You can use useMemo to save the filtered list and avoid to use useEffect and setState and with this approach you save one render every time that your data changed.
You need something like this:
const filteredList = React.memo(() => {
return data.filter(({ childImageSharp: { fluid } }) => {
return fluid. originalName.indexOf(filter) > -1;
})
}, [data, filter])
data: is all your items list
filter: is the route that need to match with the item name
You use JS filter, check if your originalName exists and if it matches with your filter that comes from the fn argument
function onFilterData(filter = 'music') {
const edges = data?.allFile?.edges;
if (edges) {
const target = edges.filter(edge => {
const fileName = edge?.node?.childImageSharp?.fluid?.originalName;
if (fileName && fileName.includes(filter)) {
return true;
}
return false;
});
setFiltered(target);
}
}
Imagine your React app gets a response like this:
email_address
first_name
last_name
What's a best practice way to convert these fields to something more common in Javascript:
emailAddress
firstName
lastName
Also keeping mind that there could be nested structures.
I've typically done this immediately when the response is received.
My colleagues seem to think it's fine to keep the snake_case syntax persist through the app.
There may be some edge cases that fail, I could not find anything on github that would do the trick but if you have any errors then please let me know.
It is assuming you only pass object literals to it, maybe you can write some tests and tell me if anything fails:
const snakeToCamel = snakeCased => {
// Use a regular expression to find the underscores + the next letter
return snakeCased.replace(/(_\w)/g, function(match) {
// Convert to upper case and ignore the first char (=the underscore)
return match.toUpperCase().substr(1);
});
};
const toCamel = object => {
if (Array.isArray(object)) {
return object.map(toCamel);
}
if (typeof object === 'object' && object !== null) {
return Object.entries(object).reduce(
(result, [key, value]) => {
result[snakeToCamel(key)] = toCamel(value);
return result;
},
{}
);
}
return object;
};
console.log(
toCamel({
arra_of_things: [
{ thing_one: null },
{ two: { sub_item: 22 } },
],
sub_one: {
sub_two: {
sub_three: {
sub_four: {
sub_four_value: 22,
},
},
},
},
})
);