Cleaning Unwanted Fields From GraphQL Responses - javascript

I have an object that my GraphQL client requests.
It's a reasonably simple object:
type Element {
content: [ElementContent]
elementId: String
name: String
notes: String
type: String
createdAt: String
updatedAt: String
}
With the special type ElementContent, which is tiny and looks like this:
type ElementContent {
content: String
locale: String
}
Now, when I query this on the clientside, both the top level object and the lower level object has additional properties (which interfere with updating the object if I attempt to clone the body exactly-as-is);
Notably, GraphQL seems to supply a __typename property in the parent object, and in the child objects, they have typename and a Symbol(id) property as well.
I'd love to copy this object to state, update in state, then clone the state and ship it to my update mutation. However, I get roadblocked because of unknown properties that GraphQL itself supplies.
I've tried doing:
delete element.__typename to good effect, but then I also need to loop through the children (a dynamic array of objects), and likely have to remove those properties as well.
I'm not sure if I'm missing something during this equation, or I should just struggle through the code and loop + delete (I received errors attempting to do a forEach loop initially). Is there a better strategy for what I'm attempting to do? Or am I on the right path and just need some good loop code to clean unwanted properties?

There are three ways of doing this
First way
Update the client parameter like this it will omit the unwanted fields in graphql.
apollo.create({
link: http,
cache: new InMemoryCache({
addTypename: false
})
});
Second Way
By using the omit-deep package and use it as a middleware
const cleanTypeName = new ApolloLink((operation, forward) => {
if (operation.variables) {
operation.variables = omitDeep(operation.variables,'__typename')
}
return forward(operation).map((data) => {
return data;
});
});
Third Way
Creating a custom middleware and inject in the apollo
const cleanTypeName = new ApolloLink((operation, forward) => {
if (operation.variables) {
const omitTypename = (key, value) => (key === '__typename' ? undefined : value);
operation.variables = JSON.parse(JSON.stringify(operation.variables), omitTypename);
}
return forward(operation).map((data) => {
return data;
});
});
and inject the middleware
const httpLinkWithErrorHandling = ApolloLink.from([
cleanTypeName,
retry,
error,
http,
]);
If you use fragments with the queries/mutations Second Way & Third Way is recommended.
Preferred method is Third Way Because it does not have any third pary dependency and no cache performance issues

If you want to wipe up __typename from GraphQL response (from the root and its children), you can use graphql-anywhere package.
Something like:
const wipedData = filter(inputFragment, rcvData);
inputFragment is a fragment defines the fields (You can see details here)
rcvData is the received data from GraphQL query
By using the filter function, the wipedData includes only required fields you need to pass as mutation input.

Here's what I did, to support file uploads as well. It's a merge of multiple suggestions I found on the Github thread here: Feature idea: Automatically remove __typename from mutations
import { parse, stringify } from 'flatted';
const cleanTypename = new ApolloLink((operation, forward) => {
const omitTypename = (key, value) => (key === '__typename' ? undefined : value);
if ((operation.variables && !operation.getContext().hasUpload)) {
operation.variables = parse(stringify(operation.variables), omitTypename);
}
return forward(operation);
});
Hooking up the rest of my client.tsx file, simplified:
import { InMemoryCache } from 'apollo-cache-inmemory';
import { createUploadLink } from 'apollo-upload-client';
import { ApolloClient } from 'apollo-client';
import { setContext } from 'apollo-link-context';
import { ApolloLink } from 'apollo-link';
const authLink = setContext((_, { headers }) => {
const token = localStorage.getItem(AUTH_TOKEN);
return {
headers: {
...headers,
authorization: token ? `Bearer ${ token }` : '',
},
};
});
const httpLink = ApolloLink.from([
cleanTypename,
authLink.concat(upLoadLink),
]);
const client = new ApolloClient({
link: httpLink,
cache,
});
export default client;
Now when I call mutations that are of type upload, I simply set the context hasUpload to true, as shown here:
UpdateStation({variables: { input: station }, context: {hasUpload: true }}).then()

For those looking for a TypeScript solution:
import cloneDeepWith from "lodash/cloneDeepWith";
export const omitTypenameDeep = (
variables: Record<string, unknown>
): Record<string, unknown> =>
cloneDeepWith(variables, (value) => {
if (value && value.__typename) {
const { __typename, ...valWithoutTypename } = value;
return valWithoutTypename;
}
return undefined;
});
const removeTypename = new ApolloLink((operation, forward) => {
const newOperation = operation;
newOperation.variables = omitTypenameDeep(newOperation.variables);
return forward(newOperation);
});
// ...
const client = new ApolloClient({
cache: new InMemoryCache(),
link: ApolloLink.from([removeTypename, httpLink]),
});

I've just published graphql-filter-fragment to help with this use case. I've wrote in a bit more detail about the CRUD use case I had that led me to this approach.
Example:
import {filterGraphQlFragment} from 'graphql-filter-fragment';
import {gql} from '#apollo/client/core';
const result = filterGraphQlFragment(
gql`
fragment museum on Museum {
name
address {
city
}
}
`,
{
__typename: 'Museum',
name: 'Museum of Popular Culture',
address: {
__typename: 'MuseumAddress',
street: '325 5th Ave N',
city: 'Seattle'
}
}
);
expect(result).toEqual({
name: 'Museum of Popular Culture',
address: {
city: 'Seattle'
}
});

Here is my solution. Vanilla JS, recursive, and does not mutate the original object:
const removeAllTypenamesNoMutate = (item) => {
if (!item) return;
const recurse = (source, obj) => {
if (!source) return;
if (Array.isArray(source)) {
for (let i = 0; i < source.length; i++) {
const item = source[i];
if (item !== undefined && item !== null) {
source[i] = recurse(item, item);
}
}
return obj;
} else if (typeof source === 'object') {
for (const key in source) {
if (key === '__typename') continue;
const property = source[key];
if (Array.isArray(property)) {
obj[key] = recurse(property, property);
} else if (!!property && typeof property === 'object') {
const { __typename, ...rest } = property;
obj[key] = recurse(rest, rest);
} else {
obj[key] = property;
}
}
const { __typename, ...rest } = obj;
return rest;
} else {
return obj;
}
};
return recurse(JSON.parse(JSON.stringify(item)), {});
};

Below approach worked for me so far in my use case.
const {
loading,
error,
data,
} = useQuery(gqlRead, {
variables: { id },
fetchPolicy: 'network-only',
onCompleted: (data) => {
const { someNestedData } = data;
const filteredData = removeTypeNameFromGQLResult(someNestedData);
//Do sth with filteredData
},
});
//in helper
export const removeTypeNameFromGQLResult = (result: Record<string, any>) => {
return JSON.parse(
JSON.stringify(result, (key, value) => {
if (key === '__typename') return;
return value;
})
);
};

Related

The filter function does not remove the object in the array in Vue/Vuex

I am busy with making a booking function in vue3/vuex.
A user can book an item and also remove it from the basket.The problem is that the Filter function in vue does not remove the object in the array and I can not find out what the problem is. I hope you can help me
This is the result if I put console.log() in the removeFromBasket(state, payload)
removeFromBasket(state, payload) {
console.log('removeFromBasket', payload, JSON.parse(JSON.stringify(state.basket.items)))
}
method to remove
removeFromBasket() {
this.$store.commit('basket/removeFromBasket', this.bookableId);
}
basket module
const state = {
basket: {
items: []
},
};
const getters = {
getCountOfItemsInBasket: (state) => {
return state.basket.items.length
},
getAllItemsInBasket: (state) => {
return state.basket.items
},
inBasketAlready(state) {
return function (id) {
return state.basket.items.reduce((result, item) => result || item.bookable.id === id, false);
}
},
};
const actions = {};
const mutations = {
addToBasket(state, payload) {
state.basket.items.push(payload);
},
removeFromBasket(state, payload) {
state.basket.items = state.basket.items.filter(item => item.bookable.id !== payload);
}
};
export default {
namespaced: true,
state,
getters,
actions,
mutations
};
I have solved the problem.
I used typeof() in console.log() to see what type the payload and item.bookable.id are.
The payload was a string and the item.bookable.id was a number.
So I put the payload in parseInt(payload) and the problem was solved.
removeFromBasket(state, payload) {
state.basket.items = state.basket.items.filter(item => item.bookable.id !== parseInt(payload));
}

whitelist nested item in state (Redux persist)

The state in my reducer contains the key current_theme, which contains an object, with the key palette, which contains an object with the key mode which can either be the string value "dark" or the string value "light"
So I need to make only this bit of data persistent while leaving all other attributes intact.
redux-persist offers a whitelist parameter which is what I want. However, I can only do something like
const persistedReducer = persistReducer (
{
key: 'theme',
storage,
whitelist: ["current_theme"]
},
myReducer
);
But this makes everything inside current_theme persistent. I want only current_theme.palette.mode to be persistent and nothing else.
I tried the below but it didn't work neither
const persistedReducer = persistReducer (
{
key: 'theme',
storage,
whitelist: ["current_theme.palette.mode"]
},
myReducer
);
I had to write my own simplified code for a persistReducer method which allows nested blacklisting/whitelisting with the help of dot-prop-immutable and the deep merge function in this question. thanks to Salakar and CpILL
import dotProp from "dot-prop-immutable";
const STORAGE_PREFIX: string = 'persist:';
interface persistConfig {
key: string;
whitelist?: string[];
blacklist?: string[];
}
function isObject(item:any) {
return (item && typeof item === 'object' && !Array.isArray(item));
}
function mergeDeep(target:any, source:any) {
let output = Object.assign({}, target);
if (isObject(target) && isObject(source)) {
Object.keys(source).forEach(key => {
if (isObject(source[key])) {
if (!(key in target))
Object.assign(output, { [key]: source[key] });
else
output[key] = mergeDeep(target[key], source[key]);
} else {
Object.assign(output, { [key]: source[key] });
}
});
}
return output;
}
function filterState({ state, whitelist, blacklist }: { state: any, whitelist?: string[], blacklist?: string[] }) {
if (whitelist && blacklist) {
throw Error("Can't set both whitelist and blacklist at the same time");
}
if (whitelist) {
var newState: any = {};
for (const i in whitelist) {
let val = dotProp.get(state, whitelist[i]);
if (val !== undefined) {
newState = dotProp.set(newState, whitelist[i], val)
}
}
return newState;
}
if (blacklist) {
var filteredState: any = JSON.parse(JSON.stringify(state));
for (const i in blacklist) {
filteredState = dotProp.delete(filteredState, blacklist[i]);
}
return filteredState;
}
return state
}
export function persistReducer(config: persistConfig, reducer: any) {
const { key, whitelist, blacklist } = config;
var restore_complete = false;
return (state: any, action: { type: string, payload?: any }) => {
const newState = reducer(state, action)
if (action.type === '##INIT' && !restore_complete) {
restore_complete = true;
const data = localStorage.getItem(STORAGE_PREFIX + key);
if (data !== null) {
const newData = mergeDeep(newState, JSON.parse(data));
console.log("Restoring data:", data ,"\nnewData: ", newData);
return newData;
}
}
if(restore_complete){
const filteredNewState = filterState({
state: newState,
whitelist,
blacklist
})
localStorage.setItem(STORAGE_PREFIX + key, JSON.stringify(filteredNewState));
}
return newState;
}
}
Usage:
Same as the persistReducer function in redux-persist except that no storage options. it always uses localStorage and it accepts dotted paths in both the whitelist and blacklist parameters
if you do something like:
const persistedReducer = persistReducer (
{
key: 'theme',
whitelist: ["current_theme.palette.mode"]
},
myReducer
);
Then current_theme.palette.mode would always be saved permanently. While any other props in the store, under current_theme, or under palette will remain intact.
Note: All you have to do to use this state persistence code is to pass your reducer function through the persistReducer. No additional configurations such as creating a persister, from the store and wrapping your app in a PersistGate. No need to install any packages other than dot-prop-immutable, Just use the persistedReducer of your original reducer as returned by persistReducer and you're good to go.
Note: If a default value is provided to your original reducer and some state has been saved from a previous session, both will be deeply merged when while the initial state is being loaded, with the persisted state from the previous session having higher priority so it can overwrite default values.

How to filter two arrays of splitting?

I'm a bit confused
I am sending emails with nodemailer, and every time I send one I perform certain validations in order to manage the upload limit of the attachments. If the upload limit exceeds what is established, the service divides that email and sends it in different emails with the same subject and body as well as its attachment.
Every time this happens, it does a _.chunk that takes care of splitting the pdfs array into smaller elements. But, it should be noted that before that, he made a method to prepare the attachments and this is in charge of obtaining certain information from the api to paint the pdf buffer and thus put it in the content of the emails.
But now what I want to do is search within the matrix that performs the step before dividing the files those that are equal to the array that obtains the information and if they are equal, carry out the instruction that it sends
I will explain with a graph:
If getAmount.pdfBuffer === attachmentMap
// doAction console.log('Equals)
But even though I tried to do it, I couldn't, I don't know if it's because for each attachment that the array has divided, it generates a getAmount array. What do you think I'm doing wrong?
async sendEmail(
{
para: to,
asunto: subject,
plantilla: template,
contexto: context,
}: CorreoInfoDto,
attachments: EmailAttachment[],
driveConfig: OAuthGoogleConfig
) {
const totalSize: number = this.getSizeFromAttachments(attachments);
const chunkSplit = Math.floor(isNaN(totalSize) ? 1 : totalSize / this.LIMIT_ATTACHMENTS) + 1;
const attachmentsChunk: any[][] = _.chunk(attachments, chunkSplit);
if ((totalSize > this.LIMIT_ATTACHMENTS) && attachmentsChunk?.length >= 1) {
await Promise.all(
attachmentsChunk?.map(async (attachment: EmailAttachment[], index) => {
console.log('attachment', attachment)
if (this.getSizeFromAttachments(attachment) > this.LIMIT_ATTACHMENTS) {
const result: GenerateDriveLinkResponse[] = await Promise.all(
attachment?.map(item => {
const file = new GoogleDriveUploadFile({
name: item?.filename,
mimeType: MimeTypesEnum.PDF,
body: item?.content
});
return this.uploadFilesService.uploadToDrive(driveConfig, file) as any;
})
)
const texto = result?.map((item, index) => {
console.log('item', item?.webViewLink);
console.log('index', index);
return new SolicitudXLinkDrive({
texto: attachment[index].filename,
link: item?.webViewLink
})
});
context.links = texto;
const link = `(${index + 1}/${attachmentsChunk?.length - 1})`;
const newContext = {
getCurrent: link,
...context
}
const prepareEmail = this.prepareEmail({
para: to,
asunto: ` ${subject} (${index + 1}/${attachmentsChunk?.length})`,
plantilla: template,
contexto: newContext,
}, []);
return prepareEmail
} else {
// this.getCantidad = `(${index + 1}/${attachmentsChunk?.length - 1})`;
console.log('getCantidad', this.getAmount );
const attachmentMap = attachment.map(element => element.content);
this.getAmount .forEach(element => {
if (element.pdfBuffer === attachmentMap) {
console.log('do action');
}
})
const link = ` (${index + 1}/${attachmentsChunk?.length - 1})`;
const newContext = {
getCurrent: link,
...context
}
return this.prepareEmail({
para: to,
asunto: ` ${subject} (Correo ${index + 1}/${attachmentsChunk?.length - 1})`,
plantilla: template,
contexto: newContext,
}, attachment);
}
})
);
} else {
await this.prepareEmail(
{
para: to,
asunto: ` ${subject}`,
plantilla: template,
contexto: context,
},
attachments,
);
}
}
async prepareEmail(
{
para: to,
asunto: subject,
plantilla: template,
contexto: context,
}: CorreoInfoDto,
attachments: EmailAttachment[],
) {
return await this.mailerService.sendMail({
to,
from: `${process.env.SENDER_NAME} <${process.env.EMAIL_USER}>`,
subject,
template,
attachments: attachments,
context: context,
});
}
async sendEmails(correos: EnvioMultiplesCorreosDto) {
let pdf = null;
let info: ConfiguracionDocument = null;
let GDriveConfig: ConfiguracionDocument = null;
let logo: ConfiguracionDocument = null;
let forContext = {};
const documents = Array.isArray(correos.documento_id) ? correos.documento_id : [correos.documento_id];
const solicitudes = await this.solicitudesService.findByIds(documents);
const nombresPacientes = solicitudes.reduce((acc, cv) => {
acc[cv.correlativo_solicitud] = cv['info_paciente']?.nombre_paciente;
return acc;
}, {});
await Promise.all([
await this.getPdf(correos.tipo_reporte, correos.documento_id, correos?.dividir_archivos).then(data => { pdf = data; }),
await this.configuracionesService.findByCodes([
ConfigKeys.TEXTO_CORREO_MUESTRA,
ConfigKeys[process.env.DRIVE_CONFIG_API],
ConfigKeys.LOGO_FIRMA_PATMED
]).then(data => {
info = data[0];
GDriveConfig = data[1];
logo = data[2];
})
]);
forContext = this.configuracionesService.castValorObjectToObject(info?.valor_object)
const attachmentPrepare = this.prepareAttachments(pdf as any, nombresPacientes);
await this.sendEmail(
{
para: correos.para,
asunto: correos.asunto,
plantilla: 'muestras',
contexto: {
cuerpo: correos.cuerpo,
titulo: forContext[EnvioCorreoMuestraEnum.titulo],
direccion: forContext[EnvioCorreoMuestraEnum.direccion],
movil: forContext[EnvioCorreoMuestraEnum.movil],
pbx: forContext[EnvioCorreoMuestraEnum.pbx],
email: forContext[EnvioCorreoMuestraEnum.email],
logo: logo?.valor,
},
},
attachmentPrepare,
this.configuracionesService.castValorObjectToObject(GDriveConfig?.valor_object) as any,
);
const usuario = new UsuarioBitacoraSolicitudTDTO();
usuario.createFromUserRequest(this.sharedService.getUserFromRequest());
solicitudes.forEach((solicitud) => {
const actual = new BitacoraSolicitudDTO();
actual.createFromSolicitudDocument(solicitud);
const newBitacora = new CrearBitacoraSolicitudDTO();
newBitacora.createNewItem(null, actual, actual, usuario, AccionesBitacora.EmailEnviado);
this.bitacoraSolicitudesService.create(newBitacora);
});
}
prepareAttachments(item: BufferCorrelativosDTO | BufferXSolicitudDTO[], nombresPacientes: { [key: string]: string }) {
if (this.sharedService.isAnArray(item)) {
const castItem: BufferXSolicitudDTO[] = item as any;
this.getCantidad = castItem;
return castItem?.map((s) => {
const namePatient = nombresPacientes[s.correlativo_solicitud];
return new EmailAttachment().setFromBufferXSolicitudDTO(s, namePatient, 'pdf');
});
} else {
return [new EmailAttachment().setFromBufferCorrelativosDTO(item as any, 'pdf')];
}
}
Thank you very much for your attention, I appreciate it. Cheers
You could try using lodash as this has _.intersectionBy and _.intersectionWith functions that should allow you to compare 2 arrays and filter the common values.
There are some good examples here:
How to get intersection with lodash?

action creator does not return value to stream in marble test

I've got following Epic which works well in application, but I can't get my marble test working. I am calling action creator in map and it does return correct object into stream, but in the test I am getting empty stream back.
export const updateRemoteFieldEpic = action$ =>
action$.pipe(
ofType(UPDATE_REMOTE_FIELD),
filter(({ payload: { update = true } }) => update),
mergeMap(({ payload }) => {
const { orderId, fields } = payload;
const requiredFieldIds = [4, 12]; // 4 = Name, 12 = Client-lookup
const requestData = {
id: orderId,
customFields: fields
.map(field => {
return (!field.value && !requiredFieldIds.includes(field.id)) ||
field.value
? field
: null;
})
.filter(Boolean)
};
if (requestData.customFields.length > 0) {
return from(axios.post(`/customfields/${orderId}`, requestData)).pipe(
map(() => queueAlert("Draft Saved")),
catchError(err => {
const errorMessage =
err.response &&
err.response.data &&
err.response.data.validationResult
? err.response.data.validationResult[0]
: undefined;
return of(queueAlert(errorMessage));
})
);
}
return of();
})
);
On successfull response from server I am calling queueAlert action creator.
export const queueAlert = (
message,
position = {
vertical: "bottom",
horizontal: "center"
}
) => ({
type: QUEUE_ALERT,
payload: {
key: uniqueId(),
open: true,
message,
position
}
});
and here is my test case
describe("updateRemoteFieldEpic", () => {
const sandbox = sinon.createSandbox();
let scheduler;
beforeEach(() => {
scheduler = new TestScheduler((actual, expected) => {
expect(actual).toEqual(expected);
});
});
afterEach(() => {
sandbox.restore();
});
it("should return success message", () => {
scheduler.run(ts => {
const inputM = "--a--";
const outputM = "--b--";
const values = {
a: updateRemoteField({
orderId: 1,
fields: [{ value: "test string", id: 20 }],
update: true
}),
b: queueAlert("Draft Saved")
};
const source = ActionsObservable.from(ts.cold(inputM, values));
const actual = updateRemoteFieldEpic(source);
const axiosStub = sandbox
.stub(axios, "post")
.returns([]);
ts.expectObservable(actual).toBe(outputM, values);
ts.flush();
expect(axiosStub.called).toBe(true);
});
});
});
output stream in actual returns empty array
I tried to return from map observable of the action creator which crashed application because action expected object.
By stubbing axios.post(...) as [], you get from([]) in the epic - an empty observable that doesn't emit any values. That's why your mergeMap is never called. You can fix this by using a single-element array as stubbed value instead, e.g. [null] or [{}].
The below is an answer to a previous version of the question. I kept it for reference, and because I think the content is useful for those who attempt to mock promise-returning functions in epic tests.
I think your problem is the from(axios.post(...)) in your epic. Axios returns a promise, and the RxJS TestScheduler has no way of making that synchronous, so expectObservable will not work as intended.
The way I usually address this is to create a simple wrapper module that does Promise-to-Observable conversion. In your case, it could look like this:
// api.js
import axios from 'axios';
import { map } from 'rxjs/operators';
export function post(path, data) {
return from(axios.post(path, options));
}
Once you have this wrapper, you can mock the function to return a constant Observable, taking promises completely out of the picture. If you do this with Jest, you can mock the module directly:
import * as api from '../api.js';
jest.mock('../api.js');
// In the test:
api.post.mockReturnValue(of(/* the response */));
Otherwise, you can also use redux-observable's dependency injection mechanism to inject the API module. Your epic would then receive it as third argument:
export const updateRemoteFieldEpic = (action$, state, { api }) =>
action$.pipe(
ofType(UPDATE_REMOTE_FIELD),
filter(({ payload: { update = true } }) => update),
mergeMap(({ payload }) => {
// ...
return api.post(...).pipe(...);
})
);
In your test, you would then just passed a mocked api object.

How to add a custom GraphQL parameter in a GatsbyJS node?

I created the following gatsby node to query 1 record
const axios = require("axios");
exports.sourceNodes = async (
{ actions, createNodeId, createContentDigest },
configOptions
) => {
const { createNode } = actions;
// Gatsby adds a configOption that's not needed for this plugin, delete it
delete configOptions.plugins;
// Helper function that processes a post to match Gatsby's node structure
const processPost = post => {
const nodeId = createNodeId(`gutenberg-post-${post.id}`);
const nodeContent = JSON.stringify(post);
const nodeData = Object.assign({}, post, {
id: nodeId,
parent: null,
children: [],
internal: {
type: `GutenbergPost`,
content: nodeContent,
contentDigest: createContentDigest(post)
}
});
return nodeData;
};
const apiUrl = `http://wp.dev/wp-json/gutes-db/v1/${
configOptions.id || 1
}`;
// Gatsby expects sourceNodes to return a promise
return (
// Fetch a response from the apiUrl
axios
.get(apiUrl)
// Process the response data into a node
.then(res => {
// Process the post data to match the structure of a Gatsby node
const nodeData = processPost(res.data);
// Use Gatsby's createNode helper to create a node from the node data
createNode(nodeData);
})
);
};
My source is a rest API that has the following format:
http://wp.dev/wp-json/gutes-db/v1/{ID}
Currently the gatsby node default ID set is 1
I can query it in graphql by doing this:
{
allGutenbergPost {
edges {
node{
data
}
}
}
}
This will always return record 1
I wanted to add a custom parameter for ID so that I could do this
{
allGutenbergPost(id: 2) {
edges {
node{
data
}
}
}
}
What adjustments should I do with my existing code?
I assume you are creating page programmatically? If so, in the onCreatePage hook, when you do createPage, you can pass in a context object. Anything in there will be available as a query variable.
For example, if you have
createPage({
path,
component: blogPostTemplate,
context: {
foo: "bar",
},
})
Then you can do a page query like
export const pageQuery = graphql`
ExampleQuery($foo: String) {
post(name: { eq: $foo }) {
id
content
}
}
`
If you just want to filter by id, you can check out the docs on filter & comparison operators.
{
allGutenbergPost(filter: { id: { eq: 2 }}) {
edges {
node{
data
}
}
}
}
or
{
gutenbergPost(id: { eq: 2 }) {
data
}
}
Hope it helps!

Categories

Resources