whitelist nested item in state (Redux persist) - javascript

The state in my reducer contains the key current_theme, which contains an object, with the key palette, which contains an object with the key mode which can either be the string value "dark" or the string value "light"
So I need to make only this bit of data persistent while leaving all other attributes intact.
redux-persist offers a whitelist parameter which is what I want. However, I can only do something like
const persistedReducer = persistReducer (
{
key: 'theme',
storage,
whitelist: ["current_theme"]
},
myReducer
);
But this makes everything inside current_theme persistent. I want only current_theme.palette.mode to be persistent and nothing else.
I tried the below but it didn't work neither
const persistedReducer = persistReducer (
{
key: 'theme',
storage,
whitelist: ["current_theme.palette.mode"]
},
myReducer
);

I had to write my own simplified code for a persistReducer method which allows nested blacklisting/whitelisting with the help of dot-prop-immutable and the deep merge function in this question. thanks to Salakar and CpILL
import dotProp from "dot-prop-immutable";
const STORAGE_PREFIX: string = 'persist:';
interface persistConfig {
key: string;
whitelist?: string[];
blacklist?: string[];
}
function isObject(item:any) {
return (item && typeof item === 'object' && !Array.isArray(item));
}
function mergeDeep(target:any, source:any) {
let output = Object.assign({}, target);
if (isObject(target) && isObject(source)) {
Object.keys(source).forEach(key => {
if (isObject(source[key])) {
if (!(key in target))
Object.assign(output, { [key]: source[key] });
else
output[key] = mergeDeep(target[key], source[key]);
} else {
Object.assign(output, { [key]: source[key] });
}
});
}
return output;
}
function filterState({ state, whitelist, blacklist }: { state: any, whitelist?: string[], blacklist?: string[] }) {
if (whitelist && blacklist) {
throw Error("Can't set both whitelist and blacklist at the same time");
}
if (whitelist) {
var newState: any = {};
for (const i in whitelist) {
let val = dotProp.get(state, whitelist[i]);
if (val !== undefined) {
newState = dotProp.set(newState, whitelist[i], val)
}
}
return newState;
}
if (blacklist) {
var filteredState: any = JSON.parse(JSON.stringify(state));
for (const i in blacklist) {
filteredState = dotProp.delete(filteredState, blacklist[i]);
}
return filteredState;
}
return state
}
export function persistReducer(config: persistConfig, reducer: any) {
const { key, whitelist, blacklist } = config;
var restore_complete = false;
return (state: any, action: { type: string, payload?: any }) => {
const newState = reducer(state, action)
if (action.type === '##INIT' && !restore_complete) {
restore_complete = true;
const data = localStorage.getItem(STORAGE_PREFIX + key);
if (data !== null) {
const newData = mergeDeep(newState, JSON.parse(data));
console.log("Restoring data:", data ,"\nnewData: ", newData);
return newData;
}
}
if(restore_complete){
const filteredNewState = filterState({
state: newState,
whitelist,
blacklist
})
localStorage.setItem(STORAGE_PREFIX + key, JSON.stringify(filteredNewState));
}
return newState;
}
}
Usage:
Same as the persistReducer function in redux-persist except that no storage options. it always uses localStorage and it accepts dotted paths in both the whitelist and blacklist parameters
if you do something like:
const persistedReducer = persistReducer (
{
key: 'theme',
whitelist: ["current_theme.palette.mode"]
},
myReducer
);
Then current_theme.palette.mode would always be saved permanently. While any other props in the store, under current_theme, or under palette will remain intact.
Note: All you have to do to use this state persistence code is to pass your reducer function through the persistReducer. No additional configurations such as creating a persister, from the store and wrapping your app in a PersistGate. No need to install any packages other than dot-prop-immutable, Just use the persistedReducer of your original reducer as returned by persistReducer and you're good to go.
Note: If a default value is provided to your original reducer and some state has been saved from a previous session, both will be deeply merged when while the initial state is being loaded, with the persisted state from the previous session having higher priority so it can overwrite default values.

Related

The filter function does not remove the object in the array in Vue/Vuex

I am busy with making a booking function in vue3/vuex.
A user can book an item and also remove it from the basket.The problem is that the Filter function in vue does not remove the object in the array and I can not find out what the problem is. I hope you can help me
This is the result if I put console.log() in the removeFromBasket(state, payload)
removeFromBasket(state, payload) {
console.log('removeFromBasket', payload, JSON.parse(JSON.stringify(state.basket.items)))
}
method to remove
removeFromBasket() {
this.$store.commit('basket/removeFromBasket', this.bookableId);
}
basket module
const state = {
basket: {
items: []
},
};
const getters = {
getCountOfItemsInBasket: (state) => {
return state.basket.items.length
},
getAllItemsInBasket: (state) => {
return state.basket.items
},
inBasketAlready(state) {
return function (id) {
return state.basket.items.reduce((result, item) => result || item.bookable.id === id, false);
}
},
};
const actions = {};
const mutations = {
addToBasket(state, payload) {
state.basket.items.push(payload);
},
removeFromBasket(state, payload) {
state.basket.items = state.basket.items.filter(item => item.bookable.id !== payload);
}
};
export default {
namespaced: true,
state,
getters,
actions,
mutations
};
I have solved the problem.
I used typeof() in console.log() to see what type the payload and item.bookable.id are.
The payload was a string and the item.bookable.id was a number.
So I put the payload in parseInt(payload) and the problem was solved.
removeFromBasket(state, payload) {
state.basket.items = state.basket.items.filter(item => item.bookable.id !== parseInt(payload));
}

When routing mswjs/data populates the database with new items and removes the previous one, making it inaccessible

I use next-redux-wrapper, MSW, #mswjs/data and redux-toolkit for storing my data in a store as well as mocking API calls and fetching from a mock Database.
I have the following scenario happening to me.
I am on page /content/editor and in the console and terminal, I can see the data was fetched from the mock database and hydrated from getStaticProps of Editor.js. So now IDs 1 to 6 are inside the store accessible.
Now I click on the PLUS icon to create a new project. I fill out the dialog and press "SAVE". a POST request starts, it's pending and then it gets fulfilled. The new project is now in the mock DB as well as in the store, I can see IDs 1 to 7 now.
Since I clicked "SAVE" and the POST request was successful, I am being routed to /content/editor/7 to view the newly created project.
Now I am on Page [id].js, which also fetched data from the mock DB and then it gets stored and hydrated into the redux store. The idea is, it takes the previous store's state and spreads it into the store, with the new data (if there are any).
Now the ID 7 no longer exists. And IDs 1 to 6 also don't exist anymore, instead, I can see in the console and terminal that IDs 8 to 13 were created, and the previous ones are no more.
Obviously, this is not great. When I create a new project and then switch the route, I should be able to access the newly created project as well as the previously created ones. But instead, they all get overwritten.
It either has something to do with the next-redux-wrapper or MSW, but I am not sure how to make it work. I need help with it. I will post some code now:
Code
getStaticProps
// path example: /content/editor
// Editor.js
export const getStaticProps = wrapper.getStaticProps(
(store) =>
async ({ locale }) => {
const [translation] = await Promise.all([
serverSideTranslations(locale, ['editor', 'common', 'thesis']),
store.dispatch(fetchProjects()),
store.dispatch(fetchBuildingBlocks()),
]);
return {
props: {
...translation,
},
};
}
);
// path example: /content/editor/2
// [id].js
export const getStaticProps = wrapper.getStaticProps(
(store) =>
async ({ locale, params }) => {
const { id } = params;
const [translation] = await Promise.all([
serverSideTranslations(locale, ['editor', 'common', 'thesis']),
store.dispatch(fetchProjects()),
// store.dispatch(fetchProjectById(id)), // issue: fetching by ID returns null
store.dispatch(fetchBuildingBlocks()),
]);
return {
props: {
...translation,
id,
},
};
}
);
Mock Database
Factory
I am going to shorten the code to the relevant bits. I will remove properties for a project, as well es helper functions to generate data.
const asscendingId = (() => {
let id = 1;
return () => id++;
})();
const isDevelopment =
process.env.NODE_ENV === 'development' || process.env.STORYBOOK || false;
export const projectFactory = () => {
return {
id: primaryKey(isDevelopment ? asscendingId : nanoid),
name: String,
// ... other properties
}
};
export const createProject = (data) => {
return {
name: data.name,
createdAt: getUnixTime(new Date()),
...data,
};
};
/**
* Create initial set of tasks
*/
export function generateMockProjects(amount) {
const projects = [];
for (let i = amount; i >= 0; i--) {
const project = createProject({
name: faker.lorem.sentence(faker.datatype.number({ min: 1, max: 5 })),
dueDate: date(),
fontFamily: getRandomFontFamily(),
pageMargins: getRandomPageMargins(),
textAlign: getRandomTextAlign(),
pageNumberPosition: getRandomPageNumberPosition(),
...createWordsCounter(),
});
projects.push(project);
}
return projects;
}
API Handler
I will shorten this one to GET and POST requests only.
import { db } from '../../db';
export const projectsHandlers = (delay = 0) => {
return [
rest.get('https://my.backend/mock/projects', getAllProjects(delay)),
rest.get('https://my.backend/mock/projects/:id', getProjectById(delay)),
rest.get('https://my.backend/mock/projectsNames', getProjectsNames(delay)),
rest.get(
'https://my.backend/mock/projects/name/:id',
getProjectsNamesById(delay)
),
rest.post('https://my.backend/mock/projects', postProject(delay)),
rest.patch(
'https://my.backend/mock/projects/:id',
updateProjectById(delay)
),
];
};
function getAllProjects(delay) {
return (request, response, context) => {
const projects = db.project.getAll();
return response(context.delay(delay), context.json(projects));
};
}
function postProject(delay) {
return (request, response, context) => {
const { body } = request;
if (body.content === 'error') {
return response(
context.delay(delay),
context.status(500),
context.json('Server error saving this project')
);
}
const now = getUnixTime(new Date());
const project = db.project.create({
...body,
createdAt: now,
maxWords: 10_000,
minWords: 7000,
targetWords: 8500,
potentialWords: 1500,
currentWords: 0,
});
return response(context.delay(delay), context.json(project));
};
}
// all handlers
import { buildingBlocksHandlers } from './api/buildingblocks';
import { checklistHandlers } from './api/checklist';
import { paragraphsHandlers } from './api/paragraphs';
import { projectsHandlers } from './api/projects';
import { tasksHandlers } from './api/tasks';
const ARTIFICIAL_DELAY_MS = 2000;
export const handlers = [
...tasksHandlers(ARTIFICIAL_DELAY_MS),
...checklistHandlers(ARTIFICIAL_DELAY_MS),
...projectsHandlers(ARTIFICIAL_DELAY_MS),
...buildingBlocksHandlers(ARTIFICIAL_DELAY_MS),
...paragraphsHandlers(ARTIFICIAL_DELAY_MS),
];
// database
import { factory } from '#mswjs/data';
import {
buildingBlockFactory,
generateMockBuildingBlocks,
} from './factory/buildingblocks.factory';
import {
checklistFactory,
generateMockChecklist,
} from './factory/checklist.factory';
import { paragraphFactory } from './factory/paragraph.factory';
import {
projectFactory,
generateMockProjects,
} from './factory/project.factory';
import { taskFactory, generateMockTasks } from './factory/task.factory';
export const db = factory({
task: taskFactory(),
checklist: checklistFactory(),
project: projectFactory(),
buildingBlock: buildingBlockFactory(),
paragraph: paragraphFactory(),
});
generateMockProjects(5).map((project) => db.project.create(project));
const projectIds = db.project.getAll().map((project) => project.id);
generateMockTasks(20, projectIds).map((task) => db.task.create(task));
generateMockBuildingBlocks(10, projectIds).map((block) =>
db.buildingBlock.create(block)
);
const taskIds = db.task.getAll().map((task) => task.id);
generateMockChecklist(20, taskIds).map((item) => db.checklist.create(item));
Project Slice
I will shorten this one as well to the relevant snippets.
// projects.slice.js
import {
createAsyncThunk,
createEntityAdapter,
createSelector,
createSlice,
current,
} from '#reduxjs/toolkit';
import { client } from 'mocks/client';
import { HYDRATE } from 'next-redux-wrapper';
const projectsAdapter = createEntityAdapter();
const initialState = projectsAdapter.getInitialState({
status: 'idle',
filter: { type: null, value: null },
statuses: {},
});
export const fetchProjects = createAsyncThunk(
'projects/fetchProjects',
async () => {
const response = await client.get('https://my.backend/mock/projects');
return response.data;
}
);
export const saveNewProject = createAsyncThunk(
'projects/saveNewProject',
async (data) => {
const response = await client.post('https://my.backend/mock/projects', {
...data,
});
return response.data;
}
);
export const projectSlice = createSlice({
name: 'projects',
initialState,
reducers: {
// irrelevant reducers....
},
extraReducers: (builder) => {
builder
.addCase(HYDRATE, (state, action) => {
// eslint-disable-next-line no-console
console.log('HYDRATE', action.payload);
const statuses = Object.fromEntries(
action.payload.projects.ids.map((id) => [id, 'idle'])
);
return {
...state,
...action.payload.projects,
statuses,
};
})
.addCase(fetchProjects.pending, (state, action) => {
state.status = 'loading';
})
.addCase(fetchProjects.fulfilled, (state, action) => {
projectsAdapter.addMany(state, action.payload);
state.status = 'idle';
action.payload.forEach((item) => {
state.statuses[item.id] = 'idle';
});
})
.addCase(saveNewProject.pending, (state, action) => {
console.log('SAVE NEW PROJECT PENDING', action);
})
.addCase(saveNewProject.fulfilled, (state, action) => {
projectsAdapter.addOne(state, action.payload);
console.group('SAVE NEW PROJECT FULFILLED');
console.log(current(state));
console.log(action);
console.groupEnd();
state.statuses[action.payload.id] = 'idle';
})
// other irrelevant reducers...
},
});
This should be all the relevant code. If you have questions, please ask them and I will try to answer them.
I have changed how the state gets hydrated, so I turned this code:
.addCase(HYDRATE, (state, action) => {
// eslint-disable-next-line no-console
console.log('HYDRATE', action.payload);
const statuses = Object.fromEntries(
action.payload.projects.ids.map((id) => [id, 'idle'])
);
return {
...state,
...action.payload.projects,
statuses,
};
})
Into this code:
.addCase(HYDRATE, (state, action) => {
// eslint-disable-next-line no-console
console.group('HYDRATE', action.payload);
const statuses = Object.fromEntries(
action.payload.projects.ids.map((id) => [id, 'idle'])
);
state.statuses = { ...state.statuses, ...statuses };
projectsAdapter.upsertMany(state, action.payload.projects.entities);
})
I used the adapter to upsert all entries.

I want to filter out an array of objects by ids in another array

I have an array of objects i.e queueDetails[{},{}]. I have another array of ids from response
"payload":[{"id":"1"},{"id":"2"}].
I want to filter out the ids in payload from queueDetails for which I have following code:
action.payload.map(payload => {
state.queueDetails.filter(queue => queue._id !== payload.id)
})
return {
...state,
queueDetails: ???
}
How do I proceed from here.
I think, it's safe to guess, you're building part of Redux store reducer, if that's the case, corresponding case section for filtering action may be something, like:
case FILTER_QUEUE_DETAILS : {
const { queueDetails } = state,
{ payload } = action,
submittedIds = payload.map(({id}) => id)
return {...state, queueDetails: queueDetails.filter(({id}) => !submittedIds.includes(id))}
}
You may find the quick demo below:
const { createStore } = Redux
const defaultState = {queueDetails:[{id:1,data:'somedata'},{id:2,data:'moredata'},{id:3,data:'somemore'}]},
FILTER_QUEUE_DETAILS = 'FILTER_QUEUE_DETAILS',
appReducer = (state=defaultState, action) => {
switch(action.type) {
case FILTER_QUEUE_DETAILS : {
const { queueDetails } = state,
{ payload } = action,
submittedIds = payload.map(({id}) => id)
return {...state, queueDetails: queueDetails.filter(({id}) => !submittedIds.includes(id))}
}
default: return state
}
},
store = createStore(appReducer)
//initial state
console.log(`// initial state:\n`, store.getState())
//dispatch action to filter out id's 1, 3
store.dispatch({
type: FILTER_QUEUE_DETAILS,
payload: [{id:1},{id:3}]
})
//log resulting state
console.log(`// state upon id's 1 and 3 filtered out:\n`,store.getState())
.as-console-wrapper {min-height:100%}
<script src="https://cdnjs.cloudflare.com/ajax/libs/redux/4.0.5/redux.min.js"></script>

Using reducer state inside useEffect

Hello All 👋🏻 I have a question about our favorite Hooks API!
What am I trying to do?
I am trying to fetch photos from some remote system. I store the blob urls for these photos in my reducer state keyed by an id.
I have a helper function wrapped in the memoized version returned by the useCallback hook. This function is called in the useEffect I have defined.
The Problem ⚠️
My callback a.k.a the helper function depends on part of the reducer state. Which is updated every time a photo is fetched. This causes the component to run the effect in useEffect again and thus causing an infinite loop.
component renders --> useEffect runs ---> `fetchPhotos` runs --> after 1st photo, reducer state is updated --> component updates because `useSelector`'s value changes ---> runs `fetchPhotos` again ---> infinite
const FormViewerContainer = (props) => {
const { completedForm, classes } = props;
const [error, setError] = useState(null);
const dispatch = useDispatch();
const photosState = useSelector(state => state.root.photos);
// helper function which fetches photos and updates the reducer state by dispatching actions
const fetchFormPhotos = React.useCallback(async () => {
try {
if (!completedForm) return;
const { photos: reducerPhotos, loadingPhotoIds } = photosState;
const { photos: completedFormPhotos } = completedForm;
const photoIds = Object.keys(completedFormPhotos || {});
// only fetch photos which aren't in reducer state yet
const photoIdsToFetch = photoIds.filter((pId) => {
const photo = reducerPhotos[pId] || {};
return !loadingPhotoIds.includes(pId) && !photo.blobUrl;
});
dispatch({
type: SET_LOADING_PHOTO_IDS,
payload: { photoIds: photoIdsToFetch } });
if (photoIdsToFetch.length <= 0) {
return;
}
photoIdsToFetch.forEach(async (photoId) => {
if (loadingPhotoIds.includes(photoIds)) return;
dispatch(fetchCompletedFormPhoto({ photoId }));
const thumbnailSize = {
width: 300,
height: 300,
};
const response = await fetchCompletedFormImages(
cformid,
fileId,
thumbnailSize,
)
if (response.status !== 200) {
dispatch(fetchCompletedFormPhotoRollback({ photoId }));
return;
}
const blob = await response.blob();
const blobUrl = URL.createObjectURL(blob);
dispatch(fetchCompletedFormPhotoSuccess({
photoId,
blobUrl,
}));
});
} catch (err) {
setError('Error fetching photos. Please try again.');
}
}, [completedForm, dispatch, photosState]);
// call the fetch form photos function
useEffect(() => {
fetchFormPhotos();
}, [fetchFormPhotos]);
...
...
}
What have I tried?
I found an alternative way to fetch photos a.k.a by dispatching an action and using a worker saga to do all the fetching. This removes all the need for the helper in the component and thus no useCallback and thus no re-renders. The useEffect then only depends on the dispatch which is fine.
Question ?
I am struggling with the mental modal of using the hooks API. I see the obvious problem, but I am not sure how could this be done without using redux middlewares like thunks and sagas.
Edit:
reducer function:
export const initialState = {
photos: {},
loadingPhotoIds: [],
};
export default function photosReducer(state = initialState, action) {
const { type, payload } = action;
switch (type) {
case FETCH_COMPLETED_FORM_PHOTO: {
return {
...state,
photos: {
...state.photos,
[payload.photoId]: {
blobUrl: null,
error: false,
},
},
};
}
case FETCH_COMPLETED_FORM_PHOTO_SUCCESS: {
return {
...state,
photos: {
...state.photos,
[payload.photoId]: {
blobUrl: payload.blobUrl,
error: false,
},
},
loadingPhotoIds: state.loadingPhotoIds.filter(
photoId => photoId !== payload.photoId,
),
};
}
case FETCH_COMPLETED_FORM_PHOTO_ROLLBACK: {
return {
...state,
photos: {
...state.photos,
[payload.photoId]: {
blobUrl: null,
error: true,
},
},
loadingPhotoIds: state.loadingPhotoIds.filter(
photoId => photoId !== payload.photoId,
),
};
}
case SET_LOADING_PHOTO_IDS: {
return {
...state,
loadingPhotoIds: payload.photoIds || [],
};
}
default:
return state;
}
}
You could include the photoIdsToFetch calculation logic into your selector function, to reduce the number of renders caused by state change.
const photoIdsToFetch = useSelector(state => {
const { photos: reducerPhotos, loadingPhotoIds } = state.root.photos;
const { photos: completedFormPhotos } = completedForm;
const photoIds = Object.keys(completedFormPhotos || {});
const photoIdsToFetch = photoIds.filter(pId => {
const photo = reducerPhotos[pId] || {};
return !loadingPhotoIds.includes(pId) && !photo.blobUrl;
});
return photoIdsToFetch;
},
equals
);
However the selector function isn't memoized, it returns a new array object every time, thus object equality will not work here. You will need to provide an isEqual method as a second parameter (that will compare two arrays for value equality) so that the selector will return the same object when the ids are the same. You could write your own or deep-equals library for example:
import equal from 'deep-equal';
fetchFormPhotos will depend only on [photoIdsToFetch, dispatch] this way.
I'm not sure about how your reducer functions mutate the state, so this may require some fine tuning. The idea is: select only the state from store that you depend on, that way other parts of the store will not cause re-renders.

Cleaning Unwanted Fields From GraphQL Responses

I have an object that my GraphQL client requests.
It's a reasonably simple object:
type Element {
content: [ElementContent]
elementId: String
name: String
notes: String
type: String
createdAt: String
updatedAt: String
}
With the special type ElementContent, which is tiny and looks like this:
type ElementContent {
content: String
locale: String
}
Now, when I query this on the clientside, both the top level object and the lower level object has additional properties (which interfere with updating the object if I attempt to clone the body exactly-as-is);
Notably, GraphQL seems to supply a __typename property in the parent object, and in the child objects, they have typename and a Symbol(id) property as well.
I'd love to copy this object to state, update in state, then clone the state and ship it to my update mutation. However, I get roadblocked because of unknown properties that GraphQL itself supplies.
I've tried doing:
delete element.__typename to good effect, but then I also need to loop through the children (a dynamic array of objects), and likely have to remove those properties as well.
I'm not sure if I'm missing something during this equation, or I should just struggle through the code and loop + delete (I received errors attempting to do a forEach loop initially). Is there a better strategy for what I'm attempting to do? Or am I on the right path and just need some good loop code to clean unwanted properties?
There are three ways of doing this
First way
Update the client parameter like this it will omit the unwanted fields in graphql.
apollo.create({
link: http,
cache: new InMemoryCache({
addTypename: false
})
});
Second Way
By using the omit-deep package and use it as a middleware
const cleanTypeName = new ApolloLink((operation, forward) => {
if (operation.variables) {
operation.variables = omitDeep(operation.variables,'__typename')
}
return forward(operation).map((data) => {
return data;
});
});
Third Way
Creating a custom middleware and inject in the apollo
const cleanTypeName = new ApolloLink((operation, forward) => {
if (operation.variables) {
const omitTypename = (key, value) => (key === '__typename' ? undefined : value);
operation.variables = JSON.parse(JSON.stringify(operation.variables), omitTypename);
}
return forward(operation).map((data) => {
return data;
});
});
and inject the middleware
const httpLinkWithErrorHandling = ApolloLink.from([
cleanTypeName,
retry,
error,
http,
]);
If you use fragments with the queries/mutations Second Way & Third Way is recommended.
Preferred method is Third Way Because it does not have any third pary dependency and no cache performance issues
If you want to wipe up __typename from GraphQL response (from the root and its children), you can use graphql-anywhere package.
Something like:
const wipedData = filter(inputFragment, rcvData);
inputFragment is a fragment defines the fields (You can see details here)
rcvData is the received data from GraphQL query
By using the filter function, the wipedData includes only required fields you need to pass as mutation input.
Here's what I did, to support file uploads as well. It's a merge of multiple suggestions I found on the Github thread here: Feature idea: Automatically remove __typename from mutations
import { parse, stringify } from 'flatted';
const cleanTypename = new ApolloLink((operation, forward) => {
const omitTypename = (key, value) => (key === '__typename' ? undefined : value);
if ((operation.variables && !operation.getContext().hasUpload)) {
operation.variables = parse(stringify(operation.variables), omitTypename);
}
return forward(operation);
});
Hooking up the rest of my client.tsx file, simplified:
import { InMemoryCache } from 'apollo-cache-inmemory';
import { createUploadLink } from 'apollo-upload-client';
import { ApolloClient } from 'apollo-client';
import { setContext } from 'apollo-link-context';
import { ApolloLink } from 'apollo-link';
const authLink = setContext((_, { headers }) => {
const token = localStorage.getItem(AUTH_TOKEN);
return {
headers: {
...headers,
authorization: token ? `Bearer ${ token }` : '',
},
};
});
const httpLink = ApolloLink.from([
cleanTypename,
authLink.concat(upLoadLink),
]);
const client = new ApolloClient({
link: httpLink,
cache,
});
export default client;
Now when I call mutations that are of type upload, I simply set the context hasUpload to true, as shown here:
UpdateStation({variables: { input: station }, context: {hasUpload: true }}).then()
For those looking for a TypeScript solution:
import cloneDeepWith from "lodash/cloneDeepWith";
export const omitTypenameDeep = (
variables: Record<string, unknown>
): Record<string, unknown> =>
cloneDeepWith(variables, (value) => {
if (value && value.__typename) {
const { __typename, ...valWithoutTypename } = value;
return valWithoutTypename;
}
return undefined;
});
const removeTypename = new ApolloLink((operation, forward) => {
const newOperation = operation;
newOperation.variables = omitTypenameDeep(newOperation.variables);
return forward(newOperation);
});
// ...
const client = new ApolloClient({
cache: new InMemoryCache(),
link: ApolloLink.from([removeTypename, httpLink]),
});
I've just published graphql-filter-fragment to help with this use case. I've wrote in a bit more detail about the CRUD use case I had that led me to this approach.
Example:
import {filterGraphQlFragment} from 'graphql-filter-fragment';
import {gql} from '#apollo/client/core';
const result = filterGraphQlFragment(
gql`
fragment museum on Museum {
name
address {
city
}
}
`,
{
__typename: 'Museum',
name: 'Museum of Popular Culture',
address: {
__typename: 'MuseumAddress',
street: '325 5th Ave N',
city: 'Seattle'
}
}
);
expect(result).toEqual({
name: 'Museum of Popular Culture',
address: {
city: 'Seattle'
}
});
Here is my solution. Vanilla JS, recursive, and does not mutate the original object:
const removeAllTypenamesNoMutate = (item) => {
if (!item) return;
const recurse = (source, obj) => {
if (!source) return;
if (Array.isArray(source)) {
for (let i = 0; i < source.length; i++) {
const item = source[i];
if (item !== undefined && item !== null) {
source[i] = recurse(item, item);
}
}
return obj;
} else if (typeof source === 'object') {
for (const key in source) {
if (key === '__typename') continue;
const property = source[key];
if (Array.isArray(property)) {
obj[key] = recurse(property, property);
} else if (!!property && typeof property === 'object') {
const { __typename, ...rest } = property;
obj[key] = recurse(rest, rest);
} else {
obj[key] = property;
}
}
const { __typename, ...rest } = obj;
return rest;
} else {
return obj;
}
};
return recurse(JSON.parse(JSON.stringify(item)), {});
};
Below approach worked for me so far in my use case.
const {
loading,
error,
data,
} = useQuery(gqlRead, {
variables: { id },
fetchPolicy: 'network-only',
onCompleted: (data) => {
const { someNestedData } = data;
const filteredData = removeTypeNameFromGQLResult(someNestedData);
//Do sth with filteredData
},
});
//in helper
export const removeTypeNameFromGQLResult = (result: Record<string, any>) => {
return JSON.parse(
JSON.stringify(result, (key, value) => {
if (key === '__typename') return;
return value;
})
);
};

Categories

Resources