Firestore startAt() and Redux serializable vs non-serializable issue - javascript

I'm at a cross-roads in my Firebase project and needs some general guidance. Here are the facts as I've understood them from research:
Firestore queries have a handy feature for pagination (startAt / startAfter)
startAt can take a returned record for simplicity, rather than manually passing property values according to the orderBy of your query. I would prefer to use that (hence this question on StackOverflow).
Redux toolkit has a best practice that non-serializable objects are not allowed in state.
QueryDocumentSnapshot (returned from Firestore queries) objects are non-serializable, but calling .data() produces a serializable JS object.
So this puts me in a difficult place, because I need to store something serializable in state to make RTK happy. But I need to pass the non-serializable QueryDocumentSnapshot for my Firestore query in order for it to actually return results. (passing the result of .data() does not work in my testing).
I am wondering if a lot of people don't use Redux w/ Firestore and therefore are not encountering this problem, because the vanilla React useState hook doesn't complain about whether your value is serializable or not.
Does anyone have advice? I can resort to just passing values directly to startAt, but I would prefer the convenience of passing the originally returned object.
Some pseudo sample code, lifted from several files in my code base:
// gets constructed dynamically, but for illustration, some sample pipeline entries
const pipeline = [
where('... etc'),
orderBy('fieldABC'),
limit(20),
];
const collRef = collection(db, collectionName);
const baseQuery = query(collRef, ...pipeline);
// put in state (1 of 2 ways)
const items = await getDocs(query); // RTK warning about non-serializable
// const items = (await getDocs(query)).map((doc) => doc.data()); // works, but breaks pagination if I pass one of these items to startAt
// later, due to "infinite scroll" want to get another set of records
// if lastItem is a QueryDocumentSnapshot, it works
const lastItem = items[items.length - 1];
const nextPageQuery = query(collRef, ...pipeline, startAt(lastItem);

One naive solution that is against the "strong recommendations" of the RTK team (but that may be worth it if you don't mess with certain RTK edge cases, such as utilizing persistence) would be to ignore the non-serializable values:
export const store = configureStore({
reducer: {
// ...
},
middleware: (getDefaultMiddleware) => {
const ware = getDefaultMiddleware({
serializableCheck: {
ignoreState: true,
},
});
return ware.concat(firebaseApi.middleware);
},
});
This is a good enough hack for my use case, but may not fully address it for others.

Related

How can I synchronize data from two different queries, where one is being transformed in useEffect?

I have an array and map being fetched from two different RTK Queries. Array A contains metaData (Id's) used to access the values in map B. Hence, their recency needs to be synchronized so that every value in B has some ID in array A.
I fetch both data from the server, but then apply some filters to map B such that the number of values is potentially reduced. This is done like so:
const queryAResult = useRTKQueryA(); // arrayA = queryAResult.data
const queryBResult = userRTKQueryB(); // mapB = queryBResult.data
const [filteredMapB, setFilteredMapB] = useState({});
// UseEffect is used here because queryBResult changes as the query is refetched under the appropriate conditions.
useEffect(() => {
if (queryBResult.data) {
const mapB = queryBResult.data;
const filtered = filter(mapB); // Generic filter
setFilteredMapB[filtered];
}
}, [queryBResult.data]); // mapB = queryBResult.data
// Here I render a component which is dependent on using arrayA to make accesses to filteredMapB.
The issue im facing is that, while I can listen to the query results for both arrayA and mapB to make sure they're both available (through data, isLoading, isFetching, etc), im unsure of how to make sure that the filtering process in useEffect has occurred before using arrayA to access filteredMapB. Currently the problem occurs because I use the most recently available version of arrayA to access filteredMapB (which is not alway the most recent, since the filtering takes time), so arrayA ends up containing elements not in filteredMapB.
How can I synchronize these? Ideally the solution would lend itself well to a boolean expression I could use to conditionally render the list or a loading spinner.
I have thought about putting both arrays in local state variables and using useCallback to create getters for the two arrays which would return synchronized values by using the dependency array, but would prefer a different solution.
I would not use useEffect at all here. This is something where useMemo comes in a lot more handy, as it happens synchronously and you will never have a render in-between.
const queryAResult = useRTKQueryA(); // arrayA = queryAResult.data
const queryBResult = userRTKQueryB(); // mapB = queryBResult.data
const filteredMapB = useMemo(() =>
if (queryBResult.data) {
const mapB = queryBResult.data;
const filtered = filter(mapB); // Generic filter
return filtered
}
}, [queryBResult.data]); // mapB = queryBResult.data
if (!queryAResult || !filteredMapB) {
return <Spinner />
}
// normal rendering
it looks like your ArrayA is also changing, so maybe include ArrayA in useEffect as well in your case?

Redux: Is it common to do deep-compare at reducer and return original state if no change?

Action: fetch an array of data from the server.
Reducer: save an array of data to store.
My reducer saves the array of data in the expected immutable fashion:
return {
...state,
arrayOfData: fetchedArrayOfData,
};
The fetch action happens periodically, but the content of the array remains the same most of the time. However, since a new reference is saved each time, the selector for this data will be considered "changed", and doesn't memoize well if used as an input for creatorSelector.
The only solution I can think of is to perform a deep-compare at the reducer, and if the contents are the same, simply return the original state:
return state;
If this common practice/pattern?
Note: I tried to look around, and most projects are doing the same thing that I was doing (i.e. return new state object), which will do not memoize well and causes selector transformations to run.
There are some solutions like using immutableJS as mentioned in the comments but you can return your state conditionally by comparing your fetchedArrayOfData with the last one (which is stored in your state).
Assume there is a comparison function that gives two arrays and compares them.
In the reducer:
const previousFetchedData = state.fetchedArrayOfData;
const newFetchedData = action.payload.fetchedArrayOfData;
const resultOfComparision = isSameArray(previousFetchedData, newFetchedData) // true or false
if (resultOfComparision) { // case of same arrays
return state
} else { // case of different arrays
...state,
arrayOfData: fetchedArrayOfData,
};
Note 1: you can create your own comparison function but there are many nice ones in this post of StackOverflow which you can use them.
Note 2: using immutableJs and conditional returning data from reducer is common(in such scenario) and don't worry about using them.
Note 3: You can also compare your data at the component level by using the traditional way with shouldComponentUpdate.
Node 4: using middlewares like redux-saga will be useful, you can also implement the isSameArray function in the saga and then dispatch the proper action. read more on saga documentation.
Note 5: The best solution (in my thought) is to handle this case on the backend services with 304 status which means Not modified. then you can easily determine the right action according to the response status. more info on MDN documentation.

Using Merge with a single Create call in FaunaDB is creating two documents?

Got a weird bug using FaunaDB with a Node.js running on a Netlify Function.
I am building out a quick proof-of-concept and initially everything worked fine. I had a Create query that looked like this:
const faunadb = require('faunadb');
const q = faunadb.query;
const CreateFarm = (data) => (
q.Create(
q.Collection('farms'),
{ data },
)
);
As I said, everything here works as expected. The trouble began when I tried to start normalizing the data FaunaDB sends back. Specifically, I want to merge the Fauna-generated ID into the data object, and send just that back with none of the other metadata.
I am already doing that with other resources, so I wrote a helper query and incorporated it:
const faunadb = require('faunadb');
const q = faunadb.query;
const Normalize = (resource) => (
q.Merge(
q.Select(['data'], resource),
{ id: q.Select(['ref', 'id'], resource) },
)
);
const CreateFarm = (data) => (
Normalize(
q.Create(
q.Collection('farms'),
{ data },
),
)
);
This Normalize function works as expected everywhere else. It builds the correct merged object with an ID with no weird side effects. However, when used with CreateFarm as above, I end up with two identical farms in the DB!!
I've spent a long time looking at the rest of the app. There is definitely only one POST request coming in, and CreateFarm is definitely only being called once. My best theory was that since Merge copies the first resource passed to it, Create is somehow getting called twice on the DB. But reordering the Merge call does not change anything. I have even tried passing in an empty object first, but I always end up with two identical objects created in the end.
Your helper creates an FQL query with two separate Create expressions. Each is evaluated and creates a new Document. This is not related to the Merge function.
Merge(
Select(['data'], Create(
Collection('farms'),
{ data },
)),
{ id: Select(['ref', 'id'], Create(
Collection('farms'),
{ data },
)) },
)
Use Let to create the document, then Update it with the id. Note that this increases the number of Write Ops required for you application. It will basically double the cost of creating Documents. But for what you are trying to do, this is how to do it.
Let(
{
newDoc: Create(q.Collection("farms"), { data }),
id: Select(["ref", "id"], Var("newDoc")),
data: Select(["data"], Var("newDoc"))
},
Update(
Select(["ref"], Var("newDoc")),
{
data: Merge(
Var("data"),
{ id: Var("id") }
)
}
)
)
Aside: why store id in the document data?
It's not clear why you might need to do this. Indexes can be created on the ref value themselves. If your client receives a Ref, then that can be passed into subsequent queries directly. In my experience, if you need the plain id value directly in an application, transform the Document as close to that point in the application as possible (like using ids as keys for an array of web components).
There's even a slight Compute advantage for using Ref values rather than re-building Ref expressions from a Collection name and ID. The expression Ref(Collection("farms"), "1234") counts as 2 FQL functions toward Compute costs, but reusing the Ref value returned by queries is free.
Working with GraphQL, the _id field is abstracted out for you because working with Document types in GraphQL would be pretty awful. However, the best practice for FQL queries would be to use the Ref's directly as much as possible.
Don't let me talk in absolute terms, though! I believe generally that there's a reason for anything. If you believe you really need to duplicate the ID in the Documents data, then I would be interested in a comment why.

Vue Array converted to Proxy object

I'm new to Vue. While making this component I got stuck here.
I'm making an AJAX request to an API that returns an array using this code:
<script>
import axios from 'axios';
export default {
data() {
return {
tickets: [],
};
},
methods: {
getTickets() {
axios.get(url)
.then((response) => {
console.log(response.data) //[{}, {}, {}]
this.tickets = [...response.data]
console.log(this.tickets) //proxy object
})
},
},
created() {
this.getTickets();
}
};
</script>
The problem is, this.tickets gets set to a Proxy object instead of the Array I'm getting from the API.
What am I doing wrong here?
Items in data like your tickets are made into observable objects. This is to allow reactivity (automatically re-rendering the UI and other features). This is expected and the returned object should behave just like the array.
Check out the reactivity docs because you need to interact with arrays in a specific pattern or it will not update on the ui: https://v3.vuejs.org/guide/reactivity-fundamentals.html
If you do not want to have reactivity - maybe you never update tickets on the client and just want to display them - you can use Object.freeze() on response.data;
if you want reactive information use toRaw
https://vuejs.org/api/reactivity-advanced.html#toraw
const foo = {}
const reactiveFoo = reactive(foo)
console.log(toRaw(reactiveFoo) === foo) // true
or use unref if you donot want ref wrapper around your info
https://vuejs.org/api/reactivity-utilities.html#unref
You can retrieve the Array response object from the returned Proxy by converting it to a JSON string and back into an Array like so:
console.log(JSON.parse(JSON.stringify(this.tickets)));
You're not doing anything wrong. You're just finding out some of the intricacies of using vue 3.
Mostly you can work with the proxied array-object just like you would with the original. However the docs do state:
The use of Proxy does introduce a new caveat to be aware of: the proxied object is not equal to the original object in terms of identity comparison (===).
Other operations that rely on strict equality comparisons can also be impacted, such as .includes() or .indexOf().
The advice in docs doesn't quite cover these cases yet. I found I could get .includes() to work when checking against Object.values(array). (thanks to #adamStarrh in the comments).
import { isProxy, toRaw } from 'vue';
let rawData = someData;
if (isProxy(someData)){
rawData = toRaw(someData)
}

Get array of objects from real time data snapshot - Cloud Firestore

I'm trying to fetch real time data from Cloud Firestore using the below code.
export const getRealTimeData = () =>
db
.collection('posts')
.onSnapshot(
(querySnapshot) => {
const posts: any = [];
querySnapshot.forEach((doc) =>
posts.push(Object.assign({
id: doc.id
}, doc.data()))
);
},
);
};
And, I want to use the resultant array to display the data on UI. When I'm doing this, the resultant array is a function but not the actual array of data.
const posts = getRealTimeData();
Here's what I get when I log posts
function () {
i.kT(), o.al(s);
}
Could anyone please point where I went wrong?
Realtime listeners added with onSnapshot() are not compatible with returning values from function calls. That's because they continue to generate new results over time, and would never really "return" anything once. You should abandon the idea of making a synhronous getter type function in this case - they just don't work for what you're trying to do.
Ideally, you would use an architecture like Redux to manage the updates as they become available. Your realtime listener would dispatch query updates to a store, and your component would subscribe to that store that to receive those updates.
If you don't want to use Redux (which is too bad - you really should for this sort of thing), then you should wrap your query inside a useEffect hook, then have your listener set a state hook variable so your component can receive the updates.

Categories

Resources