how to write batched transactions for firebase admin realtime db? - javascript

is there a way to perform batched transactions on different fields in realtime database with admin sdk? Currently, I'm using the following:
exports.function = functions.https.onCall((data, context) => {
var transactions = new Object();
transactions[0] = admin.database().ref('ref1/')
.transaction(currentCount => {
return (currentCount || 0) + 1;
}, (error, committed, dataSnapshot) => {...})
transactions[1] = admin.database().ref('ref2/')
.transaction(currentCount => {
return (currentCount || 0) + 1;
}, (error, committed, dataSnapshot) => {...})
return admin.database().ref().update(transactions)
// |^| error occurs right above '|^|', but i don't know why, i suspect it may have something to do with transactions object, and if so, what's the proper way to do batched transactions?
.then(result => {...})
.catch(error => {
console.error('error: ' + error)
})
}
but every time this function is called, although the transactions do work as a batch, the following error is thrown:
Unhandled error TypeError: obj.hasOwnProperty is not a function
at each (/srv/node_modules/#firebase/database/dist/index.node.cjs.js:541:17)
at validateFirebaseData (/srv/node_modules/#firebase/database/dist/index.node.cjs.js:1470:9)
at /srv/node_modules/#firebase/database/dist/index.node.cjs.js:1487:13
at each (/srv/node_modules/#firebase/database/dist/index.node.cjs.js:542:13)
at validateFirebaseData (/srv/node_modules/#firebase/database/dist/index.node.cjs.js:1470:9)
at /srv/node_modules/#firebase/database/dist/index.node.cjs.js:1559:9
at each (/srv/node_modules/#firebase/database/dist/index.node.cjs.js:542:13)
at validateFirebaseMergeDataArg (/srv/node_modules/#firebase/database/dist/index.node.cjs.js:1557:5)
at Reference.update (/srv/node_modules/#firebase/database/dist/index.node.cjs.js:13695:9)
at admin.firestore.collection.doc.collection.doc.create.then.writeResult (/srv/index.js:447:43)

You can't pass a bunch of transactions into a update() call, which is what the error message is (admittedly somewhat confusingly) is trying to tell you.
Firebase has no concept of nested, or batched, transactions. If you need to perform a transaction over multiple locations, you will need to run this as a single transaction call over a node that is above all those locations. As you can probably guess, the contention on such a multi-location transaction is very quickly going to be a throughput limit, so you'll want to consider alternative solutions.
The "simplest" approach I can think of for your use-case, is to replace the two transactions with a single multi-location update, and then use server-side security rules to validate the operation.
For an example of how to do something similar, see my answer here: Is the way the Firebase database quickstart handles counts secure?
With this approach you prevent most of the contention, as the multi-location update doesn't need to read-send-check the entire top-level node, but merely the lower-level nodes that you're updating.
You may have to modify you data structure, and possibly even write additional data, to allow this approach. But in return you'll get a much more scalable transactional update.

Related

How to wrap code that uses transcations in a transaction and then rollback?

I'm setting up my integration testing rig. I'm using the beforeEach and afterEach hooks to wrap every single test in a transaction that rollsback so that the tests don't affect each other. A simplified example might be this:
const { repository } = require("library")
describe("Suite", function () {
beforeEach(async function () {
await knex.raw("BEGIN");
});
afterEach(async function () {
await knex.raw("ROLLBACK");
});
it("A test", async function () {
const user = await repository.createUser()
user.id.should.equal(1)
});
});
This worked fine because I configured knex to use a single DB connection for tests. Hence calling knex.raw("BEGIN"); created a global transaction.
Now however, the library's repository which I can't control started using transactions internally. I.e. createUser() begins and then commits the created user. This broke my tests as now my afterEach hook doesn't rollback the changes because they were already committed.
Is there a way in Postgres to rollback a transaction that have (already committed) nested transactions?
Or maybe a way to use knex to prevent the repository from starting transactions in the first place? It uses knex.transaction() to create them.
Thanks!
Judging by the looks of an example debug log, knex does in fact detect transaction nesting automatically and switches nested transactions from using irreversible commit/rollback to manageable savepoint s1/release s1/rollback to s1 the way I was guessing in my comment.
In this case, it should be enough for you to wrap your calls in a transaction, so that you "own" the top-level one. Knex should detect this and force the underlying transactions to use savepoints instead of commits, all of which you can then undo, rolling back the top-level transaction. If I read the doc right:
const { repository } = require("library")
describe("Suite", function () {
it("A test", async function () {
try {
await knex.transaction(async trx => {
const user = await repository.createUser();
user.id.should.equal(1);
trx.rollback();
})
} catch (error) {
console.error(error);
}
});
});
That's assuming none of the calls below issues a knex.raw("COMMIT") or somehow calls .commit() on the outer, top-level transaction.
As may be guessed from the tags the library in question is Strapi and I'm trying to write tests for the custom endpoints I implemented with it.
As noted by #zagarek, Postgres itself can't rollback already committed transactions. Knex does support nested transactions (using save-points) but you must explicitly refer to the parent transaction when creating a new one for it to get nested.
Many tried to achieve this setup. See the threads under e.g. here or here. It always boils down to somehow passing the test-wrapping transcation all the way down to your ORM/repository/code under test and instructing it to scope all queries under that transaction.
Unfortunately, Strapi doesn't provide any way to be given a transaction nor to create a global one. Now, cover your eyes and I'll tell you how I hacked around this.
I leverage one nice aspect of Knex: its Transaction object behaves (mostly) the same as a Knex instance. I mercilessly replace Strapi's reference of Knex instance with a Knex transaction and then rollback it in afterEach hook. To not make this too easy, Strapi extends its knex instance with a getSchemaName function. I therefore extend the transaction in disguise too and proxy to the original.
This does it: (Note that I'm using Mocha where this can be used to pass state between hooks and/or tests.)
const Strapi = require("#strapi/strapi");
before(async function () {
// "Load" Strapi to set the global `strapi` variable.
await Strapi().load();
// "Listen" to register API routes.
await strapi.listen();
// Put aside Strapi's knex instance for later use in beforeEach and afterEach hooks.
this.knex = strapi.db.connection;
});
after(async function () {
// Put back the original knex instance so that Strapi can destroy it properly.
strapi.db.connection = this.knex;
await strapi.destroy();
});
beforeEach(async function () {
// Replace Strapi's Knex instance with a transaction.
strapi.db.connection = Object.assign(await this.knex.transaction(), {
getSchemaName: this.knex.getSchemaName.bind(this.knex),
});
});
afterEach(async function () {
strapi.db.connection.rollback();
});
it("Health-check is available.", async function () {
// Any changes made within here will get rolled back once the test finishes.
await request(strapi.server.httpServer).get("/_health").expect(204);
});
Lastly, it's worth noting that some Knex maintainers persistently discourage using transcations to isolate tests so consider if chasing this hacky setup is a good idea.

How to get id of an object from firebase

I am trying to get id of an object after set that object. But I am getting type error. TypeError: Cannot read properties of undefined (reading 'val'). How should I do that with firebase 9?
Here is the code that I want to work:
set(push(ref(db, "expenses")), expense)
.then((snapshot) => {
console.log(snapshot.val());
dispatch(
addExpense({
id: snapshot.key,
...expense,
})
);
})
.catch((e) => {
console.log("This failed.", e);
});
Thanks in advance.
Why your code doesn't work
The documentation of set(ref, value) shows that is is defined as:
function set(ref: DatabaseReference, value: unknown): Promise<void>
It returns a Promise<void>, so there's no snapshot being passed to your then.
If the promise resolves (and thus your then callback gets called) that the expense was written to the database on the server as is.
How to fix it
If you want to get the key of the push call, you can capture that outside of the set call already:
const newRef = push(ref(db, "expenses"));
set(newRef, expense)
.then(() => {
dispatch(
addExpense({
id: newRef.key,
...expense,
})
);
})
.catch((e) => {
console.log("This failed.", e);
});
Calling push is a pure client-side operation, which is synchronous, so that doesn't require await or then (which should be used with asynchronous operations).
Further considerations
Note though that now you're only showing the expense locally after it's been written to the server. If that is a requirement for your use-case, then 👍. But when using Firebase it is quite common to:
Use a permanent, onValue listener on expenses to show the latest expenses in the UI.
Write the new expense with a simple call, without a then() listener: set(push(ref(db, "expenses")), expense);
The Firebase SDK will then immediately call the local onValue listener with the new value, with the assumption that the write will succeed.
So your UI will show the local value straight away, giving the user an almost instant response.
In the (more uncommon) case that the server (i.e. your security rules) rejects the write operation, the SDK calls onValue again with the corrected data, so your UI can update the state.

Wait for Observable to complete in order to submit a form

I have a 'new trip' form, where the user can write the names of the participants and then submit the form to create the trip.
On submit, I query a Firebase database with the names, in order to get the IDs of the participants (/users). I then add the IDs to the participantsID field of the trip object and then I push the new trip to Firebase.
The problem is that the Firebase query is async and returns an Observable, therefore my function will proceed to push the object before the Observable has completed, so the participantsID field of the new object is empty.
Is there any method to wait for the observable to complete (in a kind of synchronous way) so that i can manipulate the data and then proceed? All my attempts to fix this have failed so far.
Here's my simple code.
getUserByAttribute(attribute, value) {
return this.db.list('/users', {
query: {
orderByChild: attribute,
equalTo: value,
limitToFirst: 1
}
});
}
createTrip(trip) {
for(let name in participantsName.split(',')) {
getUserByAttribute('username', name)
.subscribe( user => trip.participantsID.push(user[0].$key) );
}
this.db.list('/trips').push(trip);
}
You could treat all Observables into a single Observable by doing forkJoin
createTrip(trip) {
var observableArray: any = participantsName.split(',')
.switchMap((name)=> getUserByAttribute('username', name))
Observable.forkJoin(observableArray).subscribe(
trips => trips.forEach((trip) => {
this.db.list('/trips').push(trip);
})
);
}
In the end I used part of #Pankaj Parkar's answer to solve the problem.
I forkJoin all the Observables returned by mapping the splitted names and I subscribe to that Observable which result contains an array of arrays, where the inner arrays contain a user object.
getUserByAttribute(attribute, value) {
return this.db.list('/users', {
query: {
orderByChild: attribute,
equalTo: value,
limitToFirst: 1
}
}).first();
}
createTrip(trip) {
Observable.forkJoin(
trip.participantsName.split(',')
.map(name => getUserByAttribute('name', name))
).subscribe(
participants => {
trip.participants = participants.map( p => p[0].$key);
this.tripService.createTrip(trip);
}
);
}
}
You have a difficult problem. You have to get users info before push a new trip.
You can't just make new subscriptions every time because of the memory leak problem (or be careful with unsubscribes). If you are using Firebase, you can use AngularFire subject support.
You can update a subscription by using a subject in your query (with the equal to) and then push a user to retrieve with .next(user).
Then you still have to wait for all users. For that, you can have only one subscription and get all IDs synchronously or have multiple subscriptions to get multiple results faster (but it's difficult).
To solve this problem, I created:
a queue of callbacks (just arrays but use push() and unshift() methods)
a queue of values
one subject for one subscription.
If you want an ID, you have to:
push the value
push the callback that will retrieve the value returned.
You should use functions to push because you'll have to call .next() if there is no value in the stack (to start !).
And in your subscription, in its callback, i.e when you receive the distant user object, you can call the first callback in the stack. Don't forget to pop your value and callback of the stacks and call the next() for the next value if there is one.
This way, you can push your trip in the last callback for the last user. And it's all callbacks, it means your app is not interrupted.
I still not decided if we should do that in a cloud function. Because the user have to stay connected, and this use his data / processor. But it's good to have all the code in the same place, and cloud functions are limited for a free version of Firebase. What would a Firebase developer advice?
I made a lot of searches to find a better solution, so please share it if you have one. It's a little complicated I think, but it's working very fine. I had the same problem when a user want to add a new flight, I need to get the airports information before (coords) and push multiple objects (details, maps, etc.)

Testing Complex Asynchronous Redux Actions

So, let's say I have the next action:
export function login({ email, password, redirectTo, doNotRedirect }) {
return ({ dispatch }) => {
const getPromise = async () => {
const basicToken = Base64.encode(`${email}:${password}`);
const authHeaders = { Authorization: `Basic ${basicToken}` };
const { payload, error } = await dispatch(sendAuthentication(authHeaders));
if (error) throw payload;
const { username, token, fromTemporaryPassword } = payload;
const encodedToken = Base64.encode(`${username}:${token}`);
dispatch(persistence.set('authorizationToken', encodedToken));
dispatch(postGlobalId({ username }));
dispatch(setIsLoggedIn(true));
dispatch(setIsFromTemporaryPassword(fromTemporaryPassword));
await dispatch(clientActions.fetchClient);
if (doNotRedirect) return;
if (fromTemporaryPassword)
dispatch(updatePath('/profile/change-password'));
else
dispatch(updatePath(redirectTo || '/dashboard'));
};
return {
type: AUTHENTICATION_LOGIN,
payload: getPromise()
};
};
}
And I want to add tests for it, to add reliability to the code.
So, here are few things:
We send authentication headers and get data as a response
We throw an error if some error is present in the response
We set up all needed tokens, dispatch all needed actions to show that we are logged in now
Fetching client data
Based on params and received data, we redirect to needed route / don't redirect
The question is that it is really too hard to test and we need to stub literally everything, which is bad due to brittle tests, fragility and too much of implementation knowing (not to mention that it is pretty challenging to stub dispatch to work properly).
Therefore, should I test all of these 5 points, or to focus only on the most important stuff, like sending authorization request, throw error and check redirects? I mean, the problem with all flags that they can be changed, so it is not that reliable.
Another solution is just to separate these activities into something like following:
auth
setLoginInfo
handleRedirects
And to pass all needed functions to invoke through dependency injection (here just with params, basically)? With this approach I can spy only invoking of this functions, without going into much details.
I am quite comfortable with unit testing of pure functions and handling different edge-cases for them (without testing too much implementation, just the result), but testing complex functions with side-effects is really hard for me.
If you have very complex actions like that, I think an alternative (better?) approach is to have simple synchronous actions instead (you can even just dispatch payloads directly, and drop action creators if you like, reducing boiler-plate), and handle the asynchronous side using redux-saga: https://github.com/yelouafi/redux-saga
Redux Saga makes it very simple to factor out your business logic code into multiple simple generator functions that can be tested in isolation. They can also be tested without the underlying API methods even being called, due to the 'call' function in that library: http://yelouafi.github.io/redux-saga/docs/api/index.html#callfn-args. Due to the use of generators, your test can 'feed' values to the saga using the standard iterator.next method. Finally, they make it much easier for reducers to have their say, since you can check something from store state (e.g. using a selector) to see what to do next in your saga.
If Redux + Redux Saga had existed before I started on my app (about 100,000 JS(X) LOC so far), I would definitely have used them.

Shortest code to cache Rxjs http request while not complete?

I'm trying to create an observable flow that fulfills the following requirements:
Loads data from storage at subscribe time
If the data has not yet expired, return an observable of the stored value
If the data has expired, return an HTTP request observable that uses the refresh token to get a new value and store it
If this code is reached again before the request has completed, return the same request observable
If this code is reached after the previous request completed or with a different refresh token, start a new request
I'm aware that there are many different answers on how to perform step (3), but as I'm trying to perform these steps together I am looking for guidance on whether the solution I've come up with is the most succinct it can be (which I doubt!).
Here's a sample demonstrating my current approach:
var cachedRequestToken;
var cachedRequest;
function getOrUpdateValue() {
return loadFromStorage()
.flatMap(data => {
// data doesn't exist, shortcut out
if (!data || !data.refreshtoken)
return Rx.Observable.empty();
// data still valid, return the existing value
if (data.expires > new Date().getTime())
return Rx.Observable.return(data.value);
// if the refresh token is different or the previous request is
// complete, start a new request, otherwise return the cached request
if (!cachedRequest || cachedRequestToken !== data.refreshtoken) {
cachedRequestToken = data.refreshtoken;
var pretendHttpBody = {
value: Math.random(),
refreshToken: Math.random(),
expires: new Date().getTime() + (10 * 60 * 1000) // set by server, expires in ten minutes
};
cachedRequest = Rx.Observable.create(ob => {
// this would really be a http request that exchanges
// the one use refreshtoken for new data, then saves it
// to storage for later use before passing on the value
window.setTimeout(() => { // emulate slow response
saveToStorage(pretendHttpBody);
ob.next(pretendHttpBody.value);
ob.completed();
cachedRequest = null; // clear the request now we're complete
}, 2500);
});
}
return cachedRequest;
});
}
function loadFromStorage() {
return Rx.Observable.create(ob => {
var storedData = { // loading from storage goes here
value: 15, // wrapped in observable to delay loading until subscribed
refreshtoken: 63, // other process may have updated this between requests
expires: new Date().getTime() - (60 * 1000) // pretend to have already expired
};
ob.next(storedData);
ob.completed();
})
}
function saveToStorage(data) {
// save goes here
}
// first request
getOrUpdateValue().subscribe(function(v) { console.log('sub1: ' + v); });
// second request, can occur before or after first request finishes
window.setTimeout(
() => getOrUpdateValue().subscribe(function(v) { console.log('sub2: ' + v); }),
1500);
First, have a look at a working jsbin example.
The solution is a tad different then your initial code, and I'd like to explain why. The need to keep returning to your local storage, save it, save flags (cache and token) didn't not fit for me with reactive, functional approach. The heart of the solution I gave is:
var data$ = new Rx.BehaviorSubject(storageMock);
var request$ = new Rx.Subject();
request$.flatMapFirst(loadFromServer).share().startWith(storageMock).subscribe(data$);
data$.subscribe(saveToStorage);
function getOrUpdateValue() {
return data$.take(1)
.filter(data => (data && data.refreshtoken))
.switchMap(data => (data.expires > new Date().getTime()
? data$.take(1)
: (console.log('expired ...'), request$.onNext(true) ,data$.skip(1).take(1))));
}
The key is that data$ holds your latest data and is always up to date, it is easily accessible by doing a data$.take(1). The take(1) is important to make sure your subscription gets a single values and terminates (because you attempt to work in a procedural, as opposed to functional, manner). Without the take(1) your subscription would stay active and you would have multiple handlers out there, that is you'll handle future updates as well in a code that was meant only for the current update.
In addition, I hold a request$ subject which is your way to start fetching new data from the server. The function works like so:
The filter ensures that if your data is empty or has no token, nothing passes through, similar to the return Rx.Observable.empty() you had.
If the data is up to date, it returns data$.take(1) which is a single element sequence you can subscribe to.
If not, it needs a refresh. To do so, it triggers request$.onNext(true) and returns data$.skip(1).take(1). The skip(1) is to avoid the current, out dated value.
For brevity I used (console.log('expired ...'), request$.onNext(true) ,data$.skip(1).take(1))). This might look a bit cryptic. It uses the js comma separated syntax which is common in minifiers/uglifiers. It executes all statements and returns the result of the last statement. If you want a more readable code, you could rewrite it like so:
.switchMap(data => {
if(data.expires > new Date().getTime()){
return data$.take(1);
} else {
console.log('expired ...');
request$.onNext(true);
return data$.skip(1).take(1);
}
});
The last part is the usage of flatMapFirst. This ensures that once a request is in progress, all following requests are dropped. You can see it works in the console printout. The 'load from server' is printed several times, yet the actual sequence is invoked only once and you get a single 'loading from server done' printout. This is a more reactive oriented solution to your original refreshtoken flag checking.
Though I didn't need the saved data, it is saved because you mentioned that you might want to read it on future sessions.
A few tips on rxjs:
Instead of using the setTimeout, which can cause many problems, you can simply do Rx.Observable.timer(time_out_value).subscribe(...).
Creating an observable is cumbersome (you even had to call next(...) and complete()). You have a much cleaner way to do this using Rx.Subject. Note that you have specifications of this class, the BehaviorSubject and ReplaySubject. These classes are worth knowing and can help a lot.
One last note. This was quite a challange :-) I'm not familiar with your server side code and design considerations yet the need to suppress calls felt uncomfortable to me. Unless there is a very good reason related to your backend, my natural approach would be to use flatMap and let the last request 'win', i.e. drop previous un terminated calls and set the value.
The code is rxjs 4 based (so it can run in jsbin), if you're using angular2 (hence rxjs 5), you'll need to adapt it. Have a look at the migration guide.
================ answers to Steve's other questions (in comments below) =======
There is one article I can recommend. It's title says it all :-)
As for the procedural vs. functional approach, I'd add another variable to the service:
let token$ = data$.pluck('refreshtoken');
and then consume it when needed.
My general approach is to first map my data flows and relations and then like a good "keyboard plumber" (like we all are), build the piping. My top level draft for a service would be (skipping the angular2 formalities and provider for brevity):
class UserService {
data$: <as above>;
token$: data$.pluck('refreshtoken');
private request$: <as above>;
refresh(){
request.onNext(true);
}
}
You might need to do some checking so the pluck does not fail.
Then, each component that needs the data or the token can access it directly.
Now lets suppose you have a service that needs to act on a change to the data or the token:
class SomeService {
constructor(private userSvc: UserService){
this.userSvc.token$.subscribe(() => this.doMyUpdates());
}
}
If your need to synthesize data, meaning, use the data/token and some local data:
Rx.Observable.combineLatest(this.userSvc.data$, this.myRelevantData$)
.subscribe(([data, myData] => this.doMyUpdates(data.someField, myData.someField));
Again, the philosophy is that you build the data flow and pipes, wire them up and then all you have to do is trigger stuff.
The 'mini pattern' I've come up with is to pass to a service once my trigger sequence and register to the result. Lets take for example autocomplete:
class ACService {
fetch(text: string): Observable<Array<string>> {
return http.get(text).map(response => response.json().data;
}
}
Then you have to call it every time your text changes and assign the result to your component:
<div class="suggestions" *ngFor="let suggestion; of suggestions | async;">
<div>{{suggestion}}</div>
</div>
and in your component:
onTextChange(text) {
this.suggestions = acSVC.fetch(text);
}
but this could be done like this as well:
class ACService {
createFetcher(textStream: Observable<string>): Observable<Array<string>> {
return textStream.flatMap(text => http.get(text))
.map(response => response.json().data;
}
}
And then in your component:
textStream: Subject<string> = new Subject<string>();
suggestions: Observable<string>;
constructor(private acSVC: ACService){
this.suggestions = acSVC.createFetcher(textStream);
}
onTextChange(text) {
this.textStream.next(text);
}
template code stays the same.
It seems like a small thing here, but once the app grows bigger, and the data flow complicated, this works much better. You have a sequence that holds you data and you can use it around the component wherever you need it, you can even further transform it. For example, lets say you need to know the number of suggestions, in the first method, once you get the result, you need to further query it to get it, thus:
onTextChange(text) {
this.suggestions = acSVC.fetch(text);
this.suggestionsCount = suggestions.pluck('length'); // in a sequence
// or
this.suggestions.subscribe(suggestions => this.suggestionsCount = suggestions.length); // in a numeric variable.
}
Now in the second method, you just define:
constructor(private acSVC: ACService){
this.suggestions = acSVC.createFetcher(textStream);
this.suggestionsCount = this.suggestions.pluck('length');
}
Hope this helps :-)
While writing, I tried to reflect about the path I took to getting to use reactive like this. Needless to say that on going experimentation, numerous jsbins and strange failures are big part of it. Another thing that I think helped shape my approach (though I'm not currently using it) is learning redux and reading/trying a bit of ngrx (angular's redux port). The philosophy and the approach does not let you even think procedural so you have to tune in to functional, data, relations and flows based mindset.

Categories

Resources