I have the code below that inserts a todo object in an indexedDB object store - and then gets a copy of the stored object (see further down) and this works fine.
I'm concerned that I am reusing a transaction that might be unsafe to use - since the transaction has already succeeded.
Should I create another transaction for the get - or is this unnecessary?
// this is only called after the 'tasks' object store has been opened
const todoIDB = requestIDB.result
const transaction = todoIDB.transaction(["tasks"], "readwrite")
const todoStore = transaction.objectStore("tasks")
const addRequest = todoStore.add({text:txt_val})
addRequest.addEventListener("success", ()=>{
console.log("Added " + "#" + addRequest.result + ": " + txt_val)
// should I add a new transaction, etc. here?
const getRequest = todoStore.get(addRequest.result)
getRequest.addEventListener("success", ()=>{
console.log("Found " + JSON.stringify(getRequest.result))
})
})
Here is some (valid) output (from Chrome):
Added #18: aaa
Found {"text":"aaa","id":18}
Added #19: bbb
Found {"text":"bbb","id":19}
Transactions can span multiple requests, so this is fine. (Of course, if the add request fails - e.g. the record already exists - then "success" won't fire the get request won't happen.)
And to clarify a point - when you're observing the "success" event, it's the request that has succeeded, not the transaction. A "complete" or "abort" event will fire at the transaction object when the overall transaction has finished, i.e. when all of the individual requests have succeeded or one has failed and caused the transaction to fail.
Related
I am attempting to use a simple indexeddb 'get' method to retrieve a keyed record from an existing data store. For some reason that i have been unable to find, though the result of my 'get' request is 'successful', the returned value is 'undefined'.
The code below is called from an event listener that is fired when the user clicks on a button in a row of a table, indicating the user wants details on that particular row. Each row in the table represents a record in the indexeddb data store that has already been created. There are 8 values of interest in each row, but only 3 are displayed in my html table. I am simply attempting to access the row in the data store to get all 8 values in order to pass them along to the next process.
The following code is called from an event listener created on a 'Select' button for each row...
async function getGameInProgressDetails(gameID) {
try {
db = await idbConnect(DBName,DBVersion);
let tx = db.transaction(['gamesList'], 'readonly');
let gameStore = tx.objectStore('gamesList');
// I have confirmed the 'gameID' passed in is the key value that i would expect
// to retrieve the desired result.
let req = gameStore.get(gameID); // gameID comes from the selected row in the html table.
req.onsuccess = (ev) => {
console.log('Success');
let request = ev.target;
console.log(request); // returns an IDBRequest object with result: undefined and error: null.
let theGame = request.result;
console.log(theGame ); // displays 'undefined'.
}
req.onerror = (err) => {
console.warn(err);
};
} catch (e) {
console.warn(e);
}
I get the 'success' message indicating that the 'get' operation was successful, but my result is 'undefined'. Could someone please tell me if i am missing a step or if there's another way to go about this that i should look into? I've looked at a variety of tutorials and i must be overlooking something 'obvious'.
I discovered the problem with my code/approach and am answering my own question. The reason that the above code was not working was not because of my indexeddb code or logic at all.
What i discovered was that the index value that i had passed into the routine needed to be cast as an integer before the get() method call, JS was treating it as a string value. I was confused by the fact that i had checked the value in a console.log statement and it had shown the appropriate value. What i hadn't considered was how JS evaluated what type the variable value was.
For those coming later and seeing this, my final code was thus:
async function getGameInProgressDetails(gameID) {
db = await idbConnect(DBName,DBVersion);
let tx = db.transaction(['gamesList'], 'readonly');
var gameStore = tx.objectStore('gamesList');
let gameIndex = gameStore.index("gameIDIdx");
let request = gameIndex.get(parseInt(gameID)); //!! NOTE THE parseInt!!
request.onsuccess = function() {
if (request.result !== undefined) {
console.log("Games", request.result);
} else {
console.log("No such games.");
}
}
I have a python server-side which sends a request using SSE.
Here the example of python code. It sends an 'action-status' and data which JS has to handle (to do):
async def sse_updates(request):
loop = request.app.loop
async with sse_response(request) as resp:
while True:
# Sending request for queue' token remove when it's been removed on server.
if request.app['sse_requests']['update_queue_vis_remove']:
await resp.send("update-remove")
request.app['sse_requests']['update_queue_vis_remove'] = False
# Sending request for queue' token adding up when it's been added on server.
if request.app['sse_requests']['update_queue_vis_append'][0]:
await resp.send(f"update-append {request.app['sse_requests']['update_queue_vis_append'][1]} {request.app['sse_requests']['update_queue_vis_append'][2]}")
request.app['sse_requests']['update_queue_vis_append'][0] = False
# Sending request for redundant token's list rewrite (on client side ofc)
if request.app['sse_requests']['redundant_tokens_vis'][0]:
await resp.send('update-redtokens ' + ''.join(token + ' ' for token in request.app['sse_requests']['redundant_tokens_vis'][1]))
request.app['sse_requests']['redundant_tokens_vis'][0] = False
await asyncio.sleep(0.1, loop=loop)
return resp
And the JS script which handles a response:
evtSource = new EventSource("http://" + window.location.host + "/update")
evtSource.onmessage = function(e) {
// Data from server is fetching as "<server-event-name> <data1> <data2> <data3> ..."
let fetched_data = e.data.split(' ');
// First option is when a token has been removed from server this event has to be represented on a client-side.
if(fetched_data[0] === "update-remove")
displayQueueRemove();
// The second option is when a token appended on server and also it should be represented to a user
else if(fetched_data[0] === "update-append")
// fetched_data[1] - token
// fetched_data[2] - it's (token's) position
displayQueueAdd(fetched_data[1], parseInt(fetched_data[2]));
// The last possible options is that if the web-page will has refreshed a data in redundant_tokens should be rewritten
else if (fetched_data[0] === "update-redtokens"){
fetched_data.shift();
// Creating variables for token' wrapping
let tag;
let text;
// Wrapping tokens and store it into the array.
for(let i = 0; i < fetched_data.length - 1; i++) {
tag = document.createElement("div");
text = document.createTextNode(fetched_data[i]);
tag.appendChild(text);
tag.setAttribute("class", "token-field");
redundant_tokens[i] = tag;
}
}
}
The problem is that if I open two or more browser windows (sessions), the only one of them catches a response and represents it. Moreover there were the cases when I send a request from one session, however obtain a response to another one.
Is there an option to fix it using SSE (I mean, I was considering some other methods but I'd like to try with SSE)?
I think your problem is synchronizing the data (app["sse_requests"]).
Depending on how you modify the data and who needs to be notified you might need to keep a list of clients (sessions).
For example if all clients need to be notified of all events then keep a list (or even better a set) of connected clients and create a periodic function (using create_task) in which to notify all of them.
If a client needs to only be notified of certain events then you need to identify that client using some sort of key in the request object.
Please bear with me, as the question is long and detailed, and I am fairly new to RxJS.
I am attempting to create Amazon S3 browser in Angular which looks like Windows Explorer.
Something like this...
The left list will contain all the root folders (and it will not be a tree view), and when clicked on any root folder, the subfolders and files inside it will be shown in the right-side details view.
I need a new S3 Access token for each of the root folders in the left-list. I have a backend service which does so. This token is valid for certain time duration. So the cases in which the current token is invalid are:-
If the user clicks on some other root folder in left list.
If the token expiry is reached.
This is what I have written to manage this token expiry condition :-
private accessTokenSource: BehaviorSubject<AccessToken | null> = new BehaviorSubject(null);
accessToken$ = this.accessTokenSource.asObservable();
getAccessToken() {
return this.http.get(${this.accessTokenEndpoint}).pipe(
tap((accessToken) => {
// set Access token in a subject
this.accessTokenSource.next(accessToken);
}),
switchMapTo(timer(55*60*1000).pipe(
tap(() => {
// reset access token in subject since now token is invalid - Expiry case
this.accessTokenSource.next(null);
})
))
);
}
// Whoever subscribes to this will fetch the token and start the expiration timer
Since not wanting to expose access token fetching logic in view layer, each of my left-list and details component calls a method getDetails(currentPrefix: string) in the s3Service. This method first checks validity of the token for being able to call S3 API, and then calls listObjects operation and returns the result. Here's what I have so far :-
// Checks the validity of token for the current prefix
checkAccessTokenValidity(currentPrefix: string) {
let isTokenValid: boolean = true;
// This uses access token set in the subject
// According to me, it will be reset by the timer's tap operation (when it expires)
const sub = this.accessToken$.subscribe((token) => {
// Check token's expiry or usability for current folder and update isTokenValid accordingly
if(!token || !currentPrefix.includes(token.rootFolder)) {
isTokenValid = false;
}
});
sub.unsubscribe();
return isTokenValid;
}
// Public method to call from list and details components
getDetails(currentPrefix: string) {
const isTokenValid = this.checkAccessTokenValidity(currentPrefix);
if(!isTokenValid) {
// This will fetch the token and start the TIMER
this.getAccessToken().subscribe(() => {});
}
// I think that this will not work, since if getAccessToken takes time,
// then accessToken$ will still be invalid!
const objectList$ = this.accessToken$.pipe(
map(token => {
// S3 List Objects method here, with current token
})
)
}
How do I solve the problem of checking token validity and then waiting for my service to return new valid token in order to call the S3 API? Any help would be really appreciated. This approach may be dead wrong as well, so please feel free to correct me as well.
I'd say that there is no need to create another subscription just to get the current value of a BehaviorSubject.
This means that these lines:
const sub = this.accessToken$.subscribe((token) => {
if(!token || !currentPrefix.includes(token.rootFolder)) {
isTokenValid = false;
}
});
sub.unsubscribe();
could be replaced with
const isTokenValid = !!this.accessTokenSource.value;
As you already mentioned, getAccessToken takes some time, meaning you can't get its result synchronously.
A quick fix would be this:
const tokenValid$ = of(this.accessTokenSource.value);
const tokenInvalid$ = merge(
// Not interested in the values emitted as side effects are produced in `tap()`
// With this, we're just subscribing. This way, an HTTP call will be made
this.getAccessToken().pipe(ignoreElements()),
// `accessTokenSource` is a `BehaviorSubject` and we don't want its current value,
// that's why we're skipping it. Next time it emits, it will have the value returned from `getAccessToken`
this.accessTokenSource.pipe(skip(1))
);
const objectList$ = iif(() => isTokenValid, tokenValid$, tokenInvalid$);
iff() is used to decide at subscription time to which observable to subscribe.
if(() => booleanValue, subscribeToThisIfTrue, subscribeToThisIfFalse)
This way, when you subscribe to objectList$, it will pick up the proper observable, depending on whether the token is valid or not.
I have an IndexedDB that is storing a large amount of dynamic data. (The static data is already cached by a Service Worker)
My problem is as this data is dynamic, I need the IndexedDB to be cleared and it to be restored each time the application is opened. For this, I need the version number to be incremented so that the onupgradeneeded event is fired. I can't think of any logical way to do this, and even using the following call in the onupgradeneeded event I get an undefined answer.
e.target.result.oldversion
My IndexedDB code is as follows, with the parameteres being:
Key - The name of the JSON object to store in the database.
Value - The JSON object itself.
function dbInit(key, value) {
// Open (or create) the database
var open = indexedDB.open("MyDatabase", 1);
console.log(value);
// Create the schema
open.onupgradeneeded = function(e) {
console.log("Old Version: " + e.target.result.oldversion); //Undefined
console.log("New Version: " + e.target.result.newversion); //Undefined
var db = open.result;
var store = db.createObjectStore("Inspections", {keyPath: "id", autoIncrement: true});
var index = store.createIndex(key, key);
};
open.onsuccess = function() {
// Start a new transaction
var db = open.result;
var tx = db.transaction("Inspections", "readwrite");
var store = tx.objectStore("Inspections");
var index = store.index(key);
store.add(value);
// Close the db when the transaction is done
tx.oncomplete = function() {
db.close();
};
}
}
As this method is called several times for several 'Key' objects, I will need to work out a way to increment this Version number once per opening of the page, and then move the 'add' call to outside of the onupgradeneeded method - but for the moment the priority is making sure it runs through once - incrementing the version number, firing the onupgradeneeded, deleting the current data, storing the new data.
Thanks in advance!
The oldVersion and newVersion properties (note the capitalization) are on the IDBVersionChangeEvent, not on the IDBDatabase. In other words, this:
console.log("Old Version: " + e.target.result.oldversion); //Undefined
console.log("New Version: " + e.target.result.newversion); //Undefined
should be:
console.log("Old Version: " + e.oldVersion);
console.log("New Version: " + e.newVersion);
Now that said... you're using the schema versioning in a somewhat atypical way. If you really want to start with a fresh database each time the page is opened, just delete before opening:
indexedDB.deleteDatabase("MyDatabase");
var open = indexedDB.open("MyDatabase", 1);
// An open/delete requests are always processed in the order they
// are made, so the open will wait for the delete to run.
Note that the queued operations (delete and open) would then be blocked if another tab was holding an open connection and didn't respond to a versionchange event sent out in response to the delete request. Maybe that's a good thing in your case - it would prevent two tabs from partying on the database simultaneously.
A more typical usage pattern would be to only change the version when the web app is upgraded and the database schema is different. If you did need to wipe the data across sessions you'd do that on open, rather than on upgrade, and use things like clear() on the object store. But now we're getting into the design of your app, which it sounds like you've got a good handle on.
I am trying to send a push notification using cloud code to targeted channels. The Object is called Prayers. When someone saves Prayers, it is supposed to send a push notification to certain channels, if the new data in Prayers was not made anonymously. Prayers has a key of 'Anonymous' in it that is boolean. So, I have cloud code set up like this, in an effort that if the boolean value is false, it sends, it, but if it is true, it won't send the push. The issue now is that it is sometimes sending the Push through 2 times on a non-anonymous post.
Parse.Cloud.afterSave("Prayers", function(request) {
var firstName = request.object.get('FirstName');
var lastName = request.object.get('LastName');
var userId = request.object.get('UserId');
var anonymous = request.object.get('Anonymous');
var anonymousString = anonymous.toString
var pushQuery = new Parse.Query(Parse.Installation);
pushQuery.equalTo('channels', userId);
if (anonymous == false) {
Parse.Push.send({
where: pushQuery, // Set our Installation query
data: {
alert: firstName + " " + lastName + " " + "just added a prayer request."
}
}, {
success: function() {
// Push was successful
},
error: function(error) {
throw "Got an error " + error.code + " : " + error.message;
}
});
}
});
At first glance there doesn't seem anything wrong with your function, but since you always try to send a push notification after a prayer has been saved, are you sure you're not saving the object twice? That could be the reason why the afterSave is invoked twice.
One of the things I also once ran into was, that I had 2 pieces of cloud code
First one would modify an object when I tried to save it.
Second one was that I would do a push after saving the object.
In my code where I modified the object during the save process, I saved the modified object, which resulted in my Parse.Cloud.afterSave being fired twice for the same object
Server Side
Update reg_id by UUID when device registration on server.
Delete by reg_id which have return a canonical_id in the response after send a push.
Periodically send fake push using dry_run and do the same things as 2.
Client Side
Send message_id within payload, and save it in sqlite DB. And device will know that it has received it or not.