Accessing IndexedDB from multiple javascript threads - javascript

Overview:
I am trying to avoid a race condition with accessing an IndexedDB from both a webpage and a web-worker.
Setup:
Webpage that is saving items to the local IndexedDB as the user works with the site. Whenever a user saves data to the local DB the record is marked as "Unsent".
Web-worker background thread that is pulling data from the IndexedDB, sending it to the server and once the server receives it, marking the data in the IndexedDB as "Sent".
Problem:
Since access to the IndexedDB is asynchronous, I can not be guaranteed that the user won't update a record at the same time the web-worker is sending it to the server. The timeline is shown below:
Web-worker gets data from DB and sends it to the server
While the transfer is happening, the user updates the data saving it to the DB.
The web-worker gets the response from the server and then updates the DB to "Sent"
There is now data in DB that hasn't been sent to the server but marked as "Sent"
Failed Solution:
After getting the response from the server, I can recheck to row to see if anything has been changed. However I am still left with a small window where data can be written to the DB and it will never be sent to the server.
Example:
After server says data is saved, then:
IndexedDB.HasDataChanged(
function(changed) {
// Since this is async, this changed boolean could be lying.
// The data might have been updated after I checked and before I was called.
if (!changed){
IndexedDB.UpdateToSent() }
});
Other notes:
There is a sync api according to the W3 spec, but no one has implemented it yet so it can not be used (http://www.w3.org/TR/IndexedDB/#sync-database). The sync api was designed to be used by web-workers, to avoid this exact situation I would assume.
Any thoughts on this would be greatly appreciated. Have been working on it for about a week and haven't been able to come up with anything that will work.

I think I found a work around for this for now. Not really as clean as I would like, but it seems to be thread safe.
I start by storing the datetime into a LastEdit field, whenever I update the data.
From the web-worker, I am posting a message to the browser.
self.postMessage('UpdateDataSent#' + data.ID + '#' + data.LastEdit);
Then in the browser I am updating my sent flag, as long as the last edit date hasn't changed.
// Get the data from the DB in a transaction
if (data.LastEdit == lastEdit)
{
data.Sent = true;
var saveStore = trans.objectStore("Data");
var saveRequest = saveStore.put(data);
console.log('Data updated to Sent');
}
Since this is all done in a transaction in the browser side, it seems to work fine. Once the browsers support the Sync API I can throw it all away anyway.

Can you use a transaction?
https://developer.mozilla.org/en/IndexedDB/IDBTransaction

Old thread but the use of a transaction would solve the Failed Solution approach. I.e. the transaction only needs to span the check that the data in the IndexedDB hasn't change after the send and marking it as sent if there was no change. If there was a change, the transaction ends without writing.

Related

Best way to have a Node.JS server keep updated with a FireBase database in real time?

I currently have a Node.JS server set up that is able to read and write data from a FireBase database when a request is made from a user.
I would like to implement time based events that result in an action being performed at a certain date or time. The key thing here though, is that I want to have the freedom to do this in seconds (for example, write a message to console after 30 seconds have passed, or on Friday the 13th at 11:30am).
A way to do this would be to store the date/time an action needs be performed in the database, and read from the database every second and compare the current date/time with events stored so we know if an action needs to be performed at this moment. As you can imagine though, this would be a lot of unnecessary calls to the database and really feels like a poor way to implement this system.
Is there a way I can stay synced with the database without having to call every second? Perhaps I could then store a version of the events table locally and update this when a change is made to the database? Would that be a better idea? Is there another solution I am missing?
Any advice would be greatly appreciated, thanks!
EDIT:
How I currently initialise the database:
firebase.initializeApp(firebaseConfig);
var database = firebase.database();
How I then get data from the database:
await database.ref('/').once('value', function(snapshot){
snapshot.forEach(function(childSnapshot){
if(childSnapshot.key === userName){
userPreferences = childSnapshot.val().UserPreferences;
}
})
});
The Firebase once() API reads the data from the database once, and then stops observing it.
If you instead us the on() API, it will continue observing the database after getting the initial value - and call your code whenever the database changes.
It sounds like you're looking to develop an application for scheduling. If that's the case you should check out node-schedule.
Node Schedule is a flexible cron-like and not-cron-like job scheduler
for Node.js. It allows you to schedule jobs (arbitrary functions) for
execution at specific dates, with optional recurrence rules. It only
uses a single timer at any given time (rather than reevaluating
upcoming jobs every second/minute).
You then can use the database to keep a "state" of the application so on start-up of the application you read all the upcoming jobs that will be expected and load them into node-schedule and let node-schedule do the rest.
The Google Cloud solution for scheduling a single item of future work is Cloud Tasks. Firebase is part of Google Cloud, so this is the most natural product to use. You can use this to avoid polling the database by simply specifying exactly when some Cloud Function should run to do the work you want.
I've written a blog post that demonstrates how to set up a Cloud Task to call a Cloud Functions to delete a document in Firestore with an exact TTL.

Get real time updates from Microsoft Graph API (beta) Communication API

I am current using Microsoft Graph API (beta) to get Presence Status e.g. Online, away etc. in an spfx webpart (using React) using GraphClient:
this.props.context.msGraphClientFactory.getClient().then(async (client: MSGraphClient) => {
let response = await client
.api('/communications/getPresencesByUserId')
.version('beta')
.post(postData)
console.log("Communication API Response: "+ response);
this.usersWithPresence = response.value;
});
This is working fine, but to get updated status of a user, I have to refresh the page so another API call is made and updated presence status. I want to do it like this happens in 'Skype'.
What I need is suggestions about a mechanism that I can apply to get
real time updates in user's presence status, so as soon as user
updates the status this is reflected in my webpart. I know I can use
setInterval or setTimeout functions to request for presence status
after specific intervals but for learning purposes i don't want to
request API this way again and again but rather getting updated
message from server like this happens using web sockets. How a web
socket sort of stuff can be applied with this API?
your suggestions are welcome.
Today this API doesn't support any kind of subscription mechanism to when the status changes. There is a uservoice entry you can upvote for it. That means the only way to get any changes is to periodically poll it. As for socket io, the only support today is for SharePoint lists, for any other resource you need to stand up your own infrastructure to relay the message.

Getting data from a API on the client side without giving the key

Currently I am making a page that display's data gathered from an API. Most of the data is updated on the server side every 4 hours but some of it is updated whenever a client requests the index route. As a result, there is a delay in the index file being sent since the data needs to be updated. I want to gather the updated data after the page has been requested and sent so there is no delay. My first idea was to make the request on the client side which will handle updating the display after the data is gathered but from my knowledge, I don't know how to do that without giving them the API key. Should I approach the problem this way or is there a better way to do it? I'm using Express for the back-end, Axios is used to make the get requests, and EJS is the template engine.
Here is the code:
// This is called before the data is send in a for loop
data.gameData[i].player_count = await SteamModule.liveGetCurrentPlayers(data.gameData[i].appid);
res.render('index', {data: data});
// This is the function that is called
liveGetCurrentPlayers: async (id) => {
const res = await axios.get(`${base}/ISteamUserStats/GetNumberOfCurrentPlayers/v1/?key=${key}&appid=${id}`, {timeout: 1000}).catch(err => {
Logger.error("Error getting updated user data per request");
return 'Error';
});
if(res.data) {
return res.data.response.player_count;
} else {
return 'Error';
}
}
Here's a bit of a drawing to explain what I've said in comments.....
(the code you showed should constantly update - without other info I can't help with whatever the other issue was, but this is the overall idea....)
Where:
Client requests data from you (your server)
Your server sends html and css to show a 'frame' of the page (no data, just something for them to see and feel like something is happening...)
Your server requests data from the API server (all the various "20-ish" things you said you wanted to serve....)
As the data is updated (or you may have it already), you send the data to the client, updating their 'frame' page with current data.
You can maintain your keys on the server side and add restriction that those API's can only be accessed by your client side URL. So you would access the API and it will maintain your session and handle the authorized KEY part as well.
Anything and everything on the client side is accessible if it running in your browser.
You can add security measures on the server but not on client side for protecting your key.

Right way of passing consistent data from DB to user without repeatedly querying

Database stores some data about the user which almost never change. Well sometimes information might change if the user wants to edit his name for example.
Data information is about each user's name, username and his company data.
The first two are being shown to his navigation bar all the time using ejs, like User_1 is logged in, his company profile data when he needs to create an invoice.
My current way is to fetch user data through middleware using router.use so the extracted information is always available through all routes/views, for example:
router.use(function(req, res ,next) { // this block of code is called as middleware in every route
req.getConnection(function(err,conn){
uid = req.user.id;
if(err){
console.log(err);
return next("Mysql error, check your query");
}
var query = conn.query('SELECT * FROM user_profile WHERE uid = ? ', uid, function(err,rows){
if(err){
console.log(err);
return next(err, uid, "Mysql error, check your query");
}
var userData = rows;
return next();
});
});
})
.
I understand that this is not an optimal way of passing user profile data to every route/view since it makes new DB queries every time the user navigates through the application.
What would be a better way of having this data available without repeating the same query in each route yet having them re-fetched once the user changes a portion of this data, like his fullname ?
You've just stumbled into the world of "caching", welcome! Caching is a very popular choice for use cases like this, as well as many others. A cache is essentially somewhere to store data that you can get back much quicker than making a full DB query, or a file read, etc.
Before we go any further, it's worth considering your use case. If you're serving only a few users and have a low load on your service, caching might be over-engineering and in fact making a DB request might be the simplest idea. Adding caching can add a lot of complexity to your code as things move forward, not enough to scare you, but enough to cause hard to trace bugs. So consider for a moment your service load, if it's not very high (say an internal application for somewhere you work with only maybe a few requests every few minutes) then just reading from the DB is probably not going to slow down a request too much. In this case, reading from the DB is the simplest and probably best solution. However, if you're noticing that this DB request is slowing down your application for requests or making it harder to scale up, then caching might be for you.
A really popular approach for this would be to get something like "redis" which is a key-value database that holds everything in memory (RAM). Redis can sit as a service like MySQL and has a very basic query language. It is blindingly fast and can scale to enormous loads. If you're using Express, there are a number of NPM modules that help you access a redis instance. Simply push in your credentials and you can then make GET and SET requests (to get data or to set data).
In your example, you may wish to store a users profile in a JSON format against their user id or username in redis. Then, create a function called getUserProfile which takes in the ID or username. This can then look it up in redis, if it finds the record then it can return it to your main controller logic. If it does not, it can look it up in your MySQL database, save it in redis, and then return it to the controller logic (so it'll be able to get it from cache next time).
Your next problem is known for being a very pesky problem in computer science. It's "Cache Invalidation", in this case if your user profile updates you want to "invalidate" your cache. A way of doing this would be to update your cached version when the user updates their profile (or any other data saved). Alternatively, you could also just remove the cached version from redis and then next time it's requested from getUserProfile, it will be fetched from the DB fresh, and then put into redis for next time.
There are many other ways to approach this, but this will most likely solve your problem in the simplest way without too much overhead. It will also be easy to expand in the future!

BreezeJs: SaveChanges() server response getting dropped

I have breezeJs running in an angular app on mobile device (cordova), which talks to .Net WebApi.
Everything works great, except once in a while the device will get PrimaryKey violations (from my SQL Server).
I think I narrowed it down to only happening when data connection is shakey on the device.
The only way I can figure these primary key violations are happening is somehow the server is Saving Changes, but the mobile connection drops out before the response can come back from server that everything saved OK.
What is supposed to happen when BreezeJS doesn't hear back from server after calling SaveChanges?
Anyone familiar with BreezeJS know of a way to handle this scenario?
I've had to handle the same scenario in my project. The approach I took was two part:
Add automatic retries to failed ajax requests. I'm using breeze with jQuery, so I googled "jQuery retry ajax". There's many different implementations, mine is somewhat custom, all center around hijacking the onerror callback as well as the deferred's fail handler to inject retry logic. I'm sure Angular will have similar means of retrying dropped requests.
In the saveChanges fail handler, add logic like this:
...
function isConcurrencyException(reason: any) {
return reason && reason.message && /Store update, insert, or delete statement affected an unexpected number of rows/.test(reason.message);
}
function isConnectionFailure(reason: any): boolean {
return reason && reason.hasOwnProperty('status') && reason.status === 0
}
entityManager.saveChanges()
.then(... yay ...)
.fail(function(reason) {
if (isConnectionFailure(reason)) {
// retry attempts failed to reach server.
// notify user and save to local storage....
return;
}
if (isConcurrencyException(reason)) {
// EF is not letting me save the entities again because my previous save (or another user's save) moved the concurrency stamps on the record. There's also the possibility that a record I'm try to save was deleted by another user.
// recover... in my case I kept it simple and simply attempt to reload the entity. If nothing is returned I know the entity was deleted. Otherwise I now have the latest version. In either case a message is shown to the user.
return;
}
if (reason.entityErrors) {
// We have an "entityErrors" property... this means the saved failed due to server-side validation errors.
// do whatever you do to handle validation errors...
return;
}
// an unexpected exception. let it bubble up.
throw reason;
})
.done(); // terminate the promise chain (may not be an equivalent in Angular, not sure).
One of the ways you can test spotty connections is to use Fiddler's AutoResponder tab. Set up a *.drop rule with a regex that matches your breeze route and check the "Enable Automatic Responses" box when you want to simulate dropped requests.
This is a somewhat messy problem to solve- no one size fits all answer, hope this helps.
NOTE
Ward makes a good point in the comments below. This approach is not suitable in situations where the entity's primary key is generated on the server (which would be the case if your db uses identity columns for PKs) because the retry logic could cause duplicate inserts.

Categories

Resources