indexedDB correct usage - javascript

I'm experimenting with indexedDB. Now everything is like asynchronous and that hurts my brain a lot.
I created an object like this:
var application = {};
application.indexedDB = {};
application.indexedDB.db = null;
application.indexedDB.open = function() {
var dbName = "application";
var dbVersion = 1;
var openRequest = indexedDB.open(dbName, dbVersion);
openRequest.onupgradeneeded = function(e) {
console.log("Upgrading your DB (" + dbName + ", v" + dbVersion + ")...");
var thisDB = e.target.result;
if (!thisDB.objectStoreNames.contains("journal")) {
thisDB.createObjectStore(
"journal",
{keyPath: "id"}
);
}
}
openRequest.onsuccess = function(e) {
console.log("Opened DB (" + dbName + ", v" + dbVersion + ")");
application.indexedDB.db = e.target.result;
}
openRequest.onerror = function(e) {
console.log("Error");
console.dir(e);
}
};
Now I am able to open the dbconnection with application.indexedDB.open(). Now I added another function to my Object:
application.indexedDB.addItemToTable = function(item, table) {
var transaction = application.indexedDB.db.transaction([table], "readwrite");
var store = transaction.objectStore(table);
//Perform the add
var request = store.add(item);
request.onerror = function(e) {
console.log("Error", e.target.error.name);
//some type of error handler
}
request.onsuccess = function(e) {
console.log("Woot! Did it");
}
};
My instruction-sequence extended like this:
application.indexedDB.open()
application.indexedDB.addItemToTable(item, "journal")
But this doesn't work. Because the open-Instruction is asynchronous the application.indexedDB.dbis not yet available when i call it in the addItemToTable-Function.
How does a Javascript-Developer solve this?
I was following this tutorial here: http://code.tutsplus.com/tutorials/working-with-indexeddb--net-34673 and now I have some problems with those examples.
For example he creates the HTML-Output directly in the "onsuccess"-Part (in the "Read More Data" Section) . In my eyes this is bad coding because the view has nothing to do with the db-reading part.. isn't it? but then comes my question. how the heck can you return something in the "onsuccess"-Part?
Adding callbackfunctions is somewhat complicated. Especially when i want to read some data and with that resultset get some more data. It's very complicated to describe what i mean.
I made a little fiddle - maybe it clarifies things.. -- http://jsfiddle.net/kb8nuby6/
Thank you

You don't need to use someone else's extra program. You will need to learn about asynchronous javascript (AJAX) before using indexedDB. There is no way to avoid it. You can learn about AJAX without learning about indexedDB. For example, look at how XMLHttpRequest works. Learn about setTimeout and setInterval. Maybe learn about requestAnimationFrame. If you know nodejs stuff, review process.nextTick. Learn about how functions are first class. Learn about the idea of using a callback function. Learn about continuation passing style.
You will probably not get the answer you are looking for to this question. If anything, this is a duplicate of the several thousand other questions on stack overflow about asynchronous programming in javascript. It is not even that related to indexedDB. Take a look at some of the numerous other questions about asynchronous js.
Maybe this gets you started:
var a;
setTimeout(function() { a = 1; }, 10);
console.log('The value of a is %s', a);
Figure out why that did not work. If you do, you will be much closer to finding the answer to this question.

The pattern I commonly adopted is wait all database operations until connected. It is similar concept to $.ready in jQuery.
You will find that as the app get age, you have many schema versions and need to upgrade data as well. A lot of logic in database connection itself.
You can use callback queue, if you need to use database before ready. Here is snippet from Google analytics on commend queue:
// Creates an initial ga() function. The queued commands will be executed once analytics.js loads.
i[r] = i[r] || function() {
(i[r].q = i[r].q || []).push(arguments)
},
Basically, you will execute these callbacks once database is connected.
I highly recommend to check out my own library, ydn-db. It has all these concepts.

Related

How do I check if an indexedDB instance is open?

Suppose I have an instance of an indexedDB object. Is there a simple way of detecting if the object is currently in the 'open' state?
I've tried database.closePending and looking at other properties but do not see a simple property that tells me the state of the database.
I am looking to do this synchronously.
Doing something like attempting open a transaction on a database and checking if an exception occurs is not a reasonable solution for me.
I don't want to maintain an extra variable associated with the database instance.
Perhaps I am missing some simple function in the api? Is there some observable feature of the instance variable that I can quickly and easily query to determine state?
Stated a different way, can you improve upon the following implementation?
function isOpen(db) {
if(db && Object.prototype.toString.call(db) === '[object IDBDatabase]') {
var names = db.objectStoreNames();
if(names && names.length) {
try {
var transaction = db.transaction(names[0]);
transaction.abort();
return true;
} catch(error) {
}
}
}
}
Or this method?
var opened = false;
var db;
var request = indexedDB.open(...);
request.onsuccess = function() {
db = request.result;
opened = true;
};
function isOpen(db) {
return opened;
}
db.close();
opened = false;
Or this method?
var db;
var request = indexedDB.open(...);
request.onsuccess = function() {
db = request.result;
db.onclose = function() {
db._secret_did_close = true;
};
};
function isOpen(db) {
return db instanceof IDBDatabase && !db.hasOwnProperty('_secret_did_close');
}
There's nothing else in the API that tells you if a connection is closed. Your enumeration of possibilities is what is available.
Also note that there is no closePending property in the API. The specification text uses a close pending flag to represent internal state, but this is not exposed to script.
Doing something like attempting open a transaction on a database and checking if an exception occurs is not a reasonable solution for me.
Why? This is the most reliable approach. Maintaining extra state would not account for unexpected closure (e.g. the user has deleted browsing data, forcing the connection to close) although that's what the onclose handler would account for - you'd need to combine your 2nd and 3rd approaches. (close is not fired if close() is called by script)
You should create a request by using indexedDB.open and if the connection is open you will jump onsuccess method.
request = indexedDB.open('html5',1);
request.onsuccess = function() {
console.log('Database connected');
};
Example :
https://codepen.io/headmax/pen/BmaOMR?editors=1111
About how to close or how to known if the indexedDB is still open : I guess you need to implement all events on every transaction : for example to take the control you can take the events : transaction.onerror, transaction.onabort ... If you need some example explanation i guess you have to create a new post ;).
https://developer.mozilla.org/en-US/docs/Web/API/IDBTransaction

Should I open an IDBDatabase each time or keep one instance open?

I have a SPA application that will make multiple reads/writes to IndexedDB.
Opening the DB is an asynchronous operation with a callback:
var db;
var request = window.indexedDB.open("MyDB", 2);
request.onupgradeneeded = function(event) {
// Upgrade to latest version...
}
request.onerror = function(event) {
// Uh oh...
}
request.onsuccess = function(event) {
// DB open, now do something
db = event.target.result;
};
There are two ways I can use this db instance:
Keep a single db instance for the life of the page/SPA?
Call db.close() once the current operation is done and open a new one on the next operation?
Are there pitfalls of either pattern? Does keeping the indexedDB open have any risks/issues? Is there an overhead/delay (past the possible upgrade) to each open action?
I have found that opening a connection per operation does not substantially degrade performance. I have been running a local Chrome extension for over a year now that involves a ton of indexedDB operations and have analyzed its performance hundreds of times and have never witnessed opening a connection as a bottleneck. The bottlenecks come in doing things like not using an index properly or storing large blobs.
Basically, do not base your decision here on performance. It really isn't the issue in terms of connecting.
The issue is really the ergonomics of your code, how much you are fighting against the APIs, and how intuitive your code feels when you look at it, how understable you think the code is, how welcoming is it to fresh eyes (your own a month later, or someone else). This is very notable when dealing with the blocking issue, which is indirectly dealing with application modality.
My personal opinion is that if you are comfortable with writing async Javascript, use whatever method you prefer. If you struggle with async code, choosing to always open the connection will tend to avoid any issues. I would never recommend using a single global page-lifetime variable to someone who is newer to async code. You are also leaving the variable there for the lifetime of the page. On the other hand, if you find async trivial, and find the global db variable more amenable, by all means use it.
Edit - based on your comment I thought I would share some pseudocode of my personal preference:
function connect(name, version) {
return new Promise((resolve, reject) => {
const request = indexedDB.open(name, version);
request.onupgradeneeded = onupgradeneeded;
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
request.onblocked = () => console.warn('pending till unblocked');
});
}
async foo(bar) {
let conn;
try {
conn = await connect(DBNAME, DBVERSION);
await storeBar(conn, bar);
} finally {
if(conn)
conn.close();
}
}
function storeBar(conn, bar) {
return new Promise((resolve, reject) => {
const tx = conn.transaction('store');
const store = tx.objectStore('store');
const request = store.put(bar);
request.onsuccess = () => resolve(request.result);
request.onerror = () => reject(request.error);
});
}
With async/await, there isn't too much friction in having the extra conn = await connect() line in your operational functions.
Opening a connection each time is likely to be slower just because the browser is doing more work (e.g. it may need to read data from disk). Otherwise, no real down sides.
Since you mention upgrades, either pattern requires a different approach to the scenario where the user opens your page in another tab and it tries to open the database with a higher version (because it downloaded newer code form your server). Let's say the old tab was version 3 and the new tab is version 4.
In the one-connection-per-operation case you'll find that your open() on version 3 fails, because the other tab was able to upgrade to version 4. You can notice that the open failed with VersionError e.g. and inform the user that they need to refresh the page.
In the one-connection-per-page case your connection at version 3 will block the other tab. The v4 tab can respond to the "blocked" event on the request and let the user know that older tabs should be closed. Or the v3 tab can respond to the versionupgrade event on the connection and tell the user that it needs to be closed. Or both.

Changing EventSource in HTML

Basically, what I'm trying to do is to pass a parameter through the URL to the php code, however, it seems that in the function body of the on message event, I can't change the source. Here's the code below:
var source = new EventSource("get_message.php?latest_chat_ID=0");
var i = 0;
$(source).on("message", function (event) {
var data = event.originalEvent.data;
++i;
source = new EventSource("get_message.php?latest_chat_ID=" + i);
// $.post("get_message.php", {latest_chat_ID: 0}, function (data, status) {});
$("#messages").html($("#messages").html() + data);
});
I was wondering -
How do I rectify this problem?
Are there other ways to send data to a PHP page? (I contemplated using the $.post{} jQuery function, but that will execute the script twice - once from firing the EventSource event and once from the .post{} request?)
I also understand that alternative technologies exist, such as WebSockets and libraries such as node.js, that are better suited for bidirectional communication. However, most of my base code is written with an SSE implementation in mind, and I'd like to stick to that.
If you want to continue using SSE, I think you'll need to rewrite what you have similar to what is below. The reason what you have right now doesn't work is because you are only listening to the first EventSource, and just changing the variable. The variable is not reused by jQuery when you change it. Plus, I probably would skip using jQuery for that since it's going to try and cache things you don't want cached.
var listenToServer = function(i) {
source = new EventSource("get_message.php?latest_chat_ID=" + i);
source.onmessage = function(event) {
var data = event.originalEvent.data;
$messages.html($messages.html() + data);
listenToServer(i + 1);
}
},
$messages = $('#messages'),
source;
listenToServer(0);
I also went ahead and cached $('#messages') so you're not creating new objects over and over. Left source outside of the function so that you don't have to worry as much about garbage collection with the various EventSources.

How to structure my code to return a callback?

So I've been stuck on this for quite a while. I asked a similar question here: How exactly does done() work and how can I loop executions inside done()?
but I guess my problem has changed a bit.
So the thing is, I'm loading a lot of streams and it's taking a while to process them all. So to make up for that, I want to at least load the streams that have already been processed onto my webpage, and continue processing stream of tweets at the same time.
loadTweets: function(username) {
$.ajax({
url: '/api/1.0/tweetsForUsername.php?username=' + username
}).done(function (data) {
var json = jQuery.parseJSON(data);
var jsonTweets = json['tweets'];
$.Mustache.load('/mustaches.php', function() {
for (var i = 0; i < jsonTweets.length; i++) {
var tweet = jsonTweets[i];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', { tweet: tweet, optional_id: optional_id });
configureTweetSentiment(tweet);
configureTweetView(tweet);
}
});
});
}};
}
This is pretty much the structure to my code right now. I guess the problem is the for loop, because nothing will display until the for loop is done. So I have two questions.
How can I get the stream of tweets to display on my website as they're processed?
How can I make sure the Mustache.load() is only executed once while doing this?
The problem is that the UI manipulation and JS operations all run in the same thread. So to solve this problem you should just use a setTimeout function so that the JS operations are queued at the end of all UI operations. You can also pass a parameter for the timeinterval (around 4 ms) so that browsers with a slower JS engine can also perform smoothly.
...
var i = 0;
var timer = setInterval(function() {
var tweet = jsonTweets[i++];
var optional_id = '_user_tweets';
$('#all-user-tweets').mustache('tweets_tweet', {
tweet: tweet,
optional_id: optional_id
});
configureTweetSentiment(tweet);
configureTweetView(tweet);
if(i === jsonTweets.length){
clearInterval(timer);
}
}, 4); //Interval between loading tweets
...
NOTE
The solution is based on the following assumptions -
You are manipulating the dom with the configureTweetSentiment and the configureTweetView methods.
Ideally the solution provided above would not be the best solution. Instead you should create all html elements first in javascript only and at the end append the final html string to a div. You would see a drastic change in performance (Seriously!)
You don't want to use web workers because they are not supported in old browsers. If that's not the case and you are not manipulating the dom with the configure methods then web workers are the way to go for data intensive operations.

Reading from and writing to multiple ip's with node.js

I've been through a tutorial with Node.js, but it seems everything was aimed at what to do with it from the server side. I was playing around with it to see how it would work writing to multiple devices on our network with non-blocking i/o. Here's what I have so far (rough code, sorry):
var net = require('net');
var nodes = [<list of IP addresses>];
function connectToServer(ip) {
conn = net.createConnection(3083, ip);
console.log ("Creating connection to: " + ip + "\n");
conn.on('connect', function() {
conn.write ("act-user::myuser:ex1::mypass;");
});
conn.on('data', function(data) {
var read = data.toString();
if (read.match('ex1'))
{
console.log(ip + ": " + read);
}
});
conn.on('end', function() {
});
}
nodes.forEach(function(node) {
connectToServer(node);
});
I see it go through the list and kicking off a connection to each IP address, but then it seems to write the login line to the same host over and over again. I think somewhere I'm getting my streams crossed (just like Egon said to never do), but I'm not sure how to fix it. The non-blocking node.js world is still kind of difficult for me to think through.
Your issue is a JavaScript kink. In JavaScript, if a variable is defined without var before it, it will be set as global. This is happening to the conn variable in your connectToServer() function. conn is being overwritten and is not reset after the connectToServer scope is exited.
Try adding var in front of the following.
conn = net.createConnection(3083, ip);
Result
var conn = net.createConnection(3083, ip);

Categories

Resources