How do I update data in indexedDB? - javascript

I have tried to get some information from W3C regarding the update of an objectStore item in a indexedDB database, but with not so much susccess.
I found here a way to do it, but it doesn't really work for me.
My implementation is something like this
DBM.activitati.edit = function(id, obj, callback){
var transaction = DBM.db.transaction(["activitati"], IDBTransaction.READ_WRITE);
var objectStore = transaction.objectStore("activitati");
var keyRange = IDBKeyRange.only(id);
objCursor = objectStore.openCursor(keyRange);
objCursor.onsuccess = function(e){
var cursor = e.target.result;
console.log(obj);
var request = cursor.update(obj);
request.onsuccess = function(){
callback();
}
request.onerror = function(e){
conosole.log("DBM.activitati.edit -> error " + e);
}
}
objCursor.onerror = function(e){
conosole.log("DBM.activitati.edit -> error " + e);
}
}
I have all DBM.activitati.(add | remove | getAll | getById | getByIndex) methods working, but I can not resolve this.
If you know how I can manage it, please, do tell!
Thank you!

Check out this jsfiddle for some examples on how to update IDB records. I worked on that with another StackOverflower -- it's a pretty decent standalone example of IndexedDB that uses indexes and does updates.
The method you seem to be looking for is put, which will either insert or update a record if there are unique indexes. In that example fiddle, it's used like this:
phodaDB.indexedDB.addUser = function(userObject){
//console.log('adding entry: '+entryTxt);
var db = phodaDB.indexedDB.db;
var trans = db.transaction(["userData"],IDBTransaction.READ_WRITE);
var store = trans.objectStore("userData");
var request = store.put(userObject);
request.onsuccess = function(e){
phodaDB.indexedDB.getAllEntries();
};
request.onerror = function(e){
console.log('Error adding: '+e);
};
};
For what it's worth, you've got some possible syntax errors, misspelling "console" in console.log as "conosole".

A bit late for an answer, but possible it helps others. I still stumbled -as i guess- over the same problem, but it's very simple:
If you want to INSERT or UPDATE records you use objectStore.put(object) (help)
If you only want to INSERT records you use objectStore.add(object) (help)
So if you use add(object), and a record key still exists in DB, it will not overwritten and fires error 0 "ConstraintError: Key already exists in the object store".
If you use put(object), it will be overwritten.

this is case of update infos of an user object
var transaction = db.transaction(["tab_user"], "readwrite");
var store = transaction.objectStore("tab_user");
var req = store.openCursor();
req.onerror = function(event) {
console.log("case if have an error");
};
req.onsuccess = function(event) {
var cursor = event.target.result;
if(cursor){
if(cursor.value.idUser == users.idUser){//we find by id an user we want to update
var user = {};
user.idUser = users.idUser ;
user.nom = users.nom ;
var res = cursor.update(user);
res.onsuccess = function(e){
console.log("update success!!");
}
res.onerror = function(e){
console.log("update failed!!");
}
}
cursor.continue();
}
else{
console.log("fin mise a jour");
}
}

I'm a couple of years late, but thought it'd be nice to add my two cents in.
First, check out BakedGoods if you don't want to deal with the complex IndexedDB API.
It's a library which establishes a uniform interface that can be used to conduct storage operations in all native, and some non-native client storage facilities. It also maintains the flexibility and options afforded to the user by each. Oh, and it's maintained by yours truly :) .
With it, placing one or more data items in an object store can be as simple as:
bakedGoods.set({
data: [{key: "key1", value: "value1"}, {key: "key2", value: "value2"}),
storageTypes: ["indexedDB"],
complete: function(byStorageTypeResultDataObj, byStorageTypeErrorObj){}
});
Now to answer the actual question...
Lets begin by aggregating the valuable information spread across the existing answers:
IDBObjectStore.put() adds a new record to the store, or updates an existing one
IDBObjectStore.add() adds a new record to the store
IDBCursor.update() updates the record at the current position of the cursor
As one can see, OP is using an appropriate method to update a record. There are, however, several things in his/her code, unrelated to the method, that are incorrect (with respect to the API today at least). I've identified and corrected them below:
var cursorRequest = objectStore.openCursor(keyRange); //Correctly define result as request
cursorRequest.onsuccess = function(e){ //Correctly set onsuccess for request
var objCursor = cursorRequest.result; //Get cursor from request
var obj = objCursor.value; //Get value from existing cursor ref
console.log(obj);
var request = objCursor.update(obj);
request.onsuccess = function(){
callback();
}
request.onerror = function(e){
console.log("DBM.activitati.edit -> error " + e); //Use "console" to log :)
}
}
cursorRequest.onerror = function(e){ //Correctly set onerror for request
console.log("DBM.activitati.edit -> error " + e); //Use "console" to log :)
}

Related

Doesn't add records into PouchDB when used same function over again

I'm trying to create a database with "users" and their data in it. Strangely it doesn't put() new variables in it when I try to for the third time. To do all this I create a local database dblocal and replicate this DB to the remote db called dbremote. At first I create a document with one variable.
function newuser() {
if (window.document.consent_form.consent_to_share.value) {
var id = "p" + Date.now() + "-" + Math.floor(Math.random() * 10000);
var dblocal = new PouchDB(id);
var consenttoshare = window.document.consent_form.consent_to_share.value;
document.cookie = id;
var dbremote = 'http://localhost:5984/experiment';
dblocal.put({
_id: id,
consent: consenttoshare
});
dblocal.replicate.to(dbremote, {live: true});
}
}
This all worked well, in another js file I'm trying to add a variable to the same document by executing the following function putdb(). Im doing this in the following way (as said in their documentation is the right way):
function putdb () {
if (document.cookie){
var id = document.cookie;
var loggedin = "True";
var dblocal = new PouchDB(id);
dblocal.get(id).then(function (doc) {
doc.loggedin = loggedin;
return dblocal.put(doc);
}).then(function () {
return dblocal.get(id);
}).then(function (doc) {
console.log(doc);
var dbremote = 'http://localhost:5984/experiment';
dblocal.replicate.to(dbremote, {live: true});
});
}
}
This succesfully added the variable loggedin to the document as I wanted. However upon trying to add information to this document for the third time (again in another js file), nothing happens. I used exactly the same approach as before but only use different variables.
function putdb (checked) {
if (document.cookie) {
var id = document.cookie;
var checkedlist = [];
for (i = 0; i < checked; i++) {
checkedlist.push($("input[type=checkbox]:checked")[i].value)
}
var playlistname = document.getElementById("playlistname").value;
var dblocal = new PouchDB(id);
dblocal.get(id).then(function (doc) {
doc.checkedlist = checkedlist;
doc.playlistname = playlistname;
return dblocal.put(doc);
}).then(function () {
return dblocal.get(id);
}).then(function (doc) {
console.log(doc);
var dbremote = 'http://localhost:5984/experiment';
dblocal.replicate.to(dbremote, {live: true});
});
}
}
I checked all variables, they are correct.
I tried plain text variables.
The script does run.
I tried to add information to the document the way I did the first time.
None of all this seems to add another variable to the document as I wanted in the last function. I think it has to do with the way pouchDB works which I don't know. help is much appreciated!
There are a number of problems in your code that results in bad usage of PouchDB, and may lead to problems.
First of all, it does not make a lot of sense to give your document the same id as the name of your database. Assuming you want a one database per user approach, there are two approaches you can follow.
Multiple document approach
You can instead make multiple documents within the same database with different id's. For instance, your 'consent' information may be stored like this:
var id = "p" + Date.now() + "-" + Math.floor(Math.random() * 10000);
let dblocal = new PouchDB(id);
document.cookie = id;
let dbremote = 'http://localhost:5984/experiment';
dblocal.put({
_id: "consent",
consent: window.document.consent_form.consent_to_share.value
});
dblocal.replicate.to(dbremote, {live: true});
While your playlist information is stored like this:
dblocal.put({
_id: "playlist",
name: playlistname,
itemsChecked: checkedlist
});
Single-document approach
The second option is to store a single document containing all the information you want to store that is associated to a user. In this approach you will want to fetch the existing document and update it when there is new information. Assuming you named your document global-state (i.e. replace "consent" in the first code snippet with "global-state"), the following code will update a document:
dblocal.get("global-state").then((doc)=>{
doc.loggedIn = true; // or change any other information you want
return dblocal.put(doc);
}).then((response)=>{
//handle response
}).catch((err)=>{
console.log(err);
});
Furthermore, you should only call the
dblocal.replicate.to(dbremote, {live: true});
function once because the 'live' option specifies that future changes will automatically be replicated to the remote database.

Create index on already existing objectStore

As an example on basic setup one index is created.
db.onupgradeneeded = function(event) {
var db = event.target.result;
var store = db.createObjectStore('name', { keyPath: 'id' });
store.createIndex('by name', 'name', { unique: false });
};
Question:
Is it possible to create/append more indexes to the same objectStore on the future versionupdate? Since if I try:
db.onupgradeneeded = function(event) {
var db = event.target.result;
var store = db.createObjectStore('name', { keyPath: 'id' });
store.createIndex('by newName', 'newName', { unique: false });
};
It throws an error that current objectStore does already exist. An if I try to create store reference using transaction:
db.onupgradeneeded = function(event) {
var db = event.target.result;
var store = db.transaction('name', 'readwrite').objectStore('name');
store.createIndex('by newName', 'newName', { unique: false });
};
It throws that version change transaction is currently running
Yes it is possible. It can be a bit confusing at first. You want to get the existing object store via the implicit transaction created for you within onupgradeneeded. This is a transaction of type versionchange which is basically like a readwrite transaction but specific to the onupgradeneeded handler function.
Something like this:
var request = indexedDB.open(name, oldVersionPlusOne);
request.onupgradeneeded = myOnUpgradeNeeded;
function myOnUpgradeNeeded(event) {
// Get a reference to the request related to this event
// #type IDBOpenRequest (a specialized type of IDBRequest)
var request = event.target;
// Get a reference to the IDBDatabase object for this request
// #type IDBDatabase
var db = request.result;
// Get a reference to the implicit transaction for this request
// #type IDBTransaction
var txn = request.transaction;
// Now, get a reference to the existing object store
// #type IDBObjectStore
var store = txn.objectStore('myStore');
// Now, optionally inspect index names, or create a new index
console.log('existing index names in store', store.indexNames);
// Add a new index to the existing object store
store.createIndex(...);
}
You also will want to take care to increment the version so as to guarantee the onupgradeneeded handler function is called, and to represent that your schema (basically the set of tables and indices and properties of things) has changed in the new version.
You will also need to rewrite the function so that you only create or make changes based on the version. You can use event.oldVersion to help with this, or things like db.objectStoreNames.contains.
Something like this:
function myOnUpgradeNeeded(event) {
var is_new_db = isNaN(event.oldVersion) || event.oldVersion === 0;
if(is_new_db) {
var db = event.target.result;
var store = db.createObjectStore(...);
store.createIndex('my-initial-index');
// Now that you decided you want a second index, you also need
// to do this for brand new databases
store.createIndex('my-second-new-index');
}
// But if the database already exists, we are not creating things,
// instead we are modifying the existing things to get into the
// new state of things we want
var is_old_db_not_yet_current_version = !isNaN(event.oldVersion) && event.oldVersion < 2;
if(is_old_db_not_yet_current_version) {
var txn = event.target.transaction;
var store = txn.objectStore('store');
store.createIndex('my-second-new-index');
}
}
Pay close attention to the fact that I used event.target.transaction instead of db.transaction(...). These are not at all the same thing. One references an existing transaction, and one creates a new one.
Finally, and in addition, a personal rule of mine and not a formal coding requirement, you should never be using db.transaction() from within onupgradeneeded. Stick to modifying the schema when doing upgrades, and do all data changes outside of it.

Assemble paginated ajax data in a Bacon FRP stream

I'm learning FRP using Bacon.js, and would like to assemble data from a paginated API in a stream.
The module that uses the data has a consumption API like this:
// UI module, displays unicorns as they arrive
beautifulUnicorns.property.onValue(function(allUnicorns){
console.log("Got "+ allUnicorns.length +" Unicorns");
// ... some real display work
});
The module that assembles the data requests sequential pages from an API and pushes onto the stream every time it gets a new data set:
// beautifulUnicorns module
var curPage = 1
var stream = new Bacon.Bus()
var property = stream.toProperty()
var property.onValue(function(){}) # You have to add an empty subscriber, otherwise future onValues will not receive the initial value. https://github.com/baconjs/bacon.js/wiki/FAQ#why-isnt-my-property-updated
var allUnicorns = [] // !!! stateful list of all unicorns ever received. Is this idiomatic for FRP?
var getNextPage = function(){
/* get data for subsequent pages.
Skipping for clarity */
}
var gotNextPage = function (resp) {
Array.prototype.push.apply(allUnicorns, resp) // just adds the responses to the existing array reference
stream.push(allUnicorns)
curPage++
if (curPage <= pageLimit) { getNextPage() }
}
How do I subscribe to the stream in a way that provides me a full list of all unicorns ever received? Is this flatMap or similar? I don't think I need a new stream out of it, but I don't know. I'm sorry, I'm new to the FRP way of thinking. To be clear, assembling the array works, it just feels like I'm not doing the idiomatic thing.
I'm not using jQuery or another ajax library for this, so that's why I'm not using Bacon.fromPromise
You also may wonder why my consuming module wants the whole set instead of just the incremental update. If it were just appending rows that could be ok, but in my case it's an infinite scroll and it should draw data if both: 1. data is available and 2. area is on screen.
This can be done with the .scan() method. And also you will need a stream that emits items of one page, you can create it with .repeat().
Here is a draft code (sorry not tested):
var itemsPerPage = Bacon.repeat(function(index) {
var pageNumber = index + 1;
if (pageNumber < PAGE_LIMIT) {
return Bacon.fromCallback(function(callback) {
// your method that talks to the server
getDataForAPage(pageNumber, callback);
});
} else {
return false;
}
});
var allItems = itemsPerPage.scan([], function(allItems, itemsFromAPage) {
return allItems.concat(itemsFromAPage);
});
// Here you go
allItems.onValue(function(allUnicorns){
console.log("Got "+ allUnicorns.length +" Unicorns");
// ... some real display work
});
As you noticed, you also won't need .onValue(function(){}) hack, and curPage external state.
Here is a solution using flatMap and fold. When dealing with network you have to remember that the data can come back in a different order than you sent the requests - that's why the combination of fold and map.
var pages = Bacon.fromArray([1,2,3,4,5])
var requests = pages.flatMap(function(page) {
return doAjax(page)
.map(function(value) {
return {
page: page,
value: value
}
})
}).log("Data received")
var allData = requests.fold([], function(arr, data) {
return arr.concat([data])
}).map(function(arr) {
// I would normally write this as a oneliner
var sorted = _.sortBy(arr, "page")
var onlyValues = _.pluck(sorted, "value")
var inOneArray = _.flatten(onlyValues)
return inOneArray
})
allData.log("All data")
function doAjax(page) {
// This would actually be Bacon.fromPromise($.ajax...)
// Math random to simulate the fact that requests can return out
// of order
return Bacon.later(Math.random() * 3000, [
"Page"+page+"Item1",
"Page"+page+"Item2"])
}
http://jsbin.com/damevu/4/edit

Why is this call to IDBObjectStore.get() is resulting in confusing behavior?

I am trying to get the hang of using indexedDB to store data client side.
consider the following code:
function queryURL(message, sender)
{
chrome.contextMenus.removeAll();
var openRequest = indexedDB.open("Tags",1);
openRequest.onsuccess = function(event){
var queryURL = message['host'];
var db = event.target.result;
var objectStore = db.transaction("domains").objectStore("domains");
var query = objectStore.get(queryURL);
query.onsuccess = function(event){
alert(query.result);
delete query.result["domain"];
createMenuItems(query.result);
available_commands=request.result;
};
db.onerror = function(event){
console.log("an error bubbled up during a transaction.");
};
};
openRequest.onerror = function(event){
console.log("error opening DB");
};
}
I do not fully understand what should be happening in the query.
The result is the same whether or not the key that is queried for is in the database:
query.onsuccess() runs and query.result is undefined so the
code errors and exits as soon as I try to delete a key from
query.result.
If the key is not found, query.onsuccess() should not be
running, correct?
If the key is found, query.result should hold the object that
corresponds to that key, correct?
In case it helps, here is the code that I used to initialize the database:
const db_name="Tags";
var request = window.indexedDB.open(db_name, 1);
var tags = [
//codes: 0 - markdown wrap tag
// 1 - HTML wrap tag
// 2 - single tag
{ domain: "www.youtube.com",
bold:["*",0],
strikethrough:["-",0],
italic:["_",0]
},
{ domain: "www.stackoverflow.com",
bold:["<strong>",1],
italic:["<em>",1],
strikethrough:["<del>",1],
superscript:["<sup>",1],
subscript:["<sub>",1],
heading1:["<h1>",1],
heading2:["<h2>",1],
heading3:["<h3>",1],
blockquote:["<blockquote>",1],
code:["<code>",1],
newline:["<br>",2],
horizontal:["<hr>",2]
}
];
request.onerror = function(event) {
alert("Error opening the database");
};
request.onupgradeneeded = function(event) {
var db = event.target.result;
alert("I'm doing stuff!");
var objectStore = db.createObjectStore("domains", {keyPath: "domain" });
objectStore.createIndex("domain", "domain", { unique: true });
objectStore.transaction.onComplete = function(event) {
var domanStore=db.transaction("domains","readwrite").objectStore("domains");
for(var i in tags)
{
domainStore.add(tags[i]);
}
}
};
Here are some links to the resources I am using:
Using IndexedDB
IDBObjectStore
IDBRequest
Finding out that the result is empty or undefined is a successful query. So yes, you get onsuccess call with result === undefined.
onerror is only reserved for when something breaks, e.g. you supplied an invalid key.
From IDBObjectStore.get docs:
Note: This method produces the same result for: a) a record that doesn't exist in the database and b) a record that has an undefined value. To tell these situations apart, call the openCursor() method with the same key. That method provides a cursor if the record exists, and no cursor if it does not.
Yes. It is even more confusing when delete method return success, even if no record is deleted.
Since request error event is cancellable bubbling event, it is not feasible to invoke to error callback even if no record is found. If request is on error and error is not prevented, its transaction will be aborted and indexedDB.onerror will be called as well. So invoking success with undefined result is still better than invoking error.

Chrome Extension with Database API interface

I want to update a div with a list of anchors that I generate from a local database in chrome. It's pretty simple stuff, but as soon as I try to add the data to the main.js file via a callback everything suddenly becomes undefined. Or the array length is set to 0. ( When it's really 18. )
Initially, I tried to install it into a new array and pass it back that way.
Is there a setting that I need to specify in the chrome manifest.json in order to allow for communication with the database API? I've checked, but all I've been able to find was 'unlimited storage'
The code is as follows:
window.main = {};
window.main.classes = {};
(function(awe){
awe.Data = function(opts){
opts = opts || new Object();
return this.init(opts);
};
awe.Data.prototype = {
init:function(opts){
var self = this;
self.modified = true;
var db = self.db = openDatabase("buddy","1.0","LocalDatabase",200000);
db.transaction(function(tx){
tx.executeSql("CREATE TABLE IF NOT EXISTS listing ( name TEXT UNIQUE, url TEXT UNIQUE)",[],function(tx,rs){
$.each(window.rr,function(index,item){
var i = "INSERT INTO listing (name,url)VALUES('"+item.name+"','"+item.url+"')";
tx.executeSql(i,[],null,null);
});
},function(tx,error){
});
});
self._load()
return this;
},
add:function(item){
var self = this;
self.modified = true;
self.db.transaction(function(tx){
tx.executeSql("INSERT INTO listing (name,url)VALUES(?,?)",[item.name,item.url],function(tx,rs){
//console.log('success',tx,rs)
},function(tx,error){
//console.log('error',error)
})
});
self._load()
},
remove:function(item){
var self = this;
self.modified = true;
self.db.transaction(function(tx){
tx.executeSql("DELETE FROM listing where name='"+item.name+"'",[],function(tx,rs){
//console.log('success',tx,rs)
},function(tx,error){
//console.log('error',tx,error);
});
});
self._load()
},
_load:function(callback){
var self = this;
if(!self.modified)
return;
self.data = new Array();
self.db.transaction(function(tx){
tx.executeSql('SELECT name,url FROM listing',[],function(tx,rs){
console.log(callback)
for(var i = 0; i<rs.rows.length;i++)
{
callback(rs.rows.item(i).name,rs.rows.item(i).url)
// var row = rs.rows.item(i)
// var n = new Object()
// n['name'] = row['name'];
// n['url'] = row['url'];
}
},function(tx,error){
//console.log('error',tx,error)
})
})
self.modified = false
},
all:function(cb){
this._load(cb)
},
toString:function(){
return 'main.Database'
}
}
})(window.main.classes);
And the code to update the list.
this.database.all(function(name,url){
console.log('name','url')
console.log(name,url)
var data = []
$.each(data,function(index,item){
try{
var node = $('<div > '+item.name + '</div>');
self.content.append(node);
node.unbind();
node.bind('click',function(evt){
var t = $(evt.target).attr('href');
chrome.tabs.create({
"url":t
},function(evt){
self._tab_index = evt.index
});
});
}catch(e){
console.log(e)
}
})
});
From looking at your code above, I notice you are executing "self._load()" at the end of each function in your API. The HTML5 SQL Database is asynchronous, you can never guarantee the result. In this case, I would assume the result will always be 0 or random because it will be a race condition.
I have done something similar in my fb-exporter extension, feel free to see how I have done it https://github.com/mohamedmansour/fb-exporter/blob/master/js/database.js
To solve a problem like this, did you check the Web Inspector and see if any errors occurs in the background page. I assume this is all in a background page eh? Try to see if any error occurs, if not, I believe your encountering a race condition. Just move the load within the callback and it should properly call the load.
Regarding your first question with the unlimited storage manifest attribute, you don't need it for this case, that shouldn't be the issue. The limit of web databases is 5MB (last I recall, it might have changed), if your using a lot of data manipulation, then you use that attribute.
Just make sure you can guarantee the this.database.all is running after the database has been initialized.

Categories

Resources