I am working with React 16.3.2, Redux 4 and Dexie 2.0.3.
when I am going to store data second time it throws this error message.
Error: ConstraintError: Key already exists in the object store.
return dispatch => {
db.table
.add(data)
.then (function(id){
console.log(id)
})
.catch (function (error) {
console.log("Error: " + error);
});
}
My Db schema:
const db = new Dexie('ReactReduxDexieJsCRUD');
db.version(1).stores({table:'++id,name,age,bloodGroup,donateBefore,weight' });
The first time it stores date well but after it gives the error.
How does your schema look like? (the part db.version(x).stores({...}) ?
The most common is to have inbound primary key, example:
db.version(1).stores({
table: 'id, idx1, idx2...'
});
Here id is the primary key.
db.table.add({id: 1, foo: 'bar'}) will add object with id 1.
db.table.add({id: 1, foo: 'bar2'}) 2nd time will fail because id 1 exists.
db.table.put({id: 1, foo: 'bar2'}) will update object with id 1.
So what do you really want to do? You say you want to add new object with new key. If so, I suppose the error is that you give the same key second time.
You can also let the id be generated by the db
db.version(2).stores({
table: '++id, idx1, idx2...'
});
Then you don't need to supply id in calls to add():
db.table.add({foo: 'bar'}) will add object with id 1.
db.table.add({foo: 'barX'}) 2nd time will add new obj with id 2
...
Related
Is it possible to update an object's value within an IndexedDB index without cloning, deleting, or putting a new entry? Theoretically something like the following snippet would do the trick, though it probably would not delete until the put was confirmed. But it looks like overkill to me. It looks like it would be a nightmare to do any error handling on.
const objectStore = db.transaction([objectStoreName], 'readwrite')
.objectStore(objectStoreName);
const requestGet = objectStore.get(index);
requestGet.onsuccess = (event: any) => {
const value = event.target.result.value // Store old value
const requestDelete = objectStore.delete(index);
requestDelete.onsuccess = (event: any) => {
const requestPut = objectStore
.put({index: 'New Index Value', value: value}); // Put back using new index
};
};
You cannot directly change values in an object store's index. You can change the values of an object in an object store, and IndexedDB will propagate your changes to related indices. Indices are essentially read-only.
It is possible since you specify your index, otherwise, an other logic may be necessary.
As you should know, the IDBObjectStore has a method .put() which it will receive two params. With it you can either PUT a new value or UPDATE a value.
IDBObjectStore.put(item, key)
item: The item you want to put/update
key: opcional: Your primary object store key (such as an uuid, a random number, in short...) for that item you would like to update.
Code:
//This is an example only.
//Let's think that we have an object store into our IndexDB 'user', where object store is called by user-data:
//# Key Value
//0 1 { username: 'John Doe' }
//Here, we are receiving the 'success' result from an indexedDB.open(), and using its result with a promise.
dbPromise.then(db => {
//Getting the transaction
const transaction = db.transaction('user-data', 'readwrite')
//Getting the objectStore with the data, the same object store before.
const store = transaction.objectStore('user-data')
//Getting the key's object store, in the other other words, this is the key you define when you create you objectStore, with createObjectStore. In this example, I've used 'autoIncrement: true'
const query = store.get(1)
//Getting the query result with a success listener.
query.addEventListener('success', event => {
const { ['result']: user } = event.target
user.productsIntoCart.push(newItem)
//With this, we will be able to change the object store value.
user.username = 'Jane Doe'
store.put(user, 1)
})
query.addEventListener('error', event => console.error(event))
transaction.addEventListener('complete', () => db.close())
})
//# Key Value
//0 1 { username: 'Jane Doe' }
You can see more details you want in the MDN IDBObjectStore.put documentation.
IDBObjectStore
I'm trying to create an empty new "Book" document in firestore with a randomly generated uid. At the same time I'd like to add data to a "Page" document in a subcollection called "Pages" within the book document. I haven't been able to make this work, do I need to explicitly reference a docId before writing to a subcollection?
export const addBookAndFirstPage = () => {
const pageRef = db.collection("books").doc().collection("pages").doc("page1");
return pageRef.set({ foo: "bar" });
};
I've solved this by creating the two references to each doc individually but also using add() instead of doc() as per the docs. Note that add() expects to take an object as an argument otherwise it throws the following exception:
FirebaseError: Function addDoc() called with invalid data. Data must
be an object, but it was: undefined
To create an empty document with a randomly generated uid we need to pass an empty object to the add method like below. We can then create the reference for the doc we intent to add to the subcollection and set data:
export const addBookAndFirstPage = async () => {
try {
let bookRef = await db.collection("books").add({}); // <--- Use add() with an empty object
let pageRef = await caseRef.collection("pages").doc("page1");
return await pageRef.set({ foo: "bar" });
} catch (error) {
console.log(error);
}
};
Here is a javascript function intending to perform an update on FireStore, which does not work.
I will be more that happy if anyone can see an issue in the code.
function makeUpdate(key,name) {
let theCollection = db.collection("InformationList"),
infoUnit = theCollection.doc(key).get().then(function(doc) {
if (doc.exists) {
console.log("infoUnit -name-:" + doc.get("name"));
console.log("infoUnit -telephone-:" + doc.get("telephone"));
let updateDico = {};
updateDico["name"] = name;
doc.update(updateDico);
} else {
console.log("embassyUpdate --> No such document!");
}
}).catch(err => {
console.log("Error getting documents (in makeUpdate)", err);
});
}
Apart from the fact that it does not perform the expected update, it prints three messages in the logs:
infoUnit -name-: some name
infoUnit -telephone-: some telephone number
Error getting documents (in makeUpdate)
From that I can see that a record is found in the database as expected. But at the same time an unknown error occurs.
There is no update() method on doc (which a DocumentSnapshot object). A DocumentSnapshot just contains the data read from get(). If you want to write data back into a document, you'll need to use a DocumentReference object, probably the same one you got when you called theCollection.doc(key).
There is no such method called update() which you can invoke on doc DataSnapshot object itself.
You'll have to use the set() method on the Document Reference which you get from doc.ref to update the reference.
This is how I've updated my data.
await db
.collection('collectionName')
.doc('documentId')
.update({
name: "Updated Name",
telephone: "0000000000"
});
You need to know document id and you can update your value like this.
I am attempting to perform an update to a MongoDB document (using mongoose) by first using .findById to get the document, then updating the fields in that document with new values. I am still a bit new to this so I used a tutorial to figure out how to get it working, then I have been updating my code for my needs. Here is the tutorial: MEAN App Tutorial with Angular 4. The original code had a schema defined, but my requirement is for a generic MongoDB interface that will simply take whatever payload is sent to it and send it along to MongoDB. The original tutorial had something like this:
exports.updateTodo = async function(todo){
var id = todo.id
try{
//Find the old Todo Object by the Id
var oldTodo = await ToDo.findById(id);
}catch(e){
throw Error("Error occured while Finding the Todo")
}
// If no old Todo Object exists return false
if(!oldTodo){
return false;
}
console.log(oldTodo)
//Edit the Todo Object
oldTodo.title = todo.title
oldTodo.description = todo.description
oldTodo.status = todo.status
console.log(oldTodo)
try{
var savedTodo = await oldTodo.save()
return savedTodo;
}catch(e){
throw Error("And Error occured while updating the Todo");
}
}
However, since I don't want a schema and want to allow anything through, I don't want to assign static values to specific field names like, title, description, status, etc. So, I came up with this:
exports.updateData = async function(update){
var id = update.id
// Check the existence of the query parameters, If they don't exist then assign a default value
var dbName = update.dbName ? update.dbName : 'test'
var collection = update.collection ? update.collection : 'testing';
const Test = mongoose.model(dbName, TestSchema, collection);
try{
//Find the existing Test object by the Id
var existingData = await Test.findById(id);
}catch(e){
throw Error("Error occurred while finding the Test document - " + e)
}
// If no existing Test object exists return false
if(!existingData){
return false;
}
console.log("Existing document is " + existingData)
//Edit the Test object
existingData = JSON.parse(JSON.stringify(update))
//This was another way to overwrite existing field values, but
//performs a "shallow copy" so it's not desireable
//existingData = Object.assign({}, existingData, update)
//existingData.title = update.title
//existingData.description = update.description
//existingData.status = update.status
console.log("New data is " + existingData)
try{
var savedOutput = await existingData.save()
return savedOutput;
}catch(e){
throw Error("An error occurred while updating the Test document - " + e);
}
}
My original problem with this was that I had a lot of issues getting the new values to overwrite the old ones. Now that that's been solved, I am getting the error of "TypeError: existingData.save is not a function". I am thinking the data type changed or something, and now it is not being accepted. When I uncomment the static values that were in the old tutorial code, it works. This is further supported by my console logging before and after I join the objects, because the first one prints the actual data and the second one prints [object Object]. However, I can't seem to figure out what it's expecting. Any help would be greatly appreciated.
EDIT: I figured it out. Apparently Mongoose has its own data type of "Model" which gets changed if you do anything crazy to the underlying data by using things like JSON.stringify. I used Object.prototype.constructor to figure out the actual object type like so:
console.log("THIS IS BEFORE: " + existingData.constructor);
existingData = JSON.parse(JSON.stringify(update));
console.log("THIS IS AFTER: " + existingData.constructor);
And I got this:
THIS IS BEFORE: function model(doc, fields, skipId) {
model.hooks.execPreSync('createModel', doc);
if (!(this instanceof model)) {
return new model(doc, fields, skipId);
}
Model.call(this, doc, fields, skipId);
}
THIS IS AFTER: function Object() { [native code] }
Which showed me what was actually going on. I added this to fix it:
existingData = new Test(JSON.parse(JSON.stringify(update)));
On a related note, I should probably just use the native MongoDB driver at this point, but it's working, so I'll just put it on my to do list for now.
You've now found a solution but I would suggest using the MongoDB driver which would make your code look something along the lines of this and would make the origional issue disappear:
// MongoDB Settings
const MongoClient = require(`mongodb`).MongoClient;
const mongodb_uri = `mongodb+srv://${REPLACE_mongodb_username}:${REPLACE_mongodb_password}#url-here.gcp.mongodb.net/test`;
const db_name = `test`;
let db; // allows us to reuse the database connection once it is opened
// Open MongoDB Connection
const open_database_connection = async () => {
try {
client = await MongoClient.connect(mongodb_uri);
} catch (err) { throw new Error(err); }
db = client.db(db_name);
};
exports.updateData = async update => {
// open database connection if it isn't already open
try {
if (!db) await open_database_connection();
} catch (err) { throw new Error(err); }
// update document
let savedOutput;
try {
savedOutput = await db.collection(`testing`).updateOne( // .save() is being depreciated
{ // filter
_id: update.id // the '_id' might need to be 'id' depending on how you have set your collection up, usually it is '_id'
},
$set: { // I've assumed that you are overwriting the fields you are updating hence the '$set' operator
update // update here - this is assuming that the update object only contains fields that should be updated
}
// If you want to add a new document if the id isn't found add the below line
// ,{ upsert: true }
);
} catch (err) { throw new Error(`An error occurred while updating the Test document - ${err}`); }
if (savedOutput.matchedCount !== 1) return false; // if you add in '{ upsert: true }' above, then remove this line as it will create a new document
return savedOutput;
}
The collection testing would need to be created before this code but this is only a one-time thing and is very easy - if you are using MongoDB Atlas then you can use MongoDB Compass / go in your online admin to create the collection without a single line of code...
As far as I can see you should need to duplicate the update object. The above reduces the database calls from 2 to one and allows you to reuse the database connection, potentially anywhere else in the application which would help to speed things up. Also don't store your MongoDB credentials directly in the code.
I have a data like this:
"customers": {
"aHh4OTQ2NTlAa2xvYXAuY29t": {
"customerId": "xxx",
"name": "yyy",
"subscription": "zzz"
}
}
I need to retrive a customer by customerId. The parent key is just B64 encoded mail address due to path limitations. Usually I am querying data by this email address, but for a few occasions I know only customerId. I've tried this:
getCustomersRef()
.orderByChild('customerId')
.equalTo(customerId)
.limitToFirst(1)
.once('child_added', cb);
This works nicely in case the customer really exists. In opposite case the callback is never called.
I tried value event which works, but that gives me whole tree starting with encoded email address so I cannot reach the actual data inside. Or can I?
I have found this answer Test if a data exist in Firebase, but that again assumes that you I know all path elements.
getCustomersRef().once('value', (snapshot) => {
snapshot.hasChild(`customerId/${customerId}`);
});
What else I can do here ?
Update
I think I found solution, but it doesn't feel right.
let found = null;
snapshot.forEach((childSnapshot) => {
found = childSnapshot.val();
});
return found;
old; misunderstood the question :
If you know the "endcodedB64Email", this is the way.:
var endcodedB64Email = B64_encoded_mail_address;
firebase.database().ref(`customers/${endcodedB64Email}`).once("value").then(snapshot => {
// this is getting your customerId/uid. Remember to set your rules up for your database for security! Check out tutorials on YouTube/Firebase's channel.
var uid = snapshot.val().customerId;
console.log(uid) // would return 'xxx' from looking at your database
// you want to check with '.hasChild()'? If you type in e.g. 'snapshot.hasChild(`customerId`)' then this would return true, because 'customerId' exists in your database if I am not wrong ...
});
UPDATE (correction) :
We have to know at least one key. So if you under some circumstances
only know the customer-uid-key, then I would do it like this.:
// this is the customer-uid-key that is know.
var uid = firebase.auth().currentUser.uid; // this fetches the user-id, referring to the current user logged in with the firebase-login-function
// this is the "B64EmailKey" that we will find if there is a match in the firebase-database
var B64EmailUserKey = undefined;
// "take a picture" of alle the values under the key "customers" in the Firebase database-JSON-object
firebase.database().ref("customers").once("value").then(snapshot => {
// this counter-variable is used to know when the last key in the "customers"-object is passed
var i = 0;
// run a loop on all values under "customers". "B64EmailKey" is a parameter. This parameter stores data; in this case the value for the current "snapshot"-value getting caught
snapshot.forEach(B64EmailKey => {
// increase the counter by 1 every time a new key is run
i++;
// this variable defines the value (an object in this case)
var B64EmailKey_value = B64EmailKey.val();
// if there is a match for "customerId" under any of the "B64EmailKey"-keys, then we have found the corresponding correct email linked to that uid
if (B64EmailKey_value.customerId === uid) {
// save the "B64EmailKey"-value/key and quit the function
B64EmailUserKey = B64EmailKey_value.customerId;
return B64UserKeyAction(B64EmailUserKey)
}
// if no linked "B64EmailUserKey" was found to the "uid"
if (i === Object.keys(snapshot).length) {
// the last key (B64EmailKey) under "customers" was returned. e.g. no "B64EmailUserKey" linkage to the "uid" was found
return console.log("Could not find an email linked to your account.")
}
});
});
// run your corresponding actions here
function B64UserKeyAction (emailEncrypted) {
return console.log(`The email-key for user: ${auth.currentUser.uid} is ${emailEncrypted}`)
}
I recommend putting this in a function or class, so you can easily call it up and reuse the code in an organized way.
I also want to add that the rules for your firebase must be defined to make everything secure. And if sensitive data must be calculated (e.g. price), then do this on server-side of Firebase! Use Cloud Functions. This is new for Firebase 2017.