Firebase realtime database: writing new data is failing - javascript

I have a bunch of data that needs to be updated in my realtime database but the set command (as described here in the docs) isn't working. Here's my code, which I'm running with babel-node scriptName.js:
var config = {
//CONFIG
};
Firebase.initializeApp(config);
var dbRef = Firebase.database().ref();
getAllFirebaseDocs(dbRef);
async function getAllFirebaseDocs(dbRef, newSet) {
var snapshot = await dbRef.once('value');
var data = snapshot.val();
var bookName = "FirebaseForDummies";
var newData = {
'prop1': 'string',
'prop2': 0
}
Firebase.database().ref(bookName + "/edition/"+ 3).set(newData);
}
I'm doing some other stuff which doesn't affect the write (hence reading all existing data). I'm specifically updating/augmenting existing data, so for example; the book FirebaseForDummies has 3 editions already. This means that the editions value in the database looks something like this:
0: {data}
1: {data}
2: {data}
I want to add a fourth, so I create the db reference with FirebaseForDummmies/edition/3. Then just like in the doc, I use set and pass it the newData object. This however fails silently; I don't get any error messages, and nothing changes in the realtime database console.
edit:
I tried these commands in a babel-node session in my console, and they worked. So there's something in my script that's making the set() function not work... not sure what it is, since I have other scripts that implement set() on existing data and they all work fine.
edit2:
I added a callback as follows:
Firebase.database().ref(bookName + "/edition/"+ 3).set(newData, function(error) {
if(error) {
console.log(error);
}
else {
console.log("written!");
}
});
Again, when I try it in a babel-node console it works fine (returning 'written!') but in my script, I don't get either the error or success console.log.

The answer is not to do a process.exit() at the end of the script. I didn't even think of it because I put it there simply to close the script when it was done, but apparently it was 'cutting off' the requests. I think I should make await work with the set requests so that this can all happen synchronously.

Related

delete incoming write event after calculations in firebase functions

I have an app that uses firebase, the whole stack pretty much, functions, database, storage, auth, messaging, the whole 9. I want to keep the client end very lightweight. So if a user comments on a post and "tags" another user, let's say using the typical "#username" style tagging, I moved all of the heavy lifting to the firebase functions. That way the client doesn't have to figure out the user ID based on the username, and do everything else. It is setup using triggers, so when the above scenario happens I write to a "table" called "create_notifications" with some data like
{
type: "comment",
post_id: postID,
from: user.getUid(),
comment_id: newCommentKey,
to: taggedUser
}
Where the taggedUser is the username, the postID is the active post, the newCommentKey is retrieved from .push() on the comments db reference, and the user.getUid() is from the firebase auth class.
Now in my firebase functions I have a "onWrite" trigger for that specific table that gets all of the relevant information and sends out a notification to the poster of the post with all the relevant details. All of that is complete, what I am trying to figure out is... how do I delete the incoming event, that way I don't need any sort of cron jobs to clear out this table. I can just grab the event, do my needed calculations and data gathering, send the message, then delete the incoming event so it never even really exists in the database except for the small amount of time it took to gather the data.
A simplified sample of the firebase functions trigger is...
exports.createNotification = functions.database.ref("/create_notifications/{notification_id}").onWrite(event => {
const from = event.data.val().from;
const toName = event.data.val().to;
const notificationType = event.data.val().type;
const post_id = event.data.val().post_id;
var comment_id, commentReference;
if(notificationType == "comment") {
comment_id = event.data.val().comment_id;
}
const toUser = admin.database().ref(`users`).orderByChild("username").equalTo(toName).once('value');
const fromUser = admin.database().ref(`/users/${from}`).once('value');
const referencePost = admin.database().ref(`posts/${post_id}`).once('value');
return Promise.all([toUser, fromUser, referencePost]).then(results => {
const toUserRef = results[0];
const fromUserRef = results[1];
const postRef = results[2];
var newNotification = {
type: notificationType,
post_id: post_id,
from: from,
sent: false,
create_on: Date.now()
}
if(notificationType == "comment") {
newNotification.comment_id = comment_id;
}
return admin.database().ref(`/user_notifications/${toUserRef.key}`).push().set(newNotification).then(() => {
//NEED TO DELETE THE INCOMING "event" HERE TO KEEP DB CLEAN
});
})
}
So in that function in the final "return" of it, after it writes the finalized data to the "/user_notifications" table, I need to delete the event that started the whole thing. Does anyone know how to do that? Thank you.
First off, use .onCreate instead of .onWrite. You only need to read each child when they are first written, so this will avoid undesirable side effects. See the documentation here for more information on the available triggers.
event.data.ref() holds the reference where the event occurred. You can call remove() on the reference to delete it:
return event.data.ref().remove()
The simplest way to achieve this is through calling the remove() function offered by the admin sdk,
you could get the reference to the notification_id through the event, i.e event.params.notification_id then remove it when need be with admin.database().ref('pass in the path').remove(); and you are good to go.
For newer versions of Firebase, use:
return change.after.ref.remove()

Meteor remote collection - hooks don’t work

I have to connect to the external database and get access to its collections. It works fine, when I use it, but the problem is when I need collection hooks, e.g. Collection.after.insert(function(userId, doc)). The hook is not being fired. I have following code:
// TestCollection.js
let database = new MongoInternals.RemoteCollectionDriver("mongodb://127.0.0.1:3001/meteor",
{
oplogUrl: 'mongodb://127.0.0.1:3001/local'
});
let TestCollection = new Mongo.Collection("testCollection", { _driver: database });
module.exports.TestCollection = TestCollection;
console.log(TestCollection.findOne({name: 'testItem'})); // writes out the item correctly
// FileUsingCollection.js
import { TestCollection } from '../collections/TestCollection.js';
console.log(TestCollection.findOne({name: 'testItem'})); // writes out the item correctly second time
TestCollection.after.update(function (userId, doc) {
console.log('after update');
}); // this is NOT being fired when I change the content of remote collection (in external app, which database I am connected)
How to make this work?
EDIT:
I have read many hours about it and I think it might be connected with things like:
- oplog
- replicaSet
But I am newbie to Meteor and can’t find out what are those things about. I have set MONGO_OPLOG_URL and I added oplog parameter to database driver as I read here: https://medium.com/#lionkeng/2-ways-to-share-data-between-2-different-meteor-apps-7b27f18b5de9
but nothing changed. And I don’t know how to use this replicaSet, how to add it to the url. Anybody can help?
You can also try something like below code,
var observer = YourCollections.find({}).observeChanges({
added: function (id, fields) {
}
});
You can also have 'addedBefore(id, fields, before)', 'changed(id, fields)', 'movedBefore(id, before)', 'removed(id)'
For more features goto link.

how to fix these when using neo4j_driver in nodejs?

I am having several problems when trying to use neo4j in nodejs at the backend.
The following seems fine, but I could not see any nodes in local database browser.
var neo4j = require('neo4j-driver').v1;
var driver = neo4j.driver("bolt://localhost", neo4j.auth.basic("neo4j", "neo4j"));
var session = driver.session();
session.run('CREATE (a:person {name: "aaaa")-[a:work_at]->(b:company {type:"Inc"})');
session.close();
driver.close();
at the local browser
http://localhost:7474/browser/
I tried to see these added nodes by
match (a) return a
but nothing came out.
So, where above nodes are now? how do I know that i added something to database?
Since above code seems fine, I put it inside a function in a module, called it in another module. The problem is that I got an error, how is this possible? It is the same code, OMG!
function test() {
var neo4j = require('neo4j-driver').v1;
var driver = neo4j.driver("bolt://localhost", neo4j.auth.basic("neo4j", "neo4j"));
var session = driver.session();
session
.run( 'CREATE (a:person {name: "aaaa")-[a:work_at]->(b:company {type:"Inc"})' )
.then( () => {
console.log( 'add a node' );
session.close();
driver.close();
})
.catch(error =>{
console.log( error );
session.close();
driver.close();
return error;
})
}
I keep getting the following error. Searched everywhere, but could not fix it.
Error: Connection was closed by server
How can I specify where to put my database files, name of the database file and so on?
I wanted to import data to noe4j from csv file.
Can I use 'load csv' like the following
session.run('load csv from .....')
can I do that?
I saw every one just using 'load csv' from commend line.
If I must have to do from command line, how can I specify the path of the file?
Anyone point out what I am doing wrong?
Please help!
Your Cypher query has a couple of major problems, and will not work.
You omitted the } when assigning the name property to the person node.
You are attempting to use the same identifier, "a", in two conflicting ways: for a node and for a relationship.
Something like this would work (I left out all identifiers, since your current query doesn't need them):
CREATE (:person {name: "aaaa"})-[:work_at]->(:company {type:"Inc"})

Meteor synchronous and asynchronous call to read a file

I am new to Meteor. I am using following code to read a file stored at server.
Client side
Meteor.call('parseFile', (err, res) => {
if (err) {
alert(err);
} else {
Session.set("result0",res[0]);
Session.set("result1",res[1]);
Session.set("result2",res[2]);
}
});
let longitude = Session.get("result0");
let latitude = Session.get("result1");
var buildingData = Session.get("result2");
Server Side
Meteor.methods({
'parseFile'() {
var csv = Assets.getText('buildingData.csv');
var rows = Papa.parse(csv).data;
return rows;
}
})
The problem is while I make a call it takes time to send the result back and hence wherever i am using latitude and longitude its giving undefined and page breaks. So, is there any solution to avoid this problem. One of the solution can be to make a synchronous call and wait for result to be returned.
You can make the server method run synchronously using the futures package, which should force the client to wait for the method to complete.
It might look something like this:
Meteor.methods({
'parseFile'() {
var future = new Future();
var csv = Assets.getText('buildingData.csv');
var rows = Papa.parse(csv).data;
future.return(rows);
future.wait();
}
});
This would require you installing the futures package linked above and setting up your includes properly in file containing your Meteor.methods() definitions. You might also look into good error handling inside your method.
UPDATE:
The link to the Future package is an NPM package, which you can read about here. The link above is to the atmosphere package, which looks like an old wrapper package.

Return value of crud functions

I have simple table scripts in Azure written in javascript and node.js.
Now, if I call any of these from Windows Phone, the object parameter gets updated automatically from the return value. And thus code like this works
await table1.InsertAsync(val1);
MyObj val2 = new MyObj();
val2.data = val1.Id;
await table2.InsertAsync(val2);
However now I try to utilize this same from scheduled job in Azure: essentially chaining 2 insert calls so that latter depends on the id of the former. The id column is identity and gets created automatically.
How can I achieve this? Scheduled job is written in javascript/node.js. I have tried
var res = table1.insert(val1);
And using val1.id after the first insert, but neither works.
And of course just moments after I post the question I come up with an answer.
table1.insert(val1, {
success: function(res) {
var val2 = {
data: res.id
}
table2.insert(val2);
}
});

Categories

Resources