Meteor Collections Publish and Subscriptions - javascript

Been a bit stuck today and found a bunch of related topics, but still didn't manage to fix it. I'm kinda new to Meteor and might not been doing it in the right way but got autopublish removed.
I'm creating the collection as a const in lib/import folder which is shared with client/server. Next, I'm calling a server method inside a Async to insert the data into the collection. So far so good, I think (I see the data inside db mongo).
Now in the client.js, I want to handle each data related to a user, and then append it to the template or do some other stuff.
// SERVER
Meteor.publish("pipeline", function() {
var data = pipeline.find({}, {fields: {userID:this.userID}}).fetch();
return data;
});
// CLIENT
var loadCurrentPipeLineUser = Meteor.subscribe('pipeline');
var data = pipeline.findOne({userID: Meteor.userId()});
console.log(loadCurrentPipeLineUser);
console.log(data);
Both loadCurrentPipeLineUser and data return undefined. The output of loadCurrentPipe (which I think means undefined) is:
On server side, inside the publish, it prints everything right on the console.

Publish it without .fetch()
Meteor.publish("pipeline", function() {
return pipeline.find({}, {fields: {userID:this.userID}});
});
Because publish returns a cursor.

Related

Meteor remote collection - hooks don’t work

I have to connect to the external database and get access to its collections. It works fine, when I use it, but the problem is when I need collection hooks, e.g. Collection.after.insert(function(userId, doc)). The hook is not being fired. I have following code:
// TestCollection.js
let database = new MongoInternals.RemoteCollectionDriver("mongodb://127.0.0.1:3001/meteor",
{
oplogUrl: 'mongodb://127.0.0.1:3001/local'
});
let TestCollection = new Mongo.Collection("testCollection", { _driver: database });
module.exports.TestCollection = TestCollection;
console.log(TestCollection.findOne({name: 'testItem'})); // writes out the item correctly
// FileUsingCollection.js
import { TestCollection } from '../collections/TestCollection.js';
console.log(TestCollection.findOne({name: 'testItem'})); // writes out the item correctly second time
TestCollection.after.update(function (userId, doc) {
console.log('after update');
}); // this is NOT being fired when I change the content of remote collection (in external app, which database I am connected)
How to make this work?
EDIT:
I have read many hours about it and I think it might be connected with things like:
- oplog
- replicaSet
But I am newbie to Meteor and can’t find out what are those things about. I have set MONGO_OPLOG_URL and I added oplog parameter to database driver as I read here: https://medium.com/#lionkeng/2-ways-to-share-data-between-2-different-meteor-apps-7b27f18b5de9
but nothing changed. And I don’t know how to use this replicaSet, how to add it to the url. Anybody can help?
You can also try something like below code,
var observer = YourCollections.find({}).observeChanges({
added: function (id, fields) {
}
});
You can also have 'addedBefore(id, fields, before)', 'changed(id, fields)', 'movedBefore(id, before)', 'removed(id)'
For more features goto link.

JS: Node.js and Socket.io - globals and architecture

Dear all,
Im working with JS for some weeks and now I need a bit of clarification. I have read a lot of sources and a lot of Q&A also in here and this is what I learned so far.
Everything below is in connection with Node.js and Socket.io
Use of globals in Node.js "can" be done, but is not best practice, terms: DONT DO IT!
With Sockets, everything is treated per socket call, meaning there is hardly a memory of previous call. Call gets in, and gets served, so no "kept" variables.
Ok I build up some chat example, multiple users - all get served with broadcast but no private messages for example.
Fairly simple and fairly ok. But now I am stuck in my mind and cant wrap my head around.
Lets say:
I need to act on the request
Like a request: "To all users whose name is BRIAN"
In my head I imagined:
1.
Custom object USER - defined globally on Node.js
function User(socket) {
this.Name;
this.socket = socket; }
2.
Than hold an ARRAY of these globally
users = [];
and on newConnection, create a new User, pass on its socket and store in the array for further action with
users.push(new User(socket));
3.
And on a Socket.io request that wants to contact all BRIANs do something like
for (var i = 0; i < users.length; i++) {
if(user[i].Name == "BRIAN") {
// Emit to user[i].socket
}}
But after trying and erroring, debugging, googling and reading apparently this is NOT how something like this should be done and somehow I cant find the right way to do it, or at least see / understand it. can you please help me, point me into a good direction or propose a best practice here? That would be awesome :-)
Note:
I dont want to store the data in a DB (that is next step) I want to work on the fly.
Thank you very much for your inputs
Oliver
first of all, please don't put users in a global variable, better put it in a module and require it elsewhere whenever needed. you can do it like this:
users.js
var users = {
_list : {}
};
users.create = function(data){
this._list[data.id] = data;
}
users.get = function(user_id){
return this._list[user_id];
};
users.getAll = function(){
return this._list;
};
module.exports = users;
and somewhere where in your implementation
var users = require('users');
For your problem where you want to send to all users with name "BRIAN",
i can see that you can do this good in 2 ways.
First.
When user is connected to socketio server, let the user join a socketio room using his/her name.
so it will look like this:
var custom_namespace = io.of('/custom_namespace');
custom_namespace.on('connection', function(client_socket){
//assuming here is where you send data from frontend to server
client_socket.on('user_data', function(data){
//assuming you have sent a valid object with a parameter "name", let the client socket join the room
if(data != undefined){
client_socket.join(data.name); //here is the trick
}
});
});
now, if you want to send to all people with name "BRIAN", you can achieve it by doing this
io.of('/custom_namespace').broadcast.to('BRIAN').emit('some_event',some_data);
Second.
By saving the data on the module users and filter it using lodash library
sample code
var _lodash = require('lodash');
var Users = require('users');
var all_users = Users.getAll();
var socket_ids = [];
var users_with_name_brian = _lodash.filter(all_users, { name : "BRIAN" });
users_with_name_brian.forEach(function(user){
socket_ids.push(user.name);
});
now instead of emitting it one by one per iteration, you can do it like this in socketio
io.of('/custom_namespace').broadcast.to(socket_ids).emit('some_event',some_data);
Here is the link for lodash documentation
I hope this helps.

How to query firebase for many to many relationship?

It is my first time developing a SPA, and I am not using JS frameworks like React, Vue or Angular. My project just uses the firebase sdk and jquery to access the DOM elements.
In my app, the users can be associated with projects. Since that, I have a user-projects and project-users paths to represent that relationship.
When a user logs in my app I request users/uid to get the user data. After that I have to fetch the projects associated with the user. I will take the ids of the associated projects to finally request the data of each project.
I'm trying to use promises as described here, but I get nothing in the console.
function loadUserProjects() {
// Authenticated user
var user = firebase.auth().currentUser;
// General reference to the real time db
var ref = firebase.database().ref();
// Request the user data
ref.child('users/'+user.uid).on('value').then(function(snapshot) {
var user_data = snapshot.val(); console.log(user_data);
// Global variable to store the id of the selected project
project_selected_key = user_data.project_selected;
// Get the list of associated projects
return ref.child('user-projects/'+user.uid).on('value').then(function(snapshot) {
console.log(snapshot);
return snapshot;
});
}).then(function (projectsSnapshot) {
console.log(projectsSnapshot);
// List associated projects
var project_options = '';
projectsSnapshot.forEach(function (e) {
project_options += '<option value="'+e.key+'">'+e.val()+'</option>';
});
if (! project_options) {
project_options = '<option disabled selected value>- Ningún proyecto -</option>';
}
$('#project_selected').html(project_options);
}, function(error) {
// Something went wrong.
console.error(error);
});
}
I know that I have to use one additional request, because at this point the <select>will be populated with truevalues (the additional request have to query the full data of each project). But I am not getting messages in the console.
Thanks in advance.
After that, I need to define different levels of privilege in each project, and associate a level when a project is assigned to a specific user. Initially I was very excited about the real time, but it seems that firebase is getting more complicated than I supposed.
A Firebase on() listener can respond to multiple events. A promise can only resolve once, that's why it's only available when you use Firebase's once() operation.
return ref.child('user-projects/'+user.uid).once('value');

Trying to understand Flux stores - so if the state is held in the store, is this also where I do database calls?

I'm trying to build a contacts list app to teach myself reactjs, and I am learning fluxible now.
1) A new contact is entered. Upon submit, a newContact object is created that holds:
firstName
lastName
email
phone1 (can add up to 3 phones)
image (right now its just a text field, you can add a URL..)
2) This newContact object is sent as a payload to my createNewContactAction, and dispatcher is "alerted" that a new contact has been made.
3) At this point, ContactStore comes into play.. This is where I am stuck.
I have gotten my object to this point. If I want to save this object to my database, is this where I would do that?
I'm a bit confused as to what to do next. My end goal would be to show all the contacts in a list, so I need to add each new contact somewhere so I can pull all of them.
Can someone point me in the right direction?
I would make a request to the server to save the newContact object before calling the createNewContactAction function. If the save is successful, then you can call the createNewContactAction to store the newContact object in the ContactStore. If it isn't successful, then you can do some error handling.
To understand why I think this pattern is preferable in most cases, imagine that you saved the contact in the store and then tried to save it in the database, but then the attempt to save in the database was unsuccessful for some reason. Now the store and database are out of sync, and you have to undo all of your changes to the store to get them back in sync. Making sure the database save is successful first makes it much easier to keep the store and database in sync.
There are cases where you might want to stash your data in the store before the database, but a user submitting a form with data you want to save in the database likely isn't one of those cases.
I like to create an additional file to handle my API calls, having all of your xhttp calls in your store can clutter things very quickly. I usually name it with my store, so in this case something like "contacts-api.js". In the api file I export an object with all of the api methods I need. eg using superagent for xhttp requests:
module.exports = {
createNewContact: function(data, callback) {
request
.post('/url')
.send(data)
.end(function(res, err) {
if (callback && typeof callback === 'function') {
callback(res, err);
}
});
}
}
I usually end up creating 3 actions per request. First one is to trigger the initial request with data, next is a success with the results and last is one for errors.
Your store methods for each action might end up looking something like this:
onCreateNewContactRequest: function(data) {
api.createNewContact(data, function(res, err) {
if (err) {
ContactsActions.createNewContactError(err);
} else {
ContactsActions.createNewContactSuccess(res);
}
});
},
onCreateNewContactSuccess: function(res) {
// save data to store
this.newContact = res;
},
onCreateNewContactError: function(err) {
// save error to store
this.error = err;
}
DB calls should ideally be made by action creators. Stores should only contain data.

Meteor collection not updating subscription on client

I'm quite new on Meteor and Mongo and even if I don't want it, I need some relations.
I have a Collection called Feeds and another called UserFeeds where I have a feedid and a userid, and I publish the user feeds on the server like this:
Meteor.publish('feeds', function(){
return Feeds.find({_id:{$in:_.pluck(UserFeeds.find({user:this.userId}).fetch(),'feedid')}});
});
I find the user on UserFeeds, fetch it (returns an array) and pluck it to have only the feedid field, and then find those feeds on the Feeds collection.
And subscribe on the client like this:
Deps.autorun(function(){
Meteor.subscribe("feeds");
});
The problem is that when I add a new feed and a new userfeed the client doesn't receive the change, but when I refresh the page the new feed does appear.
Any idea of what I'm missing here?
Thanks.
I've run into this, too. It turns out publish functions on the server don't re-run reactively: if they return a Collection cursor, as you're doing (and as most publish functions do), then the publish function will run once and Meteor will store the cursor and send down updates only when the contents of the cursor change. The important thing here is that Meteor will not re-run the publish function, nor, therefore, the Collection.find(query), when query changes. If you want the publish function to re-run, then the way I've done it so far is to set up the publish function to receive an argument. That way the client, whose collections do update reactively, can re-subscribe reactively. The code would look something like:
// client
Meteor.subscribe('user_feeds');
Deps.autorun(function(){
var allFeeds = UserFeeds.find({user: Meteor.userId()}).fetch();
var feedIds = _.pluck(allFeeds,'feedid');
Meteor.subscribe('feeds',feedids);
});
// server
Meteor.publish('feeds',function(feedids) {
return Feeds.find({_id: {$in: feedids}});
});
I believe the Meteorite package publish-with-relations is designed to solve this problem, although I haven't used it.
EDIT: I believe the publish function will re-run when the userId changes, which means that you can have a server-side check to make sure the user is logged in before publishing sensitive data.
I think your problem is that .fetch() which you use here…
UserFeeds.find({user:this.userId}).fetch()
…removes the reactivity.
.fetch() returns an array instead of a cursor, and that array won't be reactive.
http://docs.meteor.com/#fetch
try this ...
Meteor.autosubscribe(function(){
Meteor.subscribe("feeds");
});
and in the Template JS ...
Template.templateName.feeds = function()
return Feeds.find() # or any specific call
};
in the HTML ...
{{#each feeds}}
do some stuff
{{else}}
no feed
{{/each}}
You can use the reactive-publish package (I am one of authors). It allows you to create publish endpoints which depend on the result of another query. In your case, query on UserFeeds.
Meteor.publish('feeds', function () {
this.autorun(function (computation) {
var feeds = _.pluck(UserFeeds.find({user: this.userId}, {fields: {feedid: 1}}).fetch(), 'feedid');
return Feeds.find({_id: {$in: feeds}});
});
});
The important part is that you limit the UserFeeds fields only to feedid to make sure autorun does not rerun when some other field changes in UserFeeds, a field you do not care about.

Categories

Resources