I try to develop my own proxy (extends Ext.data.proxy.Proxy). after getting the response from my server (json) I can decode it and using my Json reader I create the set of records to commit to the store. As I understand I have to create an Operation object and commit the data with operation.commitRecords(). But this line if (clientRecords && clientRecords.length) never passes as clientRecords is undefined. I cannot understand how to use this object neither how to initialize it. This is my current code:
options.action = options.action || 'read';
var operation = new Ext.data.Operation(options);
var json = reader.read(response);
operation.commitRecords(json.records);
What should I do in order to commit the records?
Thanks in advance!
That's not the proxy that is supposed to create the operation, it is the store. Then it passes this operation to the proxy, along with a callback. The proxy executes the operation in its own way and, when it is done, use the callback to notify the store.
So, the short answer to your question is that you need to execute the callback passed to the CRUD method you're implementing. Let's say it's read (Amit's right that some context about your code could have helped). So that would be something like:
// see, the operation is given to us
read: function(operation, callback, scope) {
var me = this;
doWebsocketMagic(function() {
// notify the store
Ext.callback(callback, scope || me, [operation]);
});
}
Now, that won't be enough, because proxies are expected to manipulate the operation. You'll have to read the code from other proxies to know how. ServerProxy would have saved you that work, but you're not working with HTTP requests ("ServerProxy should ideally be named HttpProxy"). No luck.
So you should start by read the code of the read method of MemoryProxy. It offers, in one single place, an example of everything (I think) that you've got to do.
Then, maybe you can go clever about it:
Ext.define('My.WebsocketProxy', {
extend: 'Ext.data.proxy.Memory'
,read: function(operation, callback, scope) {
var me = this;
doWebsocketMagic(function(response) {
me.data = response;
Ext.data.proxy.Memory.prototype.read.call(me, operation, callback, scope);
});
}
});
Ok...Having searched how the store initializes the operation then it calls the proxies read method with callback function onProxyLoad(operation);. So adding store.onProxyLoad(operation); populates the store properly. Thanks for the replies guys...they helped solving my problem.
Related
New to MongoDB, very new to Atlas. I'm trying to set up a trigger such that it reads all the data from a collection named Config. This is my attempt:
exports = function(changeEvent) {
const mongodb = context.services.get("Cluster0");
const db = mongodb.db("TestDB");
var collection = db.collection("Config");
config_docs = collection.find().toArray();
console.log(JSON.stringify(config_docs));
}
the function is part of an automatically created realm application called Triggers_RealmApp, which has Cluster0 as a named linked data source. When I go into Collections in Cluster0, TestDB.Config is one of the collections.
Some notes:
it's not throwing an error, but simply returning {}.
When I change context.services.get("Cluster0"); to something else, it throws an error
When I change "TestDB" to a db that doesnt exist, or "Config" to a collection which doesn't exist, I get the same output; {}
I've tried creating new Realm apps, manually creating services, creating new databases and new collections, etc. I keep bumping into the same issue.
The mongo docs reference promises and awaits, which I haven't seen in any examples (link). I tried experimenting with that a bit and got nowhere. From what I can tell, what I've already done is the typical way of doing it.
Images:
Collection:
Linked Data Source:
I ended up taking it up with MongoDB directly, .find() is asynchronous and I was handling it incorrectly. Here is the reply straight from the horses mouth:
As I understand it, you are not getting your expected results from the query you posted above. I know it can be confusing when you are just starting out with a new technology and can't get something to work!
The issue is that the collection.find() function is an asynchronous function. That means it sends out the request but does not wait for the reply before continuing. Instead, it returns a Promise, which is an object that describes the current status of the operation. Since a Promise really isn't an array, your statment collection.find().toArray() is returning an empty object. You write this empty object to the console.log and end your function, probably before the asynchronous call even returns with your data.
There are a couple of ways to deal with this. The first is to make your function an async function and use the await operator to tell your function to wait for the collection.find() function to return before continuing.
exports = async function(changeEvent) {
const mongodb = context.services.get("Cluster0");
const db = mongodb.db("TestDB");
var collection = db.collection("Config");
config_docs = await collection.find().toArray();
console.log(JSON.stringify(config_docs));
};
Notice the async keyword on the first line, and the await keyword on the second to last line.
The second method is to use the .then function to process the results when they return:
exports = function(changeEvent) {
const mongodb = context.services.get("Cluster0");
const db = mongodb.db("TestDB");
var collection = db.collection("Config");
collection.find().toArray().then(config_docs => {
console.log(JSON.stringify(config_docs));
});
};
The connection has to be a connection to the primary replica set and the user log in credentials are of a admin level user (needs to have a permission of cluster admin)
I posted this question yesterday but I guess I just confused everyone. I got responses like "what exactly is your question?" So I am expanding and reposting today.
The following node.js snippet is from the file "accounts.js" which is in an ETrade api library that exists in the path /lib. It should return json containing data about the accounts of the authenticated user. The authentication part is working great. I'm confused about what exactly is being done in the last line of this function:
this._run(actionDescriptor,{},successCallback,errorCallback);
Ten years ago (the last time I was coding), we didn't have the construct "this" and I haven't a clue about "_run" and Google searches have not been helpful. Here is the function.
exports.listAccounts = function(successCallback, errorCallback) {
var actionDescriptor = {
method: "GET",
module: "accounts",
action: "accountlist",
useJSON: true,
};
this._run(actionDescriptor, {}, successCallback, errorCallback);
};
I understand that the function is accessed with "et.listAccounts ...." but then my understanding goes all to hell. It's pretty obvious that a get is being executed and json data returned. It's also obvious that the result is passed back through the successCallback.
In my app.js file, I have the following:
var etrade = require('./lib/etrade');
var et = new etrade(configuration);
Can someone please suggest a snippet to be used in app.js that will output the accounts data to the console?
It seems like the json data must be passed back through the successCallback but I'm lost on how to access it on the app.js side.
Suppose in app.js I want to put the accounts data in a variable called myAccounts. The exports.listAccounts function does not specify a return value, so I doubt I can do var myAccounts = et.listAccounts(). Likewise, myAccounts will be undefined if I try to do this: et.listAccounts(){myAccounts, error}. Finally, the listAccounts function contains two possible variable names I could use, "accounts" and "accountlist" but these turn out to be undefined at app.js.
When I put a function in successCallback in app.js to write a generic message to the console, the message appears in the log so I know I am making it into the listAccounts function and back successfully. In this case, the log also shows
"Request: [GET]: https://etwssandbox.etrade.com/accounts/sandbox/rest/accountlist.json"
From this I deduce that the data is actually being returned and is available at that end point.
Ten years ago (the last time I was coding), we didn't have the construct "this" and I haven't a clue about "_run"
this refers to the current object, further reading here. _run is just what they chose to call the function.
I have no experience with this module, but with a cursory glance at the git repo I suspect you will want to expand your app.js like so:
et.listAccounts(function(response) {
console.log(response);
});
In javascript functions are first order and so can be passed around like variables see here. listAccounts wants a function passed to it, and when it is complete it will call it with one parameters, as can be seen in etrade.js.
There is also the function errorCallback which is much the same but is called on an error. You could expand the above snippet like so:
et.listAccounts(function(response) {
console.log(response);
}, function(error) {
console.log(error);
});
I want to establish a two-way (bidirectional) communication within my meteor app. But I need to do it without using mongo collections.
So can pub/sub be used for arbitrary in-memory objects?
Is there a better, faster, or lower-level way? Performance is my top concern.
Thanks.
Yes, pub/sub can be used for arbitrary objects. Meteor’s docs even provide an example:
// server: publish the current size of a collection
Meteor.publish("counts-by-room", function (roomId) {
var self = this;
check(roomId, String);
var count = 0;
var initializing = true;
// observeChanges only returns after the initial `added` callbacks
// have run. Until then, we don't want to send a lot of
// `self.changed()` messages - hence tracking the
// `initializing` state.
var handle = Messages.find({roomId: roomId}).observeChanges({
added: function (id) {
count++;
if (!initializing)
self.changed("counts", roomId, {count: count});
},
removed: function (id) {
count--;
self.changed("counts", roomId, {count: count});
}
// don't care about changed
});
// Instead, we'll send one `self.added()` message right after
// observeChanges has returned, and mark the subscription as
// ready.
initializing = false;
self.added("counts", roomId, {count: count});
self.ready();
// Stop observing the cursor when client unsubs.
// Stopping a subscription automatically takes
// care of sending the client any removed messages.
self.onStop(function () {
handle.stop();
});
});
// client: declare collection to hold count object
Counts = new Mongo.Collection("counts");
// client: subscribe to the count for the current room
Tracker.autorun(function () {
Meteor.subscribe("counts-by-room", Session.get("roomId"));
});
// client: use the new collection
console.log("Current room has " +
Counts.findOne(Session.get("roomId")).count +
" messages.");
In this example, counts-by-room is publishing an arbitrary object created from data returned from Messages.find(), but you could just as easily get your source data elsewhere and publish it in the same way. You just need to provide the same added and removed callbacks like the example here.
You’ll notice that on the client there’s a collection called counts, but this is purely in-memory on the client; it’s not saved in MongoDB. I think this is necessary to use pub/sub.
If you want to avoid even an in-memory-only collection, you should look at Meteor.call. You could create a Meteor.method like getCountsByRoom(roomId) and call it from the client like Meteor.call('getCountsByRoom', 123) and the method will execute on the server and return its response. This is more the traditional Ajax way of doing things, and you lose all of Meteor’s reactivity.
Just to add another easy solution. You can pass connection: null to your Collection instantiation on your server. Even though this is not well-documented, but I heard from the meteor folks that this makes the collection in-memory.
Here's an example code posted by Emily Stark a year ago:
if (Meteor.isClient) {
Test = new Meteor.Collection("test");
Meteor.subscribe("testsub");
}
if (Meteor.isServer) {
Test = new Meteor.Collection("test", { connection: null });
Meteor.publish("testsub", function () {
return Test.find();
});
Test.insert({ foo: "bar" });
Test.insert({ foo: "baz" });
}
Edit
This should go under comment but I found it could be too long for it so I post as an answer. Or perhaps I misunderstood your question?
I wonder why you are against mongo. I somehow find it a good match with Meteor.
Anyway, everyone's use case can be different and your idea is doable but not with some serious hacks.
if you look at Meteor source code, you can find tools/run-mongo.js, it's where Meteor talks to mongo, you may tweak or implement your adaptor to work with your in-memory objects.
Another approach I can think of, will be to wrap your in-memory objects and write a database logic/layer to intercept existing mongo database communications (default port on 27017), you have to take care of all system environment variables like MONGO_URL etc. to make it work properly.
Final approach is wait until Meteor officially supports other databases like Redis.
Hope this helps.
Decided to test out Meteor JS today to see if I would be interested in building my next project with it and decided to start out with the Deps library.
To get something up extremely quick to test this feature out, I am using the 500px API to simulate changes. After reading through the docs quickly, I thought I would have a working example of it on my local box.
The function seems to only autorun once which is not how it is suppose to be working based on my initial understanding of this feature in Meteor.
Any advice would be greatly appreciated. Thanks in advance.
if (Meteor.isClient) {
var Api500px = {
dep: new Deps.Dependency,
get: function () {
this.dep.depend();
return Session.get('photos');
},
set: function (res) {
Session.set('photos', res.data.photos);
this.dep.changed();
}
};
Deps.autorun(function () {
Api500px.get();
Meteor.call('fetchPhotos', function (err, res) {
if (!err) Api500px.set(res);
else console.log(err);
});
});
Template.photos.photos = function () {
return Api500px.get();
};
}
if (Meteor.isServer) {
Meteor.methods({
fetchPhotos: function () {
var url = 'https://api.500px.com/v1/photos';
return HTTP.call('GET', url, {
params: {
consumer_key: 'my_consumer_key_here',
feature: 'fresh_today',
image_size: 2,
rpp: 24
}
});
}
});
}
Welcome to Meteor! A couple of things to point out before the actual answer...
Session variables have reactivity built in, so you don't need to use the Deps package to add Deps.Dependency properties when you're using them. This isn't to suggest you shouldn't roll your own reactive objects like this, but if you do so then its get and set functions should return and update a normal javascript property of the object (like value, for example), rather than a Session variable, with the reactivity being provided by the depend and changed methods of the dep property. The alternative would be to just use the Session variables directly and not bother with the Api500px object at all.
It's not clear to me what you're trying to achieve reactively here - apologies if it should be. Are you intending to repeatedly run fetchPhotos in an infinite loop, such that every time a result is returned the function gets called again? If so, it's really not the best way to do things - it would be much better to subscribe to a server publication (using Meteor.subscribe and Meteor.publish), get this publication function to run the API call with whatever the required regularity, and then publish the results to the client. That would dramatically reduce client-server communication with the same net result.
Having said all that, why would it only be running once? The two possible explanations that spring to mind would be that an error is being returned (and thus Api500px.set is never called), or the fact that a Session.set call doesn't actually fire a dependency changed event if the new value is the same as the existing value. However, in the latter case I would still expect your function to run repeatedly as you have your own depend and changed structure surrounding the Session variable, which does not implement that self-limiting logic, so having Api500px.get in the autorun should mean that it reruns when Api500px.set returns even if the Session.set inside it isn't actually doing anything. If it's not the former diagnosis then I'd just log everything in sight and the answer should present itself.
I am writing a Backbone application, and I need to offer some feedback to users whenever a request to the server is made (annoying, I know, but I have no control over this behaviour of the application). The backend always reports an informative (at least in theory) message with every response, like
{
"status":"error",
"message":"something went really wrong"
}
or
{
"status":"success",
"message":"congratulations",
"data":{...}
}
What I would like to understand is where to put a hook for some kind of messaging service.
One possibility is the parse() method for models and collections. To avoid duplication, I would have to put it inside some model base class. It is still a bit annoying since all models and collections have their own parse() anyway.
A more reasonable place to look would be the Backbone.sync function. But I do not want to overwrite it, instead I would like to wrap it inside some other helper function. The problem here is that I cannot find a good hook where to put some logic to be executed with every request.
Do you have any suggestions on how to organize some piece of logic to be executed with every request?
Since Backbone.sync returns whatever $.ajax returns, it is easy to achieve what I want by using jQuery delegates, like this
var originalMethod = Backbone.sync;
Backbone.sync = function(method, model, options) {
var request = originalMethod.call(Backbone, method, model, options);
request.done(function(msg) {
console.log(msg);
});
request.fail(function(jqXHR, textStatus) {
console.log(jqXHR, textStatus);
});
return request;
};
Assuming you are using a recent (>1.5) jquery all results from sync will return the $.ajax promise.
You can do it then without overriding anything in sync by using that promise. For example, if you did a fetch(), you could do:
var p = mymodel.fetch();
p.done(function (res) { ... });
p.fail(function (err) { ... });
Of course you can also use callbacks in fetch options, but I find the above much cleaner. The same pattern applies for say save or anything that uses sync.