How to build an object of friends and likes - javascript

This is partially a javascript technique question. I am trying to build an object with facebook id as the key, and an array of likes as the value. My issue is the in my innermost function, I cannot access the variable fbid that I need for setting the key.
How to get access to fbid in the scope of the inner anonymous function?
friendsLikes = [];
FB.api('/me/friends',function(friends){
for(var i=0;i<friends.data.length;i++){
var fbid = friends.data[i].id
FB.api(fbid+'/likes',function(likes){
if(likes.data.length>=1){
//this is where I build the object
//I cannot use fbid for the key :(
console.log(likes.data.length);
}
})
}
})

The problem is that fbid is updated on each iteration, before your callback executes.
Loop through friends list
Set fbid to the friend's ID
Initiate an async call to get that friend's likes
Continue to next friend until we loop through the entire list
Some time passes
Results come back from the server and callbacks begin executing. At this point, fbid is the same for all callbacks – specifically, it is set to the ID of the last friend in the list.
Here's how I'd capture the fbid of each iteration:
friendsLikes = [];
FB.api('/me/friends',function(friends){
for(var i=0;i<friends.data.length;i++){
var fbid = friends.data[i].id
FB.api(fbid+'/likes', function(fbid) { return function(likes){
if(likes.data.length>=1){
// `fbid` will be correct here
console.log(likes.data.length);
}
}}(fbid));
}
});
Notice that we use a self-executing function and pass the current fbid. This returns a function that will have the proper fbid in scope.
And now a note on how you're doing this: this code's performance is going to suck because you're paying for a HTTP roundtrip for each friend. Remember that a browser will only open somewhere between 2-8 connections per host (depending on browser), and all of these calls are going to graph.facebook.com.
With a modest friend list of 200 and a generous roundtrip time of 150ms, the theoretical best case scenario is ~4 seconds. Things quickly go downhill if the browser will only do 2 concurrent connections and we have a 200ms roundtrip time: 20 seconds.
It's also highly likely Facebook might rate-limit you at some point.
Instead, you need to use the Batch API.
FB.api('/', 'POST', { batch: [
{ method: 'GET', relative_url: id+'/likes' },
...
] }, function(r) {
// `r` will be an array of results for each item in `batch`
});

Related

How to throw a custom message using Dialogflow after three times of fallback

I am developing a chatbot using Dialogflow, I would like to throw a message to user when the chatbot doesn't understand the user input for three times in a row and for the forth time respond with a custom message (not the one of the options declared on the dialogflow interface)
One idea that I have is to make a counter within the input unknown action like this:
var counter = 1;
// The default fallback intent has been matched, try to recover (https://dialogflow.com/docs/intents#fallback_intents)
'input.unknown': () => {
// Use the Actions on Google lib to respond to Google requests; for other requests use JSON
if (requestSource === googleAssistantRequest) {
sendGoogleResponse('I\'m having trouble, can you try that again?'); // Send simple response to user
} else {
if (counter == 3) {
counter = 1;
sendResponse('Custom message');
} else {
counter++;
sendResponse('I\'m having trouble, can you try that again?'); // Send simple response to user
}
}
},
This would work, but idk if this will work for multiple user at the same time, I was thinking to create a storage for storing requests attached by a unique id and have a different counter for each request!
Do you have any better idea of achieving such thing in Dialogflow?
This will not work the way you've designed it. Not quite for the reason you think, but close.
You don't show the rest of your code (that's ok), but the counter variable is probably in a function that gets called each time it processes a message. When that function is finished, the counter variable goes out of scope - it is lost. Having multiple calls at the same time won't really be an issue since each call gets a different scope (I'm glossing over some technical details, but this should be good enough).
One solution is that you could store the variable in a global context - but then you do have the issue of multiple users ending up with the same counter. That is very very bad.
Your solution about keeping a counter in a database, keyed against the user, does make sense. But for this need, it is overkill. It is useful for saving data between conversations, but there are better ways to save information during the same conversation.
The easiest solution would be to use a Dialogflow Context. Contexts let you save state in between calls to your webhook fulfillment during the same conversation and for a specific number of messages received from the user (the lifespan).
In this case, it would be best if you created a context named something like unknown_counter with a lifespan of 1. In the parameters, you might set val to 1.
The lifespan of 1 would mean that you'll only see this context the next time your webhook is called. If they handle it through some other Intent (ie - you understood them), then the context would just vanish after your fulfillment runs.
But if your input.unknown handler is called again, then you would see the context was there and what the value is. If it doesn't meet the threshold, send the context again (with a lifespan of 1 again), but with the value being incremented by 1. If it did meet the threshold - you'd reply with some other answer and close the connection.
By "send the context", I mean that the context would be included as part of the reply. So instead of sending just a string to sendGoogleResponse() or sendResponse() you would send an object that included a speech property and an outputContexts property. Something like this:
var outputContexts = [
{
name: 'unknown_counter',
lifespan: 1,
parameters: {
'val': counterValue,
}
}
];
sendResponse({
speech: "I'm confused. What did you say?",
outputContexts: outputContexts
});

Nodejs Mongoose - Serve clients a single query result

I'm looking to implement a solution where I can query the Mongoose Database on a regular interval and then store the results to serve to my clients.
I'm assuming this will reduce my response time when my users pull the collection.
I attempted to implement this plan by creating an empty global object and then writing a function that queries the db and then stores the results as the global object mentioned previously. At the end of the function I setTimeout for 60 seconds and then ran the function again. I call this function the first time the server controller gets called when the app is first run.
I then set my clients up so that when they requested the collection, it would first look to see if the global object exists, and if so return that as the response. I figured this would cut my 7-10 second queries down to < 1 sec.
In my novice thinking I assumed that Nodejs being 'single-threaded' something like this could work quite well - but it just seemed to eat up all my RAM and cause fatal errors.
Am I on the right track with my thinking or is it better to query the db every time people pull the collection?
Here is the code in question:
var allLeads = {};
var getAllLeads = function(){
allLeads = {};
console.log('Getting All Leads...');
Lead.find().sort('-lastCalled').exec(function(err, leads) {
if (err) {
console.log('Error getting leads');
} else {
allLeads = leads;
}
});
setTimeout(function(){
getAllLeads();
}, 60000);
};
getAllLeads();
Thanks in advance for your assistance.

Parse Cloud Code Ending Prematurely?

I'm writing a job that I want to run every hour in the background on Parse. My database has two tables. The first contains a list of Questions, while the second lists all of the user\question agreement pairs (QuestionAgreements). Originally my plan was just to have the client count the QuestionAgreements itself, but I'm finding that this results in a lot of requests that really could be done away with, so I want this background job to run the count, and then update a field directly on Question with it.
Here's my attempt:
Parse.Cloud.job("updateQuestionAgreementCounts", function(request, status) {
Parse.Cloud.useMasterKey();
var query = new Parse.Query("Question");
query.each(function(question) {
var agreementQuery = new Parse.Query("QuestionAgreement");
agreementQuery.equalTo("question", question);
agreementQuery.count({
success: function(count) {
question.set("agreementCount", count);
question.save(null, null);
}
});
}).then(function() {
status.success("Finished updating Question Agreement Counts.");
}, function(error) {
status.error("Failed to update Question Agreement Counts.")
});
});
The problem is, this only seems to be running on a few of the Questions, and then it stops, appearing in the Job Status section of the Parse Dashboard as "succeeded". I suspect the problem is that it's returning prematurely. Here are my questions:
1 - How can I keep this from returning prematurely? (Assuming this is, in fact, my problem.)
2 - What is the best way of debugging cloud code? Since this isn't client side, I don't have any way to set breakpoints or anything, do I?
status.success is called before the asynchronous success calls of count are finished. To prevent this, you can use promises here. Check the docs for Parse.Query.each.
Iterates over each result of a query, calling a callback for each one. If the callback returns a promise, the iteration will not continue until that promise has been fulfilled.
So, you can chain the count promise:
agreementQuery.count().then(function () {
question.set("agreementCount", count);
question.save(null, null);
});
You can also use parallel promises to make it more efficient.
There are no breakpoints in cloud code, that makes Parse really hard to use. Only way is logging your variables with console.log
I was able to utilize promises, as suggested by knshn, to make it so that my code would complete before running success.
Parse.Cloud.job("updateQuestionAgreementCounts", function(request, status) {
Parse.Cloud.useMasterKey();
var promises = []; // Set up a list that will hold the promises being waited on.
var query = new Parse.Query("Question");
query.each(function(question) {
var agreementQuery = new Parse.Query("QuestionAgreement");
agreementQuery.equalTo("question", question);
agreementQuery.equalTo("agreement", 1);
// Make sure that the count finishes running first!
promises.push(agreementQuery.count().then(function(count) {
question.set("agreementCount", count);
// Make sure that the object is actually saved first!
promises.push(question.save(null, null));
}));
}).then(function() {
// Before exiting, make sure all the promises have been fulfilled!
Parse.Promise.when(promises).then(function() {
status.success("Finished updating Question Agreement Counts.");
});
});
});

Strange issue with socket.on method

I am facing a strange issue with calling socket.on methods from the Javascript client. Consider below code:
for(var i=0;i<2;i++) {
var socket = io.connect('http://localhost:5000/');
socket.emit('getLoad');
socket.on('cpuUsage',function(data) {
document.write(data);
});
}
Here basically I am calling a cpuUsage event which is emitted by socket server, but for each iteration I am getting the same value. This is the output:
0.03549148310035006
0.03549148310035006
0.03549148310035006
0.03549148310035006
Edit: Server side code, basically I am using node-usage library to calculate CPU usage:
socket.on('getLoad', function (data) {
usage.lookup(pid, function(err, result) {
cpuUsage = result.cpu;
memUsage = result.memory;
console.log("Cpu Usage1: " + cpuUsage);
console.log("Cpu Usage2: " + memUsage);
/*socket.emit('cpuUsage',result.cpu);
socket.emit('memUsage',result.memory);*/
socket.emit('cpuUsage',cpuUsage);
socket.emit('memUsage',memUsage);
});
});
Where as in the server side, I am getting different values for each emit and socket.on. I am very much feeling strange why this is happening. I tried setting data = null after each socket.on call, but still it prints the same value. I don't know what phrase to search, so I posted. Can anyone please guide me?
Please note: I am basically Java developer and have a less experience in Javascript side.
You are making the assumption that when you use .emit(), a subsequent .on() will wait for a reply, but that's not how socket.io works.
Your code basically does this:
it emits two getLoad messages directly after each other (which is probably why the returning value is the same);
it installs two handlers for a returning cpuUsage message being sent by the server;
This also means that each time you run your loop, you're installing more and more handlers for the same message.
Now I'm not sure what exactly it is you want. If you want to periodically request the CPU load, use setInterval or setTimeout. If you want to send a message to the server and want to 'wait' for a response, you may want to use acknowledgement functions (not very well documented, but see this blog post).
But you should assume that for each type of message, you should only call socket.on('MESSAGETYPE', ) once during the runtime of your code.
EDIT: here's an example client-side setup for a periodic poll of the data:
var socket = io.connect(...);
socket.on('connect', function() {
// Handle the server response:
socket.on('cpuUsage', function(data) {
document.write(data);
});
// Start an interval to query the server for the load every 30 seconds:
setInterval(function() {
socket.emit('getLoad');
}, 30 * 1000); // milliseconds
});
Use this line instead:
var socket = io.connect('iptoserver', {'force new connection': true});
Replace iptoserver with the actual ip to the server of course, in this case localhost.
Edit.
That is, if you want to create multiple clients.
Else you have to place your initiation of the socket variable before the for loop.
I suspected the call returns average CPU usage at the time of startup, which seems to be the case here. Checking the node-usage documentation page (average-cpu-usage-vs-current-cpu-usage) I found:
By default CPU Percentage provided is an average from the starting
time of the process. It does not correctly reflect the current CPU
usage. (this is also a problem with linux ps utility)
But If you call usage.lookup() continuously for a given pid, you can
turn on keepHistory flag and you'll get the CPU usage since last time
you track the usage. This reflects the current CPU usage.
Also given the example how to use it.
var pid = process.pid;
var options = { keepHistory: true }
usage.lookup(pid, options, function(err, result) {
});

NodeJS: How to handle a variable number of callbacks run in parallel and map their responses to requests?

As an exercise to teach myself more about node js I started making a basic CRUD REST server for SimpleDB (sdb) using the aws-sdk.
Everything was running smoothly until I got to a function for reading the domains. The aws-sdk has two functions for this purpose: listDomains and domainMetadata. listDomains returns an array of sdb domain names. domainMetadata will return additional statistics about a domain, but will only return them for one domain at a time. It does not include the domain name in the results.
My script is running listDomains and returning an array in the JSON response just fine. I would like to make my api readDomains function more ambitious though and have it return the metadata for all of the domains in the same single api call. After all, running a handful of domainMetadata calls at the same time is where node's async io should shine.
The problem is I can't figure out how to run a variable number of calls, use the same callback for all of them, match the results of each domainMetadata call to it's domainName (since it's async and they're not guaranteed to return in the order they were requested) and tell when all of the metadata requests have finished so that I can send my final response. Put into code my problem areas are:
domain.receiveDomainList = function(err, data){
var domainList = [];
for(var i=0; i<data.DomainNames.length; i++){
sdb.domainMetaData({"DomainName":data.DomainNames[i]},domain.receiveMetadata);
// alternatively: domainList.push({"DomainName":data.DomainNames[i]});
}
// alternatively:
// async.map(domainList, sdb.domainMetadata, domain.receiveMetadata)
console.log(domainList);
}
domain.receiveMetadata = function (err, data){
// I figure I can stash the results one at a time in an array in the
// parent scope but...
// How can I tell when all of the results have been received?
// Since the domainname used for the original call is not returned with
// the results how do I tell what result matches what request?
}
Based on my reading of async's readme the map function should at least match the metadata responses with the requests through some black magic, but it causes node to bomb out in the aws sync library with an error of " has no method 'makeRequest'".
Is there any way to have it all: requests run in parallel, requests matched with responses and knowing when I've received everything?
Using .bind() you can set the context or this values as well as provide leading default arguments to the bound function.
The sample code below is purely to show how you might use .bind() to add additional context to your response callbacks.
In the code below, .bind is used to:
set a domainResults object as the context for the receiveMetaData callback
pass the current domain name as an argument to the callback
The domainResults object is used to:
keep track of the number of names received in the first request
keep track of the completedCount (incremented on each callback from the metaData request)
keep track of both error and success responses in list
provide a complete callback
Completely untested code for illustrative purposes only:
domain.receiveDomainList = function(err, data) {
// Assuming err is falsey
var domainResults = {
nameCount: data.DomainNames.length,
completeCount: 0,
list: {},
complete: function() {
console.log(this.list);
}
};
for (var i = 0; i < data.DomainNames.length; i++) {
sdb.domainMetaData({ "DomainName": data.DomainNames[i] },
domain.receiveMetadata.bind(domainResults, data.DomainNames[i]));
}
}
domain.receiveMetadata = function(domainName, err, data) {
// Because of .bind, this === domainResults
this.completeCount++;
this.list[domainName] = data || {error: err};
if(this.completeCount === this.nameCount) {
this.complete();
}
}

Categories

Resources