Say I have an messaging service that schedules messages to friends,
And a user uploads their friends, along with when they want to send them
But say 10000 milliseconds later after scheduling, the uploader wants to take bob out of the (for) loop. How do I take bob out, without canceling scheduler. Or is there a better way to do this? (Its on a node server)
var friends = [‘John’, ‘bob’, ‘billy’, ‘dan’];
for (i in friends) {
setTimeout(function(){
sendMessage(friend[i])
},3000000)
}
I feel like there is a much better way to do this but have not found anything
Thanks, new to js and appreciate the help!
setTimeout returns an object. Store that somewhere, and then call clearTimeout with that object tif you want to do so, as described here:
https://stackoverflow.com/a/6394645/1384352
The way you have your code, all the timeouts will expire at the same time.
Instead you could only initiate the next timeout when the current one finishes. In the following demo 'Billy' is removed while the messages are being sent with an interval of 1 second. And indeed, Billy gets no message:
var friends = ['John', 'Bob', 'Billy', 'Dan'];
(function loop(i) {
setTimeout(function(){
if (i >= friends.length) return; // all done
sendMessage(friends[i]);
loop(i+1); // next
}, 1000);
})(0); // call the function with index 0
function sendMessage(friend) { // mock
console.log('send message to ' + friend);
}
// now remove 'Billy' after 2.5 second
setTimeout(function () {
console.log('remove ' + friends[2]);
friends.splice(2, 1);
}, 2500);
One option could be that you add flag if user should be removed from the recipients and if flag is on you skip sending to that person. Then later on clean marked users when you can. e.g.
if (!friends[i].isRemoved) {
setTimout.....
}
Other option would be that you define individual timers for each friend and then you can cancel who ever you want.
Related
I made a premium membership system to my discord bot, but the time starts to be minus a certain time, so I want to delete the data from the database when the time I set with ms expires. I tried something like that but it didn't work
I save the data as follows;
Looks like you are setting the interval to an incredibly long time. It looks like you are storing the specific time you want the function to run. You'll probably want to do something like this:
let interval = sure - new Date();
if(interval < 0) { // this expiration date already passed
interval = 0;
}
setInterval(function() {
db.delete(` ... `);
db.delete(` ... `);
}, interval);
However! If you do all this multiple times (like inside the 'message' handler like you're doing right now), you're gonna have a memory leak. Make sure you are setting the intervals only once.
Also, if your program crashes, you'll have to set up all the intervals again at startup.
If I was making something like this, I would instead make a cron job to check only once per day to delete all the expired members, instead of using setIntervals.
the chrome.cookies API is not clear to me. I want to getAll cookies for 3 different Domains, then delete those and wait for this process to finish, afterwards I want to set some cookies.
If I understand cookies chrome.cookies.getAll does not return a promise, only I can define a callback. Now its clear to me that I can write callbacks for all three getAll cookies commands but in there I am removing several cookies again this process goes asynchronously. So I am totally lost on how I can identify when all cookies of 3 domains have been completely removed.
One option that I could think of is that I run the 3 cookie.getAlls one time ahead and count the number of cookies, then with every remove I increase a counter and in a callback of remove I check whether I have reached the total number. This seems very strange so I can not believe that this is the correct way to do it.
Thanks
i don't think it's optimize but this is a quick answer.
function RemoveCookies(cookies, domain){
for(var i=0; i<cookies.length;i++) {
chrome.cookies.remove({url: 'https://'domain + cookies[i].path, name:cookies[i].name});
}
}
function RemoveDomain1(callback, ...params){
chrome.cookies.getAll({domain: domain1}, function(cookies) {
RemoveCookies(cookies, domain1);
callback(params)
});
}
function RemoveDomain2(callback, ...params){
chrome.cookies.getAll({domain: domain2}, function(cookies) {
RemoveCookies(cookies, domain2);
callback(params)
});
}
function RemoveDomain3(callback, ...params){
chrome.cookies.getAll({domain: domain3}, function(cookies) {
RemoveCookies(cookies, domain3);
callback()
});
}
RemoveDomain1(RemoveDomain2, RemoveDomain3, DoSomthingAfterAll)
check this link too maybe it help
I'm looking to implement a solution where I can query the Mongoose Database on a regular interval and then store the results to serve to my clients.
I'm assuming this will reduce my response time when my users pull the collection.
I attempted to implement this plan by creating an empty global object and then writing a function that queries the db and then stores the results as the global object mentioned previously. At the end of the function I setTimeout for 60 seconds and then ran the function again. I call this function the first time the server controller gets called when the app is first run.
I then set my clients up so that when they requested the collection, it would first look to see if the global object exists, and if so return that as the response. I figured this would cut my 7-10 second queries down to < 1 sec.
In my novice thinking I assumed that Nodejs being 'single-threaded' something like this could work quite well - but it just seemed to eat up all my RAM and cause fatal errors.
Am I on the right track with my thinking or is it better to query the db every time people pull the collection?
Here is the code in question:
var allLeads = {};
var getAllLeads = function(){
allLeads = {};
console.log('Getting All Leads...');
Lead.find().sort('-lastCalled').exec(function(err, leads) {
if (err) {
console.log('Error getting leads');
} else {
allLeads = leads;
}
});
setTimeout(function(){
getAllLeads();
}, 60000);
};
getAllLeads();
Thanks in advance for your assistance.
I have an application in which a user can manage multiple alarms (set a new alarm, edit it, or snooze)
When the alarm time is reached I need to notify the user. If a user clicks snooze, the time_end field gets 5 minutes added to it.
How can I track when the alarm times have been reached?
I have tried using a collection.observe() - but it only works once, on server start
Server: Meteor Startup
var expiredAlarms = Alarms.find({$and: [{"time_end": {$lt: moment()._d}}, {notification_sent: false}]});
expiredAlarms.observe({
added: function(doc) {
console.log('alarm timeout has been reached');
processAlarmEnd(); //set notification_sent to true
},
removed: function(doc) {
console.log('A notification has been sent');
}
});
The above code only works when the app is started and processes all notifications for expired alarms - but once new alarms expire, nothing happens. My guess is because moment()._d does not change and the same old query is called over and over again.
Should I place this code in a Tracker.autorun - will that help? Any suggestions on doing this better/smarter?
To make code execute in the future, you need to use setTimeout. In your observe query, don't care about the time, and in the added callback, calculate how many milliseconds remains till the alarm should go off, and use setTimeout to call processAlarmEnd after that amount of milliseconds.
You may also need to cancel a setTimeout with clearTimeout in the removed callback.
This is partially a javascript technique question. I am trying to build an object with facebook id as the key, and an array of likes as the value. My issue is the in my innermost function, I cannot access the variable fbid that I need for setting the key.
How to get access to fbid in the scope of the inner anonymous function?
friendsLikes = [];
FB.api('/me/friends',function(friends){
for(var i=0;i<friends.data.length;i++){
var fbid = friends.data[i].id
FB.api(fbid+'/likes',function(likes){
if(likes.data.length>=1){
//this is where I build the object
//I cannot use fbid for the key :(
console.log(likes.data.length);
}
})
}
})
The problem is that fbid is updated on each iteration, before your callback executes.
Loop through friends list
Set fbid to the friend's ID
Initiate an async call to get that friend's likes
Continue to next friend until we loop through the entire list
Some time passes
Results come back from the server and callbacks begin executing. At this point, fbid is the same for all callbacks – specifically, it is set to the ID of the last friend in the list.
Here's how I'd capture the fbid of each iteration:
friendsLikes = [];
FB.api('/me/friends',function(friends){
for(var i=0;i<friends.data.length;i++){
var fbid = friends.data[i].id
FB.api(fbid+'/likes', function(fbid) { return function(likes){
if(likes.data.length>=1){
// `fbid` will be correct here
console.log(likes.data.length);
}
}}(fbid));
}
});
Notice that we use a self-executing function and pass the current fbid. This returns a function that will have the proper fbid in scope.
And now a note on how you're doing this: this code's performance is going to suck because you're paying for a HTTP roundtrip for each friend. Remember that a browser will only open somewhere between 2-8 connections per host (depending on browser), and all of these calls are going to graph.facebook.com.
With a modest friend list of 200 and a generous roundtrip time of 150ms, the theoretical best case scenario is ~4 seconds. Things quickly go downhill if the browser will only do 2 concurrent connections and we have a 200ms roundtrip time: 20 seconds.
It's also highly likely Facebook might rate-limit you at some point.
Instead, you need to use the Batch API.
FB.api('/', 'POST', { batch: [
{ method: 'GET', relative_url: id+'/likes' },
...
] }, function(r) {
// `r` will be an array of results for each item in `batch`
});