I've created a cron-job to run a task and I want it to stop at some specific condition inside the scheduler, but it is not working.
How can I stop cron job inside the scheduler?
Here is my code:
// cron-job.js
const cron = require('node-cron');
const scheduler = cron.schedule('* * * * *', () => {
let result = fetchResultAfterSomeOperation();
if (result < 20) {
scheduler.stop(); //------------------------ It doesn't work
}
}, {
scheduled: false
});
scheduler.start();
How cron jobs created ?
Actually, Cron jobs are created from parent process. So parent process can have ability or feature to kill or suspend the child processes. So here node-cron may works in this way.
Now come to your issue, you have submitting cron task using cron.schedule(time,callback). Now callback is going to run on separate child process. So even if you're using scheduler object to stop the cron task, it wont work. The scheduler can stop the child process from main process (i.e cron.js file).
So I advise you to refactor your code.
As suggested by #Faizul Ahemed, you may change the code to something like this:
const cron = require('node-cron');
let result = null
const seachFunction = () => {
if (!result) fetchResults();
}
const scheduler = cron.schedule('*/10 * * * * *', searchFunction, { scheduled: false });
scheduler.start();
setTimeout(() => { scheduler.stop() }, 5 * 60 * 1000)
The above code will cause the fetchResults function to fetch/get data every x time for a duration of y set by the setTimeout.
Related
I'm trying to make a bot that sends a message every day at 5 am EST so I'm trying to create a Cron job. This is what I have but every time I run it, it sends a message straight away instead of the time I want it to send at. Here's my code. I have it at 5 am in the code but I change the time when I'm testing it out.
Thank you.
const e = require('express')
const client = new Discord.Client()
const config = require('./config.json')
const privateMessage = require('./private-message')
const cron = require('node-cron');
const express = require('express');
client.on('ready', () => {
console.log('running');
})
cron.schedule('0 5 * * *', function() {
console.log('cron is working');
}, {
scheduled: true,
timezone: "America/New_York"
});
client.login(config.token).then(() => {
console.log('sending');
client.users
.fetch('749097582227357839').then((user) => {
user.send(`hello`,);
})
console.log("nope");
client.destroy();
});
client.login(config.token)
You have two client.login functions for some reason, please remove the first one.
You dont seem to use cron in the correct way - the cron.schedule function is where you put the code to repeat, not below it.
That is why your bot is sending the message immediately - the code is simply doing what you are asking it to do right after it schedules a cron job.
If you have everything else correct, your bot will actually log cron is working to the console at 5am every morning with your current code - the below code should achieve what you need.
To sum up:
//declare packages here
cron.schedule('0 5 * * *', function() {
console.log('sending');
client.users.fetch('749097582227357839').send('hello');
}, {
scheduled: true,
timezone: "America/New_York"
});
client.login(config.token); //only ever use one of these events, it causes issues if you use multiple
there is also no need to do client.destroy - it shouldn't make any difference
I'm brand spakin' new to rxjs, and would like to use it to build a video downloader. The intention is to run it 24/7 and automatically record an occasional livestream for later watching. Here is what I have so far.
import { BehaviorSubject, from, defer, of } from "rxjs";
import { delay, mergeMap, repeat, tap } from "rxjs/operators";
const downloader = url => {
const defaultDelay = 1000;
const maxDelay = 10000;
const delayTime = new BehaviorSubject(defaultDelay);
/*
* Simulated download output.
*
* #return {String|Number} potentialOutput
* A {Number} 1 means "FAILURE, stream is offline."
* A {String} means "SUCCESS, video was downloaded."
* 1 is the most likely value returned
*
* greets https://stackoverflow.com/a/8877271/1004931
*/
function randomWithProbability() {
var potentialOutput = [1, 1, 1, 1, 1, "/tmp/video.mp4"];
var idx = Math.floor(Math.random() * potentialOutput.length);
return potentialOutput[idx];
}
/**
* Simulated download. Returns a promise which resolves after 1 second.
*/
const download = url => {
let downloadP = new Promise((resolve, reject) => {
setTimeout(() => {
resolve(randomWithProbability());
}, 1000);
});
return from(downloadP);
};
/**
* Conditionally adjust the delay inbetween download attempts.
* - If the video downloaded successfuly, reset the timer to it's default.
* (in case the stream went down by error, we want to record again ASAP.)
* - If the video stream was offline, increase the delay until our next download attempt.
* (we don't want to be rude and flood the server)
*/
const adjustTimer = (ytdlOutput) => {
if (typeof ytdlOutput === 'string') {
delayTime.next(defaultDelay); // video stream exited successfully, so reset in case the stream starts again
} else {
let adjustedTime = (delayTime.getValue() * 2 > maxDelay) ? maxDelay : delayTime.getValue() * 2;
delayTime.next(adjustedTime); // video stream exited abnormally, likely due to being offline. wait longer until next attempt
}
};
/**
* The Observable.
* 1. Start with the URL of the video stream
* 2. delay by the time defined in delayTime
* 3. download, merging the download observable with the parent observable.
* 4. adjust the delayTime based on download output.
* 5. repeat the process indefinitely.
*/
const stream = of(url)
.pipe(
delay(delayTime.getValue()),
mergeMap(download),
tap(res => {
adjustTimer(res);
}),
repeat()
)
stream.subscribe(val => {
console.log(
`download result:${val}, delayTime:${delayTime.getValue()}`
);
});
};
downloader("https://example.com/files/video.mp4");
(Stackblitz)
The problem I'm having is that the {BehaviorSubject} delayTime is not getting updated on every iteration of my loop. delayTime is getting updated, as indicated by delayTime.getValue() being called in the subscriber's callback, but the changes aren't having an effect in the memory(?) of the observable/subscriber(?).
Instead, I'm seeing that delayTime in the scope(?) of the observable is staying the same, as it was when it was first subscribed to. In the observable's world, there is no update to the BehaviorSubject's value, as I want there to be.
And this is where I'm stuck. How can I refactor my code to have a delay timer which changes over time, and effects the delay until the next download attempt?
Ignore rxjs for a moment, and look at this code pretending you don't know what any of these functions mean:
const stream = of(url)
.pipe(
delay(delayTime.getValue()),
mergeMap(download),
tap(res => {
adjustTimer(res);
}),
repeat()
)
An anonymized, simple version would be
someFunc(delayTime.getValue())
The problem here is that delayTime.getValue() gets evaluated directly, not when someFunc runs. The same is true for your code above: the evaluation happens when the stream variable is created, not on every "iteration" (better word: emission).
The delay operator works only with a fixed delay. For your purpose you want to use delayWhen, which is evaluated for each emission:
delayWhen(() => timer(delayTime.getValue())
Notice, however, that we need to return a notifier observable rather than the desired delay in ms.
As a final note, accessing getValue is a red flag for not using observables correctly. That's also why we don't actually use the arguments provided to the callback in delayWhen. Your code could do with refactoring to make it properly reactive, but that is beyond the scope here.
I currently have an SPA built with MERN and I want to improve it further by adding a scheduled update to a particular collection in my MongoDB database by setting a boolean field in all of the documents in a collection to false every midnight.
Can someone point me to the right direction on how to accomplish this?
I want to be able to scale it as well at some point - for example, have a value saved in a document in another collection to indicate the time where these boolean fields will be invalidated in the front end?
I'm using a MERN stack. Thanks for your help!
you can use cron job
const moment = require('moment');
const CronJob = require('cron').CronJob;
const updateCollections = async ()=>{
await someQueriesServices()
}
new CronJob('0 0 * * *', async () => {
await updateCollections()
}, null, true, 'America/Los_Angeles');
or you can use setInterval
const timeInSec = moment().endOf('day').valueOf()
const Interval = Date.now() - timeInSec;
setInterval(async ()=>{
await updateCollections()
},Interval)
I usually use node-schedule
const schedule = require('node-schedule');
const j = schedule.scheduleJob('42 * * * *', function(){
console.log('The answer to life, the universe, and everything!');
});
This is quite hard problem to describe.
I have a koajs app with a function which is created in multiple instances (10-1000 range) every 2 minutes. this scheduled job created on app startup. I use koajs because i need a few simple api endpoints for this app. It is running well for first 3-5 hours and then the count of created instances starts to decrease and some of the log output disappears.
Here is the minimal sample based on actual code:
server.ts
const bootstrap = async () => {
process.setMaxListeners(0); //(node:7310) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 uncaughtException listeners added to [process]. Use emitter.setMaxListeners() to increase limit
//appears on app startup (however seems like this setMaxListeners(0) doesnt affect anything since the warning persist)
const app = new Koa();
app.use(async ctx => {
ctx.body = "Welcome to my Server!";
});
app.listen(port);
new Main().run();
};
bootstrap();
main.ts (tried: cron npm package, node-scheduler, setInterval, recursive setTimeout) to run the scheduledJobWrapper.
isStarting: boolean = false;
async run() {
logger.info(`running the app, every 2 minutes`);
//let that = this;
// new CronJob(`*/2 * * * *`, function () {
// that.scheduledJobWrapper();
// }, null, true, 'America/Los_Angeles');
const interval = 2 * 60 * 1000;
setInterval(() => {
this.scheduledJobWrapper();
}, interval);
}
async scheduledJobWrapper() {
logger.info("here scheduledJobWrapper");
let args = {};
//some irrelevant logic to set the arguments
await this.scheduledJob(args);
}
async scheduledJob(args) {
try {
logger.info("starting");
if (!this.isStarting) {
this.isStarting = true;
const runningCount = Executor.tasks.length; //Executor.tasks is a singleton containing some info about tasks. details are irrelevant. the point is it contains the active tasks.
const tasksLimit = 100;
if (runningCount < tasksLimit) {
for await (const i of Array(tasksLimit - runningCount).keys()) {
if (Executor.tasks.length > 20)
await global.sleep(5 * 1000);
this.startWrapper(args); //calling main task here
}
}
this.isStarting = false;
logger.info(`Started: ${Executor.tasks.length - runningCount}`);
}
} catch (e) {
logger.error("Error running scheduled job: " + e.toString());
}
}
In this example the problem manifests as following:
All work as expected first 3-5 hours, later for each time the scheduled function called:
logger.info("here scheduledJobWrapper"); does now show any output.
logger.info("starting"); not in the output
this.startWrapper does run and the code inside it is being executed.
Despite that the code inside of this.startWrapper is still running, the count of newly created jobs is slowly decreasing.
Hardware (RAM/CPU) is not getting any significant load (CPU under 10%, RAM under 20%)
Any clue on possible reason?
nodejs: 12.6.0
Thanks!
UPDATE
it seems like that with the usage of setInterval the app is running OK for a longer time (6-24 hours), but after that the problem still starts.
The issue is with setInterval function. It gets slow down with the time. It has wierd behavior too. You can create custom setInterval using setTimeout or use third-party module and give try.
Sample setInterval Implementation.
const intervals = new Map();
function setInterval(fn, time, context, ...args) {
const id = new Date().getTime() + "" + Math.floor(Math.random() * 10000);
intervals.set(
id,
setTimeout(function next() {
intervals.set(id, setTimeout(next, time));
fn.apply(context, args);
}, time)
);
return id;
}
function clearInterval(id) {
clearTimeout(intervals.get(id));
}
setInterval(console.log, 100, console, "hi");
You can also enhance, by adding delta time loss in next setTimeout.
Meaning if time loss, run next setTimeout earlier.
First of all, It will be better to move instance of Main() in listen scope:
app.listen(port, () => {
new Main().run();
});
I don't know how good idea is to run setInterval function in the backend side. It's better to extract this logic and move it in cron job.
Are we sure that the machine can run 100 tasks? Please count the tasks by order and see when the problem starts. Probably you can not schedule 100 tasks and exists one limit somewhere
I would like to delete data that is older than two hours. Currently, on the client-side, I loop through all the data and run a delete on the outdated data. When I do this, the db.on('value') function is invoked every time something is deleted. Also, things will only be deleted when a client connects, and what might happen if two clients connect at once?
Where can I set up something that deletes old data? I have a timestamp inside each object created by a JavaScript Date.now().
Firebase does not support queries with a dynamic parameter, such as "two hours ago". It can however execute a query for a specific value, such as "after August 14 2015, 7:27:32 AM".
That means that you can run a snippet of code periodically to clean up items that are older than 2 hours at that time:
var ref = firebase.database().ref('/path/to/items/');
var now = Date.now();
var cutoff = now - 2 * 60 * 60 * 1000;
var old = ref.orderByChild('timestamp').endAt(cutoff).limitToLast(1);
var listener = old.on('child_added', function(snapshot) {
snapshot.ref.remove();
});
As you'll note I use child_added instead of value, and I limitToLast(1). As I delete each child, Firebase will fire a child_added for the new "last" item until there are no more items after the cutoff point.
Update: if you want to run this code in Cloud Functions for Firebase:
exports.deleteOldItems = functions.database.ref('/path/to/items/{pushId}')
.onWrite((change, context) => {
var ref = change.after.ref.parent; // reference to the items
var now = Date.now();
var cutoff = now - 2 * 60 * 60 * 1000;
var oldItemsQuery = ref.orderByChild('timestamp').endAt(cutoff);
return oldItemsQuery.once('value', function(snapshot) {
// create a map with all children that need to be removed
var updates = {};
snapshot.forEach(function(child) {
updates[child.key] = null
});
// execute all updates in one go and return the result to end the function
return ref.update(updates);
});
});
This function triggers whenever data is written under /path/to/items, so child nodes will only be deleted when data is being modified.
This code is now also available in the functions-samples repo.
I have a http triggered cloud function that deletes nodes, depending on when they were created and their expiration date.
When I add a node to the database, it needs two fields: timestamp to know when it was created, and duration to know when the offer must expire.
Then, I have this http triggered cloud function:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
/**
* #function HTTP trigger that, when triggered by a request, checks every message of the database to delete the expired ones.
* #type {HttpsFunction}
*/
exports.removeOldMessages = functions.https.onRequest((req, res) => {
const timeNow = Date.now();
const messagesRef = admin.database().ref('/messages');
messagesRef.once('value', (snapshot) => {
snapshot.forEach((child) => {
if ((Number(child.val()['timestamp']) + Number(child.val()['duration'])) <= timeNow) {
child.ref.set(null);
}
});
});
return res.status(200).end();
});
You can create a cron job that every X minutes makes a request to the URL of that function: https://cron-job.org/en/
But I prefer to run my own script, that makes a request every 10 seconds:
watch -n10 curl -X GET https://(your-zone)-(your-project-id).cloudfunctions.net/removeOldMessages
In the latest version of Firebase API, ref() is changed to ref
var ref = new Firebase('https://yours.firebaseio.com/path/to/items/');
var now = Date.now();
var cutoff = now - 2 * 60 * 60 * 1000;
var old = ref.orderByChild('timestamp').endAt(cutoff).limitToLast(1);
var listener = old.on('child_added', function(snapshot) {
snapshot.ref.remove();
});
If someone will have the same problem, but in Firestore. I did a little script that at first read documents to console.log and then delete documents from a collection messages older than 24h. Using https://cron-job.org/en/ to refresh website every 24h and that's it. Code is below.
var yesterday = firebase.firestore.Timestamp.now();
yesterday.seconds = yesterday.seconds - (24 * 60 * 60);
console.log("Test");
db.collection("messages").where("date",">",yesterday)
.get().then(function(querySnapshote) {
querySnapshote.forEach(function(doc) {
console.log(doc.id," => ",doc.data());
});
})
.catch(function(error) {
console.log("Error getting documents: ", error);
});
db.collection("messages").where("date","<",yesterday)
.get().then(function(querySnapshote) {
querySnapshote.forEach(element => {
element.ref.delete();
});
})
You could look into Scheduling Firebase Functions with Cron Jobs. That link shows you how to schedule a Firebase Cloud Function to run at a fixed rate. In the scheduled Firebase Function you could use the other answers in this thread to query for old data and remove it.