I have a working async function as main code (async function (){...});, working with some inputs coming from reed contacts posting them into a cloud. All good so far.
Now I want to add the feature analyzing if a reed is open for more then x seconds. If so, I want to do something additionally but still listening to other inputs while that. So what I try is:
var threshold;
async function time(threshold){
var sec = 0;
var start = new Date(); //set time stamp
while (sec < threshold){
// take actual time
var akt = new Date();
// calc difference
var diff = akt.getTime() - start.getTime();
// calc secs from mills
sec = Math.floor(diff / 1000);
}
post threshold data to cloud;
return "Threshold reached";
}
(async function () {
reading reed permanent {
if (reed === 1){
post data to cloud;
var call = time(20);
console.log(call);
}
}
})();
I wan't the main function still listeing to new reed changes while the time loop should wait for the threshold and do it's job parallel.
But my code waits for the threshold to be reached before continuing.
How can I parallel them?
EDIT after Gibors help:
Now I stuck at the point verifying if the reed is still open or was closed meanwhile. I always get a diff > threshold even if the reed is closed in the meanwhile so that the timestamp should be newer...
var id = [];
if (value['contact'] === 1) {
var bool = "false";
id[device['manufacturer']] = new Date();
} else if(value['contact'] === 0) {
var bool = "true";
id[device['manufacturer']] = new Date();
setTimeout( () => {
var akt = new Date();
var diff = akt.getTime() - id[device['manufacturer']].getTime();
if (diff > threshold) {
console.log("now: " + diff + "=" + akt.getTime() + "-" + id[device['manufacturer']].getTime());
}
}, threshold);
/*var call = time (50);
console.log(call);*/
}
```
First i'll explain what wrong with your code, and then i'll give you something I believe is a better and simpler solution:
Your time function is marked as async, but it actually does nothing asynch-ish. This effectively means that when the function "returns"- it actually returns a resolved promise. But the promise is created only at the end and resolves immediately, so it doesn't do its job to be async.
I believe something like this should work:
async function time(threshold){
return new Promise( (resolve, reject) => {
var sec = 0;
var start = new Date(); //set time stamp
while (sec < threshold){
// take actual time
var akt = new Date();
// calc difference
var diff = akt.getTime() - start.getTime();
// calc secs from mills
sec = Math.floor(diff / 1000);
}
post threshold data to cloud;
resolve("Threshold reached)";
}
}
Now this will run in parallel, and call variable will only get the string "threshold reached" when promise is resolved- which means in your current code, the log you'll get is something like Promise<pending>.
To log only when its done (and do your other stuff), use the .then on the promise held by call.
One thing you should notice tho, is that you'll have to somehow sync between the reed status (which I didn't really get what it actually is but I think its irrelevant) and the promise status, because you want to do your post-timeout code when the 20 sec are over AND reed is still open, so you'll have to check it in your .then clause, or pass it to time function which will reject the promise if the reed status changes before time runs out, etc.
Now- for the simple solution: it seems to me you're way better off using the setTimeout function:
The setTimeout() method sets a timer which executes a function or specified piece of code once the timer expires.
So you can simply do something like this:
(async function () {
reading reed permanent {
if (reed === 1){
post data to cloud;
setTimeout( () => { if(reed still open) {/*do what you want after timeout*/} }, 20 );
}
}
})();
Related
I am building a backend to handle pulling data from a third party API.
There are three large steps to this, which are:
Delete the existing db data (before any new data is inserted)
Get a new dataset from the API
Insert that data.
Each of these three steps must happen for a variety of datasets - i.e. clients, appointments, products etc.
To handle this, I have three Promise.all functions, and each of these are being passed individual async functions for handling the deleting, getting, and finally inserting of the data. I have this code working just for clients so far.
What I'm now trying to do is limit the API calls, as the API I am pulling data from can only accept up to 200 calls per minute. To quickly test the rate limiting functionality in code I have set it to a max of 5 api calls per 10 seconds, so I can see if it's working properly.
This is the code I have so far - note I have replaced the name of the system in the code with 'System'. I have not included all code as there's a lot of data that is being iterated through further down.
let patientsCombinedData = [];
let overallAPICallCount = 0;
let maxAPICallsPerMinute = 5;
let startTime, endTime, timeDiff, secondsElapsed;
const queryString = `UPDATE System SET ${migration_status_column} = 'In Progress' WHERE uid = '${uid}'`;
migrationDB.query(queryString, (err, res) => {
async function deleteSystemData() {
async function deleteSystemPatients() {
return (result = await migrationDB.query("DELETE FROM System_patients WHERE id_System_account = ($1) AND migration_type = ($2)", [
System_account_id,
migrationType,
]));
}
await Promise.all([deleteSystemPatients()]).then(() => {
startTime = new Date(); // Initialise timer before kicking off API calls
async function sleep(ms) {
return new Promise((resolve) => setTimeout(resolve, ms));
}
async function getSystemAPIData() {
async function getSystemPatients() {
endTime = new Date();
timeDiff = endTime - startTime;
timeDiff /= 1000;
secondsElapsed = Math.round(timeDiff);
if (secondsElapsed < 10) {
if (overallAPICallCount > maxAPICallsPerMinute) {
// Here I want to sleep for one second, then check again as the timer may have passed 10 seconds
getSystemPatients();
} else {
// Proceed with calls
dataInstance = await axios.get(`${patientsPage}`, {
headers: {
Authorization: completeBase64String,
Accept: "application/json",
"User-Agent": "TEST_API (email#email.com)",
},
});
dataInstance.data.patients.forEach((data) => {
patientsCombinedData.push(data);
});
overallAPICallCount++;
console.log(`Count is: ${overallAPICallCount}. Seconds are: ${secondsElapsed}. URL is: ${dataInstance.data.links.self}`);
if (dataInstance.data.links.next) {
patientsPage = dataInstance.data.links.next;
await getSystemPatients();
} else {
console.log("Finished Getting Clients.");
return;
}
}
} else {
console.log(`Timer reset! Now proceed with API calls`);
startTime = new Date();
overallAPICallCount = 0;
getSystemPatients();
}
}
await Promise.all([getSystemPatients()]).then((response) => {
async function insertSystemData() {
async function insertClinkoPatients() {
const SystemPatients = patientsCombinedData;
Just under where it says ' if (secondsElapsed < 10) ' is where I want to check the code every second to see if the timer has passed 10 seconds, in which case the timer and the count will be reset, so I can then start counting again over the next 10 seconds. Currently the recursive function is running so often that an error displayed related to the call stack.
I have tried to add a variety of async timer functions here but every time the function is returned it causes the parent promise to finish executing.
Hope that makes sense
I ended up using the Bottleneck library, which made it very easy to implement rate limiting.
const Bottleneck = require("bottleneck/es5");
const limiter = new Bottleneck({
minTime: 350
});
await limiter.schedule(() => getSystemPatients());
Essentially, how do you pause javascript execution without having to waste computation on something like a while loop?
For example, say I want to only perform a function next after 10 seconds and not interrupt other processes or waste computation? setTimeout won't work because I want the processes to actually pause / not continue any operations during that time.
const next = () => console.log("next");
/** pause for 10 seconds */
next()
Also what if I want to only run a method conditionally, where every couple seconds or something I can either abort the operation or continue. Notably I don't mean to use setInterval because in that case it's not actually conditionally pausing the actual javascript execution.
const next = () => console.log("next");
const start = new Date();
const startSeconds = now.getSeconds();
const checkEvery = 1; // seconds
const requiredDiff = 10; // seconds
const checker = () => {
const now = new Date();
let secondsNow = now.getSeconds();
secondsNow < startSeconds ? secondsNow += 60 : null;
const diff = secondsNow - startSeconds;
if (diff < 10) {
return false
}
return true;
}
/** something that runs checker every 1 seconds and pauses computation until checker() returns true and then runs next() */
A good way to do this is by creating a block in the pipe. i.e.
process -> pause -> next so that you can know that the flow of the program won't continue until a certain criteria is met
You can do this by pausing things inline
...
await pause();
...
where pause is something that does a timed-out promise or something along those lines. I don't know too much about this method, but it seems relatively complicated.
Another way to do this is by stopping the execution inline with a while loop
...
// pause for 10 seconds
let result;
while (!result) { result = checker() };
...
but as you alluded to, this wastes a lot of operations and can interfere with other actions in the background from running properly. Another thing is you can't only check checker every 1 second.
I suggest you do the following instead:
const max = 20; // max number of recursive calls (i.e. timeout after 20 seconds)
// checker is a function that returns true or false and is agnostic to this implementation
// timeout is the time (in milliseconds) to wait before running the checker again
// next is the next step in your pipeline that you want to prevent from e
const pause = async (checker, timeout, next, calls = 0) => {
if (calls > max) return; // prevents stack overflow
const result = await checker(); // just in case your checker is async
// if the condition was met then continue on to the next stage in your pipeline
// if the condition was not met then run this method again to re-check in timeout
result ? next() : setTimeout(() => pause(checker, timeout, next, calls + 1), timeout)
}
// with the functions you provided...
pause(checker, 1000, next)
pause will only execute operations when the timeout is met, and it won't allow the program to continue to the next stage until the checker is met.
Here is my for loop:
for (var d = startDate; d <= in30DaysDate; d.setDate(d.getDate() + 1)) {
var loopDay = new Date(d);
DoYourDaylyJob();
console.log("Day:" + loopDay);
}
What should i put in function DoYourDailyJob() to prevent the loop to go on the next day before it do it's "daily" job?
I hope i have described it well enough. Should i attach any kind of function callback at DoYourDailyJob, but if so how that is going to prevent the loop from proceeding until it receives response from the function ?
I'm not aware if this is possible. If it's possible can you show an example as an answer to this question ?
Just return a Promise from DoYourDailyJob, then its as simple as:
(async function() {
for (var d = startDate; d <= in30DaysDate; d.setDate(d.getDate() + 1)) {
var loopDay = new Date(d);
await DoYourDaylyJob();
console.log("Day:" + loopDay);
}
})()
Using a callback function, you replace the for with a recursive loop:
executeIteration(startDate);
function executeIteration(d) {
if (d <= in30DaysDate)
return;
var loopDay = new Date(d);
DoYourDaylyJob(function(valueFromDoYourDaylyJob) {
executeIteration(d.setDate(d.getDate() + 1)));
});
}
function DoYourDaylyJob(callback) {
// Do your dayly job
var valueToReturn = "foo";
callback(valueToReturn);
}
Just refactor it so its recursive:
var day = 0;
var loops = 10;
function DoYourDailyJob(){
var loopDay = new Date(d);
console.log("Day:" + loopDay);
if( day++ < loops ){
DoYourDailyJob();
}
}
DoYourDailyJob();
Short answer, you can't (at least without a super ugly blocking call that I can't recommend strongly enough that you avoid).
Instead, make it asynchronous.
If DoYourDailyJob() is something that is asynchronous, that you'll either want to make it:
accept a callback function which is called when complete
return a Promise that is resolved when complete
The Promise method tends to be more preferable these days:
function DoYourDailyJob() {
return new Promise((resolve, reject) => {
// do stuff
// call resolve() when done
// call reject() if there is an error
});
}
In your for loop, create an array of Dates you want to process:
const dates = [];
for (var d = startDate; d <= in30DaysDate; d.setDate(d.getDate() + 1)) {
dates.push(new Date(d));
}
With your list of Dates, you can then either run them all in parallel:
Promise.all(dates.map(d => DoYourDailyJob(d))
.then(() => console.log('all done'));
Or, if they need to be run one in a time (which may be the case since you don't pass in the date), you can essentially have a "queue" and a queue running function which will keep going until all of them are done:
const runNextDay = () => {
if (!dates.length) {
return Promise.resolve(); // all done
}
const day = dates.shift();
return DoYourDailyJob().then(() => console.log('loop date', day))
.then(runNextDay); // call the next day when its done
};
runNextDay()
.then(() => console.log('all done'));
All those comments are wrong, it is absolutely possible, in some environments.
If you have async await available, you can do it quite easily. Just make DoYourDailyJob(); an async function (returns a promise) and do
for () {
await DoYourDailyJob();
}
If you don't have async await available, comment that here and we can do something similar with raw promises.
I am trying to achieve the following functionality:
execute call back
resolve promise
check output
if not correct execute again
I have 'mimicked' the scenario with a timer, this reruns a script that makes a call to backend database for some information:
_runCheckScript: function(bStart, bPreScript){
var oController = this;
var scriptTimerdeferred = $.Deferred();
var promise = scriptTimerdeferred.promise();
if(typeof(bStart) === "undefined"){
bStart = true;
}
if(typeof(bPreScript) === "undefined"){
bPreScript = true;
}
// if the HANA DB is not stopped or started, i.e. it is still starting up or shutting down
// check the status again every x number of seconds as per the function
var msTime = 10000;
if(!bPreScript){
this._pushTextIntoConsoleModel("output", {"text":"The instance will be 'pinged' every " + msTime/1000 + " seconds for 2 minutes to monitor for status changes. After this, the script will be terminated."});
}
if(bPreScript){
var timesRun = 0;
var commandTimer = setInterval( function () {
timesRun += 1;
if(timesRun === 12){
scriptTimerdeferred.reject();
clearInterval(commandTimer);
}
// send the deferred to the next function so it can be resolved when finished
oController._checkScript(scriptTimerdeferred, bStart, bPreScript);
}, msTime);
}
return $.Deferred(function() {
var dbcheckDeffered = this;
promise.done(function () {
dbcheckDeffered.resolve();
console.log('Check finished');
oController._pushTextIntoConsoleModel("output", {"text":"Check finished."});
});
});
The script it calls, has it's own promise as it calls another function:
_checkScript: function(scriptTimerdeferred, bStart, bPreScript){
var oProperties = this.getView().getModel("configModel");
var oParams = oProperties.getProperty("/oConfig/oParams");
var deferred = $.Deferred();
var promise = deferred.promise();
var sCompareStatus1 = "inProg";
var sCompareStatus2 = this._returnHanaCompareStatus(bStart, bPreScript);
var sCompareStatus3 = this._returnHanaCompareStatus3(bStart, bPreScript);
var params = {//some params};
// Send the command
this._sendAWSCommand(params, deferred);
// When command is sent
promise.done(function (oController) {
console.log('back to db check script');
var oCommandOutputModel = oController.getView().getModel("commandOutput");
var sStatus = oCommandOutputModel.Status;
// check that it's not in the wrong status for a start/stop
// or if it's a pre script check -> pre script checks always resolve first time
if(sStatus !== sCompareStatus1 && sStatus !== sCompareStatus2 && sStatus !==sCompareStatus3|| bPreScript){
scriptTimerdeferred.resolve();
}
});
},
This works, however what it does is:
set a timer to call the first script every x seconds (as the data is currently changing - a server is coming online)
the script runs and calls another function to get some data from the DB
when the call for data is resolved (complete) it comes back to 'promise.done' on the checkScript and only resolves the timer promise if it meets certain criteria
all the while, the initial timer is resending the call as eventually the DB will come online and the status will change
I am wondering if there is a better way to do this as currently I could have, for example, 3 calls to the DB that go unresolved then all resolve at the same time. I would prefer to run a command, wait for it to resolve, check the output, if it is not right then run command again.
Thanks!
I think what you want to do can be achieved carefully reading what explained in these links:
Promise Retry Design Patterns
In javascript, a function which returns promise and retries the inner async process best practice
See this jsfiddle
var max = 5;
var p = Promise.reject();
for(var i=0; i<max; i++) {
p = p.catch(attempt).then(test);
}
p = p.then(processResult).catch(errorHandler);
function attempt() {
var rand = Math.random();
if(rand < 0.8) {
throw rand;
} else {
return rand;
}
}
function test(val) {
if(val < 0.9) {
throw val;
} else {
return val;
}
}
function processResult(res) {
console.log(res);
}
function errorHandler(err) {
console.error(err);
}
It retries a promise infinite times since the condition is not satisfied. Your condition is the point you said "check the output". If your check fails, retry the promise. # Be careful to hold a limit case, promises waste memory. If your api/service/server/callreceiver is off, and you don't set a threshold, you could create an infinite chain of promises NO STOP
I have been trying to implement a job that will run through a few million entries in my database to update properties based on values in another collection.
The structure is along the lines of
const Promise = require('bluebird');
var updateFunc = Promise.coroutine(function* (skip, limit) {
let objects = yield repository.getDocs({}, skip, limit);
// perform other actions on objects
});
var loopFunc = Promise.coroutine(function* () {
const count = yield repository.countDocs({});
var skip = 0;
var limit = 10000;
let runs = Math.ceil(count/limit);
for (let i = 0; i < runs; i++) {
yield updateFunc(skip, limit);
skip += limit;
}
});
Promise.coroutine(function* () {
var start = new Date().getTime();
yield loopFunc();
var end = new Date().getTime();
var timeTaken = ((end - start) / 1000);
console.log('finished in %s seconds..', parseFloat(timeTaken).toFixed(2));
process.exit(0);
})();
this consumes memory pretty quickly until it crashes.
If I move the while-true into the main function directly after loopFunc returns then after 30 seconds or so it will release the memory.
I figure that the loopFunc is stopping memory from being released as long as it is running, for the job I require a fairly long running loop.
One thought is to break it into pieces and run each part in a separate child process, but I would like to understand why this is not working.
Everything I have tried has come up unsuccessful.
I thought perhaps it was something in the repository leaking memory but if I just create a huge array in updateFunc and have no reference to anything else it still holds memory.
I also tried forcing garbage collection with --expose-gc on start and global.gc() every X runs in the loop and this also failed to produce any better results.
Any idea what I am missing here?
* UPDATE *
I have managed to get the memory slightly under control using process.nextTick instead of a loop.
my new code looks like
const Promise = require('bluebird');
var kill = false;
var updateFunc = Promise.coroutine(function* (skip, limit, runs, run, count) {
let objects = yield repository.getDocs({}, skip, limit);
// perform other actions on objects
if (run < runs) {
process.nextTick(Promise.coroutine(function* () {
yield locator.helper.makePromise(null, 2000, true);
yield updateFunc(skip+limit, limit, runs, run+1, count);
}));
} else {
kill = true;
}
});
Promise.coroutine(function* () {
var start = new Date().getTime();
var count = yield locator.repository.count(locator.db.absence, {});
console.log('checking %s companies...', count);
var skip = 0;
var limit = 10000;
let runs = Math.ceil(count/limit);
yield updateFunc(skip, limit, runs+1, 1, count);
while(!kill) {
yield locator.helper.makePromise(null, 5000, true);
}
var end = new Date().getTime();
var timeTaken = ((end - start) / 1000);
console.log('finished in %s seconds..', parseFloat(timeTaken).toFixed(2));
process.exit(0);
})();
Using nextTick to call the updateFunc seem to allow the GC to do its job which is great. Memory grows quickly and then after it hits about 1 GB, it gets cleaned up and drops.
I need the while loop with the wait so that the updateFunc has time to run to completion, once it hits the max runs I can break out and kill the process.
I am still looking for some ways to improve on it, but I am finally making progress.