stop current run of useEffect and start the next one - javascript

I was wondering if there is any way to break the current process of a UseEffect and have it start on the next render, like this
...
useEffect(() => {
SlowFunction(update);
}, [update]);
setUpdate(1)
// a bit of time passes but not long enough for the SlowFunction(1) to be done
setUpdate(2)
//when this is called and the useEffect runs, stop the SlowFunction(1) and run SlowFunction(2)
my updated personal function is called in the use effect like so,
const [update, setUpdate] = useState(0);
const [thisConst, setThisConst] = useState(0);
async function SlowFunction(firstParam, paramEtc, { signal } = {}) {
while (true) {
//wait two seconds between each
await new Promise((r) => setTimeout(r, 2000));
// Before starting every individual "task" in the function,
// first throw if the signal has been aborted. This will stop the function
// if cancellation occurs:
signal?.throwIfAborted();
// else continue working...
console.log('working on another iteration');
}
console.log('Completed!');
return 'some value';
}
useEffect(() => {
const controller = new AbortController();
const { signal } = controller;
(async () => {
try {
const result = await SlowFunction(update, 'some other value', {
signal,
});
setConst(result);
} catch (ex) {
console.log('EXCEPTION THROWN: ', ex);
}
})();
return () => controller.abort(new Error('Starting next render'));
}, [update]);

The AbortSignal API is the standard method for handling cancellation.
I'll provide an example of how to use it with a function like your SlowFunction. You'll need to accept an abort signal as an optional parameter so that when the next render occurs, the function can be cancelled.
Here's an example cancellable function:
async function SlowFunction (firstParam, paramEtc, {signal} = {}) {
for (let i = 0; i < 1_000_000; i += 1) {
// Before starting every individual "task" in the function,
// first throw if the signal has been aborted. This will stop the function
// if cancellation occurs:
signal?.throwIfAborted();
// else continue working...
console.log('working on another iteration');
}
return 'some value';
}
You can use it in an effect hook like this: returning a cleanup function which invokes the abort method on the controller:
useEffect(() => {
const controller = new AbortController();
const {signal} = controller;
(async () => {
try {
const result = await SlowFunction(update, 'some other value', {signal});
setConst(result);
}
catch (ex) {
// Catch the exception thrown when the next render starts
// and the function hasn't completed yet.
// Handle the exception if you need to,
// or do nothing in this block if you don't.
}
})();
return () => controller.abort(new Error('Starting next render'));
}, [update]);
If the function completes before the next render occurs, then the abort operation will have no effect, but if it hasn't yet, then the next time that the statement signal?.throwIfAborted(); is reached, the function will throw an exception and terminate.
Update in response to your comment:
If your JavaScript runtime is too old to support the AbortSignal.throwIfAborted() method, you can work around that by replacing that line:
signal?.throwIfAborted();
with:
if (signal?.aborted) {
throw signal?.reason ?? new Error('Operation was aborted');
}

Related

Fetch with retry, abort, etc

I've been trying to find a wrapper that does fetch with retries, timeouts, aborts, etc. I came across https://pastebin.com/54Ct4xEh a little bit ago, and after fixing a couple typos (missing options. and =>), it works, except... well, maybe it works, but I don't know how to use it. How do I abort a fetch with this particular wrapper? I have a fiddle, https://jsfiddle.net/1fdwb2o6/2/. With this code, how can I, say, click a button and have it abort this fetch loop? For my use case, I' using boopstrap, and I have a modal that, when shown, attempts to load dynamic content. If the user clicks Cancel while it's loading, I want the fetch process to stop. From what I can tell, I should be able to do it with the code below... but I'm not sure how to perform the abort. Perhaps this isn't possible, as structured, with a Promise... but I don't know enough (anything) about promises to know better, one way or the other.
const fetchWithRetry = (userOptions) => {
let abort = false;
const options = {
url: '',
options: {},
cancel: {},
retries: 5,
retryDelay: 1000,
...userOptions
};
// Add an abort to the cancel object.
options.cancel.abort = () => {
abort = true;
};
// Abort or proceed?
return abort ? Promise.reject('aborted') : fetch(options.url).then(response => {
// Reject because of abort
return abort ? Promise.reject('aborted')
// Response is good
: response.ok ? Promise.resolve(response.text())
// Retries exceeded
: !options.retries ? Promise.reject('retries exceeded')
// Retry with one less retry
: new Promise((resolve, reject) => {
setTimeout(() => {
// We use the returned promise's resolve and reject as
// callback so that the nested call propagates backwards.
fetchWithRetry({
...options,
retries: options.retries - 1
}).then(resolve, reject);
}, options.retryDelay);
});
});
}
var xxx;
console.clear();
xxx = fetchWithRetry({
url: "some_file_that_doesnt_exist.php"
})
.then((response) => {
alert(response);
}).catch(function(err) {
// Error: response error, request timeout or runtime error
alert("Error! Cannot load folder list! Please try again!");
});
setTimeout(function() {
// somehow, abort the fetch...
// xxx.abort(); <-- no worky...
}, 1234);
As I said in my comments, the code you have in your question does not provide a cancel() function that the caller can use. It has a cancel() function internally, but that's not something the caller can use. As written that function just returns a promise so the caller has nothing they can call to cancel the retries.
So, I decided to write my own version of fetchWithRetry() that would work for your use case. This has a number of capabilities that the one in your question does not:
It returns both the promise and a cancel function so the caller can cancel the retries.
It allows you to pass the init options for fetch() so you can pass any of the various arguments that fetch() supports and are often needed such as withCredentials.
It has an option to check the response.ok boolean so it will detect and retry more things that you would if you required the promise to be rejected before a retry (note: fetch() doesn't reject on a 404, for example).
If There was a fetch() rejection and it was either cancelled or it ran out of retries, then it will use the newest Error class feature where it will set the cause to the actual fetch() error so the caller can see what the original error was.
Note that this version of fetchWithRetry() returns an object containing both a promise and a cancel function. The caller uses the promise the same way they would any promise from fetch() and they can use the cancel() function to cancel any further retries.
Here's the code:
const Deferred = function() {
if (!(this instanceof Deferred)) {
return new Deferred();
}
const p = this.promise = new Promise((resolve, reject) => {
this.resolve = resolve;
this.reject = reject;
});
this.then = p.then.bind(p);
this.catch = p.catch.bind(p);
if (p.finally) {
this.finally = p.finally.bind(p);
}
}
function fetchWithRetry(url, userOptions = {}, init = {}) {
const options = {
// default options values, can be overridden by userOptions
retries: 3,
retryDelay: 1000,
checkResponseOk: true,
...userOptions
};
let cancelled = false;
let timerDeferred;
let timer;
function run() {
return fetch(url, init).then(response => {
// force retry on non 2xx responses too
if (options.checkResponseOk && !response.ok) {
throw new Error(`fetch failed with status ${response.status}`);
}
return response;
}).catch(err => {
// got error, set up retry
console.log(err);
if (cancelled) {
throw new Error("fetch cancelled", { cause: err });
}
--options.retries;
if (options.retries < 0) {
throw new Error("fetch max retries exceeded", { cause: err });
}
// create new Deferred object for use with our timer
// so it can be resolved by the timer or rejected
// by the cancel callback
timerDeferred = new Deferred();
timer = setTimeout(() => {
timerDeferred.resolve();
timer = null;
}, options.retryDelay);
return timerDeferred.then(() => {
if (cancelled) {
throw new Error("fetch cancelled", { cause: err });
}
return run();
});
});
}
return {
promise: run(),
cancel: () => {
cancelled = true;
// if currently in a timer waiting, reject immediately
if (timer) {
clearTimeout(timer);
timer = null;
}
if (timerDeferred) {
timerDeferred.reject(new Error("fetch cancelled"));
}
}
}
};
Sample usage:
const result = fetchWithRetry(someUrl);
result.promise.then(resp => {
return resp.text().then(data => {
// got final result here
console.log(data.slice(0, 100));
});
}).catch(err => {
console.log(err);
});
// simulate user cancel after 1.5 seconds
setTimeout(() => {
result.cancel();
}, 1500);

How can I check the status periodically and syncronlicaly with a request?

The context of the problem is the following:
I do a petition to the back and this starts with a slow process that you can check with a rest request.
I need to continue with the main thread when this request indicates that the process is completed.
So I do a request to launch that process, after that, I want to make a request every 20 seconds in order to know if the process is finished and finally I want to continue with the main thread.
In order to solve this I triyed the following code:
...
const checker = await setInterval(() => {
let state = checkState(exportId)
if(state !== '100%'){
state = checkState(exportId)
}else{
clearInterval(checker)
console.log('Clear interval')
}
}, 20000);
...
The check state function is the following:
checkState = async (exportId: string): Promise<IStatus> => {
const headers = {...}
const query = JSON.stringify({...});
const { _, data } = await axios.post(...);
return data
};
The problem with this code is that when I execute it the code inside setInterval never is executed.
So how can I solve this, and await the setInterval until the state return 100%?
Thanks
const checker = await setInterval(() => {
setInterval does not return a promise, it returns an identifier so you can later cancel the interval. Awaiting this value does nothing.
let state = checkState(exportId)
if(state !== '100%'){
checkState is an async function which means it will always return a promise. It will never return a string. await the promise until it resolves to a string:
const checker = setInterval(async () => {
let state = await checkState(exportId);
if(state !== '100%'){
// no need to call checkState because you're using setInterval.
// state = checkState(exportId)
console.log('State currently is', state);
}else{
clearInterval(checker);
console.log('Clear interval');
}
}, 20000);
Or equivalently:
const checker = setInterval(() => {
checkState(exportId).then((state) => {
if(state !== '100%'){
// no need to call checkState because you're using setInterval.
// state = checkState(exportId)
console.log('State currently is', state);
}else{
clearInterval(checker);
console.log('Clear interval');
}
});
}, 20000);
You could wrap the entire thing up in a promise to continue when the main process has finished:
const finished = new Promise((resolve, reject) => {
const checker = setInterval(async () => {
let state = await checkState(exportId);
if(state !== '100%'){
// no need to call checkState because you're using setInterval.
// state = checkState(exportId)
console.log('State currently is', state);
}else{
clearInterval(checker);
resolve();
console.log('Clear interval');
}
}, 20000);
});
await finished;
console.log("Done");

How to write a handler chain executer (like express)?

I am learning about routing libraries like express and they all have a common
export const middleware = (req, res, next) => {
next()
}
So I am trying to make my own implementation of this to learn what's going on and I'm struggling a bit. Any good resources would also be appreciated
The problem I am specifically trying to solve is the following
const executeHandlers = (handlers) => {
// Run them in sequence
}
executeHandlers([
next => {
console.log(1)
next()
},
next => {
console.log(2)
next()
},
next => {
console.log(3)
}
])
I assume next() is the following handler wrapped in a function, but I am struggling to get it there.
Here's a version written with well-separated parts for clarity (and assuming you want to see console.log(1) get called before console.log(2)); see the comments for details:
const executeHandlers = (handlers) => {
// Remember which handler we're on
let i = 0;
// This is the function we'll pass to the handlers
function next() {
// Get the handler to call, if any;
// update the index of the one we're on
const handler = handlers[i++];
if (handler) {
// This flag is specific to each function we pass to `handler`
let called = false;
handler(() => {
// Prevent `handler` from calling this twice
if (!called) {
called = true;
next();
}
});
}
}
// Start the chain
next();
};
Live Example:
const executeHandlers = (handlers) => {
// Remember which handler we're on
let i = 0;
// This is the function we'll pass to the handlers
function next() {
// Get the handler to call, if any;
// update the index of the one we're on
const handler = handlers[i++];
if (handler) {
// This flag is specific to each function we pass to `handler`
let called = false;
handler(() => {
// Prevent `handler` from calling this twice
if (!called) {
called = true;
next();
}
});
}
}
// Start the chain
next();
};
executeHandlers([
next => {
console.log(1)
next()
},
next => {
console.log(2)
next()
},
next => {
console.log(3)
}
]);
There are various spins you could put on that, like handling errors, passing a value from one handler to the next, taking a snapshot of the handlers chain before you start, but that's the basic idea.

Custom status change events in Javascript

I have an asynchronous function that performs various await tasks. I am trying to inform my UI in React when the status of the function changes or when one of the tasks is completed.
const foo = async () => {
// trigger on load event
await task1();
// trigger task1 done event
await task2();
// trigger task2 done event
await task3();
// trigger on done event
}
I also want to be able to specify callbacks for each event, like so:
const bar = foo();
foo.on_load(() => {
// some code goes here
});
foo.on_done(() => {
// some code goes here
});
Another alternative would be something like this:
const bar = foo();
foo.on('status_change', status => {
// read the status here and do something depending on the status
})
I have been reading about custom events in JS but not sure how to use them for this. Or maybe there's another way to do this in React.
Any ideas would be helpful. Thanks!
EDIT
var uploadTask = storageRef.child('images/rivers.jpg').put(file);
// Register three observers:
// 1. 'state_changed' observer, called any time the state changes
// 2. Error observer, called on failure
// 3. Completion observer, called on successful completion
uploadTask.on('state_changed', function(snapshot){
// Observe state change events such as progress, pause, and resume
// Get task progress, including the number of bytes uploaded and the total number of bytes to be uploaded
var progress = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
console.log('Upload is ' + progress + '% done');
switch (snapshot.state) {
case firebase.storage.TaskState.PAUSED: // or 'paused'
console.log('Upload is paused');
break;
case firebase.storage.TaskState.RUNNING: // or 'running'
console.log('Upload is running');
break;
}
}, function(error) {
// Handle unsuccessful uploads
}, function() {
// Handle successful uploads on complete
// For instance, get the download URL: https://firebasestorage.googleapis.com/...
uploadTask.snapshot.ref.getDownloadURL().then(function(downloadURL) {
console.log('File available at', downloadURL);
});
});
I was trying to achieve something like the above code, taken from the firebase documentation on uploading files
This is where I've gotten so far:
class Task {
constructor() {
this.first = null;
this.second = null;
}
on(keyword, callback) {
switch (keyword) {
case "first":
this.first = callback;
break;
case "second":
this.second = callback;
break;
default:
// throw new error
break;
}
}
}
const timeout = async time => {
return new Promise(resolve => setTimeout(resolve, time));
};
const foo = () => {
const task = new Task();
timeout(2000).then(async () => {
task.first && task.first();
await timeout(2000);
task.second && task.second();
});
console.log("returning");
return task;
};
const taskObject = foo();
taskObject.on("first", () => console.log("executing first callback"));
taskObject.on("second", () => console.log("executing second callback"));
Is there a better way to do this - without having the nested thens? Which approach would be better and when? EDIT - removed nested then clauses and replaced with then and await
PS: for my requirements, having callbacks would be sufficient. This is just so I can understand the concept better. Thanks!
I'm going to assume there's a reason for you not simply calling some named method after each async step has complete, i.e., you want to be able to plug in different handlers for each event. Here is one way to go about it - whether or not it's the best is hard to tell from the little context provided:
const foo = async (handlers) => {
handlers.onLoad && handlers.onLoad();
await task1();
handlers.onTask1Complete && handlers.onTask1Complete();
await task2();
handlers.onTask2Complete && handlers.onTask2Complete();
}
const myHandlers = {
onLoad: () => {
// do stuff
},
onTask1Complete: () => {
// do other stuff
},
onTask2Complete: () => {
// etc
}
};
foo(myHandlers);
Note that it lets you specify only the handlers you need. A more flexible approach would be to a publish-subscribe model, where a subscribe method pushes a function to an array of handlers, all of which are called when the event occurs.
The best option would be to make use of promises, which means every time a promise is resolved, you will get notified and then cascading promise will get executed.
an example below of chaining promises
var function3 = function(resolve, reject)
{
try
{
//do some thing
console.log('function3 called');
resolve('function3 success');
}
catch(err)
{
reject(err);
}
}
var function2 = function(resolve, reject)
{
try
{
//do some thing
console.log('function2 called');
resolve('function2 success');
//return new Promise(function3);
}
catch(err)
{
reject(err);
}
}
var function1 = function(resolve, reject)
{
try
{
//do some thing
console.log('function1 called');
resolve('function1 success');
}
catch(err)
{
reject(err);
}
}
var promise = new Promise(function1);
promise
.then(function(response){
console.log(response);
return new Promise(function2);
}, function(error)
{
console.log(error);
})
.then(function(response)
{
console.log(response);
return new Promise(function3);
},
function(err)
{
console.log(error);
})
.then(function(response)
{
console.log(response);
},
function(err)
{
console.log(error);
})
//output
"function1 called"
"function1 success"
"function2 called"
"function2 success"
"function3 called"
"function3 success"

socket.io and async events

I'm using socket.io and mongoose in my express server.
My socket is listening for events using the following code:
socket.on('do something', async () => {
try {
await doA();
doX();
await doB();
doY();
await doC();
} catch (error) {
console.log(error);
}
});
doA, doB and doC are async operations that writes on database using mongoose, but in general they can be any method returning a promise.
I want that 'do something' runs synchronously.
If the event queue processes more events at the same time I have consistency problems in my mongodb.
In other words if the server receives two 'do something' events, I want that the second event received is processed only when the first event is fully processed (after the await doC). Unfortunately the 'do something' callback is async.
How to handle this?
It's possible to implement a queue by adding the functions you want to run to an array, and then running them one by one. I've created an example below.
let queue = [];
let running = false;
const delay = (t, v) => {
return new Promise((resolve) => {
setTimeout(resolve.bind(null, "Returned value from Promise"), t)
});
}
const onSocketEvent = async () => {
console.log("Got event");
if (!running) {
console.log("Nothing in queue, fire right away");
return doStuff();
}
// There's something in the queue, so add it to it
console.log("Queuing item")
queue.push(doStuff);
}
const doStuff = async () => {
running = true;
const promiseResult = await delay(2000);
console.log(promiseResult);
if (queue.length > 0) {
console.log("There's more in the queue, run the next one now")
queue.shift()();
} else {
console.log("Queue empty!")
running = false;
}
}
onSocketEvent();
setTimeout(() => onSocketEvent(), 1000);
setTimeout(() => onSocketEvent(), 1500);
setTimeout(() => onSocketEvent(), 2000);
setTimeout(() => onSocketEvent(), 2500);
I would suggest adding a delay between each await. This will prevent deadlocks from occurring and fix your issue. For such things, I would suggest using the Caolan's async library.
Task delay example:
setTimeout(function() { your_function(); }, 5000); // 5 seconds
If your function has no parameters and no explicit receiver, you can call directly setTimeout(func, 5000)
Useful jQuery timers plugin

Categories

Resources