how to handle redux saga error in the wrapper? - javascript

This is the Normal Way:
function* saga1() {
try {
// do stuff
} catch (err) {
// handle err
}
}
function* saga2() {
try {
} catch (err) {
}
}
function* wrapper() {
yield [
takeLatest('saga1', saga1),
takeLatest('saga2', saga2),
];
}
This is Expected Way:
function* saga1() {
}
function* saga2() {
}
function* wrapper() {
try {
takeLatest('saga1', saga1),
takeLatest('saga2', saga2),
} catch (err) {
// handle errors
}
}
Is there anyway to achieve above way to handle errors? Using normal way sometimes leading to repeatedly handle same error.

Easiest way for this case is fork as using parallel dynamic effect in saga. Of course, it is not real thread, but way to write sequence of asynchronous operations.
Let see your example in depth. Construction like yield [ takeA(), takeB() ] supposes that you delegate A&B operation to saga workflow with supplied callback. In other words, execution of wrapper is done for this moment, so try/catch is not appropriate.
Is alternative you can fork as parallel dynamic effect for two or more independent saga-processes, and make infinite loop in which of them.
Code:
function* proc1() {
while(true) {
yield call(() => new Promise(resolve => setTimeout(resolve, 1500)));
throw new Error('Err1')
}
}
function* proc2() {
while(true) {
yield call(() => new Promise(resolve => setTimeout(resolve, 2000)));
throw new Error('Err2')
}
}
function* watchLoadRequest() {
try {
yield [call(proc1), call(proc2)]
} catch(err) {
console.log('##ERROR##', err);
}
}
Of course, you should implement custom business login in parallel procedures. If you need a persistent/shared state between them, use object argument with appropriate fields.

Related

Retry failed API call in for...of loop

I am trying to build a wrapper over Notion JS SDK's iteratePaginatedAPI that handles errors as well. I feel particularly lost on how do I catch API errors in such a way that I can actually retry them (aka retry the iteration that failed). Here's my attempt:
async function* queryNotion(listFn, firstPageArgs) {
try {
for await (const result of iteratePaginatedAPI(listFn, firstPageArgs)) {
yield* result
}
} catch (error) {
if (error.code === APIErrorCode.RateLimited) {
console.log('rate_limited');
console.log(error);
sleep(1);
// How would I retry the last iteration?
}
}
}
Coming from the Ruby world, there is a retry in a rescue block. Any help would be appreciated!
Very interesting problem. The issue is that the exception comes from the for await itself, not from its body, so you cannot catch it there. When the exception hits, loops are over.
Note that the iterator might be done after a rejection/exception, in which case there is nothing you can do except starting a new one.
That said, you can always call Iterator.next() yourself and process the result manually. The next() call of an async iterator will return an object like {value: Promise<any>, done: boolean}, and when running it in a loop, you can await the promise in a try..catch and only exit the loop when done becomes true:
async function* queryNotion(listFn, firstPageArgs) {
const asyncGenerator = mockIteratePaginatedAPI(listFn, firstPageArgs)
while (true) {
const current = asyncGenerator.next()
if (current.done) {
break
}
try {
yield* await current.value
} catch (e) {
console.log(`got exception: "${e}" - trying again`)
continue
}
}
}
function* mockIteratePaginatedAPI(a, b) {
for (let i = 0; i < 8; i++) {
yield new Promise((resolve, reject) => setTimeout(() => [3, 5].includes(i) ? reject(`le error at ${i}`) : resolve([i]), 500))
}
}
(async function() {
for await (const n of queryNotion('foo', 'bar')) {
console.log(n)
}
})()
If we keep a reference to the generator, we can also put it back into a for async. This might be easier to read, however a for await ...of will call the iterator's return() when exiting the loop early, likely finishing it, in which case, this will not work:
async function* queryNotion(listFn, firstPageArgs) {
const asyncGenerator = mockIteratePaginatedAPI(listFn, firstPageArgs)
while (true) {
try {
for await (const result of asyncGenerator) {
yield* result
}
break
} catch (e) {
console.log('got exception:', e, 'trying again')
}
}
}
function* mockIteratePaginatedAPI(a, b) {
for (let i = 0; i < 8; i++) {
yield new Promise((resolve, reject) => setTimeout(() => [3, 5].includes(i) ? reject(`le error at ${i}`) : resolve([i]), 500))
}
}
(async function () {
for await (const n of queryNotion('foo', 'bar')) {
console.log(n)
}
})()
Simply add a continue statement inside your if
async function* queryNotion(listFn, firstPageArgs) {
try {
for await (const result of iteratePaginatedAPI(listFn, firstPageArgs)) {
yield* result
}
} catch (error) {
if (error.code === APIErrorCode.RateLimited) {
console.log('rate_limited');
console.log(error);
await sleep(1);
continue; // retry the last iteration
}
}
}

Best way to handle recursion with async functions and promises?

The following is pseudocode to illustrate my problem. The parent function must ultimately return a promise when all of the tasks are done (I've omitted the others for clarity). The parent function calls child functions and some of the child functions have to perform their tasks recursively and so, for clarity, I've separated them into worker functions. If there is a cleaner way I would love to learn it.
How best to handle the recursion in this example?
// This function must ultimately return a Promise.
async function parentFunction(uId) {
try {
await childFunction(uId);
return Promise.resolve(uId);
} catch (error) {
console.log(error);
}
}
async function childFunction(uId) {
try {
const done = await workerFunction(uId);
if (done) {
return Promise.resolve(true);
} else {
// There are more files to delete; best way to handle recursion?
}
} catch (error) {
console.log(error);
}
}
async function workerFunction(uId) {
try {
// Query the database, limit to 100 files.
const query = await db.queryFiles().limit(100);
if (query.size == 0) {
// Nothing to delete, we're done!
return Promise.resolve(true);
}
// Perform an atomic (all-or-none) batch delete that can only take 100 files at most.
await db.batchDelete(query);
// Batch delete successfull!
if (query.size < 100) {
// The query was less than 100 files so there can be no more files to delete.
return Promise.resolve(true);
} else {
// There may possibly be more files to delete.
// Return a promise or handle recursion here?
return Promise.resolve(false);
}
} catch (error) {
console.log(error);
}
}
just do recursion it's fine!
async function deleteFiles() {
const query = await db.queryFiles().limit(100)
if (query.size > 0) {
await db.batchDelete(query)
}
if (query.size === 100) {
return deleteFiles()
}
return true;
}

Events handled by another saga (Redux-Saga)

I'm trying to create an error interceptor for all my sagas, after analyzing the redux-saga eventChannel I tried to create a saga like this:
export function interceptor() {
return eventChannel(() => {
api.interceptors.response.use(
(response) => response,
(error) => {
const { response } = error;
if ([401, 403].includes(response.status)) {
emit(AuthCreators.logoutRequest());
}
return Promise.reject(error);
}
);
return () => null;
});
}
In rootSaga it is being called this way:
export default function* rootSaga() {
return yield all([fork(interceptor), anotherSaga, anotherSaga2]);
}
This way, every time one of my other sagas has a catch the interceptor is triggered, however my emit that should trigger the logoutRequest that is in other saga is not being triggered.
How can the emit call other saga?
Is this the best way to create an error interceptor?
Already grateful
Maybe I don't understand you question exactly but this might help you
const saga = [
anotherSaga, anotherSaga2
]
yield all(sagas.map((saga) => spawn(function* () {
try {
yield call(saga);
} catch (e) {
// console.log(e);
// here we should store all errors in some service...
}
})));

redux saga yield all cancel other effect when one failed

I have issues with yield all in saga effect, I provide my sample code below
function* fetchData(item) {
try {
const data = yield call(request, url);
yield put(fetchDataSuccess(data));
} catch (error) {
yield put(fetchDataFailure(error));
throw error;
}
}
function* fetchSummary(action) {
try {
yield all(
list.map(item=>
call(fetchData, item)
)
);
} catch (error) {
yield put(
enqueueSnackbar({
message: "Has Error",
options: { variant: "error" }
})
);
}
}
The logic of it is that I want to call multiple requests (some success, and some failed).
Expected: If it has failed request, the error will be caught after yield all but those success requests still continue and it should dispatch action "fetchDataSuccess" after individual success request (Promise.all can do this)
Actual: If it has failed request, the error will be caught after yield all, and then saga immediately cancel all other "fetchData" call.
Can anyone help me to achieve this logic. Thanks in advance.
The "Actual" behavior that you are describing fits with what I am seeing in your code. As soon as any error is thrown, we leave the try block and enter the catch block.
When we yield an array of effects, the generator is blocked until all the effects are resolved or as soon as one is rejected (just like how Promise.all behaves). - docs
If you want each fetch to execute then you would need to put the try/catch inside the .map. You can either map to an array of true/false values or set a value on error. Or if you don't mind having multiple snackbars you could put enqueueSnackbar inside fetchData instead of in fetchSummary.
Here's one way to do it:
// modified to return either true or false
function* fetchData(item) {
try {
const data = yield call(request, item);
yield put(fetchDataSuccess({ item, data }));
return true;
} catch (error) {
yield put(fetchDataFailure({ item, error }));
return false;
}
}
function* fetchSummary(action) {
const results = yield all(
action.payload.list.map((item) => call(fetchData, item))
);
// check if any of the results were false;
const hasError = results.some((res) => !res);
if (hasError) {
yield put(
enqueueSnackbar({
message: "Has Error",
options: { variant: "error" }
})
);
}
}
Code Sandbox Demo

Redux-saga isn't waiting for api calls to resolve, keeps returning promises. How do I make "yeild call" wait for on api calls?

Where I'm making the API request:
function* search(value){
// return new Promise(async (resolve, reject)=>{
// try {
// const res = await axios.get(`https://www.breakingbadapi.com/api/characters?name=${value}`)
// console.log(res.data)
// resolve(res.data)
// } catch (error) {
// reject(error)
// }
// })
return axios.get(`https://www.breakingbadapi.com/api/characters?name=${value}`)
.then(res=>{
console.log(res.data)
})
.catch(err=>{
// console.log(err)
})
}
Where I'm calling this function to get the result and put it in state:
function* startSearch(value){
try {
yield put({type:'loading'})
const person = yield call(search, value)
console.log("SAGA",person)
yield put({type:'success', payload:person})
} catch (error) {
yield put({type:'failure'})
}
}
As you can see I've tried wrapping the api call in a promise and in typical .then .catch. No matter what I keep console logging a promise and no kind of object is stored in state as intended. according to the docs yield call is supposed to pause the generator if it returns a promise but that doesn't seem to happen.
Edit:This is why you need to step away from the screen folks. All I needed to do was remove the "*" from the search function. Super easy.
Deceptively simple. All you need to do is wrap the promise generating function call in its own function which you can then call with Redux-Saga's call().
I think you just need to make your search function a regular function rather than a generator function. (Remove the *)
function search(url) {
return axios({ url })
.then(response => response.data)
.catch(err => {
log.error(err);
});
}
function* startSearch() {
const response = yield call(search, downloadURL);
}

Categories

Resources