ReactJS: how to make two backend requests at once? - javascript

Is it possible to make two backend requests at once from react?
The code below is the first backend call. The post request gets send to the backend and then I would like to do another request. Is it possible at all? Or do I have to wait for the backend response until the next request could be made?
What I basically want is to get information about how many files have been uploaded. The upload could take 3 minutes and the user right now only sees a loading icon. I want to additionally add a text like "50 of 800 Literatures uploaded" and 10 seconds later "100 of 800 litereratures uploaded".
This is basically my code :
class ProjectLiterature extends Component {
constructor(props) {
super(props);
this.state = {
isLoading:"false",
}
}
addLiterature(data, project_name) {
this.setState({ isLoading:true }, () => {
axios.post("http://127.0.0.1:5000/sendLiterature", data })
.then(res => {
this.setState({ isLoading: false })
})
})
}

If both requests do not depend on each other, you can make use of JavaScript's Promise.all() for the above purpose.
const request1 = axios.get('http://127.0.0.1:5000/sendLiterature');
const request2 = axios.get(url2);
Promise.all([request1,request2]).then([res1, res2] => {
// handle the rest
}).catch((error) => {
console.error(error);
// carry out error handling
});
If the second request relies on the response of the first request, you will have to wait for the first request to be completed as both requests have to be carried out in sequence.
const res = await axios.get('http://127.0.0.1:5000/sendLiterature');
// carry out the rest

You can see axios docs for this purpose, they support multiple requests out of box.
You can use Promise.all instead of axios.all as well but if one of requests fails then you won't be able to get response of successful calls. If you want get successful response even though some calls fails then you can use Promise.allSettled.

Related

Request to API and waiting to answer

I need to send GET request username=alison&date=2021 to file file.phpand send requests every 200 ms if I don't get one of two possible responses "yes" or "no", resend request needed to get right answer no blank and not error.
If get "yes" "no" do functions.
when receiving responses, perform different actions
function after receiving yes
function after receiving no
I'm not 100% sure what you're asking for here, but here is my method based on my own interpretation of your question using a call from GitHub's API. We are using fetch to pull the data. Our .then is our resolve, and our .catch is our reject.
const url = 'https://api.github.com/users'
const callApi = function(fetchUrl){
fetchUrl = url
fetch(fetchUrl)
.then(response=>{
return response.json(); // Turn data to JSON
})
.then(data=>{ // If it was successful, this below will run
console.log(data) // Do whatever you want with the data from the API here
})
.catch(err=>{ // If it was unsuccessful, this below will run
console.log(err); // Console log the error
setTimeout(callApi(url), 200); //If it failed, try again in 200ms
})
}
callApi(url) // Initial function call
Some things to note: If you're using an API that limits the number of calls you can make in a day/month, this will eat up through those allotted requests really quickly.

Angular, how to make consecutive, dependent http get requests

I'd like to execute one http get request after another has completed. The endpoint URL for the second request will need to depend on the first request.
I've tried nesting the requests with something like this in Angular:
this.http.get(endpoint1).subscribe(
success => {
this.http.get(endpoint2 + success).subscribe(
anotherSuccess => console.log('hello stackoverflow!')
);
}
);
First question here on stackoverflow, please let me know if I should provide more detail.
here you can find how to do that, you have various options:
with subscribe:
this.http.get('/api/people/1').subscribe(character => {
this.http.get(character.homeworld).subscribe(homeworld => {
character.homeworld = homeworld;
this.loadedCharacter = character;
});
});
with mergeMap
this.homeworld = this.http
.get('/api/people/1')
.pipe(mergeMap(character => this.http.get(character.homeworld)));
There is a version with forkJoin but isn't explicit compared with subscribe and mergeMap.
You should probably try using ngrx (redux) triggering the second request depending on a success action.
The flow should be something like this: dispatch an action from a component -> call the first request -> the request triggers a success action -> success effect triggers the second request with a payload from the previous request.
Read the docs here

I'm currently making 3 GET requests for JSON files using Axios, are they loaded simultaneously or one after another?

The application that I'm making loads 3 JSON files in order to get information about a game's characters, spell and more. Currently I have 3 functions that use axios to make GET requests and then store the responses, however, I'm wondering if I'm slowing my load time because frankly I'm not sure if these JSON files are loaded simultaneously or one after another. Each file takes about 45 ms to load so if they're being loaded one after another, I'm looking at around 135 ms load time and I'm not liking it.
Currently I've tried 2 ways but frankly I don't see a difference in the loading time in chrome's network tab. If you're wondering, the functions are located in my Vue.js Vuex store and the calls are executed in App.vue mounted hook.
The first way uses 3 separate functions and each makes its own GET request. Then these functions are called one after another.
The call:
this.$store.dispatch('getChampions')
this.$store.dispatch('getSummonerSpells')
this.$store.dispatch('getSummonerRunes')
The functions:
getChampions({commit, state}){
axios.get("https://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/champion.json")
.then((response) => {
commit('champions', {
champions: response.data.data
})
})
.catch(function (error) {
console.log(error);
})
},
getSummonerSpells({commit, state}){
axios.get("http://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/summoner.json")
.then((response) => {
commit('summonerSpells', {
summonerSpells: response.data.data
})
})
.catch(function (error) {
console.log(error);
})
},
getSummonerRunes({commit, state}){
axios.get("https://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/runesReforged.json")
.then((response) => {
commit('summonerRunes', {
summonerRunes: response.data
})
})
.catch(function (error) {
console.log(error);
})
}
And using the second way, I have 1 function like this:
The call:
this.$store.dispatch('getRequirements')
The function:
getRequirements({commit, state}){
axios.all([
axios.get('https://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/champion.json'),
axios.get('http://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/summoner.json'),
axios.get('https://ddragon.leagueoflegends.com/cdn/9.14.1/data/en_US/runesReforged.json')
])
.then(axios.spread((response1, response2, response3) => {
commit('champions', {
champions: response1.data.data
})
commit('summonerSpells', {
summonerSpells: response2.data.data
})
commit('summonerRunes', {
summonerRunes: response3.data
})
}))
}
You're executing the requests in parallel so your browser will attempt to execute them simultaneously. Whether or not it does this is up the browser.
You can use your browser's Network console timing column (aka Waterfall in Chrome) to see what's going on.
If your question is
"is there a difference between these?"
the answer is "no" as far as timing goes.
If you start running into errors with any particular request, your first option is more robust since axios.all will reject the promise if any fail.
If you want to speed this up, you could create a service that combines the three results in to one so you're only making a single request. Then throw in a cache for an extra speed-up.
When all requests are complete, you’ll receive an array containing the response objects in the same order they were sent. Commit() is called only after both of your requests are completed.

Redirect/ Routing issues in React and Express

I am creating a flight search app that makes external api calls. The data will be reorganized and returned in search results component.
On submit, the data is sent to express and takes about 10 seconds or more to complete all the api calls.
I think I need a loader at some point for during the delay of api calls, but also I am unsure of how to send/render the data.
As it stands, I have two pages home.js- '/' where i make the search and is sent to the server side, and prices.js- '/search' which when loaded fetches the data from the json file. but i do not have them connected
Both files work but I need to connect them. When I press submit, the user inputs are sent to server and the api calls are made but in order to see the results i have to manually refresh localhost:3000/search.
In express app after all the api calls, I tried res.redirect method, however the error given was setting the headers after sent to the client.
In react, I tried after submitting, to redirect to the search page. However I could not get it to redirect and also as soon as the /search page is called, it fetches the data from the file. This will happen before the api has finished writing to file and therefore the previous search results will render.
--in app.js
setTimeout(() => {
Promise.all([promise, promise2]).then(values => {
return res.redirect('http://localhost:3000/search');
});
}, 25000);
I had to wrap the api calls in promises so it will only redirect after all is written to file.
(in react prices.js)
componentDidMount() {
fetch('/search')
.then(res => {
return res.json()
})
.then(res => {
console.log(res.toString());
this.setState({flightData: res});
})
.catch(error => console.log(error));
}
home.js
home.js
```
onChange = (e) => {
this.setState({
originOne: e.target.value, originTwo: e.target.value});
};
onSubmit = (e) => {
e.preventDefault();
const { originOne, originTwo ,redirectToResult} = this.state;
};
```
app.js - I have all the functions calling each other in a waterfall style ( probably not the best way I know)
app.post('/', function getOrigins(req,res) {
var OrigOne;
var OrigTwo;
....
function IataCodeSearchOrg1(res, OrigOne, OrigTwo) {
...
findPrices(res,A,B)
}
function findPrices(res, A, B) {
promise = new Promise(function (resolve) {
...
}
}
All the methods are called within eachother. The api calls are in a loop and after each iteration they are written to the json file.
All these functions are in the app.post method and i tried to res.redirect but it did not work.
EDIT:
You can't redirect server-side from an XHR request. You would need to redirect client-side.
e.g.
componentDidMount() {
fetch('/search')
.then(res => res.json())
...
.then(() => window.location = '/myredirecturl')
}

Requests through service-worker are done twice

I've done a simple service-worker to defer requests that fail for my JS application (following this example) and it works well.
But I still have a problem when requests succeed: the requests are done twice. One time normaly and one time by the service-worker due to the fetch() call I guess.
It's a real problem because when the client want to save datas, they are saved twice...
Here is the code :
const queue = new workbox.backgroundSync.Queue('deferredRequestsQueue');
const requestsToDefer = [
{ urlPattern: /\/sf\/observation$/, method: 'POST' }
]
function isRequestAllowedToBeDeferred (request) {
for (let i = 0; i < requestsToDefer.length; i++) {
if (request.method && request.method.toLowerCase() === requestsToDefer[i].method.toLowerCase()
&& requestsToDefer[i].urlPattern.test(request.url)) {
return true
}
}
return false
}
self.addEventListener('fetch', (event) => {
if (isRequestAllowedToBeDeferred(event.request)) {
const requestClone = event.request.clone()
const promiseChain = fetch(requestClone)
.catch((err) => {
console.log(`Request added to queue: ${event.request.url}`)
queue.addRequest(event.request)
event.respondWith(new Response({ deferred: true, request: requestClone }))
})
event.waitUntil(promiseChain)
}
})
How to do it well ?
EDIT:
I think I don't have to re-fetch() the request (because THIS is the cause of the 2nd request) and wait the response of the initial request that triggered the fetchEvent but I have no idea how to do it. The fetchEvent seems to have no way to wait (and read) the response.
Am I on the right way ? How to know when the request that triggered the fetchEvent has a response ?
You're calling event.respondWith(...) asynchronously, inside of promiseChain.
You need to call event.respondWith() synchronously, during the initial execution of the fetch event handler. That's the "signal" to the service worker that it's your fetch handler, and not another registered fetch handler (or the browser default) that will provide the response to the incoming request.
(While you're calling event.waitUntil(promiseChain) synchronously during the initial execution, that doesn't actually do anything with regards to responding to the request—it just ensures that the service worker isn't automatically killed while promiseChain is executing.)
Taking a step back, I think you might have better luck accomplishing what you're trying to do if you use the workbox.backgroundSync.Plugin along with workbox.routing.registerRoute(), following the example from the docs:
workbox.routing.registerRoute(
/\/sf\/observation$/,
workbox.strategy.networkOnly({
plugins: [new workbox.backgroundSync.Plugin('deferredRequestsQueue')]
}),
'POST'
);
That will tell Workbox to intercept any POST requests that match your RegExp, attempt to make those requests using the network, and if it fails, to automatically queue up and retry them via the Background Sync API.
Piggybacking Jeff Posnick's answer, you need to call event.respondWith() and include the fetch() call inside it's async function().
For example:
self.addEventListener('fetch', function(event) {
if (isRequestAllowedToBeDeferred(event.request)) {
event.respondWith(async function(){
const promiseChain = fetch(event.request.clone())
.catch(function(err) {
return queue.addRequest(event.request);
});
event.waitUntil(promiseChain);
return promiseChain;
}());
}
});
This will avoid the issue you're having with the second ajax call.

Categories

Resources