Fetch multiple URLs at the same time? - javascript

I'm looking for a way to fetch multiple URLs at the same time. As far as I know the API can only retrieve the data I want with a single product lookup so I need to fetch multiple products at once with the url structure "/products/productID/". Note, this is in VUEJS. This is what my code looks like so far:
In my productServices.js:
const productsService = {
getCategory(productID){
const url = `${config.apiRoot}/products/${productID}`
return fetch(url, {
method: 'GET',
headers: {
'content-type': 'application/json',
'Authorization': `Bearer ${authService.getToken()}`
},
})
}
}
In my view:
data() {
return {
featuredProduct: [13,14,15],
productName: [],
productImg: []
}
}
async mounted(){
const response = await productsService.getCategory(this.featuredProduct)
const resJSON = JSON.parse(response._bodyInit)
this.loading = false
this.productName = resJSON.name
this.productImg = resJSON.custom_attributes[0].value
}
So I need to hit all three featuredProduct IDs and store the data. I'm not really sure how to loop through multiple URLS. All of my other API calls have had all the data readily available using search params but for the specific data I need here ( product image ), it can only be seen by calling a single product.
Any help is much appreciated!

Like Ricardo suggested I'd use Promise.all. It takes in an array of promises and resolves the promise it returns, once all the passed ones have finished (it resolves the promises in the form of an array where the results have the same order as the requests).
Docs
Promise.all([
fetch('https://jsonplaceholder.typicode.com/todos/1').then(resp => resp.json()),
fetch('https://jsonplaceholder.typicode.com/todos/2').then(resp => resp.json()),
fetch('https://jsonplaceholder.typicode.com/todos/3').then(resp => resp.json()),
]).then(console.log)
Using map + Promise.all (tested)
Promise.all([1, 2, 3].map(id =>
fetch(`https://jsonplaceholder.typicode.com/todos/${id}`).then(resp => resp.json())
)).then(console.log);
if you have multiple products in an array which need to be fetched, you could just use:
Code not tested
Promise.all(productIds.map(productId =>
fetch(`https://url/products/${productId}`)
)).then(() => {/* DO STUFF */});
Little suggestion on storing your data:
If you store everything in one array, it makes to whole job way easier. So you could do
fetchFunction().then(results => this.products = results);
/*
this.products would then have a structure something like this:
Array of Obejcts: {
name: "I'm a name",
displayName: "Please display me",
price: 10.4
// And so on
}
*/

Because you have an array of products, I'd start by changing your state names:
data() {
return {
productIds: [13, 14, 15],
productNames: [],
productImages: [],
};
},
Then you can use Promise.all to fetch the products in parallel:
async mounted() {
const responses = await Promise.all(
this.productIds.map(id => productsService.getCategory(id))
);
responses.forEach((response, index) => {
const resJSON = JSON.parse(response._bodyInit);
this.productNames[index] = resJSON.name;
this.productImages[index] = resJSON.custom_attributes[0].value;
});
this.loading = false;
}
You could also consider refactoring getCategory do the parsing for you and return an object containing a name and an image - that way, mounted wouldn't have to know about the internal response structure.

Check the Promise.all method
Maybe you can create the calls that you need by iterating into your data and then request them in bulk.

Related

React Native API FETCH Different names for each objects

I am connecting a REST api from React Native app. I have Json response with filename objects with different names but all the objects have same variables: filename, message, and display.
Number of objects changes with each request to API (REST), the names of objects in response are different depending on requests. But the variables in each object are same as above.
The information I need from this response is only filename text, but it will be acceptable if I get list of objects so I can read through the messages from errors.
The image shows how my objects look like.
This is my fetch request :
const getGists = async () => {
await axios
.get(`https://api.github.com/gists/public?per_page=30`)
.then((r) => {
let n;
for (n = 0; n < 30; n++) {
console.log(r.data[n].files.filename);
// console.log("____________________");
// console.log(r.data[n].owner.avatar_url);
// console.log("____________________");
// console.log(JSON.stringify(r.data[n].files));
}
})
.catch((e) => {
console.log("ERROR", e);
});
};
how is possible to get every filename from these requests even if object name is not the same in each iteration . Thanks for help
Working with the result of the API calls and some higher-order functions, this will work fine:
const getGists = async () => {
await axios
.get(`https://api.github.com/gists/public?per_page=30`)
.then((response) => {
const myDesireResult = response.data.reduce((acc, item) => {
const files = Object.values(item.files);
if (files.length > 1) {
files.forEach((file) => acc.push(file.filename));
} else {
acc.push(files[0].filename);
}
return acc;
}, []);
console.log(myDesireResult);
})
.catch((e) => {
console.log("ERROR", e);
});
};
Explanation:
in the then block, can get the API call result with result.data
with reduce function, looping through the data will start.
since the object in the files has different names, we can get the files with Object.values() easily.
Some of the files contain several items and most of them have just one item. so with checking the length of the file we can do proper action. if the files have more than one element, with another simple lop, we can traverse this file array easily.
Check the working example on codesandbox

React.JS, how to edit the response of a first API call with data from a second API call?

I need to display some data in my component, unfortunately the first call to my API returns just part of the information I want to display, plus some IDs. I need another call on those IDs to retrieve other meaningful data. The first call is wrapped in a useEffect() React.js function:
useEffect(() => {
const getData = async () => {
try {
const { data } = await fetchContext.authAxios.get(
'/myapi/' + auth.authState.id
);
setData(data);
} catch (err) {
console.log(err);
}
};
getData();
}, [fetchContext]);
And returns an array of objects, each object representing an appointment for a given Employee, as follows:
[
{
"appointmentID": 1,
"employeeID": 1,
"customerID": 1,
"appointmentTime": "11:30",
"confirmed": true
},
... many more appointments
]
Now I would like to retrieve information about the customer as well, like name, telephone number etc. I tried setting up another method like getData() that would return the piece of information I needed as I looped through the various appointment to display them as rows of a table, but I learned the hard way that functions called in the render methods should not have any side-effects. What is the best approach to make another API call, replacing each "customerID" with an object that stores the ID of the customer + other data?
[Below the approach I've tried, returns an [Object Promise]]
const AppointmentElements = () => {
//Loop through each Appointment to create a single row
var output = Object.values(data).map((i) =>
<Appointment
key={i['appointmentID'].toString()}
employee={i["employeeID"]} //returned a [Object premise]
customer={getEmployeeData((i['doctorID']))} //return a [Object Promise]
time={index['appointmentTime']}
confirmed = {i['confirmed']}
/>
);
return output;
};
As you yourself mentioned functions called in the render methods should not have any side-effects, you shouldn't be calling the getEmployeeData function inside render.
What you can do is, inside the same useEffect and same getData where you are calling the first api, call the second api as well, nested within the first api call and put the complete data in a state variable. Then inside the render method, loop through this complete data instead of the data just from the first api.
Let me know if you need help in calling the second api in getData, I would help you with the code.
Update (added the code)
Your useEffect should look something like:
useEffect(() => {
const getData = async () => {
try {
const { data } = await fetchContext.authAxios.get('/myapi/' + auth.authState.id);
const updatedData = data.map(value => {
const { data } = await fetchContext.authAxios.get('/mySecondApi/?customerId=' + value.customerID);
// please make necessary changes to the api call
return {
...value, // de-structuring
customerID: data
// as you asked customer data should replace the customerID field
}
}
);
setData(updatedData); // this data would contain the other details of customer in it's customerID field, along with all other fields returned by your first api call
} catch (err) {
console.log(err);
}
};
getData();
}, [fetchContext]);
This is assuming that you have an api which accepts only one customer ID at a time.
If you have a better api which accepts a list of customer IDs, then the above code can be modified to:
useEffect(() => {
const getData = async () => {
try {
const { data } = await fetchContext.authAxios.get('/myapi/' + auth.authState.id);
const customerIdList = data.map(value => value.customerID);
// this fetches list of all customer details in one go
const customersDetails = (await fetchContext.authAxios.post('/mySecondApi/', {customerIdList})).data;
// please make necessary changes to the api call
const updatedData = data.map(value => {
// filtering the particular customer's detail and updating the data from first api call
const customerDetails = customersDetails.filter(c => c.customerID === value.customerID)[0];
return {
...value, // de-structuring
customerID: customerDetails
// as you asked customer data should replace the customerID field
}
}
);
setData(updatedData); // this data would contain the other details of customer in it's customerID field, along with all other fields returned by your first api call
} catch (err) {
console.log(err);
}
};
getData();
}, [fetchContext]);
This will reduce the number of network calls and generally preferred way, if your api supports this.

How to use dataloader?

Im trying to figure this out.
I want to get all my users from my database, cache them
and then when making a new request I want to get those that Ive cached + new ones that have been created.
So far:
const batchUsers = async ({ user }) => {
const users = await user.findAll({});
return users;
};
const apolloServer = new ApolloServer({
schema,
playground: true,
context: {
userLoader: new DataLoader(() => batchUsers(db)),// not sending keys since Im after all users
},
});
my resolver:
users: async (obj, args, context, info) => {
return context.userLoader.load();
}
load method requiers a parameter but in this case I dont want to have a specific user I want all of them.
I dont understand how to implement this can someone please explain.
If you're trying to just load all records, then there's not much of a point in utilizing DataLoader to begin in. The purpose behind DataLoader is to batch multiple calls like load(7) and load(22) into a single call that's then executed against your data source. If you need to get all users, then you should just call user.findAll directly.
Also, if you do end up using DataLoader, make sure you pass in a function, not an object as your context. The function will be ran on each request, which will ensure you're using a fresh instance of DataLoader instead of one with a stale cache.
context: () => ({
userLoader: new DataLoader(async (ids) => {
const users = await User.findAll({
where: { id: ids }
})
// Note that we need to map over the original ids instead of
// just returning the results of User.findAll because the
// length of the returned array needs to match the length of the ids
return ids.map(id => users.find(user => user.id === id) || null)
}),
}),
Note that you could also return an instance of an error instead of null inside the array if you want load to reject.
Took me a while but I got this working:
const batchUsers = async (keys, { user }) => {
const users = await user.findAll({
raw: true,
where: {
Id: {
// #ts-ignore
// eslint-disable-next-line no-undef
[op.in]: keys,
},
},
});
const gs = _.groupBy(users, 'Id');
return keys.map(k => gs[k] || []);
};
const apolloServer = new ApolloServer({
schema,
playground: true,
context: () => ({
userLoader: new DataLoader(keys => batchUsers(keys, db)),
}),
});
resolver:
user: {
myUsers: ({ Id }, args, { userLoader }) => {
return userLoader.load(Id);
},
},
playground:
{users
{Id
myUsers
{Id}}
}
playground explained:
users basically fetches all users and then myusers does the same thing by inhereting the id from the first call.
I think I choose a horrible example here since I did not see any gains in performence by this. I did see however that the query turned into:
SELECT ... FROM User WhERE ID IN(...)

How to make a multiple api call on submission of a form in react

I have a Form with 10 (may vary) rows of data ,On Submission I need to make 10(may vary) api call one for each row. I am using axios to make the api calls. How can I make multiple api calls on a single click in a best and efficient way ?.
Axios supports the Promise api, so you could use Promise.all to handle all 10 requests at once. Here is a small example:
const requests = [
{ url: "https://some.url", body: { some: "body" } },
{ url: "https://some.other.url", body: { some: "other body" } },
// As many as you like
];
const promises = requests.map(request => axios.post(request.url, request.body));
const result = Promise.all(promises).catch(error => console.log(`Someting went wrong: ${error}`);
you can use Bluebird as well.
import Bluebird from 'bluebird';
deleteRequests = (requests) => {
let promiseCollection = [];
try {
requests.map((request, index) => {
promiseCollection.push(axios.delete(request.API + request.ids));
});
return Bluebird.all(promiseCollection);
}
catch (error) {
}
}

Batch update in knex

I'd like to perform a batch update using Knex.js
For example:
'UPDATE foo SET [theValues] WHERE idFoo = 1'
'UPDATE foo SET [theValues] WHERE idFoo = 2'
with values:
{ name: "FooName1", checked: true } // to `idFoo = 1`
{ name: "FooName2", checked: false } // to `idFoo = 2`
I was using node-mysql previously, which allowed multiple-statements. While using that I simply built a mulitple-statement query string and just send that through the wire in a single run.
I'm not sure how to achieve the same with Knex. I can see batchInsert as an API method I can use, but nothing as far as batchUpdate is concerned.
Note:
I can do an async iteration and update each row separately. That's bad cause it means there's gonna be lots of roundtrips from the server to the DB
I can use the raw() thing of Knex and probably do something similar to what I do with node-mysql. However that defeats the whole knex purpose of being a DB abstraction layer (It introduces strong DB coupling)
So I'd like to do this using something "knex-y".
Any ideas welcome.
I needed to perform a batch update inside a transaction (I didn't want to have partial updates in case something went wrong).
I've resolved it the next way:
// I wrap knex as 'connection'
return connection.transaction(trx => {
const queries = [];
users.forEach(user => {
const query = connection('users')
.where('id', user.id)
.update({
lastActivity: user.lastActivity,
points: user.points,
})
.transacting(trx); // This makes every update be in the same transaction
queries.push(query);
});
Promise.all(queries) // Once every query is written
.then(trx.commit) // We try to execute all of them
.catch(trx.rollback); // And rollback in case any of them goes wrong
});
Assuming you have a collection of valid keys/values for the given table:
// abstract transactional batch update
function batchUpdate(table, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(table)
.where('id', tuple.id)
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate('user', [...]);
Are you unfortunately subject to non-conventional column names? No worries, I got you fam:
function batchUpdate(options, collection) {
return knex.transaction(trx => {
const queries = collection.map(tuple =>
knex(options.table)
.where(options.column, tuple[options.column])
.update(tuple)
.transacting(trx)
);
return Promise.all(queries)
.then(trx.commit)
.catch(trx.rollback);
});
}
To call it
batchUpdate({ table: 'user', column: 'user_id' }, [...]);
Modern Syntax Version:
const batchUpdate = (options, collection) => {
const { table, column } = options;
const trx = await knex.transaction();
try {
await Promise.all(collection.map(tuple =>
knex(table)
.where(column, tuple[column])
.update(tuple)
.transacting(trx)
)
);
await trx.commit();
} catch (error) {
await trx.rollback();
}
}
You have a good idea of the pros and cons of each approach. I would recommend a raw query that bulk updates over several async updates. Yes you can run them in parallel, but your bottleneck becomes the time it takes for the db to run each update. Details can be found here.
Below is an example of an batch upsert using knex.raw. Assume that records is an array of objects (one obj for each row we want to update) whose values are the properties names line up with the columns in the database you want to update:
var knex = require('knex'),
_ = require('underscore');
function bulkUpdate (records) {
var updateQuery = [
'INSERT INTO mytable (primaryKeyCol, col2, colN) VALUES',
_.map(records, () => '(?)').join(','),
'ON DUPLICATE KEY UPDATE',
'col2 = VALUES(col2),',
'colN = VALUES(colN)'
].join(' '),
vals = [];
_(records).map(record => {
vals.push(_(record).values());
});
return knex.raw(updateQuery, vals);
}
This answer does a great job explaining the runtime relationship between the two approaches.
Edit:
It was requested that I show what records would look like in this example.
var records = [
{ primaryKeyCol: 123, col2: 'foo', colN: 'bar' },
{ // some other record, same props }
];
Please note that if your record has additional properties than the ones you specified in the query, you cannot do:
_(records).map(record => {
vals.push(_(record).values());
});
Because you will hand too many values to the query per record and knex will fail to match the property values of each record with the ? characters in the query. You instead will need to explicitly push the values on each record that you want to insert into an array like so:
// assume a record has additional property `type` that you dont want to
// insert into the database
// example: { primaryKeyCol: 123, col2: 'foo', colN: 'bar', type: 'baz' }
_(records).map(record => {
vals.push(record.primaryKeyCol);
vals.push(record.col2);
vals.push(record.colN);
});
There are less repetitive ways of doing the above explicit references, but this is just an example. Hope this helps!
The solution works great for me! I just include an ID parameter to make it dynamic across tables with custom ID tags. Chenhai, here's my snippet including a way to return a single array of ID values for the transaction:
function batchUpdate(table, id, collection) {
return knex.transaction((trx) => {
const queries = collection.map(async (tuple) => {
const [tupleId] = await knex(table)
.where(`${id}`, tuple[id])
.update(tuple)
.transacting(trx)
.returning(id);
return tupleId;
});
return Promise.all(queries).then(trx.commit).catch(trx.rollback);
});
}
You can use
response = await batchUpdate("table_name", "custom_table_id", [array of rows to update])
to get the returned array of IDs.
The update can be done in batches, i.e 1000 rows in a batch
And as long as it does it in batches, the bluebird map could be used.
For more information on bluebird map: http://bluebirdjs.com/docs/api/promise.map.html
const limit = 1000;
const totalRows = 50000;
const seq = count => Array(Math.ceil(count / limit)).keys();
map(seq(totalRows), page => updateTable(dbTable, page), { concurrency: 1 });
const updateTable = async (dbTable, page) => {
let offset = limit* page;
return knex(dbTable).pluck('id').limit(limit).offset(offset).then(ids => {
return knex(dbTable)
.whereIn('id', ids)
.update({ date: new Date() })
.then((rows) => {
console.log(`${page} - Updated rows of the table ${dbTable} from ${offset} to ${offset + batch}: `, rows);
})
.catch((err) => {
console.log({ err });
});
})
.catch((err) => {
console.log({ err });
});
};
Where pluck() is used to get ids in array form

Categories

Resources