Promises One At A Time? - javascript

So it seems I don't quite understand promises, but I've been using them in low code software my company uses for internal tools as a way to perform the same query on different data for a certain number of times.
Anyway, I'm currently using Promises with a Mailgun query, and when I try to resolve Promise.all(promises), I assume I'm hitting them too quickly and too much. So what I would like to do, without having to refactor the entirety of my code, is take what I have and then resolve those Promises one at a time.
let query = Mailgun_MailList_Add_Members;
//let arr = testEmailData.value;
let reps = repInfo.value;
let tableData = table1.selectedRow.data;
let finalResult = [];
for(let i = 0; i < reps.length; i++){
let emailArr = [];
let allRepEmails = [];
/* function that takes an array and checks inside for subarrays, pushing all subvalues into one new array */
let getAllRepEmails = (arr) => {
if(arr instanceof Array){
for(let i = 0; i < arr.length; i++){
getAllRepEmails(arr[i]);
}
}
else allRepEmails.push(arr);
}
for(let j = 0; j < tableData.length; j++){
/* check if current records owningrep is equal to current index of repinfos lastName */
if(tableData[j].owningrep.toUpperCase() == reps[i].lastName.toUpperCase()){
/* takes all the emails from table data in the crrent index and pushes them into array */
emailArr.push(tableData[j].Emails.replace(/;/g, ",").replace(/:/g, ",").replace(/ +/g, "").replace(/,+/g, ",").split(','));
}
}
/* check inside emailArr for subarrays of emails, pushing emails into new array */
getAllRepEmails(emailArr);
/* filters array of all emails for current rep to not include empty strings */
let noEmptyEmails = _.filter(allRepEmails, el => el != "");
/* loops over final array of all actual emails, creating objects for each rep with arrays of emails up to 1000 each per API req and pushing them into final array */
while(noEmptyEmails.length){
finalResult.push({
owningrep: reps[i].lastName.toUpperCase(),
/* converts final email array into JSON format as per API req */
Emails: JSON.stringify(noEmptyEmails.splice(0,1000))
});
}
}
/* maps finalResults to create the promises that perform the query for each record */
let promises = finalResult.map((item) => {
/* get lastName from repinfo for address variable */
let name = _.filter(repInfo.value, obj => obj.lastName == item.owningrep)[0].lastName.toLowerCase();
/* uses name variable and repinfo fromAddress to make address variable representing alias for the mail list we are adding members to */
let address = _.filter(repInfo.value, obj => obj.lastName == item.owningrep)[0].fromAddress.replace(/^[^#]*/, name + "test");
query.trigger({
additionalScope: {
members: finalResult[finalResult.indexOf(item)].Emails,
alias: address
}
})
}
);
return Promise.all(promises);
I'm tried using the different methods on Promise to see what happens, I've tried splicing Promises and resolving one. I think the only thing I've learned is that I don't understand Promises.
Does anyone have any ideas?

2 things:
your finalResult.map((item) => { don't seems to return any promise as TJ explained. I think you meant to do return query.trigger either way that map runs instantly (and in parallel) so the function you have written dosen't really wait for anything so it could be that other chained calls to your function is invoked immediately b/c Promise.all dose not really wait for anything.
The let promises = seems to be an array of undefined values? so again Promise.all(promises) dose nothing for you.
if you want to run one at the time, then remove finalResult.map((item) => and instead use something like a classic for loop and use async/await:
for (const item of finalResult) {
await query.trigger(...)
}
your function is required to have the async keyword if you want to use await, async function foo() { ... }

Related

How do I prevent duplicate API Calls in JavaScript FOR Loop

I have an array of URLs I scraped from a webpage and then make an API call to validate the URLs to see if they are malicious. The only problem is I am limited to a certain amount of API calls per day and the array contains duplicate URLs. I am trying to loop through the array and used a saved API call for duplicate values and I am struggling to find the best way to do it since there could be multiple duplicates. If the loop encounters a duplicate value I want it to not make the API call and just return the already saved values from the previous API call. I included some basic sudo code inside the code below and I am unsure of what to populate the sudo code with.
/* urlToValidate is a list of URLS */
urlToValidate = ["ups.com", "redfin.com", "ups.com", "redfin.com", "redfin.com", "redfin.com"];
var isValid = false;
/* API Overview https://www.ipqualityscore.com/documentation/malicious-url-scanner-api/overview */
for (let i = 0; i < urlToValidate.length; i++) {
if (i == 0 || Is Not A DUPLICATE) {
$.getJSON('https://ipqualityscore.com/api/json/url/<API_KEY>/' + urlToValidate[i], function( json ) {
if (!json.phishing && !json.spamming && json.risk_score < 80) {
isValid = true;
returnMessage(isValid, json.risk_score, i)
} else {
isValid = false;
returnMessage(isValid, json.risk_score, i)
}
});
} else {
returnMessage(alreadySaved duplicateValue, alreadySaved duplicate risk_score, i)
}
}
Desired Output:
URL Valid: true Risk Rating: 0 Position: 7
or
Duplicate URL: true Risk Rating: 0 Position: 7
This is a simple matter of caching.
Outside of your for loop, maintain some kind of mapping of URLs to their corresponding fetch results. That way, you can store not only whether that URL has been called but also the result, if it exists. An easy way to do that is with a basic object, where the "keys" are strings corresponding to the URLs, and the "values" are the results of your fetches.
const resultCache = {};
Inside of your loop, before you do a fetch you should first check whether the cache already has a result for that URL.
let result;
if (resultCache[urlToFetch]) {
result = resultCache[urlToFetch];
} else {
// use the previous result
result = await fetch(/* whatever */);
// remember to also store result in cache
resultCache[urlToFetch] = result;
}
You have a few options.
First you could convert your urls to a Set which prevents any duplicates from occurring at all.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set
Another option would be to store the return in an object with the key being the url and in your if statement check to see if the value is not null.
*** UPDATE using a set ***
/* urlToValidate is a list of URLS */
urlToValidate = ["ups.com", "redfin.com", "ups.com", "redfin.com", "redfin.com", "redfin.com"];
var urls = new Set(urlToValidate);
var isValid = false;
/* API Overview https://www.ipqualityscore.com/documentation/malicious-url-scanner-api/overview */
for (let i = 0; i < urls.length; i++) {
$.getJSON('https://ipqualityscore.com/api/json/url/<API_KEY>/' + urls[i], function( json ) {
if (!json.phishing && !json.spamming && json.risk_score < 80) {
isValid = true;
returnMessage(isValid, json.risk_score, i)
} else {
isValid = false;
returnMessage(isValid, json.risk_score, i)
}
});
}
}

Perform some action on each element of an array but in small portions using JavaScript

I'm trying to understand how to perform some action on each element of an array, but by working in portions of that array, until each element has been touched.
As a more specific example, let's assume I have an array of 990 elements and want to perform some action on each element, but in portions of 200. What would be the most efficient way to do this?
function foo(array) {
results = []
if (array.length > 200) {
// Loop over and perform action on first 200 elements, then next 200, and so on...
// for each element, push result to results array
}
return results;
}
EDIT:
For my specific use case, each element in the array is a URL. I'm making a GET request with each URL using Axios. There is potential for my array to contain thousands of URLs, so I don't want to make a request and wait for a response one at a time; however, the server I'm making the requests to can only handle so many requests at one time (about 200).
There are lots of ways to do this. Some ways better than others. But I will assume you dont want to modify the original array and want to handle 200 elements on different moments:
function stepArray(arr){
//... create your custom index on the array object
//..., so it will know where to continue from
if(typeof arr.myIndex == 'undefined'){ arr.myIndex = 0; }
for(var k=0; k<200; k++){
if(k + arr.myIndex >= arr.length){return;}
process(arr[k + arr.myIndex]);
}
}
To make multiple chunks you can use reduce, like that:
var perChunk = 200 // chunk size
var inputArray = [] // your array
const result = inputArray.reduce((resultArray, item, index) => {
const chunkIndex = Math.floor(index/perChunk)
if(!resultArray[chunkIndex]) {
resultArray[chunkIndex] = [] // new chunk
}
resultArray[chunkIndex].push(item)
return resultArray
}, [])
console.log(result);
Then you can iterate again over the sible chunks a make your axios request.
for(i=0; i< result.length; i++) {
let delay = 3000 * i
setTimeout(() => {
console.log(// your action with the Array arr[i])
}, delay)
}
While it may not be the most efficient solution per my initial request, I found the following to be easy-to-understand:
while (array.length != 0) {
array.splice(0, 200).forEach(function (url) {
// Perform some action
});
}

How to access and update the value of a map after creating it?

This might be a basic question, however, I was struggling a little. So, what the code does is that it traverses an entire string passed to it and maps the occurrence of each letter. For instance, {P:28, A:15 ....}. What I wanted to know was that if I want to access the value, 28 in the case of P, and perform some sort of action on it, 28/(length of message) * 100, how can I do that and replace the value in the map. I would really appreciate it if someone can share it. Thanks.
P.S: I tried updating at every instance, but I need to only update once the for loop ends to get the correct percentage of occurrence.
const frequency = (message) => {
message = message.toUpperCase().replace(/\s/g,'');
for(let i = 0; i < message.length; i++)
{
let letter = message.charAt(i);
if(map.has(letter)){
map.set(letter, map.get(letter) + 1);
}
else{
map.set(letter, (count));
}
}
// const sortedMap = new Map([...map.entries()].sort());
return map;
}
You do it the same way you did in your function. Call set. You can use for..of to iterate over all the values found in the map.
for([letter, count] of map) {
map.set(letter, count/(length of message) * 100;
}
This is where the get function of the map will help you. If you know the key already (for e.g 'P').You can do this
let valueToChange = map.get('P')
valueToChange+=1
map.set('P',valueToChange)
If you do not know the keys map.keys() will return the list of keys for you

How to perform fast search on JSON file?

I have a json file that contains many objects and options.
Each of these kinds:
{"item": "name", "itemId": 78, "data": "Some data", ..., "option": number or string}
There are about 10,000 objects in the file.
And when part of item value("ame", "nam", "na", etc) entered , it should display all the objects and their options that match this part.
RegExp is the only thing that comes to my mind, but at 200mb+ file it starts searching for a long time(2 seconds+)
That's how I'm getting the object right now:
let reg = new RegExp(enteredName, 'gi'), //enteredName for example "nam"
data = await fetch("myFile.json"),
jsonData = await data.json();
let results = jsonData.filter(jsonObj => {
let item = jsonObj.item,
itemId = String(jsonObj.itemId);
return reg.test(item) || reg.test(itemId);
});
But that option is too slow for me.
What method is faster to perform such search using js?
Looking up items by item number should be easy enough by creating a hash table, which others have already suggested. The big problem here is searching for items by name. You could burn a ton of RAM by creating a tree, but I'm going to go out on a limb and guess that you're not necessarily looking for raw lookup speed. Instead, I'm assuming that you just want something that'll update a list on-the-fly as you type, without actually interrupting your typing, is that correct?
To that end, what you need is a search function that won't lock-up the main thread, allowing the DOM to be updated between returned results. Interval timers are one way to tackle this, as they can be set up to iterate through large, time-consuming volumes of data while allowing for other functions (such as DOM updates) to be executed between each iteration.
I've created a Fiddle that does just that:
// Create a big array containing items with names generated randomly for testing purposes
let jsonData = [];
for (i = 0; i < 10000; i++) {
var itemName = '';
jsonData.push({ item: Math.random().toString(36).substring(2, 15) + Math.random().toString(36).substring(2, 15) });
}
// Now on to the actual search part
let returnLimit = 1000; // Maximum number of results to return
let intervalItr = null; // A handle used for iterating through the array with an interval timer
function nameInput (e) {
document.getElementById('output').innerHTML = '';
if (intervalItr) clearInterval(intervalItr); // If we were iterating through a previous search, stop it.
if (e.value.length > 0) search(e.value);
}
let reg, idx
function search (enteredName) {
reg = new RegExp(enteredName, 'i');
idx = 0;
// Kick off the search by creating an interval that'll call searchNext() with a 0ms delay.
// This will prevent the search function from locking the main thread while it's working,
// allowing the DOM to be updated as you type
intervalItr = setInterval(searchNext, 0);
}
function searchNext() {
if (idx >= jsonData.length || idx > returnLimit) {
clearInterval(intervalItr);
return;
}
let item = jsonData[idx].item;
if (reg.test(item)) document.getElementById('output').innerHTML += '<br>' + item;
idx++;
}
https://jsfiddle.net/FlimFlamboyant/we4r36tp/26/
Note that this could also be handled with a WebWorker, but I'm not sure it's strictly necessary.
Additionally, this could be further optimized by utilizing a secondary array that is filled as the search takes place. When you enter an additional character and a new search is started, the new search could begin with this secondary array, switching to the original if it runs out of data.

node.js: expensive request

Hi, everyone! I need some help in my first app:
I’m creating an application with express+node.js as the background. There is no database. I’m using 3rd-party solution with some functions, that doing calculations, instead.
Front
50 objects. Every object has one unique value: random number. At start I have all these objects, I need to calculate some values for every object and position it on the form based on the calculated results.
Each object sends: axios.get('/calculations?value=uniqueValue') and I accumulate the results in an array. When array.length will be equal 50 I will compare array elements to each other and define (x, y) coordinates of each object. After that, objects will appear on the form.
Back
let value = uniqueValue; // an unique value received from an object
let requests = [];
for (let i = 0; i < 1500; i++) { // this loop is necessary due to application idea
requests.push(calculateData(value)); // 3rd-party function
value += 1250;
}
let result = await Promise.all(requests);
let newData = transform(result); // here I transform the calculated result and then return it.
return newData
Calculations for one object cost 700 ms. All calculations for all 50 objects cost ≈10 seconds. 3rd-party function receives only one value at the same time, but works very quickly. But the loop for (let i = 1; i < 1500; i++) {…} is very expensive.
Issues
10 seconds is not a good result, user can’t wait so long. May be I should change in approach for calculations?
Server is very busy while calculating, and other requests (e.g. axios.get('/getSomething?params=something') are pending.
Any advice will be much appreciated!
You can make the call in chunks of data using async.eachLimit
var values = [];
for (let i = 0; i < 1500; i++) { // this loop is necessary due to application idea
values.push(value);
value += 1250;
}
var arrayOfItemArrays = _.chunk(values, 50);
async.eachLimit(arrayOfItemArrays, 5, eachUpdate, function(err, result){let
newData = transform(result);
return newData ;
});
function eachUpdate(req_arr, cb){
var result = []
req_arr.forEach(fucntion(item){
calculateData(item).then((x){
result.push(x);
});
cb(result);
}

Categories

Resources