Potential infinite loop: exceeded 10001 iterations - javascript

I have a block of by project here in sandbox where I've used cherrio, https://cors-anywhere.herokuapp.com/, axios and manual built function to check for pagination, navigate to the next page and scrape the data and finally push the scraped data into the array of object.
The code somewhat works but gets stuck in the infinite loop while pushing the objects to the array.
The code is successful in scraping and pushing the data of the first page but gets stuck in infinite loop on other pages from pagination. Additionally The code does get in infinite loop if the url with no pagination is given.
Can anyone help me out to my mistake, it's been around a week that I've been trying to resolve this error.
Type Teacher for no pagination url and manager for pagination url.
Files to look at: src/store/modules/Site.js
Files to look at: src/store/modules/Helpers.js

The bug appears to be in makeObject(), shown below:
const makeObject = (jobs, img, org) => {
let mjobs = [];
for (let i = 0; i < jobs.length; i++) {
jobs.push({ // FIXME: Pushing into `jobs` which is being iterated
title: jobs[i],
img: "https://merojob.com" + img[i],
org: org[i]
});
}
return mjobs;
}
The for-loop iterates jobs, while pushing new objects into it. The termination condition checks that i is less than jobs.length, which is being incremented on each iteration, so it never exits the loop.
I think you meant to push into mjobs inside the for-loop.

Related

Fetch remaining data on last page in pagination

I'm making a simple pagination for the comments on the projects i'm working on. But i have an issue where i can keep requesting more comments and get a blank page.
My API URL for fetching comments is: {{URL}}/responses?id={{id}}&skip={{skip}}&take=10
nextComments = () => {
if (this.state.skip <= this.state.responses.total) {
this.setState((prevState) => ({
skip: prevState.skip +
}), async () => {
const responsesbyId = await getResponsesbyOrgId(this.state.orgId, this.state.skip);
this.setState({
responses: responsesbyId
})
console.log(this.state.responses);
});
}
};
I've tried setting a max, but then an another issue is that when there is e.g. 16 comments i can skip 10 then 10 more and end up with a blank page again.
Is there a smarter way to deal with this? so i round up to 6 when there is less than 10 comments left?
hope i'm clear in my question.
I'm assuming you have control over your backend based on the question, so ideally, this is something your API should make the front end aware of. The API should return the total number of results from a query, as well as the number of results per page and what page you're currently viewing (starts at 1). Then the front end can look at those values and dynamically control the pagination logic.
So for example, API response could be something like:
const res = {
results: [...],
resultCount: 16,
page: 1,
resultsPerPage: 10
};
(you'd be storing the current page in the front end state of course, but just added it to the backend response since it usually doesn't hurt to return the request params).
Then in your front end, where you're storing the current page value, the logic could be
if ((currentPage * resultsPerPage) < resultCount) { /* You can let them fetch another page */}
This would satisfy your requirements of not letting them view more pages if they shouldn't be able to, and also lets the results per page variable change in the backend without the front end having to do a thing.

Trying to display an array part after part

I'm currently displaying the content of an array in a view, let say $scope.array.
I load the content of an array with a request to my serv.
Sadly $scope.array contains a lot of elements an displaying every elements at once in the view takes a while.
In order to enhance user experience, I'd like to display the array part by part. At first I thought that $scope was able to handle it if I just proceed to add data chunk by chunk to $scope.array, but nope.
I figured out that the current $digest loop would only be over when my array was full. I tried with Async lib to add chunks asynchronously to $scope hoping for a way to dodge the $digest issue, but it doesn't work.
Now I kinda ran out of ideas to display datas properly, so if you had any experience with this kind of issues I'd be glad to hear about it !
Thanks a lot.
If pagination, periodic requests etc is ruled out then...
You can have all of your data in one array not bound to ui.
You then periodically add the data into a second array that is bound to the ui. Similar to how you mentioned you are already doing it but simply add the chunks in a $timeout or an $interval. This lets the $digests and page renders complete.
Simple Example:
<table>
<tr ng-repeat="item in shownItems track by $index">
<td>{{item}}</td>
</tr>
</table>
and in controller
app.controller('MainCtrl', ['$scope', '$timeout', function($scope, $timeout) {
//array that is bound to ui
$scope.shownItems = [];
//backing data
var fullItemsList = [];
//Create some fake data, you wouldn't do this in your program,
// this is where you would fetch from server
for(var ii = 0; ii < 50000; ii++){
fullItemsList.push("AYYYLMAO " + ii);
}
//How many items to add at a time
var chunkSize = 500;
//keep track of how many we have already added
var currentChunkIndex = 0;
//transfers a chunk of items to ui-bound array
function addMoreItems(){
var start = currentChunkIndex * chunkSize;
var end = (currentChunkIndex + 1) * chunkSize;
for(var ii = start; ii < end; ii++){
if(!fullItemsList[ii]){
break;
}
$scope.shownItems.push(fullItemsList[ii]);
}
currentChunkIndex++;
}
//Transfer chunk of items in a timeout, trigger another timeout to
//add more if there are stil items to transfer
function periodicAdd(){
$timeout(function(){
addMoreItems();
if(currentChunkIndex*chunkSize >= $scope.shownItems.length){
periodicAdd();
}
},0);
}
//Add the first chunk straight away, otherwise it will wait for the
//first timeout.
addMoreItems();
//Start the timeout periodic add.
periodicAdd();
}]);
Example plunkr
Keep in mind that this example is very rough. For instance the initial "load" of 50k rows will run on ui thread whereas your data load will presumably come async from your server. This also means you need to kick off the periodic adding only when your request completes. So yeah just a rough example.
Note that I use a 0 millisecond timeout.This will still push the callback to the end of the current processing queue, it wont execute straight away despite being 0 milliseconds. You might want to increase it a little bit to give your app a bit of breathing room. Remember to properly dispose of timeouts.
Use server side pagination. Even using one-time bindings is not always the solution, especially if there is complex template logic (show/hide parts depending on the data properties) and if the editing is required.
If you want to filter your array by some criteria (for example, month and year), implement this on the backend and pass your criteria to the server: GET /my_data?year=2017&month=7&page=1.

x-ray for grabbing html info from many urls with "for" loop are letting objs undefined

I'm trying to grab info from many urls automatic. I have this array with address called arrayDep and I have a "for" looping my array and entering the web sites. After that, I use x-ray to grab the info I want. At moment I'm using console.log to see it, but later I will add them to my database. The problem is I'm receiving undefined objects after aleatory times and sometimes server busy messages, I think is something with the time I try to run x-ray, so I tried to add a timeout, sadly without success :(
Code:
for (var i = 0; i < arrayDep.length; i++) {
x.timeout(4000);
x('http://www.camara.leg.br/internet/deputado/' + arrayDep[i], {
title: 'a'
})(function(err, obj) {
console.log(obj.title);
})
};
You can also use the async library to control concurrency, for example using the eachLimit function doing no more than 5 requests at a time:
async.eachLimit(arrayDep,5, function(item){
x('http://www.camara.leg.br/internet/deputado/' + item),{
title: 'a'
})(function(err, obj) {
console.log(obj.title);
})
You can use a promises library. I would suggest you use Kriskowal's Q Promises.
Here's the link to the github repo: https://github.com/kriskowal/q
There are tonnes of tutorials on the web regarding Q integrations. Kriskowal also has a 1 hour long video on youtube where he explains Q and its uses.
I hope this helps.

paginating through a CRUD API

I am writing a client that will query a CRUD web API. I will be be using socket.io.get('/api'). Problem is: I want to paginate the results, so I can start displaying stuff while my client is still receiving the data.
The results from the API come as JSON, like
[
{
"id": "216754",
"date": "2015-07-30T02:00:00.000Z"
},
{
"id": "216755",
"date": "2015-08-30T02:00:00.000Z"
}
]
The api lets me construct an URL query where I can limit the size of each result array. So I can make a query like /api&skip=10&limit=10, and it will get me the results from item 10 to item 19. What I want to be able to do is to keep looping and receiving results until the results array is less than length = 10 (that will mean we reached the end of the dataset). And I need that to be asynchronous, so I can start to work on the data right from the start and update whatever work I have done each time a new page is received.
Is it an infinite scroll that you are trying to do? Or do you want to call all the pages asynchronously and be able to receive the page 3 before page 2? Reading the question, I understand it is the second.
You can't rely on "until the results array is less than length = 10" since you want to launch all the calls at the same time.
You should do a first query to retrieve the number of records. Then you will be able to know how many pages there are, you could generate all the urls that you need and call them asynchronously.
It could looks like this (code not tested):
var nbItemsPerPage = 10;
socket.io.get(
'/api/count', // <= You have to code the controller that returns the count
function(resCount) {
nbPages = resCount / nbItemsPerPage;
for (var i=0; i<nbPages; i++) {
// Javascript will loop without waiting for the responses
(function (pageNum) {
socket.io.get(
'/api',
{skip:nbItemsPerPage*pageNum, limit=nbItemsPerPage},
function (resGet) {
console.log('Result of page ' + pageNum + ' received');
console.log(resGet);
}
)(i); // <= immediate function, passing "i" as an argument
// Indeed, when the callback function will be executed, the value of "i" will have changed
// Using an immediate function, we create a new scope to store pageNum for every loop
}
}
)
If what you are trying to archive is an infinite scroll page, then you have to load the page n+1 only after you received the content of the page n and you can rely on results.length < 10

SQLite DELETE transaction not working when grabbing value from Array

This is driving me crazy.
I have a message queue that I'm using a local storage web DB to store these messages when the device is offline and when there is an internet connection it sends out these messages. After it sends the messages, I want them deleted from the table.
When I send out the messages I keep an array called MessageIDs that way I can reference what rows need deleted.
I loop through the length of the MessageIDs and grab each ID and have a DELETE transaction. My alert() gets the right value but when the transaction executes, the value is undefined. I tried hard coding a known "ID" into the transaction and it worked. Any thoughts?
var MessageIDs = new Array();
//In the block of code not shown I populate MessageIDs and send out messages
for(var j=0; j < MessageIDs.length; j++)
{
alert(MessageIDs[j]); //Pulls the right value
site.data.database.transaction(
function (transaction) {
//[MessageIDs[j]] has a value of undefined and thus doesn't get deleted but the transaction doesn't technically fail either
transaction.executeSql("DELETE FROM Messages WHERE id = ?;", [MessageIDs[j]],
site.contact.removeQueuedMessagesSuccess, site.contact.removeQueuedMessagesError);
}
);
}
Sorry I can't put this as a comment, so here as an answer, extending DCoder's answer: You can also put the loop inside the transaction, then it will work, too. His solution is cleaner, though.
Edit: Maybe I should give a reason for why this addition is not unimportant: Obviously, you can't always combine the queries in such a way. So before you start nesting transactions, just put the loop of queries inside one transaction.

Categories

Resources