Postman Retry logic goes in infinite loop - javascript

I am using postman for testing my API.
One of the API which takes some time to process the data I have added a retry logic to try 3 times to check if the end points work fine.
The initial value of retryCount is 3 which is set in the few test before this test where retry is executed.
Below is the code:
let retryCount = pm.environment.get('retryCount');
let responseData = pm.response.json();
console.log(responseData.data.events.length);
console.log(retryCount);
if(responseData.data.events.length == 0 && retryCount > 0)
{
retryCount = retryCount - 1;
console.log(retryCount);
pm.environment.set('retryCount',retryCount);
postman.setNextRequest("GetEvents");
}
else
{
pm.environment.set('data-response',responseData.data);
}
After 3 retries it should stop. however, it goes in the infinite loop. The problem is occurring because the variable is always 3 when the next call happens it should reduce by 1 and eventually become 0.
what could be the reason that above code goes in infinite state.

You might be reseting the value some where in collection scripts or some where elese in your collection. Create a new variable and try, use below method :
pm.variables.get("retryCounter")===undefined ? pm.variables.set('retryCounter',3):null
let responseData = pm.response.json();
console.log(responseData.data.events.length);
retryCount = pm.variables.get("retryCounter");
console.log(retryCount);
if(responseData.data.events.length === 0 && retryCount > 0)
{
retryCount = retryCount - 1;
console.log(retryCount);
pm.variables.set('retryCounter',retryCount)
//this gives this request name , you don't have to hardcode
postman.setNextRequest(pm.info.requestName);
}
else
{
pm.environment.set('data-response',responseData.data);
}
Try this code , here we are using pm.variables.set which creates local variables. As the life time of local variable is the entire collection run and after that it will be destroyed, so for every new collection run the value will be undefined.
we are setting the value to 3 if its undefined . and then sends the request till the value becomes less than 1.

Related

How to make this loop wait some time before showing next result when trying to loop a construct function?

I'm currently learning construct function in javascript and wanted to play some experiment about it, but ended stuck in here for like 5 hours and didn't what to do.
So, I have this (superclass) constructor function to made basic function of messaging and saving data about who is the sender. Why I made this superclass? Because there will be more than 1 subclasses that need same function (send message) and later i will add some different feature in every subclasses, but i will not put all my code here because that's working perfectly.
const receiverLists = ['Ahmad', 'John', 'David'];
class SenderProfile{
// property
constructor(sender){
this.sender = String(sender);
}
// method
sendMessage(message, receiver){
console.log(`${this.sender} sent \`${message}\` to ${receiver} `);
}
}
/* Let's try
const emailUser = new SenderProfile('adam#test.com');
emailUser.sendMessage('hi', 'john#test.com');
output: adam#test.com sent hi to john#test.com
*/
And this is my subclass called TelegramSender. My expectations of this code is return all the result every 1 sec.
class TelegramSender extends SenderProfile {
constructor(sender){
super(sender);
}
// add delay feature
delayMessage(message){
for(let i=0; i<=receiverLists.length; i++){
setTimeout(() => {
console.log(`${this.sender} on Telegram was send \'${message}\' to ${receiverLists[i]}`);
}, 1000)
}
}
}
The result is near my expectations because the result delayed 1 second before show up, but they still shows at the same time, i don't want that, i want every result will shows up after waiting 1 sec.
like this
first result shows up after 1 sec -- wait 1 sec before second result shows up
second result shows up. and so on...
please help me because i runs out energy to solve this.
I think you are looking for this:
class TelegramSender extends SenderProfile {
...
delayMessage(message){
for(let i=0; i<=receiverLists.length; i++){
setTimeout(() => {
...
}, (i + 1) * 1000) // <--------------- increase the timeout
}
}
}
Basically what happens is the following:
Your for-loop runs through in no time at all. While it does that, it stacks up all your setTimeouts "at the same time". Then they wait for the defined 1000ms, and execute. It looks like they dont wait, because they are almost created at the same time.
What you have to do is define a "waiting" variable, and increase the timer on 1000ms each time your for-cycle is finished. Like so:
delayMessage(message){
for(let i=0; i<=receiverLists.length; i++){
const waitTime = (i + 1) * 1000;
setTimeout(() => {
console.log(`${this.sender} on Telegram was send \'${message}\' to ${receiverLists[i]}`);
}, waitTime)
}
}
What now happens is the following: Your for-loop runs through super fast, and it defines the waitTime as follows:
Cycle 1: i=0 -> waitTime = 0 + 1 * 1000 -> 1000ms
Cycle 2: i=1 -> waitTime = 1 + 1 * 1000 -> 2000ms
Cycle 3: i=2 -> waitTime = 2 + 1 * 1000 -> 3000ms
...
That should be your desired behavior.

Keep clicking Refresh button until data appears

I have one page for uploading a file which will be processed by the server in the background. I then have a second page where it shows only files that have been processed, which can take anything up to 5 seconds.
At the moment the code I have does this
cy.visit('/')
cy.get('.busy-spinner').should('not.exist', { timeout: 10000 })
cy.contains('Submit file').click()
cy.get('[id="requestinputFile"]').attachFile('test-file.txt');
cy.contains('Upload').click()
cy.contains('.notifications', 'Your file has been uploaded', { timeout: 10000 })
cy.wait(5000)
cy.visit('/processed-files')
cy.get('[data-render-row-index="1"] > [data-col-index="1"]').contains(filename)
Sometimes the wait is far too long, sometimes it is not long enough. What I want to do is to go to /processed-files immediately and check if the row with my filename exists.
If it does then continue. Otherwise
Pause for 1 second
Click a specific button (to reload the data on the page)
Wait until .busy-spinner does not exist (the data has been reloaded)
Check if the row exists
If it does then pass, otherwise loop - but for a maximum of 30 seconds.
This pattern will be repeated in many places, what is the best way to achieve this?
Can you just wait on the filename?
cy.contains('[data-render-row-index="1"] > [data-col-index="1"]', filename,
{ timeout: 30_000 }
)
If the reload is needed to get the correct row entry, a repeating function is a possibility
function refreshForData(filename, attempt = 0) {
if (attempt > 30 ) { // 30 seconds with a 1s wait below
throw 'File did not appear'
}
// Synchronous check so as not to fail
const found = Cypress.$(`[data-render-row-index="1"] > [data-col-index="1"]:contains('${filename}')`)
if (!found) {
cy.wait(1_000)
cy.get('Reload button').click()
cy.get('Spinner').should('not.be.visible')
refreshForData(filename, ++attempt)
}
}
refreshForData(filename) // pass in filename, function can be globalized
// maybe also pass in selector?
You can create a recursive function in which you can check the presence of the file name by reloading the page for a maximum of 30 seconds. In case you find the file name you exit the function.
let retry = 0
function isElementVisible() {
if (retry < 15 && Cypress.$('[data-render-row-index="1"] > [data-col-index="1"]'):not(:contains('filename'))) {
//Increment retry
retry++
//wait 2 seconds
cy.wait(2000)
//Reload Page by clicking button
cy.click('button')
//Check busy spinner is first visible and then not visible
cy.get('.busy-spinner').should('be.visible')
cy.get('.busy-spinner').should('not.be.visible')
//Call the recursive function again
isElementVisible()
} else if (retry < 15 && Cypress.$('[data-render-row-index="1"] > [data-col-index="1"]'):contains('filename')) {
//Row found Do something
return
} else {
//It exceeded the required no. of retries and a max of 30 seconds wait
return
}
}
//Trigger the recursive function
isElementVisible()

How to repeat iteration based on number of retries

I am uploading file chunks to Dropbox and I need to add a simple retry into my loop. So if the first attempt fails, retry another 2 times before giving up.
To give some context. Im uploading a file in chunks to Dropbox. But, I need to allow the script to fail gracefully. Asking the script to repeat the upload 3 times before I kill the upload and give the user an error.
For example (not actual attempt just a concept):
var retries = 3;
jQuery(dropbox.chunks).each(function(index, chunk){
var result = anothingFunction();
if (result == true) {
//continue the loop
}
if (result == false) {
retries--;
if (retries > 0) {
//Retry this iteration
}
if (retries = 0) {
//Kill the entire loop as this upload clearly is not going to happen.
}
}
}

Recursive Steam API Call does not terminate

I am calling an API endpoint for one of Steam's games through their web api using axios and promises in Node.js. Each JSON response from the endpoint returns 100 match objects, of which only about 10 to 40 (on average) are of interest to my use case. Moreover, I have observed that the data tends to be repeated if called many times within, say, a split second.
What I am trying to achieve is get 100 match_ids (not whole match objects) that fit my criteria in an array by continuously (recursively) calling the api until I get 100 unique match_ids that serve my purpose.
I am aware that calling the endpoint within a loop is naive and it exceeds the call limits of 1 request per second set by their web api. This is why I've resorted to recursion to ensure that each promise is resolved and the array filled with match_ids before proceeding on. The issue I am having is, my code does not terminate and at each stage of the recursive calls, the values are the same (e.g. last match id, the actual built up array, etc.)
function makeRequestV2(matchesArray, lastId) {
// base case
if (matchesArray.length >= BATCH_SIZE) {
console.log(matchesArray);
return;
}
steamapi
.getRawMatches(lastId)
.then(response => {
const matches = response.data.result.matches;
// get the last id of fetched chunk (before filter)
const lastIdFetched = matches[matches.length - 1].match_id;
console.log(`The last Id fetched: ${lastIdFetched}`);
let filteredMatches = matches
.filter(m => m.lobby_type === 7)
.map(x => x.match_id);
// removing potential dups
matchesArray = [...new Set([...matchesArray, ...filteredMatches])];
// recursive api call
makeRequestV2(matchesArray, lastIdFetched);
})
.catch(error => {
console.log(
"HTTP " + error.response.status + ": " + error.response.statusText
);
});
}
makeRequestV2(_matchIds);
// this function lies in a different file where the axios call happens
module.exports = {
getRawMatches: function(matchIdBefore) {
console.log("getRawMatches() executing.");
let getURL = `${url}${config.ENDPOINTS.GetMatchHistory}/v1`;
let parameters = {
params: {
key: `${config.API_KEY}`,
min_players: `${initialConfig.min_players}`,
skill: `${initialConfig.skill}`
}
};
if (matchIdBefore) {
parameters.start_at_match_id = `${matchIdBefore}`;
}
console.log(`GET: ${getURL}`);
return axios.get(getURL, parameters);
}
}
I'm not exceeding the request limits and all that, but the same results keep coming up.
BATCH_SIZE is 100 and
_matchIds = []
I would start with replacing the line:
matchesArray = [...new Set([...matchesArray, ...filteredMatches])];
with this one:
filteredMatches.filter(item => matchesArray.indexOf(item) === -1).forEach(item=>{
matchesArray.push(item)
})
What you were doing was that you effectively replaced the matchesArray var inside your function with new reference. I mean the var that you sent in function parameter from outside was no longer the same var inside the function. If you use matchesArray.push - you do not change the var reference though and the var in outer scope is accurately updated - just as is your intention.
This is the reason why _matchIds remains empty: each time there is a call to makeRequestV2, the inner variable matchesArray becomes 'detouched' from outer scope (during assignment statement execution) and although it gets populated, the outer scoped var still points to the original reference and stays untouched.

Node.js - http request is not working when inside a while loop

I'm using the unirest library to fetch all of the data from an api, which is split up by offset and limits parameters, and has no finite number of results.
I'm using a while condition to iterate through the data and at the point where no results are returned, I end the loop by setting an 'incomplete' variable to false.
But for some reason, when I run the following code nothing happens (as in no data is added to my database and nothing is outputted to the console) until I get the 'call_and_retry_last allocation failed' error (assuming this happens when a while loop goes on too long). But when I remove the while condition altogether the code works fine.
Is there a particular reason why this isn't working?
Here's my code:
var limit = 50,
offset = 0,
incomplete = true;
while (incomplete) {
// make api call
unirest.get("https://www.theapiurl.com")
.header("Accept", "application/json")
.send({ "limit": limit, "offset": offset })
.end(function (result) {
// parse the json response
var data = JSON.parse(result.raw_body);
// if there is data
if( data .length > 0 )
{
// save the api data
// + increase the offset value for next set of data
offset += limit;
}
else
{
// if there is no data left, end loop
incomplete = false;
console.log("finished!");
}
});
}
You can use recurrcive function as
function getServerData(offset){
//Your api service with callback.if there is a data then call it again with the new offset.
}
function getServerData(1);

Categories

Resources