I clean up a bit but in my Postman collection I do a complex test that needs to retrieve data 100.000 times from an endpoint and then do some other tests. The API is hosted in the Cloud and replies SUPER fast but the issue is on Postman...
My code works fine if I do like 200 tests but with 100.000 the Postman crashes.
I am pretty sure the issue is because all the HTTP requests are being sent async and the application cannot handle to many.
So I tried to implement a delay in the requests sent using a setTimeout but it does not wait at all, it spams HTTP requests.
This is postman so I don't really think I can import other external libs...
let counter = 0;
const counterCompleted = 100000;
// My loop for 100000 elements
pm.test(`my test should reply OK`, (done) => {
setTimeout(() => {
const payload = {
// data
};
pm.sendRequest({
url: pm.request.url,
method: 'POST',
header: {
'Content-Type': 'application/json'
},
body: {
mode: 'raw',
raw: JSON.stringify(payload),
},
},
(err, response) => {
pm.expect(err).to.be.null;
pm.expect(response).to.have.status(200);
const data = response.json()[0];
// storing my data
counter++;
if (counter === counterCompleted) {
// When all requests are completed, do more tests
}
done();
}
);
});
}, 1000);
});
What can I do to slow down the requests?
Related
I would like to use Google Cloud Task Queue in my firebase cloud function to call another function and return the work. Is this something we can do with Google Cloud Task?
I am trying to do something similar such as Redis queue where once the work is done I can get the results and continue with my app.
Here I am trying to show what I want to do with a simple addition function. In reality, I would like to call another Http endpoint and use my Queue to rate limit request. For now, this example gets the point across however I cant seem to send the data back.
exports.addNumbersFunc = functions.https.onRequest(async (data, res) => {
const payload = data.body;
console.log("Received: ", payload);
// Work done
const answer = payload["numbers"]["a"] + payload["numbers"]["b"];
// Can I send back the answer here?
res.json({ "answer": answer });
}
);
exports.exampleFunction1 = functions.https.onCall(async (data) => {
// create json object to send to server with data
const jsonPayload = JSON.stringify({
"numbers": { "a": 1, "b": 2 }
});
const url =
`https://${location}-${project}.cloudfunctions.net/addNumbersFunc`;
const task = {
httpRequest: {
httpMethod: 'POST',
url: url,
headers: {
'Content-Type': 'application/json'
},
body: Buffer.from(jsonPayload).toString('base64')
}, oidcToken: {
serviceAccountEmail
}
};
const request = {
parent: parent,
task: task,
};
// Send create task request.
console.log('Sending task:');
const [response] = await client.createTask(request);
console.log(`Response: ${response}`);
console.log(`Created task ${response.name}`);
// Would the response be here??
console.log(`Response httpReques: ${response.httpRequest.body}`);
return response;
});
Cloud Task is async. You cann't get the HTTP response of the task; –
guillaume blaquiere
I am attempting to use a streaming strategy to send data in chunks to the browser. However, when the data is read it does not send them in chunks from the code written to stream the results. It reads and sends the first batch and then gives a message that there are some more items left. Why isn't the rest of the data streamed? I thought was how Observables work, to read the data in chunks in the next callback. Here are how the results are displayed, but with the ... more items, shown below
[
...,
{
productCode: 1829222,
productName: 'Twizzlers'
} ,
... 141 more items
]
Here is the code that tries to stream the data:
const fetch = (url, payload) =>{
try{
const requestOptions = {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(payload),
};
const request = new Request(url, requestOptions);
const data$ = fromFetch(request).pipe(
switchMap(response => {
if (response.ok) {
return response.json();
} else {
return of({ error: true, message: `Error ${ response.status }` });
}
}),
catchError(err => {
console.error(err);
return of({ error: true, message: err.message })
})
);
data$.subscribe({
next: result => console.log(result),
complete: () => console.log('done')
});
}catch(e){
console.error(e)
}
}
I am attempting to use a streaming strategy to send data in chunks to the browser.
Data is sent to the browser in chunks. When the server sends data to the browser, the server is sending it in "chunks" and the browser is storing it in a buffer. You are decoding that buffer to something useable when you run response.json().
For further reading on TCP connections (How GET, POST, etc) work under the hood, I suggest reading the "Bulding blocks of TCP" chapter in the book "High Performance Browser Networking" https://hpbn.co/building-blocks-of-tcp/
Your code works well with standard GET request for some dummy json. I can see the response well and I'm not seeing the error you are seeing
https://stackblitz.com/edit/rxjs-xwyctu?file=index.ts
If you are actually trying to "stream" data to the browser, you can look into Server Sent Events (SSE) https://hpbn.co/server-sent-events-sse/. This is how you can establish a long running GET request to stream data from the server to the client.
You can see an example here: https://github.com/Fallenstedt/server-sent-events-example
I know there is same official limitation to send network request by loop in Cypress,
but probably there is some unofficial way to do it.
The use case is to send some cy.request() and wrap in in for() or while() loop and pass different values in the header everytime from some array or directly from the database and then to manipulate on the result by some assert.
e.g.
let query = 'query bla bla bla'
let projectId = some value from array or db';
let result;
describe('Tests', () => {
it('send graphql request to endpoint', () => {
for(let i = 0; 0 > 3; i++) {
cy.request({
method: 'POST',
url: 'https://www.blabla.con/api2',
body: {
'operationName': 'bla bla',
'variables': {
'campaignProjectId': null,
'ids': [ { 'type': 'project', 'id': projectId } ],
'userData': null,
},
query,
},
headers: {
'accept': '*/*',
'content-type': 'application/json',
},
}).then((response: any) => {
// placeholder for assert - will compare between the results
expect(JSON.stringify(response.body.data).is.equal(JSON.stringify(result);
});
};
});
In the code above, it's just looping without to send the request, seems like a recursive issue or something else.
Take a look at #ArtjomProzorov question Cy requests retries.
The same approach can work with some mods. Maybe set the attempts limit to the total number of data items to send, and return instead of throwing an error.
const data = [...]
function req (attempts = 0) {
if (attempts === data.length) return // finished the data set
const dataset = data[attempts]
cy.request(...) // format request from dataset
.then((resp) => {
// handle response
cy.wait(300) // 300 ms delay so as not to slam the server
req(++attempts)
})
}
I am not sure why but my webhook is being fired twice in my cron job. So this cron job is suppose to run once every 15 min which it does, but it is firing off twice. I will post the logs, handler and yml file to help out.
Basically my cron job will make some request to a salsify api to store a url inside a mongodb. Once that file has been completed and built the next time the cron job runs it should trigger the webhook for netlify. Then the process starts all over again.
In my netlify I noticed the build was being ran twice and have pin pointed the source to the serverless cron job.
EDIT: Something I should add in here is that even if my cron job runs twice it still should still only technically call the webhook once if there is a file in the MongoDB. Yet it is still calling it twice somehow which is causing my netlify build to fail because it needs that file in order to build.
function part of serverless.yml:
functions:
salsifyCron:
handler: src/handler.salsifyCron
events:
- schedule:
rate: cron(*/15 * * * ? *)
enabled: true
Logs:
2018-05-17 10:00:41.121 (-05:00) 10d87735-59e3-11e8-be56-69e06899fa1f Trigger Webhook
2018-05-17 10:01:45.941 (-05:00) 10d87735-59e3-11e8-be56-69e06899fa1f Trigger Webhook
handler:
require('envdotjs').load();
import fetch from 'isomorphic-fetch';
import axios from 'axios';
import middy from 'middy';
import { jsonBodyParser, httpErrorHandler, cors } from 'middy/middlewares';
import { connectToDatabase } from '../utils/db';
import Sheet from '../models/Sheet';
import config from '../utils/config';
module.exports.salsifyCron = middy(async (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
let sheetId;
const options = {
url: `https://app.salsify.com/api/orgs/${
process.env.SALSIFY_ORG_ID
}/export_runs`,
headers: {
Authorization: `Bearer ${process.env.SALSIFY_API_KEY}`,
'Content-Type': 'application/json'
}
};
await connectToDatabase();
const storedData = await Sheet.find({});
if (
storedData.length > 0 &&
storedData[0] &&
storedData[0].status === 'completed' &&
storedData[0].url !== null
) {
console.log('Trigger WebHook');
axios.post('https://api.netlify.com/build_hooks/*****************');
process.exit(0);
return;
}
if (storedData[0]) {
sheetId = storedData[0].sheetId;
}
if (storedData.length === 0) {
const res = await fetch(options.url, {
method: 'POST',
headers: options.headers,
body: JSON.stringify(config)
}).then(res => res.json());
if (res.id && res.status) {
await Sheet.create({
sheetId: res.id,
url: null,
status: res.status
});
sheetId = res.id;
} else {
console.log(res);
process.exit(1);
}
}
const resWithId = await fetch(`${options.url}/${sheetId}`, {
method: 'GET',
headers: options.headers
}).then(res => res.json());
if (resWithId.status === 'running') {
console.log('running cron job');
console.log(resWithId.estimated_time_remaining);
}
if (resWithId.status === 'completed') {
console.log('completed cron job');
await Sheet.findByIdAndUpdate(
storedData[0],
{ status: resWithId.status, url: resWithId.url },
{ new: true }
);
}
})
.use(cors())
.use(jsonBodyParser())
.use(httpErrorHandler());
Lambda timeout. This might not have been the problem in your case, but it is a common problem that causes this result.
Your lambdas are not getting executed simultaneously but with a bit of a delay. This is a clue that it is not just getting a duplicate execution.
I would guess that your lambda is first terminating with an error (for example timing out - the default lambda timeout is quite small) and the lambda is being rerun after it fails.
I have had this problem with timeouts and it is quite confusing if you don't notice that the lambda has timed out.
I have a http request similar to this:
const timeout = $q.defer()
let timedOut = false
setTimeout(() => {
timedOut = true
timeout.resolve()
}, 4000)
$http({
method: 'GET',
url: 'www.someurl.com',
timeout: timeout.promise
}).then((response) => {
return response.data
})
.catch(() => {
if (timedOut) {
return "server is not responding"
} else {
return "some other error"
}
})
The purpose of this code is to send an http request, and if after 4 seconds there is no response, the request is cancelled, returning a message that the server is unresponsive.
The problem is that I have no idea how I would test something like this, to see if it actually works. I normally use unit tests with $httpBackend to mock requests, but in this case it would not work. Is this the correct approach?