REST API measuring server-side response times (performance). - javascript

I developed some rest APIs based nodejs, I want to test the performance of the APIs. Is any tool can easily count the time of each API call?
Or how to implement measuring of time required for REST API to response on requests.

Here is example of how to make event injection with precise time measuring using express.js.
Add this before your routes:
app.all('*', function(req, res, next) {
var start = process.hrtime();
// event triggers when express is done sending response
res.on('finish', function() {
var hrtime = process.hrtime(start);
var elapsed = parseFloat(hrtime[0] + (hrtime[1] / 1000000).toFixed(3), 10);
console.log(elapsed + 'ms');
});
next();
});
It will save start time of each request, and will trigger finish after response is sent to client.
Thanks for user419127 pointing to 'finish' event

What about a performance measuring tool like Apache JMeter. You can easily use it to simulate a (heavy) load on your server and then measure response times and other performance indicators. It provides multiple graphical representations for this.
This Blog Post shows how you can setup an HTTP based performance test for Web-APIs. I do it this way to test my RESTful webservices.

Keep it simple use the console.time('timerName'); console.timeEnd('timerName') features in Node. 'timerName' is obviously configurable.
Example:
console.time('getCustomers');
console.timeEnd('getCustomers')
Output:
getCustomers: 58ms

Related

specific response to specific clients while making requests at the same time to the server

Is there any way to get responses to a specific client when another client has a different request at the same time to the same server?
This is code snippet for an exchange server. The given function is present in a library named "ccxt", this function "exchange.fetchMarkets()" has an API which requests to a third party server which is an exchange server like 'bitfinex', 'crex24', 'binance', etc. The issue I am facing is when one client is requesting for an exchange like 'crex24' at the same time when another client is requesting for different exchange like 'binance', they are getting the same response as the function calls for the last recent exchange.
I want it to give responses according to the client's requests independent of each other.
this one is controller function:
const ccxt = require("ccxt");
exports.fetchMarkets = function(req, res){
let API = req.params.exchangeId;
let exchange = new ccxt[API]();
if (exchange.has["fetchMarkets"]) {
try{
var markets = await exchange.fetchMarkets();
res.send(markets)
}catch (err) {
let error = String(err);
res.send({ failed: error });
}
}else{
res.send({loadMarkets : "not available"})
}
}
This is end point for the server request:
app.route('/markets/:exchangeId')
.get(exchange.fetchMarkets)
Here you can find the ccxt library: https://github.com/ccxt/ccxt/wiki/Manual and can be included in the project by "npm install ccxt"
I don't see why the code you mentioned wouldn't work the way you are expecting it to work. I created a small app and it is working as expected. You can check here
https://repl.it/repls/IllfatedStrangeRepo
I am hitting four different request with different ids and I am getting different response.
Hope it clear the doubts.

Run each express request in new fork

So I'm trying to implement forking all express requests to setup different uid per fork.
My current approach I just setup euid and restore it after a request like this:
const mainUID = process.geteuid();
app.get('/', () => {
process.seteuid(500);
// some action1 that requires privelegies of user id 500
// some action2 that requires privelegies of user id 500
process.seteuid(mainUID);
});
But in case of concurrent requests it fails because after action1 and action2 some source code could be executed.
So I read some information about cluster module: https://nodejs.org/api/cluster.html but I have no idea how to use it in my case.
The more preferred way for me to create fork for each express request without splitting actually javascript code... So is it possible?

Angular 5 InMemoryWebAPI - How to simulate CRUD with HTTP Verbs?

In working with the Angular docs / tutorial on the in-memory web api, I want to return some JSON values that indicate the success or failure of a request. i.e.:
{success:true, error:""}
or
{success:false, error:"Database error"}
But, the code in the example for the in-memory-data.service.ts file only has the one method: createDb().
How do update that service code to respond to a PUT/POST/DELETE request differently than a GET?
Note: In real-life / production, the backend will be PHP, and we can return these values any way we want (with the correct status codes). This question is specifically directed at making the In Memory Web API mock those responses.
Example:
Executing:
return = this.http.post(url,someJsonData,httpHeaders);
I would want return to be:
{success:'true',id:1234} with an HTTP Status code of 200.
Later, to delete that record that was just created:
url = `/foo/` + id + '/'; // url = '/foo/1234/';
this.http.delete(url);
This wouldn't really need a JSON meta data response. An HTTP Status code of 200 is sufficient.
How do update that service code to respond to a PUT/POST/DELETE
request differently than a GET?
The server will always respond to the requests that you have made. So if you fire an Ajax request it may be one of the known HTTP Methods (like POST, GET, PUT...). Afterwards you'll wait for the answer.
/** POST: add a new hero to the database */
addHero (hero: Hero): Observable<Hero> {
return this.http.post<Hero>(this.heroesUrl, hero, httpOptions)
.pipe(
catchError(this.handleError('addHero', hero))
);
}
/** GET: get a new hero from the database */
getHero (heroId: number): Observable<Hero> {
return this.http.get(this.heroesUrl + '/' + heroId)
}
I found the answer I was looking for: It's HttpInterceptors.
This blog post and the corresponding github repo demonstrate exactly how to simulate CRUD operations in Angular 2/5 without having to setup a testing server.

How to use long polling in native JavaScript and node.js?

I need to implement long polling for a chat application. I've searched around, but I only find how to implement it in JavaScript using JQuery. How can I implement it using only native JavaScript and node.js? Can you guide me to some relevant articles or materials?
Q: How to do long polling in native Javascript in nodeJS?
A: I guess first of all you need to understand how the long polling model works. If you haven't had any clue then the RFC-6202 specification is a good starting point.
It is about the client sending a request to the server and waits until a response is returned.
From the specification we know that first the client will have to issue a http request which has an infinite or at least a high timeout value. Then the server, which is your nodeJs application is expected to stash all incoming requests into a data structure, basically a holding area. Your application will essentially hold on all the response object until an event gets triggered, then you reply to the responses appropriately.
Consider this Pseudo code:
const express = require('express');
const app = express();
const bodyParser = require('body-parser');
var requestCounter = 0;
var responses = {
/* Keyed by room Id =*/
"room_abc" : [ /* array of responses */]
};
app.get('/', function (req, res) {
requestCounter += 1;
var room = /* assuming request is for room_abc */ "room_abc";
// Stash the response and reply later when an event comes through
responses[room].push(res);
// Every 3rd request, assume there is an event for the chat room, room_abc.
// Reply to all of the response object for room abc.
if (requestCounter % 3 === 0) {
responses["room_abc"].forEach((res) => {
res.send("room member 123 says: hi there!");
res.end();
});
}
});
app.use(bodyParser.text({ type: 'text/*' }));
app.use(bodyParser.json());
app.listen(9999, function () {
console.log('Example app listening on port 9999!')
})
It is relatively time consuming to write a working example here but the code above is a good example of how you can implement long polling in NodeJS.
If you have postman installed or curl you can do HTTP calls to http://localhost:9999/ using method GET. You should noticed that on the first two calls you won't get a response and it is when you fired the 3rd one then you'll receive a response for all previous and current calls.
The idea here is you stash the request's response object first and when an event comes through, assuming on every 3rd HTTP call, you then loop through all of the responses and reply to them. For your chat application's case, the event that triggers a response would probably be when someone fires off a message to a chat room.

How to stream data to feed progressbar

I have a NodeJS app that do some computation, and I'd like to fill a progressbar on the client (AngularJS) showing the amount of computing done. For now I do something like this :
Server side :
var compute_percent = 0;
router.post('/compute', function(req, res) {
myTask.compute1(function(result) {
compute_percent = 33;
myTask.compute2(function(result2) {
compute_percent = 66;
myTask.compute3(function(result3) {
compute_percent = 100;
res.json(result3);
});
});
});
}
router.get('/compute_percent', function(req, res) {
res.json(compute_percent);
}
Client Side
angular.module('myApp').controller('myCtrl',
function($scope, $interval, $http) {
$interval(function() {
$http.get('/compute_percent').success(function(result) {
$scope.percent = result});
},
500);
}
});
What I don't like is that I end up doing a lot of request (if I want the progressbar to be accurate) and almost all of the request are useless (the state didn't change server side).
How can I 'inverse' this code, having the server send a message to the listening client that the computing state changed ?
You have 3 possibilities that can be done:
Standard push
By using sockets or anything that can communicate both way, you can do fast information exchange. It will uses many requests for updating the progress bar, but it's pretty fast and can support a tons of request per seconds.
Long pooling
The browser send a request, and the server does not respond immediately, instead it wait for an event to occurred before reporting it by responding. The browser then apply its action and send another request that will be put to wait.
With this technique you will only have update when the server wants. But if you want great accuracy, this will still produce a lot of requests.
Pushlet
The client send only one request, and the server fill it with javascript that will update the progress bar.
The response is stream to keep the connection opened, and javascripts update are sends when needed.
The response will look like:
set_progress(0);
// nothing for X seconds
set_progress(10);
// nothing for X seconds
set_progress(0);
....
Comparing to the others, you still send the same amount of information, but in one request, so it's less heavy.
The easier to implement is the long-pooling I think.

Categories

Resources