How to bring a gRPC defined API to the web browser - javascript

We want to build a Javascript/HTML gui for our gRPC-microservices. Since gRPC is not supported on the browser side, we thought of using web-sockets to connect to a node.js server, which calls the target service via grpc.
We struggle to find an elegant solution to do this. Especially, since we use gRPC streams to push events between our micro-services.
It seems that we need a second RPC system, just to communicate between the front end and the node.js server. This seems to be a lot of overhead and additional code that must be maintained.
Does anyone have experience doing something like this or has an idea how this could be solved?

Edit: Since Oct 23,2018 the gRPC-Web project is GA, which might be the most official/standardized way to solve your problem. (Even if it's already 2018 now... ;) )
From the GA-Blog: "gRPC-Web, just like gRPC, lets you define the service “contract” between client (web) and backend gRPC services using Protocol Buffers. The client can then be auto generated. [...]"
We recently built gRPC-Web (https://github.com/improbable-eng/grpc-web) - a browser client and server wrapper that follows the proposed gRPC-Web protocol. The example in that repo should provide a good starting point.
It requires either a standalone proxy or a wrapper for your gRPC server if you're using Golang. The proxy/wrapper modifies the response to package the trailers in the response body so that they can be read by the browser.
Disclosure: I'm a maintainer of the project.

Unfortunately, there isn't any good answer for you yet.
Supporting streaming RPCs from the browser fully requires HTTP2 trailers to be supported by the browsers, and at the time of the writing of this answer, they aren't.
See this issue for the discussion on the topic.
Otherwise, yes, you'd require a full translation system between WebSockets and gRPC. Maybe getting inspiration from grpc-gateway could be the start of such a project, but that's still a very long shot.

An official grpc-web (beta) implementation was released on 3/23/2018. You can find it at
https://github.com/grpc/grpc-web
The following instructions are taken from the README:
Define your gRPC service:
service EchoService {
rpc Echo(EchoRequest) returns (EchoResponse);
rpc ServerStreamingEcho(ServerStreamingEchoRequest)
returns (stream ServerStreamingEchoResponse);
}
Build the server in whatever language you want.
Create your JS client to make calls from the browser:
var echoService = new proto.grpc.gateway.testing.EchoServiceClient(
'http://localhost:8080');
Make a unary RPC call
var unaryRequest = new proto.grpc.gateway.testing.EchoRequest();
unaryRequest.setMessage(msg);
echoService.echo(unaryRequest, {},
function(err, response) {
console.log(response.getMessage());
});
Streams from the server to the browser are supported:
var stream = echoService.serverStreamingEcho(streamRequest, {});
stream.on('data', function(response) {
console.log(response.getMessage());
});
Bidirectional streams are NOT supported:
This is a work in progress and on the grpc-web roadmap. While there is an example protobuf showing bidi streaming, this comment make it clear that this example doesn't actually work yet.
Hopefully this will change soon. :)

https://github.com/tmc/grpc-websocket-proxy sounds like it may meet your needs. This translates json over web sockets to grpc (layer on top of grpc-gateway).

The grpc people at https://github.com/grpc/ are currently building a js implementation.
The repro is at https://github.com/grpc/grpc-web (gives 404 ->) which is currently (2016-12-20) in early access so you need to request access.

GRPC Bus WebSocket Proxy does exactly this by proxying all GRPC calls over a WebSocket connection to give you something that looks very similar to the Node GRPC API in the browser. Unlike GRPC-Gateway, it works with both streaming requests and streaming responses, as well as non-streaming calls.
There is both a server and client component.
The GRPC Bus WebSocket Proxy server can be run with Docker by doing docker run gabrielgrant/grpc-bus-websocket-proxy
On the browser side, you'll need to install the GRPC Bus WebSocket Proxy client with npm install grpc-bus-websocket-client
and then create a new GBC object with: new GBC(<grpc-bus-websocket-proxy address>, <protofile-url>, <service map>)
For example:
var GBC = require("grpc-bus-websocket-client");
new GBC("ws://localhost:8080/", 'helloworld.proto', {helloworld: {Greeter: 'localhost:50051'}})
.connect()
.then(function(gbc) {
gbc.services.helloworld.Greeter.sayHello({name: 'Gabriel'}, function(err, res){
console.log(res);
}); // --> Hello Gabriel
});
The client library expects to be able to download the .proto file with an AJAX request. The service-map provides the URLs of the different services defined in your proto file as seen by the proxy server.
For more details, see the GRPC Bus WebSocket Proxy client README

I see a lot of answers didn't point to a bidirectional solution over WebSocket, as the OP asked for browser support.
You may use JSON-RPC instead of gRPC, to get a bidirectional RPC over WebSocket, which supports a lot more, including WebRTC (browser to browser).
I guess it could be modified to support gRPC if you really need this type of serialization.
However, for browser tab to browser tab, request objects are not serializsed and are transfered natively, and the same with NodeJS cluster or thread workers, which offers a lot more performance.
Also, you can transfer "pointers" to SharedArrayBuffer, instead of serializing through the gRPC format.
JSON serialization and deserialization in V8 is also unbeatable.
https://github.com/bigstepinc/jsonrpc-bidirectional

Looking at the current solutions with gRPC over web, here is what's available out there at the time of writing this (and what I found):
gRPC-web: requires TypeScript for client
gRPC-web-proxy: requires Go
gRPC-gateway: requires .proto modification and decorations
gRPC-bus-websocket-proxy-server: as of writing this document it lacks tests and seems abandoned (edit: look at the comments by the original author!)
gRPC-dynamic-gateway: a bit of an overkill for simple gRPC services and authentication is awkward
gRPC-bus: requires something for the transport
I also want to shamelessly plug my own solution which I wrote for my company and it's being used in production to proxy requests to a gRPC service that only includes unary and server streaming calls:
gRPC-express
Every inch of the code is covered by tests. It's an Express middleware so it needs no additional modifications to your gRPC setup. You can also delegate HTTP authentication to Express (e.g with Passport).

Related

Communication between JavaScript and WCF Service Library (WCF Test Client)

there is a web service (WCF Service Library) when I debug the web service project (in Visual Studio) "Test Client WCF" is launched (so I guess its hosted via the Test Client). I have a web service method called "Test" which returns string. When I "call" that method with the Test Client WCF - it works.
When I want to use browser as a client. I go to http://localhost:9001/Name/WebService/WebAPI and I see the web service (xml with some info about methods). And now I want to use JavaScript to call that Test method.
I created a client similar to this https://stackoverflow.com/a/11404133 and I replace the sr variable (SOAP request) with a request, which is in XML part of the Test method in the "Test Client WCF" and for url I chose http://localhost:9001/Name/WebService/WebAPI . I tried that JavaScript client, but I got some client error -
content-type 'text-xml' is invalid, server wanted
'application/soap+xml; charset=utf-8'
(unfortunately right now I can't get to the web service, so I don’t know a number of the error and exact message, but there was no other information, beside the content-type). So I changed the request header to 'application/soap+xml; charset=utf-8', but then I got error – that tells me:
The message cannot be processed at the receiver, due to an
AddressFilter mismatch at the EndpointDispatcher. Check that the
sender and receiver's EndpointAddresses agree
(Or something like that - I had to translated it to english)
I also tried the "JavaScript client" with an existing service, that I found on the internet and with text/xml content-type. And it works fine.
Please do you have any advice - how to call the Test method with JavaScript? Thanks.
The service invocated in Javascript is called Restful style service. WCF is able to create a Restful style service too. But we need to set up some kinds of additional configuration. The service is hosting in IIS express when we test the service in Visual Studio. It uses the default binding configuration to host the service(BasicHttpBinding), called SOAP web service. The universal way to invocate the service is taking advantage of using service proxy class, that is what the WCFTestClient do.
If we want to invocate the service by using JavaScript, here is a simple demo, wish it is useful to you. Please be aware that the project template is WCF Service Application instead of Service Library project.
https://stackoverflow.com/questions/56873239/how-to-fix-err-aborted-400-bad-request-error-with-jquery-call-to-c-sharp-wcf/56879896#56879896
Feel free to let me know if there is something I can help with.

NodeJS && browser cross compatible websocket client

My goal is to implement a websocket client in JavaScript with specific protocol.
This client should be platform independent. It should be able to run from NodeJS environment and from browsers as well. I know that one option is to use browserify on NodeJS packages and than use the result in browser, but I wonder if there is other more elegant sollution?
Also at the github repo of ws NodeJS package that is the most popular I think is being said that it returns global.WebSocket which has slightly different API.
Can I still use it for platform independent client application?
Thanks for any reply.
In the end I used https://websockets.github.io/ws/ package for NodeJS which is compatible with browser WebSocket protocol API.
var WebSock = global.WebSocket || global.MozWebSocket || require('ws');
this.sock = new WebSock(url, this.subProtocol);
this.sock.onopen = (evt) => {};
this.sock.onmessage = (msg) => {};
this.sock.onerror = (err) => {};
Depending on your specific requirements I have found Socket.io to work well for both NodeJS (using socket.io-client and browser. It does more then just WebSockets though so make sure and look into it more as it may not suit your specific purposes.
I am using it for a task queue and have been pleased with it.
Take a look at sockjs for SockJS-client as a JavaScript client library and
SockJS-node for the server implementation.
http://sockjs.org

Keeping a client-side sync of Sails.js collection, using sockets

I very much like Meteor's pub/sub. I wonder if there is a way to get a similar workflow, using sails.js or just a socket library in general.
In particular, what I would like to be able to do is something along the lines of:
// Server-side:
App.publish('myCollection', -> collection.find({}))
// Client-side:
let myCollection = App.subscribe('myCollection')
let bob = myCollection.find({name: 'Bob'})
myCollection.insert({name: 'Amelie'}, callback)
All interaction with the server should happen in the background.
I very much like Meteor's pub/sub. I wonder if there is a way to get a similar workflow, using sails.js or just a socket library in general
Basically yes, at least about realtime sync between backend and frontend. Let's review what meteor's have and answer point by point.
Pub/sub
The Pub / Sub concept, as stated by Sabbir, is also supported by sails.js. Though the basics are slightly different :
In meteor, the client can subscribes to everything he wants, and the server control what it receives by only publishing to who he wants;
whereas in sails.js, the server both does subscribe some clients sockets and publish to all binded sockets
Note that, by default:
meteor contains the autopublish package that just notify every client without any kind of filtering. To acheive some filtering, you have to meteor remove autopublish then you can handle what will your client receive by adding a mongo request to it, like explained here.
sails by default, on its automatic "select" blueprints actions, auto-subscribes the calling socket to the events on the objects returned by the "select".
As a server-side conclusion:
Subscribe: just call findor findOne blueprint default action, through a socket (attaching some where filters or not) and your socket will automatically be subscribed to every events concerning returned objects => you don't have to code anything on the server, in most cases, for the Subscribe logic.
Publish: every blueprint default actions (create, update, destroy, add, remove) auto-publish to subscribed sockets => you don't have to code anything on the server, in most cases, for the Publish logic.
(Though, if you find yourself implementing some manual controller actions, sails API helps you publishing and subscribing easily)
Client handling
Therefore, with both meteor and sails, clients only receive what they're supposed to receive. Time for front-end now.
Philosophy
meteor in one hand, with it's isomorphic dimension, does provide a front-end connector by nature, exposing it's data-bound collections.
sails on the other hand, is front-end agnostic, and can be attacked by any http REST connector (JS or not), such as $http, $resource, or more advanced ones like Restangular.
Though, being aware of the complexity using raw sockets on their API (when it comes to session, CORS, CSRF and stuff), they developped a javascript socket.io wrapper called sails.io.js designed to be REST-like-over-socket, and just works like a charm.
Basically, The main difference is that meteor is one step higher-level than sails, because it provides the logic of syncing collections and objects.
All interaction with the server should happen in the background.
sails.io.js, the official front-end component, is just not that high-level. When it comes to Angular.js.
Though, you can find some community connectors that aim to, kinda, provide the same feature as mongo data-bound collections and objects. There is sails-resource, spinnaker or angular resource sails. I tried both of them, and I should say that I was disapointed. The abstraction level is so high that it just becomes annoying, IMHO. For example, with not-very-RESTful-friendly custom actions, like a login, it becomes very hard to adapt it for your needs.
==> I would advice to use a low-level connector, such as angularSails or (my prefered) https://github.com/janpantel/angular-sails, or even raw sails.io.js if you're not using Angular.
Edit: just foun a backbone version, by the sails' creator
It just works great, and believe me, the "keep my collection in sync with that socket" code is so ridiculous, that finding a module for this is just not worth it.
Some code please, stop talking
In particular, what I would like to be able to do is something along the lines of:
Server
Meteor
# Server-side:
App.publish('myCollection', -> collection.find({}))
Sails
//Nothing to do, just sails generate api myCollection
Client
Meteor
# Client-side:
myCollection = App.subscribe('myCollection')
Sails, with sails.io.js
(Here using lodash for convenience)
var myCollection;
sails.io.get('/myCollection').then(
function(res) {
myCollection = res.data;
},
function(err) {
//Handle error
}
);
sails.io.on('myCollection').function(msg) {
switch(msg.verb) {
case 'created':
myCollection.push(msg.data);
break;
case 'updated':
_.extend(_.find(myCollection, 'id', msg.id), msg.data);
break;
case 'destroyed':
_.remove(myCollection, 'id', msg.id);
break;
};
});
(I leave the find where and create to your imagination with [the doc])
All interaction with the server should happen in the background.
Well, Sails, only for angular, with sails ressources
I'm not pretty used to that process, so I leave you reading here or here, but once again I'd choose manual .on()method.
Since I asked this question, I've learned a few things and some new projects have popped up. I decided against sails.io, because when developing with React.js, most of the community's weight is behind webpack, but sails.io uses gulp. I realize these can be used together and there is even an npm package for this, but I wasn't too keen on making my stack bigger than it had to be, so I went with a simple express.js server that I could tailor to my needs.
In order to sync my data, I'm using rethinkdb which allows me to asynchronously watch the database for changes and then publish the changes to the clients through websockets.
I've set up a simple script where I keep an instance of a baobab tree on both the client and the server.
When the tree gets modified on the server, it sends transaction data to the appropriate clients through the websocket
The client merges the transaction with the tree.
This method does not make use of local storage and keeps the data in memory in the node.js process. The data in the transaction is also quite redundant.
The future plan has always been to set something up using redis and local storage ...
... until yesterday when I found deepstream.io!
This is a tool that does exactly what I want and need! Nothing more, nothing less.
Another project worth mention is meatier: "like meteor, but meatier". It is composed of many other well supported open source projects, so you could even pick and choose.

Relationship of node.js / Express / Connect / Socket.io

I'm confused. The main question I have is, when to use pure node.js, when shall I use a framework/MVC like "express" or "connect".
I know that "express" is just adding a bunch of functionality to "connect", but what is it really good for? Lets say, I want all my HTTP stuff do against an "Apache" server and only do some partial stuff with node.js (like WebSocket connections, CouchDB, etc.).
Would it make sense in this scenario to use "express" or "connect" for some reason?
As far as I know, Socket.IO also handles HTTP requests as a fallback, so would it be just enough to use Socket.IO for those needs ?
What else is the big advantage using Express/Connect ?
Express (or Connect) is an application framework for HTTP web applications. It's the entry point of your application. It provides some very common functionnalities such as :
HTTP Server
URL Routing
Request args
Sessions
It also allows other functionnalities to be easily used (they are called middleware) such as Authentication handling, templating.
If you just want to implement a pub/sub service through SocketIO, without any pages or other API, just use the Socket.io library (S.io homepage example) :
var io = require('socket.io').listen(80);
io.sockets.on('connection', function (socket) {
socket.emit('news', { hello: 'world' });
socket.on('my other event', function (data) {
console.log(data);
});
});
If you want to use Socket.io within a more complexe application, serving pages and API, you can (must ?) integrate it with Express (see How To Use)
Hi I have been using Expressjs for some time now and find it particularly useful for the Jade templating engine it provides by default. Jade is a new templating language and though I admit it takes some time to get familiar with it, its pretty useful. You can write conditionals, mixins, pass variables to your pages, use partials etc. It just makes client side html really easy. Also expressjs sets up your view, javascript, stylesheets directory structure... If followed properly catching responses and rendering html pages are a matter of couple of line of codes. As discussed above, the http middlewear is a lot easier to implement..

How do I do an LDAP query with JavaScript?

I'm trying to make a sidebar gadget that has an LDAP query function but haven't been able to find very good, or any, useful documentation on the matter. I'm not hugely experienced with Javascript and know little to nothing about how LDAP queries function, so any information at all would be useful.
info:
host: a.b.c.d.e
port: 389
ou: people
o: x_y_z
c: us
first snippet:
var sSearchURL = "ldap://a.b.c.d.e:389/o=x_y_z,c=us";
var URLsuffix = "dc=" + form.SearchData.value;
document.location = sSearchURL URLsuffix;
other snippet:
var ldap = GetObject('LDAP:');
var ad = ldap.OpenDSObject(
'LDAP://a.b.c.d.e:389/o=x_y_z',
'cn=Administrator,ou=People,o=rootname',
'password',
0
);
As long as you want to run your JavaScript in a web browser, you are limited to the HTTP protocol and to the domain from which your script was loaded in the first place.
So, talking to an LDAP server will not be possible from a web browsers JavaScript engine.
There are JavaScript runtime environments that have less limitations where you can implement socket servers and clients. For LDAP conenctivity you'd have to write your own library or find some existing one.
You could write a proxy web service that translates your HTTP requests into LDAP queries, forwards them to an LDAP server and returns the results back to you. Of course that'd have both security and scalability implications and is far from trivial.
As Selfawaresoup already mentioned there are limitations to perfoming this on client side alone, however, if you're able to host your application/page on nodejs you can utilise an LDAP plugin with it.
Links to nodejs are as follows:
https://nodejs.org/en/
https://nodejs.org/en/download/
Nodejs LDAP plugin:
http://ldapjs.org/
Instruction on setting up nodejs to serve http:
https://www.sitepoint.com/build-a-simple-web-server-with-node-js/
https://blog.risingstack.com/your-first-node-js-http-server/
Although it's for a specific application here's a manual demonstrating the integration of LDAP query via nodejs:
https://www.ibm.com/developerworks/library/se-use-ldap-authentication-authorization-node.js-bluemix-application/index.html
Here's a working demo of it (note this is for querying public facing LDAP servers):
https://login-using-ldap.mybluemix.net/
Best of luck to you however you resolve this.
I am not sure answer 1 is correct. Domain would be limited to client's domain for an active directory ldap query. However, LDAP://server is not limited to just local domain. Its limited to 'reachable' domains. If you can ping it you should be able to query it, given proper credentials.

Categories

Resources