web gRPC with TLS on localhost - javascript

I have a Go server and client using a local trusted certificate and they communicate flawless with each other. Now I want the Go server to communicate with a web-grpc instance too. Insecure didn't work as browsers force HTTP2 to go over TLS or deny it fully. And after all; it should work on TLS in production too anyway. Another issue is CORS too, which I can't figure out yet to give to https://github.com/improbable-eng/grpc-web 's version of the server implementation to add origin headers. But first things first.
I serve a simple HTML and Webpack build JS that I serve with a simple Golang FileServer over TLS.
I first generated a new TLS cert/key (that the already-working Go Server/Client pair use successfully):
openssl req -new -newkey rsa:4096 -days 365 -nodes -x509 -subj "/C=US/ST=State/L=Town/O=Office/CN=www.wp-ts.loc" -keyout ./www.wp-ts.loc.key -out ./www.wp-ts.loc
Then I added it to the macOS keychain to be trusted:
sudo security add-trusted-cert -d -r trustRoot -k /Library/Keychains/System.keychain www.wp-ts.loc
This is the Protobuf file I use:
syntax = "proto3";
package pb;
service GCDService {
rpc Compute (GCDRequest) returns (GCDResponse) {}
}
message GCDRequest {
uint64 a = 1;
uint64 b = 2;
}
message GCDResponse {
uint64 result = 1;
}
Then build with:
protoc -I=. gcd.proto \
--js_out=import_style=commonjs:. \
--grpc-web_out=import_style=commonjs,mode=grpcwebtext:.
This is the simple HTML page I serve:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>gRPC</title>
<script src=".//dist/main.js"></script>
</head>
<body>
</body>
</html>
Then the JS module:
import { grpc } from "grpc";
import { GCDRequest, GCDResponse } from "./gcd_pb.js";
import { GCDServiceClient } from "./gcd_grpc_web_pb.js";
var root_certs = ["www.wp-ts.loc", "www.wp-ts.loc.key"];
var sslCredentials = grpc.credentials.createSsl(...root_certs);
var client = new GCDServiceClient("https://www.wp-ts.loc:3000", sslCredentials);
var request = new GCDRequest();
request.setA(294);
request.setB(462);
client.compute(request, {}, (err, response) => {
console.log(err);
console.log(response);
// console.log(response.getResult());
});
Which is build with Webpack like so (which outputs to ./dist/main.js and thus read by the index.html):
npx webpack client.js
The www.wp-ts.loc test domain is in my /etc/host so it can mimic as a domain with the certificate and reroute all traffic to localhost.
Now here's the issue I can't really seem to find in all the big library overhead as this mainly seems to be meant for NodeJS. The Webpack builder with new GCDServiceClient() without credentials builds fine, but of course the browser won't allow the non-TLS style (leaving CORS out of the equation for now). Then using the credentials as a test (which of course is dangerous, but I'm trying out along the way and can't find good docs for it on grpc-web style) gives the obvious NodeJS issue that it can't use the filesystem:
... // Many lines here, but this is clear enough I guess
ERROR in ./node_modules/grpc/src/grpc_extension.js
Module not found: Error: Can't resolve 'fs' in '/Users/#USERNAME/Downloads/wp-ts/node_modules/grpc/src'
# ./node_modules/grpc/src/grpc_extension.js 34:11-24
# ./node_modules/grpc/index.js
# ./client.js
Maybe I'm just approaching this totally wrong and I'm also aware that the grpc-web implementation is still in a very brittle phase, but how do you make it work to connect correctly over HTTP2/TLS (with certificate) and maybe if known to get the CORS headers to be added to https://github.com/improbable-eng/grpc-web 's server that I add to my Listener in Go at port 3000.
Sorry for big gist, but I hope someone can help me with this. I'm really excited to get going with Go + Browser gRPC/Protobuf :D
Thank you so much in advance!

you need a grpcwebproxy for forwarding your browser request to real grpc server.
https://github.com/improbable-eng/grpc-web/tree/master/go/grpcwebproxy

Related

How to implement and use HTTP/2 in Node.js

I have a node.js app. I was using http for creating server as shown below:
...
var http=require('http');
var server=http.createServer(app);
...
And this works perfectly.
Then I implemented http/2 module from npm as shown below :
npm install http2
And I have the package now. Then I changed my code as:
var http2 = require('http2') ;
var server= http2.createServer(app);
But this doesn't work. I get "could not get any response" error from postman.
Why doesn't it work and how can I fix it and use http2?
EDIT : I found HTTP/2 documentation. You can see how you can implement HTTP/2, what you should do for server side, what you should do for client side and other informations about http2 in this document. Also I found out that to check http2 with browsers, you need to generate the certificate and key in this document. The document:
https://nodejs.org/api/http2.html#http2_server_side_example
Postman does not support HTTP/2.
Source: https://github.com/postmanlabs/postman-app-support/issues/2701

Node.js file to run a local server with access-control-allow-origin

I have an html file that has resources in it's directory
(example file tree)
index.html
imgs
>img1.jpg
>img2.jpg
>img3.jpg
js
>js1.js
>js2.js
How do I run a node.js server that will allow me to view the HTML file, as well as allow me to access certain websites with the access-control-allow-origin *
I am unfamiliar with node, so the simpler, the better!
Extra: does not necessarily have to be node, just a server that will allow access control
Since You're learning and starting from scratch so it's preferred to learn how it's done than installing supper-pupper swiss knife toolset that will hide the logic from You and make You boring lazy developer.
If You just want to achieve quick result and don't want to learn - You may use serve package that will do what You need.
But if You're learning nodejs from zero to hero so read my answer.
It's better to do simple things.
Let's go (:
Create some folder and inside of it do following commands in terminal (or cmd in windows os):
1) Init app:
npm init
2) Install express module:
npm i --save express
3) Install cors module/middleware:
npm i --save cors
4) Create public folder and put Your html files there
5) Create app.js file in sibling folder with public:
"use strict";
const
express = require('express'),
app = express(),
cors = require('cors');
app.use(cors()); // attach cors middleware (must be set before of most route handlers to populate appropriate headers to response context)
app.use('/', express.static('public'));
app.listen(8080, () => console.log('APP STARTED'));
6) Run it: node app.js
7) Open in browser: http://127.0.0.1:8080
for more stuff search in YouTube for nodejs express tutorials, nodejs mean stack tutorials and etc. (:
For a quick resolution it can also be checked, the Chrome Web server app, for creating local server allowing access to the local files over localhost server.
https://chrome.google.com/webstore/detail/web-server-for-chrome/ofhbbkphhbklhfoeikjpcbhemlocgigb

What's the cause of the error 'getaddrinfo EAI_AGAIN'?

My server threw this today, which is a Node.js error I've never seen before:
Error: getaddrinfo EAI_AGAIN my-store.myshopify.com:443
at Object.exports._errnoException (util.js:870:11)
at errnoException (dns.js:32:15)
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:78:26)
I'm wondering if this is related to the DynDns DDOS attack which affected Shopify and many other services today. Here's an article about that.
My main question is what does dns.js do? What part of node is it a part of? How can I recreate this error with a different domain?
If you get this error with Firebase Cloud Functions, this is due to the limitations of the free tier (outbound networking only allowed to Google services).
Upgrade to the Flame or Blaze plans for it to work.
EAI_AGAIN is a DNS lookup timed out error, means it is a network connectivity error or proxy related error.
My main question is what does dns.js do?
The dns.js is there for node to get ip address of the domain(in brief).
Some more info:
http://www.codingdefined.com/2015/06/nodejs-error-errno-eaiagain.html
If you get this error from within a docker container, e.g. when running npm install inside of an alpine container, the cause could be that the network changed since the container was started.
To solve this, just stop and restart the container
docker-compose down
docker-compose up
Source: https://github.com/moby/moby/issues/32106#issuecomment-578725551
As xerq's excellent answer explains, this is a DNS timeout issue.
I wanted to contribute another possible answer for those of you using Windows Subsystem for Linux - there are some cases where something seems to be askew in the client OS after Windows resumes from sleep. Restarting the host OS will fix these issues (it's also likely restarting the WSL service will do the same).
For those who perform thousand or millions of requests per day, and need a solution to this issue:
It's quite normal to get getaddrinfo EAI_AGAIN errors when performing a lot of requests on your server. Node.js itself doesn't perform any DNS caching, it delegates everything DNS related to the OS.
You need to have in mind that every http/https request performs a DNS lookup, this can become quite expensive, to avoid this bottleneck and getaddrinfo errors, you can implement a DNS cache.
http.request (and https) accepts a lookup property which defaults to dns.lookup()
http.get('http://example.com', { lookup: yourLookupImplementation }, response => {
// do something here with response
});
I strongly recommend to use an already tested module, instead of writing a DNS cache yourself, since you'll have to handle TTL correctly, among other things to avoid hard to track bugs.
I personally use cacheable-lookup which is the one that got uses (see dnsCache option).
You can use it on specific requests
const http = require('http');
const CacheableLookup = require('cacheable-lookup');
const cacheable = new CacheableLookup();
http.get('http://example.com', {lookup: cacheable.lookup}, response => {
// Handle the response here
});
or globally
const http = require('http');
const https = require('https');
const CacheableLookup = require('cacheable-lookup');
const cacheable = new CacheableLookup();
cacheable.install(http.globalAgent);
cacheable.install(https.globalAgent);
NOTE: have in mind that if a request is not performed through Node.js http/https module, using .install on the global agent won't have any effect on said request, for example requests made using undici
The OP's error specifies a host (my-store.myshopify.com).
The error I encountered is the same in all respects except that no domain is specified.
My solution may help others who are drawn here by the title "Error: getaddrinfo EAI_AGAIN"
I encountered the error when trying to serve a NodeJs & VueJs app from a different VM from where the code was developed originally.
The file vue.config.js read :
module.exports = {
devServer: {
host: 'tstvm01',
port: 3030,
},
};
When served on the original machine the start up output is :
App running at:
- Local: http://tstvm01:3030/
- Network: http://tstvm01:3030/
Using the same settings on a VM tstvm07 got me a very similar error to the one the OP describes:
INFO Starting development server...
10% building modules 1/1 modules 0 activeevents.js:183
throw er; // Unhandled 'error' event
^
Error: getaddrinfo EAI_AGAIN
at Object._errnoException (util.js:1022:11)
at errnoException (dns.js:55:15)
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:92:26)
If it ain't already obvious, changing vue.config.js to read ...
module.exports = {
devServer: {
host: 'tstvm07',
port: 3030,
},
};
... solved the problem.
I started getting this error (different stack trace though) after making a trivial update to my GraphQL API application that is operated inside a docker container. For whatever reason, the container was having difficulty resolving a back-end service being used by the API.
After poking around to see if some change had been made in the docker base image I was building from (node:13-alpine, incidentally), I decided to try the oldest computer science trick of rebooting... I stopped and started the docker container and all went back to normal.
Clearly, this isn't a meaningful solution to the underlying problem - I am merely posting this since it did clear up the issue for me without going too deep down rabbit holes.
I was having this issue on docker-compose. Turns out I forgot to add my custom isolated named network to my service which couldn't be found.
TLDR; Make sure, in your compose file, you have your custom-networks defined on both services that need to talk to each other.
My error looked like this: Error: getaddrinfo EAI_AGAIN minio-service. The error was coming from my server's backend when making a call to the minio-service using the minio-service hostname. This tells me that minio-service's running service, was not reachable by my server's running service. The way I was able to fix this issue is I changed the minio-service in my docker-compose from this:
docker-compose.yml
version: "3.8"
# ...
services:
server:
# ...
networks:
my-network:
# ...
minio-service:
# ... (missing networks: section)
# ...
networks:
my-network:
To include my custom isolated named network, like this:
docker-compose.yml
version: "3.8"
# ...
services:
server:
# ...
networks:
my-network:
# ...
minio-service:
# ...
networks:
my-network:
# ...
# ...
networks:
my-network:
More details on docker-compose networking can be found here.
This is the issue related to hosts file setup.
Add the following line to your hosts file
In Ubuntu: /etc/hosts
127.0.0.1 localhost
In windows: c:\windows\System32\drivers\etc\hosts
127.0.0.1 localhost
In my case the problem was the docker networks ip allocation range, see this post for details
#xerq pointed correctly, here's some more reference
http://www.codingdefined.com/2015/06/nodejs-error-errno-eaiagain.html
i got the same error, i solved it by updating "hosts" file present under this location in windows os
C:\Windows\System32\drivers\etc
Hope it helps!!
In my case, connected to VPN, the error happens when running Ubuntu from inside Windows Terminal but doesn't happen when opening Ubuntu directly from Windows (not from inside the Windows Terminal)
I had a same problem with AWS and Serverless. I tried with eu-central-1 region and it didn't work so I had to change it to us-east-2 for the example.
I was getting this error after I recently added a new network to my docker-compose file.
I initially had these services:
services:
frontend:
depends_on:
- backend
ports:
- 3005:3000
backend:
ports:
- 8005:8000
I decided to add a new network which hosts other services I wanted my frontend service to have access to, so I did this:
networks:
moar:
name: moar-network
attachable: true
services:
frontend:
networks:
- moar
depends_on:
- backend
ports:
- 3005:3000
backend:
ports:
- 8005:8000
Unfortunately, the above made it so that my frontend service was no longer visible on the default network, and only visible in the moar network. This meant that the frontend service could no longer proxy requests to backend, therefore I was getting errors like:
Error occured while trying to proxy to: localhost:3005/graphql/
The solution is to add the default network to the frontend service's network list, like so:
networks:
moar:
name: moar-network
attachable: true
services:
frontend:
networks:
- moar
- default # here
depends_on:
- backend
ports:
- 3005:3000
backend:
ports:
- 8005:8000
Now we're peachy!
One last thing, if you want to see which services are running within a given network, you can use the docker network inspect <network_name> command to do so. This is what helped me discover that the frontend service was not part of the default network anymore.
Enabled Blaze and it still doesn't work?
Most probably you need to set .env from the right path, require('dotenv').config({ path: __dirname + './../.env' }); won't work (or any other path). Simply put the .env file in the functions directory, from which you deploy to Firebase.

how to run node.js on windows with apache server installed in?

I'm a node.js begginer . Let's say I have an apache server(XAAMP) and node.js installed in C:\Program Files\nodejs\nodejs.exe on windows 7.
How can I run node.js in my apache server to simulate my code?
I mean, I know how to write node.js code but what I don't know how it's work on my server?
Apache server don't need for Node.js.
For create your own Node.js server:
Download and install Node.js
Create file hello.js:
var http = require("http");
var server = http.createServer().listen(3000); // beter way for create
server.on("request", function(req, res){
res.writeHead(200, {"Content-Type": "text/plain"});
// for view at page http://localhost:3000
res.write("Hello world");
res.end();
});
server.on("listening", function(){
// for view in console
console.log("Listen: 3000...");
});
In terminal go to dir where file hello.js and type:
node hello.js
Open your browser and point it at http://localhost:3000/. This should display a web page that says:
Hello world
A basic HTTP server
Node.js Manual & Documentation
If you like to work with a replacement for XAAMP you should finally take a look at MEAN.io.
At NpmJS.org you will find different solutions for most of your needs.
and like Reagan Gallant commented you should take a look at this famous stackoverflow post (if you need ideas).
NodeSchool indeed is a good entry point for your fist steps. After that npmjs will make sense and finally you will love Mean.io
You just make it use a different port than Apache is using (for example port 3000 which is the default for express-js and others) -- that is assuming that you don't need the two to work together.
If you do need them to work together, you add a forwarding module to Apache and configure the forwarding in Apache of certain URL to go to your local port for node-js

Can a proxy (like fiddler) be used with Node.js's ClientRequest

Can node.js be setup to recognize a proxy (like Fiddler for example) and route all ClientRequest's through the proxy?
I am using node on Windows and would like to debug the http requests much like I would using Fiddler for JavaScript in the browser.
Just be clear, I am not trying to create a proxy nor proxy requests received by a server. I want to route requests made by http.request() through a proxy. I would like to use Fiddler to inspect both the request and the response as I would if I was performing the request in a browser.
I find the following to be nifty. The request module reads proxy information from the windows environment variable.
Typing the following in the windows command prompt, will set it for the lifetime of the shell. You just have to run your node app from this shell.
set https_proxy=http://127.0.0.1:8888
set http_proxy=http://127.0.0.1:8888
set NODE_TLS_REJECT_UNAUTHORIZED=0
To route your client-requests via fiddler, alter your options-object like this (ex.: just before you create the http.request):
options.path = 'http://' + options.host + ':' + options.port + options.path;
options.headers.host = options.host;
options.host = '127.0.0.1';
options.port = 8888;
myReq = http.request(options, function (result) {
...
});
If you want to montior outgoing reqeusts from node
you can use the request module
and just set the proxy property in the options, like that
request.post('http://204.145.74.56:3003/test', {
headers :{ 'content-type' : 'application/octet-stream'},
'body' : buf ,
proxy: 'http://127.0.0.1:8888'
}, function() {
//callback
});
8888 is the default port , of fiddler .
process.env.https_proxy = "http://127.0.0.1:8888";
process.env.http_proxy = "http://127.0.0.1:8888";
process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0";
Answering my own question: according to https://github.com/joyent/node/issues/1514 the answer is no, but you can use the request module, http://search.npmjs.org/#/request, which does support proxies.
If you want to configure a proxy in the general case, the other answers are right: you need to manually configure that for the library you're using as node intentionally ignores your system proxy settings out of the box.
If however you're simply looking for a fiddler-like HTTP debugging tool for Node.js, I've been working on an open-source project to do this for a little while (with built-in node support) called HTTP Toolkit. It lets you
Open a terminal from the app with one click
Start any node CLI/server/script from that terminal
All the HTTP or HTTPS requests it sends get proxied automatically, so you can see and rewrite everything. No code changes or npm packages necessary.
Here's a demo of it debugging a bunch of NPM, node & browser traffic:
Internally, the way this works is that it injects an extra JS script into started Node processes, which hooks into require() to automatically reconfigure proxy settings for you, for every module which doesn't use the global settings.

Categories

Resources