Electron doesnt execute https call to local spring server, curl works - javascript

I'm simply trying to connect to a local Spring configured with SSL / TLS 1.2.
Context (Server): Server is built with Spring, logs when creating a request via curl. Also requires a certificate from the client (X.509 certificate based authentication)
Client (CURL, command-line):
Works fine!
Client (Electron):
When the code below gets executed nothing happens. No requests are being made to the spring (nothing logs), nor is any error occuring. Nothing. Electron is such a drag to debug..
Code:
let options = {
hostname: 'localhost',
port: 8443,
path: '/',
cert: fs.readFileSync(global.relativePaths.config + 'client.crt'),
key: fs.readFileSync(global.relativePaths.config + 'clientprivate.key'),
passphrase: '[phrase_here]',
rejectUnauthorized: false,
requestCert: true
};
let request = https.request(options);
request.on('error', () => {
console.log("Error!");
})
Also, this is just localhost, hence the rejectUnauthorized - I'm working with self-signed certificates until ready for production. :)
Thank you in advance. :)
EDIT:
using the test-code on the wiki (call to github) outputs with no problem.. what could it be?...
also, im doing this on the main process (not on the renderer)

Try using the http-client library like [axios][1]. axios is a promise-based HTTP client for the browser and node.js. It worths giving a try to most used libraries like axios, it's simple to use. If the issue still persists, try setting the User-Agent header in your request.

Related

CRA with manual proxy config not sending the correct cookie

I have a create react app dev server running on http://localhost:3000. This dev server needs to fetch data from a remote server on a different domain https://mydomain.in. To achieve this I have the below proxy config in setupProxy.js file -
const { createProxyMiddleware } = require('http-proxy-middleware');
const CLOUD_URL = 'https://mydomain.in';
module.exports = function (app) {
app.use(
'/api',
createProxyMiddleware({
target: CLOUD_URL,
changeOrigin: true,
secure: false,
ws: true,
cookieDomainRewrite: 'localhost',
logLevel: 'debug'
})
);
};
The requests are being proxied but cookie being sent is not the target's cookie. The response of all the requests is the login page of my remote server. Please suggest if I am missing something. Thank you so much in advance!
Note - I have logged-in to my application in different tab on the same browser as the dev server. The remote server has CORS enabled.
I have logged-in to my application in different tab on the same browser as the dev server. The remote server has CORS enabled.
In order for the cookie rewriting to work properly, you need to have the cookie present on localhost. You can use an extension like EditThisCookie to configure it.
Once you have the cookie on localhost the cookieDomainRewrite should begin working and your request should no longer be hitting the login page.

NodeJS keep throwing `UNABLE_TO_VERIFY_LEAF_SIGNATURE` no matter what

I have a basic nodeJS application that connects to a server via a secured websocket (ws library) connection. When connecting the server keeps throwing a UNABLE_TO_VERIFY_LEAF_SIGNATURE error. I have no control over the server, it's external. The endpoint starts with wss:// (not sure if it makes a difference)
I have tried so many things, but none of them seem to work:
rejectUnauthorized: false works, but connection breaks after x time with 1006
export NODE_EXTRA_CA_CERTS=[your CA certificate file path] doesn't work
npm config set cafile [your CA certificate file path] doesn't work
npm modules syswide-cas or node-ssl-root-cas don't work
adding ca: [fs.readFileSync("ca.ca-bundle", {encoding: 'utf-8'})] in socket options doesn't work
However, when I navigate in a browser to the endpoint, the connection is valid and secured. I tried extracting the certificates from the browser and included into nodeJS, no results, same error. The same goes for when I use the website https://whatsmychaincert.com/ to get the certificates ...
Basic code:
this.websocket = new WebSocket(this.apiUrl, this.applicationId, {
rejectUnauthorized: true,
});
...
this.websocket.on('open', () => {
console.log('Connection successfully opened');
});
this.websocket.on('error', error => {
console.log(error);
});
No matter what I do, it seems that regardless the included certificates NodeJS keeps throwing me the UNABLE_TO_VERIFY_LEAF_SIGNATURE error.
Somebody any idea what is happening?
After days and days of trying to solve this I finally managed to solve this issue!
I was using the wrong certificate chain the whole time. The endpoint which I'm connecting to has a wss:// prefix. But I was getting the certificates from the url with the https:// prefix. Apparently this has a different certificate chain.

Connecting SOCKET.IO-client (angular) to SOCKET.IO-server (node.js) over https net::ERR_CONNECTION_CLOSED

Our website has been running on an internal test machine where it could be accessed by all computers inside the network.
Now we want to deploy this website on a webserver (Apache2) to make it available to our clients. We want to use https and this is where we encountered a problem.
The Socket.io client canĀ“t connect to the node.js server since switching to https. We are using a signed certificate from a trusted CA. I have tried every solution I could find but none seem to work for our case.
Constructor used with ngx-socket-io in angular:
constructor() {
super({url: 'https://mywebPage.com:8080',options:{secure: true, origin: '*', transport:['websocket']}})
}
Our certificate seems to be valid since it works for our angular page. We are also able to make HTTPS GET/POST requests to our API which is located on the same server.
node.js socket.io server code:
var options = {
key: fs.readFileSync('/etc/apache2/folder/certificate.com.key'),
cert: fs.readFileSync('/etc/apache2/folder/certificate.com.public.crt'),
ca: fs.readFileSync('/etc/apache2/folder/certificate-intermediate.crt'),
requestCert: true
};
let server = require('https').createServer(options);
let io = require('socket.io')(server);
server.listen(8080);
console.log("Server started on Port 8080");
the client tries to connect to the socket-Server but fails and gets the net::ERR_CONNECTION_CLOSED the rest of the web page loads fine and has a valid certificate
We have also tested to see if the port on the web server is accesible and it seems to be open in netstat and nma.
If any more information is needed I am glad to provide.
EDIT: have tested the setup with rejectUnauthorized:false on the client side but that does not change the error.
similar stack overflow questions which i have considered in solving the problem:
socket.io net::ERR_CONNECTION_CLOSED
Setup Server-Server SSL communication using socket.io in node.js
EDIT 2: added requestCert: false, rejectUnauthorized: false into my node.js options.
Now the previous Error has been resolved now:
error during WebSocket handshake: Unexpected response code: 400

how can I properly proxy requests to /api/ressource in prod?

I have a webpack dev configuration with my front end dev server running on 8080 and my backed server running on port 3000.
So in dev mode my webpack dev server is configured like follows :
proxy: {
'/api': 'http://localhost:3000',
}
How can I do the same thing in the prod server that serves the built static files of my front end ?
I have the following code for my prod server that serves the static files of my front end :
const proxy = require('http-proxy-middleware')
app.use(express.static(dir))
/**
* Redirect everything that starts with /api/* to the backend rest server
*/
app.use('/api', proxy({ target: backendUrl }))
app.get('*', (req, res) => {
res.sendFile(path.resolve(dir + '/index.html'))
})
This is not working as the cookies seem to be lost with the proxying (unlike with the proxying with webpack where evyrhthing works).
Am I going about this problem in the correct way ?
In this case, you can create a reverse-proxy which is going to receive all the information from the frontend, make the requests to the other address and then return the proper answer to the frontend. I used to develop a lot of these back in the days, there is a package that i created which can help you.
Basically the flow is:
Frontend -> endpoint on your render server (port 8080) -> backend (port 3000) -> render server (port 8080) -> frontend
You can try using:
A server (i.E. nginx) as a reverse proxy.
A node-http-proxy as a reverse proxy.
A vhost middleware if each domain is served from the same Express
codebase and node.js instance.
You may also want to check your cookie flags (changeOrigin, secure, cookieDomainRewrite etc..)
info: IF running http on localhost, the cookie will not be set if secure-flag is present in the response

Express.js piping request to php built-in server on localhost results in ECONNREFUSED

I am building a simple website using npm for development, and it is hosted with a provider with php support.
The only functionality that uses php is contact form to send email. the rest is simple html and javascript.
I use a simple php server in development started with php -S localhost:8000 to test a simple php email script and again in dev I reverse proxy requests for email.php to this php server locally.
Node app is on port 3000 and php server is on port 8000. The problem is I get connection refused error with the following express server configuration when request goes through localhost:3000/email.php:
#!/usr/bin/env node
var express = require('express');
var app = express(),
request= require('request'),
port = +(process.env.PORT || 3000);
app.set('case sensitive routing', false);
app.post( '/email.php', function( req, res ){
req.pipe( request({
url: 'http://localhost:8000/email.php',
qs: req.query,
method: req.method
}, function(error){
if (error.code === 'ECONNREFUSED'){
console.error('Refused connection');
} else {
throw error;
}
})).pipe( res );
});
// other request handlers here
app.listen(port, function() {
console.log('listening');
});
Php server is definitely up and serving all the pages on port 8000, which I can browse with the browser. I test it with curl and it seems to be handling the request just fine when posted directly to localhost:8000 using curl.
Not sure why I get this error, scratching my head, can't think of any reason.
Help is much appreciated.
I figured out what it was, d'oh! Well I am gonna post the answer in case someone else stumbles upon this.
PHP is to blame it seems; Checking the sockets listening a port using ss -ltn ( I am on Linux, this might not work for you) I realised php server is listening IPv6 only. Relevant output as follows:
State Recv-Q Send-Q Local Address:Port Peer Address:Port
LISTEN 0 128 ::1:8000
With the relevant search I found the answer on web server documentation page under user notes posted by a user. See the post here. The solution is to use 127.0.0.1 rather than localhost:
As it turned out, if you started the php server with "php -S
localhost:80" the server will be started with ipv6 support only!
To access it via ipv4, you need to change the start up command like
so: "php -S 127.0.0.1:80" which starts server in ipv4 mode only.

Categories

Resources