Do you have any idea why cookie is not set in client? It is sent from backend:
func loginEmail(_ req: Request) throws -> Response
{
let response = Response(status: .ok)
let cookie = HTTPCookies.Value(string: "abcdef")
response.cookies["userId2"] = cookie
return response
}
it is visible in browser in Network tab
set-cookie: userId2=abcdef; Path=/; SameSite=Lax
but not on Application
GET is sent to backend. Backend runs on 8080 port, frontend on 3000.
I use axios in a React / Next.js app for calling endpoint:
const login = () => {
axios
.get(`http://localhost:8080/loginEmail`)
.then((res) => {})
.catch((err) => console.error(err));
};
I am using Vapor as backend, and has the following configurations, maybe they matter:
app.middleware.use(CORSMiddleware(configuration: .init(
allowedOrigin: .originBased,
allowedMethods: [.GET, .POST, .PUT, .OPTIONS, .DELETE, .PATCH],
allowedHeaders: [.accept, .authorization, .contentType, .origin, .xRequestedWith, .userAgent, .accessControlAllowOrigin, .init("crossDomain"), .accessControlAllowCredentials, .xRequestedWith]
)))
app.sessions.configuration.cookieName = "userId2"
// Configures cookie value creation.
app.sessions.configuration.cookieFactory = { sessionID in
print("sessionID.string: \(sessionID.string)")
return .init(string: sessionID.string, isSecure: false)
}
app.middleware.use(app.sessions.middleware)
I'd like to send HTTP/2 requests to a server via a proxy using Node.js's http2 library.
I'm using Charles v4.2.7 as a proxy, for testing purposes, but Charles is not able to proxy the request. Charles is showing Malformed request URL "*" errors, as the request it receives is PRI * HTTP/2.0 (the HTTP/2 Connection Preface). I can successfully sent HTTP/2 requests via my Charles proxy using cURL (e.g. curl --http2 -x localhost:8888 https://cypher.codes), so I don't think this is an issue with Charles, but instead an issue with my Node.js implementation.
Here's my Node.js HTTP/2 client implementation which tries to send a GET request to https://cypher.codes via my Charles proxy listening at http://localhost:8888:
const http2 = require('http2');
const client = http2.connect('http://localhost:8888');
client.on('error', (err) => console.error(err));
const req = client.request({
':scheme': 'https',
':method': 'GET',
':authority': 'cypher.codes',
':path': '/',
});
req.on('response', (headers, flags) => {
for (const name in headers) {
console.log(`${name}: ${headers[name]}`);
}
});
req.setEncoding('utf8');
let data = '';
req.on('data', (chunk) => { data += chunk; });
req.on('end', () => {
console.log(`\n${data}`);
client.close();
});
req.end();
Here's the Node.js error I get when running node proxy.js (proxy.js is the file containing the above code):
events.js:200
throw er; // Unhandled 'error' event
^
Error [ERR_HTTP2_ERROR]: Protocol error
at Http2Session.onSessionInternalError (internal/http2/core.js:746:26)
Emitted 'error' event on ClientHttp2Stream instance at:
at emitErrorNT (internal/streams/destroy.js:92:8)
at emitErrorAndCloseNT (internal/streams/destroy.js:60:3)
at processTicksAndRejections (internal/process/task_queues.js:81:21) {
code: 'ERR_HTTP2_ERROR',
errno: -505
}
I reran the above cURL request with verbose output and it looks like cURL first sends a CONNECT to the proxy using HTTP/1, before sending the GET request using HTTP/2.
$ curl -v --http2 -x localhost:8888 https://cypher.codes
* Trying ::1...
* TCP_NODELAY set
* Connected to localhost (::1) port 8888 (#0)
* allocate connect buffer!
* Establish HTTP proxy tunnel to cypher.codes:443
> CONNECT cypher.codes:443 HTTP/1.1
> Host: cypher.codes:443
> User-Agent: curl/7.64.1
> Proxy-Connection: Keep-Alive
>
< HTTP/1.1 200 Connection established
<
* Proxy replied 200 to CONNECT request
* CONNECT phase completed!
* ALPN, offering h2
* ALPN, offering http/1.1
* successfully set certificate verify locations:
* CAfile: /etc/ssl/cert.pem
CApath: none
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
* CONNECT phase completed!
* CONNECT phase completed!
* TLSv1.2 (IN), TLS handshake, Server hello (2):
* TLSv1.2 (IN), TLS handshake, Certificate (11):
* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
* TLSv1.2 (IN), TLS handshake, Server finished (14):
* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
* TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1):
* TLSv1.2 (OUT), TLS handshake, Finished (20):
* TLSv1.2 (IN), TLS change cipher, Change cipher spec (1):
* TLSv1.2 (IN), TLS handshake, Finished (20):
* SSL connection using TLSv1.2 / ECDHE-RSA-AES128-GCM-SHA256
* ALPN, server accepted to use h2
* Server certificate:
* subject: CN=cypher.codes
* start date: Jun 21 04:38:35 2020 GMT
* expire date: Sep 19 04:38:35 2020 GMT
* subjectAltName: host "cypher.codes" matched cert's "cypher.codes"
* issuer: CN=Charles Proxy CA (8 Oct 2018, mcypher-mbp.local); OU=https://charlesproxy.com/ssl; O=XK72 Ltd; L=Auckland; ST=Auckland; C=NZ
* SSL certificate verify ok.
* Using HTTP2, server supports multi-use
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
* Using Stream ID: 1 (easy handle 0x7ff50d00d600)
> GET / HTTP/2
> Host: cypher.codes
> User-Agent: curl/7.64.1
> Accept: */*
>
...
I'd like to try doing the same via Node.js (first sending a HTTP/1 CONNECT request and then sending my HTTP/2 request on the same TCP connection), but I'm not sure how to do this. The very act of creating the HTTP/2 client (i.e. http2.connect('http://localhost:8888');) sends the HTTP/2 Connection Preface. I thought about first creating a connection using HTTP/1 (e.g. using the http library) and then upgrading this to HTTP/2, but I couldn't find any examples on how to do this.
Could someone help me send a HTTP/2 request via a proxy using Node.js?
Update (2020-07-13): I made more progress towards first creating a connection using HTTP/1, sending a CONNECT request, and then trying to send a GET request using HTTP/2 over the same socket. I can see the CONNECT request come through in Charles, but not the additional GET request, which indicates that I'm still doing something wrong when trying to use the same socket for HTTP/2 requests. Here's my updated code:
const http = require('http');
const http2 = require('http2');
const options = {
hostname: 'localhost',
port: 8888,
method: 'CONNECT',
path: 'cypher.codes:80',
headers: {
Host: 'cypher.codes:80',
'Proxy-Connection': 'Keep-Alive',
'Connection': 'Keep-Alive',
},
};
const connReq = http.request(options);
connReq.end();
connReq.on('connect', (_, socket) => {
const client = http2.connect('https://cypher.codes', {
createConnection: () => { return socket },
});
client.on('connect', () => console.log('http2 client connect success'));
client.on('error', (err) => console.error(`http2 client connect error: ${err}`));
const req = client.request({
':path': '/',
});
req.setEncoding('utf8');
req.on('response', (headers, flags) => {
let data = '';
req.on('data', (chunk) => { data += chunk; });
req.on('end', () => {
console.log(data);
client.close();
});
});
req.end();
});
To tunnel HTTP/2 through a proxy that doesn't understand it, you need to use HTTP/1.1 for the initial connection, and then use HTTP/2 only in the tunnel. Your code uses HTTP/2 right from the start, which isn't going to work.
To actually make that tunnel, you first send an HTTP CONNECT request for the target host, and receive a 200 response, and then everything else on the connection in future is forwarded back and forth between you and the target host.
Once you have that tunnel working, you can send HTTP/2 (or anything else the target server understands) and it'll go straight to your target.
The code to do that in node looks like this:
const http = require('http');
const http2 = require('http2');
// Build a HTTP/1.1 CONNECT request for a tunnel:
const req = http.request({
method: 'CONNECT',
host: '127.0.0.1',
port: 8888,
path: 'cypher.codes'
});
req.end(); // Send it
req.on('connect', (res, socket) => {
// When you get a successful response, use the tunnelled socket
// to make your new request.
const client = http2.connect('https://cypher.codes', {
// Use your existing socket, wrapped with TLS for HTTPS:
createConnection: () => tls.connect({
socket: socket,
ALPNProtocols: ['h2']
})
});
// From here, use 'client' to do HTTP/2 as normal through the tunnel
});
I've been working on the internals of my own tool as well recently, to add full HTTP/2 support for proxying, and writing that up over here, which is probably super relevant for you. The tests for that in https://github.com/httptoolkit/mockttp/blob/h2/test/integration/http2.spec.ts have more & larger examples of tunnelling HTTP/2 in node like this, so those are definitely worth a look too. That's all still under development of course, so let me know if you have any questions or find any mistakes there.
#TimPerry 's answer almost worked for me but it missed couple of things: authentication and how to avoid TLS certificate error.
So here is my updated version:
const http = require('http');
const http2 = require('http2');
const tls = require('tls');
// Build a HTTP/1.1 CONNECT request for a tunnel:
const username = '...';
const password = '...';
const req = http.request({
method: 'CONNECT',
host: '127.0.0.1',
port: 8888,
path: 'website.com', //the destination domain
headers: { //this is how we authorize the proxy, skip it if you don't need it
'Proxy-Authorization': 'Basic ' + Buffer.from(username + ':' + password).toString('base64')
}
});
req.end(); // Send it
req.on('connect', (res, socket) => {
// When you get a successful response, use the tunnelled socket to make your new request
const client = http2.connect('https://website.com', {
createConnection: () => tls.connect({
host: 'website.com', //this is necessary to avoid certificate errors
socket: socket,
ALPNProtocols: ['h2']
})
});
// From here, use 'client' to do HTTP/2 as normal through the tunnel
});
I was getting an "ERR_HTTP2_ERROR" and "ERR_HTTP2_PROTOCOL_ERROR" errors with #Stalinko's answer, so i need to find an alternative...
To display my solution, we will make an request to an API that returns your IP as a JSON, then you be able to adapt to your needs.
Here is the code:
/**
* A URL without the path.
*/
const TARGET_AUTHOTIRY = 'https://api4.my-ip.io'
/**
* You should use the host with the port equivalent to the protocol
* HTTP => 80
* HTTPS => 443
*/
const TARGET_HOST = 'api4.my-ip.io:443'
/**
* Proxy configuration
*/
const PROXY_HOST = '<your_proxy_host>'
const PROXY_PORT = '<your_proxy_port>'
const PROXY_USERNAME = '<your_proxy_username>'
const PROXY_PASSWORD = '<your_proxy_password>'
/**
* Establishes an connection to the target server throught the HTTP/1.0
* proxy server.
*
* The CONNECT method tells the PROXY server where this connection should arive.
*
* After the connection is established you will be able to use the TCP socket to send data
* to the TARGET server.
*/
const request = http.request({
method: 'CONNECT',
host: PROXY_HOST,
port: PROXY_PORT,
path: TARGET_HOST,
headers: {
'Host': TARGET_HOST,
'Proxy-Authorization': `Basic ${Buffer.from(`${PROXY_USERNAME}:${PROXY_PASSWORD}`).toString('base64')}`
}
})
/**
* Wait the "connect" event and then uses the TCP socket to proxy the HTTP/2.0 connection throught.
*/
request.on('connect', (res, socket) => {
/**
* Check if it has successfully connected to the server
*/
if (res.statusCode !== 200)
throw new Error('Connection rejected by the proxy')
/**
* Use the TCP socket from the HTTP/1.0 as the socket for this new connection
* without the need to establish the TLS connection manually and handle the errors
* manually too.
*
* This method accepts all TCP and TLS options.
*/
const client = http2.connect(TARGET_AUTHOTIRY, { socket })
client.on('connect', () => {
console.log('Connected to the page!')
})
/**
* Request to check your IP
*/
const req = client.request({
':path': '/ip.json',
})
req.on('response', (headers) => {
console.log('Recieved a response')
})
/**
* Stores the data recieved as a response
*/
const buffers = []
req.on('data', (buffer) => {
buffers.push(buffer)
})
req.on('end', () => {
console.log(Buffer.concat(buffers).toString('utf-8'))
// Closes the connection with the server
client.close()
})
req.end()
})
request.end()
Instead of creating a TLS Socket, i just inject my TCP Socket in the HTTP/2.0 client.
The socket option is not explicitly listed in the method documentation, but the method accepts all net.connect() and tls.connect() options.
You can find all the documentation about http2.connect method here: HTTP 2 Node JS Documentation
I am working on a speech-to-text web app using the IBM Watson Speech to text API. The API is fetched on the click of a button. But whenever I click the button. I get the above-mentioned error. I Have stored my API key and URL in a .env file.
I tried a lot but keep on getting this error. Please Help me out as I am new to all this.
I got server.js from the Watson Github Repo
Server.js
'use strict';
/* eslint-env node, es6 */
const env = require('dotenv');
env.config();
const express = require('express');
const app = express();
const AuthorizationV1 = require('watson-developer-cloud/authorization/v1');
const SpeechToTextV1 = require('watson-developer-cloud/speech-to-text/v1');
const TextToSpeechV1 = require('watson-developer-cloud/text-to-speech/v1');
const vcapServices = require('vcap_services');
const cors = require('cors');
// allows environment properties to be set in a file named .env
// on bluemix, enable rate-limiting and force https
if (process.env.VCAP_SERVICES) {
// enable rate-limiting
const RateLimit = require('express-rate-limit');
app.enable('trust proxy'); // required to work properly behind Bluemix's reverse proxy
const limiter = new RateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // limit each IP to 100 requests per windowMs
delayMs: 0 // disable delaying - full speed until the max limit is reached
});
// apply to /api/*
app.use('/api/', limiter);
// force https - microphone access requires https in Chrome and possibly other browsers
// (*.mybluemix.net domains all have built-in https support)
const secure = require('express-secure-only');
app.use(secure());
}
app.use(express.static(__dirname + '/static'));
app.use(cors())
// token endpoints
// **Warning**: these endpoints should probably be guarded with additional authentication & authorization for production use
// speech to text token endpoint
var sttAuthService = new AuthorizationV1(
Object.assign(
{
iam_apikey: process.env.SPEECH_TO_TEXT_IAM_APIKEY, // if using an RC service
url: process.env.SPEECH_TO_TEXT_URL ? process.env.SPEECH_TO_TEXT_URL : SpeechToTextV1.URL
},
vcapServices.getCredentials('speech_to_text') // pulls credentials from environment in bluemix, otherwise returns {}
)
);
app.use('/api/speech-to-text/token', function(req, res) {
sttAuthService.getToken(function(err, token) {
if (err) {
console.log('Error retrieving token: ', err);
res.status(500).send('Error retrieving token');
return;
}
res.send(token);
});
});
const port = process.env.PORT || process.env.VCAP_APP_PORT || 3002;
app.listen(port, function() {
console.log('Example IBM Watson Speech JS SDK client app & token server live at http://localhost:%s/', port);
});
// Chrome requires https to access the user's microphone unless it's a localhost url so
// this sets up a basic server on port 3001 using an included self-signed certificate
// note: this is not suitable for production use
// however bluemix automatically adds https support at https://<myapp>.mybluemix.net
if (!process.env.VCAP_SERVICES) {
const fs = require('fs');
const https = require('https');
const HTTPS_PORT = 3001;
const options = {
key: fs.readFileSync(__dirname + '/keys/localhost.pem'),
cert: fs.readFileSync(__dirname + '/keys/localhost.cert')
};
https.createServer(options, app).listen(HTTPS_PORT, function() {
console.log('Secure server live at https://localhost:%s/', HTTPS_PORT);
});
}
App.js
import React, {Component} from 'react';
import 'tachyons';
//import WatsonSpeech from 'ibm-watson';
var recognizeMic = require('watson-speech/speech-to-text/recognize-microphone');
class App extends Component {
onListenClick = () => {
fetch('http://localhost:3002/api/speech-to-text/token')
.then(function(response) {
return response.text();
}).then(function (token) {
var stream = recognizeMic({
token: token, // use `access_token` as the parameter name if using an RC service
objectMode: true, // send objects instead of text
extractResults: true, // convert {results: [{alternatives:[...]}], result_index: 0} to {alternatives: [...], index: 0}
format: false // optional - performs basic formatting on the results such as capitals an periods
});
stream.on('data', function(data) {
console.log('error 1')
console.log(data);
});
stream.on('error', function(err) {
console.log('error 2')
console.log(err);
});
//document.querySelector('#stop').onclick = stream.stop.bind(stream);
}).catch(function(error) {
console.log('error 3')
console.log(error);
});
}
render() {
return(
<div>
<h2 className="tc"> Hello, and welcome to Watson Speech to text api</h2>
<button onClick={this.onListenClick}>Listen to Microphone</button>
</div>
);
}
}
export default App
Since the only code you show is fetching an authorisation token then I guess that that is what is throwing the authentication failure. I am not sure how old the code you are using is, but the mechanism you are using was used when the STT service credentials are userid / password. The mechanism became unreliable when IAM keys started to be used.
Your sample is still using watson-developer-cloud, but that has been superseded by ibm-watson. As migrating the code to ibm-watson will take a lot of rework, you can continue to use watson-developer-cloud.
If do you stick with watson-developer-cloud and you want to get hold of a token, with an IAM Key then use:
AuthIAMV1 = require('ibm-cloud-sdk-core/iam-token-manager/v1'),
...
tokenService = new AuthIAMV1.IamTokenManagerV1({iamApikey : apikey});
...
tokenService.getToken((err, res) => {
if (err) {
...
} else {
token = res;
...
}
});
I use Helmet with Express to set quite some security HTTP headers from the server side. This is nicely done, when rendering client pages on top of the node.js app, using:
var app = express();
app.use(helmet());
..
res.render("pages/index", data);
All the resources on the index page will have the Helmet headers. Unfortunately, socket.io does its own header management. So, anything that comes after /socket.io/ will have insecure/its own headers. For example here:
<https_path>/socket.io/socket.io.js
<https_path>/socket.io/?EIO=3&transport=polling&t=Lj4CFnj&sid=ILskOFWbHUaU6grTAAAA
Hence, I want to set custom headers for all socket.io items manually.
This is how I require socket.io (excerpt only):
/src/app.js
var express = require("express");
var sio = require("socket.io");
var app = express();
var io = require("./../lib/io.js").initialize(app.listen(REST_PORT, () => {
logger.info("Application ready on port " + REST_PORT + " . Environment: " + NODE_ENV);
}));
/lib/io.js
exports = module.exports = {};
var sio = require("socket.io");
exports.initialize = function(server) {
var options = {
cookie: false,
extraHeaders: {
"X-Custom-Header-For-My-Project": "Custom stuff",
}
};
io = sio(server, options);
io.on("connection", function(socket) {
// logic
)};
The "extraHeaders" option doesn´t work, I guess it could only with socket.io-client. I did large amount of googling around, but not luck on this.
Also looked around how to use socket.request (apparently it helps with headers, according to: here), but I couldn´t figure that out either.
Could you guys help?
extraHeaders options will work as below, as you need to remove "transports: ['polling']," in case you are using, and use below pattern. This worked for me, and was able to send custom headers.
package used :- "socket.io-client": "^2.2.0",
this.socket = io(environment.host, {
path: `/api/backend/socket.io`,
origins: '*:*',
// transports: ['polling'],
transportOptions: {
polling: {
extraHeaders: {
'authorization': token,
'user-id' : userId
}
}
}
})
Ref:- https://socket.io/docs/client-api/#With-extraHeaders
With http node module (only native modules) how i can recreate app.listen() and app.get() use http module with a constructor
var app = function(opts) {
this.token= opts.token
}
app.prototype.get = function(callback) {
// use request and response of app.listen()
}
app.prototype.active = function(callback) {
// use request and response of app.listen()
// return on callback some manipulate
//request params
}
app.prototype.listen = function() {
// start http or https server
}
Import the modules and work with this
var app = require(...)
Var client = new app({
token: 0000
})
client.get(function(error, reply) {})
client.listen()
It's pretty easy to build your own very simple HTTP framework on top of Node's http module. Here's a quick one I made which implements the app.get() and app.listen() methods, you can see how it could grow to become something more Express-like:
'use strict';
const Http = require('http');
const Url = require('url');
// Framework
const Framework = function (options) {
this.options = options;
this.routes = [];
this.listener = Http.createServer(this._onRequest.bind(this));
};
Framework.prototype.get = function (path, handler) {
this.routes.push({ path, method: 'GET', handler });
};
Framework.prototype.post = function (path, handler) {
this.routes.push({ path, method: 'POST', handler });
};
Framework.prototype.listen = function (callback) {
this.listener.listen(this.options.port, callback);
};
Framework.prototype._onRequest = function (req, res) {
// Find the first matching route
for (let i = 0; i < this.routes.length; ++i) {
const route = this.routes[i];
const url = Url.parse(req.url);
if (route.method === req.method && url.path === route.path) {
return route.handler(req, res);
}
}
// No matching routes
res.writeHead(404);
res.end('Not found');
};
You can use this mini framework like so:
const app = new Framework({ port: 4000 });
app.get('/', (req, res) => {
res.end('Home page');
});
app.get('/about', (req, res) => {
res.end('About page');
});
app.listen(() => {
console.log('Started server!');
});
You can test it with a few cURL requests:
$ curl -i http://localhost:4000/
HTTP/1.1 200 OK
Date: Sun, 24 Apr 2016 14:38:02 GMT
Connection: keep-alive
Content-Length: 9
Home page
$ curl -i http://localhost:4000/about
HTTP/1.1 200 OK
Date: Sun, 24 Apr 2016 14:38:08 GMT
Connection: keep-alive
Content-Length: 10
About page
$ curl -i http://localhost:4000/spaghetti
HTTP/1.1 404 Not Found
Date: Sun, 24 Apr 2016 14:38:14 GMT
Connection: keep-alive
Transfer-Encoding: chunked
Not found
Obviously this is a really basic framework and suffers from many problems that frameworks like hapi have solved:
There's no support for parameters in paths e.g. /users/{id}. The URL paths must match the route path exactly
The order that you add routes is important (this can lead to issues)
Conflicting paths are permitted
Missing a lot of nice features like serving files and rendering templates (although you could do this in the handlers manually)