we are running our Java app (spring based) including the UI modules in a Tomcat container. Calling tomcat directly over http://localhost:8080 a login page is displayed and a redirect 302 occurs to the web app.
Now we want to develop the UI modules separately from the Java app by running an Express server with http-middleware-proxy and browser-sync. The modules have not been extracted out of the war-file and are running on the Tomcat instance. For testing purposes we just copied the UI module to another dir to setup Express and corresponding modules.
The problem is that we are not able to get the authorization cookies (JSESSIONID) and CSRF tokens correctly set.
How can the redirect 302 intercepted and redirected to the separately hosted UI app?
We´ve got the authorization working, so no login is required but calling the "copied app" does not work and results in "auth error" or "forbidden".
We already checked the documentation and other posts in here.
var cookie;
function relayRequestHeaders(proxyReq, req) {
if (cookie) {
proxyReq.setHeader('cookie', cookie);
}
};
function relayResponseHeaders(proxyRes, req, res) {
var proxyCookie = proxyRes.headers["set-cookie"];
if (proxyCookie) {
cookie = proxyCookie;
}
};
const oOptions = {
target: 'http://localhost:8080',
changeOrigin: true,
auth: 'user:pw',
pathRewrite: {
'^/dispatcher': '/dispatcher',
},
//cookieDomainRewrite: 'localhost',
onProxyReq: relayRequestHeaders,
onProxyRes: relayResponseHeaders,
logLevel: 'debug'
};
const wildcardProxy = createProxyMiddleware( oOptions );
app.use(wildcardProxy);
Any ideas on how to get that solved?
Thanks.
Update:
We tried as well to filter context paths which works but then it does not access the resources of the hosted webapp via express.
const oOptions = {
target: 'http://localhost:8080',
changeOrigin: true,
auth: 'user:pw',
logLevel: 'debug'
};
const wildcardProxy = createProxyMiddleware(['index.html', 'index.jsp', '!/resources/scripts/**/*.js'], oOptions );
app.use(wildcardProxy);
This is because we are proxying "/". How can it be achieved to only proxy the login and initial index.jsp but then using the resources of "webapp" and not of tomcat resources (e.g. *.js)? Is this possible somehow via the app.use to bind to certain files and paths?
We got that solved. We just excluded certain files from being served by the target host.
config.json:
{
...
"ignoreFilesProxy": ["!/**/*.js", "!/**/*.css", "!/**/*.xml", "!/**/*.properties"],
...
}
...
createProxyMiddleware(config.ignoreFilesProxy, oOptions);
...
Next step was to change the Maven build, so that the UI modules are not served/packaged with the war-file in the local dev environment. We solved that by introducing two Maven Profiles to include or exclude the UI modules in the WAR-project.
Related
I'm working on a website made with React, run with npm. The website currently uses a JS API run with the website code, but we're migrating to using an external REST API. Everything's configured correctly to run locally with a webpack dev server:
proxy: {
'/apiv1': 'http://localhost:5000/', // in prod, run in the same container
'/api': {
target: 'http://localhost:8080/', // in prod, run separately on a different url (api.website.com)
pathRewrite: { '^/api/': '' },
},
},
In the console, I see errors complaining that some data is undefined (using the minified variable names, so difficult to track down--but almost certainly data from the API).
I checked if it was a CORS issue per this question, but I'm still having the same issue with CORS disabled completely.
I can successfully ping the api with a direct request, placed at the beginning of the base App's render method:
axios.get("https://api.website.com/")
I've tried to add the following to my package.json per this:
"homepage": ".",
"proxy": {
"/api": {
"target": "https://api.website.com",
"pathRewrite": {
"^/api": ""
},
"changeOrigin": true
}
},
In general -- how can I proxy requests for website.com/api/request to api.website.com/request in production? Is there something else I'm configuring incorrectly?
Please let me know if there's any more information I can add!
Edit:
I've also tried configuring the proxy with http-proxy-middleware:
// src/setupProxy.js
const { createProxyMiddleware } = require('http-proxy-middleware');
module.exports = function (app) {
app.use(
'/api',
createProxyMiddleware({
target: "https://api.website.com",
pathRewrite: {
"^/api": ""
},
changeOrigin: true,
})
);
};
Proxying api requests in production for React/Express app
In production we can't use (proxy).. instead we can set a variable in the frontend for the backend URL, and vise versa.
https://github.com/facebook/create-react-app/issues/1087#issuecomment-262611096
proxy is just that: a development feature. It is not meant for production.
These suggest that it's entirely impossible. Makes sense that the proxy option won't work, but I can't say I understand why there's no equivalent functionality for a production environment. It seems the best option for me is making all calls to the full domain instead of proxying.
If you're using nginx, the linked answers suggest using that to proxy the requests:
upstream backend-server {
server api.example.com;
}
server {
listen 80;
server_name example.com;
root /var/www/build;
index index.html;
location /api/ {
proxy_pass http://backend-server;
}
location / {
try_files $uri /index.html;
}
}
I am implementing a login feature to a website project. The backend is Express and the frontend is Nuxt 3. Upon successfully authenticating a user login, the Express backend returns necessary data to the webserver, which then creates an httpOnly cookie and sets any necessary data in a Pinia store. On page refresh, I would like the Nuxt 3 server to look at the cookie and setup the Pinia store (since it is lost on page refresh).
Can someone provide some guidance? I have looked at the useNuxtApp() composable, and I can see the cookie in nuxtApp.ssrContext.req.headers.cookie, but that only provides a K/V pairing of all set cookies, which I would need to parse. I know of the useCookie composable, but that only works during Lifecycle hooks, which seems to only resolve undefined.
Thanks.
Not sure if this is the right way,
but it's a solution I used to get through a similar case - dotnet api + nuxt3 client.
First, we need to proxy API (express in your case),
this will make it, so our cookie is on the same domain and browser will start sending it to /api/ endpoints.
Install #nuxtjs-alt/proxy - npm i #nuxtjs-alt/proxy.
Add configuration to nuxt.config.ts (my api running on localhost:3000):
nuxt.config.ts:
export default defineNuxtConfig({
modules: [
'#nuxtjs-alt/proxy'
],
proxy: {
enableProxy: true,
fetch: true,
proxies: {
'/proxy': {
target: 'http://localhost:3000',
changeOrigin: true,
rewrite: (path) => path.replace(/^\/proxy/, '')
}
}
}
});
Then we can the request that will set a cookie anywhere on client using proxy instead of a direct call.
Anywhere on client, do a request using newly setup proxy instead of calling API directly.
Adjust parameters based on your setup.
await $fetch('/proxy/user/sign-in', {
method: 'POST',
body: {
email: 'example#mail.com',
password: 'password'
}
});
Ultimately, should end up with a cookie set on our client domain.
And lastly, when we handle request client side - we read the cookie and set up on forwarding request.
Replace COOKIE_NAME and API URL accordingly.
server/api/user/me.get.ts:
export default defineEventHandler(async (event) => {
return await $fetch('http://localhost:3000/user/me', {
headers: {
Cookie: `COOKIE_NAME=${
getCookie(event, 'COOKIE_NAME')
}`
}
});
});
API call will use the same cookie we got when we did a first request using cookie and the server should be able to read it.
After switch to vite, I am trying to mimic proxy: "http://localhost:5000" which I previously used in package.json
Here is my vite config
export default defineConfig({
plugins: [react()],
server: {
proxy: {
"/api": {
target: "http://localhost:5000",
changeOrigin: true,
secure: false,
},
},
},
});
I have react app running on port 3000. When I send a request in the root url (http://localhost:3000) everything works fine
const { data } = await axios.get("api/user/me");
Well, not really fine. Even though proper data is returned in response, in the console request gets sent to http://localhost:3000/api/user/me instead of http://localhost:5000/api/user/me. Can anyone explain this behaviour?
The main problem is that when I navigate to another page (e.g. http://localhost:3000/dashboard), then the same request gets sent to http://localhost:3000/dashboard/api/user/me.
What am I doing wrong? I want to send requests to http://localhost:5000, no matter the location
I found a workaround by specifying FE url before every request const { data } = await axios.get("http://localhost:3000/api/user/me");, but still is there a way to mimic package.json proxy behaviour?
I resolved the issue by changing axios defaults
axios.defaults.baseURL = `http://localhost:5000`
By doing this I achieved what I was going for. Requests get sent to the proper endpoint no matter the location
I solved this problem by using Axios.
Create a new file. I called mine emailApi.
Setup your axios, like so:
export const emailApi = axios.create({baseURL: "http://localhost:<yourPortNumber>"})
Done! Whenever you want to send a request to your server, import the emailApi and you're good.
Also in your server, make sure to install cors and set it as a middleware, like so:
express().use(cors({origin: <viteLocalServer>}))
I'm trying to run below passport-sample example with one login SSO. But I couldn't make it successful. I have given Onelogin HTTP-redirect url in the SAML entry point(config.js). It was redirecting to one login authentication page and redirecting back to application page but the application is not loading.
https://github.com/gbraad/passport-saml-example
Please advise what am I missing here.
module.exports = {
development: {
app: {
name: 'Passport SAML strategy example',
port: process.env.PORT || 3000
},
passport: {
strategy: 'saml',
saml: {
path: process.env.SAML_PATH || '/login/callback',
entryPoint: process.env.SAML_ENTRY_POINT || 'https://domain.onelogin.com/trust/saml2/http-redirect/slo/200908',
issuer: 'passport-saml',
cert: process.env.SAML_CERT || null
}
}
}
};
The saml entryPoint doesn't look right in the passport saml configuration.
It is currently configured to the single logout service URL; whereas
the single sign on service URL should read similar to:
'https://domain.onelogin.com/trust/saml2/http-post/sso/200908'
The protocol binding used in entry point above is also ascertained to be the right one because the AuthNRequest sent in passport-saml module at version 0.5.0 uses http-post protocol binding for the authentication request with the identity provider and not http-redirect protocol binding.
I am working on an Multi-Environment API based on Express framework. What i want is to keep my configuration dynamic, e.g. This Api would be able to serve both mobile apps and Web apps. If request is coming from mobile source than
config-app-1.json should be included, otherwise config-app-2.json.
Currently I have config-app-1.json, config-app-2.json, config-db-1.json, config-db-2.json and a configManager.js class which sets required configuration in app.listen(). In other application modules i require configManager and use the necessary configurations. This however leads to code duplication issue in individual functions. Each function has to get reference of db and application settings in its local scope.
I would like to know what are best practices for a multi-environment API build using Express framework.
These are configuration files, here is my approach.
File Structure
.
├── app.js
├── _configs
| ├── configManager.js
| ├── database.js
| └── platform
| ├── mobile.js
| └── desktop.js
Environment Configs
Configration files are js modules for each device, then the configManager handles which one is active based on device.
//mobile.js example
module.exports = {
device: 'mobile',
configVar: 3000,
urls: {
base: 'DEVICE_SPECIFIC_BASE_URL',
api: 'DEVICE_SPECIFIC_BASE_URL'
},
mixpanelKey: 'DEVICE_SPECIFIC_BASE_URL',
apiKey: "DEVICE_SPECIFIC_BASE_URL",
}
Database Config
Database configurations should be centralized.
Usually you can connect to multiple databases within the same node instance, however it is not recommended. if you absolutely have to, just use two objects (instead of "mongodb" replace with "mobileMongoDb" and "desktopMongoDb") but i recommend that you use one database and divide it into two main documents, or use certain prefixes set in your platform-specific configs.
// databse.js example
module.exports= {
mongodb: {
host : 'localhost',
port : 27017,
user : '',
password : '',
database : 'DB_NAME'
},
}
configManager.js (Putting things together)
This is a simple file for demonstration only..
var userAgent = req.headers['User-Agent'];
var isMobile = /Mobile|Android|/i.test(userAgent);
// require them all to be cached when you run node.
var configs = {
mobile: require('./platform/mobile' ),
desktop: require('./platform/desktop' )
}
var activeConfig = isMobile? configs.mobile : configs.desktop;
var dbConfigs = require('./databse');
var mongoose = require('mongoose');
var express = require('express');
var app = express();
app.get('/', function (req, res) {
var finalresp = 'Hello from ';
finalresp += isMobile? 'mobile' : 'desktop;
finalresp += activeConfig.configVar;
res.send(finalresp);
});
mongoose.connect(dbConfigs.mongodb.host, function(err) {
if(isMobile) { /* ... */ }
});
Detect Mobile from header
read more here https://gist.github.com/dalethedeveloper/1503252
You can set environment variables. What I usually do is have multiple config files as you have mentioned.
Then set environment variable NODE_ENV in local, development and production as "LOCAL", "DEVELOPMENT" and "PRODUCTION" respectively.
Then you can refer the environment by following code
ENV = process.env.NODE_ENV
if(ENV === 'PRODUCTION') {
mainConf = JSON.parse(fs.readFileSync(path.join(__dirname, '/config/main-production.json')))
dbConf = JSON.parse(fs.readFileSync(path.join(__dirname, '/config/db-production.json')))
} else if(ENV === 'DEVELOPMENT') {
mainConf = JSON.parse(fs.readFileSync(path.join(__dirname, '/config/main-development.json')))
dbConf = JSON.parse(fs.readFileSync(path.join(__dirname, '/config/db-development.json')))
} else if(ENV === 'LOCAL') {
mainConf = JSON.parse(fs.readFileSync(path.join(__dirname, '/config/main-local.json')))
dbConf = JSON.parse(fs.readFileSync(path.join(__dirname, '/config/db-local.json')))
}
Make sure you set environment variables properly on environment of each server.
Use the config json retrieved from the above code the way you want.
Can the source of the request (e.g. Mobile - Web) change during runtime? In other words can request 1 come from a mobile device and request 2 from the web?
If so, you could look at the user agent in the headers to determine the kind of device you're dealing with. This does make you dependent on the user agent though, and if it's not sent you won't have a way of identifying your client.
req.headers['User-Agent'];
If you own the clients yourself, you can add a property to every request, say an extra header. req.headers['X-Client-Type'] = 'Mobile'; //Web.
This way you aren't dependent on the user agent and still able to identify the type of each client.
Lastly, if you are dealing with third party clients, other people making applications to hit your API you might want to make them register their application. (Name, developer name, contact information, maybe agree to some type of service agreement, and also state the type of client, Web vs Mobile).
You'd then be able to fetch the type of each client on every new request.