Vite react proxy sends requests to different endpoints depending on current location - javascript

After switch to vite, I am trying to mimic proxy: "http://localhost:5000" which I previously used in package.json
Here is my vite config
export default defineConfig({
plugins: [react()],
server: {
proxy: {
"/api": {
target: "http://localhost:5000",
changeOrigin: true,
secure: false,
},
},
},
});
I have react app running on port 3000. When I send a request in the root url (http://localhost:3000) everything works fine
const { data } = await axios.get("api/user/me");
Well, not really fine. Even though proper data is returned in response, in the console request gets sent to http://localhost:3000/api/user/me instead of http://localhost:5000/api/user/me. Can anyone explain this behaviour?
The main problem is that when I navigate to another page (e.g. http://localhost:3000/dashboard), then the same request gets sent to http://localhost:3000/dashboard/api/user/me.
What am I doing wrong? I want to send requests to http://localhost:5000, no matter the location
I found a workaround by specifying FE url before every request const { data } = await axios.get("http://localhost:3000/api/user/me");, but still is there a way to mimic package.json proxy behaviour?

I resolved the issue by changing axios defaults
axios.defaults.baseURL = `http://localhost:5000`
By doing this I achieved what I was going for. Requests get sent to the proper endpoint no matter the location

I solved this problem by using Axios.
Create a new file. I called mine emailApi.
Setup your axios, like so:
export const emailApi = axios.create({baseURL: "http://localhost:<yourPortNumber>"})
Done! Whenever you want to send a request to your server, import the emailApi and you're good.
Also in your server, make sure to install cors and set it as a middleware, like so:
express().use(cors({origin: <viteLocalServer>}))

Related

Proxying requests from React to External REST API

I'm working on a website made with React, run with npm. The website currently uses a JS API run with the website code, but we're migrating to using an external REST API. Everything's configured correctly to run locally with a webpack dev server:
proxy: {
'/apiv1': 'http://localhost:5000/', // in prod, run in the same container
'/api': {
target: 'http://localhost:8080/', // in prod, run separately on a different url (api.website.com)
pathRewrite: { '^/api/': '' },
},
},
In the console, I see errors complaining that some data is undefined (using the minified variable names, so difficult to track down--but almost certainly data from the API).
I checked if it was a CORS issue per this question, but I'm still having the same issue with CORS disabled completely.
I can successfully ping the api with a direct request, placed at the beginning of the base App's render method:
axios.get("https://api.website.com/")
I've tried to add the following to my package.json per this:
"homepage": ".",
"proxy": {
"/api": {
"target": "https://api.website.com",
"pathRewrite": {
"^/api": ""
},
"changeOrigin": true
}
},
In general -- how can I proxy requests for website.com/api/request to api.website.com/request in production? Is there something else I'm configuring incorrectly?
Please let me know if there's any more information I can add!
Edit:
I've also tried configuring the proxy with http-proxy-middleware:
// src/setupProxy.js
const { createProxyMiddleware } = require('http-proxy-middleware');
module.exports = function (app) {
app.use(
'/api',
createProxyMiddleware({
target: "https://api.website.com",
pathRewrite: {
"^/api": ""
},
changeOrigin: true,
})
);
};
Proxying api requests in production for React/Express app
In production we can't use (proxy).. instead we can set a variable in the frontend for the backend URL, and vise versa.
https://github.com/facebook/create-react-app/issues/1087#issuecomment-262611096
proxy is just that: a development feature. It is not meant for production.
These suggest that it's entirely impossible. Makes sense that the proxy option won't work, but I can't say I understand why there's no equivalent functionality for a production environment. It seems the best option for me is making all calls to the full domain instead of proxying.
If you're using nginx, the linked answers suggest using that to proxy the requests:
upstream backend-server {
server api.example.com;
}
server {
listen 80;
server_name example.com;
root /var/www/build;
index index.html;
location /api/ {
proxy_pass http://backend-server;
}
location / {
try_files $uri /index.html;
}
}

How to access httpOnly cookies from Nuxt 3 server

I am implementing a login feature to a website project. The backend is Express and the frontend is Nuxt 3. Upon successfully authenticating a user login, the Express backend returns necessary data to the webserver, which then creates an httpOnly cookie and sets any necessary data in a Pinia store. On page refresh, I would like the Nuxt 3 server to look at the cookie and setup the Pinia store (since it is lost on page refresh).
Can someone provide some guidance? I have looked at the useNuxtApp() composable, and I can see the cookie in nuxtApp.ssrContext.req.headers.cookie, but that only provides a K/V pairing of all set cookies, which I would need to parse. I know of the useCookie composable, but that only works during Lifecycle hooks, which seems to only resolve undefined.
Thanks.
Not sure if this is the right way,
but it's a solution I used to get through a similar case - dotnet api + nuxt3 client.
First, we need to proxy API (express in your case),
this will make it, so our cookie is on the same domain and browser will start sending it to /api/ endpoints.
Install #nuxtjs-alt/proxy - npm i #nuxtjs-alt/proxy.
Add configuration to nuxt.config.ts (my api running on localhost:3000):
nuxt.config.ts:
export default defineNuxtConfig({
modules: [
'#nuxtjs-alt/proxy'
],
proxy: {
enableProxy: true,
fetch: true,
proxies: {
'/proxy': {
target: 'http://localhost:3000',
changeOrigin: true,
rewrite: (path) => path.replace(/^\/proxy/, '')
}
}
}
});
Then we can the request that will set a cookie anywhere on client using proxy instead of a direct call.
Anywhere on client, do a request using newly setup proxy instead of calling API directly.
Adjust parameters based on your setup.
await $fetch('/proxy/user/sign-in', {
method: 'POST',
body: {
email: 'example#mail.com',
password: 'password'
}
});
Ultimately, should end up with a cookie set on our client domain.
And lastly, when we handle request client side - we read the cookie and set up on forwarding request.
Replace COOKIE_NAME and API URL accordingly.
server/api/user/me.get.ts:
export default defineEventHandler(async (event) => {
return await $fetch('http://localhost:3000/user/me', {
headers: {
Cookie: `COOKIE_NAME=${
getCookie(event, 'COOKIE_NAME')
}`
}
});
});
API call will use the same cookie we got when we did a first request using cookie and the server should be able to read it.

Cookie not set in request with NodeJS and NextJS

I'm developing a fullstack app with Node + Express backend and NextJS front end (separate servers) and am having trouble requesting the browser to attach the cookie vended down as part of the response header from the node server. Here's the setup:
Node server is running on localhost:3000 and NextJs server is running on localhost:3001.
I have set up alias in etc/hosts to route someAlias.com to 127.0.0.1.
Using the front end UI (port 3001) I was able to vend the cookie with JsHttp's cookie module with the following code from the backend (port 3000):
import { serialize } from 'cookie';
...
const cookie = serialize(TOKEN_NAME, TOKEN_VAL, {
httpOnly: true,
sameSite: 'none',
});
I was able to observe the Set-Cookie header in the response.
However, in the subsequent requests, I did not see the cookie being attached. I have tried fiddling with the above cookie serialization params with no success:
Here are the arguments I've tried:
domain: ['.someAlias.com:3000', '.someAlias.com:3001']
path: '/'
domain: '.someAlias.com'
I have a feeling it might just be due to front end and back end server ports being different, but all requests have been initiated on the client side going to localhost:3000 (backend port). So not sure what I've possibly done wrong here.
====== UPDATE =======
I've run a couple more experiments, and found out that when I'm accessing a URL directly, NextJs renders the page server-side. When I'm transitioning between pages within the app, the page is rendered client-side where it queries the backend port 3000 directly. Unfortunately in neither scenario did I see any cookie being set...
the cookies must be sent directly from the browser to the server , which is not the case when you use nextJs . because when you access to your app next js will server side render your page and then the request will be sent from nextjs server to your nodejs server so the browser will send the cookies to nextjs server not to your nodejs server .
the solution is to send cookies manually from nextjs server to nodejs .
example with fetchApi and getServerSideProps function:
export async function getServerSideProps(context){
try{
const res = await fetch(`your-api-endpoint`, {
method: 'GET',
credentials:'include',
headers: {
'Access-Control-Allow-Credentials': true,
Cookie: context.req.headers.cookie
},
})
const data = await res.json()
if(!res.ok){
throw data
}
}catch(err){
return console.log(err)
}
return {
props: {
data
}
}
}
SAMESITE NONE
If you use SameSite=none you also need to use Secure, meaning SSL must also be used. You are then saying that your 2 servers are from unrelated domains, which is probably not what you want, since browsers will drop cookies aggressively.
LOCAL PC OPTIONS
Use these settings initially on your development computer, and ensure that all URLs used in the browser and Ajax calls use http://somealias.com:3000 and http://somealias.com:3001 rather than localhost. Cookies will then stick on a development computer.
Domain=.somealias.com
Path=/
SameSite=strict
HTTP Only
DEPLOYED OPTIONS
When you deploy to a proper environment, also use SSL and set the cookie option Secure. Most importantly, the two domains must meet hosting prerequisites of sharing the same base domain, eg:
https://www.example.com
https://api.example.com
This ensures that cookies issued are considered first party and in the same site, so that they are not dropped. If preconditions are not met, there is nothing you can do in code to fix the problem.
SIMILAR RESOURCE
This Curity code example uses local development domains and same site cookie settings similar to those I've used above and may be useful to compare against.
You should set serialized cookie with res.set Express method.
Alternatively, you can use res.cookie method without additional cookie package like this:
res.cookie(TOKEN_NAME, TOKEN_VAL, {
httpOnly: true,
sameSite: 'none',
});
Note, you shouldn't worry about different ports on the same domain, since cookies are not isolated by port but domain only. No matter what port you use, cookies should be visible.
Since you said, "I was able to observe the Set-Cookie header in the response", I believe your node.js setting correct.
When you get response from node js, you need to set cookies, which can be done with a npm packages easily. I will demonstrate with js-cookie:
import Cookies from "js-cookie";
you write a reusable function to set the cookies:
// what ever your project returns
setSession(authResult) {
//converting everything to miliseconds
const expiresAt =
JSON.stringify(authResult.expiresIn * 1000) + new Date().getTime();
// I just put properties. I dont know how project sets
Cookies.set("user", authResult.idTokenPayload);
Cookies.set("jwt", authResult.idToken);
Cookies.set("expiresAt", expiresAt);
}
Everytime you make request you have to set headers. You have to retrieve cookies based on if you are on browser or on server. So you have to write a function if you are on server. Since I demonstrated how to set cookies with js-cookies, you can get the cookies easily on the browser. This reusable function to retrieve the cookie if you are on the server:
// cookieKey: I set three "user", "jwt","expiresAt"
export const getCookieFromReq = (req, cookieKey) => {
console.log("req.headers", req.headers);
// cookies are attached to the req.header.cookie.
const cookie = req.headers.cookie
.split(";")
.find((c) => c.trim().startsWith(`${cookieKey}=`));
if (!cookie) return undefined;
return cookie.split("=")[1];
};
Now you have to write a function to set the headers:
import Cookies from "js-cookie";
import { getCookieFromReq } from "./directoryOf";
export const setAuthHeader = (req) => {
const token = req ? getCookieFromReq(req, "jwt") : Cookies.getJSON("jwt");
if (token) {
return {
headers: { authorization: `Bearer ${token}` },
};
}
return undefined;
};
Now when you make request, you have to use this setAuthHeader. For example:
await axiosInstance
.post("/blogs", blogData, setAuthHeader())
.then((response) => response.data)
.catch((error) => rejectPromise(error));

Node http-middleware-proxy and express against Tomcat

we are running our Java app (spring based) including the UI modules in a Tomcat container. Calling tomcat directly over http://localhost:8080 a login page is displayed and a redirect 302 occurs to the web app.
Now we want to develop the UI modules separately from the Java app by running an Express server with http-middleware-proxy and browser-sync. The modules have not been extracted out of the war-file and are running on the Tomcat instance. For testing purposes we just copied the UI module to another dir to setup Express and corresponding modules.
The problem is that we are not able to get the authorization cookies (JSESSIONID) and CSRF tokens correctly set.
How can the redirect 302 intercepted and redirected to the separately hosted UI app?
We´ve got the authorization working, so no login is required but calling the "copied app" does not work and results in "auth error" or "forbidden".
We already checked the documentation and other posts in here.
var cookie;
function relayRequestHeaders(proxyReq, req) {
if (cookie) {
proxyReq.setHeader('cookie', cookie);
}
};
function relayResponseHeaders(proxyRes, req, res) {
var proxyCookie = proxyRes.headers["set-cookie"];
if (proxyCookie) {
cookie = proxyCookie;
}
};
const oOptions = {
target: 'http://localhost:8080',
changeOrigin: true,
auth: 'user:pw',
pathRewrite: {
'^/dispatcher': '/dispatcher',
},
//cookieDomainRewrite: 'localhost',
onProxyReq: relayRequestHeaders,
onProxyRes: relayResponseHeaders,
logLevel: 'debug'
};
const wildcardProxy = createProxyMiddleware( oOptions );
app.use(wildcardProxy);
Any ideas on how to get that solved?
Thanks.
Update:
We tried as well to filter context paths which works but then it does not access the resources of the hosted webapp via express.
const oOptions = {
target: 'http://localhost:8080',
changeOrigin: true,
auth: 'user:pw',
logLevel: 'debug'
};
const wildcardProxy = createProxyMiddleware(['index.html', 'index.jsp', '!/resources/scripts/**/*.js'], oOptions );
app.use(wildcardProxy);
This is because we are proxying "/". How can it be achieved to only proxy the login and initial index.jsp but then using the resources of "webapp" and not of tomcat resources (e.g. *.js)? Is this possible somehow via the app.use to bind to certain files and paths?
We got that solved. We just excluded certain files from being served by the target host.
config.json:
{
...
"ignoreFilesProxy": ["!/**/*.js", "!/**/*.css", "!/**/*.xml", "!/**/*.properties"],
...
}
...
createProxyMiddleware(config.ignoreFilesProxy, oOptions);
...
Next step was to change the Maven build, so that the UI modules are not served/packaged with the war-file in the local dev environment. We solved that by introducing two Maven Profiles to include or exclude the UI modules in the WAR-project.

CORS policy in vue.js. Access only in backend?

If I have project in vue-cli with no node.js (or express.js), can I somehow unblocke this CORS access?
I tried to add a code in vue.config.js
vue.config.js:
module.exports = {
devServer: {
proxy: {
'/api': {
target: 'http://18.700.121.3:8000/',
ws: true,
changeOrigin: true
}
}
}
}
vue template
import axios from 'axios'
export default {
name: "Jokes",
data () {
return {
songs: null,
}
},
},
mounted () {
const config = {headers: {'Access-Control-Allow-Origin': '*'}};
axios
.get(`http://2.788.121.2:4000/`, config) //it a sample api
.then(
response => (
this.songs= response.data))
}
}
</script>
but it didn't help. Also I tried to swich-on chrome plugin Access-Control-Allow-Origin, where I add access to localhost:8080, but still doesnt work.
So it is possible, that only option is install node.js and add res.header("Access-Control-Allow-Headers","*");
Try remove the header const config = {headers: {'Access-Control-Allow-Origin': '*'}}; from the request
The additional header will just confuse the browser
First way,
Use Cors Plugin in chrome Eg- Moseif CORS or Degrade your chrome version less than 71.
Second way, If you can modify the server-side code (If using express)
Add below codes, You can easily solved this problem.
var cors = require('cors')
var app = express()
app.use(cors())
Close the Chrome.
Hold down the Windows Key and Press R on your keyboard. The "RUN" dialog box will open.
Insert the following input field chrome --disable-web-security --user-data-dir and run it
Now run your vue.js app
Hope it helps.

Categories

Resources