Axios, Proxy request seems like not working - javascript

It seems like the axios library doesn't work as it should and I don't know what's wrong.
First, to explain. I am new to axios and the company I work has setup the API server using nGinx with proxy pass.
So let say, the API Server is under the domain: https://www.api.com and the api endpoints are located under https://www.api.com/api/.
Then, we are going to create a SSR application, using the axios library to make the request to this server, but the point is to use the proxy settings of the axios, in a way that the finally requests will not be like that: https://www.api.com/api/endpoint but like that: https://www.js-app.com/api/endpoint.
So, currently I have the following class:
class WebApiService {
constructor() {
this.deferred = Q.defer();
this.$http = axios.create();
}
async call(config) {
WebApiService._setDefaultApiCallHeader(config);
try {
const result = await this.$http(config);
this.deferred.resolve(result);
} catch( error ) {
this.deferred.reject(error);
}
return this.deferred.promise;
}
static _setDefaultApiCallHeader(config) {
config.headers = config.headers || {};
if (window.sessionStorage['Authorization'] !== 'undefined') {
config.headers['Authorization'] = `Bearer ${window.sessionStorage['Authorization']}`;
}
// HERE I SETUP THE PROXY SETTINGS
if ( /^\/api\//.test(config.url) ) {
config.proxy = config.proxy || {};
config.proxy.host = 'https://www.api.com';
}
}
}
So now, when I utilize this class from another place in my app using code like this:
const options = {
method: 'POST',
url: '/api/v0/configs',
data: { domain: window.location.hostname }
};
const websiteConfig = await this.webApiService.call(options);
In my browser console I get a 404 error and that's because the axios still request the API call from my localhost instead from the remote server.
The error I get is like that:
Also the axios settings inside the request Interceptor are like that:
So, you think I do something in wrong? You think I try to achieve something that's not possible? Is there any solution to this situation? Any idea on how to approach this issue?

I think you are misusing axios proxy option. This option as far as I understand is intended to put there some actual proxy. And you don't need all this in your case.
You should serve your app-client from your server https://www.js-app.com, below is an example of Nginx config:
location / {
proxy_pass http://localhost:8080;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
And for the API Server:
location /api {
proxy_pass http://localhost:8081; # here you put your backend server port
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
proxy_read_timeout 600s;
}
Examples are from this nice article that may help.
Or you may google some others, with title similar to "How to Set Up a Node.js Application for Production with Nginx Reverse Proxy"
If you can not put your backend server like that, you can always do something behind Nginx or in Nginx config to actually send request to api.com/api

Related

NextJS FetchError ETIMEDOUT

I deployed my project on centos 7 that port forwarded to 8080 which means, we use the site using ip then :8080. And here is the NGINX config I used for my front-end and backend reverse proxy
Site nginx config
server {
listen 80;
listen [::]:80;
server_name myapp.com etc.com;
#my api
location / {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Auth-Request-Redirect "http://api.myapp.com";
proxy_cache_bypass $http_upgrade;
proxy_pass http://127.0.0.1:3333;
proxy_http_version 1.1;
proxy_buffer_size 128k;
proxy_buffers 4 256k;
proxy_busy_buffers_size 256k;
proxy_redirect off;
#proxy_cookie_path / "/; SameSite=lax; HTTPOnly; Secure";
}
#my app
location /myapp {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header X-Auth-Request-Redirect "http://ipaddress:8080/adminpage";
proxy_cache_bypass $http_upgrade;
proxy_pass http://127.0.0.1:3000;
proxy_http_version 1.1;
proxy_buffer_size 128k;
proxy_buffers 4 256k;
proxy_busy_buffers_size 256k;
#proxy_cookie_path / "/; SameSite=lax; HTTPOnly; Secure";
}
}
I got this error
FetchError: request to http://ipaddress:8080/auth/checkauth failed, reason: connect ETIMEDOUT ipaddress:8080
at ClientRequest.<anonymous> (/root/web/myapp/node_modules/next/dist/compiled/node-fetch/index.js:1:64142)
at ClientRequest.emit (node:events:527:28)
at Socket.socketErrorListener (node:_http_client:454:9)
at Socket.emit (node:events:527:28)
at emitErrorNT (node:internal/streams/destroy:157:8)
at emitErrorCloseNT (node:internal/streams/destroy:122:3)
at processTicksAndRejections (node:internal/process/task_queues:83:21) {
type: 'system',
errno: 'ETIMEDOUT',
code: 'ETIMEDOUT'
}
_middleware.ts
import type { NextRequest } from 'next/server';
import { NextResponse } from 'next/server';
export async function middleware(req: NextRequest) {
const token = req.cookies;
const urlClone = req.nextUrl.clone();
urlClone.pathname = `/404`;
const res = await fetch(`${process.env.NEXT_PUBLIC_API_URL}/auth/checkauth`, {
method: 'GET',
headers: {
Authorization: `Bearer ${token.app_token}`,
},
});
if (res.status === 200) {
return NextResponse.next();
}
return NextResponse.rewrite(urlClone);
}
The app works fine in production, also with my Axios api. I can login thru my app using axios request but my middleware that have fetch api and getServerSideProps that also have fetch api having connect ETIMEDOUT error.
What I tried so far
proxy agent
change fetch api to axios with adapter
tried playing with url
set cors on my API to all and true
I can also call my api endpoint using curl inside the server and postman on my local machine
My other NextJS app that being deployed in linux server same procedure also, it does have fetch api in middleware and getServerSideProps works fine but that server is not port forwarded to any port. I'm wondering if that could be the issue
I used NextJS v12.1.6
I solved this by replacing my await fetch base url to localhost http://127.0.0.1:3333 (api host) on the server. Works perfect on my case.
const res = await fetch(`http://127.0.0.1:3333/auth/checkauth`, {
method: 'GET',
headers: {
Authorization: `Bearer ${token.app_token}`,
},
});

Serving a React app from Node while also having an API

So I am trying to have a Node/React setup on Ubuntu, inside an Nginx server.
The React app works fine, however when I try to have API endpoints (in Node) for the React app to call, those endpoints don't work - neither for the app, nor for going to those endpoints from a browser.
This is what some of the code looks like:
const express = require('express');
const path = require('path');
const app = express();
app.use(express.static(path.join(__dirname, 'client/build')));
app.use(express.json());
app.get('/api/contactinfo', async (req, res) => {
let contactinfo = await Information.findAll({
plain: true,
attributes: ["phone", "email", "address"],
});
res.json(contactinfo);
});
app.get('*', (req, res) => {
res.sendFile(path.join(__dirname + '/client/build/index.html'));
});
const port = process.env.PORT || 5000;
app.listen(port);
So for example, in this part, I might go to the React app's contact page (example.com/contact), and that loads fine. But the API call that the React app makes to the node server fails. So it seems like the React routing is working, but not the Node routing.
Likewise, if I go to just the Node API directly (example.com/api/contactinfo), that fails with a 502 bad gateway.
My Nginx setup looks like this:
location / {
proxy_pass http://localhost:5000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
I've also got some SSL cert setup stuff as well, though I'm not sure if that is relevant.
When I look at the Nginx error.log, this is what I see:
2020/09/02 15:36:54 [error] 1424#1424: *325 upstream prematurely closed connection while reading response header from upstream, client: 35.3.25.220, server: exampledomain.com, request: "GET /api/contactinfo HTTP/1.1", upstream: "http://127.0.0.1:5000/api/contactinfo", host: "exampledomain.com"
What exactly is causing my Node app API endpoints to fail? I've tried increasing the timeout, and several other things and nothing seems to be working - I've been trying to fix this problem for hours, but for some reason, despite the fact that I can successfully get React to load, I can't get any Node endpoints to do so.
How do I fix this?
I don't know the error in your implementation but here how I would do it.
// Nginx
server {
charset utf-8;
listen 80 default_server;
server_name _;
# front-end files
location / {
root /opt/front-end;
try_files $uri /index.html;
}
# node api reverse proxy
location /api/ {
proxy_pass http://localhost:5000/;
}
}
// folder structure
.
opt
+-- front-end
| +-- react app build
+-- back-end
+-- node app
// Node app
...
app.listen(5000);
Based on the article How to Deploy a MEAN Stack App to Amazon EC2
The “problem” is the certbot. If you have a self sign certificate it means that .... depending on how you configúrate certbot, basically now everything runs under port 443 not 80. So if on yow request you have some like `http://www domain com/api/endpoint. You’ll get the error you are getting. What you need to do is to use the https module from node.
import bodyParser from "body-parser";
import express from "express";
import fs from "fs";
import path from "path";
// import https
import https from "https";
import { routes } from "./routes";
import { logger } from "./utils/logger";
// The paths of those files keys will depend on where certbot stored them
const servOptions = {
cert: fs.readFileSync("/etc/letsencrypt/live/feikdomain.com/fullchain.pem"),
key: fs.readFileSync("/etc/letsencrypt/live/feikdomain.com/privkey.pem"),
ca: fs.readFileSync("/etc/letsencrypt/live/feikdomain.com/chain.pem"),
};
/**
* Createas an instance of the framework `fsexpress`.
*
* #returns {import("Express").Express} `express` instance.
*/
logger.info("express::expressApp");
const app: Express = express();
const build = path.join(__dirname, "../html");
app.use(express.static(build));
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true, limit: "5m" }));
app.use("/static", express.static(build));
app.use(`${process.env.ENETO_CURRENT}`, routes());
app.use("*", function (req, res) {
return res.status(200).sendFile(`${build}/index.html`);
})
const secure = https.createServer(servOptions, app);
secure.listen(Number(process.env.PORT), () => {
console.log("servOptions: ", servOptions);
logger.info("APP RUNNING");
});
The issue was that I had a misunderstanding of how Node logging worked under Nginx.
I thought any problems or console logs with the Node setup would be logged to the nginx/error.log.
This was not in fact the case.
The Node setup had another problem, which made trying to access my endpoints crash.
The solution here is better logging that is not dependent on any sort of Nginx logs.
Solution
I just got this problem, and there are a couple of things you need.
First
you still need this
import bodyParser from "body-parser";
import express from "express";
import fs from "fs";
import path from "path";
// import https
import https from "https";
import { routes } from "./routes";
import { logger } from "./utils/logger";
// The paths of those files keys will depend on where certbot stored them
const servOptions = {
cert: fs.readFileSync("/etc/letsencrypt/live/feikdomain.com/fullchain.pem"),
key: fs.readFileSync("/etc/letsencrypt/live/feikdomain.com/privkey.pem"),
ca: fs.readFileSync("/etc/letsencrypt/live/feikdomain.com/chain.pem"),
};
/**
* Createas an instance of the framework `fsexpress`.
*
* #returns {import("Express").Express} `express` instance.
*/
logger.info("express::expressApp");
const app: Express = express();
const build = path.join(__dirname, "../html");
app.use(express.static(build));
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true, limit: "5m" }));
app.use("/static", express.static(build));
app.use(`${process.env.ENETO_CURRENT}`, routes());
app.use("*", function (req, res) {
return res.status(200).sendFile(`${build}/index.html`);
})
const secure = https.createServer(servOptions, app);
secure.listen(Number(process.env.PORT), () => {
console.log("servOptions: ", servOptions);
logger.info("APP RUNNING");
});
Second
on yow location
location / {
proxy_pass http://localhost:5000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
more specific on the proxy_pass http://localhost:5000;
change the http or add an s at the end like this
proxy_pass https://localhost:5000;
location / {
proxy_pass https://localhost:5000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}

Nginx audio files (wav/ogg/mp3) not working

Audios on prod are not working while working fine on dev environment (Angular 7).
Prod config (VPS):
Ubuntu 18
Nginx
Let's encrypt
AudioService:
export class AudioService {
audio = new Audio();
constructor() { }
isPlaying() {
return this.audio.currentTime > 0 && !this.audio.paused && !this.audio.ended && this.audio.readyState > 2;
}
play(name: string): void {
this.audio.src = `assets/audio/${name}`;
this.audio.crossOrigin = 'anonymous';
this.audio.load();
if (!this.isPlaying()) {
this.audio.play();
}
}
pause(): void {
if (this.isPlaying()) {
this.audio.pause();
}
}
}
CORS are enabled on Nodejs side (using Nestjs). main.ts:
app.enableCors();
Chrome log:
Uncaught (in promise) DOMException: Failed to load because no
supported source was found.
Firefox log:
NotSupportedError: The media resource indicated by the src attribute
or assigned media provider object was not suitable.
Looking at Network console we can see myaudio.wav with:
Status Code: 206 Partial Content
Note: Loading images works fine !
EDIT:
Nginx config /etc/nginx/sites-available/mywebsite:
# Redirection
server {
# if ($host = mywebsite.com) {
# return 301 https://$host$request_uri;
# } # managed by Certbot
listen 80;
listen [::]:80;
server_name mywebsite.com www.mywebsite.com;
return 301 https://$host$request_uri;
#return 404; # managed by Certbot
}
# Config
server {
server_name mywebsite.com www.mywebsite.com;
root /home/foo/mywebsite/gui;
index index.html index.htm;
location / {
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header Host $http_host;
proxy_set_header X-NginX-Proxy true;
proxy_pass http://my.ip:3000/;
# Websocket
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
}
if ($host = 'www.mywebsite.com') {
return 301 https://mywebsite.com$request_uri;
}
listen [::]:443 ssl ipv6only=on; # managed by Certbot
listen 443 ssl; # managed by Certbot
ssl_certificate /etc/letsencrypt/live/mywebsite.com/fullchain.pem; # managed by Certbot
ssl_certificate_key /etc/letsencrypt/live/mywebsite.com/privkey.pem; # managed by Certbot
}
On dev environment localhost:4200/assets/audio/myaudio.wav → works fine
On prod environment https://mywebsite.com/assets/audio/myaudio.wav → returns home page
While https://mywebsite.com/assets/image.jpg → works fine
Only audios don't work.
Set max_ranges to 0.
For your case, this would look like something like this:
location ~ \.wav$ {
max_ranges 0;
}
Meaning the rule applies to every wav file regardless of their location.

How to correctly configure Nginx for Node.js REST API?

I have a node application running on a service with Apache and Nginx as a reverse proxy.
On the same server also a Node REST API is running.
The JavaScript code looks as follows:
api.js
// Express
const express = require('express');
const bodyParser = require('body-parser');
const cors = require('cors');
// Express App
const app = express();
// Env
const PORT = process.env.PORT || 3000;
const NODE_ENV = process.env.NODE_ENV || 'development';
// Config
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
app.use(cors());
// Routes
const users = require('./routes/users');
// Angular Http content type for POST etc defaults to text/plain at
app.use(bodyParser.text(), function ngHttpFix(req, res, next) {
try {
req.body = JSON.parse(req.body);
next();
} catch(e) {
next();
}
});
app.use('/api', users);
app.listen(PORT, function() {
console.log('Listen on http://localhost:' + PORT + ' in ' + NODE_ENV);
});
/routes/users.js
var models = require('../models');
var express = require('express');
var router = express.Router();
// get all users
router.get('/users', function(req, res) {
models.Beekeeper.findAll({}).then(function(users) {
res.json(users);
});
});
module.exports = router;
The Nginx configuration looks as follows:
index index.html index.htm;
upstream api {
server 127.0.0.1:3000;
}
server {
listen 80;
server_name example.com;
return 301 https://$server_name$request_uri;
}
server {
listen 443;
root /var/www;
ssl on;
ssl_prefer_server_ciphers On;
ssl_protocols TLSv1.2;
ssl_session_cache shared:SSL:10m;
ssl_session_timeout 10m;
ssl_ciphers AES256+EECDH:AES256+EDH:!aNULL;
add_header Strict-Transport-Security "max-age=63072000; includeSubdomains; preload";
add_header X-Frame-Options DENY;
add_header X-Content-Type-Options nosniff;
ssl_dhparam /etc/nginx/ssl/dhparam.pem;
ssl_certificate /etc/letsencrypt/live/example.com/cert.pem;
ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem;
server_name example.com;
location / {
proxy_pass http://127.0.0.1:8000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-Ip $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
location /api {
proxy_pass http://api;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
proxy_set_header X-Real-Ip $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
rewrite ^/api/?(.*) /$1 break;
proxy_redirect off;
}
}
The problem is that if I make an API call on my development server, for example, localhost:3000/api/users, it works as expected.
However, if I make an API call on my production server, for example, https://example.com/api/users, I get Cannot GET /users and 404 NOT FOUND, respectively.
I suppose that there is something wrong with my Nginx configuration, however, although I already read numerous other posts about similar problems here on Stackoverflow, I could not solve the problem.
Notice that you're requesting this:
https://example.com/api/users
But the error says this:
Cannot GET /users
So the /api prefix is being stripped off the request path before being passed to your Node server.
Which is done by this line:
rewrite ^/api/?(.*) /$1 break;
Solution: remove that line.

How to check with JavaScript if web server port is being blocked

I use NginX proxy to pipe data to and fro trough WebSocket between web application and Node.js application.
In doing this, all requests to open the webpage and PHP functions would route through port 80 and all data coming and going through WebSocket would direct to port 8080 or 8008.
The problem: port 8080 has been blocked in mobile network so people cannot use mobile phone (Android) to run the web app and port 8008 was blocked in some offices and homes so user cannot use PC to pipe and receive data through the app.
I’m able to set NginX proxy to receive those data separately, but how do I check which port is being blocked at the time. I’ve tried this workaround but it isn’t working:
var socket = new WebSocket('ws://62.57.141.143:8008/') || new WebSocket('ws:// 62.57.141.143:8080/');
I want something like this:
var socket;
if ( port 8080 is blocked ) {
socket = new WebSocket('ws://62.57.141.143:8008/')
} else {
socket = new WebSocket('ws://62.57.141.143:8080/')
}
Thank you and hope I make myself clear
Finally, I can find the solution.
function openWebsocket(url){
try {
socket = new WebSocket(url);
socket.onopen = function(){
console.log('Socket is now open.');
};
socket.onerror = function (error) {
console.error('There was an un-identified Web Socket error');
};
socket.onmessage = function (message) {
console.info("Message: %o", message.data);
};
} catch (e) {
console.error('Sorry, the web socket at "%s" is un-available', url);
}
}
openWebsocket("ws://62.57.141.143:8008");
openWebsocket("ws://62.57.141.143:8080");
Try to connect to it. If you get an error, fall back to another port.
If you are already using nginx, take advantage of the proxy_pass:
Users will still connect to port 80, and nginx proxy it to port 8080.
Example: Proxy pass everything under /socket.io/ to http://127.0.0.1:8080.
location /socket.io/ {
proxy_pass http://127.0.0.1:8080;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
}
For client-side, try this:
If the client can connect, the networkPort80xx variable will be set to true.
In your web application:
<script src="//62.57.141.143:8080/detect.js"></script>
<script src="//62.57.141.143:8008/detect.js"></script>
<script>
var networkPort8080 = false,
networkPort8008 = false;
if (window.NetworkPort8080) { networkPort8080=true; }
if (window.NetworkPort8008) { networkPort8008=true; }
var socket;
if (networkPort8080) {
socket = new WebSocket('ws://62.57.141.143:8080/');
} else if (networkPort8008) {
socket = new WebSocket('ws://62.57.141.143:8008/');
}
</script>
In 62.57.141.143:8080/detect.js
window.NetworkPort8080=function(){return true;}
In 62.57.141.143:8008/detect.js
window.NetworkPort8000=function(){return true;}

Categories

Resources