Why is my Remote Server not appropriately routing? - javascript

I am having trouble getting my VPS server set up appropriately. I copied all the files recursively from my local server to the remote server, but post & get requests to the server are not functioning appropriately (404 not found error). The routing works perfectly on my localhost server, but unfortunately it doesn't on the remote.
Server Code:
// getting datastores
const datastore = require('nedb');
const customerRecordsdb = new datastore("CustomerRecords.db");
customerRecordsdb.loadDatabase();
// importing express
const port = 3000;
const express = require('express');
const application = express();
const path = require('path')
const newPath = path.join(__dirname + '../../..')
application.listen(port, () => console.log("Listening at " + port));
application.use(express.json( {limit: '1mb'} ));
application.get('/RetrieveCustomerInformation', (request, response) => {
console.log("SUCCESS!");
customerRecordsdb.find({}, (error, data) => {
if (error) {
response.end();
return;
}
response.json(data);
})
})
Client Code:
// loading data
let numberRecords = 0;
async function loadClientRecords() {
const response = await fetch('/RetrieveCustomerInformation');
const data = await response.json();
for (let i = 0; i < data.length; i++) {
numberRecords++;
insertNewRecord(data[i]);
}
}
loadClientRecords()
.then(response => {
// was successful
})
.catch(error => {
//console.log(error);
})
[VPS Page Output]
https://i.stack.imgur.com/ghoIa.png (The server is not outputting anything)
[Local Host Page output]
https://i.stack.imgur.com/I6mqE.png
(The server is outputting "SUCCESS!" on every refresh)
As previously mentioned, I simply copied the directory over to the remote server.
Any help to what the problem could be will be greatly appreciated!

Related

html page loading but CSS is not in node js app

I've made a weather app using pure node js , but ultimately when i load the website, it only loads the HTML site and note the CSS.
Here is my index.js
const fs = require("fs");
const http = require("http");
var requests = require("requests");
const homeFile = fs.readFileSync('home.html', 'utf-8');
const replaceVal = (tempVal, orgVal) => {
let temperature = tempVal.replace("{%tempVal%}", orgVal.main.temp);
temperature = temperature.replace("{%minTemp%}", orgVal.main.temp_min);
temperature = temperature.replace("{%maxTemp%}", orgVal.main.temp_max);
temperature = temperature.replace("{%country%}", orgVal.sys.country);
return temperature;
}
const server = http.createServer((req, res) => {
if (req.url == '/') {
requests('https://api.openweathermap.org/data/2.5/weather?q=Pune&appid=0bfd89f4982a2d416a4ac5d299d03f9e&units=metric')
.on('data', function (chunk) {
const objData = JSON.parse(chunk);
const arr = [objData];
const realTimeData = arr.map(val => {
return replaceVal(homeFile, val);
}).join("");
res.write(realTimeData);
})
.on('end', function (err) {
if (err) return console.log('connection closed due to errors', err);
res.end();
});
}
});
server.listen(8000, "127.0.0.1");
Can anyone help in solving this issue.
It looks like you are only sending homeFile to your response for path "/". If you want to output your css as well you will need another path matching for any css files. That way when your html file asks the server for the css file it can request it.
if(req.url === "/"){
//send html here
}else if(req.url.match("\.css$")){
//send css here
}

How to get data from custom created function in JS using https node module

How to get data from my function Data() instead of JSON PLACE HOLDER mock API using HTTPS/HTTP node module and how to make an endpoint of this get data HTTP/HTTPS module to utilize response in front end just like Angular?
My mock backen.js file:
const https = require('https');
https.get(Data.data, res => {
let data = [];
const headerDate = res.headers && res.headers.date ? res.headers.date : 'no response date';
console.log('Status Code:', res.statusCode);
console.log('Date in Response header:', headerDate);
res.on('data', chunk => {
data.push(chunk);
});
res.on('end', () => {
console.log('Response ended: ');
const users = JSON.parse(Buffer.concat(data).toString());
for(user of users) {
console.log(`Got user with id: ${user.id}, name: ${user.name}`);
}
});
}).on('error', err => {
console.log('Error: ', err.message);
});
function Data() {
var data = {};
........
return data;
}
Your time and help will be really appreciated. Thanks :)
Hurray! I got it using the following code and Express in node js. I simply call my custom method that creates data into an express "get" endpoint. When I hit the end point the response will be my custom method result.
const express = require('express');
const cors = require('cors');
const app = express();
app.use(cors());
app.get('/data', (req, res) => {
res.send(
getDashboardData());
});
app.listen(3000, () => {
console.log('server is listening on port 3000');
});
function Data() {
var data = {};
..........
//console.log(JSON.stringify(data));
return JSON.stringify(data);
}

Node JS HTTP Proxy hanging up

I have an http-proxy to proxy any website and inject some custom JS file before to serve the HTML back to the client. Whenever I try to access the proxied website, it will hang up or the browser seems to load indeterminately. But when I check the HTML source, I successfully managed to inject my custom JavaScript file. Here is the code:
const cheerio = require('cheerio');
const http = require('http');
const httpProxy = require('http-proxy');
const { ungzip } = require('node-gzip');
_initProxy(host: string) {
let proxy = httpProxy.createProxyServer({});
let option = {
target: host,
selfHandleResponse: true
};
proxy.on('proxyRes', function (proxyRes, req, res) {
let body = [];
proxyRes.on('data', function (chunk) {
body.push(chunk);
});
proxyRes.on('end', async function () {
let buffer = Buffer.concat(body);
if (proxyRes.headers['content-encoding'] === 'gzip') {
try {
let $ = null;
const decompressed = await ungzip(buffer);
const scriptTag = '<script src="my-customjs.js"></script>';
$ = await cheerio.load(decompressed.toString());
await $('body').append(scriptTag);
res.end($.html());
} catch (e) {
console.log(e);
}
}
});
});
let server = http.createServer(function (req, res) {
proxy.web(req, res, option, function (e) {
console.log(e);
});
});
console.log("listening on port 5051");
server.listen(5051);
}
Can someone please tell me if I am doing anything wrong, it looks like node-http-proxy is dying a lot and can't rely much on it since the proxy can work sometimes and die at the next run, depending on how many times I ran the server.
Your code looked fine so I was curious and tried it.
Although you do log a few errors, you don't handle several cases:
The server returns a body with no response (cheerio will generate an empty HTML body when this happens)
The server returns a response that is not gzipped (your code will silently discard the response)
I made a few modifications to your code.
Change initial options
let proxy = httpProxy.createProxyServer({
secure: false,
changeOrigin: true
});
Don't verify TLS certificates secure: false
Send the correct Host header changeOrigin: true
Remove the if statement and replace it with a ternary
const isCompressed = proxyRes.headers['content-encoding'] === 'gzip';
const decompressed = isCompressed ? await ungzip(buffer) : buffer;
You can also remove the 2 await on cheerio, Cheerio is not async and doesn't return an awaitable.
Final code
Here's the final code, which works. You mentioned that "it looks like node-http-proxy is dying a lot [...] depending on how many times I ran the server." I experienced no such stability issues, so your problems may lie elsewhere if that is happening (bad ram?)
const cheerio = require('cheerio');
const http = require('http');
const httpProxy = require('http-proxy');
const { ungzip } = require('node-gzip');
const host = 'https://github.com';
let proxy = httpProxy.createProxyServer({
secure: false,
changeOrigin: true
});
let option = {
target: host,
selfHandleResponse: true
};
proxy.on('proxyRes', function (proxyRes, req, res) {
console.log(`Proxy response with status code: ${proxyRes.statusCode} to url ${req.url}`);
if (proxyRes.statusCode == 301) {
throw new Error('You should probably do something here, I think there may be an httpProxy option to handle redirects');
}
let body = [];
proxyRes.on('data', function (chunk) {
body.push(chunk);
});
proxyRes.on('end', async function () {
let buffer = Buffer.concat(body);
try {
let $ = null;
const isCompressed = proxyRes.headers['content-encoding'] === 'gzip';
const decompressed = isCompressed ? await ungzip(buffer) : buffer;
const scriptTag = '<script src="my-customjs.js"></script>';
$ = cheerio.load(decompressed.toString());
$('body').append(scriptTag);
res.end($.html());
} catch (e) {
console.log(e);
}
});
});
let server = http.createServer(function (req, res) {
proxy.web(req, res, option, function (e) {
console.log(e);
});
});
console.log("listening on port 5051");
server.listen(5051);
I ended up writing a small Python Server using CherryPy and proxied the web app with mitmproxy. Everything is now working smoothly. Maybe I was doing it wrong with node-http-proxy but I also became sceptic about using it in a production environment.

Socket connection in a API route controller retains the data of previous call

I have an API endpoint in my Node/Express app. The endpoint is responsible to upload files. There are several stages involved in the upload process. like image conversion, sending images to another third party API, etc. I am using socket.io to tell the client about the current stage of upload.
The problem is, The socket connection works fine in the first call, but in my second call to the endpoint, the socket connection runs twice and retains the data which I sent in the previous call.
Here's my code:
server.js
import ClientsRouter from './api/routes/clients';
import express from 'express';
import http from 'http';
import io from 'socket.io';
const port = process.env.PORT || 3000;
const app = express();
const server = http.Server(app);
const socket = io(server);
app.set('socket', socket);
app.use(bodyParser.urlencoded({ extended: false }));
app.use(bodyParser.json());
app.set('view engine', 'ejs');
app.use(express.static('dist'))
app.use('/uploads', express.static('uploads'));
app.use('/api/clients', ClientsRouter);
server.listen(port, () => console.log(`Server Listening on ${process.env.URL}`));
api/routes/clients.js
import express from 'express';
import ClientsController from '../controllers/clients';
ClientsRouter.post('/uploadClientData/', clientDataUpload.array('client_data'), ClientsController.uploadClientData);
controllers/clients.js
const uploadClientData = async (req, res) => {
try {
const files = req.files
const clientFolder = req.body.client_folder
const { dbxUser } = req;
const team_member_id = req.body.team_member_id;
const data = req.files.map( i => ({team_member_id, destination: i.destination.substring(1), filename: i.filename, status: 1 } ))
const io = req.app.get("socket");
console.log("Outside Socket", data); //This contains my currently passed data
io.on('connection', async socket => {
console.log("Socket Connection established");
console.log("Inside Socket", data); //But This contains my current data aling with the data that I passed in previous call
await uploadQueue.collection.insertMany(data)
socket.emit('upload stage', { upload_stage: 2, progress: 33 })
await helpers.convertImagesToWebResolution(team_member_id, req.body.dpi, req.body.resolution);
socket.emit('upload stage', { upload_stage: 3, progress: 66 })
await helpers.uploadImagesToDropbox(team_member_id, dbxUser, clientFolder)
socket.emit('upload stage', { upload_stage: 4, progress: 100 })
})
res.status(200).json({message: "Uploaded"});
} catch (error) {
console.log(error)
res.status(500).json({
error
});
}
}
And in my front-end react component
componentDidMount(){
const { currentFolder } = this.props;
this.setState({ client_folder: currentFolder }, () => this.afterFileSelect())
}
componentDidUpdate(prevProps){
const { selectedFiles } = this.props;
if(prevProps.selectedFiles !== selectedFiles){
this.afterFileSelect()
}
}
afterFileSelect = async () => {
const { selectedFiles, setSelectedFiles, currentFolder, user, uploadSettings} = this.props;
let formData = new FormData()
formData.append('client_folder', currentFolder)
formData.append('team_member_id', user.team_member_id)
formData.append('resolution', uploadSettings.resolution.split("x")[0])
formData.append('dpi', uploadSettings.dpi)
for(let selectedFile of selectedFiles){
formData.append('client_data', selectedFile)
}
let uploadResp = uploadSettings.convert_web_res ? await uploadClientData(formData) : await dropboxDirectUpload(formData)
const endpoint = uploadResp.config.url;
const host = endpoint.substring(0, endpoint.indexOf("api"));
const socket = socketIOClient(host);
socket.on("upload stage", data => {
this.setState({upload_stage: data.upload_stage, progress: data.progress})
data.upload_stage === 4 && this.setState({client_folder: ""})
})
}
Also I want to know if this is the correct way to to track upload progress?

Streaming JSON with Node.js

I have been given a task — I'm trying to make routes/endpoints using just Node to better demostrate what is happening under the covers with Express. But I have to use streams.
I have two routes GET /songs and GET/refresh-songs with the following requirements:
We need the following two endpoints:
GET /songs
GET /refresh-songs
All endpoints should return JSON with the correct headers
/songs should...
stream data from a src/data/songs.json file
not crash the server if that file or directory doesn't exist
bonus points for triggering refresh code in this case
bonus points for compressing the response
/refresh-songs should...
return immediately
it should not hold the response while songs are being refreshed
return a 202 status code with a status JSON response
continue getting songs from iTunes
frontend should have
UI/button to trigger this endpoint
This is what I have so far in my server.js file where my endpoints will live.
const { createServer } = require('http');
const { parse: parseUrl } = require('url');
const { createGzip } = require('zlib');
const { songs, refreshSongs } = require('./songs');
const fs = require('fs');
const stream = fs.createReadStream('./src/data/songs.jso')
const PORT = 4000;
const server = createServer(({ headers, method, url }, res) => {
const baseResHeaders = {
// CORS stuff is gone. :(
'content-type' : 'application/json'
};
// Routing ¯\_(ツ)_ /¯
//
var path = url.parseUrl(url).pathname;
function onRequest(request, response){
response.writeHead(200, 'Content-Type': baseResHeaders['content-type']);
stream.on('data', function(err, data){
if (err) {
response.writeHead(404);
response.write('File not found');
refreshSongs();
} else {
response.write(createGzip(data))
}
response.end()
})
}
switch (path) {
case '/songs':
onRequest()
break;
case '/refresh-songs':
onRequest()
break;
}
server.on('listening', () => {
console.log(`Server running at http://localhost:${PORT}/`);
});
I am wondering if I am wiring the onRequest method I created correctly and if the switch statement is correctly going to intercept those URL's

Categories

Resources