I'm developing a React web app and I'm using the create-react-app npm utility.
My app communicates with a server which, during development, is on my local machine. For this reason all the Ajax request I make use a localhost:port address.
Of course, when I'm going to build and deploy my project in production I need those addresses to change to the production ones.
I am used to the preprocess Grunt plugin flow (https://github.com/jsoverson/grunt-preprocess) which has the possibility to mark parts of code to be excluded, included or changed at build time.
For example:
//#if DEV
const SERVER_PATH = "localhost:8888";
//#endif
//#if !DEV
const SERVER_PATH = "prot://example.com:8888";
//#endif
Do you know if there is a way to do such thing inside the create-react-app development environment?
Thank you in advance!
I'm not too sure exactly how your server-side code handles requests, however you shouldn't have to change your code when deploying to production if you use relative paths in your ajax queries. For example, here's an ajax query that uses a relative path:
$.ajax({
url: "something/getthing/",
dataType: 'json',
success: function ( data ) {
//do a thing
}
});
Hopefully that helps :)
When creating your networkInterface, use the process.env.NODE_ENV to determine what PATH to use.
if (process.env.NODE_ENV !== 'production') {
const SERVER_PATH = "localhost:8888";
}
else {
const SERVER_PATH = "prot://example.com:8888";
}
Your application will automatically detect whether you are in production or development and therefore create the const SERVER_PATH with the correct value for the environment.
According to the docs, the dev server can proxy your requests. You can configure it in your package.json like this:
"proxy": "http://localhost:4000",
Another option is to ask the browser for the current location. It works well when your API and static files are on the same backend, which is common with Node.js and React.
Here you go:
const { protocol, host } = window.location
const endpoint = protocol + host
// fetch(endpoint)
Related
I am working on my nextjs project under docker, and when using getStaticProps my backend api is not available(which is also under docker). So I connected frontend to backend via networks and if i hardcode api for ssr request it works. But when i try to utilize serverRuntimeConfig and publicRuntimeConfig so i could switch between them depending on where code is being ran I get {} for serverRuntimeConfig. However publicRuntimeConfig is fine and i can access api from it.
My next.config.js is:
module.exports = {
publicRuntimeConfig: {
// Will be available on both server and client
baseUrl: 'http://localhost/api/v1',
},
serverRuntimeConfig: {
// Will only be available on the server side
baseUrl: 'http://backend_nginx_1/api/v1/',
},
am I missing something ?
This will sound dumb, but I spent 2 hours seeing the empty file been recognized by the system and just seeing {}.
Now... restarting the server, gives you access to the content of the file.
That was my solution.
And it was not included in the documentation.
https://nextjs.org/docs/api-reference/next.config.js/runtime-configuration
I'm new to coding. I have a node.js application which I have deployed using "Heroku.com". I want to use a custom domain for the application as well as having SSL active using the custom domain. When the application uses the default domain given by Heroku, SSL is automatically in place, however if I want SSL to work when I use a custom domain, I have to include the following code in my app.js file in order for SSL to work:
app.use((req, res, next) => {
if (req.header('x-forwarded-proto') !== 'https')
res.redirect(`https://${req.header('host')}${req.url}`)
else
next()
});
This works fine, however when I am maintaining my app locally (in VS Code) and use localhost:3000 for testing purposes I have to comment out the code above in order to be able to view the app using locahost because of something to do with localhost not working with HTTPS.
So my question is, is there a code (if statement or something of the like) that will run that code if its being used in a live environment or to not run if its being used in localhost. This is mainly so I don't have to continue to comment out the code before and after deployment and testing.
If you have any other advice or better solutions for this kind of thing I would appreciate it.
Cheers,
Sam
You can use the concept of environment variables to differentiate between the production enviroment (live website on heroku) and the development environment (localhost). Specifically, you can set an environment variable NODE_ENV (just a popular naming convention, nothing in-built) to a value that can be used inside your code logic to perform actions based on the environment.
You can access the value by writing
const env = process.env.NODE_ENV;
Note: You have to set the environment variable first, otherwise process.env.NODE_ENV is just going to be undefined.
How to set the environment variables?
There are a couple of ways, like having a .env file, passing through CLI, etc. I'll show you a quick way. While running your server on localhost, write this,
NODE_ENV=development node server.js
Now, inside your server.js, you can do something like
// If NODE_ENV is undefined, assume production
const env = process.env.NODE_ENV || 'production';
app.use((req, res, next) => {
if (env === 'production' && req.header('x-forwarded-proto') !== 'https') {
res.redirect(`https://${req.header('host')}${req.url}`)
}
else {
next()
}
})
You can have as many environments as you like (development, testing, production, staging, etc.) Also check out dotenv module
[Edit]
Create folder called config inside add settings.js
package.json
app.js
--config // Will hold your configuration settings
----settings.js // Your settings
In settings.js
const settings = {
development: {
// Developpment configuration settings
email: 'development#gmail.com'
},
staging: {
// staging settings
email: 'staging#gmail.com'
},
production: {
// Production configuration settings
email: 'production#gmail.com'
},
}
const getSettings = () => {
if (!process.env.NODE_ENV) return settings.development
if (process.env.NODE_ENV === 'staging') return settings.staging
return settings.production
}
module.exports = getSettings()
what is proccess.env.NODE_ENV? It's an enviroment variable which if you didn't declare will return undefined and you can make it's variable different by setting it from the terminal
When you go to development try to run this command in the hoster terminal
export NODE_ENV=production
DON'T RUN THIS IN YOUR LOCAL TERMINAL, try to ask your web hosters how to run a command in their command line to store environment variable
for heroku: heroku run export process.env.NODE_ENV
DON'T WRITE ANY SENSITIVE DATA THERE! store them in environment variables too
Also for the port you would want to do something like this in your app.js
const port = process.env.PORT || 3000
I am building my first react app and not sure about front end security. I am making a call to the following third party library: emailjs.sendForm(serviceID, templateID, templateParams, userID);
The userId field is sensitive information. I make the following call on my onSubmit handler. I am wondering if I need to secure this information somehow? Also, is there a way for me to check if a user can somehow see this information my inspecting and finding the code in the method somehow?
emailjs
.sendForm(
"gmail",
"client-email",
"#form",
"**here_is_the_sensitive_info**"
)
.then(() => {
resetForm({});
})
.catch(() => {
const acknowledgement = document.createElement("H6");
acknowledgement.innerHTML = "Something went wrong, please try.";
document.getElementById("form").appendChild(acknowledgement);
});
In this case, EmailJS is meant to be used in the browser, so I don't think that the userId is sensitive at all.
In their own documentation, you can see the following instruction to get started.
<script type="text/javascript"
src="https://cdn.jsdelivr.net/npm/emailjs-com#2.4.1/dist/email.min.js">
</script>
<script type="text/javascript">
(function(){
emailjs.init("YOUR_USER_ID");
})();
</script>
That said, anyone can definitely see this in the source of the page in their browser. You are right to be cautious with anything sensitive in client-side JavaScript.
To avoid anyone using your userId on their own website (which is very unlikely since it only triggers emails that you configured), you can whitelist your own domain with their paid plan apparently.
The .env file, when used in a frontend project, only serves to set environment variables that are used at compilation time. The file never gets to the browser, but the values are often just interpolated (e.g. with the DefinePlugin) in the final bundle source, so there's nothing necessarily more secure here.
WARNING: Do not store any secrets (such as private API keys) in your
React app!
Environment variables are embedded into the build, meaning anyone can
view them by inspecting your app's files.
# (s) for sensitive info
.env -> compilation -> bundle -> browser -> third-party
(s) (s) (s) (s) (s)
That said, when used in a Node.js server, the .env file serves to set, again, environment variables, but this time, at the start of the application. These values are not shared with the frontend though, so one way to use this as a secure solution is to expose your own endpoint, whitelisting only your own domain, which then uses the sensitive information only on the server.
.env -> Node.js server -> third-party
(s) (s) (s)
^
/ (api call)
...bundle -> broswer
But then again, here, EmailJS' userId is not sensitive information.
You should never have sensitive info in the frontend. You should have for instance, a nodejs instance running, expose and endpoint, to the frontend, and call it. Then, inside your nodejs application, you should have a .env file with your credentials.
Then, just use the .env info from your node.js server.
If you have sensitive info in the frontend, you are exposing everything.
1.first we need install DotENV in you are project
command: npm install dotenv
and now check in your package.json file install or not , if install file we can see like this "dotenv": "^10.0.0", and we can configure the file in your file in top of the file like "require('dotenv').config();" and now where you want now your using .env file.
first we need to understand how to using .env file in your file
any .env file are using (process.env)
and more information for Sensitive info Questions please go to the like
https://www.youtube.com/watch?v=17UVejOw3zA
Thankyou,
I have a pretty standard setup with a node backend that serves an SPA as a webpack bundle as well as the api serving that SPA application. The backend is using koa2.
So I have had hot reload working just fine for the front end part but I now have a bit more work on the backend and need to make my roundtrips faster.
I'm wondering what the best approach is. I started out by:
1. Bootstrapping from webpack
I used webpack-serve which at the time seemed standard and added the KOA as middleware in the webpack config:
serve: {
add: app => {
require('./src/node/backend')(app)
}
....
This however doesn't hot reload the backend and it's pretty painful since I have to restart the whole webpack-serve command on a backend change then.
So then I tried
2 Bootstrapping from the node backend with webpack as middleware
const Koa = require('koa')
const koaWebpack = require('koa-webpack')
const webpack = require('webpack')
const app = new Koa()
const config = require('../../webpack.dev.js')
const compiler = webpack(config)
koaWebpack({ compiler }).then(middleware => {
require('./backend')(app) // delegate to the common
app.listen(process.env.port)
app.use(middleware)
return app
})
This works fine too for frontend but I still have no reloading of backend so basically the same experience.
3. Running webpack-dev-server and backend as different processes.
This works fine and then I can use nodemon for the koa backend which is good enough for me, but I then have to do some shuffling of ports I think.
I guess webpack-dev-server/webpack-serve can act as proxy and pass stuff through to the backend unless it hits any of my frontend stuff. But it all feels tedious. I'd rather stick it all together on the same port.
So is there any other easy way to have the two builds hot reloaded but still running together on the same port…?
I found this project https://github.com/vlazh/node-hot-loader, that might be interesting, but I really felt the need to ask before if I'm missing something more obvious since I'm actually happy with restarting my server nodemon style… (it's small and fast)
After extensive searching I found a few tools tackling this, but they all used webpack for bundling the server side code too which I feel is overkill for me at this point.
I ended up doing an in-app reload for the server side, staying in alternative 2 above. So for the koa backend I just added a file watch to invalidate the require cache for the dev server:
const chokidar = require('chokidar')
const watcher = chokidar.watch('.')
watcher.on('ready', () => {
watcher.on('all', () => {
Object.keys(require.cache).forEach(function(id) {
if (id.indexOf('src/node') > 0 && id.indexOf('node_modules') < 0) {
delete require.cache[id]
}
})
})
})
Then an indirection so one always passes the require in dev:
const reloadable =
process.env.NODE_ENV === 'development' ? id => (ctx, next) => require(id).then(r => r(ctx, next)) : id => require(id)
which I then use in all the route definitions like:
const use = app.use.bind(app)
reloadable('./auth').then(use)
reloadable('./users').then(use)
reloadable('./ingredients').then(use)
This was good enough for me.
Can node.js be setup to recognize a proxy (like Fiddler for example) and route all ClientRequest's through the proxy?
I am using node on Windows and would like to debug the http requests much like I would using Fiddler for JavaScript in the browser.
Just be clear, I am not trying to create a proxy nor proxy requests received by a server. I want to route requests made by http.request() through a proxy. I would like to use Fiddler to inspect both the request and the response as I would if I was performing the request in a browser.
I find the following to be nifty. The request module reads proxy information from the windows environment variable.
Typing the following in the windows command prompt, will set it for the lifetime of the shell. You just have to run your node app from this shell.
set https_proxy=http://127.0.0.1:8888
set http_proxy=http://127.0.0.1:8888
set NODE_TLS_REJECT_UNAUTHORIZED=0
To route your client-requests via fiddler, alter your options-object like this (ex.: just before you create the http.request):
options.path = 'http://' + options.host + ':' + options.port + options.path;
options.headers.host = options.host;
options.host = '127.0.0.1';
options.port = 8888;
myReq = http.request(options, function (result) {
...
});
If you want to montior outgoing reqeusts from node
you can use the request module
and just set the proxy property in the options, like that
request.post('http://204.145.74.56:3003/test', {
headers :{ 'content-type' : 'application/octet-stream'},
'body' : buf ,
proxy: 'http://127.0.0.1:8888'
}, function() {
//callback
});
8888 is the default port , of fiddler .
process.env.https_proxy = "http://127.0.0.1:8888";
process.env.http_proxy = "http://127.0.0.1:8888";
process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0";
Answering my own question: according to https://github.com/joyent/node/issues/1514 the answer is no, but you can use the request module, http://search.npmjs.org/#/request, which does support proxies.
If you want to configure a proxy in the general case, the other answers are right: you need to manually configure that for the library you're using as node intentionally ignores your system proxy settings out of the box.
If however you're simply looking for a fiddler-like HTTP debugging tool for Node.js, I've been working on an open-source project to do this for a little while (with built-in node support) called HTTP Toolkit. It lets you
Open a terminal from the app with one click
Start any node CLI/server/script from that terminal
All the HTTP or HTTPS requests it sends get proxied automatically, so you can see and rewrite everything. No code changes or npm packages necessary.
Here's a demo of it debugging a bunch of NPM, node & browser traffic:
Internally, the way this works is that it injects an extra JS script into started Node processes, which hooks into require() to automatically reconfigure proxy settings for you, for every module which doesn't use the global settings.