I'm refactoring my Hapi server to use reusable modules instead of performing logic in my route handlers. I have a plugin registered in my Hapi server for MongoDB connection pooling, which I'd like to be able to access in these modules. Is there a way to export the server object itself, or do I need to rewrite my modules to accept the request object as an argument? I'm using node 0.12.12 and Hapi 8.4.0.
I already tried module.exports = server; in the file where my server is defined, and then requiring the server object from a different file, (both with var server = require('../index.js').server; and var server = require('../index.js')(server);, but I either get an error or undefined.
The closest thing I could find to an answer was this issue from a few years ago, on an older version of Hapi: https://github.com/hapijs/hapi/issues/1260
- but it looks like this was never really resolved.
Well, I'm an idiot, but maybe this will help somebody else out:
It seems module.exports cannot be called within a callback, according to the node documentation. So I moved this statement to the bottom of my index.js:
module.exports.server = server
And then in my other modules, called:
var server = require('../index.js');
And was able to access the plugins contents as server.server.plugins
HTH
Related
I'm using Keystone 6 for my backend and I'm trying to integrate Stripe, which requires access to the underlying express app to pass a client secret to the client from the server. This is highlighted in the Stripe documentation here: https://stripe.com/docs/payments/payment-intents#passing-to-client
I'm having trouble figuring out how to access the express app in Keystone 6 though and they don't seem to mention anything about this in the documentation. Any help would be appreciated.
The short answer is Keystone 6 doesn't support this yet.
The longer answer has two parts:
This functionality is coming
We've been discussing this requirement internally and the priority of it has been raised. We're updating the public roadmap to reflect this next week.
The functionality itself should arrive soon after. (Unfortunately I can't commit to a release date.)
Getting access to the Express app is possible, it's just a real pain right now
If you look at Keystone's start command you can see where it
calls createExpressServer().
This just returns an express app with the GraphQL API and a few other bits and bobs.
But there's actually nothing forcing you to use the build in keystone start command – You can copy this code, hack it up and just run it directly yourself.
Eg. you could replace this...
const server = await createExpressServer(
config,
graphQLSchema,
keystone.createContext,
false,
getAdminPath(cwd)
);
With...
const server = express();
server.get('/hello-world', (req, res) => {
res.send('Hello');
});
const keystoneServer = await createExpressServer(
config,
graphQLSchema,
keystone.createContext,
false,
getAdminPath(cwd)
);
server.use(keystoneServer);
And your /hello-world endpoint should take precedence over the stuff Keystone adds.
Unfortunately, this doesn't work for the dev command so, in your local environment you'll need to do it differently.
One option is to start a second express server that you control and put it on a different port and include your custom routes there.
You can still do this from within your Keystone app codebase but having different URLs in different environments can be annoying.
You'll probably need an environment variable just for your custom endpoints URL, with values like this in production:
# Production
GRAPHQL_ENDPOINT="https://api.example.com/api/graphql"
CUSTOM_ENDPOINT="https://api.example.com/hello-world"
And this in dev:
# Dev
GRAPHQL_ENDPOINT="http://localhost:3000/api/graphql"
CUSTOM_ENDPOINT="http://localhost:3100/hello-world"
It's ugly but it does work.
I'll update this answer when the "official" functionality lands.
I'm building my first Express app, which needs to interact with an API, using an API key that ideally remains secure.
So I wanted to follow a basic pattern of keeping the key (and any future environment variables), in a .gitignored .env file in the root directory.
To not reinvent the wheel, I used this package, and set my env variables like so, in my app.coffee file (the root file of the application):
env = require('node-env-file')
env __dirname + '/.env'
console.log process.env.MY_API_KEY
That console.log prints out the right key to the server logs. The problem arises later:
If I try to access that same variable in one of the JS files loaded later on by my app, process.env is an empty object, so the API key is undefined. This doesn't appear to be a problem with the above package, because if I define the variable in the CL (API_KEY=whatever npm start), the behavior is the same -- it console logs correctly from app.coffee but is unavailable later.
Some information on how the files in which the key is unavailable are being loaded:
The app is running React, which I write to a few .jsx files in public/javascripts/src, and which are compiled by gulp into public/javascripts/build/*.js.
I'm trying to access the key in a .js file in public/javascripts/ which is required by one of the .jsx files.
In that required .js file, process.env returns an empty object. When I try to access process.env in the .jsx files, I'm actually told that process itself is undefined.
Any ideas what's going on here? I'm new to Express/React, and unclear where this process object, which I thought was global and defined on npm start is defined, and what's happening to all the env info in it.
Thanks! Please let me know if any other information would be helpful, orif anyone has any suggestions for how better to handle private env info in my situation.
EDIT:
I tried the suggestions below, and created a separate endpoint internally, which hits the external API and then returns a response. I've strung things up correctly, so that this responds correctly:
router.get '/images', (req, res, next) ->
res.json({ some: 'json' });
but this (which uses a separate class to make a request to an external API), throws an error:
router.get '/images', (req, res, next) ->
new Images('nature').fetch (images) ->
res.json({ some: 'json' })
Essentially, it looks like the asynchrony of the response from the external API (and not even the data itself, which I ignored), is creating a problem. How do I hit this external endpoint and then respond to the internal request with the incoming data?
Back-end vs Front-end
It seems like you are trying to access back-end data from a front-end location, in a wrong way.
The great power of Node.js is having JavaScript in the front and in the back, but it is quite confusing in the beginning to understand on which side each script is executed.
In an Express project, all Javascript files that are sent to the front-end, those that will directly interact with the client's page, are located in public/javascripts/. Generally you will have some AJAX functions in some of those files to exchange data and communicate with the back-end.
These back-end files are located everywhere else : in the root directory, in routes/, and all the other folders you create. Those files are pretty much all connected to your Node instance, and therefore can communicate with each other using global objects like process for example.
Your script in public/javascripts/, that is executed on the client's computer, is trying to directly access a variable located on the server running your Node instance : that's why your code doesn't work. If you wish to access data from the back-end, you must use AJAX calls in the front-end.
Server <---(AJAX only)--- Client
------ ------
app.js public/javascripts/script.js
routes.js
...
That being said, you wanted to keep your API key private, which will not happen if you send it to every client who's on that specific page. What you should do is make the call from the back-end, using the xhr module for example, and then delivering the data to front-end, without the secret API key.
I hope I was clear, Node is quite confusing at first but very soon you will get over these little mistakes !
All .jsx is, is some code, what matters is where the code is being executed. process.env is a variable that is accessible inside the Node.js runtime. When your .jsx code gets transpiled down to .js and served to the browser, the process.env variable will no longer exist. If you're making an API call inside the browser, the API key will be fundamentally available to the client. If you want to secure the key, you have to have your Node.js server expose an API route, which your React app will hit. That Node.js server will then make the call to the external service using the API key. Because that call is being made by the server, process.env will be available, and will remain hidden from the client. You can then forward the result of the API call back to the user.
I have a to access the config variables defined in the file called
test.js which has --
var aws = require('aws-sdk');
exports.connect = function(){
return aws;
}
Now I need to access it when the OnClick event occurs on the browser. I have this script but the require module does not work.
clientScript.js
var aws = require('../scripts/test.js').connect();
function getValue() {
aws.describe({},function(){...})
}
How can I access this aws variable?
Hopefully I'm not too far off the mark with what you're trying to do here. My understanding (cobbled together between this and your previous question is that you would like something in the browser that upon click will retrieve some status information from an external API, which will then be displayed in the client.
What I would recommend doing (based on the above assumption) is defining your desired function as something to be triggered by an HTTP request to the Express server, which can perform your function and send whatever you'd like from its process back to the client.
In your server define (assuming your Express variable is app)
app.get('/request', someFunction);
In someFunction define what it is you'd like to do, how it relates to the request and response, and what to send back to the client. Express will expect the function to take request and response as arguments, but you don't necessarily need to use them:
someFunction(req,res) {
//do whatever I'd like with aws or anything else
res.send(foo); //where foo is whatever JSON or text or anything else I'd like the client to have to manipulate
}
On the client, you would have a function bound to onclick that would make the request to that /request endpoint. This function could use AJAX or simply render another page entirely.
This sort of organization also leaves display and behavior to the client, while the server deals with retrieving and manipulating data. This layout also resolves any concerns about require() statements on the clientside (which, while possible with things like Browserify, are not necessary and may make the code more confusing).
You can't use require() on the client side, that is only a server side function provided by node.js which runs on the server. If you need config options that are shared between server and client, then you will need to make a few changes to your test.js file so it will work in both. I'm sure there are a number of ways to do this, but the way I prefer is:
Put all your configuration variables inside test.js into an object like:
this.ConfigOptions = {option1:value1, option2:value2};
The client would include the file like this:
<script src="test.js" type="text/javascript"></script>
and can access the config options via the ConfigOptions object
while the server would use require() to include the file and access the config options like this:
var ConfigOptions = require('test.js').ConfigOptions;
I can't seem to figure out how to get my express server to run a simple function when the server starts up. Where is the appropriate place to call a function to run on server startup, and the proper syntax?
I have the function in my routes file as exports.myFunction = function() { code here};
I've tried sticking it in the app.configure block as routes.myFunction. I've tried changing it in routes to just be myfunction() { code}, then calling it in the configure block as routes.myfunction(), no luck there either. The function needs to stay in the file containing my routes since it alters some global variables there.
I know it's some stupidly simple syntax thing, but I can't seem to find any hints here or on google. Much thanks for any help!
Use this event:
app.on('listening', function () {
// server ready to accept connections here
});
To be honest app returned by express.createServer() is just http.Server, so everything described in nodejs docs related to http.Server make sense for express and railwayjs.
I would keep it simple. In the module where you call app.listen(port), just call your startup function right before (or after) that. If you need that function to reside in a separate module full of other routes, just export it so your main server.js module can invoke it on startup. If you are still not satisfied with that, consider maybe binding an event listener somewhere in express/connect, although I'm not sure an explicit "startup" event is emitted.
In server.js (or whatever module you start your express server), do this:
var myRoutes = require("./myroutes");
var app = express.createServer();
...
app.listen(8080, "127.0.0.1", function() {
myRoutes.myFunction();
});
You can also bind to the "listening" event as #Anatoly says. The docs for the listening event are here.
Let's say I have 2 web servers. Both of them just installed Node.js and is running a website (using Express). Pretty basic stuff.
How can Server-A tell Server-B to execute a function? (inside node.js)
Preferably...is there a npm module for this that makes it really easy for me?
How can Server-A tell Server-B to
execute a function?
You can use one of the RPC modules, for example dnode.
Check out Wildcard API, it's an RPC implementation for JavaScript.
It works between the browser and a Node.js server and also works between multiple Node.js processes:
// Node.js process 1
const express = require('express');
const wildcardMiddleware = require('#wildcard-api/server/express');
const {endpoints} = require('#wildcard-api/server');
endpoints.hello = async function() {
const msg = 'Hello from process 1';
return msg;
};
const app = express();
app.use(wildcardMiddleware());
app.listen(3000);
// Node.js process 2
const wildcard = require('#wildcard-api/client');
const {endpoints} = require('#wildcard-api/client');
wildcard.serverUrl = 'http://localhost:3000';
(async () => {
const msg = await endpoints.hello();
console.log(msg); // Prints "Hello from process 1"
})();
You can browse the code of the example here.
You most likely want something like a JSON-RPC module for Node. After some quick searching, here is a JSON-RPC middleware module for Connect that would be perfect to use with Express.
Also, this one looks promising too.
Update : The library I've made & linked below, isn't maintained currently. Please check out the other answers on this thread.
What you need is called RPC. It is possible to build your own, but depending on the features you need, it can be time consuming.
Given the amount of time I had to invest, I'd recommend finding a decent library that suits your purpose, instead of hand rolling. My usecase required additional complex features like selective RPC calls, for which I couldn't find anything lightweight enough, so had to roll my own.
Here it is https://github.com/DhavalW/octopus.