How to implement Express + Node JS + Browserify properly? - javascript

So I'm somewhat new to the whole web development thing with node.js and I'm wondering if someone could help me out with understanding how to implement my application correctly.
So the app is a simple landing page with an email form that takes an email and sends it to the API. I designed this functionality without issue except when I launched my website i'm getting a required not defined error.
I understand that this is because node.js is a server side technology so that when the application goes live, the client doesn't understand what required means.
Through further research, I figured out that I had two options and that's to either implement synchronous dependencies via something like Browserify or take things asynchronously and use something like RequireJS.
Right now I've decided on using Browserify, (unless someone can convince me otherwise) I just need help with figuring out how to implement it for my specific app.
app.js
//The dependenices my node js app needs (also where the require bug occurs)
//------------------------------------------------------------------------------------------------------------------------
const express = require('express'); //Require the express package that we installed with npm install express
const request = require('request'); //Require the express package that we installed with npm install request
const bodyParser = require('body-parser'); //Require the express package that we installed with npm install bodyParser
const path = require('path'); //Require the express package that we installed with npm install path
//------------------------------------------------------------------------------------------------------------------------
const app = express(); //Creates the application with express
//Middleware
app.use(express.json()); //Tells express to use JSON as it's input
app.use(bodyParser.urlencoded({ extended: false })); //Prevents XSS, only will return if the url has specified enconding
app.use(express.static(path.join(__dirname, '/site'))); //Tells the app to use the current path D:\Gaming Galaxy\Website\Website\main\site as the dir for the website
console.log("The directory used is", express.static(path.join(__dirname, '/site')));
app.post('/subscribe', (req, res) => { //Makes a post request to express req is the request and res is the response
const { email, js } = req.body; //Create a constant that makes up of the request's body
const mcData = { //Create a constant JSON object mcData, that contains the email from the above constant and a status message
members: [
{
email_address: email,
status: 'pending'
}
]
}
const mcDataPost = JSON.stringify(mcData); //Turns the JSON object into a string
const options = { //Sets a JSON object of a bunch of options that mailchimp will use
url: 'https://us20.api.mailchimp.com/3.0/lists/f10300bacb',
method: 'POST',
headers: {
Authorization: 'auth f24c3169da044653d1437842e39bece5-us20'
},
body: mcDataPost
}
if (email) { //If there's an email that exists
request(options, (err, response, body) => { //Send a request to mail chimp
if (err) { //If there's an error
res.json({ error: err }) //Print said error
} else { //If there's not an error
if (js) { //If javascript is enabled (boolean)
res.sendStatus(200); //Send a success message
} else {
res.redirect('/success.html'); //If it's disabled, send them to a successful HTML page.
}
}
})
} else {
res.status(404).send({ message: 'Failed' }) //If the email doesn't exist, have it fail
}
});
app.listen(5000, console.log('Server started!')) //Console log that confirms the start of the server
package.json
{
"name": "gaminggalaxy",
"version": "1.0.0",
"main": "site/js/app.js",
"dependencies": {
"body-parser": "^1.19.0",
"commonjs": "^0.0.1",
"express": "^4.17.1",
"index": "^0.4.0",
"node-fetch": "^2.6.6",
"prototype": "^0.0.5",
"request": "^2.65.0",
"requirejs": "^2.3.6",
"uniq": "^1.0.1"
},
"devDependencies": {
"nodemon": "^2.0.15"
},
"scripts": {
"serve": "node app",
"dev": "nodemon app"
},
"keywords": [],
"author": "",
"license": "ISC",
"repository": {
"type": "git",
"url": "git+https://github.com/InvertedTNT/Main.git"
},
"bugs": {
"url": "https://github.com/InvertedTNT/Main/issues"
},
"homepage": "https://github.com/InvertedTNT/Main#readme",
"description": ""
}
index.html (the form itself)
<form action="/subscribe" method="POST">
<div class="newsletter-form-grp">
<i class="far fa-envelope"></i>
<input type="email" name="email" id="email"
placeholder="Enter your email..." required>
</div>
<button id="cta">SUBSCRIBE <i class="fas fa-paper-plane"></i></button>
</form>
folder structure
node_modules
site
-index.html
-css
-js
- app.js
-images
app.js
package-lock.json
package.json
Thank you for your help, I would appreciate any sort of advice on how I can use those dependencies and the overall implementation of browserify.

A browser is an HTTP client.
Express is a framework for building HTTP servers.
HTTP clients make requests to HTTP servers which then send responses back.
Express depends on Node.js. It requires features provided by Node.js (like the ability to listen for network requests) which are not available in browsers.
Browserify can bundle up JavaScript which is written using modules into non-module code that can run in a browser but only if that code does not depend on Node.js-specific features. (i.e. if the JS modules are either pure JS or depend on browser-specific features).
Browserify cannot make Express run inside the browser.
When you run your JS program using Node.js you can then type the URL to the server the program creates into the browser’s address bar to connect to it.

Related

mongoose post.save failing on heroku, works on localhost

I'm trying to simply post to my MongoDB Atlas db via node,express,mongoose and heroku. A Postman POST request, Raw JSON with body:
{
"title": "heroku post",
"description": "post me plsssss"
}
works on localhost with this exact code, but when uploaded via heroku the try/catch block fails at post.save() as the response is the error.
{
"error": "there's an error",
"message": {}
}
But the error is empty and I'm not sure how to debug it. I've put in mongoose.set('debug', true); in app.js and i've modified my package.json: "start": "node app.js DEBUG=mquery", but those have made no extra output that I am seeing. Is there any other way to know why the post.save() is throwing an error, some logs that I am not utilising.. and how can I see those logs? Or if you just know what the issue is?
App.js
//use dotenv to store secret keys
require("dotenv").config();
const express = require('express');
const mongoose = require('mongoose');
const app = express();
const cors = require('cors');
//connect to DB
mongoose.connect("mongodb+srv://grushevskiy:intercom#cluster-rest.4luv0.mongodb.net/cluster-rest?retryWrites=true&w=majority", { useNewUrlParser: true, useUnifiedTopology: true, dbName: "cluster-rest" }, () => {
console.log('connected to DB!')
})
mongoose.set('debug', true);
const db = mongoose.connection;
db.once('open', () => {
console.log('connection opened')
});
//import routes for middleware
const postsRoute = require('./routes/posts');
//midleware routes
app.use('/posts', postsRoute)
//ROUTES
app.get('/', (req,res) => {
res.send('we are on home')
})
//MIDDLEWARES
//cors
app.use(cors());
//decode url special characters
app.use(express.urlencoded({ extended: true }));
//parse json POSTs
app.use(express.json());
//How do we start listening to the server
app.listen( process.env.PORT || 3000);
posts.js
const express = require ('express')
const router = express.Router();
const Post = require('../models/Post')
//SUBMIT A POST
router.post('/', async (req,res) => {
const post = new Post({
title: req.body.title,
description: req.body.description
});
console.log(post)
try {
const savedPost = await post.save();
res.json(savedPost);
console.log(savedPost)
} catch (err) {
res.json({ error: "there's an error", message: err, })
}
})
module.exports = router;
Post.js Model
const mongoose = require('mongoose')
const PostSchema = mongoose.Schema({
title: {
type: String,
required: true
},
description: {
type: String,
required: true
},
date: {
type: Date,
default: Date.now
}
})
module.exports = mongoose.model('Post', PostSchema)
When I type heroku logs --tail there are no errors, also initially, the 'connected to DB!' message comes in a bit late.. I'm wondering if maybe this is an issue with async/await? My package.json:
{
"name": "22-npmexpressrestapi",
"version": "1.0.0",
"engines": {
"node": "14.15.3",
"npm": "6.14.9"
},
"description": "",
"main": "index.js",
"scripts": {
"start": "node app.js DEBUG=mquery",
"start:dev": "node app.js"
},
"author": "",
"license": "ISC",
"devDependencies": {
"express": "^4.17.1",
"nodemon": "^2.0.7"
},
"dependencies": {
"dotenv": "^8.2.0",
"express": "4.17.1",
"cors": "^2.8.5",
"mongoose": "^5.11.11"
}
}
After reading your question carefully I have a couple of suggestions that I think you might want to try. First, let me explain myself:
In my experience, when one learns how to build REST APIs with Node.js, using mongoose to communicate with a MongoDB cluster, this is what the sequence of events of their index.js file looks like:
Execute a function that uses environment variables to establish a connection with the database cluster. This function runs only once and fills de mongoose instance that the app is going to use with whatever models that have been designed for it.
Set up the app by requiring the appropriate package, defining whatever middleware it's going to require, and calling to all the routes that have been defined for it.
After the app has incorporated everything it's going to need in order to run properly, app.listen() is invoked, a valid port is provided and... voila! You've got your server running.
When the app and the databsae are simple enough, this works like a charm. I have built several services using these very same steps and brought them to production without noticing any miscommunication between mongoose and the app. But, if you check the App.js code you provided, you'll realize that there is a difference in your case: you only connect to your Mongo cluster after you set up your app, its middleware, and its routes. If any of those depend on the mongoose instance or the models to run, there is a good chance that by the time Heroku's compiler gets to the point where your routes need to connect to your database, it simply hasn't implemented that connection (meaning hasn't run the mongoose.connect() part) jet.
Simple solution
I think that Heroku is simply taking a little bit longer to run your whole code than your local machine. Also, if your local version is connecting to a locally stored MongoDB, there is a good chance things run quite quicker than in a cloud service. Your statement
the 'connected to DB!' message comes in a bit late
gives me hope that this might be the case. If that were so, then I guess that moving
mongoose.connect("mongodb+srv://xxxxx:xxxxx#cluster-rest.4luv0.mongodb.net/cluster-rest?retryWrites=true&w=majority", { useNewUrlParser: true, useUnifiedTopology: true, dbName: "cluster-rest" }, () => {console.log('connected to DB!')});
to the second line of App.js, right after you call Dotenv, would do the trick. Your mongoose instance would go into App already connected with the database and models that App is going to use.
Asynchronous solution
Even if what I previously wrote fixed your problem, I would like to expand the explanation because, in my experience, that version of the solution wouldn't be right either. The way of doing things that I exposed at the beginning of this answer is ok... As long as you keep your MongoDB cluster and your mongoose instance simple. Not very long ago I had to implement a service that involved a much more complex database pattern, with several different databases being used by the same App, and many of its routes depending directly on those databases' models.
When I tried to use the logic I described before with such a problem, involving heavily populated databases and several models contained in the mongoose instance, I realized that it took Node far less building the App, its middleware, and its routes than connecting to mongoose and loading all the models that that very same App required for it to run. Meaning that I got my Server connected to PORT **** message before the database was actually connected. And that caused trouble.
That made me realize that the only way to do things properly was to ensure asynchronicity, forcing App to run only after all the databases and the models were loaded into the mongoose instance, and feeding it that very same instance. This way, I ended up having an index.js that looked like this:
const Dotenv = require("dotenv").config();
const { connectMongoose } = require("./server/db/mongoose");
const wrappedApp = require("./app");
connectMongoose().then((mongoose) => {
const App = wrappedApp(mongoose);
App.listen(process.env.PORT, () => {
console.log(
`Hello, I'm your server and I'm working on port ${process.env.PORT}`
);
});
});
A file with several functions govering the connections to my databases and modeles called mongoose.js. The function that implements all the connections in the right order looks is:
const connectMongoose = async (mongoose = require("mongoose")) => {
try {
// Close any previous connections
await mongoose.disconnect();
// Connect the Main database
console.log(
"--> Loading databases and models from MongoDB. Please don't use our server jet. <--"
);
mongoose = await connectMother(mongoose);
// Connect all the client databases
mongoose = await connectChildren(mongoose);
console.log(
"--> All databases and models loaded correctly. You can work now. <--"
);
return mongoose;
} catch (e) {
console.error(e);
}
};
module.exports = { connectMongoose };
With this, the file app.js returns a function that needs to be fed a mongoose instance and sets up all the Express environment:
module.exports = (Mongoose) => {
const Express = require("express"); //Enrouting
// Other required packages go here
// EXPRESS SETTINGS
const App = Express();
App.set("port", process.env.PORT || 3000); //Use port from cloud server
// All the middleware App needs
App.use(Flash());
// Routing: load your routes
// Return fully formed App
return App;
};
Only using this pattern I made sure that the chain of events made sense:
Connect all the databases and load their models.
Only after the last database has been connected and its last model, loaded, set up the App, its middleware, and your routes.
Use App.listen() to serve the App.
If your database isn't very large, then the first part of the answer might do the trick. But it's always useful to know the whole story.
I am able to reach an empty error message with your code when i add wrong connection string.
console.log in this callback is incorrect:
mongoose.connect("mongodb+srv://xxxxx:xxxxx#cluster-rest.4luv0.mongodb.net/cluster-rest?retryWrites=true&w=majority", { useNewUrlParser: true, useUnifiedTopology: true, dbName: "cluster-rest" }, () => {
console.log('connected to DB!')
})
if you wanna know if you are connected to db use this:
mongoose.connect("mongodb+srv://xxxxx:xxxxx#cluster-rest.4luv0.mongodb.net/cluster-rest?retryWrites=true&w=majority", { useNewUrlParser: true, useUnifiedTopology: true, dbName: "cluster-rest" });
const db = mongoose.connection;
db.once('open', () => {
console.log('connection opened')
});
I noticed you are using dotenv maybe you are not adding variables to your heroku host.

Running Node.js Application with html server

i currently learn Node.js development and i try to start my Webcode. Im working with Visual Studio Code on Windows.
When i start the application with "npm start", the website on "localhost:3000/" just shows the "Server Connect". But how do i see all of the other files like the "Index.html"? Do i have to add them anywhere?
Thank you in advance.
Https Server File:
const http = require('http');
http.createServer(function(req, res){
res.write('Server connect');
res.end();
}).listen('3000');
package.json
{
"name": "project-1",
"version": "1.0.0",
"description": "Project 1",
"main": "index.html",
"scripts": {
"start": "nodemon server.js"
},
"author": "",
"license": "ISC",
"dependencies": {
"chartjs": "^0.3.24",
"express": "^4.17.1",
"http": "0.0.1-security",
"mongodb": "^3.6.2",
"mongoose": "^5.10.11",
"nodemon": "^2.0.6"
}
}
Write code which looks at req. A property on that object there will tell you the path that is being asked for. (The API documentation for the http module will tell you what property that is).
Then decide what you want to respond with based on that path.
If it is a static file, then read that static file from the file system and output it in the response.
Make sure you set the right Content-Type headers. (You'll get problems if you send HTML but say it is plain text or a JPEG image which you say is HTML).
You are reinventing the wheel here. You should probably look at the Express.js framework and its static module.
To add what #Quentin said. Here's a good example from W3.
https://www.w3schools.com/nodejs/shownodejs.asp?filename=demo_ref_http
var http = require('http');
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write('Hello World!');
res.end();
}).listen(3000);

How to compress build files in create react app without ejecting?

I'm trying to figure out how to best optimize my build file and ran across the notion of compressing text files like js and css. From what I've come across every article either assumes you have access to the webpack config file or you've ejected from CRA. I don't want to.
So I added a post build script to my package.json file:
"scripts": {
"build": "npm run build --prefix client",
"prod-compress": "gzip client/build/static/js/*.js && gzip client/build/static/css/*.css",
which results in the /client/build/static/js and /client/build/static/css folders looking like this:
I then went into my app.js file and added the following code:
app.get('*.js', function(req, res, next) {
req.url = req.url + '.gz';
res.set('Content-Encoding', 'gzip');
res.set('Content-Type', 'text/javascript');
next();
});
app.get('*.css', function(req, res, next) {
req.url = req.url + '.gz';
res.set('Content-Encoding', 'gzip');
res.set('Content-Type', 'text/css');
next();
});
If I understand what's happening correctly, the f/e /client/public/index.html file will still reference a regular .js file. However, when the file is requested from the server, it will respond with the .js.gz file.
However, when I compress the files the entire site goes blank, like it can't find anything to serve up.
If you don't mind adding a new dependency, I would recommend using express-static-gzip which will automatically serve your compressed files:
const express = require("express");
const expressStaticGzip = require("express-static-gzip");
const app = express();
const buildPath = path.join(__dirname, '..', 'build', 'static');
app.use("/", expressStaticGzip(buildPath);
You also have the choice to add other types of compression like brotli by passing an options object:
const buildPath = path.join(__dirname, '..', 'build', 'static');
app.use(
'/',
expressStaticGzip(buildPath, {
enableBrotli: true,
orderPreference: ['br', 'gz']
})
);
Brotli gives you even more optimized files than gzip but it's not supported by all browsers. Thankfully, express-static-gzip automatically picks the correct file to send based on the Accept-Encoding/Content-Encoding header the user's browser sends to it.
If you want to use brotli, I recommend taking a look at compress-create-react-app. It's specifically made for React apps but should work with any files.
Disclaimer: I'm the author of compress-create-react-app

Deploy OpenShift Node.js application from github

I created a node.js (with express) application and have been unsucessfully trying to deploy to OpenShift from github. I am attempting to deploy from the web interface (providing the URL to the github repository root and "master" in the branch/tag field) and am getting an error I'm having trouble understanding:
The initial build for the application failed:
Shell command '/sbin/runuser -s /bin/sh 5724c3b42d5271363b000191 -c "exec /usr/bin/runcon 'unconfined_u:system_r:openshift_t:s0:c4,c687' /bin/sh -c \"gear postreceive --init >> /tmp/initial-build.log 2>&1\""' returned an error. rc=255 .
Last 10 kB of build output: Stopping NodeJS cartridge
Repairing links for 1 deployments
Building git ref 'master', commit a5ca0f7
Building NodeJS cartridge Preparing build for deployment
Deployment id is c2527992
Activating deployment
Starting NodeJS cartridge Sat Apr 30 2016 10:41:09 GMT-0400 (EDT):
Starting application 'profile' ...
Script = server.js
Script Args = Node Options = !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! It is highly recommended that you add a package.json file to your application. !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Waiting for application port (8080) become available ...
Application 'profile' failed to start (port 8080 not available) -------------------------
Git Post-Receive Result: failure
Activation status: failure
Activation failed for the following gears: 5724c3b42d5271363b000191
(Error activating gear: CLIENT_ERROR: Failed to execute: 'control start' for /var/lib/openshift/5724c3b42d5271363b000191/nodejs #<IO:0x000000019b0298> #<IO:0x000000019b0220> )
Deployment completed with status: failure postreceive failed
I read a couple of posts about some errors above like port 8080 not available and failed to execute control start but the directives I was able to follow did not solve my issue. I am finding the line that says "using a package.json file is highly recommended" strange as I do have one. My package.json file is:
{
"name": "Portfolio_Memoria",
"version": "0.0.0",
"private": true,
"scripts": {
"start": "node server.js"
},
"main": "server.js",
"description": "Portfolio_Memoria",
"author": {
"name": "gorra",
"email": ""
},
"dependencies": {
"express": "~4.9.0",
"body-parser": "~1.8.1",
"cookie-parser": "~1.3.3",
"morgan": "~1.3.0",
"serve-favicon": "~2.1.3",
"debug": "~2.0.0",
"jade": "~1.6.0",
"stylus": "0.42.3"
}
}
And server.js file is:
#!/usr/bin/env node
var debug = require('debug')('Portfolio_Memoria');
var app = require('./app');
if(typeof process.env.OPENSHIFT_NODEJS_PORT === 'undefined'){
app.set('port', process.env.PORT || 3000);
var server = app.listen(app.get('port'), function() {
debug('Express server listening on port ' + server.address().port);
});
} else {
app.set('port', process.env.OPENSHIFT_NODEJS_PORT || 3000);
app.set('ip', process.env.OPENSHIFT_NODEJS_IP || '127.0.0.1');
var server = app.listen(app.get('port'), app.get('ip'), function() {
debug('Express server listening on port ' + server.address().port);
});
}
Of course, the application runs without issue locally. I don't know what I am missing here.
EDIT:
I got it to work by creating a blank application in OpenShift, cloning the repository OpenShift creates via command line, copying my whole project to it and pushing it back. This is a workaround and not a solution to the original problem, though.

Unit Testing Express app in Jenkins

Ok, I'm searching a couple of hours for a solution to run my Mocha unit tests inside Jenkins, for my Express JS app.
Writing tests is quite easy, but connecting the tests with my app is a bit harder. An example of my current test:
/tests/backend/route.test.js
var should = require('should'),
assert = require('assert'),
request = require('supertest'),
express = require('express');
describe('Routing', function() {
var app = 'http://someurl.com';
describe('Account', function() {
it('should return error trying to save duplicate username', function(done) {
var profile = {
username: 'vgheri',
password: 'test',
firstName: 'Valerio',
lastName: 'Gheri'
};
// once we have specified the info we want to send to the server via POST verb,
// we need to actually perform the action on the resource, in this case we want to
// POST on /api/profiles and we want to send some info
// We do this using the request object, requiring supertest!
request(app)
.post('/api/profiles')
.send(profile)
// end handles the response
.end(function(err, res) {
if (err) {
throw err;
}
// this is should.js syntax, very clear
res.should.have.status(400);
done();
});
});
});
In above example I connect to a running app (see ```var app = 'http://someurl.com'``). Obviously this is not working inside Jenkins, except if I can tell Jenkins to first run the app and then check for a localhost url. But how to do this?
Now if I take a look at https://www.npmjs.org/package/supertest this should be enough to test my express app:
var request = require('supertest')
, express = require('express');
var app = express();
But it's not. I receive 404 errors on all urls which I would like to test.
Anyone who knows how to test my app inside Jenkins?
Is the app you referenced at http://someurl.com the app you're trying to write this test for? If so...
Supertest should allow you to run tests without needing an external server running. The way you accomplish this is by separating out the server routing Express code from the code you'll run that actually starts the server, and use Supertest on the server routing code.
Here is an example:
Given this directory structure:
./package.json
./server/index.js
./app.js
./test/server-test.js
Here are my files:
./package.json
All of these dependencies aren't required for this example, I just pulled this from my package.json I'm using in my project.
{
"name": "StackOverflowExample",
"version": "1.0.0",
"description": "Example for stackoverflow",
"main": "app.js",
"scripts": {
"test": "mocha"
},
"author": "Mark",
"license": "ISC",
"dependencies": {
"body-parser": "^1.15.2",
"ejs": "^2.5.2",
"express": "^4.14.0",
"express-validator": "^2.21.0",
},
"devDependencies": {
"chai": "^3.5.0",
"mocha": "^3.1.2",
"supertest": "^2.0.1"
}
}
./server/index.js
var express = require('express');
var app = express();
app.get('/', function(req, res) {
res.send('Hello World');
});
module.exports = app;
./app.js
var app = require('./server');
var server = app.listen(8000, function() {
console.log('Listening on port 8000');
});
./tests/server-test.js
var request = require('supertest');
var app = require('../server');
var assert = require('assert'); //mocha
var chai = require('chai');
var expect = null;
chai.should();
expect = chai.expect;
describe('GET /', function() {
it('should return a 200', function(done) {
request(app).get('/').expect(200, done);
});
});
I'm using Mocha and Chai here (didn't have time to refactor and make the example simpler, sorry). The main point is that within the it statement you can run your tests.
Doing it this way does not require a separate server running. Supertest magically starts one up for you and runs it based on the express server you exported from the server module. I think this approach should allow you to run your tests seamlessly through Jenkins.
In ./server/index.js, an Express object is created. That object is exported, and imported via require in both ./app.js and ./test/server-test.js. In ./app.js it's used to start a server via listen(). In the test file, Supetest uses it to start a server for testing internally.
When you are running the server in this example, you would run it via ./app.js. That would set the port you choose (8000 in this example) and start listening for connections on that port.
Hope this helps! Good luck!

Categories

Resources