Node JS Cannot Get / - javascript

I am following this Node JS tutorial, but I have encountered a problem where I get the response Cannot GET / instead of Hello. Here is the structure for my code:
myapp/
├── app
│   └── routes
│   ├── index.js
│   └── note_routes.js
├── conf
│   ├── httpd-app.conf
│   └── httpd-prefix.conf
├── htdocs
├── node_modules
│   ├── ...
├── package.json
├── package-lock.json
└── server.js
server.js looks like this:
const express = require('express');
const MongoClient = require('mongodb').MongoClient;
const bodyParser = require('body-parser');
const app = express();
const port = 3000;
require('./app/routes')(app, {});
app.listen(port, () => {
console.log('We are live on ' + port);
});
index.js is this:
const noteRoutes = require('./note_routes');
module.exports = function(app, db) {
noteRoutes(app, db);
// Other route groups could go here, in the future
};
And note_routes.js looks like this:
module.exports = function(app, db) {
app.post('/notes', (req, res) => {
// You'll create your note here.
res.send('Hello')
});
};
As can be seen, this is a very simple setup. According to the above-linked tutorial, this should be enough and should respond with Hello. However, this is not the case.
I am almost 100% sure that the port is correct because there is some not-503 response.
What's wrong here, and how can I get this to work as intended?

Your app.post() in note_routes should be app.get() or if you plan on making post requests then you can chain it like
app.post()
.get();

this is wrong on a whole new kind of level.
I would imagine you are starting your server with node server.js
firstly, server.js does not require index.js anywhere
note_routes should contain .get() NOT .post()
in your browser you should be visiting http://localhost:3000/notes NOT plain old
http://localhost:3000

Related

Sequelize synch doesn't create table in db

I have project structure like
├── index.ts
├── package.json
├── package-lock.json
├── src
│ ├── controllers
│ ├── models
│ │ └── repository.ts
│ ├── routes
│ │ └── repository.ts
│ └── services
│ ├── minio.ts
│ └── sequelize.ts
└── tsconfig.json
// services/sequelize.ts
import { Sequelize } from 'sequelize'
const sequelizeConnection = new Sequelize("randomDb", "randomUser", "randomPassword", {
host: "randomHost",
dialect: "mysql"
})
export default sequelizeConnection
// models/user.ts
import { DataTypes } from 'sequelize'
import sequelizeConnection from '../services/sequelize'
const User = sequelizeConnection.define("user", {
name: {
type: DataTypes.STRING(30),
allowNull: false
}
})
export default User
// index.ts
import express from 'express'
import sequelize from './src/services/sequelize'
const app = express()
const startApp = async()=> {
try {
await sequelize.sync()
app.listen(3000, () => {
console.log(`Listen on port 3000`)
})
} catch (error) {
console.log("Synchronization error")
}
}
startApp()
When I start app, "sequelize.sync()" should create table "users", but for some reasons it doesn't. I tried call sync() method separately on "User" and it worked, so there's no problem with connection with db.
await User.sync()
But in my case, I have much more models and I don't want call synch method on each model.
I believe the issue here is just with your folder structure. The models directory should be on the same level as the index.ts.
Documentation on the .sync() method isn't great, likely because migrations are the preferred approach for production applications (Reference). It's possible there is a config attribute you can set somewhere to maintain your current structure, though I dug around a little bit and couldn't find anything.

The `uri` parameter to `openUri()` must be a string, got "undefined"

I searched a lot about this, but none of them could help me.
When I run my project, I get this error:
/home/ali/Desktop/personalitytest-backend/node_modules/mongoose/lib/connection.js:428
throw new MongooseError('The uri parameter to openUri() must be a ' +
^
MongooseError: The uri parameter to openUri() must be a string, got "undefined". Make sure the first parameter to
mongoose.connect() or mongoose.createConnection() is a string.
My index.js file:
const express = require('express'),
app = express(),
mongoose = require('mongoose'),
rateLimit = new require('express-rate-limit')({
windowMs: 1000 * 60 * 10,
max: 500,
handler: (req, res) => {
res.json({
data: 'Your request was too much, please try again in 10 minutes later.',
status: 'error'
})
}
});
const Application = new class {
constructor() {
this.setConfig();
this.setupDB();
this.setRouters();
this.setupExpress();
}
setConfig() {
require('dotenv').config();
app.use(require('helmet')());
app.use(express.json());
}
setupDB() {
mongoose.Promise = global.Promise;
mongoose.connect(process.env.DATABASE_URL, { useNewUrlParser: true, useCreateIndex: true });
}
setRouters() {
app.use('/', require('./routes'));
}
setupExpress() {
app.listen(process.env.PORT, () => console.log(`Listening on port ${process.env.PORT}.`));
// app.listen(process.env.PORT, process.env.IP, () => console.log(`Listening on port ${process.env.PORT}.`));
}
}
My .env file:
PORT=3000
DATABASE_URL=mongodb://localhost:27017/PersonalityTest
JWT_SECRETKEY=asfdawetq312etr%!#$qe
If I simply write database url in mongoose.connect method, there will be no error.
For example, this doesn't have error:
mongoose.connect("mongodb://localhost:27017/PersonalityTest", { useNewUrlParser: true, useCreateIndex: true });
To read the .env-file you'll need to install something that will read that file, for instance the dotenv package
npm install dotenv --save
Then you require that package in your code
require('dotenv').config();
And according to the dotenv documentation you should do it
As early as possible in your application, require and configure dotenv.
Next you might need to add double quotation marks around your DATABASE_URL value
DATABASE_URL="mongodb://localhost:27017/PersonalityTest"
Have you checked if your .env variables can be readed from that index.js file?
For example,check out what you get when you log some of them to the console:
console.log(process.env.DATABASE_URL);
If you get 'undefined', then you could try especifiying the absolute path for your .env file like this:
const path = require('path');
require('dotenv').config({ path: path.resolve(__dirname, './.env') });
I struggled with this problem recently and in my case that solved the issue. Regards.
This work for me
const dotenv = require('dotenv')
dotenv.config({path:__dirname+'/.env'});
You have to declare .env file after installing dotenv package.
Make sure the .env file is named .env and not config.env or anything else.
I had this problem from following a tutorial online and only recently figured out why I got this error.
file structure
console output
to put :
require('dotenv').config();
at the top of the file where all the other "require" are.
Then add process.env.DATABASE_URL in a variable:
const source = process.env.DATABASE_URL;
and therefore at the top of the file you will have:
require ('dotenv'). config ();
const source = process.env.DATABASE_URL;
lower:
mongoose.connect (source, {useNewUrlParser: true});
I had this error and funny enough I had another project that uses a similar setup. Went to that project and started it up with the same .env values and it had no issues.
So I copied the code over the my current project and started the current one but did not want to connect to the mongodb if I have it setup like this mongoose.connect(process.env.DATABASE_URL, { useNewUrlParser: true, useCreateIndex: true, useUnifiedTopology: true },
and will only work if I use it like this mongoose.connect("mongodb://localhost:3000/posts", { useNewUrlParser: true, useCreateIndex: true });
The thing that bothered me is that it works in one project but not in the other so I decided I'm going to delete my node_modules folder and package-lock.json file and reinstall everything.
After that everything worked.
Also Check if you dont have another node_modules folder that is clashing with your current one. if so delete both with your package-lock.json file and reinstall again. make sure you are in the correct directory as well.
move the .env file from the routes folder or any other folder. And don't place it in any particular folder just let it be in the main folder just like app.js or index.js whichever you have. This might work!
Make sure that your .env file is also located at the same path where you are executing the nodemon command. Or else you will have to declare the path of .env as in the answer of #oxk4r
const dotenv = require('dotenv');
dotenv.config({ path: './config.env' });
const DB = process.env.DATABASE.replace(
'<PASSWORD>',
process.env.DATABASE_PASSWORD
);
mongoose
.connect(DB, {
useNewUrlParser: true,
useCreateIndex: true,
useFindAndModify: false,
useUnifiedTopology: true,
})
.then((con) => {
// console.log(con.connections);
console.log('DB connection successful');
});
const port = process.env.PORT || 3000;
app.listen(port, () => {
console.log(`app running on port ${port}...`);
});
Now the problem you facing is related to the path of // config.env to check that you accessing is correct// console.log(process.env); so that it will show you are get access of .env, if not then your path-
require('dotenv').config({ path: path.resolve(__dirname, '../.env') });
Now again console.log(process.env) if you get this in your console read their property is mentioned in config.env file present.
Check the link of your DATABASE and DATABASE_PASSWORD is correct or not or go to mongoDB Atlas change your password for your cluster and again try it
I know what the problem is. Make sure first you write dotenv.config({path: './config.env'}); And then you use your process.env.DATABASE_URL
Don't use process.env.DATABASE_URL before dotenv.config({path:'./config.env'});
Reference to the github repository:
https://github.com/realabbas/serverless-lambda-node-express-mongodb
Extending oxk4r's answer, I have the following tree showing the directory structure of the app:
.
├── code-of-conduct.md
├── .env
├── .gitignore
├── _config.yml
├── contributing.md
├── lib
│   ├── app.js
│   ├── db.js
│   └── routes
│   ├── index.js
│   └── notes
│   ├── note.js
│   └── notes.controller.js
├── LICENSE
├── package.json
├── package-lock.json
├── README.md
├── secrets.json
├── server.js
└── serverless.yml
We primarily use the dotenv package to reference the environmental variables in the .env file.
The important thing to note here is db.js and the .env files. We need to reference the .env file from the db.js file. We do as below:
db.js
const path = require('path');
const mongoose = require('mongoose')
require('dotenv').config({ path: path.resolve(__dirname, '../.env') });
console.log('here...', path.resolve(__dirname, '../.env'), process.env.MONGODB_URL)
mongoose.connect(process.env.MONGODB_URL
, { useNewUrlParser: true })
.env
MONGODB_URL="mongodb://127.0.0.1:27017/game-of-thrones"
As we see the tree, the .env file id one level above the db.js file. So, we reference it by the path package as:
path.resolve(__dirname, '../.env')
I had the same issue and it still persisted even after following all that was stated here. Was losing my mind until I went over the .env and realised I typed DB_NAME:mongodb+srv:/....., changed it to DB_NAME=mongodb+srv:/..... and the variable was no longer undefined, Mongoose was able to read it.
Check your key and value in your Heroku Config Vars under your app settings. Seems silly, but I had MONGODB_URL instead of _URI and it took me forever to realize my mistake.
mongoose.connect(process.env.MONGO_URI).then(() => {
console.log('DB Conected...')
})
In my case while connecting to database, I wrote wrong spelling of MONGO_URI , I wrote MANGO_URI there (but in .env file I had given name as MONGO_URI), Please do check & correct it, I know it not proper solution but sometime we do such small mistakes.
I found that when I write
dotenv.config();
before
mongoose.connect(process.env.LOCAL_MONGO);
and my problem solved.
Not written in the constructor.
import dotenv from "dotenv";
dotenv.config();
Or:
const dotenv = require("dotenv");
dotenv.config();
No need to add "" for database URL.
Change the .env file to the root of your app at the same level as the package.json
Create a .env file and provide MONGOURI in it.
Example:
MONGO_URI=mongodb+srv://<password>#cluster0.wlhevoe.mongodb.net/test
2.In index.js , require like this
const dotenv = require('dotenv')
dotenv.config({ path: './.env' }) // Here you must take care to provide the correct path of the .env file
mongoose.connect(process.env.MONGO_URI, { useNewUrlParser: true })
.then(() => console.log('DB Connection Successful'));
This worked for me. Thanks
In my case, I had left out the brackets after the word config. So the correct config is below. No more errors now.
require('dotenv').config()

Mocking a yaml file in jest

I have a yaml file, that has some config information and I use it in a module that I want to test. But when I test it I want to mock it so it only has simplified and static data, so it's easy to test and if the config is changed I don't have to edit the tests. Here is what I tried so far:
// config/index.js
const yaml = require('js-yaml');
const fs = require('fs');
const path = require('path');
const filePath = path.join(__dirname, 'stuff.yaml');
module.exports =
{
getStuff()
{
return yaml.safeLoad(fs.readFileSync(filePath, 'utf8'));
},
setStuff(stuff)
{
fs.writeFile(filePath, yaml.safeDump(stuff), err => console.log);
}
}
// test/config.test.js
const config = require("../config")
test('getStuff', () => {
jest.mock('../config/stuff.yaml')
expect(config.getStuff()).toEqual({/*..*/});
});
My file structure being:
project-root/
├── config/
│ ├── __mocks__/
| └── stuff.yaml (the mock file)
│ ├── stuff.yaml (the real file)
│ └── index.js
└── test/
└── config.test.js
But the test still return the data from the real file. Summarizing, I want to mock a text file in the file system, so that any module reads it instead of the real one.
Note: I don't really care if the mock version is on the disk or I just have it as a string in memory. Having it in memory would even be beneficial in the sense of the tests being faster.
You can probably update your Jest configuration and leverage moduleNameMapper to handle this.
{
"moduleNameMapper": {
"config/stuff.yaml": "<rootDir>/config/__mocks__/stuff.yaml"
}
}
You could also try setMock - https://facebook.github.io/jest/docs/en/jest-object.html#jestsetmockmodulename-moduleexports
jest.setMock('config/__mocks__/stuff.yaml', require('config/stuff.yaml');

Unable to require() from inside an already "required" module

I have this project structure:
myApp
├── gulpfile.js
├── package.json
└── source
   └── scripts
      ├── modules
      │ └── utils.js
      ├── background.js
      └── data.json
My browserify task:
gulp.task('browserify', function () {
return gulp.src(['./source/scripts/**/*.js'])
.pipe($.browserify({
debug: true,//for source maps
standalone: pkg['export-symbol']
}))
.on('error', function(err){
console.log(err.message);
this.emit('end');
})
.pipe(gulp.dest('./build/scripts/'));
});
My sample utils.js:
const data = require('../data.json');
const utils = (function () {
const output = function () {
console.log(data);
};
return {
output: output,
};
}());
module.exports = utils;
If I try to build it with the current directory structure, I get this error:
module "../data.json" not found from "/dev/myApp/source/scripts/fake_4d8cf8a4.js"
I can only build it, if I put data.json inside the modules directory AND inside the scripts directory, ie. it only works if I duplicate the file:
myApp
├── gulpfile.js
├── package.json
└── source
   └── scripts
      ├── modules
      │ ├── utils.js
      │ └── data.json
      ├── background.js
      └── data.json
Obviously this is not okay... what am I doing wrong?
Thank you
I'm inferring from your use of gulp.src to pass files to $.browerify that you are using a Gulp plugin, probably gulp-browserify. It is generally not recommended to use a plugin to invoke Browserify from Gulp. The recommended way to do it is to just call Browserify directly. Indeed, Gulp's blacklist of plugins states:
"gulp-browserify": "use the browserify module directly",
I've replicated your directory structure and put some reasonable values for the files for which you did not provide contents (data.json, background.js) and indeed, I get the same error you get when I try to run the Gulp code you show. However, if I switch to calling Browserify directly, I do not get any error. Here is the code I have:
const gulp = require("gulp");
const browserify = require("browserify");
const source = require('vinyl-source-stream');
gulp.task('browserify', function () {
return browserify({
entries: ["./source/scripts/background.js",
"./source/scripts/modules/utils.js"],
debug: true,//for source maps
standalone: "foo",
})
.bundle()
.pipe(source('bundle.js')) // This sets the name of the output file.
.pipe(gulp.dest('./build/scripts/'));
});
You use gulp.src(['./source/scripts/**/*.js']) in your code, which means that Browserify will take all your .js files as entries into the bundle.
So I've put two entries in my code above, which manually replicates the pattern you use with the plugin. However, while Browserify does not produce an error with this setup, I suspect you don't actually want to have multiple entries. Typically, we pass one entry point to Browserify and let it trace the require calls to figure what it needs to pull.

Gulp watch, incremental build

I'm struggling to make gulp-watch behave as I desire. This small project is tooling to build HTML5 emails templates from Jade+SASS, in a repeatable way. The directory structure is as such:
.
├── Gulpfile.coffee
├── Gulpfile.js
├── build
│   ├── hello-world.html
│   └── styles
│   └── ink.css
├── node_modules
│   ├── ...snip...
├── package.json
└── source
├── hello-world.jade
├── layouts
│   └── default.jade
└── styles
└── ink.scss
My wish list is thus:
Build all templates when styles or templates change. This is what I can't do
Don't have a seaerate "cold" start, always use the gulp incremental build. (This seems to work)
Live reload would reload the browser, that'd be cool. (This too)
The Gulpfile, in CoffeeScript notation for brevity is included below, it's predominantly based on the documentation Incremental rebuilding, including operating on full file sets.
gulp = require 'gulp'
inlineCss = require 'gulp-inline-css'
jade = require 'gulp-jade'
marked = require 'gulp-marked'
plumber = require 'gulp-plumber'
rename = require 'gulp-rename'
sass = require 'gulp-sass'
cached = require 'gulp-cached'
util = require 'gulp-util'
watch = require 'gulp-watch'
webserver = require 'gulp-webserver'
styleGlob = "source/styles/*.scss"
templateAndLayouysGlob = "source/**/*.jade"
templateGlob = "source/*.jade"
styleChangeHandler = (event) ->
if event.type is "deleted"
delete cached.caches.scripts[event.path]
templateChangeHandler = (event) ->
if event.type is "deleted"
delete cached.caches.templates[event.path]
gulp.task "styleWatcher", ->
gulp.src(styleGlob)
.pipe(cached('styles'))
.pipe(watch(styleGlob))
.pipe(sass())
.pipe(gulp.dest("build/styles"))
.on('error', util.log)
gulp.task "templateWatcher", ->
gulp.src(templateGlob)
.pipe(cached('templates'))
.pipe(watch(templateGlob))
.pipe(jade(pretty: true))
.pipe(inlineCss())
.pipe(gulp.dest("build/"))
.on('error', util.log)
gulp.task 'webserver', ->
buildPath = 'build/'
gulp.src(buildPath)
.pipe(webserver({
livereload: true,
directoryListing: {
enable: true,
path: buildPath
},
open: true
}))
gulp.task "watch", ->
styleWatcher = gulp.watch(styleGlob, ["styleWatcher"])
styleWatcher.on 'change', styleChangeHandler
templateWatcher = gulp.watch(templateGlob, ["templateWatcher"])
templateWatcher.on 'change', templateChangeHandler
# I would expect this to fire when something in build/styles/*.css
# is updated by the style watcher?
templateStyleWatcher = gulp.watch('build/styles/*.css', ["templateWatcher"])
templateStyleWatcher.on 'change', templateChangeHandler
gulp.task "default", ["styleWatcher", "templateWatcher", "watch", "webserver"]
If it were possible, and I'd written this watcher in GNU Make or similar, I would have had the option to express that build/, and rely on the tooling to rebuild those files if the ones upon which they depend are out of date.
I've seen that there are a number of gulp-<something about inlining> plugins, but none of them make clear whether they support this conditional recompilation by watching paths that were imported for changes (I doubt it).
Given my background in systems programming, I may well be approaching Javascript build tooling in a completely incorrect way.

Categories

Resources