I am making trying to take a backend using Hapi for the first time but every time a request is sent to the server it crashes. Sometimes I do get a response but the server eventually crashes on its own.
The error I get is:
TypeError: Cannot read properties of null (reading 'statusCode')
at Request._finalize (C:\Users\prakh\Desktop\Angular\buy-and-sell-backend\node_modules\#hapi\hapi\lib\request.js:491:31)
at Request._reply (C:\Users\prakh\Desktop\Angular\buy-and-sell-backend\node_modules\#hapi\hapi\lib\request.js:428:18)
at Request._execute (C:\Users\prakh\Desktop\Angular\buy-and-sell-backend\node_modules\#hapi\hapi\lib\request.js:274:14)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
The code is simple since I am only testing right now:
import Hapi from '#hapi/hapi'
const start = async () => {
const server = Hapi.server({
port: 8000,
host: 'localhost',
});
server.route({
method: 'GET',
path: '/hello',
handler: (req, h) => {
return 'Hello!';
}
});
await server.start();
console.log(`Server is listening on ${server.info.uri}`)
}
process.on('unhandledRejection', err => {
console.log(err);
process.exit(1);
});
start();
I am using Node v16.17.0 and the command I use to run it is npx babel-node src/server.js
I am not sure what I am doing wrong here.
Found the fix.
Had to update the Hapi version that I was using.
Use: npm uninstall #hapi/hapi
and then: npm install #hapi/hapi
Hapi 20.2.2 works
Related
I am developing an application using Nodejs and Docker. Within the code, I need to make the request to GitLab API to get the data. When I run the application through the Docker command docker-compose exec web sh -c "project=GitLab type=New npm start" then I get the error as it is unable to get the response from the API call but the same code and API request works perfectly when running with the direct command node index.js.
Following is the code I have:
./web/index.js:
const express = require('express');
const http = require("http");
const bodyParser = require('body-parser');
const app = express();
const port = process.env.PORT || 9000;
const gitlabDump = require("./controller/GitLabDump");
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
//Make NodeJS to Listen to a particular Port in Localhost
app.listen(port, function(){
var project = process.env.project;
var type = process.env.type;
if(project.trim() === "GitLab" && (type.trim() === "New" || type.trim() === "Update")){
//If porject is GitLab then fetch the data from Gitlab
console.log("Fetching GitLab Data.......");
gitlabDump.gitlabDump(type, function(data){
console.log("Completed Execution for GitLab")
process.exit();
})
}
}
Following is my controller code where I am making the API request:
./web/controller/GitLabDump.js
const request = require('request');
exports.gitlabDump = function(callback){
var gitlabAPI = "https://gitlab.com/api/v4/projects/<project_id>/repository/tree?ref=<target_branch>&path=path/to/subdirectory";
console.log("BEFORE \n")
request(gitlabAPI, function(error, response, body) {
console.log(JSON.parse(body))
callback("Completed");
})
}
Following is my DockerFile:
./docker.compose.yml
version: '3'
services:
db:
container_name: db
image: mysql:5.7
volumes:
- "./data/mysql:/var/lib/mysql:rw"
environment:
MYSQL_DATABASE: myDatabase
MYSQL_ROOT_PASSWORD: myPassword
MYSQL_PASSWORD: myPassword
DATABASE_HOST: localhost
restart: always
web:
container_name: web
image: node:8
volumes:
- ./web:/usr/src/app
working_dir: /usr/src/app
depends_on:
- db
restart: on-failure
command: "tail -f /dev/null"
environment: ["project=${project}", "type=${type}"]
Following is the command I am using to run the application:
docker-compose exec web sh -c "project=GitLab type=New npm start"
Following is the error that I get:
Fetching GitLab Data.......
BEFORE
undefined:1
undefined
^
The error is coming due to the line console.log(JSON.parse(body)). Because the body is undefined as the API call returns no data.
Please Note:
API URL is correct and I have the proper access because the same URL gives me data when accessing through Chrome, Postman, and even when running the code using the node index.js command.
I am using the same application to make other API calls apart from GitLab and they are working fine.
Can someone please help me what's the issue here and why is failing for GitLab API?
Posting the answer here as it can be useful to someone else in the future:
Finally, I was able to find the resolution. The issue is not happening because of the docker but rather because of the Nodejs itself. If we console.log the error then we get the certificate has expired message.
The fix is to request something like this:
request({
url: gitlabAPI,
agentOptions: {
rejectUnauthorized: false
}
}, function (error, response, body) {
console.log(JSON.parse(response.body))
});
Refer to the questions: Node.js request CERT_HAS_EXPIRED
I'm trying to add Log4js-Node to a Node.js server running on Apache. Here's my code:
const path = require("path");
const express = require("express");
const log4js = require('log4js');
const app = express();
const logger = log4js.getLogger();
logger.level = "debug";
const port = 443;
log4js.configure({
appenders: { everything: { type: 'file', filename: 'logs.log', flags: 'w' } },
categories: { default: { appenders: ['everything'], level: 'ALL' } }
});
const server = app.listen(port, () => {
logger.debug("listening to requests on port " + port);
});
app.get("/log", (req, res) => {
res.sendFile(path.join(__dirname + "/logs.log"));
});
When I run the script on Node.js on my computer and navigate to localhost:443/log I see what I expect, which is this:
[2020-03-17T22:50:43.145] [DEBUG] default - listening to requests on port 443
But when I run the code on a remote server it crashes and I get this in the error page (with part of the path replaced by me with "[removed]"):
App 25925 output: at Server. ([removed]/index.js:27:9)
App 25925 output: at Logger. [as debug] ([removed]/12/lib/node_modules/log4js/lib/logger.js:124:10)
App 25925 output: at Logger.log ([removed]/12/lib/node_modules/log4js/lib/logger.js:73:12)
App 25925 output: at Logger._log ([removed]/12/lib/node_modules/log4js/lib/logger.js:90:16)
App 25925 output: at Object.send ([removed]/12/lib/node_modules/log4js/lib/clustering.js:97:15)
App 25925 output: [removed]/12/lib/node_modules/log4js/lib/clustering.js:97
App 25925 output: at Object. ([removed]/12/lib/node_modules/log4js/lib/clustering.js:8:13)
I'm using A2 Hosting which uses Apache 2.4.41. I opted for Node.js 12.9.0, and Log4js 6.1.2. The package.json should be the same on both my computer and the server, and I've run npm install on both.
Is this just an issue with Log4js and the server, or have I missed something somewhere?
This was actually a relatively simple fix. The path referenced by the last error in the stack trace is a Log4js module that implements clustering support through Node's "cluster" module. The line "8" referenced is cluster = require("cluster"). It's wrapped in a try/catch block like this:
try {
cluster = require("cluster"); //eslint-disable-line
} catch (e) {
debug("cluster module not present");
disabled = true;
}
The installation of Node.js on my computer came with the "cluster" module, however as far as I can tell, the server I'm using doesn't support it. Also, the version of Node I'm using on my computer is newer than what I'm using on the server (so I've now installed 12.9 on my machine). I believe the older version of Node doesn't bother trying to catch the exception and tries to load the cluster module, fails, and then throws the error.
So the simple fix was to comment out most of the "try/catch" block, leaving just the contents of "catch" like this:
// try {
// cluster = require("cluster"); //eslint-disable-line
// } catch (e) {
debug("cluster module not present");
disabled = true;
// }
If someone has a better fix, I'm open to suggestions.
The same response of #skittleswrapper,thx, it work for me.
I use Node.js 14.18.1 with log4js 6.3.0.
But i wondering what'is the necessary of this module 'cluster' and if we can
add it to our app in other way.
LT027296-Mac:~$ docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
mongo latest 394204d45d87 3 weeks ago 410MB
redis latest a55fbf438dfd 4 weeks ago 95MB
nginx latest 2bcb04bdb83f 4 weeks ago 109MB
bitnami/mysql latest c5c056b8435c 3 months ago 287MB
LT027296-Mac:~$ docker run --name some-redis -d redis
15e126e26ea452b2b8c2933c549a15d74bb49aece1fe8b5e4b746e67bced6c20
LTB0207296-Mac:~ b0207296$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
15e126e26ea4 redis "docker-entrypoint.s…" About a minute ago Up About a minute 6379/tcp some-redis
LT027296-Mac:~$
Hi
I am learning redis tutorial from below link
https://medium.com/tech-tajawal/introduction-to-caching-redis-node-js-e477eb969eab
I do the following steps
Download redis image from docker
then run container (as shown above)
Then I run my code
const express = require('express')
const fetch = require("node-fetch");
const redis = require('redis')
// create express application instance
const app = express()
// create and connect redis client to local instance.
const client = redis.createClient()
// echo redis errors to the console
client.on('error', (err) => {
console.log("Error " + err)
});
// get photos list
app.get('/photos', (req, res) => {
// key to store results in Redis store
const photosRedisKey = 'user:photos';
// Try fetching the result from Redis first in case we have it cached
return client.get(photosRedisKey, (err, photos) => {
// If that key exists in Redis store
if (photos) {
return res.json({ source: 'cache', data: JSON.parse(photos) })
} else { // Key does not exist in Redis store
// Fetch directly from remote api
fetch('https://jsonplaceholder.typicode.com/photos')
.then(response => response.json())
.then(photos => {
// Save the API response in Redis store, data expire time in 3600 seconds, it means one hour
client.setex(photosRedisKey, 3600, JSON.stringify(photos))
// Send JSON response to client
return res.json({ source: 'api', data: photos })
})
.catch(error => {
// log error message
console.log(error)
// send error to the client
return res.json(error.toString())
})
}
});
});
// start express server at 3000 port
app.listen(3000, () => {
console.log('Server listening on port: ', 3000)
});
I am getting this error
Redis connection to 127.0.0.1:6379 failed - connect ECONNREFUSED 127.0.0.1:6379
While running the docker image you've to publish the port as well.
docker run --name some-redis -d redis -p 6379:6379
This will map docker's port to host port
Optionally you can also pass the host as well:
-p 127.0.0.1:8001:8001
Read more about publish here
My server has embedded node-red. I'm trying to create new websocket listener in server. But when execute this code, websockets in node-red application stops working.
const WebSocket = require('ws');
const wss = new WebSocket.Server({
server: server,
path: '/test'
});
wss.on('connection', function connection(ws, req) {
console.log('test');
});
Websocket in node-red admin panel:
Problem is related to:
https://github.com/websockets/ws/issues/381
How access to node-red websocket and handle messages for own path?
I know this is an old thread but I thought I'd throw in that you can use the OP code in node red like this:
var WebSocket = global.get('ws');
const wss = new WebSocket.Server({
server: <your http(s) server>,
path: '/'
});
wss.on('connection', function connection(ws, req) {
node.warn('connection');
});
You just need to:
npm install ws
edit settings.js
under functionGlobalContext: add
ws:require('ws')
It does work, I'm using it like this because I couldn't get the websocket node to work in my configuration.
I am trying to start and stop gulp-webserver in different gulp tasks like below:
gulp.task('webserver', ['build'], function () {
gulp.src([config.build, config.base])
.pipe(webserver({
livereload: false,
directoryListing: false,
open: true
}));
});
gulp.task('webserver-stop', function () {
var stream = gulp.src([config.build, config.base])
.pipe(webserver());
stream.emit('kill');
});
I am able to successfully start the server but when I try to stop the using gulp webserver-stop, it gives following error.
[19:36:30] Finished 'webserver-stop' after 27 ms
events.js:160
throw er; // Unhandled 'error' event
^
Error: listen EADDRINUSE 127.0.0.1:8000
at Object.exports._errnoException (util.js:1008:11)
at exports._exceptionWithHostPort (util.js:1031:20)
at Server._listen2 (net.js:1253:14)
at listen (net.js:1289:10)
at net.js:1399:9
at GetAddrInfoReqWrap.asyncCallback [as callback] (dns.js:65:16)
at GetAddrInfoReqWrap.onlookup [as oncomplete] (dns.js:84:10)
I have no significant experience with gulp and javascript before,
Any help to fix this please.
When you run gulp webserver and then gulp webserver-stop you have two processes. Those two processes know nothing of each other.
So when you invoke webserver() in webserver-stop that just starts another webserver instance. It doesn't connect to the already running instance or anything like that. And since there is already one webserver running on port 8000 you get an EADDRINUSE error.
Instead your webserver-stop task needs to send a message to the running webserver that causes it to shut down.
Since you're already running a webserver you might as well send that message over HTTP. You can use the middleware option of gulp-webserver to achieve this:
var gulp = require('gulp');
var webserver = require('gulp-webserver');
var http = require('http');
gulp.task('webserver', function () {
var stream = gulp.src(['.'])
.pipe(webserver({
livereload: false,
directoryListing: false,
open: true,
middleware: function(req, res, next) {
if (/_kill_\/?/.test(req.url)) {
res.end();
stream.emit('kill');
}
next();
}
}));
});
gulp.task('webserver-stop', function (cb) {
http.request('http://localhost:8000/_kill_').on('close', cb).end();
});
Now you can invoke gulp webserver-stop in another terminal window or simply open http://localhost:8000/_kill_ in a browser to shut down the running webserver instance.