I am using Node.js and Socket.io for my web application.
I want to broadcast a file which can be somehow large (>15 Mb)to all the connected sockets and then use it in my clients. Is there a way to do this?
PS: if you can have a demo with babylon.js + socket.io that would be awesome
EDIT:
As requested, my server code:
import { createServer } from 'http';
import { createSocketServer} from "./socket";
import cookieParser from "cookie-parser";
import express from "express";
import morgan from "morgan";
import path from "path";
const port = 3000;
// Create a new express application instance
const app: express.Application = express();
app.use(cookieParser());
app.use(morgan('dev'));
const server = createServer(app);
// create a socket.io server
createSocketServer(server);
app.use('/', express.static(path.join(__dirname, 'public')));
server.listen(port,'0.0.0.0', function () {
console.log('Server is listening on port ' + port + ' !');
});
Babylon Assets Loading code:
this.assetsManager = new BABYLON.AssetsManager(this.scene);
this.assetsManager.addMeshTask('obj task', '',
'http://192.168.0.100:3000/babylon-files/dir1/', 'objectFile.obj');
this.assetsManager.addMeshTask('mtl task', '',
'http://192.168.0.100:3000/babylon-files/dir1/', 'materialFile.mtl');
this.assetsManager.addTextureTask('text1 task',
'http://192.168.0.100:3000/babylon-files/dir1/texture1.jpg');
this.assetsManager.addTextureTask('text2 task',
'http://192.168.0.100:3000/babylon-files/dir1/texture2.jpg');
this.assetsManager.onFinish = ((tasks) => {
this.engine.runRenderLoop(() => {
this.scene.render();
});
}).bind(this);
this.assetsManager.load();
You sould store the *.obj & *.mtl files on the same server.
Note that your node & webserver code cant run on the same port.
You should use a reverse proxy or listen on diffrent ports.
This prevents cross site issues.
For example: create in your webserver a directory that is public accessable: http://example.com/assets/babylon-files
In your node code you can now trigger clients to load files from that path:
// socket.io logic above
// waiting for connetions, auth, etc...
// tell conencted clients what the should load from
// http://example.com/assets/babylon-files
socket.broadcast.emit('loadAsset', 'house.obj');
socket.broadcast.emit('loadAsset', 'car.obj');
socket.broadcast.emit('loadAsset', 'wall.obj');
The client should looks something like this:
// listen for socket.io events from server here
io.on("loadAsset", (filename) => {
// tell babylon to load assets
BABYLON.SceneLoader.Load("/assets/babylon-files", filename, engine, function (scene) {
// do something with the scene
});
// - or -
// tell babylon to append assets
BABYLON.SceneLoader.Append("/assets/babylon-files", filename, function (scene) {
// do something with the scene
});
});
On the same way you can send binary data to the clients:
fs.readFile("/path/to/obj<or>mtl/file", (err, buff) => {
if (err) {
res.status(500).end();
return;
}
socket.binary(true).emit("loadAsset", buff);
});
Related
I have a project which uses express to receive webhooks which are sent to a specific url. The project contains also a index.html which implements (via tag) the script game.js with a simple tic tac toe game. To this point everything works fine: the webapp is displaying, the tic tac toe game is working and if a webhook is received console.log(req.body); is getting executed.
Now to my question:
I want that when server.js is receiving a webhook that in the game.js the function webhookEvent() gets called. But I have no idea how to achieve this as I am pretty new to javascript/node js. I don't understand how I can call a function in game.js from server.js. I would be pleased if somebody can help me with that.
The project has following folder structure and file contents:
project
│ server.js
│ package.json
│
└───src
│ │ index.html
│ │ game.js
server.js (start script):
const express = require("express");
var app = express()
const path = require('path');
app.use(express.static(path.join(__dirname, 'src')));
app.get('/', (req, res) => {
res.sendFile(path.join(__dirname, 'src', 'index.html'));
});
app.use(express.urlencoded({ extended: false }));
app.post("/webhook", (req, res) => {
console.log(req.body);
res.sendStatus(200);
});
var listener = app.listen(process.env.PORT, function () {
console.log("Your bot is running on port " + listener.address().port);
});
package.json (sets server.js to start script):
{
"name": "project",
...
"main": "server.js",
"scripts": {
"start": "node server.js"
},
...
}
src/index.html (implements game.js at the bottom):
<!DOCTYPE html>
<html lang="en">
<head>
...
</head>
<body>
...
<script type="module" src="game.js"></script>
</body>
</html>
src/game.js (here I want to do something if server.js receives a webhook)
if (document.querySelector) {
document.documentElement.classList.add("js");
var ticTacToeElement = document.querySelector("#tic-tac-toe");
...
}
function webhookEvent() {
// do something when webhook is received
}
What I have already tried:
export app from server.js module.exports = app; --> issue when I try to import this in game.js with var app = require("../server.js"); the script game.js is not working anymore.
export webhookEvent from game.js with export {webhookEvent}; and import it in server.js --> I think this does not work because server.js is called before game.js and therefore can't import functions from game.js
You can use Websockets which allow you to communicate back-and-forth between the server & client. I put together a very simple Websockets sandbox for you to understand and try implementing in your code.
There are many Websocket libraries to choose from, such as Socket.io, ws, etc. The example I provided has a React frontend and Express + Node backend. Even if you don't use React, the concepts are the same.
index.js
import express from "express";
import WebSocket, { WebSocketServer } from "ws";
import http from "http";
const heartbeat = (ws) => {
ws.isAlive = true;
};
const app = express();
const server = http.createServer(app);
const wss = new WebSocketServer({ port: 4443 });
wss.on("close", function close() {
console.log("closed");
});
wss.on("open", function connection(client) {
console.log('sent "open" to client');
client.send("open");
});
// USER CONNECTED (onload events)
wss.on("connection", async function connection(client, request) {
console.log("user connected", Date.now());
client.send(JSON.stringify({ ready: true }));
// CLIENT ALIVE-CHECK
client.isAlive = true;
client.on("pong", () => heartbeat(client));
// message all clients in the ui
client.on("message", async function message(d, isBinary) {
if (client.readyState === WebSocket.OPEN) {
const data = JSON.parse(d.toString());
console.log("client message", { data });
client.send(
JSON.stringify({
score: Math.random(),
session: Math.random(),
...data,
})
);
}
});
});
server.listen(process.env.port || 4444, () => {
console.log(`*~~% wesley's server %~~*`);
});
In your frontend you then make a connection to the Websocket server and setup your listeners. These allow your frontend to know when your backend sent it a message. Likewise, on the server there are listeners that let it know when the client emit or sent a message.
websocket.addEventListener("message", (e) => {});
I am making a next js application.
Deployment works fine in vercel.
For deploying the same project in another server, got help from https://stackoverflow.com/a/63660079/13270726 and used the same instructions in our app.
Deployed the out directory into server using ftp client.
Issue
-> When we enter into http://your-domain.com , it works fine. (Even page refresh also works fine in this page)
-> If we move to about page using the url, http://your-domain.com/about then it also works but on page refresh in the url http://your-domain.com/about results in the error,
-> This page refresh also results in the console error like,
Get http://your-domain.com/about Not found
next.config.js: (With public path)
const config = {
webpack: (config, { isServer }) => {
.
.
.
config.devServer = {
historyApiFallback: true
}
config.output.publicPath = "/"
return config;
}
}
module.exports = withPlugins(config);
The issue arises in page refresh only or when we manually type the url.. But while we navigate to it first time then the issue is not there.
Any good help would be much appreciated as I am stuck for long time..
Edit:
I have a server.js file and its code look like,
const dotenv = require("dotenv");
// import ENVs from ".env.local" and append to process
dotenv.config({ path: ".env.local" });
const express = require("express");
const address = require("address");
const chalk = require("chalk");
// create express web server instance
const app = express();
// pull out ENVs from process
const { LOCALHOST, PORT } = process.env;
// get the Local IP address
const LOCALIP = address.ip();
// tell express to serve up production assets from the out directory
app.use(express.static("out" + '/'));
app.get('/*', (req, res) => {
res.send('ok')
});
app.all('*', function(req, res) {
res.redirect('/index.html');
});
// tell express to listen for incoming connections on the specified PORT
app.listen(PORT, (err) => {
if (!err) {
// log the LOCALHOST and LOCALIP addresses where the app is running
console.log(
`\n${chalk.rgb(7, 54, 66).bgRgb(38, 139, 210)(" I ")} ${chalk.blue(
"Application is running at"
)} ${chalk.rgb(235, 220, 52).bold(LOCALHOST)} ${chalk.blue(
"or"
)} ${chalk.rgb(235, 220, 52).bold(`http://${LOCALIP}:${PORT}`)}\n`
);
} else {
console.err(`\nUnable to start server: ${err}`);
}
});
I am working on a speech-to-text web app using the IBM Watson Speech to text API. The API is fetched on the click of a button. But whenever I click the button. I get the above-mentioned error. I Have stored my API key and URL in a .env file.
I tried a lot but keep on getting this error. Please Help me out as I am new to all this.
I got server.js from the Watson Github Repo
Server.js
'use strict';
/* eslint-env node, es6 */
const env = require('dotenv');
env.config();
const express = require('express');
const app = express();
const AuthorizationV1 = require('watson-developer-cloud/authorization/v1');
const SpeechToTextV1 = require('watson-developer-cloud/speech-to-text/v1');
const TextToSpeechV1 = require('watson-developer-cloud/text-to-speech/v1');
const vcapServices = require('vcap_services');
const cors = require('cors');
// allows environment properties to be set in a file named .env
// on bluemix, enable rate-limiting and force https
if (process.env.VCAP_SERVICES) {
// enable rate-limiting
const RateLimit = require('express-rate-limit');
app.enable('trust proxy'); // required to work properly behind Bluemix's reverse proxy
const limiter = new RateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 100, // limit each IP to 100 requests per windowMs
delayMs: 0 // disable delaying - full speed until the max limit is reached
});
// apply to /api/*
app.use('/api/', limiter);
// force https - microphone access requires https in Chrome and possibly other browsers
// (*.mybluemix.net domains all have built-in https support)
const secure = require('express-secure-only');
app.use(secure());
}
app.use(express.static(__dirname + '/static'));
app.use(cors())
// token endpoints
// **Warning**: these endpoints should probably be guarded with additional authentication & authorization for production use
// speech to text token endpoint
var sttAuthService = new AuthorizationV1(
Object.assign(
{
iam_apikey: process.env.SPEECH_TO_TEXT_IAM_APIKEY, // if using an RC service
url: process.env.SPEECH_TO_TEXT_URL ? process.env.SPEECH_TO_TEXT_URL : SpeechToTextV1.URL
},
vcapServices.getCredentials('speech_to_text') // pulls credentials from environment in bluemix, otherwise returns {}
)
);
app.use('/api/speech-to-text/token', function(req, res) {
sttAuthService.getToken(function(err, token) {
if (err) {
console.log('Error retrieving token: ', err);
res.status(500).send('Error retrieving token');
return;
}
res.send(token);
});
});
const port = process.env.PORT || process.env.VCAP_APP_PORT || 3002;
app.listen(port, function() {
console.log('Example IBM Watson Speech JS SDK client app & token server live at http://localhost:%s/', port);
});
// Chrome requires https to access the user's microphone unless it's a localhost url so
// this sets up a basic server on port 3001 using an included self-signed certificate
// note: this is not suitable for production use
// however bluemix automatically adds https support at https://<myapp>.mybluemix.net
if (!process.env.VCAP_SERVICES) {
const fs = require('fs');
const https = require('https');
const HTTPS_PORT = 3001;
const options = {
key: fs.readFileSync(__dirname + '/keys/localhost.pem'),
cert: fs.readFileSync(__dirname + '/keys/localhost.cert')
};
https.createServer(options, app).listen(HTTPS_PORT, function() {
console.log('Secure server live at https://localhost:%s/', HTTPS_PORT);
});
}
App.js
import React, {Component} from 'react';
import 'tachyons';
//import WatsonSpeech from 'ibm-watson';
var recognizeMic = require('watson-speech/speech-to-text/recognize-microphone');
class App extends Component {
onListenClick = () => {
fetch('http://localhost:3002/api/speech-to-text/token')
.then(function(response) {
return response.text();
}).then(function (token) {
var stream = recognizeMic({
token: token, // use `access_token` as the parameter name if using an RC service
objectMode: true, // send objects instead of text
extractResults: true, // convert {results: [{alternatives:[...]}], result_index: 0} to {alternatives: [...], index: 0}
format: false // optional - performs basic formatting on the results such as capitals an periods
});
stream.on('data', function(data) {
console.log('error 1')
console.log(data);
});
stream.on('error', function(err) {
console.log('error 2')
console.log(err);
});
//document.querySelector('#stop').onclick = stream.stop.bind(stream);
}).catch(function(error) {
console.log('error 3')
console.log(error);
});
}
render() {
return(
<div>
<h2 className="tc"> Hello, and welcome to Watson Speech to text api</h2>
<button onClick={this.onListenClick}>Listen to Microphone</button>
</div>
);
}
}
export default App
Since the only code you show is fetching an authorisation token then I guess that that is what is throwing the authentication failure. I am not sure how old the code you are using is, but the mechanism you are using was used when the STT service credentials are userid / password. The mechanism became unreliable when IAM keys started to be used.
Your sample is still using watson-developer-cloud, but that has been superseded by ibm-watson. As migrating the code to ibm-watson will take a lot of rework, you can continue to use watson-developer-cloud.
If do you stick with watson-developer-cloud and you want to get hold of a token, with an IAM Key then use:
AuthIAMV1 = require('ibm-cloud-sdk-core/iam-token-manager/v1'),
...
tokenService = new AuthIAMV1.IamTokenManagerV1({iamApikey : apikey});
...
tokenService.getToken((err, res) => {
if (err) {
...
} else {
token = res;
...
}
});
I'm trying to setup my server with websockets so that when I update something via my routes I can also emit a websocket message when something on that route is updated.
The idea is to save something to my Mongo db when someone hits the route /add-team-member for example then emit a message to everyone who is connected via websocket and is a part of whatever websocket room that corresponds with that team.
I've followed the documentation for socket.io to setup my app in the following way:
App.js
// there's a lot of code in here which sets what to use on my app but here's the important lines
const app = express();
const routes = require('./routes/index');
const sessionObj = {
secret: process.env.SECRET,
key: process.env.KEY,
resave: false,
saveUninitialized: false,
store: new MongoStore({ mongooseConnection: mongoose.connection }),
secret : 'test',
cookie:{_expires : Number(process.env.COOKIETIME)}, // time im ms
}
app.use(session(sessionObj));
app.use(passport.initialize());
app.use(passport.session());
module.exports = {app,sessionObj};
start.js
const mongoose = require('mongoose');
const passportSocketIo = require("passport.socketio");
const cookieParser = require('cookie-parser');
// import environmental variables from our variables.env file
require('dotenv').config({ path: 'variables.env' });
// Connect to our Database and handle an bad connections
mongoose.connect(process.env.DATABASE);
// import mongo db models
require('./models/user');
require('./models/team');
// Start our app!
const app = require('./app');
app.app.set('port', process.env.PORT || 7777);
const server = app.app.listen(app.app.get('port'), () => {
console.log(`Express running → PORT ${server.address().port}`);
});
const io = require('socket.io')(server);
io.set('authorization', passportSocketIo.authorize({
cookieParser: cookieParser,
key: app.sessionObj.key, // the name of the cookie where express/connect stores its session_id
secret: app.sessionObj.secret, // the session_secret to parse the cookie
store: app.sessionObj.store, // we NEED to use a sessionstore. no memorystore please
success: onAuthorizeSuccess, // *optional* callback on success - read more below
fail: onAuthorizeFail, // *optional* callback on fail/error - read more below
}));
function onAuthorizeSuccess(data, accept){}
function onAuthorizeFail(data, message, error, accept){}
io.on('connection', function(client) {
client.on('join', function(data) {
client.emit('messages',"server socket response!!");
});
client.on('getmessage', function(data) {
client.emit('messages',data);
});
});
My problem is that I have a lot of mongo DB save actions that are going on in my ./routes/index file and I would like to be able to emit message from my routes rather than from the end of start.js where socket.io is connected.
Is there any way that I could emit a websocket message from my ./routes/index file even though IO is setup further down the line in start.js?
for example something like this:
router.get('/add-team-member', (req, res) => {
// some io.emit action here
});
Maybe I need to move where i'm initializing the socket.io stuff but haven't been able to find any documentation on this or perhaps I can access socket.io from routes already somehow?
Thanks and appreciate the help, let me know if anything is unclear!
As mentioned above, io is in your global scope. If you do
router.get('/add-team-member', (req, res) => {
io.sockets.emit('AddTeamMember');
});
Then every client connected, if listening to that event AddTeamMember, will run it's associated .on function on their respective clients. This is probably the easiest solution, and unless you're expecting a huge wave of users without any plans of load balancing, this should be suitable for the time being.
Another alternative you can go:
socket.io lib has a rooms functionality where you can join and emit using the io object itself https://socket.io/docs/rooms-and-namespaces/ if you have a knack for this, it'd look something like this:
io.sockets.in('yourroom').broadcast('AddTeamMember');
This would essentially do the same thing as the top, only instead of broadcasting to every client, it'd only broadcast to those that are exclusive to that room. You'd have to basically figure out a way to get that users socket into the room //before// they made the get request, or in other words, make them exclusive. That way you can reduce the amount of load your server has to push out whenever that route request is made.
Lastly, if neither of the above options work for you, and you just absolutely have to send to that singular client when they initiate it, then it's going to get messy, because you have to have some sort of id to that person, and since you have no reference, you'd have to store all your sockets upon connection, and then make a comparison. I do not fully recommend something like this, because well, I haven't ever tested it, and don't know what type of repercussions could happen, but here is a jist of an idea I had:
app.set('trust proxy', true)
var SOCKETS = []
io.on('connection', function(client) {
SOCKETS.push(client);
client.on('join', function(data) {
client.emit('messages',"server socket response!!");
});
client.on('getmessage', function(data) {
client.emit('messages',data);
});
});
router.get('/add-team-member', (req, res) => {
for (let i=0; i< SOCKETS.length; i++){
if(SOCKETS[i].request.connection.remoteAddress == req.ip)
SOCKETS[i].emit('AddTeamMember');
}
});
Keep in mind, if you do go down this route, you're gonna need to maintain that array when users disconnect, and if you're doing session management, that's gonna get hairy really really quick.
Good luck, let us know your results.
Yes, it is possible, you just have to attach the instance of socket.io as long as you get a request on your server.
Looking to your file start.js you just have to replace your functions as:
// Start our app!
const app = require('./app');
app.app.set('port', process.env.PORT || 7777);
const io = require('socket.io')(app.app);
const server = app.app.listen(app.app.get('port'), () => {
server.on('request', function(request, response){
request.io = io;
}
console.log(`Express running → PORT ${server.address().port}`);
});
now when you receive an event that you want to emit some message to the clients you can use your io instance from the request object.
router.get('/add-team-member', (req, res) => {
req.io.sockets.emit('addteammember', {member: 6});
//as you are doing a broadcast you just need broadcast msg
....
res.status(200)
res.end()
});
Doing that i also were able to integrate with test framework like mocha, and test the events emited too...
I did some integrations like that, and in my experience the last thing to do was emit the msg to instances in the socket.
As a good practice the very begining of middleware functions i had were doing data validation, data sanitization and cleaning data.
Here is my working example:
var app = require('../app');
var server = require('http').Server(app);
var io = require('socket.io')(server);
io.on('connection', function(client) {
client.emit('connected');
client.on('disconnect', function() {
console.log('disconnected', client.id);
});
});
server.on('request', function(request, response) {
request.io = io;
});
pg.initialize(app.config.DATABASEURL, function(err){
if(err){
throw err;
}
app.set('port', process.env.PORT || 3000);
var server1 = server.listen(app.get('port'), function(){
var host = 'localhost';
var port = server1.address().port;
console.log('Example app listening at http://%s:%s', host, port);
});
});
Your io is actually the socket object, you can emit events from this object to any specific user by -
io.to(userSocketId).emit('eventName', data);
Or you can broadcast by -
io.emit('eventName', data);
Just create require socket.io before using it :)
You can use emiter-adapter to emit data to client in other process/server. It use redis DB as backend for emitting messages.
I did something similar in the past, using namespaces.
Let's say your client connect to your server using "Frontend" as the namespace.
My solution was to create the instance of socket.io as a class in a separate file:
websockets/index.js
const socket = require('socket.io');
class websockets {
constructor(server) {
this.io = socket(server);
this.frontend = new Frontend(this.io);
this.io.use((socket, next) => {
// put here the logic to authorize your users..
// even better in a separate file :-)
next();
});
}
}
class Frontend {
constructor(io) {
this.nsp = io.of('/Frontend');
[ ... ]
}
}
module.exports = websockets;
Then in App.js
const app = require('express')();
const server = require('http').createServer(app);
const websockets = require('./websockets/index');
const WS = new websockets(server);
app.use('/', (req, res, next) => {
req.websocket = WS;
next();
}, require('./routes/index'));
[ ... ]
Finally, your routes can do:
routes/index.js
router.get('/add-team-member', (req, res) => {
req.websocket.frontend.nsp.emit('whatever', { ... });
[ ... ]
});
My Issue
I've coded a very simple CRUD API and I've started recently coding also some tests using chai and chai-http but I'm having an issue when running my tests with $ mocha.
When I run the tests I get the following error on the shell:
TypeError: app.address is not a function
My Code
Here is a sample of one of my tests (/tests/server-test.js):
var chai = require('chai');
var mongoose = require('mongoose');
var chaiHttp = require('chai-http');
var server = require('../server/app'); // my express app
var should = chai.should();
var testUtils = require('./test-utils');
chai.use(chaiHttp);
describe('API Tests', function() {
before(function() {
mongoose.createConnection('mongodb://localhost/bot-test', myOptionsObj);
});
beforeEach(function(done) {
// I do stuff like populating db
});
afterEach(function(done) {
// I do stuff like deleting populated db
});
after(function() {
mongoose.connection.close();
});
describe('Boxes', function() {
it.only('should list ALL boxes on /boxes GET', function(done) {
chai.request(server)
.get('/api/boxes')
.end(function(err, res){
res.should.have.status(200);
done();
});
});
// the rest of the tests would continue here...
});
});
And my express app files (/server/app.js):
var mongoose = require('mongoose');
var express = require('express');
var api = require('./routes/api.js');
var app = express();
mongoose.connect('mongodb://localhost/db-dev', myOptionsObj);
// application configuration
require('./config/express')(app);
// routing set up
app.use('/api', api);
var server = app.listen(3000, function () {
var host = server.address().address;
var port = server.address().port;
console.log('App listening at http://%s:%s', host, port);
});
and (/server/routes/api.js):
var express = require('express');
var boxController = require('../modules/box/controller');
var thingController = require('../modules/thing/controller');
var router = express.Router();
// API routing
router.get('/boxes', boxController.getAll);
// etc.
module.exports = router;
Extra notes
I've tried logging out the server variable in the /tests/server-test.js file before running the tests:
...
var server = require('../server/app'); // my express app
...
console.log('server: ', server);
...
and I the result of that is an empty object: server: {}.
You don't export anything in your app module. Try adding this to your app.js file:
module.exports = server
It's important to export the http.Server object returned by app.listen(3000) instead of just the function app, otherwise you will get TypeError: app.address is not a function.
Example:
index.js
const koa = require('koa');
const app = new koa();
module.exports = app.listen(3000);
index.spec.js
const request = require('supertest');
const app = require('./index.js');
describe('User Registration', () => {
const agent = request.agent(app);
it('should ...', () => {
This may also help, and satisfies #dman point of changing application code to fit a test.
make your request to the localhost and port as needed
chai.request('http://localhost:5000')
instead of
chai.request(server)
this fixed the same error message I had using Koa JS (v2) and ava js.
The answers above correctly address the issue: supertest wants an http.Server to work on. However, calling app.listen() to get a server will also start a listening server, this is bad practice and unnecessary.
You can get around by this by using http.createServer():
import * as http from 'http';
import * as supertest from 'supertest';
import * as test from 'tape';
import * as Koa from 'koa';
const app = new Koa();
# add some routes here
const apptest = supertest(http.createServer(app.callback()));
test('GET /healthcheck', (t) => {
apptest.get('/healthcheck')
.expect(200)
.expect(res => {
t.equal(res.text, 'Ok');
})
.end(t.end.bind(t));
});
Just in case, if someone uses Hapijs the issue still occurs, because it does not use Express.js, thus address() function does not exist.
TypeError: app.address is not a function
at serverAddress (node_modules/chai-http/lib/request.js:282:18)
The workaround to make it work
// this makes the server to start up
let server = require('../../server')
// pass this instead of server to avoid error
const API = 'http://localhost:3000'
describe('/GET token ', () => {
it('JWT token', (done) => {
chai.request(API)
.get('/api/token?....')
.end((err, res) => {
res.should.have.status(200)
res.body.should.be.a('object')
res.body.should.have.property('token')
done()
})
})
})
Export app at the end of the main API file like index.js.
module.exports = app;
We had the same issue when we run mocha using ts-node in our node + typescript serverless project.
Our tsconfig.json had "sourceMap": true . So generated, .js and .js.map files cause some funny transpiling issues (similar to this). When we run mocha runner using ts-node. So, I will set to sourceMap flag to false and deleted all .js and .js.map file in our src directory. Then the issue is gone.
If you have already generated files in your src folder, commands below would be really helpful.
find src -name ".js.map" -exec rm {} \;
find src -name ".js" -exec rm {} \;
I am using Jest and Supertest, but was receiving the same error. It was because my server takes time to setup (it is async to setup db, read config, etc). I needed to use Jest's beforeAll helper to allow the async setup to run. I also needed to refactor my server to separate listening, and instead use #Whyhankee's suggestion to create the test's server.
index.js
export async function createServer() {
//setup db, server,config, middleware
return express();
}
async function startServer(){
let app = await createServer();
await app.listen({ port: 4000 });
console.log("Server has started!");
}
if(process.env.NODE_ENV ==="dev") startServer();
test.ts
import {createServer as createMyAppServer} from '#index';
import { test, expect, beforeAll } from '#jest/globals'
const supertest = require("supertest");
import * as http from 'http';
let request :any;
beforeAll(async ()=>{
request = supertest(http.createServer(await createMyAppServer()));
})
test("fetch users", async (done: any) => {
request
.post("/graphql")
.send({
query: "{ getQueryFromGqlServer (id:1) { id} }",
})
.set("Accept", "application/json")
.expect("Content-Type", /json/)
.expect(200)
.end(function (err: any, res: any) {
if (err) return done(err);
expect(res.body).toBeInstanceOf(Object);
let serverErrors = JSON.parse(res.text)['errors'];
expect(serverErrors.length).toEqual(0);
expect(res.body.data.id).toEqual(1);
done();
});
});
Edit:
I also had errors when using data.foreach(async()=>..., should have use for(let x of... in my tests