Socket.io not globally available when using Webpack - javascript

Everything was working fine before adding webpack. Here's my current config (below). Inside login.js, there's a window.onload function, then shortly after there's a socket.on function, which is now breaking the program. It seems as if "socket" is not globally available to the other files after adding webpack.
Here's the error code I receive in console when running my app:
"login.js:1 Uncaught TypeError: socket.on is not a function at window.onload (login.js:1)"
Any help or insight would be appreciated.
webpack.config
const path = require('path')
module.exports = {
entry: {
scripts: './src/scripts.js',
login: './src/login.js'
},
output: {
path: path.resolve(__dirname, 'public/js'),
filename: '[name].js'
},
module: {
rules: [{
test: /\.js$/,
exclude: /node_modules/,
use: {
loader: 'babel-loader',
options: {
presets: ['env']
}
}
}]
},
devServer: {
contentBase: path.resolve(__dirname, 'public'),
publicPath: '/js/'
}
}
Script section of index.html
<script src="./socket.io/socket.io.js"></script>
<script src="./js/scripts.js"></script>
<script src="./js/login.js"></script>
</body>
Worked fine before, so I don't believe there's an issue here:
Server
const http = require('http').Server(app);
const io = require('socket.io')(http);
Client
scripts.js (not in a function, globally declared):
const socket = io();
let socketid;
socket.on('connect', () => socketid = socket.io.engine.id);
login.js (inside window.onload function):
socket.on('check email', () => {
setTimeout(() => refreshPage(), 7500);
popupbox({
titletext: 'Verify email', messagetext: 'A verification link has been sent, please verify your email address within 24 hours.',
okaytext: 'Okay', okayfunction: () => refreshPage(),
customcolor: "#007C5B"
});
});

const socket is local to the module defined by scripts.js and is not a global variable. And as of that it does not exists in login.js.
While it would be possible to make it global, the whole idea of modules and bundlers is to encapsulate your code into small logical parts and that you don't need to pollute the global namespace.
So you need to pass your socket in some way to login.js. How you want to do that depends on the overall structure of the project.
One way could be to export it in your module, and import/require it in your login.js or where ever else you need it.
socket.js
const socket = io();
let socketid;
socket.on('connect', () => socketid = socket.io.engine.id);
module.exports.socket = socket; // export the socket
// could also look that way:
// module.exports = socket;
// the require would then look like that:
// const socket = require('./socket.js');
login.js
const socket = require('./socket.js').socket;
socket.on('check email', () => {
setTimeout(() => refreshPage(), 7500);
popupbox({
titletext: 'Verify email', messagetext: 'A verification link has been sent, please verify your email address within 24 hours.',
okaytext: 'Okay', okayfunction: () => refreshPage(),
customcolor: "#007C5B"
});
});

Also you can use global to declare your variables globaly like
global.io = io();
And the use the io variable in other files like
io.on('connect',...)
Might be it would help someone in future

Related

Confused by module exports/requires

I am working on a piece where I am basically refactoring existing code. I have two files: index and server. My index is:
const md5File = require('md5-file');
const fs = require('fs');
const path = require('path');
const ignoreStyles = require('ignore-styles');
const register = ignoreStyles.default;
const extensions = ['.gif', '.jpeg', '.jpg', '.png', '.svg'];
...
require('#babel/polyfill');
require('#babel/register')({
ignore: [/\/(build|node_modules)\//],
presets: ['#babel/preset-env', '#babel/preset-react'],
plugins: [
'#babel/plugin-syntax-dynamic-import',
'#babel/plugin-proposal-class-properties',
'dynamic-import-node',
'react-loadable/babel'
]
});
// Now that the nonsense is over... load up the server entry point
require('./server');
My server is like:
import path from 'path';
import Loadable from 'react-loadable';
...
const main = async () => {
// tell React Loadable to load all required assets
await Loadable.preloadAll();
process.on('unhandledRejection', err => {
console.error(err);
process.exit(1);
});
const server = fastify(config.fastify);
server.register(require('./routes'), config);
server.register(fastifyStatic, {
root: path.resolve(__dirname, '../build')
});
if (require.main === module) {
// called directly i.e. $ node index.js
const address = await server.listen(config.address);
// start listening - ROCK AND ROLL!
server.log.info(`Server running at: ${address}`);
} else {
// required as a module => executed on aws lambda
module.exports = server;
}
};
main();
The server should run a REST service when executed locally, and export the server instance for the inject method. This way a proxy can attach to it when running under AWS Lambda.
I used the same setup before, multiple times. Only the two pieces were in the same file eg the server was inside the index. A single file version works fine - require.main comparison tells the program how it is running and module.exports exposes the server instance [server] with the needed inject method when running under Lambda and runs the REST service with a direct invocation.
However, since I need to import react-loadable this time, I split the files in two pieces.
Now, I probably need to figure out how the code is running inside index.js as server.js is not being invoked directly then pass it to the server. It is probably not too difficult.
My main problem is that when if I do console.log(require('./server')) from index, it prints {}. While the server instance is created successfully, and the inject() is present; I am somehow unable to export it from the server.js file and import into index.js correctly, and therefore [re-]export it from index.js for the proxy to attach.
Obviously, I am doing something incorrectly. To me, seems that my require does not have the server instance because the instance gets created after the require finished. Since my main() is async, it is plausible.
What is the right way to accomplish this?
First, notice that your main() function in ./server.js is async and you're defining module.exports from within that asynchronous function. Second, you're calling require('./server.js') from ./index.js without waiting for the asynchronous work to finish. Node resolves require()-d modules as a blank object immediately (that's the {} you're getting), and then extends the object when any async or cyclic material becomes available. So that's why you're seeing what you're seeing.
Which solutions will or will not fit your use case will depend on the details of how your AWS/direct invocation is supposed to work. Here's a suggestion:
const md5File = require('md5-file');
const fs = require('fs');
const path = require('path');
const ignoreStyles = require('ignore-styles');
const register = ignoreStyles.default;
const extensions = ['.gif', '.jpeg', '.jpg', '.png', '.svg'];
// ...
require('#babel/polyfill');
require('#babel/register')({
ignore: [/\/(build|node_modules)\//],
presets: ['#babel/preset-env', '#babel/preset-react'],
plugins: [
'#babel/plugin-syntax-dynamic-import',
'#babel/plugin-proposal-class-properties',
'dynamic-import-node',
'react-loadable/babel'
]
});
process.env.serverRunLocally = require.main === module;
// Now that the nonsense is over... load up the server entry point
require('./server').then(listen => listen());
import path from 'path';
import Loadable from 'react-loadable';
// ...
const main = async () => {
// tell React Loadable to load all required assets
await Loadable.preloadAll();
process.on('unhandledRejection', err => {
console.error(err);
process.exit(1);
});
const server = fastify(config.fastify);
server.register(require('./routes'), config);
server.register(fastifyStatic, {
root: path.resolve(__dirname, '../build')
});
if (process.env.serverRunLocally) {
// called directly i.e. $ node index.js
return () => {
const address = await server.listen(config.address);
// start listening - ROCK AND ROLL!
server.log.info(`Server running at: ${address}`);
}
} else {
// required as a module => executed on aws lambda
return server;
}
};
module.exports = main();

How to use Jest global Setup and Teardown in a nodeJS project?

I added tests to my node js project using jest but for each test suite there's a beforeAll method that creates a new test server and connects to a mongo database and an afterAll method that closes both test server and the database. I would like to perform the above tasks globally for all the test suites not one at a time. Below is a sample of my code.
app.js
const express = require("express");
const app = express();
const { connectToDb } = require("./startup/db");
require("./startup/routes")(app);
connectToDb();
...
const port = process.env.PORT || 3000;
if (process.env.NODE_ENV !== "test") {
app.listen(port, () => winston.info(`Listening on port ${port}...`));
}
module.exports = app;
auth.test.js
const request = require("supertest");
const http = require("http");
const { disconnectDb } = require("../../startup/db");
describe("auth middleware", () => {
let server;
beforeAll((done) => {
const app = require("../../app");
server = http.createServer(app);
server.listen(done);
});
afterAll((done) => {
server.close(done);
disconnectDb();
});
it("should return 401 if no token is provided", async () => {
const res = request(server)
.post("/api/genres")
.set("x-auth-token", "")
.send({ name: "genre1" });
expect(res.status).toBe(401);
});
...
jest.config.js
module.exports = {
testEnvironment: "node",
};
Try with this jest.config.js:
module.exports = {
testEnvironment: "node",
globalSetup: '<rootDir>/src/testSetup.ts'
};
And in testSetup.ts you can do:
// testSetup.ts
module.exports = async () => {
const app = require("../../app");
server = http.createServer(app);
server.listen(done);
};
use this config: setupFiles: ['./tests/setup.js']
your setup file should look like this:
// setup.js
(async () => {
const app = require('../app.js')
global.app = app
})()
then you will be able to use app globally in every test suite
I had the same problem, I wanted to make one database connection before all test files and close the connection after all tests in all files.
But....I did not achieve what I wanted and MAYBE we don't need to do this.
I found a solution to launch functions beforeAll(),afterAll() etc... really before ALL TEST FILES and after ALL TEST FILES etc..
So you define these functions once in a certain file and they run for every test file.
To do that, all we need is to create a setupFile.ts and add path to this file in jest.config or in package.json "setupFilesAfterEnv": ["<rootDir>/__tests__/settings/setupTests.ts"],
Here is an example of my jest configuration.
"jest": {
"preset": "ts-jest",
"testEnvironment": "node",
"setupFilesAfterEnv": ["<rootDir>/__tests__/settings/setupTests.ts"],
"rootDir": "src",
"verbose": true,
"clearMocks": true,
"testMatch": [
"**/**/*.test.ts"
]
},
Here is an example of setupFile.ts
import usersCollection from "../../database/user-schema";
import mongoose from "mongoose";
beforeAll(async () => {
try {
await mongoose.connect(process.env.MONGODB_URL!);
await usersCollection.deleteMany({});
} catch (error) {
console.log(error);
}
});
afterAll(async () => {
try {
await mongoose.disconnect();
} catch (error) {
console.log(error);
}
});
It means that we will establish a connection to the database FOR EVERY TEST FILE BEFORE ALL TESTS IN THAT FILE and close connection after all tests in every test file.
What I realized for myself:
In real life we have many test files and not every file needs a connection to a database.
It's perfectly fine to open a connection to a database in files which need a connection and close after all tests in that file, for example integration tests when we test API endpoints.
In other tests to not use real database for many unit tests we can consider to mock(simulate) a database. It's another very interesting topic 😊
If I say something wrong you can correct me
P.S
I also want to mention what is written in the Mongoose documentation
Do not use globalSetup to call mongoose.connect() or
mongoose.createConnection(). Jest runs globalSetup in a separate
environment, so you cannot use any connections you create in
globalSetup in your tests.
https://mongoosejs.com/docs/jest.html

How would I get webpack or an other JS bundler to bundle files remotely hosted?

I have a distributed system and all JS files are exposed through HTTP. So a normal module would look like this:
http://example.com/path/to/main.js
import * as core from 'http://local.example.com/path/to/core.js';
import * as redux from 'http://cdn.example.com/redux.js#version';
// code
export default {
...
}
So each import will be using either a local resource to the system or possibly remotely available resources using CDN.
Thought when I run webpack, I get this error:
trying to parse a local generated file with such content:
import * as main from 'http://example.com/path/to/main.js';
ERROR in ./src/index.js Module not found: Error: Can't resolve
'http://example.com/path/to/main.js' in '/home/.../index.js'
Is it possible to tell webpack to fetch the urls and include them inside the bundle... While packaging cdn urls isn't a big deal for now, I'd be happy if I could simply ignore the ones with a certain url.
Thought being able to bundle remote all the http:// located files would be a good start.
Also, any remote resource linking to other resources should recursively load remotely linked resources too.
Here's my current webpack config (thought nothing much to see here):
const path = require('path');
module.exports = {
mode: 'development',
entry: './src/index.js',
output: {
filename: 'main.js',
path: path.resolve(__dirname, 'dist'),
},
module: {
rules: [
]
},
};
Edit: after reading a bit, I started writing a resolver but now I'm stuck again:
const path = require('path');
const fetch = require('node-fetch');
const url = require('url')
const fs = require('promise-fs');
const sha1 = require('sha1')
class CustomResolver {
async download_save(request, resolveContext) {
console.log(request, resolveContext)
var target = url.parse(request.request)
var response = await fetch(request.request)
var content = await response.text()
try {
await fs.stat('_remote')
} catch(exc) {
await fs.mkdir('_remote')
}
var filename = `${sha1(request.request)}.js`
var file_path = `_remote/${filename}`
await fs.writeFile(file_path, content)
var abs_path = path.resolve(file_path)
var url_path = `${target.protocol}://${target.hostname}/`
var obj = {
path: abs_path,
request: request.request,
query: '',
}
console.log(`${request.request} saved to ${abs_path}`)
return obj
}
apply(resolver) {
var self = this
const target = resolver.ensureHook("resolved")
resolver.getHook("module")
.tapAsync("FetchResolverPlugin", (request, resolveContext, callback) => {
self.download_save(request, resolveContext)
.then((obj) => resolver.doResolve(target, obj, resolveContext, callback))
.catch((err) => {
console.log(err)
callback()
})
})
}
}
It does currently fetch urls starting with https:// but seems to be struggling to resolve urls relative to an http resource. For example
ERROR in _remote/88f978ae6c4a58e98a0a39996416d923ef9ca531.js
Module not found: Error: Can't resolve '/-/#pika/polyfill#v0.0.3/dist=es2017/polyfill.js' in '_remote/'
# _remote/88f978ae6c4a58e98a0a39996416d923ef9ca531.js 25:0-58
# _remote/f80b922b2dd42bdfaaba4e9f4fc3c84b9cc04fca.js
# ./src/index.js
It doesn't look like it tries to resolve relative path to already resolved files. Is there a way to tell the resolver to try to resolve everything?
Main point is: if you have CDN files - you don't need a bundler.
They already minified and ready to use. Just import files in root of your project and call libraries globally.

Identifier 'browserSync' has already been declared

I keep getting an error "Identifier 'browserSync' has already been declared" but i cant see where the problem is.Here is my code
// Watch files
function watchFiles() {
gulp.watch("*.js", gulp.series(scriptsLint, scripts, browserSyncReload));
gulp.watch(["processHTML"], gulp.series(browserSyncReload));
}
//Task Live Reload
function browserSync(done) {
browserSync.init({
server: './dist',
port: 8080,
ui: {
port: 8081
}
})
done()
};
// BrowserSync Reload
function browserSyncReload(done) {
browserSync.reload();
done();
}
// define complex tasks
const js = gulp.series(scriptsLint, scripts);
const build = gulp.parallel(processHTML,js);
const watch = gulp.parallel(watchFiles, browserSync);
You need to rename your function browserSync to other name, because that's a keyword reserved for the BrowserSync library.
Something like this:
// Watch files
function watchFiles() {
gulp.watch("*.js", gulp.series(scriptsLint, scripts, reload));
gulp.watch(["processHTML"], gulp.series(reload));
}
//Task Live Reload
function localServer(done) {
browserSync.init({
server: './dist',
port: 8080,
ui: {
port: 8081
}
})
done()
};
// BrowserSync Reload
function reload(done) {
browserSync.reload();
done();
}
// define complex tasks
const js = gulp.series(scriptsLint, scripts);
const build = gulp.parallel(processHTML,js);
const watch = gulp.parallel(watchFiles, localServer);
Your browserSync() function, declared in line 9, is named the same as another variable in its scope, browserSync (in line 10), and needs to be renamed.
// Watch files
function watchFiles() {
gulp.watch("*.js", gulp.series(scriptsLint, scripts, browserSyncReload));
gulp.watch(["processHTML"], gulp.series(browserSyncReload));
}
//Task Live Reload
function browserSyncFunc(done) {
browserSync.init({
server: './dist',
port: 8080,
ui: {
port: 8081
}
})
done()
};
// BrowserSync Reload
function browserSyncReload(done) {
browserSync.reload();
done();
}
// define complex tasks
const js = gulp.series(scriptsLint, scripts);
const build = gulp.parallel(processHTML,js);
const watch = gulp.parallel(watchFiles, browserSyncFunc /* I'm guessing you meant to use the browserSync function here, not the object */);

dotenv not working with serverless/webpack

EDIT: If I log out dotenv.config() I get an error of : Error: ENOENT: no such file or directory, open '/Users/myPathToApplication/.webpack/test/.env'
I am bundling my serverless handler in order to use es6/es7 code. I have some env variables that I am trying to use as well. The problem is it seems that dotenv is not working when I bundle the handler.
For example one of the utils I am using is connecting mongoose to my application. In here I store the DB_URI as an env variable. import envdotjs from 'envdotjs';
import mongoose from 'mongoose';
mongoose.Promise = global.Promise;
require('dotenv').config();
let isConnected;
const connectToDatabase = () => {
if (isConnected) {
console.log('=> using existing database connection');
return Promise.resolve();
}
console.log('=> using new database connection');
return mongoose.connect(process.env.DB_URI).then(db => {
isConnected = db.connections[0].readyState;
});
};
module.exports = {
connectToDatabase
};
However the DB_URI is undefined and the code breaks.
Here is my webpack:
const slsw = require('serverless-webpack');
const nodeExternals = require('webpack-node-externals');
module.exports = {
entry: slsw.lib.entries,
target: 'node',
devtool: 'source-map',
externals: [nodeExternals()],
mode: slsw.lib.webpack.isLocal ? 'development' : 'production',
module: {
rules: [
{
test: /\.js$/,
loader: 'babel-loader',
include: __dirname,
exclude: /node_modules/
}
]
}
};
I am running this in order to use es6/7 on serverless handler which is working just fine. But the env variables are breaking. I also tried using a module called envdotjs and got the same results that the env variables are undefined so I don't think this is a problem with dotenv.
I found a package dotenv-webpack also recommended by #apokryfos. Just require it in const Dotenv = require('dotenv-webpack') and include it in the webpack.config.js.
module.exports = {
...
plugins: [new Dotenv()]
}
Just include your .env in the root with your webpack.config.js and you can declare your process.env. anywhere you need to with no other configuration.

Categories

Resources