Mocking a yaml file in jest - javascript

I have a yaml file, that has some config information and I use it in a module that I want to test. But when I test it I want to mock it so it only has simplified and static data, so it's easy to test and if the config is changed I don't have to edit the tests. Here is what I tried so far:
// config/index.js
const yaml = require('js-yaml');
const fs = require('fs');
const path = require('path');
const filePath = path.join(__dirname, 'stuff.yaml');
module.exports =
{
getStuff()
{
return yaml.safeLoad(fs.readFileSync(filePath, 'utf8'));
},
setStuff(stuff)
{
fs.writeFile(filePath, yaml.safeDump(stuff), err => console.log);
}
}
// test/config.test.js
const config = require("../config")
test('getStuff', () => {
jest.mock('../config/stuff.yaml')
expect(config.getStuff()).toEqual({/*..*/});
});
My file structure being:
project-root/
├── config/
│ ├── __mocks__/
| └── stuff.yaml (the mock file)
│ ├── stuff.yaml (the real file)
│ └── index.js
└── test/
└── config.test.js
But the test still return the data from the real file. Summarizing, I want to mock a text file in the file system, so that any module reads it instead of the real one.
Note: I don't really care if the mock version is on the disk or I just have it as a string in memory. Having it in memory would even be beneficial in the sense of the tests being faster.

You can probably update your Jest configuration and leverage moduleNameMapper to handle this.
{
"moduleNameMapper": {
"config/stuff.yaml": "<rootDir>/config/__mocks__/stuff.yaml"
}
}

You could also try setMock - https://facebook.github.io/jest/docs/en/jest-object.html#jestsetmockmodulename-moduleexports
jest.setMock('config/__mocks__/stuff.yaml', require('config/stuff.yaml');

Related

How to make ChaiJS 'expect' function available in all files inside NodeJS 'test' folder?

I'm using Truffle which has Mocha & Chai included by default.
File tree:
├── test
│   ├── ManagedWallet.js
│ ├── AnotherTest.js
│ └── AndAnotherTest.js
└── truffle.js
This is one of my test file:
var expect = require('chai').expect;
var managedWallet = artifacts.require("./ManagedWallet.sol");
contract('ManagedWallet', function(accounts) {
it("belongs to customer's account address", async function(){
var contract = await managedWallet.deployed();
var customer_account = accounts[1];
var owner = await contract.owner.call();
expect(owner).to.equal(customer_account);
});
});
Since I have a lot of test files inside the test folder, I need to put that var expect = require('chai').expect; line in all the test files.
Is there a way for me to make that expect function available globally without needing me to have that var expect = require('chai').expect; line in all my test files?
I tried putting var expect = require('chai').expect; line inside truffle.js file (above module.exports) but it didn't work and I got ReferenceError: expect is not defined error when I ran the tests.
Example of my truffle.js file (located in the root folder of the project):
module.exports = {
networks: {
development: {
host: "127.0.0.1",
port: 7545,
network_id: "*" // match any network
}
}
};
Here's my .mocharc.json file that enables this:
{
"spec": "src/**/*.test.ts",
"require": ["esm", "ts-node/register", "chai/register-assert", "chai/register-expect", "chai/register-should"],
"file": "./test-setup.cjs",
"recursive": true
}
I got the answer from GitHub (thanks #keithamus).
require('chai/register-expect');
//... rest of code
expect(...)
My truffle.js looks like:
require('chai/register-expect');
module.exports = {
// the truffle configs
}
And, using expect inside my test files no longer throwing any errors. 🚀

Node JS Cannot Get /

I am following this Node JS tutorial, but I have encountered a problem where I get the response Cannot GET / instead of Hello. Here is the structure for my code:
myapp/
├── app
│   └── routes
│   ├── index.js
│   └── note_routes.js
├── conf
│   ├── httpd-app.conf
│   └── httpd-prefix.conf
├── htdocs
├── node_modules
│   ├── ...
├── package.json
├── package-lock.json
└── server.js
server.js looks like this:
const express = require('express');
const MongoClient = require('mongodb').MongoClient;
const bodyParser = require('body-parser');
const app = express();
const port = 3000;
require('./app/routes')(app, {});
app.listen(port, () => {
console.log('We are live on ' + port);
});
index.js is this:
const noteRoutes = require('./note_routes');
module.exports = function(app, db) {
noteRoutes(app, db);
// Other route groups could go here, in the future
};
And note_routes.js looks like this:
module.exports = function(app, db) {
app.post('/notes', (req, res) => {
// You'll create your note here.
res.send('Hello')
});
};
As can be seen, this is a very simple setup. According to the above-linked tutorial, this should be enough and should respond with Hello. However, this is not the case.
I am almost 100% sure that the port is correct because there is some not-503 response.
What's wrong here, and how can I get this to work as intended?
Your app.post() in note_routes should be app.get() or if you plan on making post requests then you can chain it like
app.post()
.get();
this is wrong on a whole new kind of level.
I would imagine you are starting your server with node server.js
firstly, server.js does not require index.js anywhere
note_routes should contain .get() NOT .post()
in your browser you should be visiting http://localhost:3000/notes NOT plain old
http://localhost:3000

Unable to require() from inside an already "required" module

I have this project structure:
myApp
├── gulpfile.js
├── package.json
└── source
   └── scripts
      ├── modules
      │ └── utils.js
      ├── background.js
      └── data.json
My browserify task:
gulp.task('browserify', function () {
return gulp.src(['./source/scripts/**/*.js'])
.pipe($.browserify({
debug: true,//for source maps
standalone: pkg['export-symbol']
}))
.on('error', function(err){
console.log(err.message);
this.emit('end');
})
.pipe(gulp.dest('./build/scripts/'));
});
My sample utils.js:
const data = require('../data.json');
const utils = (function () {
const output = function () {
console.log(data);
};
return {
output: output,
};
}());
module.exports = utils;
If I try to build it with the current directory structure, I get this error:
module "../data.json" not found from "/dev/myApp/source/scripts/fake_4d8cf8a4.js"
I can only build it, if I put data.json inside the modules directory AND inside the scripts directory, ie. it only works if I duplicate the file:
myApp
├── gulpfile.js
├── package.json
└── source
   └── scripts
      ├── modules
      │ ├── utils.js
      │ └── data.json
      ├── background.js
      └── data.json
Obviously this is not okay... what am I doing wrong?
Thank you
I'm inferring from your use of gulp.src to pass files to $.browerify that you are using a Gulp plugin, probably gulp-browserify. It is generally not recommended to use a plugin to invoke Browserify from Gulp. The recommended way to do it is to just call Browserify directly. Indeed, Gulp's blacklist of plugins states:
"gulp-browserify": "use the browserify module directly",
I've replicated your directory structure and put some reasonable values for the files for which you did not provide contents (data.json, background.js) and indeed, I get the same error you get when I try to run the Gulp code you show. However, if I switch to calling Browserify directly, I do not get any error. Here is the code I have:
const gulp = require("gulp");
const browserify = require("browserify");
const source = require('vinyl-source-stream');
gulp.task('browserify', function () {
return browserify({
entries: ["./source/scripts/background.js",
"./source/scripts/modules/utils.js"],
debug: true,//for source maps
standalone: "foo",
})
.bundle()
.pipe(source('bundle.js')) // This sets the name of the output file.
.pipe(gulp.dest('./build/scripts/'));
});
You use gulp.src(['./source/scripts/**/*.js']) in your code, which means that Browserify will take all your .js files as entries into the bundle.
So I've put two entries in my code above, which manually replicates the pattern you use with the plugin. However, while Browserify does not produce an error with this setup, I suspect you don't actually want to have multiple entries. Typically, we pass one entry point to Browserify and let it trace the require calls to figure what it needs to pull.

How to copy multiple files and keep the folder structure with Gulp

I am trying to copy files from one folder to another folder using Gulp:
gulp.task('move-css',function(){
return gulp.src([
'./source/css/one.css',
'./source/other/css/two.css'
]).pipe(gulp.dest('./public/assets/css/'));
});
The above code is copying one.css & two.css to the public/assets/css folder.
And if I use gulp.src('./source/css/*.css') it will copy all CSS files to the public/assets/css folder which is not what I want.
How do I select multiple files and keep the folder structure?
To achieve this please specify base.
¶ base - Specify the folder relative to the cwd. Default is where the glob begins. This is used to determine the file names when saving in .dest()
In your case it would be:
gulp.task('move-css',function(){
return gulp.src([
'./source/css/one.css',
'./source/other/css/two.css'
], {base: './source/'})
.pipe(gulp.dest('./public/assets/'));
});
Folder structure:
.
├── gulpfile.js
├── source
│ ├── css
│ └── other
│ └── css
└── public
└── assets
I use gulp-flatten and use this configuration:
var gulp = require('gulp'),
gulpFlatten = require('gulp-flatten');
var routeSources = {
dist: './public/',
app: './app/',
html_views: {
path: 'app/views/**/*.*',
dist: 'public/views/'
}
};
gulp.task('copy-html-views', task_Copy_html_views);
function task_Copy_html_views() {
return gulp.src([routeSources.html_views.path])
.pipe(gulpFlatten({ includeParents: 1 }))
.pipe(gulp.dest(routeSources.html_views.dist));
}
And there you can see the documentation about gulp-flatten: Link
gulp.task('move-css',function(){
return gulp
.src([ 'source/**'], { base: './' })
.pipe(gulp.dest('./public/assets/css/'));
});
Your own code didn't include the entire dir tree of source 'source/**' and the base {base:'./'} when calling to gulp.src which caused the function to fail.
The other parts where fine.
gulp.task('move-css',function(){
return gulp.src([
'./source/css/one.css',
'./source/other/css/two.css'
]).pipe(gulp.dest('./public/assets/css/'));
});

Intern path error in browser client

I have a conventional recommended Intern directory structure:
MyProject
├── node_modules
│ ├── intern-geezer
│ │ ├── client.html
├── src
│ ├── myFunction.js
├── tests
│ ├── intern.js
│ ├── unit
│ │ ├── ps.js
with a very simple config:
useLoader: {
'host-node': 'dojo/dojo',
'host-browser': 'node_modules/dojo/dojo.js'
},
loader: {
packages: []
},
suites: [ 'tests/unit/ps' ]
and tests:
define(function (require) {
var tdd = require('intern!tdd');
var assert = require('intern/chai!assert');
// Global function to test, not an AMD module
var parseF = require('src/myFunction.js');
var he = require('tests/he');
tdd.suite('My tests', function() {
//etc
});
});
````
but when I open the browser client the loader is looking for the test suite inside the intern-geezer directory:
I am not setting a baseUrl in the config (or in the browser URL). I didn't have this trouble going through the regular (non-geezer) intern tutorial. Since the baseUrl defaults to two directories up from client.html I don't see what I'm doing wrong. Thanks for any help. (Yes, I will need geezer for ancient IE. No, I do not want to rewrite the function I'm testing as an AMD module.)
The Intern loader doesn't know how to get to your tests because they haven't been registered as a package. When using the recommended directory structure, you'll want to also set up the recommended loader configuration so that Intern knows where to find your code and tests.
loader: {
packages: [
{ name: 'app', location: 'src/' },
{ name: 'tests', location: 'tests/' }
]
}
Then, update your tests to correctly find the code you need to test.
// Global function to test, not an AMD module
var parseF = require('app/myFunction.js');
Now, Intern should be able to correctly find the code and tests.

Categories

Resources