Jest mock pdfmake and fs.createWriteStream - javascript

I am very new to Jest and unit testing in general. I am trying to generate a test for a piece of code that essentially uses pdfmake and fs.createWriteStream in order to create and write to a pdf file
As I am reading tutorials - and getting confused a bit as it's info I am not used to - I have tried to put together a couple of things. An fs.js module to try and get the writestream mocked. About the pdfmake I am not very sure - could i perhaps skip that and assume I have a target file and therefore mock just the stream creation?
describe('createWriteStream', () => {
const MOCK_FILE_INFO = {
'/path/to/file1.pdf': 'console.log("file1 pdf contents");'
};
})
Something like the above?
And then just have a result as success for the fs.createWriteStream?
The original code looks something like:
let pdfDoc = new pdfPrinter({
Roboto: {
normal: new Buffer(require('pdfmake/build/vfs_fonts.js').pdfMake.vfs['Roboto-Regular.ttf'], 'base64')
}
}).createPdfKitDocument(docDefinition);
pdfDoc.pipe(fs.createWriteStream(path));
pdfDoc.end();
I understand is not a lot of info but I guess if someone is very aware of unit testing might know how to best shape this. Thanks

Related

How to run a pickle file in node js

Working in an AI/ML project, I need a way to run the pickle file inside nodejs so I can use it to run the algorithm on the data they submitted.
Try to use the node-pickle library to convert the pickle file to the JSON object. Here documentation of node-pickle
const nodePickle = require('node-pickle');
// Convert pickled object to JSON object
nodePickle.load(pickledData)
.then(data => ({
// data is a JSON object here
})
Then you can use tensorflow.js to run that JSON object as a model.
Tenrsorflow.js Documentation
The only one I could find from a quick google search is node-jpickle. According to it's docs, it seems to be able to handle most of the situations pickle can, even some of the more advanced ones such as classes:
function MyClass() {}
var jpickle = require('jpickle');
jpickle.emulated['__main__.MyClass'] = MyClass;
var unpickled = jpickle.loads(pickled);
If you're just trying to pickle/unpickle, you can just do it like you would in Python:
var unpickled = jpickle.loads(pickled);
The docs don't state anything about a normal load function however.

Postman:how to set up library of (semi-)complicated reusable scripts for collection

Update
I've completely rewritten this question based on subsequent investigation. Hopefully this will generate some answers.
I'm new to Postman, and trying to figure out how to most efficiently build a collection of tests for a REST application. There are a bunch of utility functions that I'd like to have accessible in each of my test scripts, but cut-and-paste-ing them in to each test script seems like a horrible solution.
In looking at the various "scopes" that Postman allows you to squirrel away data (e.g. globals, environment, collection), it seems that all of these are merely string/number stores. In other words, it properly stores them if you can/do stringify the results. But it doesn't actually allow you to store proper objects or functions. This makes sense, since each script seems to be run as a separate execution, so the idea of sharing pointers to things between different scripts doesn't make sense.
It seems like the accepted way to share utility functions is to toString() the function in the defining script (e.g. the Collection Pre-Req script), and then eval() that stringified version in the test script. For instance:
Collection Pre-Req Script
const utilFunc = () => { console.log("I am a utility function"); };
pm.environment.set("utilFunc",utilFunc.toString() );
Test Script
const utilFunc = eval(pm.environment.get("utilFunc"));
utilFunc();
The test script will successfully print to console "I am a utility function".
I've seen people do more complicated things where, if they have more than one utility function, put them in to an object like utils.func1 and utils.func2, and have the overall function return the utils object, so the test script still only has to have a single line at the top importing the whole thing.
The problem I'm running in to is scoping - since the literal text of the function is executed in the Test Script, everything thing that the utility function has to have must be in that code, or otherwise exist at eval() time in the Test Script. For instance, if I do:
Collection Pre-Req Script
const baseUtilFunc = (foo) => { console.log(foo); };
const utilFunc1 = (param) => { baseUtilFunc("One: " + param); };
const utilFunc2 = (param) => { baseUtilFunc("Two: " + param); };
pm.environment.set("utilFunc1",utilFunc1.toString() );
pm.environment.set("utilFunc2",utilFunc2.toString() );
Test Script
const utilFunc1 = eval(pm.environment.get("utilFunc1"));
const utilFunc2 = eval(pm.environment.get("utilFunc2"));
utilFunc1("Test");
This fails because, in the Test Script, baseUtilFunc does not exist. Obviously, in this example, it'd be easy to fix. But in a more complicated world where the utility functions I expect to use in my Test Scripts are themselves built on top of underlying helper functions, it gets more difficult.
So what is the right way to handle this issue? Do people just cram all the relevant logic in to one big function that they then call toString() on? Do they embed an extraction-from-environment-and-then-eval in each util function within its definition, so that it works in the Test Script context? Do they export each individual method?
There are different ways to do it. The way I did recently for one of the projects is creating a project in Git and then using raw url to fetch the data. I have a sample created at below repo
https://github.com/tarunlalwani/postman-utils
To load the file you will need to associate the below code at collection level
if (typeof pmutil == "undefined") {
var url = "https://raw.githubusercontent.com/tarunlalwani/postman-utils/master/pmutils.js";
if (pm.globals.has("pmutiljs"))
eval(pm.globals.get("pmutiljs"))
else {
console.log("pmutil not found. loading from " + url);
pm.sendRequest(url, function (err, res) {
eval(res.text());
pm.globals.set('pmutiljs', res.text())
});
}
}
As shown in below screenshot
And the later in the tests or Pre-Requests you will run the below line of code to load it
eval(pm.globals.get("pmutiljs"))
And then you can use the functions easily in test.

Jest testing, make the variable accessible for all test files

The thing is that I have some preparations to do before starting tests. I've got several test files and each of them requires rabbit connection to be opened before each test file, and close it after the ending of all tests in every file. The thing is that I've got to duplicate code in test files to open and close connection. How can deal with connection in one file. I am using TypeScript so several solutions as jest-environment-node, jest-set are not helpful in this case. Also there is a chance that I can solve this proble using setup files as:
"setupFiles": [
"./src/inttests/partial/mocks/setup.ts"
],
And write in setup file something like:
let channel: PubChannel;
beforeAll(() => {
channel = new PubChannel('amqp://localhost', true);
exports.ss = new SystemService(channel); // or using globals but it doesn't actually work in both cases.
});
afterAll(async () => {
await channel.closeConnection();
}
);
The thing is that I need ss variable to use in tests in order to call some functions. But I don't know how to make them available from setupt.ts file as I can't use export kew word because the value has been modified inside beforeAll scope. Is there any chance to solve this ridiculous problem?

NodeJS & Gulp Streams & Vinyl File Objects- Gulp Wrapper for NPM package producing incorrect output

Goal
I am currently trying to write a Gulp wrapper for NPM Flat that can be easily used in Gulp tasks. I feel this would be useful to the Node community and also accomplish my goal. The repository is here for everyone to view , contribute to, play with and pull request. I am attempting to make flattened (using dot notation) copies of multiple JSON files. I then want to copy them to the same folder and just modify the file extension to go from *.json to *.flat.json.
My problem
The results I am getting back in my JSON files look like vinyl-files or byte code. For example, I expect output like
"views.login.usernamepassword.login.text": "Login", but I am getting something like {"0":123,"1":13,"2":10,"3":9,"4":34,"5":100,"6":105 ...etc
My approach
I am brand new to developing Gulp tasks and node modules, so definitely keep your eyes out for fundamentally wrong things.
The repository will be the most up to date code, but I'll also try to keep the question up to date with it too.
Gulp-Task File
var gulp = require('gulp'),
plugins = require('gulp-load-plugins')({camelize: true});
var gulpFlat = require('gulp-flat');
var gulpRename = require('gulp-rename');
var flatten = require('flat');
gulp.task('language:file:flatten', function () {
return gulp.src(gulp.files.lang_file_src)
.pipe(gulpFlat())
.pipe(gulpRename( function (path){
path.extname = '.flat.json'
}))
.pipe(gulp.dest("App/Languages"));
});
Node module's index.js (A.k.a what I hope becomes gulp-flat)
var through = require('through2');
var gutil = require('gulp-util');
var flatten = require('flat');
var PluginError = gutil.PluginError;
// consts
const PLUGIN_NAME = 'gulp-flat';
// plugin level function (dealing with files)
function flattenGulp() {
// creating a stream through which each file will pass
var stream = through.obj(function(file, enc, cb) {
if (file.isBuffer()) {
//FIXME: I believe this is the problem line!!
var flatJSON = new Buffer(JSON.stringify(
flatten(file.contents)));
file.contents = flatJSON;
}
if (file.isStream()) {
this.emit('error', new PluginError(PLUGIN_NAME, 'Streams not supported! NYI'));
return cb();
}
// make sure the file goes through the next gulp plugin
this.push(file);
// tell the stream engine that we are done with this file
cb();
});
// returning the file stream
return stream;
}
// exporting the plugin main function
module.exports = flattenGulp;
Resources
https://github.com/gulpjs/gulp/blob/master/docs/writing-a-plugin/README.md
https://github.com/gulpjs/gulp/blob/master/docs/writing-a-plugin/using-buffers.md
https://github.com/substack/stream-handbook
You are right about where the error is. The fix is simple. You just need to parse file.contents, since the flatten function operates on an object, not on a Buffer.
...
var flatJSON = new Buffer(JSON.stringify(
flatten(JSON.parse(file.contents))));
file.contents = flatJSON;
...
That should fix your problem.
And since you are new to the Gulp plugin thing, I hope you don't mind if I make a suggestion. You might want to consider giving your users the option to prettify the JSON output. To do so, just have your main function accept an options object, and then you can do something like this:
...
var flatJson = flatten(JSON.parse(file.contents));
var jsonString = JSON.stringify(flatJson, null, options.pretty ? 2 : null);
file.contents = new Buffer(jsonString);
...
You might find that the options object comes in useful for other things, if you plan to expand on your plugin in future.
Feel free to have a look at the repository for a plugin I wrote called gulp-transform. I am happy to answer any questions about it. (For example, I could give you some guidance on implementing the streaming-mode version of your plugin if you would like).
Update
I decided to take you up on your invitation for contributions. You can view my fork here and the issue I opened up here. You're welcome to use as much or as little as you like, and in case you really like it, I can always submit a pull request. Hopefully it gives you some ideas at least.
Thank you for getting this project going.

How can i use sinon to stub functions for neo4j Thingdom module

I am having some issues writing some unit tests where i would like to stub out the functionality of the neo4j Thingdom module.
After a few hours of failed attempts i have been searching around the web and the only point of reference i found was a sample project which used to sinon.createStubInstance(neo4j.GraphDatabase); to stub out the entire object. For me, and becuase this seemed to a be a throw away project i wanted a more fine grained approach so i can test that for instance as the Thingdom API outlines when saving a node you create it (non persisted) persist it and then you can index it if you wish which are three calls and could be outlined in multiple specific tests, which i am not sure can be achieved with the createStubInstance setup (i.e. found out if a function was called once).
Example "create node" function (this is just to illustrate the function, i am trying to build it out using the tests)
User.create = function(data, next){
var node = db.createNode(data);
node.save(function(err, node){
next(null,node);
});
};
I am able to stub functions of the top level object (neo4j.GraphDatabase) so this works:
it('should create a node for persistence', function(){
var stub = sinon.stub(neo4j.GraphDatabase.prototype, 'createNode');
User.create({}, res);
stub.calledOnce.should.be.ok;
stub.restore();
});
The issue comes with the next set of test i wish to run which tests if the call to persist the node to the database is called (the node,save) method:
I am not sure if this is possible or it can be achieved but i have tried several variations of the stub and non seem to work (on neo4j.Node, neo4j.Node.prototype) and they all come back with varying errors such as can't wrap undefined etc. and this is probably due to the createNode function generating the node and not my code directly.
Is there something i am glaringly doing wrong, am i missing the trick or can you just not do this? if not what are the best tactics to deal with stuff like this?
A possible solution is to return a stubbed or mocked object, giving you control on what happens after the node is created:
it('should create a node for persistence and call save', function () {
var stubbedNode = {
save: sinon.stub().yields(undefined, stubbedNode)
};
var stub = sinon.stub(neo4j.GraphDatabase.prototype, 'createNode').returns(stubbedNode);
User.create({}, res);
stub.calledOnce.should.be.ok;
stub.restore();
stubbedNode.save.calledOnce.should.be.ok;
});
We couldn't do it directly, the way the module is setup it doesn't work to well with Sinon. What we are doing is simply abstracting the module away and wrapping it in a simple facade/adapter which we are able to stub on our unit tests.
As we are not doing anything bar calling the neo4j module in that class we are integration (and will validate when regression testing) testing that part to make sure we are hitting the neo4j database.

Categories

Resources