Get tests running time with Jest - javascript

Is there any way to know how long my tests take without doing it programmatically with Jest?
To be clear, I know that if I add a variable to get the current time before each test and then log this when my test completes I'll get this information, but I want this automatically, maybe with some Jest configuration.

You shouldn't need any configuration to get the running time for your tests
PASS src/containers/Dashboard/Dashboard.test.tsx (12.902s)
That 12.902s in the brackets is the total time from when the test command was run.
If you want to see the running time per test you can run jest with the --verbose flag and it will show you the time for each test as well as the whole suite.
Dashboard Container
✓ render without crashing (1090ms)

Related

When carrying out my unit tests, how can I execute custom code if at least one of my unit tests fails?

In a test file, I've added several unit tests using the Tape test harness. What I'd like to do now is ensure that, if at least one of my unit tests fails (screenshot), some custom JS code is executed. How would I approach that?
In this case, the custom code I'd like to carry out will play a sound (which I plan to do using the sound-play Node package.
If it matters, I'm running the unit tests in VSCode, and the Tape output is currently printed to VSCode's output panel.
Thanks.
I figured out that Tape has an onFailure method, which fires a callback whenever at least one test fails. This was exactly what I needed.

Log to terminal when QUnit test suite completes?

When my test suite completes, I need to output some stats, i. e. meta info about tests collected during test execution.
I'm trying this:
QUnit.done(() => console.log("some meta info here"))
This works when I run tests in the browser.
But when I run tests in the terminal, the console.log output is not displayed.
There's probably some debug flag, but it will enable all console.log messages and pollute the output greatly.
Instead, I need to output one specific message to the terminal, so that it's logged to CI.
PS console.log messages sent during test execution seem to make it into the terminal successfully.
PPS Using QUnit in an Ember CLI app, against Chrome headless.
This was a tricky one, as I've never had a need to interact with QUnit like, this, but here are my findings each step of the way:
Attempt 1:
That's a weird error, I thought I was passing a callback function :-\
Attempt 2:
After looking up the documentation for Qunit.log, I could see I was using it wrong. Switching to console.log shows the beginning message -- but not the ending message.
Attempt 3:
moduleDone will print something at the end -- but it also prints for every time you use the word module (after everything inside finishes running). So, I guess as a hack if QUnit.done never ends up working, you could keep track of the number of modules started, and modules done, make sure every started modules completes, and if that number is 0 at the end, your test suite is done?
Attempt 4
Turns out, this is only actually helpful for if you want to know the outermost module is done, cause it seems like multiple tests don't run in parallel (which is probably better anyway for test stability).
Attempt 5
https://github.com/qunitjs/qunit/issues/1308
It looks like an issue with the testem adapter :(

WDIO waitForExist not accepting large values when run with Cucumber?

I have a test environment where the element I want to wait for could take a minute to load, or more. I find that the .waitForExist() function is giving up after about 20 seconds, regardless of if I pass in 60000, or 600000 as the millisecond value. It does not seem to throw an exception on the console.
I am using NPM and Cucumber to call the test like this:
$ npm test -- --spec "features/b2c/campaigns/Viewing_a_campaign.feature" --cucumberOpts.tags="#smoke"
What is the problem with supplying large values to this method?
Note: I don't have any problems with values < 20000.
Thanks iamdanchiv
The issue was the timeout value in the cucumberOpts section of wdio.config.js which was set to 20000, creating an upper limit for how long any individual step can run regardless of the wdio WaitFor commands.

Passing cmd parameters to Jasmine tests

I would like to pass command line parameters to my Node.js/Jasmine tests. At the moment I have basic Jasmine file/dir structure set up (jasmine.json file and specs - as in the example: Jasmine documentation).
I run specs from command line by executing the following command:jasmine.
I would like to pass some command line parameter, so that I can use it in my specs. I would run tests with the following command:jasmine --param value or jasmine param_value.
Is it possible (how to do that)?
The parameter I want to pass is a password and I don't want to hardcode it - maybe you can suggest any better solution?
Thanks in advance!
First off in general if you want to send parameters to a test in jasmine you would use jasmine -- --param value or jasmine -- param_value. No that extra -- isn't a typo it tells jasmine to ignore everything after that and pass it on.
That said passwords and the command-line parameters don't mix. Virtually every shell has some form of history. That history will record every command just as its typed. This is major security no no. It leaves an unencrypted password recorded on disk.
The test itself should ask for the password if that needs to be supplied without hard-coding. This will at most generate a transient in memory temporary with the value. That is must safer then having it in a history file somewhere. node has modules such as readline that can do this without generating visible output for the password being entered.

Protractor Test Result variations

I am working with protractor in some projects and when i run projects there are some differences in the messages at the end of the successful test run. In one test i have written the tests normally and when the tests run this message comes
But in the other i have written using Page Objects and there are one test and 4 assertions in it. but when it runs successfully this message comes
What i want to know is why in the second scenario the assertions are not shown and why its not in green. What is the reason for this difference and is this an issue, if so how can i fix it?
That's configured in jasmineNodeOpts in your protractor-configuration-file. It's the configuration for the default reporter.
The properties you are looking for are:
jasmineNodeOpts: {
silent: false,
showColors: true
}
The values above are the default ones. In your second screen-shot they are inverted.
Take a look at jasmine-spec-reporter for another reporter with more options.

Categories

Resources