When I generate a coverage report using nyc it always give coverage results of the file(how much percentage is been covered). Although is there any way to figure out, which testcase name has contribute to the coverage of that file ?
Don't think it is possible to see the tests that hit the lines of the code or that provide coverage for a specific file. Nyc can be integrated with different coverage tools like codecov en coveralls, but in this tools I haven't found an option for this.
I don't know what you want with the option to view the test names for the different files. But for debugging purposes there is also an visual studio code extension for mocha that let's you debug your tests (https://marketplace.visualstudio.com/items?itemName=maty.vscode-mocha-sidebar).
Related
My current SonarQube version is : 9.2.4.50792
I am using Community Edition
Programming language being used : Python, JavaScript, html, css.
API’s has been developed using python Programming language.
I am trying to figure out how can I integrate Selenium (Automated testing framework) whose test cases are coded in python with SonarQube to generate a run time code coverage report in SonarQube.
When I am running test cases using selenium tool at that time the SonarQube should be able to generate code coverage report i.e. percentage of code covered while running the automated test cases, during that instance.
I have implemented the SonarQube, report has been generated but the code coverage report is shown as 0. I tried to calculate code coverage for python using coverage.py to generate an .xml file which can be integrated with SonarQube, major setback faced is how should I calculate coverage of API. The analysis done currently is static, I want to achieve runtime code coverage analysis.
I teach Javascript at college. I want to write course exams that need to run unit tests on the code that my students wrote.
I need to create exams that will run jest unit tests on the code that my students wrote.
much like edabit or codewars where you have challenges with predefined tests...
At some point the students are required to write their own jest unit tests and run it against their own code like you can do in codesandbox.
I'm using ace editor for now to allow students to write code in the browser. I need your help with the following:
What sort of Jest variations / configurations are needed to write pre-defined unit tests in order to test the students' code in the browser?
What is the best way to run code written in a browser based editor? I assume a simple eval() is problematic and not secure... I need to run it in the browser so node.js is not an option.
How do you support importing/exporting of ES6 modules? Is it true that you need to write your own module bundler to execute in the browser? Is it the only way? Do any of the module bundlers out there support browser-based bundling?
When Intern tests don't load source files (0% covered), they don't show up in the (lcov) coverage report (running in nodejs).
Typically a problem JS tools struggle with, I think.
E.g. Jest has a simple workaround.
I'm looking for the simplest workaround for intern, ideally with v3.
Since Intern uses istanbul under the cover, wonder if --include-all-source flag works and can be passed easily?
Is there a standard recipe to make the loader aware of all files?
I have files that don't load well in nodejs too, can they be included?
Taking a look at the Intern project itself and in the config script there is such an option called coverage, coverage is defined as:
An array of file paths or globs that should be instrumented for code
coverage, or false to completely disable coverage. This property
should point to the actual JavaScript files that will be executed, not
pre-transpiled sources (coverage results will still be mapped back to
original sources). Coverage data will be collected for these files
even if they’re not loaded by Intern for tests, ALLOWING A TEST WRITER TO SEE WHICH FILES HAVENT BEEN TESTED
writer to see which files haven’t been tested, as well as coverage
on files that were tested. When this value is unset, Intern will still
look for coverage data on a global coverage variable, and it will
request coverage data from remote sessions. Explicitly setting
coverage to false will prevent Intern from even checking for coverage
data. 💡This property replaces the excludeInstrumentation property
used in previous versions of Intern, which acted as a filter rather
than an inclusive list.
Sorry for the uppercase, where just suppose to highlight the sentence.
coverage uses glob just as istanbul does, so you could specify something like coverage: ['src/**/*.js'].
I realize this because Intern itself uses this configuration to collect coverage and it seems to work for them.
Edit: As pointed in the comments, this features only appears in v4 of intern.
How can I combine multiple code coverage reports to tie in with Teamcity's code coverage statistics?
I have Grunt pipeline that does Javascript code coverage reporting (using karma-coverage) and C# code coverage reporting (using OpenCover and ReportGenerator). Both offer a multitude of reporting functionality.
For Javascript I am currently doing this:
HTML report
Teamcity-interpretable console output - see this for details
Text summary
For C#:
HTML report
XML report
My own Teamcity-interpretable console output using the ReportGenerator interfaces
I am running my pipeline in Teamcity and want to use the nice Teamcity coverage statistics. It works well for one of the reports using the console output. But Teamcity does not merge them, the next coverage reporter (C# comes after Javascript for me) overwrites the previous report.
On the Blanket.JS website it is said that Mocha is supported. I followed the procedure of including blanket.js on my test page and put the data-cover attribute on the appropriate script elements but no report of coverage is being shown on the generated test page.
I have only 2 unit tests: 1 for a Backbone.js Model and 1 for a Backbone.js Collection. Can somebody assist me in getting the code coverage report?
Nevermind, to solve this problem you need to include mocha-blanket.js, also.