I'm using detox with cucumber and the setup works great until the last moment of calling AfterAll hooks.
In the hook I do the following:
AfterAll(async () => {
console.log('CLEANING DETOX');
await detox.cleanup();
});
To run cucumber I use the following script from package.json:
"bdd": "cucumber-js --require-module #babel/register",
The problem happens when I interrupt cucumber run for any reason. AfterAll hook won't run in this case and detox won't clear its stale files.
Which wouldn't be a problem but detox is using ~/Library/Detox/device.registry.state.lock file to track which simulators are being used by runner. Essentially, this leads to detox constantly launching new simulator devices as this file never gets cleared.
I thought, I could just create a simple wrapper script:
const { execSync } = require('child_process');
const detox = require('detox');
const stdout = execSync('cucumber-js --require-module #babel/register');
process.on('SIGINT', async function () {
console.log('Cleanig up detox');
await detox.cleanup();
});
console.log(stdout);
However, that didn't work either as detox.cleanup() only removes the file when it has detox.device setup. Which happens in BeforeAll hook:
BeforeAll({ timeout: 120 * 1000 }, async () => {
const detoxConfig = { selectedConfiguration: 'ios.debug2' };
// console.log('CONFIG::', detoxConfig);
await detox.init(detoxConfig);
await detox.device.launchApp();
});
My only idea left is to manually clear the file - I should be able to grab lock file path from detox internal somehow - my worry about this approach is tight dependency on detox implementation. Which is why I would rather call detox.cleanup.
EDIT:
Ended up doing this workaround for now:
BeforeAll({ timeout: 120 * 1000 }, async () => {
await startDetox();
});
async function startDetox() {
const detoxConfig = { selectedConfiguration: 'ios.debug' };
const lockFile = getDeviceLockFilePathIOS();
unlink(lockFile, (err) => {
if (err) {
console.debug('Lock file was not deleted:::', err);
}
});
await detox.init(detoxConfig);
await detox.device.launchApp();
}
Wonder if anyone has a better idea ?
Have you tried adding
detox clean-framework-cache && detox build-framework-cache into your command like so:
detox clean-framework-cache && detox build-framework-cache && cucumber-js --require-module #babel/register
This is a detox command that clears the cache and resets it so detox starts from fresh every time, it barely adds any time to the execution time so can be run on every test and means the first thing the build does before starting the simulator is clear the cache and start from fresh. I had the exact same issue with our cucumber detox builds and this fixed it for us.
After fiddling with workaround for way too long I ended up migrating to jest-cucumber.
It manages to work way better with detox and doesn't require all this workarounds.
Related
Inside my main file, I have
const loadWorker = async () => {
const SyncWorker = await import("$lib/canvas.worker?worker");
syncWorker = new SyncWorker.default();
syncWorker?.postMessage({});
};
Then in my unmount I have
onMount(() => {
console.log("Canvas: mounted");
loadWorker();
});
Then in my canvas.worker.ts file, I have a simple
onmessage = () => {
console.log("Hello from the worker!");
};
export {};
This message prints successfully in Chrome, but in firefox all I get is
SyntaxError: import declarations may only appear at top level of a module
Is this because the worker is stored on my local system, and maybe there's a special flag to allow loading of system files as workers (as that seems it may be a security concern)? Firefox docs say that my browser should support workers.
Well, I should've read the documentation better.
service workers only work in the production build, not in development.
To test it locally, use vite preview
https://kit.svelte.dev/docs/service-workers
Or in my case, "npm run build && npm run preview" worked.
I want to test my API with jest. Before executing the tests of a test file I want to reset my database using a npm command like this:
import { execSync } from 'child_process'
import prisma from 'lib/prisma'
import { foods } from 'prisma/seed-data'
import { getFoods } from './foods'
beforeAll(() => {
execSync('npm run db:test:reset')
})
afterEach(async () => {
await prisma.$disconnect()
})
test('getFoods', async () => {
const fetchedFoods = await getFoods()
expect(fetchedFoods).toEqual(foods)
})
The npm run db:test:reset excecutes this script:
echo "DROP DATABASE foodplanet WITH (FORCE); CREATE DATABASE foodplanet;" | sudo docker exec -i foodplanet_test psql -U postgres
cat prisma/backups/test/backup.sql | sudo docker exec -i foodplanet_test psql -U postgres foodplanet
Everything works fine if I run this test file by itself. However if I execute all tests there are issues with multiple tests executing this script at the same time. Running the tests sequentially with -i still results in errors like:
ERROR: relation "Account" already exists
or
ERROR: database "foodplanet" does not exist
I get different errors every time I run my unit test therefore I think there are still multiple instances trying to execute this script at the same time.
What could be the reason that the tests doesn't get executed sequentially?
How can I remove cookies, local storage and other crap from the "AppData\roaming\MyApp" folder, when the electron application quits?
I tried deleting the whole directory on app quit, but it throws me EBUSY errors. Apparently the files are locked or something, almost like someone doesn't want us to be able to remove the bloat?
const fs = require('fs-extra');
const clearBloat = async () => fs.remove(path.join(app.getPath('appData'), app.name));
app.on('window-all-closed', async () => {
await clearBloat();
});
After doing some testing, I've found that you have to delete the files after your electron process has ended (trying to delete the files while in the quit or will-quit app events doesn't delete the files/folders; they get re-created right away. Something in electron (likely, Chromium) wants these files/folders to exist while the app is running, and it's too much work to figure out how to hook into it).
What works for me is spawning a detached cmd off a shell that waits 3 seconds, and then deletes all files/folders in a given application folder. What will be an exercise up to the reader will be to hide the output of the ping command (or hide the window, but there's been mixed success on that front), or choose a different command. I've found timeout works, but sleep and choice (ie. something like this) do not work.
Here's what you will need to add:
const { app } = require("electron");
const { spawn } = require("child_process");
const path = require("path");
...
app.on("will-quit", async (event) => {
const folder = path.join(app.getPath("appData"), app.name);
// Wait 3 seconds, navigate into your app folder and delete all files/folders
const cmd = `ping localhost -n 3 > nul 2>&1 && pushd "${folder}" && (rd /s /q "${folder}" 2>nul & popd)`;
// shell = true prevents EONENT errors
// stdio = ignore allows the pipes to continue processing w/o handling command output
// detached = true allows the command to run once your app is [completely] shut down
const process = spawn(cmd, { shell: true, stdio: "ignore", detached: true });
// Prevents the parent process from waiting for the child (this) process to finish
process.unref();
});
As another user mentioned, there's a method available on your electron session that is a native API that clears all of these files/folders. However, it returns a promise, and I could not figure out how to execute this synchronously within one of these app-events.
Reference #1
I have a function that must find a file(.zip) and return it, that i can use it as argument for another function. Also I have test in karma/jasmine where i lunch my search function and when I do this its throws an error that 'fs.readFile is not a function'
test code:
const fs = require('fs');
const JSZip = require("jszip");
const searchfile = () => {
fs.readFile('./data/2-11253540.zip', function (err, data) {
if (err) throw err;
JSZip.loadAsync(data).then(function (zip) {
console.log('Process Zip: ', zip);
});
});
};
describe('Process', () => {
const process = require('./process');
searchfile();
it('001', () => expect(process()).toEqual(null));
});
it's looks not very similar to what I described above, but it was a test version to check if it work or not. In my karma config I have browserify to handle require.
So, searchfile function search a file and process function will use this file. When I run this test I have error that fs.readFile is not a function.
However if I put a code of searchfile in process function and run it directly it work fine.
Why it's not work ?
Look at the last couple of lines of green text in the screenshot (why are you posting PICTURES of text?!).
You are running the tests in Chrome.
The fs module is a Node.js feature.
The fs module is not available in Chrome (and it would be silly if a web page could read arbitrary files from the visitor’s file system).
I am trying to run multiple Karma test files in parallel from inside a Node script and get to know which tests are passing or failing. Right now what I have is this:
const exec = require("child_process").exec;
exec("karma start " + filename, (error, stdout, stderr) => {
// handle errors and test results...
});
The code above works well, and I can get the information on tests passed or failed from stdout. However, it requires having installed Karma and all of the associated dependencies (reporters, browser launchers, etc.) globally. I am looking for a solution that doesn't require me to install all dependencies globally.
My first thought was this:
const karma = require("karma");
const server = new karma.Server(config, () => {
// some logic
});
However, when trying this other approach, I have been unable to gather the test results programmatically.
When using new karma.Server(), is there any way in which I could know which tests have passed or failed (and, ideally, a stack trace of the error)? Alternatively, is there any other way in which I can execute my tests and get the desired information programmatically without the need to install dependencies globally?
Actually, changing the exec line to this seems to do the trick:
exec("node node_modules/karma/bin/karma start " + filename, (error, stdout, stderr) => {
It turns out I'd only need to run the locally installed version of Karma instead of the global one. :-)