I have a static web app. Html, JS (requirejs modules), and some CSS.
Currently the 'serverUrl' is being set through a property module, which i can 'require' and use values from it:
define({
serverUrl: 'https://some.api/path/'
})
I have Intern setup to run functional tests in the browser using src/index.html as the entry point.
return this.remote
.get(require.toUrl('src/index.html'))
Given the serverUrl is hardcoded in the properties file, I'm trying to find a way to run tests against the web app where serverUrl is pointing to localhost:1234/someFakeServer so I can test error scenarios and such.
I've trawled the web but can't find anyone doing anything remotely similar to me, which makes me think I'm doing something obviously wrong. There's solutions for NODE apps, using config modules, but because I never 'start' my web app - its just files, these won't work for me.
Some solutions I've thought about, but can't figure out how to achieve:
Intern is proxying the files on :9000, so if I can somehow 'build' an application with another properties file pointing to localhost, all is good. But I've no idea how to do that - I've looked at webpack and similar but they don't seem to do what I want.
I've looked at Interns 'Setup' config item, which allows a function to be run before the tests are started - so I thought about modifying the properties file in there, but seems too hacky and not sure how I'd put it back...
Assuming the properties file is accessible to Intern, you could simply have Intern load the properties file and pull the server URL out of it. If you have multiple potential properties files, the one being used can be set in the Intern config or passed in as a custom command line variable (which would be used to set a property in the Intern config). The test module can get the name of the properties file from Intern's config and then load the relevant file. It could look something like this (untested):
// intern config
define({
// ...
propertiesFile: 'whatever',
})
// test file
define([ 'intern', ... ], function (intern, ...) {
registerSuite({
// ...
'a test': function () {
var did = this.async();
var remote = this.remote;
require([
intern.config.propertiesFile
], dfd.callback(function (props) {
return remote.get(props.url)
.otherStuff
});
}
});
});
Related
I have a Vue project using TypeScript and built by Vue-CLI3.
What I'm trying to achieve is to get Webpack to build separate bundles for my workers. I've read about Webpack Code Splitting and about configureWebpack in vue.config.js, but so far had no luck in putting them together.
The project setup is the standard vue create type. I have a ./src/main.ts as the main entry point and a bunch of TypeScript modules, I want as separate bundles with their own dependency trees (I'm fine with code duplication if it can't be avoided).
I'd like to get
./dist/js/all main stuff
./dist/js/workers/worker1.6e3ebec8.js
./dist/js/workers/worker2.712f2df5.js
./dist/js/workers/worker3.83041b4b.js
So I could do new Worker(worker1.6e3ebec8.js) in the main code.
I could launch workers from the main package by generating javascript code, putting it into a blob and instantiating from that, but it looks rather awkward. Besides, my worker code import other modules, so it doesn't seem to be an option anyway.
I'm quite new to all of this, so maybe I'm not even heading in the right direction.
What is the usual way of doing that on this stack?
You can use import(), it will return a Promise and will resolve your module.
As you are using Vue-CLI 3, webpack is ready and it should split your bundle automatically.
const moduleName = 'coolModuleName'
import (
/* webpackChunkName: "[moduleName]" */
`#/my/module/path/${moduleName}.js`
).then(moduleCode => {
// use your module
})
// load them in parallel
const getModuleDynamically(path, moduleName) => import(
/* webpackChunkName: "[moduleName]" */
`#/${path}/${moduleName}.js`
)
Promise.all([
getModuleDynamically(path, moduleName1),
getModuleDynamically(path, moduleName2),
getModuleDynamically(path, moduleName3)
])
Got there! #aquilesb's answer did help, although I've failed to get getModuleDynamically() from the answer working after plenty of experimenting.
Update: Looks like with this solution I'm not able to use imports of npm modules. I've tried experimenting with worker-loader but haven't got anywhere so far.
Here are a few takeaways:
Create a separate webpack config for packing workers. The target: 'webworker' must be there. Call it with webpack --config ./webpack.config.workers.js, as Vue wouldn't know about that.
Create a separate tsconfig.json for workers TypeScript. The lib setting for workers must be there: "lib": ["esnext","webworker","scripthost"] as well as the proper include:[...]/exclude:[...] settings.
You may need to tell Vue to use the main tsconfig.json that has it's own "lib":["esnext","dom","dom.iterable","scripthost"] and include/exclude. This is done in vue.config.js, you will probably need to create it. I use chainWebpack configuration option of Vue config.
Let Webpack know you have dynamic loading by making calls to import() with static (i.e. not variable) names. I haven't found a way to do so in config file, but it doesn't matter: you can't help hardcoding the names somewhere, how else Webpack would know it has to bundle and split the code?
Somehow get the name(s) of generated files, as you must have at least one of them at runtime to do new Worker(filename). I used the --json option of Webpack CLI for that.
There are many ways all of this can be achieved. This is what this ended up looking like in my project:
Folder structure:
webpack.config.workers.js
vue.config.js
tsconfig.base.json
src/main/
src/main/tsconfig.json -- extends tsconfig.base.json
src/shared/ -- this code may be duplicated by the Vue app bundles and by workers bundle
src/workers/
src/workers/tsconfig.json -- extends tsconfig.base.json
webpack.config.workers.js: contains a single entry – the main worker file, that loads the other stuff.
entry: {
worker: './src/workers/worker.ts'
}
build.workers.sh: the script calls Webpack CLI and produces a JSON file with the resulting workers names (trivial actions on folders are omitted). The only one I need is called "worker". The rest is to be dynamically loaded by it.
#!/bin/bash
# Map entry name -> bundle file name
# "assetsByChunkName":{"entryN":"entryN.[hash].js", ...}
json=$(webpack --config ./webpack.config.workers.js --json $#|tr -d "\n\r\t "|grep -Eo '"assetsByChunkName":.+?}')
# Remove "assetsByChunkName"
json=$(echo "${json:20}")
echo $json
echo $json > "$target/$folder/workers.json"
Load workers.json at runtime. The other option would be to use it at compile time by providing Vue config with const VUE_APP_MAIN_WORKER = require("path to/workers.json").worker and using this env constant.
Now that we have the name of the main worker file, we can do new Worker("main worker file path we've got from webpack").
The main worker file contains the function that statically references other modules and dynamically loads them. This way Webpack knows what to bundle and how to split the code.
enum WorkerName {
sodium = "sodium",
socket = "socket"
}
function importModule(name: WorkerName): Promise<any> {
switch (name) {
case WorkerName.sodium:
return import(
/* webpackChunkName: "sodium" */
"workers/sodium"
);
case WorkerName.socket:
return import(
/* webpackChunkName: "socket" */
"workers/socket"
);
}
}
Use the postMessage/message event API to tell your main worker code what to load.
const messageHandler = (e: MessageEvent) => {
// here goes app-specific implementation of events
// that gets you the moduleName in the end
importModule(moduleName).then((work) => {
// do smth
});
};
Now, to the correct answer.
To achieve the following:
Using webworkers
Using both dynamic imports and normal imports in webworker code
Sharing code between webworkers and main app
I had to add a separate rule for worker-loader in vue.config.js and also to add babel-loader. It took me some time to find the correct solution, but I dropped the previous one (in my other answer) in the end. I still use separate tsconfig.js for main and for webworkers.
What I'm still not happy with, is that vue-cli–or rather fork-ts-checker plugin–doesn't seem to know the webworker-specific types in my worker classes (so I can't use DedicatedWorkerScope, for instance).
I use require.js to load files at runtime like following
This is working as expected when I run the file in the right context(I mean when the call is coming from the right path.)
module1.js
define(["otherModule"], function(otherModule) {
working!!!
....
Now I want to create some unit test to this file (module1) from
other context (from folder of tests which is found in diffrent location in the project) and I get error
require.js:145 Uncaught Error: Script error for: otherModule
Since it tries to run the get on this path during the Unit Test
which is located in diffrent project structure ...
https://app/path1/path2/path3/otherModule.js
And in runtime which works (from different context) it find it in the path
https://app/path1/path2/path3/path4/path5/otherModule.js
There is additional path4 & path5 in the request that works,
How should I solve it to work on both cases (UT/Runtime) ?
http://requirejs.org
I think you should be able to get it working by applying a RequireJS configuration file, so that the module name is abstracted from its path:
E.g. in the test context, call something like this as initialization step:
require.config({
baseUrl: "/path1/path2/path3"
});
Alternatively, you can also remap single modules like so (this can also be used to inject a different implementation of a specific module for testing etc.):
require.config({
paths: {
"otherModule": "/path1/path2/path3/otherModule"
}
});
See here: http://requirejs.org/docs/api.html#config
I am trying to port a library from grunt/requirejs to webpack and stumbled upon a problem, that might be a game-breaker for this endeavor.
The library I try to port has a function, that loads and evaluates multiple modules - based on their filenames that we get from a config file - into our app. The code looks like this (coffee):
loadModules = (arrayOfFilePaths) ->
new Promise (resolve) ->
require arrayOfFilePaths, (ms...) ->
for module in ms
module ModuleAPI
resolve()
The require here needs to be called on runtime and behave like it did with requireJS. Webpack seems to only care about what happens in the "build-process".
Is this something that webpack fundamentally doesn't care about? If so, can I still use requireJS with it? What is a good solution to load assets dynamically during runtime?
edit: loadModule can load modules, that are not present on the build-time of this library. They will be provided by the app, that implements my library.
So I found that my requirement to have some files loaded on runtime, that are only available on "app-compile-time" and not on "library-compile-time" is not easily possible with webpack.
I will change the mechanism, so that my library doesn't require the files anymore, but needs to be passed the required modules. Somewhat tells me, this is gonna be the better API anyways.
edit to clarify:
Basically, instead of:
// in my library
load = (path_to_file) ->
(require path_to_file).do_something()
// in my app (using the 'compiled' libary)
cool_library.load("file_that_exists_in_my_app")
I do this:
// in my library
load = (module) ->
module.do_something()
// in my app (using the 'compiled' libary)
module = require("file_that_exists_in_my_app")
cool_library.load(module)
The first code worked in require.js but not in webpack.
In hindsight i feel its pretty wrong to have a 3rd-party-library load files at runtime anyway.
There is concept named context (http://webpack.github.io/docs/context.html), it allows to make dynamic requires.
Also there is a possibility to define code split points: http://webpack.github.io/docs/code-splitting.html
function loadInContext(filename) {
return new Promise(function(resolve){
require(['./'+filename], resolve);
})
}
function loadModules(namesInContext){
return Promise.all(namesInContext.map(loadInContext));
}
And use it like following:
loadModules(arrayOfFiles).then(function(){
modules.forEach(function(module){
module(moduleAPI);
})
});
But likely it is not what you need - you will have a lot of chunks instead of one bundle with all required modules, and likely it would not be optimal..
It is better to define module requires in you config file, and include it to your build:
// modulesConfig.js
module.exports = [
require(...),
....
]
// run.js
require('modulesConfig').forEach(function(module){
module(moduleAPI);
})
You can also try using a library such as this: https://github.com/Venryx/webpack-runtime-require
Disclaimer: I'm its developer. I wrote it because I was also frustrated with the inability to freely access module contents at runtime. (in my case, for testing from the console)
I have a file named test/helper.js that I use to run Mocha tests on my Node.js apps. My tests structure looks like:
test/
test/helper.js # global before/after
test/api/sometest.spec.js
test/models/somemodel.spec.js
... more here
The file helper.js has to be loaded because it contains global hooks for my test suite. When I run Mocha to execute the whole test suite like this:
mocha --recursive test/
the helper.js file is loaded before my tests and my before hook gets executed as expected.
However, when I run just one specific test, helper.js is not loaded before the test. This is how I run it:
mocha test/api/sometest.spec.js
No global before called, not even a console.log('I WAS HERE');.
So how can I get Mocha to always load my helper.js file?
Mocha does not have any notion of a special file named helper.js that it would load before other files.
What you are trying to do works when you run mocha --recursive because of the order in which Mocha happens to load your files. Because helper.js is one level higher than the other files, it is loaded first. When you specify an individual file to Mocha, then Mocha just loads this file and, as you discovered, your helper.js file is not loaded at all.
So what you want to do is load a file such that it will set top level ("global") hooks (e.g. before, after, etc.). Options:
You could use Mocha programmatically and feed it the files in the order you want.
You could force yourself to always specify your helper file on the command line first before you list any other file. (I would not do this, but it is possible.)
Another option would be to organize your suite like I've detailed in this answer. Basically, you have one "top level" file that loads the rest of the suite into it. With this method you'd lose the ability of running Mocha on individual files, but you could use --grep to select what is being run.
You cannot use the -r option. It loads a module before running the suite but, unfortunately, the loaded module does not have access to any of the testing interface that Mocha makes available to your tests so it cannot set hooks.
What I do is create a test/test_helper.js file, which exports all the helpers I create:
// test/test_helper.js
module.exports = {
MyHelper: require('./helpers/MyHelper')
}
Then I require the helper on any test I need to use it:
// test/spec/MySpec.js
var helper = require('../test_helper');
// Or if you need just "MyHelper"
var myHelper = require('../test_helper').MyHelper;
describe('MySpec', function () {
// Tests here...
});
I prefer the above approach because its easy to understand and flexible. You can see it in action here in my demo: https://github.com/paulredmond/karma-browserify-demo/tree/master/test
First, I would definitely use mocha.opts so that you don't have to include the options you want every time. As pointed out, one option is to use --grep, but I am not a huge fan of that personally. It required you name everything in an overly simplistic way. If the before hook is NOT async you can use --require in your mocha.opts. e.g.
#mocha.opts
--recursive
--require test/helpers.js
It sounds like this wouldn't work for you because you want global after hook as well. What I have done is I just call the full test suite every time, but if I am in the middle of deving and only want to test one suite, or even one specific test, I use the exclusivity feature, only https://mochajs.org/#exclusive-tests. You can make it it.only('... or describe.only('... If you do this it looks through all tests and sets up exactly like your full test harness would, but then only executes the test or suite you have specified.
Now you can include those global hooks no problem. #Louis mentions that your helpers.js are loading in the proper order only coincidently. That is not true. If you place any hooks outside of a describe block, it automatically becomes a global hook. This can be accomplished by either putting it in its own file
// helpers.js
before(function() { console.log('testing...'); });
or within a test file
// some.spec.js
before(function() { console.log('testing...'); });
describe('Something', function() {
it('will be tested', function() {
...
});
});
Obviously, I think putting it in its own file is cleaner. (I called it hooks.js). Point is, this is not a result of the order in which files were loaded.
Just one gotcha that might be obvious to other but I struggled with briefly -- hooks not placed in a describe block are ALL global. They are not directory specific. So if you copy helpers.js into a sub-directory of tests, the before and after hook will now fire twice. Also, if you place a beforeEach hook in there, it will fire before every single test, not just those tests in that directory.
Anyway, I know this post is a bit old, but hopefully this will help others having similar issues.
Late addition to the answer:-
Mocha (v7.0.0 as of writing) support specifying file as an option:-
As per the docs
--file Specify file(s) to be loaded prior to root suite
execution
.mocharc.json
{
"watch-files": [
"test/**/*.js"
],
"recursive": true,
"file": "test/setup"
}
./test/setup.js
const request = require('supertest');
const app = require('../app'); // express app
global.request = request(app);
A Worthy Mention:
I found that the above setup loaded all .js files anyway, probably because of the mocha config extension which is set to js by default. Since I had the convention of naming all tests file with .spec.js, I can ignore other files by adding "ignore": ["test/**/!(*.spec.js)"]
I came to this question after trying all sorts of things to get my tests to connect once to a database before subsequently running a bunch of crud tests on my models.
Then I found mocha-prepare which solved my problems.
In your helper.js file you can just define a prepare function.
prepare(done => {
console.log('do something asynchronously here')
done()
}, done => {
console.log('asynchronously clean up after here')
done()
})
works a treat.
In our project, we are using helpers somewhat like this:
clientHelper = require("../../../utils/clientHelper")
You need to configure relative path of your helper properly.
And then calling it like this:
clientHelper.setCompanyId(clientId)
So, I have an app that is using requireJS. Quite happily. For the most part.
This app makes use of Socket.IO. Socket.IO is being provided by nodejs, and does not run on the same port as the main webserver.
To deal with this, in our main js file, we do something like this:
var hostname = window.location.hostname;
var socketIoPath = "http://" + hostname + ":3000/socket.io/socket.io";
requirejs.config({
baseUrl: "/",
paths: {
app : "scripts/appapp",
"socket.io" : socketIoPath
}
});
More complicated than this, but you get the gist.
Now, in interactive mode, this works swimingly.
The ugliness starts when we try to use r.js to compile this (technically we're using grunt to run r.js, but that's besides the point).
In the config for r.js, we set an empty path for socket.io (to avoid it failing to pull in), and we set our main file as the mainConfigFile.
The compiler yells about this, saying:
Running "requirejs:dist" (requirejs) task
>> Error: Error: The config in mainConfigFile /…/client.js cannot be used because it cannot be evaluated correctly while running in the optimizer. Try only using a config that is also valid JSON, or do not use mainConfigFile and instead copy the config values needed into a build file or command line arguments given to the optimizer.
>> at Function.build.createConfig (/…/r.js:23636:23)
Now, near as I can figure, this is due to the fact that I'm using a variable to set the path for "socket.io". If i take this out, require runs great, but i can't run the raw from a server. If I leave it is, my debug server is happy, but the build breaks.
Is there a way that I can lazily assign the path of "socket.io" at runtime so that it doesn't have to go into the requirejs.config() methos at that point?
Edit: Did some extensive research on this. Here are the results.
Loading from CDN with RequireJS is possible with a build. However, if you're using the smaller Almond loader, it's not possible.
This leaves you with two options:
Use almond along with a local copy of the file in your build.
Use the full require.js loader and try to use a CDN.
Use a <script> tag just for that resource.
I say try for #2 because there are some caveats. You'll need to include require.js in your HTML with the data-main attribute for your built file. But if you do this, require and define will be global functions, allowing users to require any of your internal modules and mess around with them. If you're okay with this, you'll need to follow the "empty: scheme" in your build config (but not in your main config).
But the fact remains that you now have another HTTP request. If you only want one built file, which includes the require.js loader, you'll need to optimize for only one file.
Now, if you want to avoid users being able to require your modules, you'll have to do something like wrap:true in your build. But as far as I can tell, once your module comes down from CDN, if it's AMD, it's going to look for a global define function to register itself with, and that won't exist because it's now wrapped in a closure.
The lesson I took away from all this: inline your resources to your build. It makes sense. You reduce HTTP requests, minify it all and get gzip compression. You don't expose your modules to the world and everything is a lot simpler. If you cache your resources properly you won't even need to worry about it.
But since new versions of socket.io don't like AMD, here's how I did it. Make sure to include the socket.io <script> tag before requirejs. Then create a requirejs module named socket.io with the following contents:
define([], function () {
var io = window.io;
window.io = null;
return io;
});
Set the path like so: 'socket.io': 'core/socket.io' or wherever you want.
And require it as normal! The build works fine this way.
Original answer
Is it possible that you could make use of the path config fallbacks specified in the RequireJS API? Maybe you could save the file locally as a fallback so your build will work.
The socket.io GitHub repository specifies that you can serve the client with the files in the socket.io-client package's dist/ directory.