I'm using Worker API to create new worker, it's as simple as:
var myWorker = new Worker('src/worker.js');
Unfortunately I'm using webpack to build my project and linking to src/worker.js returns 404.
How can I use it to create my Worker?
import hardWorker from './src/hardWorker.js';
var myWorker = new Worker(HOW TO PUT HARDWORKER HERE?);
Try using the worker-loader plugin for webpack, which can be found here: https://github.com/webpack-contrib/worker-loader
This registers scripts loaded through it as a Web worker, and bundles all of your workers in the main bundle that webpack creates. Using it in your app is as simple as this:
import MyWorker from 'worker-loader!./Worker.js';
const worker = new MyWorker();
Or you can add loader configuration parameters to your webpack.config so all files matching a specific pattern are loaded and registered as workers. The worker-loader has an example of this.
The issue is that the worker API expects a URL to load the work script from and webpack tries to bundle everything into a single file, meaning the src/worker.js file isn't where it needs to be when the browser tries to load it.
There are a couple options.
You could manually copy the worker.js file to wherever the build is trying to load the file from.
You could set up webpack to copy the file to the right location using the CopyWebpackPlugin.
Depending on how complicated your worker is, you could just put the code in a string and create an object URL so the browser doesn't have to load anything from the server at all. This is an approach I've used in the past and have written a helper library for to Workers a little easier to use.
Hope that helps!
Related
I have been reading through the Vite documentation for Web workers, and it mentioned importing a file using "Query Suffixes," which I have never encountered and am not sure what to search for to learn more. I'm not sure if this is native to Node.js, Vite.js, or if it is a plugin that is built into Vite.
This is the specific section that I am referring to:
Import with Query Suffixes
A web worker script can be directly imported by appending ?worker or ?sharedworker to the import request. The default export will be a custom worker constructor:
import MyWorker from './worker?worker'
const worker = new MyWorker()
The worker script can also use import statements instead of importScripts() - note during dev this relies on browser native support and currently only works in Chrome, but for the production build it is compiled away.
By default, the worker script will be emitted as a separate chunk in the production build. If you wish to inline the worker as base64 strings, add the inline query:
import MyWorker from './worker?worker&inline'
If you wish to retrieve the worker as a URL, add the url query:
import MyWorker from './worker?worker&url'
See Worker Options for details on configuring the bundling of all workers.
Update:
I found an MDN page on import that seems like a step in the direction that I am looking for, as well as this MDN page on import.meta that looks a lot like what I am searching for. I tried following that lead, but it didn't help me understand this Vite feature any better.
Is the ?worker query suffix a custom Vite implementation of import.meta?
I’m building an application that uses Vue for our front-end, specifically Vue 3 Composition API + Typescript. I need the ability to use web workers for long running processes in the background. I was able to get a vanillajs web worker to run alongside the Vue application no problem. My issue is that I need my web worker to have access to classes and functions that are written inside the Vue app. Both my web worker and my Vue app need to use the same classes and functions and I don’t want to have to write them in two places. Is there a way to share them? Any help is appreciated.
The answer is yes:
There is a way to let the web worker having access to the Vue.js framework through one single JavaScript file.
The library that will help you with this in your Vue.js project is vue-worker. This library uses the simple-web-worker package. (Inline Web Workers approach)
As you may know, a standard web worker requires an extra js file that will contain the code of the worker. For inline web workers it isn't necessary to use a helper method. That will inject a property into Vue (and pass it to every child component), with a default name of $worker. In total you can say it is behaving as an JavaScript promise but without really being a second JavaScript file.
Now your turn:
There is a really good documentation to follow along with this approach I described and additional a second "old-fashion" way. Try it out :)
I was able to figure out a solution using esbuild. I added a process to my npm run serve and npm run build commands so it runs node ./esbuild.js before the vue-cli commands. This compiles my typescript web worker to javascript and puts it in the public folder before compiling the Vue app. Now I can create a web worker like normal using:
const worker = new Worker('/worker.js', { type: 'module' })
And here is my esbuild.js file:
const esbuild = require('esbuild')
const { nodeExternalsPlugin } = require('esbuild-node-externals')
const config = {
entryPoints: ['./src/workers/worker.ts'],
outfile: 'public/worker.js',
bundle: true,
minify: false,
plugins: [nodeExternalsPlugin()]
}
esbuild.build(config).catch(() => process.exit(1))
I'm attempting to make use of this library: https://github.com/MagicTheGathering/mtg-sdk-javascript in an Angular2 application.
Unfortunately, I've been going in circles trying to load it into my application.
Firstly, on the TypeScript side if I import it using:
import { } from 'mtgsdk';
there are no types to load into the {}.
If I attempt to load it using something similar to:
import * as mtg from 'mtgsdk'
I'm unable to because it says that it's unable to find a module named mtgsdk.
I've installed the module using
npm install --save mtgsdk
Also, npm installs work fine for other modules.
The application compiles fine if I load it in using require via something similar to this:
var mtg = require('mtgsdk');
Taking that approach, I'm able to compile and launch but in the browser I get a number of errors about modules that it can't find. I figure they are prerequisites for the sdk that didn't get loaded so I start bringing them in via package.json.
For every one that I bring in, I then have to go to systemjs.config.js and add an entry pointing to the module's entry point and often have to specify a default extension using blocks like this:
pointer
'mtgsdk': 'npm:mtgsdk/lib/index.js',
'request-promise': 'npm:request-promise/lib/rp.js',
'ramda': 'npm:ramda/dist/ramda.js',
'emitter20': 'npm:emitter20/index.js',
'bluebird': 'npm:bluebird/js/browser/bluebird.js',
'request': 'npm:request/index.js'
default extension
'request-promise':
{
defaultExtension: 'js'
}
I'm not sure if that's the right approach though because the more dependencies I add, the more that I end up requiring. At one point I had literally gotten up to 50 extra dependencies added because every time I launched, the browser console would find more that were needed.
Is there any easier way to load all of these in?
Also, some of them (such as tough-cookie and request-promise-core) were very problematic to load and I couldn't get the browser console to stop complaining about them. Finally, some of them seemed very basic such as url, http, https. Those seem like they should already be present.
Using systemjs was utilized in the previous versions of Angular 2, However Angular 2 has evolved to Angular 4, with super new features like Angular CLI.
I recommend your use Angular CLI, with #angular/cli.
Importing Node modules
Since mtgsdk is a node-module, you can easily import it using
import * as mtg from 'mtgsdk'
However for your program to compile, you must install a type definition for it. or declare one for it in /typings.json or your app might not build.
Importing Client Scripts
For client scripts like firebase.js you won't need to add client scripts as entries in systemjs.config.js again.
Using #angular/cli, you would easily add them in the scripts[] array in your angular-cli.json for automatic compilation.
Then access them like this
declare const firebase: any;
Here is a quickstart tutorial to set up Angular with #angular/cli.
I'm trying to build a functional test using CasperJS.
caseperjs is run by a backend test suite using the following command:
PHANTOMJS_EXECUTABLE=../client/node_modules/phantomjs/bin/phantomjs ../client/ext_modules/casperjs/bin/casperjs test ../client/test/functional/init.coffee
In init.coffee I want to import/include other module (file) which seats just next to it. How to do it?
The following doesn't works:
require("user")
All I want is to get a content from other file into init.coffee
After trying a number of the other suggestions (each expected to work in the context of their corresponding environments), hit on this solution:
phantom.page.injectJs( 'script.js');
As of 1.1, CasperJS relies on PhantomJS’ native require():
https://groups.google.com/forum/#!topic/phantomjs/0-DYnNn_6Bs
Injecting dependencies
While injecting additional modules, CasperJS looks for path relative to cur directory
(the place where we run a casperjs command)
We can inject dependency using clientScripts option. However the injected dependencies can't
use require "globally". They are injected immediately to every page loaded.
casper.options.clientScripts = ["path/relative/to/cur/dir"]
Also we can inject modules using commandline args:
casperjs test --includes=foo.js,bar.js path/to/the/test/file
Using require
To import user modules use:
require "./user-module.coffee"
Then in user modules you can also use require. Using require paths are resolved
relative to the current file (where require is called).
If in user module you want to import casper libs, then you need to patch require,
check: https://casperjs.readthedocs.org/en/latest/writing_modules.html
There's a section about that in the docs
var require = patchRequire(global.require);
require('./user');
In your case you should use global.require since you're using CoffeeScript.
So, I have an app that is using requireJS. Quite happily. For the most part.
This app makes use of Socket.IO. Socket.IO is being provided by nodejs, and does not run on the same port as the main webserver.
To deal with this, in our main js file, we do something like this:
var hostname = window.location.hostname;
var socketIoPath = "http://" + hostname + ":3000/socket.io/socket.io";
requirejs.config({
baseUrl: "/",
paths: {
app : "scripts/appapp",
"socket.io" : socketIoPath
}
});
More complicated than this, but you get the gist.
Now, in interactive mode, this works swimingly.
The ugliness starts when we try to use r.js to compile this (technically we're using grunt to run r.js, but that's besides the point).
In the config for r.js, we set an empty path for socket.io (to avoid it failing to pull in), and we set our main file as the mainConfigFile.
The compiler yells about this, saying:
Running "requirejs:dist" (requirejs) task
>> Error: Error: The config in mainConfigFile /…/client.js cannot be used because it cannot be evaluated correctly while running in the optimizer. Try only using a config that is also valid JSON, or do not use mainConfigFile and instead copy the config values needed into a build file or command line arguments given to the optimizer.
>> at Function.build.createConfig (/…/r.js:23636:23)
Now, near as I can figure, this is due to the fact that I'm using a variable to set the path for "socket.io". If i take this out, require runs great, but i can't run the raw from a server. If I leave it is, my debug server is happy, but the build breaks.
Is there a way that I can lazily assign the path of "socket.io" at runtime so that it doesn't have to go into the requirejs.config() methos at that point?
Edit: Did some extensive research on this. Here are the results.
Loading from CDN with RequireJS is possible with a build. However, if you're using the smaller Almond loader, it's not possible.
This leaves you with two options:
Use almond along with a local copy of the file in your build.
Use the full require.js loader and try to use a CDN.
Use a <script> tag just for that resource.
I say try for #2 because there are some caveats. You'll need to include require.js in your HTML with the data-main attribute for your built file. But if you do this, require and define will be global functions, allowing users to require any of your internal modules and mess around with them. If you're okay with this, you'll need to follow the "empty: scheme" in your build config (but not in your main config).
But the fact remains that you now have another HTTP request. If you only want one built file, which includes the require.js loader, you'll need to optimize for only one file.
Now, if you want to avoid users being able to require your modules, you'll have to do something like wrap:true in your build. But as far as I can tell, once your module comes down from CDN, if it's AMD, it's going to look for a global define function to register itself with, and that won't exist because it's now wrapped in a closure.
The lesson I took away from all this: inline your resources to your build. It makes sense. You reduce HTTP requests, minify it all and get gzip compression. You don't expose your modules to the world and everything is a lot simpler. If you cache your resources properly you won't even need to worry about it.
But since new versions of socket.io don't like AMD, here's how I did it. Make sure to include the socket.io <script> tag before requirejs. Then create a requirejs module named socket.io with the following contents:
define([], function () {
var io = window.io;
window.io = null;
return io;
});
Set the path like so: 'socket.io': 'core/socket.io' or wherever you want.
And require it as normal! The build works fine this way.
Original answer
Is it possible that you could make use of the path config fallbacks specified in the RequireJS API? Maybe you could save the file locally as a fallback so your build will work.
The socket.io GitHub repository specifies that you can serve the client with the files in the socket.io-client package's dist/ directory.