I have a chrome extension that includes a complicated function comp_func(data) which takes a lot of CPU by performing many bitwise operations. Because of that, I'm trying to use WebAssembly.
I've tried to follow several tutorials, for example this one and this one.
The first link says:
fetch('simple.wasm').then(response =>
response.arrayBuffer()
).then(bytes =>
WebAssembly.instantiate(bytes, importObject)
).then(results => {
results.instance.exports.exported_func();
});
but I get an error:
Uncaught (in promise) TypeError: WebAssembly Instantiation: Import #0 module="env" error: module is not an object or function
I've tried a lot to use this approach, but it didn't work. I can't understand how to use WebAssembly that is loaded from the .wasm file.
So I've tried an easier approach:
The second link says to put this line in the html file:
<script src="index.js"></script>
and then just use the exported function:
var result = _roll_dice();
BUT, I'm in an extension so I only have a background.html file.
So I'm looking for a way to access the Module which was loaded in the background file.
And things get complicated, because the function comp_func(data) is called from a Worker.
This is what I've tried so far:
If I call chrome.extension.getBackgroundPage() I can access the Module
but I can't send it to the Worker:
Failed to execute 'postMessage' on 'Worker': # could not be cloned.
And if I try to stringify it first:
Uncaught TypeError: Converting circular structure to JSON
(I tried to un-circular it, didn't work...)
And I can't call chrome.extension.getBackgroundPage() from the Worker because I can't access chrome API from there.
So my questions are:
Did someone try to load .wasm file in chrome extension and succeed?
The second approach (loading the js file) sounds simpler but if you have a working example for this approach it would be great.
or 2. How to access the Module that has been loaded in background.html (from the second example)?
or 3. How to pass the functions that I needed from the js file to the Worker (via postMessage)?
To summarize, did someone try to use WebAssembly in a chrome extension and survive to tell?
EDIT:
I eventually left the approach of WebAssembly.
I also posted this question at bugs-chromium,
and after few month got an answer. Not sure if this is really working, but maybe this, along with the marked answer, will help someone.
I've been fiddling with WebAssembly recently, and found a way to make it work. Here are the script files:
main.js
chrome.browserAction.onClicked.addListener(function(tab) {
chrome.tabs.executeScript(null, {file: "content_script.js"});
});
content_script.js
var importObject = { imports: { imported_func: arg => console.log(arg) } };
url = 'data:application/wasm;base64,' + "AGFzbQEAAAABCAJgAX8AYAAAAhkBB2ltcG9ydHMNaW1wb3J0ZWRfZnVuYwAAAwIBAQcRAQ1leHBvcnRlZF9mdW5jAAEKCAEGAEEqEAAL";
WebAssembly.instantiateStreaming(fetch(url), importObject)
.then(obj => obj.instance.exports.exported_func());
The data URL belongs to the common tutorial wasm sample (simple.wasm), which writes 42 on the console.
PS. If it seems like cheating or bad practice to you, this content_script.js also works:
var importObject = {
imports: {
imported_func: function(arg) {
console.log(arg);
}
}
};
var response = null;
var bytes = null;
var results = null;
var wasmPath = chrome.runtime.getURL("simple.wasm");
fetch(wasmPath).then(response =>
response.arrayBuffer()
).then(bytes =>
WebAssembly.instantiate(bytes, importObject)
).then(results => {
results.instance.exports.exported_func();
});
Only if you include the code files in the web_accessible_resources section in manifest.json, though:
...
"web_accessible_resources": [
"content_script.js",
"main.js",
"simple.wasm"
],
...
Github: https://github.com/inflatablegrade/Extension-with-WASM
It can also be made compatible with Manifest V3.
Related
What I tried:
I tried work around it via if (browser), more specifically{
if (!browser) { let DailyIframe = await import('daily-co/daily-js) } in the load function inside <script context="module"> ) so the code is always executed on the server). Then pass it as a prop to a component. However, although it worked on the server, the local dev environment re-runs the load function (which has to return an empty prop as it never imported anything) and overrides DailyIframe's value (might be a bug with Vite/SvelteKit).
I tried to import the library in an end-point e.g. api.json.js instead, which is always executed on the server. However, it has to return a json, and I can't pass an entire library variable onto it.
After research
It seems like a combination of problems from Vite, SvelteKit and certain libraries where global is undefined: SvelteKit With MongoDB ReferenceError: global is not defined)
But I cannot use his solution of putting it in an endpoint, because I need the DailyIframe and the mic audio stream from the client to create a video conference room
Also, why would certain libraries Daily (and looking at other related Stackoverflow posts, MongoDB) throw this error in the first place, while other libraries are safe to use?
Anyway suggestion is appreciated!
Why ?
Vite doesn't include shims for Node builtins variables.
Read suggestion to understand:
https://github.com/vitejs/vite/issues/728
https://github.com/angular/angular-cli/issues/9827#issuecomment-369578814
Anyway suggestion is appreciated!
In index.html add :
<script>
var global = global || window;
</script>
then for example in App.svelte :
<script>
import { onMount } from 'svelte'
import DailyIframe from '#daily-co/daily-js'
onMount(async () => {
let callObject = DailyIframe.createFrame()
const stream = await navigator.mediaDevices.getUserMedia({ audio: true })
let recorder = new MediaRecorder(stream)
recorder.start()
})
</script>
👉 Demo
https://stackblitz.com/edit/sveltekit-1yn6pz?devtoolsheight=33
logs preview
I've been googling for hours now, thought I just ask here. For some reason my 'require()' does not work. The recuirejs is included and as far as I can see my return value should be my data in the exact same order as my json file.
here is my code:
$(document).ready(async function() {
let data = await fetchData('./data/file.json');
console.log(data);
}
// fetch and return data
function fetchData(path) {
return require([path]);
}
I originally had this solution (which worked with a local host but I need it without a host):
function fetchData(path) {
return fetch(path).then(response => {
return response.json().then((data) => {
return data;
}).catch((err) => {
console.log(err);
})
});
}
It gives me several script errors and MIME type mismatches plus it logs this instead of my data:
s(e, t, i)
​ arguments: null
caller: null
​ >defined: function defined(e)
​ isBrowser: true
length: 3
name: "s"
​ >prototype: Object { … }
>specified: function specified(e)​
>toUrl: function toUrl(e)​
>undef: undef(i)
I don't know what else I should try.
Thank you!
RequireJS is not compatible with Node.js's require method. It is designed for AMD modules not CommonJS modules and it does not support the loading of plain JSON files. This is why your first attempt does not work.
Your second attempt does not work because file systems requests are treated as cross-origin requests.
The only way to load a JSON file when working on the load filesystem is to have the user select it with an <input type="file"> and then read it with JavaScript.
If you want to read hard-coded JSON then you might consider baking it into your app. The simple way to do that would be to just paste it in as a JS object literal. More complex programs might benefit from using a tool like Webpack (which would need a JSON loader) and pulling it into the JS at build-time rather than development time (the aforementioned pasting approach) or run time (which is impossible as mentioned in previous paragraphs).
You have two options:
Use a tool like Webpack to put all your frontend files into the bundle
Download JSON file from web server:
I assume you are using jQuery already, so:
$.get('https://petstore.swagger.io/v2/swagger.json').then(function(data) {
console.log(data.swagger);
});
In this case you have to make your json file available for webserver
$.get('/dist/data/file.json').then(function(data) {
console.log(data);
});
I'm attempting to test the implementation of Fast Google Fonts with Cloudflare Workers, from Cloudflare's blog for inlining the Google Fonts stylesheet directly into the HTML. The implementation itself seems to be working fine when running via the Cloudflare worker. But I wrote some tests, and when running the tests, I get this error that says TransformStream is not defined.
Error when running my Jest test via npm run test:
ReferenceError: TransformStream is not defined
This is the code that the test is hitting:
async function processHtmlResponse(request, response) {
...
// This is where the reference error comes up
const { readable, writable } = new TransformStream();
const newResponse = new Response(readable, response);
modifyHtmlStream(response.body, writable, request, embedStylesheet);
return newResponse;
}
This test looks something like this, where we basically expect that the stylesheet link will be replaced by the stylesheet itself, containing all the #font-face styles.
describe('inlineGoogleFontsHtml', () => {
it('inlines the response of a Google Fonts stylesheet link within an HTML response', async () => {
const request = new Request('https://example.com')
const response = new Response(
'<html><head><link rel="stylesheet" media="screen" href="https://fonts.googleapis.com/css?family=Lato:300"></head><body></body></html>',
{
headers: {
'Content-Type': 'text/html'
}
}
)
const rewrittenResponse = processHtmlResponse(request, response)
const rewrittenResponseText = await rewrittenResponse.text()
expect(rewrittenResponseText).toContain('#font-face')
})
I'm not really sure what the issue is here. Does TransformStream work in Node? Is there some polyfill that's needed?
Related:
Cloudflare streams
TransformStream is part of the Streams API, a browser-side standard. It is not implemented by Node (because they had their own streams long before this spec existed), so you will need a polyfill when testing your code in Node.
Incidentally, the example you're following is fairly old. These days, it would be better to use HTMLRewriter to implement this kind of transformation -- it is much more efficient for rewriting HTML specifically. (However, it is a Cloudflare-specific feature, so you wouldn't be able to test it under Node at all.)
We are building an Electron app that allows users to supply their own 'modules' to run. We are looking for a way to require the modules but then delete or kill the modules if need be.
We have looked a few tutorials that seem to discuss this topic but we can't seem to get the modules to fully terminate. We explored this by using timers inside the modules and can observe the timers still running even after the module reference is deleted.
https://repl.it/repls/QuerulousSorrowfulQuery
index.js
// Load module
let Mod = require('./mod.js');
// Call the module function (which starts a setInterval)
Mod();
// Delete the module after 3 seconds
setTimeout(function () {
Mod = null;
delete Mod;
console.log('Deleted!')
}, 3000);
./mod.js
function Mod() {
setInterval(function () {
console.log('Mod log');
}, 1000);
}
module.exports = Mod;
Expected output
Mod log
Mod log
Deleted!
Actual output
Mod log
Mod log
Deleted!
Mod log
...
(continues to log 'Mod log' indefinitely)
Maybe we are overthinking it and maybe the modules won't be memory hogs, but the modules we load will have very intensive workloads and having the ability to stop them at will seems important.
Edit with real use-case
This is how we are currently using this technique. The two issues are loading the module in the proper fashion and unloading the module after it is done.
renderer.js (runs in a browser context with access to document, etc)
const webview = document.getElementById('webview'); // A webview object essentially gives us control over a webpage similar to how one can control an iframe in a regular browser.
const url = 'https://ourserver.com/module.js';
let mod;
request({
method: 'get',
url: url,
}, function (err, httpResponse, body) {
if (!err) {
mod = requireFromString(body, url); // Module is loaded
mod(webview); // Module is run
// ...
// Some time later, the module needs to be 'unloaded'.
// We are currently 'unloading' it by dereferencing the 'mod' variable, but as mentioned above, this doesn't really work. So we would like to have a way to wipe the module and timers and etc and free up any memory or resources it was using!
mod = null;
delete mod;
}
})
function requireFromString(src, filename) {
var Module = module.constructor;
var m = new Module();
m._compile(src, filename);
return m.exports;
}
https://ourserver.com/module.js
// This code module will only have access to node modules that are packaged with our app but that is OK for now!
let _ = require('lodash');
let obj = {
key: 'value'
}
async function main(webview) {
console.log(_.get(obj, 'key')) // prints 'value'
webview.loadURL('https://google.com') // loads Google in the web browser
}
module.exports = main;
Just in case anyone reading is not familiar with Electron, the renderer.js has access to 'webview' elements which are almost identical to iframes. This is why passing it to the 'module.js' will allow the module to access manipulate the webpage such as change URL, click buttons on that webpage, etc.
There is no way to kill a module and stop or close any resources that it is using. That's just not a feature of node.js. Such a module could have timers, open files, open sockets, running servers, etc... In addition node.js does not provide a means of "unloading" code that was once loaded.
You can remove a module from the module cache, but that doesn't affect the existing, already loaded code or its resources.
The only foolproof way I know of would be to load the user's module in a separate node.js app loaded as a child process and then you can exit that process or kill that process and then the OS will reclaim any resources it was using and unload everything from memory. This child process scheme also has the advantage that the user's code is more isolated from your main server code. You could even further isolate it by running this other process in a VM if you wanted to.
I'm planning on using a set of a little bit more sophisticated conventions to import assets in my webpack project. So I'm trying to write a plugin that should rewrite parts of requested module locators and then pass that down the resolver waterfall.
Let's assume we just want to
check if a requested module starts with the # character and
if so, replace that with ./lib/. The new module locator should now be looked up by the default resolver.
This means when a file /var/www/source.js does require("#example"), it should then actually get /var/www/lib/example.js.
So far I've figured out I'm apparently supposed to use the module event hook for this purpose. That's also the way chosen by other answers which unfortunately did not help me too much.
So this is my take on the custom resolve plugin, it's pretty straightforward:
function MyResolver () {}
MyResolver.prototype.apply = function (compiler) {
compiler.plugin('module', function (init, callback) {
// Check if rewrite is necessary
if (init.request.startsWith('#')) {
// Create a new payload
const modified = Object.assign({}, init, {
request: './lib/' + init.request.slice(1)
})
// Continue the waterfall with modified payload
callback(null, modified)
} else {
// Continue the waterfall with original payload
callback(null, init)
}
})
}
However, using this (in resolve.plugins) doesn't work. Running webpack, I get the following error:
ERROR in .
Module build failed: Error: EISDIR: illegal operation on a directory, read
# ./source.js 1:0-30
Apparently, this is not the way to do things. But since I couldn't find much example material out there on the matter, I'm a little bit out of ideas.
To make this easier to reproduce, I've put this exact configuration into a GitHub repo. So if you're interested in helping, you may just fetch it:
git clone https://github.com/Loilo/webpack-custom-resolver.git
Then just run npm install and npm run webpack to see the error.
Update: Note that the plugin architecture changed significantly in webpack 4. The code below will no longer work on current webpack versions.
If you're interested in a webpack 4 compliant version, leave a comment and I'll add it to this answer.
I've found the solution, it was mainly triggered by reading the small doResolve() line in the docs.
The solution was a multiple-step process:
1. Running callback() is not sufficient to continue the waterfall.
To pass the resolving task back to webpack, I needed to replace
callback(null, modified)
with
this.doResolve(
'resolve',
modified,
`Looking up ${modified.request}`,
callback
)
(2. Fix the webpack documentation)
The docs were missing the third parameter (message) of the doResolve() method, resulting in an error when using the code as shown there. That's why I had given up on the doResolve() method when I found it before putting the question up on SO.
I've made a pull request, the docs should be fixed shortly.
3. Don't use Object.assign()
It seems that the original request object (named init in the question) must not be duplicated via Object.assign() to be passed on to the resolver.
Apparently it contains internal information that trick the resolver into looking up the wrong paths.
So this line
const modified = Object.assign({}, init, {
request: './lib/' + init.request.slice(1)
})
needs to be replaced by this:
const modified = {
path: init.path,
request: './lib/' + init.request.slice(1),
query: init.query,
directory: init.directory
}
That's it. To see it a bit clearer, here's the whole MyResolver plugin from above now working with the mentioned modifications:
function MyResolver () {}
MyResolver.prototype.apply = function (compiler) {
compiler.plugin('module', function (init, callback) {
// Check if rewrite is necessary
if (init.request.startsWith('#')) {
// Create a new payload
const modified = {
path: init.path,
request: './lib/' + init.request.slice(1),
query: init.query,
directory: init.directory
}
// Continue the waterfall with modified payload
this.doResolve(
// "resolve" just re-runs the whole resolving of this module,
// but this time with our modified request.
'resolve',
modified,
`Looking up ${modified.request}`,
callback
)
} else {
this.doResolve(
// Using "resolve" here would cause an infinite recursion,
// use an array of the possibilities instead.
[ 'module', 'file', 'directory' ],
modified,
`Looking up ${init.request}`,
callback
)
}
})
}