I'm building a chrome extension with vue cli 3. I've got the basics working well, but I was hoping to also run my content and background javascript through the build process instead of just putting them into my public folder and copying it into dist. This is mostly just so I can use import/export to clean up my file structure.
I was able to add them as new "pages" in the vue config and even without the html template file, they get built properly and moved into dist.
The problem is, they then get the cache busting string appended to their filename so I'm unable to reference them in the extension manifest. For example, background.js becomes background.d8f9c902.js
Is it possible to tell the vue config that certain "pages" should not get the cache busting? The documentation here does not seem to expose that as a parameter.
Thanks in advance!
Filename hashing can be disabled for all files:
https://cli.vuejs.org/config/#filenamehashing
It works in my case using below vue.config.js:
// vue.config.js
module.exports = {
lintOnSave: true,
filenameHashing: false
}
Related
I have a fairly simple webpack set up with a bit of a twist. I have a few different ways I can think of to create my intended behavior, but I'm wondering if there are better approaches (I'm still fairly new to webpack).
I have a basic TypeScript/Scss application and all of the src files exist in a src directory. I also have a component library bundle that's dynamically generated through a separate build process (triggered via Node). This bundle also ends up in the src directory (it contains some sass variables and a few other assets that belong in src). This is src/custom-library-bundle. My current webpack set up moves the appropriate bundle files to a public (dist) directory via the CopyWebpackPlugin. My webpack dev server watches for changes and reloads as expected. This all works beautifully.
Now for the twist. I've set up a custom watcher that exists elsewhere to watch for the custom component library bundle changes. When a change occurs there, it triggers that custom build process mentioned above. This (as expected) modifies the src/custom-library-bundle directory and sets off several watches/reloads as the bundle is populated. Technically it works? And the behavior is expected, but ideally, I could tell it to wait until the custom installation work is "done" and only then have it trigger the compilation/reload.
Phew. This isn't an easy one to explain. Here's a webpack config that will (hopefully) help clarify:
devServer: {
contentBase: path.join(__dirname, 'public'),
port: 9000,
host: '127.0.0.1',
after: (app, server) => {
new MyCustomWatcherForOtherThings().watch(() => {
// invoked after the src/custom-library-bundle is done doing its thing (each time)
// now that I know it's done, I want to trigger the normal compilation/reload
})
},
watchOptions: {
ignored: [
/node_modules/
// I've been experimenting with the ignored feature a bit
// path.resolve(__dirname, 'src/custom-library-bundle/')
]
}
}
Ideal Approach: In my most ideal scenario, I want to just manually trigger webpack dev server to do its thing in my custom watch callback; have it ignore the src/custom-library-bundle until I tell it to pay attention. I can't seem to find a way to do this, however. Is this possible?
Alternate Approach #1: I could ignore the src/custom-library-bundle directory, move the updated files to public (not utilizing the webpack approach), and then trigger just a reload when I know that's complete. I don't love this because I want to utilize the same process whether I'm watching or just doing a one-off build (I want everything to end up in the public directory because webpack did that work, not becuase I wrote a script to put it there under specific scenarios). But, let's say I get over that, how can I trigger a browser reload for the dev server? Is this possible?
Alternate Approach #2 This is the one I'm leaning towards but it feels like extra, unnecessary work. I could have my custom build process output to a different directory (one that my webpack set up doesn't care about at all). Once the build process is complete, I could move all the files to src/custom-library-bundle where the watch would pick up 1 change and do a single complication/reload. This gets me so close, but feels like I'm adding a step I don't want to.
Alternate Approach #3? Can you think of a better way to solve this?
Update (including versions):
webpack 4.26.1
webpack-dev-server 3.1.14
Add the following server.sockWrite call to your after method:
devServer: {
after: (app, server) => {
new MyCustomWatcherForOtherThings().watch(() => {
// invoked after the src/custom-library-bundle is done doing its thing (each time)
// now that I know it's done, I want to trigger the normal compilation/reload
// calling this on `server` triggers a full page refresh
server.sockWrite(server.sockets, "content-changed");
});
};
}
I've never found this in the documentation, but one of webpack's core devs mentioned it in a comment on GitHub, so it's sort of sanctioned.
Useful things that webpack provides that come to mind are multi-compiler builds, creating a child compiler from a Compiler or Compilation instance, the DllPlugin and programmatically managing the compiler by calling webpack.watch() or webpack.compile().
the multi-compiler setup is useful when you want to build several compilations in a single run whose outputs are independent of each other
the child compilers allows you to set up dependencies between
compilers and using hooks they allow you to block the parent until,
say, the child compilation finished emitting the latest bundle into
the assets
the DllPlugin allows you to create a compilation and
it's manifest that could produce chunks that can contain modules that
can be used as dependents for yet unbuilt compilations (the manifest
needs to be produced and passed to them beforehand)
programmatically
managing you compiler lets you write a straightforward Node.js script
that could accomplish most of this manually.
If I understood correctly, your webpack compiler doesn't really have any dependencies on your bundle aside from expecting it to have something added to the output assets, so you might consider doing a multi-compiler build. If that doesn't work for you, you can write a simple plugin that creates a child compiler that watches all the component bundle dependencies and emits the built bundle into the main compilation assets on changes. Ultimately, as you mentioned yourself, you could write a simple script and programmatically weave everything together. All of these approaches offload tracking build dependencies to webpack, so if you put the compilers into watch mode webpack will keep track when something needs updating.
If you're interested in taking a closer look at how child compilers are created and used, I would heartily recommend reading through the sources of the html-webpack-plugin plugin. It doesn't seem like the plugin is handling the same kind of build setup as you have, but worth noting is that the HTML plugin works with files that aren't part of the build dependencies (nothing in the sources references or depends on the files/templates that are used for creating the HTML files that are added to the output).
I'm using Grunt to build the Durandal starter kit pro package.
It all works fine, except for one tiny detail. I would like to exclude one file (app-config below) from the optimizer and keep it as a non minified file when my build is done.
Based on other SO thread suggestions, I'm currently excluding it using empty:, which removes it from the optimized file as expected. However, when I open the built project I get an error in the console:
Uncaught Error: main missing app-config
options: {
name: '../lib/require/almond-custom',
baseUrl: requireConfig.baseUrl,
mainPath: 'app/main',
paths: mixIn({ }, requireConfig.paths, {
'almond': 'lib/require/almond-custom',
'app-config': 'empty:'
}),
optimize: 'none',
out: 'build/app/main.js',
preserveLicenseComments: false
}
Is almond the problem? I tried switching it to the full requirejs using include: ['path/to/require'], without success.
If you want to reproduce it locally you can either download the starter kit from the above link, or use a slightly configurated version which is closer to my example. Just run an npm install in the folder and you're all set.
I have downloaded you source code and do the following steps.
Extract zip file, open cmd and change the directory to this folder.
Run npm install to install all the dependencies.
Run grunt to start to build the project.
And when I open http://localhost:8999/ and saw the alert 1 which is alert(appConfig.foo); in your main.js.
After clicked Ok to hide the alert, the web page works fines. Any more input for you ?
So I am not sure how you are facing with this issue.
From the reference of the durandal issues found in this particular link
grunt-durandal
The main module controls the implementation of the durandal services
The link can be found in main.js
Here you can see the system.debug(true).You can remove it as written in the post here document.
The function as quoted in the article Overrides request execution timeout making it effectively infinite.
Also while using uglify in grunt the debug is set to false as per the documentation.
As per the documentation you need to set the system.debug(false)
Hope this might help a bit.
try:
....
paths: mixIn({ }, requireConfig.paths, {
'almond': ['lib/require/almond-custom', '!lib/require/almond-custom/app-config.js']
}),
....
just note the second path of app-config.js is correct. I think you should find your way, the above is a hint, if not a direct solution.
I recently started exploring Browserify for bundling Node modules and using them in the browser. It's neat and works great, however I want an improvement in the work flow. In my use case, I have a script.js file that requires node modules like Cylon etc.
For brevity, script.js looks something like:
"use strict";
var Cylon = require('cylon');
Cylon.robot({
name: "BrowserBot",
connections: {
arduino: { adaptor: 'firmata', port: '/dev/tty.usbmodem1411' }
},
devices: {
led: { driver: 'led', pin: 8 }
},
work: function(my) {
Cylon.Logger.info("Hi, my name is " + my.name)
every((2).seconds(), function() {
Cylon.Logger.info("Toggling the LED");
my.led.toggle();
});
}
});
Cylon.start();
I was looking at the bundle.js file that browserify generates and i could find the exact code block mentioned above, and I think a node process is started with this code and some bindings. I want the script.js file to be dynamic to allow the user to use a different pin on an LED or any other small change for that matter. Since I am not changing any dependencies for this file, I should be just able to replace that block in bundle.js with the new contents of the script.js file as other modules are already loaded and bundled in the bunndle.js right?
I want to know if this is possible in a browser setting. Chrome Apps allow file Storage, so it is possible for me to generate bundle.js dynamically after initial creation where I just plug-in the contents of script.js and load bundle.js in an HTML file? How do I go about this?
While the question is not specific to Cylon, I am still adding it as a tag for my specific usecase.
All the .js files should be specified in the Apps manifest.json. I don't think you can edit items from the app's folder (even when accessing it thru file storage)
I'd like to be able to install Javascript dependencies through bower and use them in a sails.js app, but I can't figure out a way to do this with out just copying an pasting files from the bower_components folder to the Sails assets folder.
Ideally I think I'd like to use requirejs and point to the bower components in the main.js file. I may be trying to pound a square peg in a round hole, please let me know if so. Any thoughts on managing components/libraries with Sails are welcome.
In Sails 0.10 the 'assets/linker' directory no longer exists, however I found a lead on simple solution that gives some automation to the bower -> asset linker workflow while also allowing some granular control over what exactly ends up getting linked.
The solution is adding grunt-bower to your Sails.js compileAssets task
grunt.registerTask('compileAssets', [
'clean:dev',
'bower:dev',
'jst:dev',
'less:dev',
'copy:dev',
'coffee:dev'
]);
Then configure your grunt-bower task like so (as tasks/config/bower.js):
module.exports = function(grunt) {
grunt.config.set('bower', {
dev: {
dest: '.tmp/public',
js_dest: '.tmp/public/js',
css_dest: '.tmp/public/styles'
}
});
grunt.loadNpmTasks('grunt-bower');
};
This will automatically copy your bower js and css files to the proper place for Sail's asset linker to find and automatically add to your project's layout template. Any other js or css files will still automatically be added after your bower files.
However this is still no silver bullet as this setup has 2 big caveats to it:
1 - The files that are added through bower-grunt have to be listed in bower.json's main array. If you see a file isn't being loaded you would expect to be, you must either edit that packages bower.json file OR add the dependency manually through grunt-bower's packageSpecific options.
2 - The order of bower files in the asset linker is currently alphabetical. My only recourse to adjust this order so far is tinkering around with an additional grunt task to manually re-order files prior to the rest of Sail's compileAssets task. However this one I'm confident there is something grunt-bower could do by supporting package copy ordering.
Note: the following answer is no longer completely relevant to the current version of SailsJS because there is no support for the linker folder as of SailsJS 0.10.
See: Sails not generating linker
Original answer:
I was able to figure out a solution for this, which is actually pretty simple. I had not realized you could configure where bower places it's files.
Create a .bowerrc file and change the directory where bower components are installed, in the case of Sailjs they should be put into the assets folder.
/*
* Create a file called .bowerrc and put the following in it.
* This file should be in the root directory of the sails app.
*/
{
"directory": "assets/linker/bower_components"
}
Sails will then use grunt to copy them to the .tmp/public/assets folder whenever a file is change. If you don't wish to have sails continually deleting and then recopying those files you can exclude them in the grunt file.
/*
* This is not necessary, but if you have a lot of components and don't want
* them constantly being deleted and copied at every file change you can update
* your Gruntfile.js with the below.
*/
clean: {
dev: ['.tmp/public/**',
'!.tmp/public',
'!.tmp/public/bower_components/**'],
build: ['www']
},
One tip on using requirejs with sails. By default you will get an error from the socket.io file since sails will load it without using requirejs. This will throw an error since socket.io supports amd style loading, more details here http://requirejs.org/docs/errors.html#mismatch.
The simplest way to fix this is to just comment out the lines near the end of the socket.io.js.
/*
* Comment the below out in the file assets/js/socket.io.js, if using requirejs
* and you don't want to modify the default sails setup or socket.io.
*/
if (typeof define === "function" && define.amd) {
define([], function () { return io; });
}
The other way would be to recode the sails files in assets/js named "socket.io.js", "sails.io.js" and app.js to be amd modules.
The simplest way I've found of achieving this is simply to add the individual Bower components to your tasks/pipeline.js file. For example, here's how you might add Modernizr:
// Client-side javascript files to inject in order
// (uses Grunt-style wildcard/glob/splat expressions)
var jsFilesToInject = [
// Load sails.io before everything else
'js/dependencies/sails.io.js',
// Dependencies like jQuery, or Angular are brought in here
'js/dependencies/**/*.js',
// =========================================================
// Modernizr:
'bower_components/modernizr/modernizr.js',
// =========================================================
// All of the rest of your client-side js files
// will be injected here in no particular order.
'js/**/*.js'
];
A link to bower_components/modernizr/modernizr.js then gets inserted in the <!--SCRIPTS--> placeholder in your layout template, and it also gets minified, etc, when you run sails lift --prod.
Sorry for my late.
I think include bower_components in linker it's a bad idea, because when you will lift sails, everything in it will be copied in .tmp and include in tags.
For example, if you have include Angular with bower, you will have two inclusions in your script: one for angular.js and one for angular.min.js. And having both is a mistake.
Here is my solution on my projects :
I have created a folder bower_component in my root directory.
I have added this line in my Gruntfile.js in the array files in copy:dev
{ '.tmp/public/linker/js/angular.js': './bower_components/angular/angular.js' }
And for each files I want to include, I need to manually add a new line in this array. I haven't find an another automatic solution. If someone finds a better solution... ;)
There is more than one approach to hooking up SailsJS and Bower.
A package for SailsJS that integrates Bower via a custom generator exists here:
https://www.npmjs.com/package/sails-generate-bower
There is one for Gulp as well.
So, I have an app that is using requireJS. Quite happily. For the most part.
This app makes use of Socket.IO. Socket.IO is being provided by nodejs, and does not run on the same port as the main webserver.
To deal with this, in our main js file, we do something like this:
var hostname = window.location.hostname;
var socketIoPath = "http://" + hostname + ":3000/socket.io/socket.io";
requirejs.config({
baseUrl: "/",
paths: {
app : "scripts/appapp",
"socket.io" : socketIoPath
}
});
More complicated than this, but you get the gist.
Now, in interactive mode, this works swimingly.
The ugliness starts when we try to use r.js to compile this (technically we're using grunt to run r.js, but that's besides the point).
In the config for r.js, we set an empty path for socket.io (to avoid it failing to pull in), and we set our main file as the mainConfigFile.
The compiler yells about this, saying:
Running "requirejs:dist" (requirejs) task
>> Error: Error: The config in mainConfigFile /…/client.js cannot be used because it cannot be evaluated correctly while running in the optimizer. Try only using a config that is also valid JSON, or do not use mainConfigFile and instead copy the config values needed into a build file or command line arguments given to the optimizer.
>> at Function.build.createConfig (/…/r.js:23636:23)
Now, near as I can figure, this is due to the fact that I'm using a variable to set the path for "socket.io". If i take this out, require runs great, but i can't run the raw from a server. If I leave it is, my debug server is happy, but the build breaks.
Is there a way that I can lazily assign the path of "socket.io" at runtime so that it doesn't have to go into the requirejs.config() methos at that point?
Edit: Did some extensive research on this. Here are the results.
Loading from CDN with RequireJS is possible with a build. However, if you're using the smaller Almond loader, it's not possible.
This leaves you with two options:
Use almond along with a local copy of the file in your build.
Use the full require.js loader and try to use a CDN.
Use a <script> tag just for that resource.
I say try for #2 because there are some caveats. You'll need to include require.js in your HTML with the data-main attribute for your built file. But if you do this, require and define will be global functions, allowing users to require any of your internal modules and mess around with them. If you're okay with this, you'll need to follow the "empty: scheme" in your build config (but not in your main config).
But the fact remains that you now have another HTTP request. If you only want one built file, which includes the require.js loader, you'll need to optimize for only one file.
Now, if you want to avoid users being able to require your modules, you'll have to do something like wrap:true in your build. But as far as I can tell, once your module comes down from CDN, if it's AMD, it's going to look for a global define function to register itself with, and that won't exist because it's now wrapped in a closure.
The lesson I took away from all this: inline your resources to your build. It makes sense. You reduce HTTP requests, minify it all and get gzip compression. You don't expose your modules to the world and everything is a lot simpler. If you cache your resources properly you won't even need to worry about it.
But since new versions of socket.io don't like AMD, here's how I did it. Make sure to include the socket.io <script> tag before requirejs. Then create a requirejs module named socket.io with the following contents:
define([], function () {
var io = window.io;
window.io = null;
return io;
});
Set the path like so: 'socket.io': 'core/socket.io' or wherever you want.
And require it as normal! The build works fine this way.
Original answer
Is it possible that you could make use of the path config fallbacks specified in the RequireJS API? Maybe you could save the file locally as a fallback so your build will work.
The socket.io GitHub repository specifies that you can serve the client with the files in the socket.io-client package's dist/ directory.