My workflow is as follows:
1) I build my front-end usingwebpack with all the cool [chunkhash] to make sure my generated files have different names when there are changes and therefore busting the cache and finish with something like:
app.asdf4354234asdfchunkhashname.js
app.asdf4354234asdfchunkhashname.css
index.html
Awesome!
2) I then have a node express server to serve the above pages and an API, short version:
var express = require('express');
app.use(express.static(path.join(__dirname, '../client'))); // Where the wekback files are located after the build
...app.listen(...
So great, my JS and CSS have unique names for each versions, but the problem is that index.html gets cached and so the browser doesn't notice the new JS/CSS files. I don't think changing the name of index.html and having node filter the requests to redirect to newIndex.html is a great solution.
Question: Anything wrong in my workflow? How do I not cache index.html or let the browser know there is a new version of my files?
Related
I am following the code structure generated by yeoman for angular fullstack.
I want to include a script called core.js in file called app.html.
<script src="core.js"></script>
I do not see express.static anywhere in this for serving static files.
I tried using it but it did not help.
It can not locate it and gives 404.
How do I get around this ?
It had happened before as well but I could get around it by using express.static and serving files from location pointed by it.
It did not help this time though.
Update:
I have app.html in folder called Music. In same folder, I have a sub folder called js where I have placed my core.js file that is to be included in app.html. I tried to access it using absolute as well as relative path but did not help and still gives 404.
In angular, the scripts go in the relevant subfolder of /scripts. Either in /controllers, /services/, /directives, etc. You then reference them in your html as such:
<script src="scripts/controllers/core.js"></script>
As for express.static, express is a NodeJS wrapper for HTTP. So that will be the service you create that lives on some Node server remotely. express.static allows the Node server to deliver static content files from the file set at the remote server. It does not go in your angular application.
I actually have:
Nginx running to serve static files (css, js, etc.)
Node with express.js, template engine: ECT (I may change for Swig)
I am currently trying to find the best way to distribute static files with a custom prefix with versioning:
'http://static.mydomain.com/' in production
'/path/to/static' in devel
So to do that, I only set a variable containing the prefix (which depends on the environment).
Then for each request, I set add the prefix to locals in an express middleware in order to access this variable in any html template:
this.use(function(req, res, next) {
res.locals.staticPrefix = staticPrefix;
next();
});
But since I also want these static files to be also cached by the client's browser, Nginx serves these files with expire = 30d.
Now to force a client to retrieve a static file (if it has changed for example), I need to provide static urls with a dynamic url parameter.
My first idea would be to set a version variable when I start the nodejs app to append it to the final url:
var staticVersion = new Date().getTime();
So in the html template, the final url for a 'myFile.css' would like this: staticPrefix + 'myFile.css?' + staticVersion
In this case, I only need to restart the nodejs application when one of the static files has been updated. It will make the url to change (according to the new date) and the client to do a new request of the file.
Is there a better way to handle this situation with node?
Best way to handle static assets like css/js files is to minify them in production. Use file name based on file contents. This way every time you change anything in js/css files, the minification code will take care of generating new file if needed. You can hook minification script to run post deployment.
I have written a package smush to help with minification tasks. Head onto its github page for example usage and sample codes.
You could use other tools/package for minification if it suits better to your use case.
Coming back to your question, you can set the root dir for nginx to static dir of your node server(/path/to/node/server/public?). This way nginx will cache and serve your static files. The node server will not be bothered to serve the static assets afterwards.
Let me know if this makes sense or if you need any further clarification.
I'm surprised I can't google my answer here... it seems no one else is having the issue.
When you run the meteor service the js, html, etc. is packaged in the .meteor/local/build folder, but it appears to exclude stuff that isn't js or html. I have a folder called "magicsets" and one called "magicimgs" and neither are in the /local/build folder. This is obviously why, when i attempt to use fs to readfile, it fails to find the file "magicsets/M14.json"
I tried putting the magicsets folder into a folder named "private", but that didn't accomplish anything.
How do I make files accessible locally to my server via FS and how do I make files accessible publically to my server via raw urls?
I'm sure I'm missing something very simple, because there are lots of more complicated questions and answers on SO, yet there is no answer for this. Thanks.
Meteor 0.6.5 which was released yesterday has a new feature which helps loads with this.
Make a directory called /private which you can access with the new Assets.getText or Assets.getBinary functions.
The stuff in the /private directory will then be bundled up into a directory called assets in /program/server/assets and it will not be accessible to the web & you wouldn't need to worry about using fs either. You could just use Assets.getText instead
To make a publicly accessible file put it in /public. So if you had a.jpg at /public/a.jpg it would be accessible at http://yourdomain.com/a.jpg
If you want text files to be available to the webserver i.e. the server that defaults to port 3000, create a folder called public in the root of the project/app directory. drop your folder and files there. You would then be able to access them as http://localhost:3000/magicsets/M14.json
update: it looks like can override the bundler, but it does require changing some of the core code there's no .meteorignore file yet. check this SO answer out: https://stackoverflow.com/a/16742853/105282
To serve a directory of files publicly independent of what Meteor is doing, you can use the following approach. I do this, for example, when I need to link an entire (Javascript) git repo into my Meteor app so I can work on a checked out version of the library.
The following works for 0.6.5. It basically servers up a checked out folder of OpenLayers in /lib:
connect = Npm.require('connect')
RoutePolicy.declare('/lib', 'network')
WebApp.connectHandlers
.use(connect.bodyParser())
.use('/lib', connect.static("/home/mao/projects/openlayers/lib"))
For more information, see https://github.com/meteor/meteor/issues/1229.
When creating static apps I often start a new Rails app. This makes quite some things easier, like compilation (Coffeescript, SCSS), minimization (JS, CSS) and browser limitations (the page is being served from localhost:3000 so external sources can be loaded etc.).
At the end I want to export the app so I can put it online. Then I just need the HTML+CSS+JS. One can go and pluck the files out manually, but there probably is an easier way for this.
So: is there a tool that stores the compiled, minimized HTML+CSS+JS files from a Rails app?
If you just want to basically copy the website as it will be rendered by rails (and there is no need for server side code execution), you could just mirror the rails-website using
wget --page-requisites --convert-links http://URL-to-Start
However, this will only download those files that are referenced from the entry URL, so you might need to run it on all sub-URLs individually.
Source: Download a working local copy of a webpage
Agree with Screenmutt. I've tried a couple of the ones mentioned but have had most success with:
http://middlemanapp.com/
Does pretty much everything you are asking for and let's you export to static HTML.
install:
gem install middleman
create project:
middleman init my_new_project (or even better with template --template=html5)
run in local server for live edits:
bundle exec middleman
dump static code:
bundle exec middleman build
Perhaps you can 'scrape' the HTML from the localhost serving it?
There seem to be some tools for downloading sites in general... You can probably limit them to download resources from localhost:3000 only.
http://www.httrack.com/
http://www.linuxjournal.com/content/downloading-entire-web-site-wget
UPDATE: Here's another tutorial that might help Use Rails 3.1 for Static Sites
This is not a common usage. You might be able to extract all the static pages by manually caching everything.
I would recommend taking a look at some alternatives.
I'm sorry that this isn't a good answer, but to be honest... You are using Rails for something that it was never intended to do. There are much better ways of making static sites.
Also, a static site is not an "app". :)
All you have to do is switch to Rails production mode locally so that assets are combined and minified. Then all you have to do is view source for the HTML, CSS, and JS. This should only take a few seconds.
So the steps are
config.assets.compress = true in development.rb
view the app locally
view source, copy and paste into an index.html file
click on compressed CSS and JS form source and save those relative to your index.html making sure they link correctly
You can use Wget (as it's already mentioned). I would go with:
wget --mirror --convert-links --adjust-extension --page-requisites --no-parent http://www.yourdomain.com
Yo can also use Httrack.
Be sure when you set Httrack you exclude all external websites with scripts so you don't download f.e. Google Analytics js files or Adsense or Facebook scripts. In Httrack, you exclude it in Preferences with:
-*.googlesyndication.com/* -*.facebook.net/* -*.google-analytics.com/*
After you are done you still need to rewrite all links because they will point at .../some-page/index.html You need .../some-page/. This solves Dynamic to Static Script.
You shouldn't serve them from rails or do anything that binds your static files to being served from rails. You may one day decide to serve your app from a CDN.
JS
One big tip would be to look at using AMD (async module definition), which would allow you to specify your JS file dependencies. Then you can use require.js and r.js(a tool that crawl and compile your dependencies in you precompile step). That would work for your JS.
CSS
For CSS, you could use sass or less. You'd include 1 file at the end of the day on your page, but the compilation process would involve merging your CSS files together. Once again this can be done at the pre-compile step.
CDN
There are gems out there that show take your assets and pass them over to something like S3, this answer and others like it would help: Is there a way to asset pipeline assets to s3 when pushing to heroku? ; however, that isn't necessary when you are first starting.
I did it with a Rake task that would fetch each of the Rails routes one at a time. It needed a bit of jiggery pokery to handle the fact that you might have conflicting routes - e.g. wget would fetch /objects as a file called "objects" but then when you want to fetch /objects/4 it would overwrite that file with a folder called "objects" with a nested file called "4". So I move each downloaded page to "index.html" inside a directory with the same name.
Here's my code, which I out in lib/tasks/export.rake:
def adjust_paths(path)
text = File.read(path)
new_contents = text.gsub(/("|\.\.\/)(assets|packs)\//, "\\1../\\2/")
new_contents = new_contents.gsub("http://localhost:3020", "")
File.write(path, new_contents)
end
namespace :static do
desc 'Generate static site in ./out/ directory'
task :export => [
'assets:clean',
'assets:precompile',
:start_rails_server
] do
begin
out_prefix = "dist"
paths = Rails.application.routes.routes.map do |route|
route.path.spec.to_s
end.uniq.reject { |p| p.starts_with?("/rails") || p == "/cable" || p == "/assets" }
paths = paths.map { |p| p.sub("(.:format)", "") }
paths.sort_by(&:length).each do |path|
if path.include?(":id")
# You'll have to use your own method for deciding which ids to use
ids = ["1", "2", "3", "4"]
else
ids = [""]
end
ids.each do |id|
id_path = path.sub(":id", id)
`wget -P #{out_prefix} -nH -p -k http://localhost:3020#{id_path}`
if id_path != "/"
file_path = "#{out_prefix}#{id_path}"
FileUtils.mv(file_path, "#{file_path}.tmp", force: true)
FileUtils.mkdir_p(file_path)
result = FileUtils.mv("#{file_path}.tmp", "#{file_path}/index.html", force: true)
puts "Moving #{id_path} to #{id_path}/index.html: #{result}"
# Will then need to relativise all of the asset paths, since we've moved it
adjust_paths("#{file_path}/index.html")
end
end
end
ensure
# stop the server when we're done
Rake::Task['static:stop_rails_server'].reenable
Rake::Task['static:stop_rails_server'].invoke
end
end
desc 'Start a Rails server in the static Rails.env on port 3020'
task :start_rails_server do
`RAILS_SERVE_STATIC_FILES=1,RAILS_ENV=static rails s -p 3020 -d`
end
desc 'Stop Rails server'
task :stop_rails_server do
`cat tmp/pids/server.pid | xargs -I {} kill {}`
end
end
Then you can just do bundle exec rake static:export
I don't know if this is a limitation to node-static or is it a bug in my code, but I can't seem to get it to serve files above or beyond the current directory. My current directory structure is this:
project
public
...public stuff here...
system
core
server.js
server.js lives in core directory, making the path to public as ../../public - but this code won't run. It returns a 404.
staticServer = new (static.Server)('../../public');
webServer = http.createServer(function (request, response) {
staticServer.serve(request,response);
})
webServer.listen(appServerConfig.port, appServerConfig.address);
However, if I change the structure to make the public folder live beside server.js and change the code accordingly, it works:
project
system
core
server.js
public
...public stuff here...
staticServer = new (static.Server)('./public');
webServer = http.createServer(function (request, response) {
staticServer.serve(request,response);
})
webServer.listen(appServerConfig.port, appServerConfig.address);
Why is this so?
Be aware that using relative paths will resolve those paths relative to the current working directory of the node.js process, that is, the directory you were in when you ran node server.js. So as coded, your could looks OK to me as long as you are in the core directory when you launch node. Are you sure you always launch node from the core directory?
If you want to be independent of the cwd (more robust IMHO), use __dirname to get the absolute directory path of the current file and then tack on your relative paths to that: __dirname + '/../../public'.
Beyond that, consider fs.realpath to resolve those. I can't say whether node-static in particular has special code to prevent this, but most other modules I've seen such as connect's static middleware will happily serve any arbitrary directory without special restrictions.