When creating static apps I often start a new Rails app. This makes quite some things easier, like compilation (Coffeescript, SCSS), minimization (JS, CSS) and browser limitations (the page is being served from localhost:3000 so external sources can be loaded etc.).
At the end I want to export the app so I can put it online. Then I just need the HTML+CSS+JS. One can go and pluck the files out manually, but there probably is an easier way for this.
So: is there a tool that stores the compiled, minimized HTML+CSS+JS files from a Rails app?
If you just want to basically copy the website as it will be rendered by rails (and there is no need for server side code execution), you could just mirror the rails-website using
wget --page-requisites --convert-links http://URL-to-Start
However, this will only download those files that are referenced from the entry URL, so you might need to run it on all sub-URLs individually.
Source: Download a working local copy of a webpage
Agree with Screenmutt. I've tried a couple of the ones mentioned but have had most success with:
http://middlemanapp.com/
Does pretty much everything you are asking for and let's you export to static HTML.
install:
gem install middleman
create project:
middleman init my_new_project (or even better with template --template=html5)
run in local server for live edits:
bundle exec middleman
dump static code:
bundle exec middleman build
Perhaps you can 'scrape' the HTML from the localhost serving it?
There seem to be some tools for downloading sites in general... You can probably limit them to download resources from localhost:3000 only.
http://www.httrack.com/
http://www.linuxjournal.com/content/downloading-entire-web-site-wget
UPDATE: Here's another tutorial that might help Use Rails 3.1 for Static Sites
This is not a common usage. You might be able to extract all the static pages by manually caching everything.
I would recommend taking a look at some alternatives.
I'm sorry that this isn't a good answer, but to be honest... You are using Rails for something that it was never intended to do. There are much better ways of making static sites.
Also, a static site is not an "app". :)
All you have to do is switch to Rails production mode locally so that assets are combined and minified. Then all you have to do is view source for the HTML, CSS, and JS. This should only take a few seconds.
So the steps are
config.assets.compress = true in development.rb
view the app locally
view source, copy and paste into an index.html file
click on compressed CSS and JS form source and save those relative to your index.html making sure they link correctly
You can use Wget (as it's already mentioned). I would go with:
wget --mirror --convert-links --adjust-extension --page-requisites --no-parent http://www.yourdomain.com
Yo can also use Httrack.
Be sure when you set Httrack you exclude all external websites with scripts so you don't download f.e. Google Analytics js files or Adsense or Facebook scripts. In Httrack, you exclude it in Preferences with:
-*.googlesyndication.com/* -*.facebook.net/* -*.google-analytics.com/*
After you are done you still need to rewrite all links because they will point at .../some-page/index.html You need .../some-page/. This solves Dynamic to Static Script.
You shouldn't serve them from rails or do anything that binds your static files to being served from rails. You may one day decide to serve your app from a CDN.
JS
One big tip would be to look at using AMD (async module definition), which would allow you to specify your JS file dependencies. Then you can use require.js and r.js(a tool that crawl and compile your dependencies in you precompile step). That would work for your JS.
CSS
For CSS, you could use sass or less. You'd include 1 file at the end of the day on your page, but the compilation process would involve merging your CSS files together. Once again this can be done at the pre-compile step.
CDN
There are gems out there that show take your assets and pass them over to something like S3, this answer and others like it would help: Is there a way to asset pipeline assets to s3 when pushing to heroku? ; however, that isn't necessary when you are first starting.
I did it with a Rake task that would fetch each of the Rails routes one at a time. It needed a bit of jiggery pokery to handle the fact that you might have conflicting routes - e.g. wget would fetch /objects as a file called "objects" but then when you want to fetch /objects/4 it would overwrite that file with a folder called "objects" with a nested file called "4". So I move each downloaded page to "index.html" inside a directory with the same name.
Here's my code, which I out in lib/tasks/export.rake:
def adjust_paths(path)
text = File.read(path)
new_contents = text.gsub(/("|\.\.\/)(assets|packs)\//, "\\1../\\2/")
new_contents = new_contents.gsub("http://localhost:3020", "")
File.write(path, new_contents)
end
namespace :static do
desc 'Generate static site in ./out/ directory'
task :export => [
'assets:clean',
'assets:precompile',
:start_rails_server
] do
begin
out_prefix = "dist"
paths = Rails.application.routes.routes.map do |route|
route.path.spec.to_s
end.uniq.reject { |p| p.starts_with?("/rails") || p == "/cable" || p == "/assets" }
paths = paths.map { |p| p.sub("(.:format)", "") }
paths.sort_by(&:length).each do |path|
if path.include?(":id")
# You'll have to use your own method for deciding which ids to use
ids = ["1", "2", "3", "4"]
else
ids = [""]
end
ids.each do |id|
id_path = path.sub(":id", id)
`wget -P #{out_prefix} -nH -p -k http://localhost:3020#{id_path}`
if id_path != "/"
file_path = "#{out_prefix}#{id_path}"
FileUtils.mv(file_path, "#{file_path}.tmp", force: true)
FileUtils.mkdir_p(file_path)
result = FileUtils.mv("#{file_path}.tmp", "#{file_path}/index.html", force: true)
puts "Moving #{id_path} to #{id_path}/index.html: #{result}"
# Will then need to relativise all of the asset paths, since we've moved it
adjust_paths("#{file_path}/index.html")
end
end
end
ensure
# stop the server when we're done
Rake::Task['static:stop_rails_server'].reenable
Rake::Task['static:stop_rails_server'].invoke
end
end
desc 'Start a Rails server in the static Rails.env on port 3020'
task :start_rails_server do
`RAILS_SERVE_STATIC_FILES=1,RAILS_ENV=static rails s -p 3020 -d`
end
desc 'Stop Rails server'
task :stop_rails_server do
`cat tmp/pids/server.pid | xargs -I {} kill {}`
end
end
Then you can just do bundle exec rake static:export
Related
I'm trying to repeat this tutorial:
https://ampersandjs.com/learn/npm-browserify-and-modules/#npm-browserify-amp-modules
But after installing browserify I don't see folder: node_modules/.bin
Instead I see a folder node_modules/browserify. Inside there is a bin folder, and Iinside of it - cmd.js and args.js.
How should I change this line of code in my case: ./node_modules/.bin/browserify app.js -o app.bundle.js to compile all js files into one file?
Or maybe I need to install browserify some other way?
Put together, the flow of creating a very simple web application with these tools might look something like this:
You simply need to point your cmd prompt to the browserify node_module, so drop the .bin if it's not there => /node_modules/browserify yourjsfile.js myjsfile.bundle.js
As far as I can understand this guide: the app.js file or yourjsfile.js needs to have all the library requirements included in order for it to work.
var squareNumbers = require('./square-numbers');
This means you need to write this file as an entry point for all your scripts you need to bundle.
TIP: try to find a youtube video or something to get a better understanding of this guide.
The dot in front of these directories tells you it's a system folder, in this case, not of your operating system, but from another "system/application", like node. It puts these kind of folders alphabetically on top to make a distinction.
I try to integrate angular to rails app. I extended assets folder by
config.assets.paths << Rails.root.join('node_modules')
and it's good work but my console has serveral messages error
http://localhost:3000/assets/rxjs/Subject.js.map
Rails app couldn't .map extension how to load it
Those are "javascript source map" files. If your Subject.js javascript file has been minimized/uglified, then you can generate and add a "map" file to let browsers know how to "unminimize"/"unuglify" your js file.
If the Subject.js file is written by you, then I guess your environment setup with different node.js and gulp.js modules has it enabled. Make sure those files are copied to your /assets folder as well.
Alternatively, you can disable them by removing special comment at the end of your javascript file:
//# sourceMappingURL=/assets/rxjs/Subject.js.map
Or, less likely, your server might be sending X-SourceMap header.
I actually have:
Nginx running to serve static files (css, js, etc.)
Node with express.js, template engine: ECT (I may change for Swig)
I am currently trying to find the best way to distribute static files with a custom prefix with versioning:
'http://static.mydomain.com/' in production
'/path/to/static' in devel
So to do that, I only set a variable containing the prefix (which depends on the environment).
Then for each request, I set add the prefix to locals in an express middleware in order to access this variable in any html template:
this.use(function(req, res, next) {
res.locals.staticPrefix = staticPrefix;
next();
});
But since I also want these static files to be also cached by the client's browser, Nginx serves these files with expire = 30d.
Now to force a client to retrieve a static file (if it has changed for example), I need to provide static urls with a dynamic url parameter.
My first idea would be to set a version variable when I start the nodejs app to append it to the final url:
var staticVersion = new Date().getTime();
So in the html template, the final url for a 'myFile.css' would like this: staticPrefix + 'myFile.css?' + staticVersion
In this case, I only need to restart the nodejs application when one of the static files has been updated. It will make the url to change (according to the new date) and the client to do a new request of the file.
Is there a better way to handle this situation with node?
Best way to handle static assets like css/js files is to minify them in production. Use file name based on file contents. This way every time you change anything in js/css files, the minification code will take care of generating new file if needed. You can hook minification script to run post deployment.
I have written a package smush to help with minification tasks. Head onto its github page for example usage and sample codes.
You could use other tools/package for minification if it suits better to your use case.
Coming back to your question, you can set the root dir for nginx to static dir of your node server(/path/to/node/server/public?). This way nginx will cache and serve your static files. The node server will not be bothered to serve the static assets afterwards.
Let me know if this makes sense or if you need any further clarification.
I have a script 3rd party websites are using: /assets/script.js. For obvious reasons, I can't ask them to change the link every time I deploy to point to the latest fingerprinted version of the script. I got a few caching issues where users still see old versions of /script.js. Are there any ways to make the cache go away directly for script.js instead of script-9dc5afea3571ba2a883a72b0da0bb623.js?
More Information: Rails on Passenger + Nginx. Looking for ways to serve the script.js file instead if the finger-printed file and invalidate the cache on every deployment.
I thought about adding ETags based on the deployment git revision, but have no idea how to do this. Nginx has no built in ETags support. There are unsupported old third party modules that do this. I can use add_header Etag="something" for this, but how do I add the git version there.
Any other ideas and options?
Thanks!
If you have a script with a name that is part of your public interface then you need to start versioning this script explicitly, and keeping old versions around for older clients.
e.g. /assets/script.1.0.js, /assets/script.1.1.js etc
The key part is that you need to be keeping the old ones around, and the code doesn't change without the name changing explicitly. The Rails asset pipeline can't do this for you, since there's usually only the very latest version of the script kept current.
As with all public interfaces, you will need to spend more time on managing this process than you would for an internal-only script.
I recommend using an ETag.
Add an ETag header to your response
http://en.wikipedia.org/wiki/HTTP_ETag
Set the ETag header to a different, unique string for each version of your script.
This will make sure browsers get a new version of the script whenever you deploy a new version .
nginx is able to generate etags in the latest version: http://nginx.org/en/docs/http/ngx_http_core_module.html#etag
I've also seen the configuration below here: https://serverfault.com/questions/426260/nginx-cache-control
location /static {
alias /opt/static/blog/;
access_log off;
etags on;
etag_hash on;
etag_hash_method md5;
expires 1d;
add_header Pragma "public";
add_header Cache-Control "public, must-revalidate, proxy-revalidate";
}
Following up on the ETag suggestion, you might find this gem useful: bust_rails_etags. It allows you to set a key on each deployment which is used in the generation of ETags, that way, your ETags will change (and so the cached script will be invalidated) every time your app is deployed. The author uses the example of Heroku release numbers as a key that changes on each deploy.
What I'm using for updating assets is:
Increment config.assets.version in config/application.rb like
#Version of your assets, change this if you want to expire all your assets
config.assets.version = '1.1'
bundle exec rake assets:precompile RAILS_ENV=production RAILS_GROUPS=assets
App restart, empty webserver cache if any
If you are using Capistrano, you could write a task that copies the script from Public/assets to another directory within Public (i.e. Public/scripts) after your assets are precompiled.
Well you can remove the fingerprint of the file :
asset_path('script.js', :digest => false)
hope it helps
Or you can use this gem if you want : https://github.com/spohlenz/digestion
But : The Rails asset pipeline now compiles asset files both with and without digests.
So after you generate your assets usually you got script.js?xxxxx and script.js into your public/assets folder.
You want a non fingerprinted asset url for third party websites.
For example: assets/public_api.js
There have been plugins or gems that excluded specified assets from fingerprinting. However rails has changed the pre-compilation process in a way that also creates non-fingerprinted files. Therefore this isn't a problem. More info here.
How to make sure your clients are loading the latest deployed script, when the asset is not fingerprinted?
I would suggest the solution youTube uses to expose their API. Basically all your assets/public_api.js does, it injects another script tag into the dom. That injected one loads the actual API code. Now your assets/public_api.js becomes assets/public_api.js.erb and looks something like this:
var tag = document.createElement('script');
tag.src = "<%=asset_path('/assets/javascripts/acctual_api')%>";
var firstScriptTag = document.getElementsByTagName('script')[0];
firstScriptTag.parentNode.insertBefore(tag, firstScriptTag);
Please note how tag.src is set to the current fingerprinted path to /assets/javascripts/acctual_api. This way your users will always get the latest compiled acctual_api script.
How to update the ETAG for the assets/public_api.js?
I suppose you use Capistrano or similar recipe based deployment solution. Maybe you could add a deployment step that updates the server config file before its restarted. It should just update:
add_header Etag="update_me_on_deploy"
Please note that you should still use versioned (assets/public_api.0.js) public scripts even with this approach.
I've been looking everywhere for answer to my questions last few hours and couldn't find anything, so i decided to ask.
I followed installation instruction in docs of Dajaxice, got everything setup exacly the same, but unfortunetely my Dajax.core.js file is not getting parsed, so when i click on the javascript link in page html source it still contains template tags. I included the Dajaxice finder in staticfiles_finder(actually i ve got everything setup like in the ins instruction.
I am using django 1.4.1 develop server at the moment for testing and the latest Dajaxice version which is 0.9, is that make any difference ?
Does the order of vars in settings.py matters ?
What are the main reasons the Javascript files are not getting parsed, and actually when they should be parsed ?
Please help me as i really would love to use this app but just can't get it to work.
Thanks in advance.
I advice you to check STATICFILES_FINDERS settings and other settings related to django.contrib.staticfiles app. Dajaxice uses a hook in this app to generate dajaxice.core file.
When you use debug server this static file is generated on the fly, on production environment the file will be generated when you run collectstatic command.
In your case it looks like the dajaxice.core.js file is founded by another static finder or served in any other way.
To check this please run the following command
python manage.py findstatic dajaxice/dajaxice.core.js
The output should look like
Found 'dajaxice/dajaxice.core.js' here:
/tmp/tmp9nzeEd
The filename in tmp dir will be different
Also 2 pitfalls with collect static app:
When you update your ajax.py file to include new dajaxice views you have to run collectstatic again
The file is generated in /tmp/ folder. So if you use -l key to generate links instead of copying files make sure that you will not remove this file by accident.