I am hosting a widget on another domain than the site in which I am embedding the widget.
The dashboard.js loads fine, but the HTML template gets,
XMLHttpRequest cannot load http://192.168.2.72:8081/widgets/templates/dashboard.html. Origin http://192.168.2.72:8080 is not allowed by Access-Control-Allow-Origin.
The url to the template is correct, so I can only assume this is a cross domain error. In the widget, the template is referred to like:
templatePath: dojo.moduleUrl("monitor/dashboard", "../templates/dashboard.html"),
This all works when it's a local widget. Is there anyway to get dojo to load the HTML template in better way?
The way I have defined loader,
<script data-dojo-config="async: 0, dojoBlankHtmlUrl: '/blank.html', parseOnLoad:true,
packages: [
{name: 'monitor', location: 'http://192.168.2.72:8081' + '/widgets'},
]"
src="/media/scripts/dojo/dojo/dojo.js"></script>
Well, there are several ways to solve it.
The first solution is a serverside solution by using CORS (Cross-origin resource sharing). If you can set the CORS headers like:
Access-Control-Allow-Origin: *
Your browser will detect this and will allow the XMLHttpRequest.
While this solution is probably the best, you could also use some alternatives, for example by using JSONP (for example with dojo/request/script). However, using JSONP also means that you cannot use a plain HTML template, but you have to convert your HTML template to a JavaScript string.
If you then use the templateString property, you can then pass the template as a string, in stead of specifying the path.
The templateString property also allows you to build your template, if you can build your template as a JavaScript string, then you could build your template, for example by using Grunt and the grunt-html-convert task.
You might be able to do a similar thing with the Dojo build system by using depsScan. This build transform should scan modules and convert legacy code to AMD and it should also look for things like dojo.cache(), dojo.moduleUrl() and templatePath and convert it to templateString.
Look at the documentation for more info.
the last (and also pretty common) solution is to use a reverse proxy. If you have to host your HTML templates on a different domain, you can still define a reverse proxy in your HTTP server and redirect certain calls to a different domain, for example (Apache 2):
ProxyPass /templates http://other-domain.com
ProxyPassReverse /templates http://other-domain.com
This allows you to go to /templates/my-template.html, which will be redirected to http://other-domain.com/my-template.html.
Related
How do I set preload headers in Next.js when the files have random build names?
I've seen from the page source that the static files are preloaded using link tags in the html. I'd much rather have these preloaded in the headers, as that also enables HTTP/2 Server Push.
According to the documentation, you can set custom headers in next.config.js. This is fine, but the main problem is that the file names get random strings every build.
For example, I've also got some local font files that I'd like to push, aswell as the CSS file generated by Tailwind.
How would you set preload headers for the resources in Next.js?
EDIT:
I have managed to hardcode font files in the headers, as these get to keep their random names on rebuild. Tailwind CSS seems to be impossible to hardcode this way, as it gets a new name right after I rebuild. I guess I could modify the build folder in that case, but both of these methods are less than ideal.
Why isn't this a more common issue with people using React/Next.js? As far as I know, Using HTTP/2 Server Push makes everything much faster as long as the server supports it.
Here is a working (but is it efficient?) solution, requiring to setup an Apache2 or NGINX reverse proxy:
Use a custom server.
Intercept the response body, search <link rel=preload> HTML tags,
and set for each link a Link HTTP header.
You could use this library.
Configure the reverse proxy (NGINX or Apache2) to automatically push resources by intercepting Link HTTP headers.
See also : https://github.com/vercel/next.js/issues/2961
This is not really a Mustache.js or Handlebars.js specific question but both frameworks would have this problem if you are trying to optimize the performance loading templates for your web app.
Right now I am loading the templates from the same domain as the rest of the app, but I would like to load my templates from the CDN if possible. The largest problem with this is that I can't cross browser can't load text files via AJAX.
What other methods can I try to optimize the load time of individual templates cross domain?
I have worked on optimizing the load order ( worked )
loading the templates as xdomain script files in the head ( failed )
<script type='text/html' src="http://domain.cdn.com"></script>
I think CORs support would be limited to the number of browsers.
Using YQL would be slow.
Can I somehow do what JSONP does, but with an XML, XHTML, or HTML, obviously without the javascript callback? Maybe the end of the template could have a small callback function, but I wouldn't want to wrap the whole thing and need to escape it as json.
One idea off the top of my head.
Use RequireJS to build the templates into a single file. Each template would be wrapped as a define module and the template will be properly escaped as a string.
Because the file would be .js it could be loaded as normal from another domain
So if the text plugin determines that the request for the resource is
on another domain, it will try to access a ".js" version of the
resource by using a script tag. Script tag GET requests are allowed
across domains. The .js version of the resource should just be a
script with a define() call in it that returns a string for the module
value.
Example: if the resource is 'text!example.html' and that resolves to a
path on another web domain, the text plugin will do a script tag load
for 'example.html.js'.
https://github.com/requirejs/text#xhr-restrictions
If you also compiled your Mustache/Handlebars templates it would be even more performant. Here is an example of a compiled Handlebars template wrapped in a define call. The Handlebars compiler takes care of the output and then the RequireJS builder will include all of them in one file for you.
Again, not tried this solution but might put you on the right track.
I'm aware that I'm trying to answer an old question ..
This is quite doable by embedding template strings in a static HTML document, wrapped in a element with the attribute type="text/x-handlebars-template". This is known as micro-templating. Because of the type attribute, the browser won't try to execute the script. For example -
<script id="list-template" type="text/x-handlebars-template">
<p>YUI is brought to you by:</p>
<ul>
{{#items}}
<li>{{name}}</li>
{{/items}}
</ul>
</script>
Once done we have to compile and then render it passing the data object.
<script>
YUI().use('handlebars', 'node-base', function (Y) {
// Extract the template string and compile it into a reusable function.
var source = Y.one('#list-template').getHTML(),
template = Y.Handlebars.compile(source),
html;
// Render the template to HTML using the specified data.
html = template({
items: [
{name: 'pie', url: 'http://pieisgood.org/'},
{name: 'mountain dew', url: 'http://www.mountaindew.com/'},
{name: 'kittens', url: 'http://www.flickr.com/search/?q=kittens'},
{name: 'rainbows', url: 'http://www.youtube.com/watch?v=OQSNhk5ICTI'}
]
});
// Append the rendered template to the page.
Y.one('body').append(html);
});
For details please refer here - http://yuilibrary.com/yui/docs/handlebars/
Say my html file is from http://foo.com/index.html, in it, there's a <script> tag to http://bar.com/bar.js. In bar.js, I want to start a SharedWorker where the url is http://bar.com/worker.js. Is there a way to achieve this (maybe something like jsonp)?
The preferred way to do this sort of cross-domain access these days is using the W3 CORS specification.
Cross-Origin Resource Sharing
However, this might not be suitable for you if you do not control the the site at bar.com. If you do, then CORS is definitely a good option, but you may need to resort to JSONP if bar.com is run by another party, since CORS depends on the site sending back specific headers authorizing your browser to download the resource you requested.
This is a solution I found:
Write the script inside a function (can be an inner function)
get the text using function.toString() (removing the function declaration and closing brace)
append the text to a BlobBuilder and get the blob
Use window.URL.createObjectURL to convert the blob to a url
use that url for the worker
So, I want to add versioning to my css and js files. The way I would like to do this is by appending a query string to the end of the asset path so
/foo/bar/baz.css
Becomes
/foo/bar/baz.css?version=1
This will work for proxies and browser cache, however, I was wondering if Akamai will know this is a new file and re-request it from the origin server? My assumption would be that it would re-request the file from the origin server but figured I'd ask if anyone knew for sure.
Yes. It matches exact URLs for all GET requests.
Not quite. It depends on the CDN configuration. Query String values are usually not part of the cache-key. So, when setting up the CDN delivery config, make sure you explicitly add the option to include the Query String as part of the cache-key. Otherwise, you will end up serving inconsistent versions due to having a cache key that does not vary based on the query string value, in this case, the asset version.
I prefer to have a url like '/css/DEVELOPER_BASE/foo/baz/style.css'.
Your build/deploy scripts do a global find and replace on '/css/DEVELOPER_BASE/' with '/css/[version_number]/'
To make this work you then have two options.
Your deploy script copies the css files from '/css/DEVELOPER_BASE/' to '/css/[version_number]/'
Your web server does an alias (not redirect) for '/css/[version_number]/' to '/css/DEVELOPER_BASE/'
This will keep you from having to worry about how browsers and CDN's handle query parameters.
Requesting data from any location on my domain with .load() (or any jQuery ajax functions) works just fine.
Trying to access a URL in a different domain doesn't work though. How do you do it? The other domain also happens to be mine.
I read about a trick you can do with PHP and making a proxy that gets the content, then you use jQuery's ajax functions, on that php location on your server, but that's still using jQuery ajax on your own server so that doesn't count.
Is there a good plugin?
EDIT: I found a very nice plugin for jQuery that allows you to request content from other pages using any of the jQuery function in just the same way you would a normal ajax request in your own domain.
The post: http://james.padolsey.com/javascript/cross-domain-requests-with-jquery/
The plugin: https://github.com/jamespadolsey/jQuery-Plugins/tree/master/cross-domain-ajax/
This is because of the cross-domain policy, which, in sort, means that using a client-side script (a.k.a. javascript...) you cannot request data from another domain. Lucky for us, this restriction does not exist in most server-side scripts.
So...
Javascript:
$("#google-html").load("google-html.php");
PHP in "google-html.php":
echo file_get_contents("http://www.google.com/");
would work.
Different domains = different servers as far as your browser is concerned. Either use JSONP to do the request or use PHP to proxy. You can use jQuery.ajax() to do a cross-domain JSONP request.
One really easy workaround is to use Yahoo's YQL service, which can retrieve content from any external site.
I've successfully done this on a few sites following this example which uses just JavaScript and YQL.
http://icant.co.uk/articles/crossdomain-ajax-with-jquery/using-yql.html
This example is a part of a blog post which outlines a few other solutions as well.
http://www.wait-till-i.com/2010/01/10/loading-external-content-with-ajax-using-jquery-and-yql/
I know of another solution which works.
It does not require that you alter JQuery. It does require that you can stand up an ASP page in your domain. I have used this method myself.
1) Create a proxy.asp page like the one on this page http://www.itbsllc.com/zip/proxyscripts.html
2) You can then do a JQuery load function and feed it proxy.asp?url=.......
there is an example on that link of how exactly to format it.
Anyway, you feed the foreign page URL and your desired mime type as get variables to your local proxy.asp page. The two mime types I have used are text/html and image/jpg.
Note, if your target page has images with relative source links those probably won't load.
I hope this helps.