How to include absolute URL for JavaScript, CSS and Menu Items - javascript

I'm working on ASP.NET MVC 3 web application. As a part of integration with third party application, we need to provide a master page (template) for our web application which they will use to inject necessary data on the page. The third party will poll our template every few minutes and update if any thing changed.
For this to work, all the JavaScript, CSS and menu links must be absolute URL to our domain name so that if user clicks the link. All JavaScript and CSS will refer to resources living on our domain.
Given that our web application will be hosted in different environments e.g. DEV, UAT, QA and Production (with actual real domain name). Could someone help me how we can provide a absolute URL for CSS, JavaScript and menu links that would work on all environments?

Related

How does angularjs not refresh on page change?

I've been learning about angularjs and have been very confused about how angular manages to change pages without refreshing and yet have a completely different view.
Are they actually changing the page URL or just hiding all the elements of on page and showing the other?
This video by CodeSchool explains it quite well.
AngularJS is just a tool that allows you to build single-page web applications with relative ease. What you are looking for is actually the definition of Single-Page Application:
Single-Page Applications (SPAs) are Web apps that load a single HTML page and dynamically update that page as the user interacts with the app. SPAs use AJAX and HTML5 to create fluid and responsive Web apps, without constant page reloads. However, this means much of the work happens on the client side, in JavaScript.
Also, from http://www.johnpapa.net/:
A SPA is fully (or close) loaded on the initial page load, it’s key
resources are preloaded, and progressively downloads features as
required.
And, more specific to your particular question:
When a user clicks on a menu item, the SPA sees that url and
translates it to a View that should be displayed. If the view has not
been seen before, the application may make an HTTP request to retrieve
the HTML template for the view. Then it will compose the view, fill in
the template, and display the view in the appropriate location within
the shell. If the view has already been viewed once, the browser may
have cached it and the router will be smart enough not to make the
request. This is one way a SPA can reduce round-tripping to and from a
server, and thus improve performance.
Keep in mind that this behavior is attained with the use of JavaScript, and does NOT require any specific library or framework (such as AngularJS), although you will probably want to learn how to use one to facilitate the process.
I also recommend you check these resources:
http://johnpapa.net/building-single-page-apps-with-knockout-jquery-and-web-api-ndash-the-story-begins/
http://www.johnpapa.net/pageinspa/
If your url's are mapped with the $routeProvider, you can reload a controller invoking $route.reload().

How to get page.js to work in a single page AND multipage on the same site?

am using page.js for routing in a Grails application using '/' to point /HomeController/index to serve up a single page web application. I just installed Grails Spring Security Core plugin, and I am using the Grails scaffolding to create the User Admin/Permissions views with the goal of serving them in the traditional multi-page way to avoid having to do a lot of UI work on admin pages. The bulk of the application will be served using single page architecture, with just the admin pages being served multi-page.
In their documentation, page.js says, "By default when a route is not matched, page.js will invoke page.stop() to unbind itself, and proceed with redirecting to the location requested. This means you may use page.js with a multi-page application without explicitly binding to certain links." But, I cannot get it to work...
I am using page.js like so:
page('/', SCM.Dashboard.home);
page('/hx', SCM.HX.summary);
page('/hx/vendor', SCM.HX.vendors);
page('/hx/customer', SCM.HX.customers);
page('/customer/list', SCM.Customer.list);
page('/maintenance/activity', SCM.Maintenance.activity);
page();
When I click a link to '/user', based on their documentation, I expect it to forward directly to 'http://domain.com/user'. It adds the correct path to browser location bar (http://domain.com/user), but the browser never forwards to the page. In order to see the page, I have to click the link, and after the location bar has changed, if I refresh the browser window, the correct page appears - obviously unacceptable. Yet, I cannot find in their documentation how to implement this correctly. I have experimented with various settings for hours with no luck. If I comment out the page.js code above, the multi-page admin pages work fine, and I am able to navigate from page to page no problem. Has anyone solved this problem?
I just upgraded from version 1.4.0 to version 1.5.0 and it links between the Single page (Main app) and Multi-page (Admin functionality) portions of the application seamlessly with no configuration needed!! Excellent feature addition!

What is good approach of implementing web widget

I have a dillema of the way web widget apps should be implemented.
Scenario is that website A should present content of website B as a widget. Lets say that the widget content type is webshop, so all that a webshop can offer will be inside a widget. Items, cart, login, checkout, etc, but no redirection to other site. Window stays on Site A.
There is no interaction between websites. No data passed from one to another.
Technologies that will be used are .net Web API as server side, angular as js framework, and all will be implemented as single page app.
I see two scenarios.
Website B will be embeded in website A through iframe
All js, css, and initial html will be somehow embedded into website A and make calls to WebApi services.
I'm not clear how 2 should be done. Giving some bundles of JS, CSS, and HTML to Site A to implement is kind of overhead for both me and site A, and I see a lot of troubles there. Maybe it all should be injected dynamically somehow.
On the other side..iFrame...this seems like a right scenario for use of iframe, but is it?
Any thought is appreciated.

JavaScript Injection On Third-Party Pages

I've recently stumbled upon a website called Overlay101 which allows you to create tours for other websites.
I was very interested to see the technique they use to load the third party websites for editing.
When you type the address of the website, it is loaded as a sub domain of the overlay101.com website.
For example, if I type https://stackoverflow.com/questions/111102/how-do-javascript-closures-work - it is loaded as http://stackoverflow.com.www.overlay101.com/questions/111102/how-do-javascript-closures-work
I was wondering how is that subdomain creation achieved and I saw in the source code of the page that JavaScript in injected. I was wondering how was that possible too.
What intrigued me most is that Stackoverflow.com does not allow pages to be loaded within frames - I was wondering how they managed to load up the page so that tour popups could be added.
They simply use wildcard DNS entries to make all subdomains work. They then use the Host header to get the original domain name and download the HTML code of the site. Since they do this on the server side they do not need any frames etc.

Twitter Cards using Backbone's HTML5 History

I'm working on a web app which uses Backbone's HTML5 History option. In order to avoid having to code everything on the client and on the server, I'm using this method to route every request to index.html
I was wondering if there is a way to get Twitter Cards to work with this setup, as currently it can't read the page as everything is loaded in dynamically with Javascript.
I was thinking about using User Agents to detect whether it's the TwitterBot, and if it is, serving a static version of the page with the required meta-tags. Would this work?
Thanks.
Yes.
At one job we did this for all the SEO/search/facebook stuff etc.
We would sniff the user-agent, and if it was one of the following sniffers
Facebook Open Graph
Google
Bing
Twitter
Yandex
(a few others I can't remember)
we would redirect to a special page that was written to dump all the relevant data about the page for SEO purposes into a nicely formatted (but completely unstyled) page.
This allowed us to retain our google index position and proper facebook sharing even though our site was a total single-page app in backbone.
Yes, serving a specific page for Twitterbot with the right meta data markup will work.
You can test your results while developing using the card's preview tool.
https://dev.twitter.com/docs/cards/preview (with your static URL or just the tags).

Categories

Resources