I'm developing a small application with HTML, CSS, Javascript, JQuery and JQTouch.
This is a LOCAL application. I've an index.html with some div's. When I click on a link in this file, I want to load HTML code of page1.html (same folder) and insert into a div of index.html the tag that I want of page1.html.
Example: Inject the content of (page1.html) into (index.html).
I try: http://api.jquery.com/load/
$('#loadContent').load('page1.html #test');
And the content of loadContent doesn't change. I include JQuery script into index.html...
I try http://api.jquery.com/html/ too, but I think it connect to the server.
Any idea? Thanks!
Make sure you're calling it after loadContent has been created. The following will run your load code when the document is ready to be written to.
$(function() {
$('#loadContent').load('page1.html #test');
});
Or you could run a local server. If you have python you can go to your directory with the files and run python -m SimpleHTTPServer for python 2.7 or python -m http.server for python 3.x
Most browsers will, by default, block this on a local file system as a security precaution. Have you tried it on a remote server?
I dont know much on jQuery. But still, you can do this, by loading the page1.html to a hidden iframe, then get the document.body.innerHTML of this page, and then append that node to the div you need. Its only HTML DOM in JavaScript.
But performance base, loading an iframe is always a costly one. This would do your job. However there may be many solutions in jQuery or other libraries, Sorry i don't know much on it.
It sounds like the problem is that the jQuery library isn't loading when you're running on localhost, or the AJAX request is failing for the same reason. This is due to protection built into the browser to prevent cross-site scripting.
See this "additional note" from the documentation for load:
Due to browser security restrictions, most "Ajax" requests are subject to the same origin policy; the request can not successfully retrieve data from a different domain, subdomain, or protocol.
If you use any AJAX, you'll have to run this on a local web server. In which case you should just run this page from your local webserver rather than from the filesystem. Then you won't need any workarounds.
If you're on Windows, you could use UniServer.
If you aren't going to use any AJAX whatsoever (aren't using load), then you could write your code so that it falls back to a local version of jQuery if the remote version didn't load.
Here's an example of how, grabbed from this page:
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js"></script>
<script>!window.jQuery && document.write('<script src="/Scripts/lib/jquery/jquery-1.4.4.min.js"></script>'))</script>
<script src="//ajax.aspnetcdn.com/ajax/jquery.validate/1.7/jquery.validate.min.js"></script>
<script>!window.jQuery.validator && document.write('<script src="/Scripts/lib/jquery/jquery.validate.min.js"></script>')</script>
<script src="//ajax.aspnetcdn.com/ajax/mvc/3.0/jquery.validate.unobtrusive.min.js"></script>
<script>!window.jQuery.validator.unobtrusive && document.write('<script src="/Scripts/lib/jquery/jquery.validate.unobtrusive.min.js"></script>')</script>
Related
I have an HTML file with Javascript, that is supposed to load and process an XML file. The main obstacle is that the script may also be run locally, without an HTTP server, and also has to support Internet Explorer (11).
In normal case, I would use XMLHttpRequest, but as far as I know, it cannot be used with locally stored files (or at least doesn't work in my test cases for Chrome and IE).
I tried using <script> blocks with set src and type="text/xml" attributes and it successfully loads the content of the xml "somewhere" (the content is loaded and is visible in the network trace), but I cannot find a way, to extract the content of the xml from the <script> node.
Most sources (e.g. Getting content of <script> tag) suggests using XHR, but AFAIK it cannot be done in this case.
Is there a sensible option to implement this, without minimal http server?
I am looking for a clean solution without jQuery.
There is javascript on my webpage, but I need to hide it from my users (I don't want them to be able to see it because it contains some answers to the game.)
So I tried using Jquery .load in order to hide the content (I load the content from an external js file with that call). But it failed to load. So I tried ajax and it failed too.
Maybe the problem comes from the fact that I'm trying to load a file located in my root directory, while the original page is located in "root/public_html/main/pages":
<script type="text/javascript">
$(document).ready(function() {
$.ajax({
url : "../../../secret_code.js",
dataType: "text",
success : function (data) {
$("#ajaxcontent").html(data);
}
});
});
</script>
1) Why can't I load a file from the root directory with ajax or load method?
2) Is there another way around?
PS: I'm putting the file in the root directory so people can't access it directly from their browsers...
1) if the file isn't accessible via web browsers, than it's not accessible via ajax (ajax is part of the web browsers
2) try /secret_code instead of ../../../secret_code.js
What is your system setup? Are you using a CMS?
Even if you add the javascript to the page after page load a user with a tool like firebug can go and view it. I don't think what you are doing is really going to secure it. An alternate solution is that you could minify and obfuscate the javascript that you use in your production environment. This will produce near unreadable but functioning javascript code. There are a number of tools that you can run your code through to minify and obfuscate it. Here is one tool you could use: http://www.refresh-sf.com/yui/
If that isn't enough then maybe you could put the answers to the game on your serverside and pull them via ajax. I don't know your setup so I don't know if that is viable for you.
Navigate to the URL, not the directory. Like
$.ajax({
url : "http://domain.com/js/secret_code.js",
..
Even if you load your content dynamicly, it's quite easy to see content of the file using firebug, fiddler or any kind of proxy. I suggest you to use obfuscator. It will be harder for user to find answer
Take a look at the jQuery.getScript() function, it's designed for loading Javascript files over AJAX and should do what you need.
Try jQuery's $.getScript() method for loading external
Script files, however, you can easily see the contents of the script file using Firebug or the developer toolbar!
Security first
You can't access your root directory with JavaScript because people would read out your database passwords, ftp password aso. if that would be possible.
You can only load files that are accessible directly from browsers, for example, http://www.mydomain.com/secret_code.js
If it can't be accessed directly by the browser, it can't be accessed by the browser via ajax. You can however use .htaccess to prevent users from opening up a js file directly, though that doesn't keep them from looking at it in the google chrome or firebug consoles.
If you want to keep it secret, don't let it get to the browser.
I'm developing a Web Server for android and I've some problems with external javascript files (.js).
With an external css it works fine, because it receives the TCP of the css file and then the server send it as a normal file.
with javascript files it doesn't receive any GET/POST request.
Can I include any tag to tell the browser to get a js file?
at this moment I only tried this one: <script type="text/javascript" src="js/javascript.js"></script>
EDIT:
I just added "text/javascript" content-type but nothing seems to has changed. If I open directly http://ip/js/javascript.js I get the text of javascript.js. Then, if I came back on my index.html, all javascript functions work... why?
EDIT 2:
My server (at this moment) doesn't use threads.. for each request it send the page and restarts the connection. this may be the problem??
but, in this case it should works "something"... no?
EDIT 3:
I had a confirm that may be a thread problem:
if in html file I reverse the javascript and the CSS tag, javascript works, css doesn't work. What do you think?
Make sure you aren't caching the file on the client side. If you have the js file linked in the page it will always attempt to download it, with the exception of caching.
So i'm very new to xml to javascript so i thought I would learn from w3schools, but this site
http://www.w3schools.com/xml/xml_to_html.asp shows an example that I can't mimic locally. I copy/pasted the .js and downloaded the xml but I just get a blank screen!
It's working in there try it yourself but not for me? Do I need it on a server or something?
Yes, that code retrieves the XML data from a web server using AJAX. Since you don't have a server running locally, you can change the URL to point directly to the w3school's version:
xmlhttp.open("GET","http://www.w3schools.com/xml/cd_catalog.xml",false);
Alternatively, play around on their online version ;)
well i guess you have to add the example xml (cd_catalog.xml) to your file system. and you definitively have to access the html file on a server (apache i.e.)
First, ensure that both HTML file (with the Javascript block in it) and XML file are placed in the same directory.
Next, you probably need to place those files under your local web-server and open the HTML like this:
http://[local server host]/ajax.html
instead of opening the file directly from e.g. Windows Explorer:
C:\[path to the file]\ajax.html
For the latter case you'll get an "Access is denied" error.
-- Pavel
Are you running this under a web server or just creating a couple of text files and loading them in your browser?
The "GET" request this relies upon could possibly be failing.
Use Apache or another similar HTTP server and run the example as if it were hosted on the web.
I'm building a local html file to show dynamically some data from an XML file and by using Chrome's Inspector I figured my XML file is not being parsed because it is "not hosted on a webserver"
XMLHttpRequest cannot load data.xml. Cross origin requests are only supported for HTTP.
I know that there are a few flags I could pass to Chrome/web browser to workaround this limitation but I'm looking into some alternative method. I will probably have to distribute this files to a few people and teaching them how to pass flags to the browser is not an option for me. Hosting them on a web server is not an option either.
Thanks a lot in advance.
No ghost unless you set up a local server or host the XML file locally. Ajax has to follow the same origin policy.
If you're willing to use a library you can use jQuery's AJAX method to perform a cross-domain request (i'm not entirely certain that jQuery will support what you're trying to do). JSONP works, but you have XML data...
You could try loading it in a script tag anyway and see if you can get the innerHTML value without breaking the script; once you're done getting the text from it, remove the script from the page. You may be able to get at the data before the browser tries to parse the script by attaching onload and onreadystatechange events to the script element.
var s = document.createElement('script');
s.src = '/path/to/file.xml';
s.onload = s.onreadystatechange = getData;
function getData(e)
{
//get the text out of the script element
//remove the event handler from the script element
//remove the script from the page
}
document.getElementsByTagName('body')[0].appendChild(s);
I didn't test it, but it might work.
How about setting up a local webserver? XAMPP should be easy to install even for novice. Just tell them to put your files in the htdocs folder, run xampp and start the apache server.
Unless there's a compelling reason to have a separate html and xml file, you could just put the data directly in the html file.
I'm afraid if chrome only provides options that you don't like to apply, your application and chrome will not come together. Access via iframe & object does'nt work too, load()-method of createDocument is not supported by chrome(if it does I guess you got the same error).
Whatever you try will be a sideway to pass chrome's restrictions, what cannot be good, because I think they have good reasons to setup these restrictions.