I have a web page displaying inside an iframe when I access it from from local machine:
http://localhost/mypage.html
it will display the following text correctly in spanish:
Búsqueda
But if I call it from my website
http://mywebsiteurl.com/mypage.html
I get the following:
Búsqueda
notice the ú has been replaced by ú I have tried changing fonts but the results are the same. The files on the web server are the same as on my localhost. Any ideas? Could it have something to do with my apache or php configuration may be difrerent than on my localhost machine?
in your html tag add the following
<html lang="es">
What you need might be
AddDefaultCharset UTF-8
in your .htaccess
For more insight check this thread How to change the default encoding to UTF-8 for Apache?
I have run into a situation where my company has decided to archive/move all the old html files into a new domain say archive.company.com. Now what I would like to be able to do is for every request to the archive domain, after the content is loaded, there will be a popup saying that this content is old. Since there are thousands of html files, I do not want to touch them. Is there any way to run a piece of javascript code from htaccess or any other means ?
If you already use a single css in all of your html files you can use :before and :after to insert content.
e.g.:
h1::before {
content: "This content is old!";
}
Another idea would be to add a cgi handler to html files on that subdomain. Then create your own html-cgi-"interpreter", that just outputs the original html with some additional javascript.
Sample Apache configuration snipplet:
AddHandler application/x-old-html html
Action application/x-old-html /path/to/the/script
The sample script may simply use awk or sed to replace <body> with <body><script language="JavaScript">alert("Old content");</script>.
It is a rather wierd problem. Consider the following small perl code:
#!/usr/bin/perl
use strict;
use warnings;
use CGI qw{ :standard };
use CGI::Carp qw{ fatalsToBrowser };
my $q = CGI->new;
print "Content-type: text/html\n\n";
print "<head>\n";
print "<script src='/home/bloodcount/Desktop/pm.js' type='text/javascript'></script>\n";
print "</head>\n";
print "<body>\n";
print "<h1>Click any number to see its factors</h1>\n";
print "</body></html>";
It prints a very small html page and includes a jasvascript file. The problem is that the javascript file isn't included. The "physical" copy is in the correct place. I thought that something may be wrong with the code I am generating so I copied the raw html which comes out if you run this file in the console which is:
Content-type: text/html
<head>
<script src='/home/bloodcount/Desktop/pm.js' type='text/javascript'></script>
</head>
<body>
<h1>Click any number to see its factors</h1>
</body></html>
I ran it in chrome and it worked perfectly. The javascript file has exactly one line if code which is:
console.log("It works!");
Any ideas what may be causing this?
Note: I know that the second code listing doesn't have !DOCTYPE.
Since you are able to execute the CGI within your browser you must have a local web server running. Your <script src='...'> path is likely unreachable from the browser due to a lack of access rights or the proper alias configured within your web server.
It works from the static file because the browser is then going though filesystem directly, so the JS file path name resolves.
You have to put the .js file somewhere that the web server knows about, and then formulate your src path correctly.
Check your web server logs and documentation to see how to set up the proper access rights and/or aliases. Note you probably do not want to expose ~/Desktop to the internet.
As an example, if you are using Apache, see USERDIR, ACCESS CONTROL, ALIAS.
After some tinkering I found the solution:
Apache searches for scripts and files only in the folder for this website meaning that each website has one specific folder where you must put the scripts. The base folder path is: /var/www/ and from there on you must find your website.
This means that when before the set path was: /home/bloodcount/Desktop/pm.js
it actually searched for the path /var/www/home/bloodcount/Desktop/pm.js which didn't exist. It wasn't searching in the real desktop, nor was there a permission problem.
Lets assume that i put a valid xml string as the body of an iframe.
Is there any way to save it as an xml file from the client side. Like for example creating a link to the iframe, right-cclicking the link and choose "save target as" or something similar. Maybe using javascript?
Thanks
Post back to server, and add Content-Disposition: attachment; filename=1234.xml header.
I might be wrong but I believe there are security restrictions in place that preventthe DOM from accessing contents within an iframe.
I don't know how should I titled this question but hope my friends will understand the problem and will help me :)
I want to show log message in arabic language using JavaScript alert() function, for which I code:
alert('أدخل سعر الافتتاح');
which means
alert('Enter opening price');
but when i save the .js file Dreamweaver says
and if I run the script browser says
this page contains
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
and i am using a lot of text in arabic which works fine.
now how can I use alert for different language?
just add your script like this:
<script src="/js/intlTelInput.min.js" charset="utf-8"></script>
Just like any other text file, .js files have specific encodings they are saved in. This message means you are saving the .js file with a non-UTF8 encoding (probably ASCII), and so your non-ASCII characters never even make it to the disk.
That is, the problem is not at the level of HTML or <meta charset> or Content-Type headers, but instead a very basic issue of how your text file is saved to disk.
To fix this, you'll need to change the encoding that Dreamweaver saves files in. It looks like this page outlines how to do so; choose UTF8 without saving a Byte Order Mark (BOM). This Super User answer (to a somewhat-related question) even includes screenshots.
Try to put in the head section of your html the following:
<meta charset='utf-8'>
I think this need to be the fist in head section. More information about charset: Meta Charset
Same problem here, solved with this:
In Eclipse (with .js file open and on focus), go to, "File", "Properties", "Resource", "Text file encoding" choose "Other:" UTF-8, put correct characters code inside the code save your file and you are done!
I think you just need to make
<meta http-equiv="Content-Type" content="text/html;charset=UTF-8">
Before calling your .js files or code
For others, I just had a similar problem and had to copy all code from my file, put it to simple notepad, save it with utf-8 coding and then replace my original file.
The problem at my side was caused by using PSpad editor.
The encoding for the page is not set correctly. Either add a header
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
or use set the appropriate http header.
Content-Type:text/html; charset=UTF-8
Firefox also allows you to change the encoding in View -> Character encoding.
If that's ok, I think javascript should handle UTF8 just fine.
This is a quite old request to reply but I want to give a short answer for newcommers. I had the same problem while working on an eight-languaged site. The problem is IDE based. The solution is to use Komodo Edit as code-editor. I tried many editors until I found one which doesnt change charset-settings of my pages. Dreamweaver (or almost all of others) change all pages code-page/charset settings whenever you change it for page. When you have changes in more than one page and have changed charset of any file then clicked "Save all", all open pages (including unchanged but assumed changed by editor because of charset) are silently re-assigned the new charset and all mismatching pages are broken down. I lost months on re-translating messages again and again until I discovered that Komodo Edit keeps settings separately for each file.
I too had this issue, I would copy the whole piece of code and put in Notepad, before pasting in Notepad, make sure you save the file type as ALL files and save the doc as utf-8 format. then you can paste your code and run, It should work. ?????? obiviously means unreadable characters.
RobW is right on the first comment.
You have to save the file in your IDE with encoding UTF-8.
I moved my alert from .js file to my .html file and this solved the issue cause Visual Studio saves .html with UTF-8 encoding.
I found a solution to my problem that seems like yours.
For some reason a script called from a external file doesn't works with charset="UTF-8", instead i had to use charset="ISO-8859-1" into script tag.
Now I'm after the "why it works?" reason.
thanks friends, after trying all and not getting desired result i think to use a hidden div with that arabic message and with jQuery fading affects solved the problem. Script I wrote is:
.js file
$('#enterOpeningPrice').fadeIn();
$('#enterOpeningPrice').fadeOut(10000);
.html file
<div id="enterOpeningPrice">
<p>أدخل سعر الافتتاح</p>
</div>
Thanks to all..