Loading a xml to a class with Javascript - javascript

How can I load a XML file to class using Javascript?

Unfortunately, each browser presents its own way of parsing a string containing XML. Here are the ways that I know of for each of the big 3 browsers. Please note, I haven't had a chance to try each of these as they're cobbled together from various blogs and my own memory.
Firefox has an object called DOMParser that can be used to parse XML in a string. The API is pretty simple -- instantiate the DOMParser and call its parseFromString method. Here is an example:
var xmlString = '<?xml version="1.0"?>...';
var parser = new DOMParser();
var dom = parser.parseFromString(theString, "text/xml");
// use dom
IE uses the Microsoft ActiveX XMLDOM control, therefore you must instantiate the DOM control and use its methods, again here's an example:
var xmlString = '<?xml version="1.0"?>...';
dom=new ActiveXObject("Microsoft.XMLDOM");
dom.async="false";
dom.loadXML(xmlString);
// use dom
And lastly, the weirdo Safari version. Safari doesn't have a parser built in, and being that it doesn't run on Windows it doesn't support ActiveX controls. However, Safari does support data: urls. In Safari a URL with the document is created and called through an XMLHTTPRequest. Like all XMLHttpRequests, you use the standard responseXml property of the XMLHttpRequest to access the DOM.
var xmlString = '<?xml version="1.0"?>...';
var url = "data:text/xml;charset=utf-8," + encodeURIComponent(xmlString);
var xhr = new XMLHttpRequest();
xhr.open("GET", url, false);
xhr.send(null);
var dom = xhr.responseXML;
// Use dom here

I'm not aware of a built-in XML serializer/deserializer in JavaScript. Is you considered something native to JavaScript, like JSON?

Here is an XML to JSON Javascript converter that might steer you in the right direction.

This code will work for all kind of browsers
var url="file.xml"
var xmlDoc="";
if(window.XMLHttpRequest&&!window.ActiveXObject)
{
var Gz=new XMLHttpRequest();Gz.open('GET',url,false);Gz.send(null);xmlDoc=Gz.responseXML;
}
else
{
xmlDoc=new ActiveXObject("Microsoft.XMLDOM");xmlDoc.async=false;xmlDoc.load(url);
}
After using this, you can parse the tag and retrieve the data.

Related

How Edit data of an XML node with Javascript

I want to write some data in an existing local XML file with Javascript with some text from an Html page. Is it possible to change content of nodes?
Here is XML sample:
<Notepad>
<Name>Player1</Name>
<Notes>text1</Notes>
</Notepad>
I will get some more text from input and want to add it after "text1", but can't find a solution.
function SaveNotes(content,player)
{
var xml = "serialize.xml";
var xmlTree = parseXml("<Notepad></Notepad>");
var str = xmlTree.createElement("Notes");
$(xmlTree).find("Notepad").find(player).append(str);
$(xmlTree).find("Notes").find(player).append(content);
var xmlString = (new XMLSerializer()).serializeToString(xmlTree);
}
Here is the code to manipulate xml content or xml file :
[Update]
Please check this Fiddle
var parseXml;
parseXml = function(xmlStr) {
return (new window.DOMParser()).parseFromString(xmlStr, "text/xml");
};
var xmlTree = parseXml("<root></root>");
function add_children(child_name, parent_name) {
str = xmlTree.createElement(child_name);
//strXML = parseXml(str);
$(xmlTree).find(parent_name).append(str);
$(xmlTree).find(child_name).append("hello");
var xmlString = (new XMLSerializer()).serializeToString(xmlTree);
alert(xmlString);
}
add_children("apple", "root");
add_children("orange", "root");
add_children("lychee", "root");
you can use it for searching in xml as well as adding new nodes with content in it. (And sorry i dont know how to load xml from client side and display it.)
but this fiddle demo will be helpful in adding content in xml and searching in it.
Hope it helps :)
If you want to achieve this on the client side you can parse your xml into a document object:
See
https://developer.mozilla.org/en-US/docs/Web/Guide/Parsing_and_serializing_XML
and
http://www.w3schools.com/xml/tryit.asp?filename=tryxml_parsertest2
And then manipulate it like you would the DOM of any html doc, e.g. createElement, appendChild etc.
See https://developer.mozilla.org/en-US/docs/Web/API/Document/createElement
Then to serialize it into a String again you could use https://developer.mozilla.org/en-US/docs/Web/API/Element/outerHTML
Persisting the data
Writing to a local file is not possible in a cross-browser way. In IE you could use ActiveX to read/write file.
You could use cookies to store data on the client side, if your data keeps small enough.
In HTML5 you could use local storage, see http://www.w3schools.com/html/html5_webstorage.asp
Try to use these two package one to convert to json and when is finish the other to come back
https://www.npmjs.com/package/xml2json
https://www.npmjs.com/package/js2xmlparser

Internet Explorer 8 parse XML doesn`t work on huge response size

I have a huge page with really lots of data. Sometimes I need to reload this data using Ajax, so ajax request:
if (window.XMLHttpRequest) {
this._request = new XMLHttpRequest();
} else {
this._request = new ActiveXObject("Microsoft.XMLHTTP");
Then, I access this data with request.responseXML. I faced a problem, when size of responsibility more than 800k characters.
IE doesnt generate responseXML, its just empty XMLObject, but in responseText everithing is OK. I`ve tried to parse XML using the following code:
if (request.responseXML) {
var responseXML = request.responseXML;
//TODO: for huge xml responses some versions of IE can't automaticly parse XML
if (O$.isExplorer && responseXML && !responseXML.firstChild) {
var originalResponseText = request.responseText;
if (window.DOMParser) {
var parser = new DOMParser();
responseXML = parser.parseFromString(originalResponseText, 'text/xml');
} else {
responseXML = new ActiveXObject("Microsoft.XMLDOM");
responseXML.async = false;
responseXML.loadXML(originalResponseText);
}
}
But I faced the same problem.
Then, I tried to parse XML using jQuery:
responseXML = jQuery.parseXml(request.responseXML);
But the problem is still the same, everything is fine when the response length small, but for huge response I still get empty XML object with the parse error inside.
errorCode : -2147467259
filepos : 814853
reason : “Unspecified error\r\n”;
I check those position inside response string and everything is correct, just some ordinal symbol. Also I’ve recheck XML lots of time and I am sure that it’s valid. I don’t know what to do at all.
Also I tried to write my own xml parser, but I think that this is problem has more simple solution.
Thanks in Advance.
It would seem that IE notoriously has difficulties parsing XML data, and there may not be that much you can really do about it. A work around is that you can try to parse the XML data with IE, and if it fails, construct a new parser that will construct XML data out of the plaintext (which as you mentioned will work seemingly regardless of size). Take a look at the similar problem and answer here. It boils down to (pseudocode)
function() {
var XMLdata = reponse.XMLdata;
If (XMLdata.error) {
var parser = construct new DOMParser;
XMLdata = parser.XMLparse(response.plaintext);
}
return XMLdata;
}

Multiple HTML DOMs - Parse and Transfer Data

I am requesting full HTML5 documents via Ajax using jQuery. I want to be able to parse them and transfer elements to my main page DOM, ideally with all major browsers, including mobile. I don't want to create an iframe as I want the process to be as quick as possible. With Chrome & Firefox I can do the following:
var contents = $(document.createElement('html'));
contents[0].innerHTML = data; // data : HTML document string
This will create a proper document, somewhat surprisingly, just without a doctype. In IE9, however, one may not use the innerHTML to set the contents of the html element. I tried to do the following, without any luck:
Create a DOM, open it, write to it and close it. Issue: on doc.open, IE9 throws an exception called Unspecified error..
var doc = document.implementation.createHTMLDocument('');
doc.open();
doc.write(data);
doc.close();
Create an ActiveX DOM. This time, the result is better but upon transferring / copying elements between documents IE9 crashes. Bad because no IE8 support (adoptNode / importNode support).
var doc = new ActiveXObject('htmlfile');
doc.open();
doc.write(data);
doc.close();
contents = $(doc.documentElement);
document.adoptNode(contents);
I was thinking about recursively recreating the elements, instead of transferring them between my documents, but that seems like an expensive task, given that I can have a lot nodes to transfer. I like my last ActiveX example as that will most likely work in IE8 and earlier (for parsing, at least).
Any ideas on this? Again, not only I need to be able to parse the head and body, but I also need to be able to append these new elements to my main dom.
Thanks much!
Answering my own question... To solve my issue I used all solutions mentioned in my post, with try/catch blocks if a browser throws an error (oh, how we love thee IE!). The following works in IE8, IE9, Chrome 23, Firefox 17, iOS 4 and 5, Android 3 & 4. I have not tested Android 2.1-2.3 and IE7.
var contents = $('');
try {
contents = $(document.createElement('html'));
contents[0].innerHTML = data;
}
catch(e) {
try {
var doc = document.implementation.createHTMLDocument('');
doc.open();
doc.write(data);
doc.close();
contents = $(doc.documentElement);
}
catch(e) {
var doc = new ActiveXObject('htmlfile');
doc.open();
doc.write(data);
doc.close();
contents = $(doc.documentElement);
}
}
At this point we can find elements using jQuery. Transferring them to a different DOM creates a bit of a problem. There are a couple of methods that do this, but they are not widely supported yet (importNode & adoptNode) and/or are buggy. Given that our selector string is called 'selector', below I re-created the found elements and append them to '.someDiv'.
var fnd = contents.find(selector);
if(fnd.length) {
var newSelection = $('');
fnd.each(function() {
var n = document.createElement(this.tagName);
var attr = $(this).prop('attributes');
n.innerHTML = this.innerHTML;
$.each(attr,function() { $(n).attr(this.name, this.value); });
newSelection.push(n);
});
$('.someDiv').append(newSelection);
};

Read xml in browser

I have below string in javascript
var output = '<?xml version="1.0" encoding="UTF-8" standalone="yes"?><abc><xyz><xyzResponse><URL>http%3A%2F%2Flocalhost%3A8080%2Fnet%2Fxyz.do%3Fpartner%3Ddummy%26id%3Dba0e245f-ae67-40b6-986d-3242acea4c04</URL><StatusMsg>SUCCESS</StatusMsg><ID>hello.com</ID><AID>test</AID></xyzResponse></xyz></abc>';
I want to parse this as xml and get values out of it.
I have tried below code
var xmlObj = $(output);
alert(xmlObj.find('URL').text())
It works in FireFox but does not work in IE. It does not give any error but does not show any content.
How to read xml that is string format and use content using javascript across the browsers?
Any help is appreciated.
jQuery's $() function doesn't parse XML: it treats it as HTML and inserts it into the HTML DOM, which doesn't work in general. If you're using jQuery 1.5, you can use its new parseXML() method:
var xmlObj = $.parseXML(output);
alert( $(xmlObj).find('URL').text() );
If you can't use jQuery 1.5, you'll need an XML parsing function such as the one I posted here: Strange jQuery XML problem
I did the following for parsing xml for all browsers. I hope you will find it helpful too.
if(window.DOMParser)//Firefox, Chrome and others Browsers
{
var xmlString = (new XMLSerializer()).serializeToString(response);
parser=new DOMParser();
xmlDoc=parser.parseFromString(xmlString,"text/xml");
}
else // Internet Explorer
{
xmlDoc=new ActiveXObject("Microsoft.XMLDOM");
xmlDoc.async="false";
xmlDoc.load(response);
}

Caching client-side XSLT imports in Internet Explorer

I'm transforming an XML document with XSLT in Internet Explorer 7. My XSLT imports/includes -- I've tried both -- another XSLT with the following line:
<xsl:import href="utils.xsl" />
This results in an HTTP request for the included file every time the including XSLT is used, even if a reference to the parent XSLT is cached and re-used. IE sends a Pragma: no-cache header on each request for the import/include request.
Is it possible to prevent these repeated HTTP requests?
Can I get IE to cache the file in the client?
If not, can I get IE to send an "If-Modified-Since" header?
For completeness, here's the corresponding transformation JavaScript:
var XMLUtil = {
// transforms the sourceStr using the given xslDoc
transformString: function(sourceStr, xslDoc /*XMLDOM doc*/) {
var sourceDoc = XMLUtil.loadFromString(sourceStr);
var resultDoc = new ActiveXObject("Microsoft.XMLDOM");
sourceDoc.transformNodeToObject(xslDoc, resultDoc);
return resultDoc;
},
// creates an XMLDOM document from a string containing XML
loadFromString: function(xml) {
var doc = new ActiveXObject("Microsoft.XMLDOM");
doc.async = false;
doc.loadXML(xml);
if (doc.parseError.errorCode != 0)
throw "Error parsing XML: " + doc.parseError.errorCode;
return doc;
}
}
The responses to a similar question recommends setting ForcedResync to false.
But Qi Samuel Zhang's response cautions
ForcedResync should work for most
cases, but the ForcedResync in MSXML3
has known issues to mitigate backward
compatibility, please use
MSXML2.DOMDocument.6.0 when possible.

Categories

Resources