AJAX request to only load/open a specific div without using JQuery? - javascript

Im looking to do something like this
$('#thisdiv').load(document.URL + ' #thisdiv');
where only the elements with the specified selector are loaded from the server, but I can't use Jquery, anyone know how to do it?

Read the XHR documentation here.
Basically, you create an XMLHttpRequest object (be wary of older browsers), then trigger its methods to load some content, then take the response and copy it where you want it.
The link above goes on to discuss XML parsing, which may come in handy if you want to use just the "#thisdiv" part of the XHR response.

Related

Is there a way to get DOM-element's actual HTML even if malformed? [duplicate]

OK I don't use js enough to know, but is there a way to get the real source code of the page with it?
document.body.innerHTML for example gives some kind of "fixed up" version where malformed tags have been removed.
I'm guessing using XMLHttpRequest on the original page might work, but seems kind of stupid.
This happens because browsers parse the DOM and don't keep the HTML in memory. What is returned to you is the browser's conversion of the current DOM back to HTML, which is the reason for the uppercase tags and lack of self closing tags where applicable.
An XMLHttpRequest would be the best way to go. In most cases, assuming the server doesn't send the no-cache header, and the HTML page has finished downloading, the XMLHttpRequest would be almost instant because the file is fetched from the cache.
For accessing JS of the same origin, XMLHttpRequest is quite fine. You can have access to any JS document in "raw" format using this technique without the browser getting in the way (i.e. conversion to DOM and back).
I am not sure I understand your comment re: XMLHttpRequest being stupid : is it because you are worried about the potential duplication of work? i.e. getting the code 2times from the origin server.
I typically use FireBug when I want to peruse or copy source files.

Parsing RSS in JS without tierce service (vanilla JS or Angular)

I want to recover an RSS feed in JS.
I looked-up on the web a whole day, and found that nearly everybody use google feed API, Yahoo API, or a nodejs/php page for the computing and Jsonification. And I don't want to depend on a service like Google Feed API.
My goal is to fetch an RSS feed, and then create an array where each article on the feed will be an object, in full javascript.
I'm using Angular JS, so if the help could use the benefits of this lib, it would be great, but I'm not closed to any vanilla-JS code if needed.
For those who may want to ask why : it is for a Firefox OS appliaction, and that's why I can't have any php/nodejs. All have to be made in JS.
Thanks,
Tom
What is the problem to make fetching of xml structure directly?
I think using systemXHR permission regular AJAX request should work fine for you.
Then you'll be able to get from xml what you need in apy possible way.
So my best guess would to just use normal DOM parser, and then query the document:
var parser = new DOMParser();
var xmlDoc = parser.parseFromString(txt, "text/xml");
I think nowadays you can also use things like querySelectorAll to quickly iterate over the document, similar to normal DOM. E.g. something like this would work:
[].forEach.call(xmlDoc.querySelectorAll('item'), function(item) {
console.log(item.querySelector('title').textContent);
});
The short answer is that you can't fetch and parse XML feeds on the client without using a 3rd party service because of browser's Same Origin Policies.
From there, there are 2 options:
fetch and parser on the server side. You'll have to do all the grunt work by your self, but then you can easily load the data from the browser, because it will be under the same domain and hence the Same Origin Policy won't apply
compromise on your requirement to not use a 3rd party and use one that transforms the XML feeds into JSON to circumvent the SOP.
In both cases, I suggest you check Superfeedr (which I created!), which I believe can help a lot... we also have an Angular module for feeds.
Thanks for pople who took the time to answer me :)
It appears it is not really possible without any server computing.
I have to confess that I'm pretty lucky, because the service I wanted to call have just realesed a new API, so happy end for me :)
Thanks every body !

Ajax response as DOM object

Is there a way to get a response from the typical ajax function so that it can be dissected with getElements? I've tried query.responseText.getElementById but it works just as bad as it looks. You should be able to tell what I'm trying to achieve by seeing that snippet, though. I just need to get elements from an ajax response the same way as I would a normal DOM object.
Also, please do not suggest using jQuery. I use it when I have a lot of script and can use a lot of its functions, but in this case I only have a short script and a library 70x the size of it would seem like a waste.
Parsing an SVG or HTML document
parser = new DOMParser();
doc = parser.parseFromString(stringContainingHTMLSource, "text/html");
doc will be a valid html document.
Well you could have a hidden div on your page and set it's innerHTML to the Ajax response you receive. You could then call div.getElementById(), since it is then just another DOM object.
Refer to this article: Parsing XML response in Ajax
In this case I am using responseXML. You can make use of getElementsByTagName and other getElement*() methods to get your data.
If your response is TEXT, I've seen ppl use ...xhr.responseText.spit('html>...body>...div id="yourTargetsParent">')[1].split('/div>.../body>.../html>')[0]; //just split the string up however!
Another way is to use iframe.contentWindow.document.body... (or contentDocument for some browsers)... just hide the iframe ya know.
Obviously, if you have control over the target that totally changes things (and this post probably wouldn't be here), but I've also seen some mean work arounds with the target's use of scripting its host dom, localStorage, splits/joins, webSQLDatabases, ...for string manipulation.
Honestly, I used to use a hidden div(thank you asleepysamurai!), but I thought I came across a more getElementById/jQuery.load type way. ..I'll post back if I find it...

Using jQuery to listen for an AJAX load that is not loaded using jQuery.AJAX

Okay, have a bit of a tricky one (for me anyway, i'm pretty rubbish at jQuery/JavaScript).
I'm pulling in data using standard AJAX (ie, NOT using a framework like jQuery or whatnot... there is a reason for it)
However, I then need to load up a jQuery script as soon as the page has been loaded in. So, here is the question, how do I bind the script once the DOM has been updated? I have been using Ariel Fleser's listen plugin (http://flesler.blogspot.com/2007/10/jquerylisten.html) for picking up on events such as clicks which works a treat, but I can't see how this can be used to listen for a load event.
Any ideas? I'm pretty stumped on this one!!
What's wrong with:
yourXMLHTTPRequest.onreadystatechange = function () {
if (this.readyState == 4) {
doJQueryStuff();
}
}
Also, what's the reason for not using the AJAX functionality provided in jQuery?
As an aside, it looks like that plugin you mention just replicates the functionality given by the live() and delegate() methods brought into jQuery in 1.4
In order to parse javascript loaded via ajax, you'll need to use eval(). Although, as far as these things go, a lot of coders consider eval() to be, um, evil... (a link, another link).
Regardless of the right and wrong of eval, this thread on webdeveloper should get you a lot of the way towards a solution.

Get the response headers with prototype.js

Is there an easy way to pull out the response headers of a page with prototypejs without using Ajax.Request?
You can't, for the current document (i.e. the one in which the executing Javascript code was referenced). See various search results.
What's your use case for this? Maybe there's another way to accomplish what you need to do.

Categories

Resources