I was browsing through one site called BSEINDIA.com (http://www.bseindia.com/stockreach/stockreach.htm?scripcd=532667), i Noticed that on click of Get Quote it seems to fire an Ajax request and get the price of selected equities. I tried to segregate this request and fire it separately, but it doesn't seem to work.
I copied over the code from the HTML of same page (http://www.bseindia.com/stockreach/stockreach.htm?scripcd=532667)
Any pointers why is this not working, is there some sort of Authentication going on , i am not even a member of this site??
following is what i am trying to do
<script type="text/javascript">
var oHTTP=getHTTPObject();
var seconds = Math.random().toString(16).substring(2);
if(oHTTP)
{
oHTTP.open("GET","http://www.bseindia.com/DotNetStockReachs/DetailedStockReach.aspx?GUID="+seconds+"&scripcd=532667",true);
oHTTP.onreadystatechange=AJAXRes;
oHTTP.send(null);
}
function AJAXRes()
{
if(oHTTP.readyState==4)alert(oHTTP.responseText);
}
function getHTTPObject(){var obj;
try{obj=new ActiveXObject("Msxml2.XMLHTTP");}
catch(e){try{
obj=new ActiveXObject("Microsoft.XMLHTTP");}
catch(e1){obj=null;}}
if(!obj&& typeof XMLHttpRequest!='undefined'){
try{obj=new XMLHttpRequest();}
catch(e){obj=false;}}return obj;}
</script>
Found out my Answer here
http://msdn.microsoft.com/en-us/library/system.net.httpwebrequest.referer%28VS.71%29.aspx
Actually, it is fairly easy. When you send an HTTP request, an header called Referrer gets sent with the request. The Referrer is basically the URL of the page which initiated the request.
BSEINDIA checks the Referrer value to make sure that the request is coming from their site. If it is, it sends the data. If not, it sends its 404 page.
You can easily test that theory by disabling the Referrer in your browser. In Firefox, you can do that by typing about:config and setting network.http.sendRefererHeader to 0.
If you still want to get the data, you will need to write a script (in PHP or another language) which will make the request with the proper Referrer and output the results.
There might be some form of IP restriction in place for accessing the files / data needed to save themselves from third party scripts accessing their data through their own scripts. Thats what I'd do.
Possibly Http Referrer. Make sure you do not break any copyright restriction.
Related
Currently I have a page that when you fill out a text box and click a button, it redirects you to another page.
The page needs to be loaded, since it updates and shows xml. (I cannot currently change how this is)
However what I what to do is after page was redirected once, redirect it again or just load another page in general.
The thing to note about the xml link, is that part of it is created with the text box, so it will be dynamic.
I currently have something along the lines of this
//please note that username is a textbox, I've just left it out
<script runat = "server">
void Button_Click(Object sender, EventArgs e)
{
var url = "http://website.com/scripts/" + username.text "/value/0"
try
{
Response.Redirect(url, true);
}
catch(Exception ex)
{//From what I learnt, adding true to redirect throws an exception,
//which is how I tried executing another redirect, but it doesn't seem to
//to load the first direct, and skips straight to this, I also put this
//in finally, because it seemed more appropriate to no avail
Response.Redirect(someurl, true);
}
}
So I'm wondering if this is actually possible, I also wonder if I'm just looking up the wrong keywords to find a solution.
I've spent a bit of time on this, and have yet to come to some sort of solution, but I'm new to web development so I may just be missing some incredibly simple.
Also I only really understand how C# works in asp, but am willing to learn how to add in javascript or VB if necessary.
Thanks in advance for the help
Edit: Solution!
So I managed to use javascript to append the textbox value to the xml link, request it and without showing the user (showing the user, is not necessary in this case).
After which a popup confirms that it is successful then reloads the page.
it is very self explanatory but what I did was
url = "website";
var xmlHttp = new XMLHttpRequest();
xmlHttp.open("GET", url, true);
window.alert("success");
return true;//this reloads the page, that or just window.location.reload();
For an added check, I will see if I can verify that the username is a valid username, and popup with failure text if not.
You seem to have a misunderstanding about what Response.Redirect(...) actually does. The method name is, in my opinion, a bit misleading. It suggests that somehow the Response to the currently executing request will be sent somewhere else than the requesting browser. This is not the case. The name could as well have been Response.SendRedirectResponseToBrowser, because that's what Response.Redirect does.
So when you do Response.Redirect(url) you are telling the server that is executing your page that is should send a response to the browser, telling the browser to do a GET request of the supplied url. The browser will then do that, at which point that page needs to include a separate Redirect in order to further tell the browser where to go next.
In this case then, the page at "http://website.com/scripts/" + username.text "/value/0" needs to be patched up so that after processing the request, it will also send a redirect response with the url you want to display next.
If you have no control over that page, then you must solve this some other way. Some options:
Use ajax to request the "http://website.com/scripts/" + username.text "/value/0" url. Then after completion set the page location to the url you want to show next.
Open the http://website.com/.... url in a _blank target, then set to location to the next page.
Use System.Net.Http.HttpClient in your code behind method to request the http://website.com/.... url, then do a redirect. This means that the server requests the url as part of processing the button click.
Notes:
If the http://website.com/.... url updates some state (like store some changes in a database or similar), then you should request it using a POST request, not a GET. GET requests can get a cached response which means that the server might never actually see the request, and therefore not do any processing.
Piecing together the url like this "http://website.com/scripts/" + username.text "/value/0" looks risky. You should at the very minimum url encode the username.text - HttpUtility.UrlEncode(username.text). Better yet would be the first validate that the entered username is actually a valid user name.
You can add a Refresh header (not a meta-refresh element) to the response that contains the XML. In the header, you can specify another URL and the number of seconds to wait before redirecting.
I guess it should be using JavaScript (front-end) instead of back-end error handling, because it goes to another page. Use promise to handle exception
I am analyzing code from a HTML page, and saw that when the page is loaded, there will be a POST request as the following:
$.post('/video_info/html5',{v:video_id},function(data){
//it does something here with data
},'html');
The page URL is in format: https://example.com/watch?v=123#video=456.
So then, I was trying to use Postman to send a POST request to https://example.com/video_info/html5 with parameter v = video_id (for example: The_Lord_of_the_Rings_2002), but it doesn't get any response while the code above can get it(I used firebug and could see the response).
Is there something wrong in the URL: https://example.com/video_info/html5 ?
Is there something wrong in the URL: https://example.com/video_info/html5 ?
No, that's the correct resolution of the URL in the post (ajax) call relative to the page URL you've described.
ajax requests carry headers and such which may make them look different from other requests. The site in question may also be looking at the referer (sic) and/or keeping server-side track of the fact that page X was just loaded from IP 1.2.3.4 and so it's okay to reply to the request for the video.
People sometimes go to great lengths to protect their content from being used except in the ways they want it used.
Is it possible to make an http request that has been sent to a server by the browser fail without having to alter the javascript?
I have a POST request that my website is sending to the server and we are trying to test how our code reacts when the request fails (e.g. an HTTP 500 response). Unfortunately, the environment that I need to test it in has uglified and compressed javascript, so inserting a breakpoint or altering the javascript isn't an option. Is there a way for us to utilize any browser to simulate a failed request?
The request takes a long time to complete, so using the browser's console to run a javascript command is a possibility.
I have tried using window.stop(), however, this does not work since I need to failure code to execute.
I am aware of the option of setting up a proxy server, but would like to avoid this is possible.
In Chrome (just checked v63), you can actually block a specific URL (or even a whole domain) from the Network tab. You only need to right-click on the entry and select Block request URL (or Block request domain.)
One possible solution is to modify the XMLHttpRequest objects that will be used by the browser. Running this code in a javascript console will cause all future AJAX calls on the page to be redirected to a different URL (which will probably give a 404 error):
XMLHttpRequest.prototype._old_open =
XMLHttpRequest.prototype._old_open || XMLHttpRequest.prototype.open;
XMLHttpRequest.prototype.open = function(method, url, async, user, pass) {
return XMLHttpRequest.prototype._old_open.call(
this, method, 'TEST-'+url, async, user, pass);
};
Don't overlook the simplest solution: disconnect your computer from the Internet, and then trigger the AJAX call.
Chrome's dev tools have an option to "beautify" (i.e. re-indent) minified JavaScript (press the "{}" button at the bottom left). This can be combined with the "XHR breakpoint" option to break when the request is made. XHR breakpoints don't support modifying the response though AFAIK, but you should be able to find a way to do it via code.
To block a specific URL and make an API call failure, you just need to follow below steps:
Go to Network tab in your browser.
Find that API call which needs to fail(as per your requirement).
Right click on that API call and
Click on 'Block Request URL', you can unblock as well in same manner as the option will turn into 'Unblock'
Just type at the brower a changed URL, e.g. the well formed URL e.g. http://thedomain.com/welcome/ by another placing "XX": http://thedomain.com/welcomeXX/ , that will cause a 404 error (not found)
I am calling window.location.href = "some url";
I want to check whether that url is opening or not. Means if the url is wrong how do check it in javascript so that I can redirect it to another url.
Thanks.
You can't, not really anyway. Once you change the location, the browser will unload your page and begin to load the new one, there's no way to find out where the browser is going or if it failed to reach the target page.
If the URL is on the same domain you could check to see if it returns a status 200 with an AJAX request before setting the location, but if it's on another domain then you're out of luck due to the same origin policy.
You need to use a server side component for this to work.
Expose an ajax service that will perform a HEAD on the given url and return a status.
Mind that this service needs to be protected to avoid being used for DDOS.
Here's the problem:
1.) We have page here... www.blah.com/mypage.html
2.) That page requests a js file www.foo.com like this...
<script type="text/javascript" src="http://www.foo.com/jsfile.js" />
3.) "jsfile.js" uses Prototype to make an Ajax request back to www.foo.com.
4.) The ajax request calls www.foo.com/blah.html. The callback function gets the html response and throws it into a div.
This doesn't seem to work though, I guess it is XSS. Is that correct?
If so, how can I solve this problem? Is there any other way to get my html from www.foo.com to www.blah.com on the client without using an iframe?
It is XSS and it is forbidden. You should really not do things that way.
If you really need to, make your AJAX code call the local code (PHP, ASP, whatever) on blah.com and make it behave like client and fetch whatever you need from foo.com and return that back to the client. If you use PHP, you can do this with fopen('www.foo.com/blah.html', 'r') and then reading the contents as if it was a regular file.
Of course, allow_remote_url_fopen (or whatever it is called exactly) needs to be enabled in your php.ini.
There is a w3c proposal for allowing sites to specify other sites which are allowed to make cross site queries to them. (Wikipedia might want to allow all request for articles, say, but google mail wouldn't want to allow requests - since this might allow any website open when you are logged into google mail to read your mail).
This might be available at some point in the future.
As mentioned above JSONP is a way around this. However, the site that you are requesting the data from needs to support JSONP in order for you to use on the client. (JSONP essentially injects a script tag into the page, and provides a callback function that should be called with the results)
If the site you are making a request to does not support JSONP you will have to proxy the request on your server. As mentioned above you can do this on your own server or what I have done in the past is use a http://www.jsonpit.com, which will proxy the request for you.
One option is to implement a proxy page which takes the needed url as a parameter. e.g. http://blah.com/proxy?uri=http://foo.com/actualRequest
JSONP was partially designed to get around the problem you are having
http://ajaxian.com/archives/jsonp-json-with-padding
JQuery has it in their $.getJSON method
http://docs.jquery.com/Ajax/jQuery.getJSON
The method shown above could become a large security hole.
Suggest you verify the site name against a white list and build the actual URI being proxied on the server side.
For cross domain hits this is a good working example and now is considered as some what "standard" http://www.xml.com/pub/a/2005/12/21/json-dynamic-script-tag.html.
there are other ways as well, for eg injecting iframes with document.domain altered
http://fettig.net/weblog/2005/11/28/how-to-make-xmlhttprequest-connections-to-another-server-in-your-domain/
I still agre that the easy way is calling a proxy in same domain but then it's not truly client side WS call.