How can I monitor outgoing requests from my browser in javascript? - javascript

I'm trying to log all the requests that sites in my browser make behind the scenes. I can do it manually using Chrome's anylitics or Firebug, but I want to have either (a) a quick js extension that I can bookmark and run on sites when I want to log requests, or (b) a chrome/firefox extension to do so. I found this thread asking roughly the same thing, but I want to catch AJAX requests too. How can I go about this?

Fiddlr
http://www.telerik.com/fiddler
This application runs outside of your browser to inspect all data transmitted between your computer and the internet. It's what I use to debug application design and I think it would be great for what you need.
To note once running it will automatically "log" all requests, and they can be easily saved for reviewing later. There are also loads of extensions to the application that may do the same for you.
Key Features
HTTP/HTTPS Traffic Recording
Fiddler is a free web debugging proxy which logs all HTTP(s) traffic between your computer and the Internet. Use it to debug traffic from virtually any application that supports a proxy like IE, Chrome, Safari, Firefox, Opera and more.
Tamper-client-requests-and-server-responses
Web Session Manipulation
Easily manipulate and edit web sessions. All you need to do is set a breakpoint to pause the processing of the session and permit alteration of the request/response. You can also compose your own HTTP requests to run through Fiddler.
Inspect-and-debug-traffic-from-any-client
Web Debugging
Debug traffic from PC, Mac or Linux systems and mobile devices. Ensure the proper cookies, headers and cache directives are transferred between the client and server. Supports any framework, including .NET, Java, Ruby, etc.
Decrypt-HTTPS-web-sessions
Security Testing
Use Fiddler for security testing your web applications -- decrypt HTTPS traffic, and display and modify requests using a man-in-the-middle decryption technique. Configure Fiddler to decrypt all traffic, or only specific sessions.
Test-the-performance-of-your-web-sites-and-apps
Performance Testing
Fiddler lets you see the “total page weight,” HTTP caching and compression at a glance. Isolate performance bottlenecks with rules like “Flag any uncompressed responses larger than 25kb.”
Update:
Google Chrome Developer tools (Specifically the Network Tab) You are able to easily see network traffic directly from the current webpage and monitor all HTTP information such as request and response headers, cookies and timing elements.

Try to use jQuery Global Ajax Event Handlers
These methods register handlers to be called when certain events, such as initialization or completion, take place for any Ajax request on the page. The global events are fired on each Ajax request if the global property in jQuery.ajaxSetup() is true, which it is by default. Note: Global events are never fired for cross-domain script or JSONP requests, regardless of the value of global.

Related

Turning your browser into proxy server

I have a logic written on my server mostly doing curl requests (e.g. accessing social networks). though, some of the sites, will be blocking my server(s) IPs soon.
I can of course, use VPN or deploy multiple servers per location, but it won't get accurate, and still some of the networks might get block the user account.
I am trying to find creative solution to run it from the user browser (it is ok to ask for his permission, as it is an action he is explicitly trying to execute) Though I am trying to avoid extra installations (e.g. downloadable plugins\extension or a desktop app)
Is there a way to turn the client browser into a server-proxy, to run those curl-calls from his machine instead of sending it from my own server? (e.g. using web-sockets, polling, etc.)
It depends on exactly what sort of curl requests you are making. In theory, you could simulate these using an XMLHttpRequest. However, for security reasons these are generally not allowed to access resources hosted on a different site. (Imagine the sort of issues it could cause for instance if visiting any website could cause your browser to start making requests to Facebook to send messages on your behalf.) Basically it will depend on the Cross-origin request policy of the social networks that you are querying. If the requests being sent are intended to be publicly available without authentication then it is possible that your system will work, otherwise it will probably be blocked.

How do I console.log POST requests with my chrome extension

I'm trying to console.log POST requests with my chrome extension but can't figure out how to do it, can anyone give me an example? I've looked at the chrome extension API but still can't seem to get to it
In Google Chrome, browser requests such as POST and GET are visible in the Network tab of the Inspector.
Screenshot from Chrome Devtools Overview:
If you are looking for a natural way to make a Javascript hook on browser requests (such as for logging them out), you will have more issues as there is no native way for Javascript to hook on requests at browser's scale, for security reasons.
But if you are okay to use a dedicated extension for the job, you can look at the webRequest extension for Chrome:
https://developer.chrome.com/extensions/webRequest
Use the chrome.webRequest API to observe and analyze traffic and to intercept, block, or modify requests in-flight.
Here's an example of listening for the onBeforeRequest event:
chrome.webRequest.onBeforeRequest.addListener(callback, filter, opt_extraInfoSpec);
Beware, there are security requirements and limitations:
You must declare the "webRequest" permission in the extension manifest
to use the web request API, along with host permissions for any hosts
whose network requests you want to access.
Note that due to the ordered or asynchronous nature of the webpage resources loading (HTTP/1.x or HTTP/2.0) you are not be guaranteed to catch all the requests made by the browser that happened before your Javascript hooks were setup.
Eventually, you have some tricks, such as those referenced here for detecting AJAX calls through Javascript proxification mechanisms.
Another strategy would be to have the request detection deported on server and informing the client he sent a request (through Websockets/queues for example). It would only work for requests targeted on the domains you manage though, and it sounds like a bit of an expensive solution. It all depends of what your ultimate needs are.

Can Fiddler's HTTP packet tampering be replicated with Javascript?

I want to capture particular HTTP requests from a Flash game and then alter the HTTP responses it receives. I can currently do this using Fiddler, but I want to write some Javascript that achieves the same programmatically. Is it possible to capture and alter HTTP browser traffic with JS like this?
Regarding my motivations, I am part of a community who enjoy playing an ancient Flash game. Part of the game involves uploading your own levels to the game's server. Unfortunately, this is broken - when you request the level from the server via the game, the server always reports failure, presumably due to no longer being maintained. So, in order to play our levels, we are using Fiddler to capture the game's HTTP requests that ask the server for the level data and then altering the server failure response by inserting our level data. I am trying to automate this process on a webpage.
Is HTTP packet sniffing feasible in Javascript? Or will we continue to be limited by native desktop solutions like Fiddler?
Web based proxies are totally a thing. In the same manner that your current solution uses Fiddler as an intermediary between your web browser and the game server, a website can be act as an intermediary between your browser and another website by simply making HTTP requests itself and then sending the modified results to the user.
To diagram:
Browser -> Fiddler -> WebPage (Game) -> Fiddler -> Browser
...is roughly equivalent to...
Browser -> WebPage (Proxy Server) -> WebPage (Game) -> WebPage (Proxy Server)-> Browser
And you could in theory write your proxy server entirely in javascript (see: full stack javascript)!
But based on the fact that you ask about javascript specifically, I'm going to guess that you are not interested in your proxy page having a meaningful back end. This may be a problem. If you would like your proxy website to be entirely client side javascript, your diagram suddenly looks more like this:
Browser -> WebPage (Proxy Server) -> Browser -> WebPage (Game) -> Browser -> WebPage (Proxy Server)-> Browser
This is a problem because web browsers take steps to prevent this behavior by default (see: Same Origin Policy [SO won't let me put more than 2 links in this answer. You're going to have to Google this one.]). Most client-side javascript proxy solutions I can imagine violate Same Origin Policy to a significant degree (if you have control of the site serving the game you could look into CORS headers or jsonp requests - but it doesn't sound like this is an option).
If you can engineer a solution that doesn't violate same-origin policy you may be successful with an entirely client-side solution. In this case, I would recommend looking into async calls as a starting point (see: jQuery AJAX).

CORS with IE11+ Access Denied with SSL to localhost

The Very Short Version: is anybody successfully requesting local resources via AJAX, in IE, over SSL? I cannot solve getting an "access denied" error.
The Longer Version:
I am using AJAX to retrieve JSON from an application that runs a local web service. The web service channel is encrypted so that if the remote site is being served over HTTPS, no "insecure resource on a secure page" errors appear.
So, in the address bar is a remote site of some sort... mysite.com. It is receiving information from https://localhost/.
The web service is setting correct headers for CORS and everything works in Chrome and Firefox. In IE, if I put my https://localhost resource into the address bar, the correct resource is returned and displayed. However, when using AJAX (not just the address bar), a security setting in IE is denying access. This is documented (in part) here:
Access denied in IE 10 and 11 when ajax target is localhost
The only proper solution in one reply is to add the requesting domain (mysite.com in this case) to the trusted sites. This works, but we would prefer to not have user intervention... pointing to a knowledge base article on how to add a trusted site is hardly a great user experience. The other replies to that question are invalid for the same reasons as below-->
Some more stumbling around and I discovered this:
CORS with IE, XMLHttpRequest and ssl (https)
Which had a reply containing a wrapper for AJAX requests in IE. It seemed promising, but as it turns out, IE11 has now deprecated the XDomainRequest API. This was probably the right thing for Microsoft to do... but now the "hack" workaround of adding a void onProgress handler to the XDR object is obviously not an option and the once-promising workaround wrapper is rendered null and void.
Has anybody come across either:
a) a way to get those requests through without needing to modify the trusted sites in IE? In other words, an updated version of the workaround in the second link?
b) as a "next best" case: a way to prompt the user to add the site to their trusted zone? "mysite.com wishes to be added to your trusted zones. Confirm Yes/No" and have it done, without them actually needing to open up their native settings dialogues and doing it manually?
For security reasons, Internet Explorer's XDomainRequest object blocks access (see #6 here) to the Intranet Zone from the Internet Zone. I would not be surprised to learn that this block was ported into the IE10+ CORS implementation for the XMLHTTPRequest object.
One approach which may help is to simply change from localhost to 127.0.0.1 as the latter is treated as Internet Zone rather than Intranet Zone and as a consequence the zone-crossing is avoided.
However, you should be aware that Internet Explorer 10+ will block all access to the local computer (via any address) when a site is running in Enhanced Protected Mode (EPM)-- see "Loopback blocked" in this post. Currently, IE uses EPM only for Internet sites when run in the Metro/Immersive browsing mode (not in Desktop) but this could change in the future.
No, there's no mechanism to show the Zones-Configuration UI from JavaScript or to automatically move a site from one zone to another. However, the fact that you have a local server implies that you are running code on the client already, which means you could use the appropriate API to update the Zone Mapping on the client. Note that such a change requires that you CLEARLY obtain user permission first, lest your installer be treated as malware by Windows Defender and other security products.
So, in summary, using the IP address should serve as a workaround for many, but not all platforms.
Since those are two different domains, one solution would be to create an application which proxies the requests in the direction you want.
If you have control over the example.com end, and want to support users who bring their own localhost service, this would be harder, as you would have to provide more requirements for what they bring.
If however, you have control over what runs in localhost, and want to access example.com, and have it access the localhost service, set up redirection in your web server of preference, or use a reverse proxy. You could add an endpoint to the same localhost app which doesn't overlap paths, for example, route http://localhost/proxy/%1 to http://%1, leaving the rest of localhost alone. Or, run a proxy on e.g. http://localhost:8080 which performs a similar redirection, and can serve example.com from a path, and the API from another.
This winds up being a type of "glue" or integration code, which should allow you to mock interactions up to a point.

source map HTTP request does not send cookie header

Regarding source maps, I came across a strange behavior in chromium (build 181620).
In my app I'm using minified jquery and after logging-in, I started seeing HTTP requests for "jquery.min.map" in server log file. Those requests were lacking cookie headers (all other requests were fine).
Those requests are not even exposed in net tab in Developer tools (which doesn't bug me that much).
The point is, js files in this app are only supposed to be available to logged-in clients, so in this setup, the source maps either won't work or I'd have to change the location of source map to a public directory.
My question is: is this a desired behavior (meaning - source map requests should not send cookies) or is it a bug in Chromium?
The String InspectorFrontendHost::loadResourceSynchronously(const String& url) implementation in InspectorFrontendHost.cpp, which is called for loading sourcemap resources, uses the DoNotAllowStoredCredentials flag, which I believe results in the behavior you are observing.
This method is potentially dangerous, so this flag is there for us (you) to be on the safe side and avoid leaking sensitive data.
As a side note, giving jquery.min.js out only to logged-in users (that is, not from a cookieless domain) is not a very good idea to deploy in the production environment. I;m not sure about your idea behind this, but if you definitely need to avoid giving the file to clients not visiting your site, you may resort to checking the Referer HTTP request header.
I encountered this problem and became curious as to why certain authentication cookies were not sent in requests for .js.map files to our application.
In my testing using Chrome 71.0.3578.98, if the SameSite cookie atttribute is set to either strict or lax for a cookie, Chrome will not send that cookie when requesting the .js.map file. When there is no sameSite restriction, the cookie will be sent.
I'm not aware of any specification of the intended behavior.

Categories

Resources