I'm trying to console.log POST requests with my chrome extension but can't figure out how to do it, can anyone give me an example? I've looked at the chrome extension API but still can't seem to get to it
In Google Chrome, browser requests such as POST and GET are visible in the Network tab of the Inspector.
Screenshot from Chrome Devtools Overview:
If you are looking for a natural way to make a Javascript hook on browser requests (such as for logging them out), you will have more issues as there is no native way for Javascript to hook on requests at browser's scale, for security reasons.
But if you are okay to use a dedicated extension for the job, you can look at the webRequest extension for Chrome:
https://developer.chrome.com/extensions/webRequest
Use the chrome.webRequest API to observe and analyze traffic and to intercept, block, or modify requests in-flight.
Here's an example of listening for the onBeforeRequest event:
chrome.webRequest.onBeforeRequest.addListener(callback, filter, opt_extraInfoSpec);
Beware, there are security requirements and limitations:
You must declare the "webRequest" permission in the extension manifest
to use the web request API, along with host permissions for any hosts
whose network requests you want to access.
Note that due to the ordered or asynchronous nature of the webpage resources loading (HTTP/1.x or HTTP/2.0) you are not be guaranteed to catch all the requests made by the browser that happened before your Javascript hooks were setup.
Eventually, you have some tricks, such as those referenced here for detecting AJAX calls through Javascript proxification mechanisms.
Another strategy would be to have the request detection deported on server and informing the client he sent a request (through Websockets/queues for example). It would only work for requests targeted on the domains you manage though, and it sounds like a bit of an expensive solution. It all depends of what your ultimate needs are.
Related
I have a logic written on my server mostly doing curl requests (e.g. accessing social networks). though, some of the sites, will be blocking my server(s) IPs soon.
I can of course, use VPN or deploy multiple servers per location, but it won't get accurate, and still some of the networks might get block the user account.
I am trying to find creative solution to run it from the user browser (it is ok to ask for his permission, as it is an action he is explicitly trying to execute) Though I am trying to avoid extra installations (e.g. downloadable plugins\extension or a desktop app)
Is there a way to turn the client browser into a server-proxy, to run those curl-calls from his machine instead of sending it from my own server? (e.g. using web-sockets, polling, etc.)
It depends on exactly what sort of curl requests you are making. In theory, you could simulate these using an XMLHttpRequest. However, for security reasons these are generally not allowed to access resources hosted on a different site. (Imagine the sort of issues it could cause for instance if visiting any website could cause your browser to start making requests to Facebook to send messages on your behalf.) Basically it will depend on the Cross-origin request policy of the social networks that you are querying. If the requests being sent are intended to be publicly available without authentication then it is possible that your system will work, otherwise it will probably be blocked.
Sometimes, one needs to add special headers to each request or specific requests made from a browser. The common approach to do this is by using browser extensions which allow us to modify request headers. Is there another way to do this, without any browser extension ?
PS - I have searched SO and not found a single post which actually suggests or shows how to do what I need.
Outside of APIs designed to make custom HTTP requests (XMLHttpRequest and fetch), it is impossible to add arbitrary HTTP headers to requests made by browsers using JS embedded in a page.
If you control the websites that you want this functionality on, you could achieve this by setting each application to install a ServiceWorker. In a nutshell, service workers run as a proxy server within your browser. They can do things like notify you of updates even if you don't have the website open.
Within a ServiceWorker you are able to set up event listeners that can do some asynchronous task on behalf of the client app. This includes the fetch event which is fired every time the web page makes a request.
Here's a write up on someone implementing a ServiceWorker who also needed to intercept network requests. You could follow most of this and just alter the logic when inspecting the request type. At that point you could add any special headers before dispatching on the applications behalf.
Theres no possibility to edit requests in existing DOM without using any external tools. The most suitable is Browser Extension which is editing the existing DOM and HTTP requests (XMLHttpRequest and fetch) done by JavaScript code.
Theres millions of possibilities to add headers to requests if the owner of website is you. And the solutions are different consider on what lib are you using for doing requests.
But in general it's not recommended to modify website data that is not yours.
The Browser Extension is the exact thing that you found for your problem.
Hope my comment will help you.
I'm working on google chrome extension which get the page url and analyze it. How can i intercept the browser request and serve that request condionally based on some criteria. I'm surfing but could find any material.
That's going to be very tricky, if at all possible.
The closest that extensions API provide is blocking webRequest API. There, you can intercept a request and make a decision to allow it or block it, but..
You can only do that until the request is sent out. So you can only rely on the URL and maybe request headers. Even in later events (when it's too late to redirect) no point webRequest API gives access to the response itself.
You have to make the decision synchronously, which basically severely limits processing options.
What you could do (very much in theory) is always redirect the request to your own "loading" page, meanwhile trying to replicate the request yourself (near-impossible to fully do, also consider side-effects), analyze the response and then substitute the "loading" page with the real one.
It's going to be either very complicated or impossible to do in complex cases. You're basically trying to implement an intercepting proxy in a Chrome extension - it doesn't really provide the full toolset to do so.
I'm trying to log all the requests that sites in my browser make behind the scenes. I can do it manually using Chrome's anylitics or Firebug, but I want to have either (a) a quick js extension that I can bookmark and run on sites when I want to log requests, or (b) a chrome/firefox extension to do so. I found this thread asking roughly the same thing, but I want to catch AJAX requests too. How can I go about this?
Fiddlr
http://www.telerik.com/fiddler
This application runs outside of your browser to inspect all data transmitted between your computer and the internet. It's what I use to debug application design and I think it would be great for what you need.
To note once running it will automatically "log" all requests, and they can be easily saved for reviewing later. There are also loads of extensions to the application that may do the same for you.
Key Features
HTTP/HTTPS Traffic Recording
Fiddler is a free web debugging proxy which logs all HTTP(s) traffic between your computer and the Internet. Use it to debug traffic from virtually any application that supports a proxy like IE, Chrome, Safari, Firefox, Opera and more.
Tamper-client-requests-and-server-responses
Web Session Manipulation
Easily manipulate and edit web sessions. All you need to do is set a breakpoint to pause the processing of the session and permit alteration of the request/response. You can also compose your own HTTP requests to run through Fiddler.
Inspect-and-debug-traffic-from-any-client
Web Debugging
Debug traffic from PC, Mac or Linux systems and mobile devices. Ensure the proper cookies, headers and cache directives are transferred between the client and server. Supports any framework, including .NET, Java, Ruby, etc.
Decrypt-HTTPS-web-sessions
Security Testing
Use Fiddler for security testing your web applications -- decrypt HTTPS traffic, and display and modify requests using a man-in-the-middle decryption technique. Configure Fiddler to decrypt all traffic, or only specific sessions.
Test-the-performance-of-your-web-sites-and-apps
Performance Testing
Fiddler lets you see the “total page weight,” HTTP caching and compression at a glance. Isolate performance bottlenecks with rules like “Flag any uncompressed responses larger than 25kb.”
Update:
Google Chrome Developer tools (Specifically the Network Tab) You are able to easily see network traffic directly from the current webpage and monitor all HTTP information such as request and response headers, cookies and timing elements.
Try to use jQuery Global Ajax Event Handlers
These methods register handlers to be called when certain events, such as initialization or completion, take place for any Ajax request on the page. The global events are fired on each Ajax request if the global property in jQuery.ajaxSetup() is true, which it is by default. Note: Global events are never fired for cross-domain script or JSONP requests, regardless of the value of global.
Regarding source maps, I came across a strange behavior in chromium (build 181620).
In my app I'm using minified jquery and after logging-in, I started seeing HTTP requests for "jquery.min.map" in server log file. Those requests were lacking cookie headers (all other requests were fine).
Those requests are not even exposed in net tab in Developer tools (which doesn't bug me that much).
The point is, js files in this app are only supposed to be available to logged-in clients, so in this setup, the source maps either won't work or I'd have to change the location of source map to a public directory.
My question is: is this a desired behavior (meaning - source map requests should not send cookies) or is it a bug in Chromium?
The String InspectorFrontendHost::loadResourceSynchronously(const String& url) implementation in InspectorFrontendHost.cpp, which is called for loading sourcemap resources, uses the DoNotAllowStoredCredentials flag, which I believe results in the behavior you are observing.
This method is potentially dangerous, so this flag is there for us (you) to be on the safe side and avoid leaking sensitive data.
As a side note, giving jquery.min.js out only to logged-in users (that is, not from a cookieless domain) is not a very good idea to deploy in the production environment. I;m not sure about your idea behind this, but if you definitely need to avoid giving the file to clients not visiting your site, you may resort to checking the Referer HTTP request header.
I encountered this problem and became curious as to why certain authentication cookies were not sent in requests for .js.map files to our application.
In my testing using Chrome 71.0.3578.98, if the SameSite cookie atttribute is set to either strict or lax for a cookie, Chrome will not send that cookie when requesting the .js.map file. When there is no sameSite restriction, the cookie will be sent.
I'm not aware of any specification of the intended behavior.