Cypress: Add cookie to external api calls from localhost frontend - javascript

I have an external api deployed to a dev server and a frontend app running on localhost. I need to attach the cookies I get from logging in, to both my localhost and external API domain.
I can see the cookies are indeed there using cy.getCookies({domain: null}), however all external API calls within the react app happen without the cookies. From what I can see, you can only add cookies to the baseUrl requests, but I want to run my tests against a deployed backend (so developers don't need to have a running instance of a local backend, which is whats currently working fine as they are both on localhost)
cy.setCookies(name, value, {domain: localhost})
cy.setCookies(name, value, {domain: external_api_domain})

If there is a request originating from the app to the back-end that needs a specific cookie attached, maybe an intercept can attach them.
This is untested & on-the-fly code, may need adjusting
cy.intercept('POST', my-url, (req) =>
const cookie = cy.getCookies(name).then((cookie) => {
req.headers['Cookie'] = `${cookie.name}=${cookie.value}`;
req.continue()
})
})
Assumptions
I'm assuming the frontend initiates the request that the backend then passes on to another server, and backend gets all info (including cookies) from the frontend request.
Debugging
You can check the backend phase is working with cy.request(). Ultimately you want FE to do it, but in case the problem lie in the FE and not the test cy.request() can be useful.
For ref Cypress request and cookies

Related

neo4j server side javascript

I have a neo4j desktop (1.4.3) database on my Windows PC. in an html code, I am connectecting to the DB using
const driver = neo4j.driver("bolt://IP_ADDRESS:7687", neo4j.auth.basic("neo4j", "PASSWORD"));
After that I query the DB and display the results on the web page (I use leafletjs maps, but this is not the issue)
var session = driver.session();
session
.run(`MATCH....etc.... return ....
`)
.subscribe({
...... etc
Everything is fine so far. I run the page on my PC or from another PC in my home network, everything is fine. The setting of neo4j is (dbms.default_listen_address=0.0.0.0) no issues there.
The question is how do I expose this page to the colleagues outside my network?
Using noip.com, I got a temporary domain mapped to my external IP.
I also configured the router to forward port 80.
But when the page Javascript gets loaded on an external client, it tries to connect to neo4j on that client. When I put the external IP addtess in "const driver ..." the connection does not work.
How do I make the connection to the DB from my server, but the queries to the DB come from the client who loaded the Javascript?
Edit: Forgot to mention that I am also using Apache Web Server (Xampp) to serve the page to remote users.
A simple architecture that does what you want, plus mitigates the risk of opening up your database to everyone uses a HTTP server + API that are accessible via your noip provider.
Your public facing frontend (HTML + JavaScript (for making API calls etc)) makes the HTTP(s) calls to your publicly accessible API (for example a nodejs server) to make the database calls. Cypher/a direct database connection to neo has no place in your users' browsers.
You can also use a starter like the GRANDstack.

Fetch request to localhost with basic auth

Is it possible for a webpage, like https://example.com perform an http request to localhost?
This is an uncommon situation. Let me explain:
My visitors have a local server running. They can access their server in a request to http://localhost:12039/wallet, which returns a json.
When the visitor opens my page (at example.com) could the JS in that page perform a request to their local server?
To make it even more complicated: the server don't add CORS header and the it also required an Basic Auth.
From postman, everything work as expected. But not from the browser (if not called from localhost)
I did some tests, but it's not working yet.
I'm not able to send the Authorization. When I run this code, there's a prompt from the browser, asking for username and password:
It's really hard to Google this terms. Most of the results are related to localhost calls for your own api during the development.

Cypress E2E fails on basic auth

I'm trying to use cypress to cover our dev-server by a set of complete e2e tests.
We have a requirement to lock our dev environment under basic http authentication so nobody couldn't access it without proper credentials (both client and api).
When I'm trying to run Cypress tests they fail because of that. Server (nginx) just responds with 401 http-code.
I tried to pass credentials in url like 'user:password#domain.com', and it partially works: cypress is able to reach frontend on domain.com, but it's still unable to send any requests to our backend (api.domain.com) from within the page (using fetch) - probably because of different subdomain or something.
I'm looking for a way for forcing it to use those credentials on all requests on a domain or any other workaround that may help me run the tests.
Thanks!
This might not be an auth issue. fetch does not work with cypress. See https://github.com/cypress-io/cypress/issues/687
A workaround for it is to put this into your support/index.js file:
Cypress.on("window:before:load", win => {
win.fetch = null;
});

How to create a server that is only internal to the app in node / npm or modify any response body from outgoing requests

I am trying to develop a node app and require a server that can only be used by the app internally.
I have tried instantiating a server without listening to the port, but can't do anything with it from that point forwards:
let http = require("http");
http.createServer(function (req, res) {
// custom code
})
This app is being built with NWJS and I need to intercept any outgoing requests (including file resources; CSS, JS, images, etc.) and modify the response, but I am not having any success with it except if I use a server for this purpose.
Problem is it becomes possible to open that server on any browser and I just want it to be used only inside the app or another way to intercept outgoing requests from the app so that the response body can be modified.
I have tried Keith's suggestion of using a service worker to intercept requests, but in my case I could not load a service worker from local environment into live environment (for example, run a local sw file in stackoverflow page), so that suggestion ended there.
Stdob's suggestion of using a proxy ended up being redundant and more troublesome than my original attempt.
In the end I went with my original attempt as follows:
Using chrome.webRequest.onBeforeRequest (Chrome API) and a local node server.
The server is created with an arbitrary port to reduce the risk of hitting an already used port.
The chrome API redirects all connections to the local server (ex. url: http://127.0.0.1:5050) and then the server will handle the requests as needed, returning the requested files modified or intact.
Last step, add a unique header with a unique value that only the app knows, so that no server access can be made from outside the app.
It is not the best solution, ideally I would prefer to have something like Firefox's webRequest.filterResponseData, but until Chrome implements that, this will have to do.

How to get JSON data via Javascript with cookies authentication?

I'm building an application with Ruby on Rails. The application is a Javascript application that get data by JSON calls from the api application. It also provide the cross domain authentication for the application itself and the api.
I handle the cross domain authentication via making both application cookies with same secret session key and same name e.g. _app_an_api_session.
Now, I'll write down the senario and I'll show you when it fails.
The application domain is domain.local the api domain api.domain.local
Lets say that api.domain.local/me is a protected page. When I open it i got unauthenticated
When I go to the application and sign in and go to api.domain.local/me again, I can see the data in it. [Pass] [it works also for the opposite actions].
The problem is, for example, after sign in I want to load api.domain.local/me contents [JSON data] in domain.local, I can see in the console that the status is (canceled).
In addition for debugging sake. I tried to see the env variable on each request to the api. So, when I access the api.domain.local/medirectly, I can see the cookies hash in the console, else if it via the real application, there's no cookies/sessions at all.
So, how to make it possible to done this right?
Your issue is the Same-Origin Policy that prevents communication between different domains and subdomains. You can resolve this either through using a proxy, JSON-P or an external proxy.

Categories

Resources