I am trying to test a player loaded by the webpage I currently test.
When testing manually everything works as expected and via https.
But when I run my cypress test the player is not loading and I get a Mixed Content-Error because it seems to request resources via http. Therefore the player is not loading.
I already tried adding a upgrade-insecure-requests : 1 header to every request and setting chromeWebSecurity : false in the config but neither seems to work.
EDIT:
After some further research I found the html script tags requesting the sources. Their URLs start with // which cypress seems to use http for instead of the protocol used beforehand (https).
Has someone already experienced this or found a working solution?
I had the same problem, you can avoid this error by setting "chromeWebSecurity" to false in your cypress config file.
cypress.config.json
const { defineConfig } = require("cypress");
module.exports = defineConfig({
e2e: {
setupNodeEvents(on, config) {
// ...
},
},
"chromeWebSecurity": false //add this param to your config file
});
Related
My question is similar to this issue. But the solution provided there does not address it.
I'm trying to integrate Tealium tags into my electron app. I have added the code snippet as per documentation as below:
<script type="text/javascript">
(function(a,b,c,d) {
a='//tags.tiqcdn.com/utag/ACCOUNT/PROFILE/ENVIRONMENT/utag.js';
b=document;c='script';d=b.createElement(c);d.src=a;
d.type='text/java'+c;d.async=true;
a=b.getElementsByTagName(c)[0];a.parentNode.insertBefore(d,a)})();
</script>
The above script in turn has calls to multiple utag.x.js. The issue is that the utag files are not locally available and hence the file protocol to locate these fails with ERR_FILE_NOT_FOUND.
I changed the value of variable 'a' to start with 'https' then utag.js is downloaded successfully but the next call to utag.x.js fails. To solve this I'm trying to intercept the file protocol using the API interceptFileProtocol().
app
.whenReady()
.then(() => {
protocol.interceptFileProtocol('file', (request, callback) => {
if (request.url.includes('utag')) {
console.log('UTAG request...');
request.url = `https://${request.url.slice('file://'.length)}`;
}
callback(request);
})
});
In the console I can see that the call is intercepted and the url gets updated. But the network call to download the resource still has the file protocol instead of https and hence get ERR_FILE_NOT_FOUND.
Am I correct in using the interceptFileProtocol or is there any other way to do it?
I have a Next JS app with a Page somePage.js. I would like to make an XHR request to the Airtable API from within getServerSideProps. The truncated component is as follows:
pages/somePage.js
import { Component } from 'react';
import Airtable from 'airtable';
export const config = {
runtime: 'experimental-edge',
};
export async function getServerSideProps({ query }) {
Airtable.configure({
endpointUrl: 'https://api.airtable.com',
apiKey: process.env.AIRTABLE_API_KEY,
})
const base = Airtable.base(process.env.AIRTABLE_BASE);
base('someTable').select({...});
return { props: { items: [] } };
}
class SomePage extends Component {
constructor(props) {
super(props);
}
render() {
return (<>...</>);
}
}
export default SomePage;
I get the following error when I do yarn run next dev:
Error: URL is malformed "". Please use only absolute URLs - https://nextjs.org/docs/messages/middleware-relative-urls
This error happened while generating the page. Any console logs will be displayed in the terminal window.
With a stack trace starting in node_modules/next and ending in node_modules/airtable.
The error goes away if I remove airtable related code.
The website linked in the error indicates that URLs must be absolute when used in specific functions (none of which are used in my app in my code, I checked), specifically, in middleware, as in the Next.js middleware paradigm (within a middleware directory, which I also don't have). I assumed the airtable.js library was trying to do some relative query, which I tried to resolve by setting the endpointUrl explicitly, but that didn't solve the issue.
I also tried doing Airtable.configure outside of getServerSideProps, but that didn't change the error.
I checked various other answers on stackoverflow, but their issues all resolved around using middleware, which I don't do, and anyway, their answers didn't seem to be relevant to my issue:
Next JS - Middlewares - Error: URLs is malformed. Please use only absolute URLs : to my knowledge, I'm not making relative URL request
Next JS Middlewares - URLs is malformed. Please use only absolute URLs : same as above
I checked the Airtable API docs, as well as the airtable.js library, but didn't find anything about ensuring all URLs are absolute, nor any helpful tutorials about using next.js with airtable. The tutorials I did find didn't seem to be using airtable in any significantly different way than me.
According to my understanding of the getServerSideProps paradigm of next.js, I should be able to make cross-origin API calls within this function, so I don't see why it would be specifically disallowed.
How can I make API calls to Airtable from within Next JS getServerSideProps?
My versions are as follows:
"airtable": "^0.11.6",
"next": "^13.1.1",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"webpack": "^4.29.6"
Edit: I have confirmed that literally just importing airtable, or requiring it, causes the error. No need to invoke it.
Edit: This seems to be related to the experimental edge runtime.
This is because the experimental edge runtime is incompatible with the airtable.js library, for various reasons. I don't know the exact cause of the mentioned bug, but basically the experimental edge runtime isn't node, and therefore the airtable library is using incompatible APIs.
See: https://github.com/vercel/next.js/discussions/44843
https://community.cloudflare.com/t/is-the-airtable-js-library-compatible-with-cloudflare-pages/452308/2
Airtable js can be used with nextjs, just not on cloudflare, because cloudflare requires experimental edge runtime for serverside rendered next.js. The deployment solution will have to be able to run the next.js node server if you want to use airtablejs for a serverside rendered app.
I'm trying to fetch data from my Python RESTfull API implemented in Flask and I got the following code set up.
tasks = {
'1' : 'Learn',
'2' : 'Build',
'3' : 'Apply',
'4' : 'Succeed'
}
#app.route('/alltasks')
def get_tasks():
return jsonify(tasks)
The routing works perfectly fine when I enter the raw URL in the url bar as http://127.0.0.1:5000/alltasks and it displays the JSON data
{
"1": "Learn",
"2": "Build",
"3": "Apply",
"4": "Succeed"
}
The issue is that I'm trying to access the data from my Vue.js application and but Chrome and the other browsers keeps giving me this error:
Cross-Origin Read Blocking (CORB) blocked cross-origin response http://127.0.0.1:5000/alltasks with MIME type application/json. See https://www.chromestatus.com/feature/5629709824032768 for more details.
I confirmed that the API was doing what it should because I made a dummy route that just returned a message:
#app.route('/hi')
def say_hello():
return 'Thank you for checking out my API'
and Vue accepted it just fine and it was displayed in the Network Diagnostic tools in Chrome. This is the code using Axios:
created: function() {
this.loadAllTasks();
},
methods: {
loadAllTasks: function () {
alert('At it');
axios.get('http://127.0.0.1:5000/alltasks')
.then(function(res, req){
alert(res);
});
}
The alert function isn't being executed because an error is being thrown.
Now I know that this question was asked and the most of the answers were to run Chrome and disable web security but do you really think every user is going to do that?
What is causing the error and what libraries can I use or code modification can I do to fix it?
What port is your vue.js web app running on? If it's running on a different port from your API (usually vue templates run on port 8080), the request will be blocked because it is viewed as a different origin (i.e. 127.0.0.1:8080 --> 127.0.0.1:5000). In order to resolve this, you can use the flask-cors library and define a list of origins that are allowed to access your api. I've never used it before, so I can't give specifics. The documentation is here (the Simple Usage section should give you what you need):
https://flask-cors.readthedocs.io/en/latest/
In particular, you should just need to add this to the top of your code (right after app is defined):
cors = CORS(app, resources={r"/*": {"origins": "*"}})
I am trying to use an AudioWorklet within my electron app for metering etc. which works fine when executed in dev mode where the worklet is being served by an express dev server like http://localhost:3000/processor.js.
However if I try to run the app in prod mode the file is being served locally like file://tmp/etc/etc/build/processor.js and in the developer-console I can even see the file correctly being previewed but I get this error message:
Uncaught (in promise) DOMException: The user aborted a request.
I saw that someone else had a similar problem before over here but unfortunately my reputation on stack overflow is not high enough to comment directly. The suggestion there to change the mime-type to application/javascript or text/javascript sounds good but I have no idea how to force electron to use a specific mime-type for a specific file. Furthermore in the developer-console in the network tab it seems like chromium is actually already assuming a javascript file for my processor.js.
I already tried to load the worklet with a custom protocol like that
protocol.registerStandardSchemes(['worklet']);
app.on('ready', () => {
protocol.registerHttpProtocol('worklet', (req, cb) => {
fs.readFile(req.url.replace('worklet://', ''), (err, data) => {
cb({ mimeType: 'text/javascript', data });
});
});
});
and then when adding the worklet
await ctx.audioWorklet.addModule('worklet://processor.js');
unfortunately this only ends in these errors followed by the first error
GET worklet://processor.js/ 0 ()
Uncaught Error: The error you provided does not contain a stack trace.
...
I found a hacky solution if anybody is interested.
To force a mime-type electron / chromium is happy with I load the worklet file with the file api as a string, convert it to a blob with mime-type text/javascript and then create an object url from that
const processorPath = isDevMode ? 'public/processor.js' : `${global.__dirname}/processor.js`;
const processorSource = await readFile(processorPath); // just a promisified version of fs.readFile
const processorBlob = new Blob([processorSource.toString()], { type: 'text/javascript' });
const processorURL = URL.createObjectURL(processorBlob);
await ctx.audioWorklet.addModule(processorURL);
Hope this helps anyone having the same problem...
If you're using webpack to compile your source you should be able to use the web-worker loader for your custom worker scripts.
So i wanted to get into Test Driven Development and decided to use Jasmine on my project.
The thing is, i can't load fixtures.
The 2 solutions commonly proposed are :
Run chrome with --allow-file-access-from-files
Serve the file from you local server
So i used the first solution, but no result.
Then i set up the routes of my webserver so that localhost/fixture/my_fixture would return the content of my_fixture.html.
So when i manually access localhost/fixture/my_fixture, the content of the fixture is displayed on screen. But in my jasmine spec file, when i use :
jasmine.getFixtures().fixturesPath = 'http://localhost/fixture'
loadFixtures('quizz_fixture')
I get the following errors :
Error: Fixture could not be loaded: http://localhost/fixture/quizz_fixture
(status: error, message: Failed to execute 'send' on 'XMLHttpRequest': Failed to load 'http://localhost/fixture/quizz_fixture?_=1455854875950'.)
When i use the URL given in the error, my browser displays the content of the fixture without errors.
Therefore, i don't understand the reason for this error. Does anyone have an insight?
Edit:
Web server : Apache
Browser : Chrome
OS : Windows 7
Edit 2
The issue comes from jasmine-jquery, on line 139 below, where the fail function is called. I can't figure out what's happening as the URL that supposedly can't be loaded actually loads just fine in my browser :
jasmine.Fixtures.prototype.loadFixtureIntoCache_ = function (relativeUrl) {
var self = this
, url = this.makeFixtureUrl_(relativeUrl)
, htmlText = ''
, request = $.ajax({
async: false, // must be synchronous to guarantee that no tests are run before fixture is loaded
cache: false,
url: url,
dataType: 'html',
success: function (data, status, $xhr) {
htmlText = $xhr.responseText
}
}).fail(function ($xhr, status, err) {
throw new Error('Fixture could not be loaded: ' + url + ' (status: ' + status + ', message: ' + err.message + ')')
})
The result is :
Failed to load 'http://localhost/fixture/quizz_fixture.html?_=1456886216017'
Which works when called in the browser. I just don't get it.
Thanks.
It's really hard to answer without knowing at least a little about the nature of your server, or what the fixture looks like. Is the server just a simple file server like node-http-server, or is this pointing to your app? Is it serving the fixture correctly? Does your fixture have a mistake in it? I can't tell any of that from here.
What I would say though is that if you are just beginning TDD you should probably avoid fixtures entirely. One of the biggest challenges to somebody new to TDD is writing small enough tests, and Jasmine fixtures make it easy to write really big tests.
Instead I would recommend manually adding the bare minimum of DOM you need to the page and removing that in an after hook. jasmine-fixture is a tool that essentially does this. This'll force you to consider how much of the DOM you actually need to write a test, and will make the DOM changes you are making visible in the tests itself.
So i found a very unsatisfying solution, but a solution nonetheless.
To summarize
Using chrome, i tried to load jasmine fixture from a local file, which wouldn't work with chrome (this is something known, disabled for security reasons).
I tried using the chrome flag --allow-file-access-from-files but it didn't work. So i gave up on using a fixture from a local file.
I understood that the fixture file had to be served from my web server, which i did. But it didn't work either, because of some Ajax error related to the caching of fixtures. I tried updating my version of jquery (which was a bit old) but it didn't work. In the end, I wasn't able to understand what the issue was.
I downloaded firefox and tried executing the jasmine specRunner with the configuration of point 3 above (fixture served by web server) but again, it didn't work.
Using firefox, I reverted to the method in point 1, which is using a local fixture file, and it did work. I hate that solution, but i need to go forward, so that will do.
Conclusion
If stuck with that kind of issue, save yourself some time and use firefox which will allow the use of a local fixture file.
In the command line you can write:
start chrome --allow-file-access-from-files "path_to_test/SpecRunner.html"
That solved to me... hope can help some more people.