So I am writing an application in Node-Webkit/NW.js that needs a "Login to LinkedIn" button. We have to use a custom protocol/domain in order to allow hooking into the Dropbox API (let's call it app://example).
In the Application Details on the developer portal, for JavaScript API Domains I have "app://example" and "example." However, when I attempt to use my API key inside the application I get the following error:
Uncaught Error: JavaScript API Domain is restricted to example
Does LinkedIn not allow custom protocols, and only http/https? This is a big problem for us and I hope someone is able to answer.
Use a server in a controlled environment rather than the developer console to avoid this error.
Use a redirect from the https:// protocol to the app:// protocol in that controlled environment. Here is the process:
Essentially, where I was previously seeing "Not allowed to load local resource: app://whatever/somefile.html", addOriginAccessWhitelistEntry eliminates the error and I see the appropriate app:// resource in the address bar with the following new error in the console: "Uncaught ReferenceError: require is not defined"
That said, if I force a refresh at this point the resource rendering occurs as expected.
nw.App.addOriginAccessWhitelistEntry('http://github.com/', 'app', 'myapp', true);
References
nw.js: app:// Protocol doesn't load for OAuth Redirect
nw.js: addOriginAccessWhitelistEntry
Related
If I have an iFrame that's trying to launch a custom URL scheme (e.g. twitter://user?data=value), Chrome will throw an error in the console:
Failed to launch 'twitter://user?data=value' because the scheme does not have a registered handler
I need to access that URL, without taking any actions (like launching Twitter, for example). If the console is displaying this error information, that must mean the client has access to the URL and data.
Is there a way to intercept these errors and deal with the URL myself?
my model (tf.keras.Sequential) was trained in Python, and I converted it into TF.js Layers format by using tfjs.converters.save_keras_model().
I created a server in folder (which contains *.bin files and a 'model.json'), using 'http-server' in cmd.
After that, I run this code to load the model:
(async () => {
const model = await tf.loadLayersModel('http://127.0.0.1:8080/model.json');
console.log('done');
})();
It doesn't work for me, these 3 errors appear in my console:
Access to fetch at 'http://127.0.0.1:8080/model.json' from origin 'null' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.
GET http://127.0.0.1:8080/model.json net::ERR_FAILED
Uncaught (in promise) Error: Request for http://127.0.0.1:8080/model.json failed due to error: TypeError: Failed to fetch
at tf.min.js:2
I have no idea how to fix it.
So this is an issue with CORS (Cross Origin Resource Sharing) - you need more than just simple web server to serve the files.
For any static files (like the bin and json files you have) that you want to use across domains on website etc you need to set the right header for those files by the web server so the browser knows its ok to use on such sites.
This is to do with web security across domains and whilst it catches a lot of folk out its important to have. Not sure what web server you are running but if you are using Express with Node.js then check this simple tutorial:
https://enable-cors.org/server_expressjs.html
or this lib:
https://medium.com/#alexishevia/using-cors-in-express-cac7e29b005b
Note the Allow-Origin part which is where you need to set the domain you plan to use it on. You can also use wildcard * to allow all domains if you want anyone to be able to use use those files on their sites too without issue.
Oh and fun fact, if you dont want to deal with a web server at all try Glitch.com which allows you to host experimental projects and upload assets if this is just for fun - it sets all the CORS headers correctly and is easy to use and prototype stuff on. https://glitch.com/#TensorFlowJS
Browser: Firefox 58.0.2 (64-bit)
I'm trying to write a very simple service worker to cache content for offline mode, following the guidance here and here.
When I load the page the first time, the service worker is installed properly. I can confirm it's running by looking in about:debugging#workers.
However, at this point if I attempt to refresh the page (whether online or offline), or navigate to any other page in the site, I get the following error:
The site at https://[my url] has experienced a network protocol
violation that cannot be repaired.
The page you are trying to view cannot be shown because an error in
the data transmission was detected.
Please contact the website owners to inform them of this problem.
The console shows this error:
Failed to load ‘https://[my url]’. A ServiceWorker passed a redirected Response to FetchEvent.respondWith() while RedirectMode is not ‘follow’.
In Chrome, I get this:
Uncaught (in promise) TypeError: Failed to execute 'fetch' on 'ServiceWorkerGlobalScope': Cannot construct a Request with a Request whose mode is 'navigate' and a non-empty RequestInit.
Per this thread, I added the { redirect: "follow" } parameter to the fetch() function, but to no avail.
(Yes I did manually uninstall the Service Worker from the about:debugging page after making the change.)
From what I understand, however, it's the response, not the fetch, that's causing the problem, right? And this is due to my server issuing a redirect when serving the requested content?
So how do I deal with redirects in the service worker? There are obviously going to be some, and I still want to cache the data.
Partly spun off from https://superuser.com/a/1388762/84988
I sometimes get the problem with Gmail with Waterfox 56.2.6 on FreeBSD-CURRENT. (Waterfox 56 was based on Firefox 56.0.2.) Sometimes when simply reloading the page; sometimes when loading the page in a restored session; and so on.
FetchEvent.respondWith() | MDN begins with an alert:
This is an experimental technology …
At a glance, the two bugs found by https://bugzilla.mozilla.org/buglist.cgi?quicksearch=FetchEvent.respondWith%28%29 are unrelated.
Across the Internet there are numerous reports, from users of Gmail with Firefox, of Corrupted Content Error, network protocol violation etc.. Found:
Mozilla bug 1495275 - Corrupted Content Error for gmail
I am integrating yammer features in our app (web-front-end stack and using Yammer JS SDK). So, I want get all the groups of a login-ed user.
Inorder to get all the groups, I have tried to call the end points in two ways, /groups.json=mine using SDK - resposne showing method not authorised and www.yammer.com/api/v1/groups.json=mine=1 using normal ajax GET request- throwing access control origin issue.
API is perfectly working, while tested using google chrome browser - by disabling web security.
My question is how I can call www.yammer.com/api/v1/API_END_POINTS without cross origin issue using yammer JS SDK or any other technique?
I found an related answer in this Q & A , but still showing error for me.
Error :
XMLHttpRequest cannot load
https://www.yammer.com/api/v1/groups.json?mine=1. Response to
preflight request doesn't pass access control check: No
'Access-Control-Allow-Origin' header is present on the requested
resource. Origin 'https://xxx.dev.com' is therefore not allowed
access.
Code I have tried after login code :
1# GET Request - return $http.get('https://www.yammer.com/api/v1/groups.json?mine=1')
2# yammer JS SDK -
yam.platform.request({
url: "groups.json?mine=1",
method: "GET",
success: function (group) {
console.log(group);
},
error: function (group) {
console.error("There was an error with the request.", group);
}
});
I have already commented different Q&As for the opinion, but no luck, no one replied.
PS - My all other yammer APIs are working(login, post, message etc), which listed in Yammer REST API Docs, only I am facing problem with APIs hosted in wwww.yammer.com/api/v1, Not api.yammer.com
Thanks in advance
You should register an App in Yammer dev console, specify allowed origins, get an API key and send it along with request.
See https://developer.yammer.com/docs/api-requests for more info.
I have two of the same site. My 1st site is http://educationaboveall.org/ and the 2nd is http://www.savantgenius.com .
1st site is loading properly on every device without any error but the 2nd (www.savantgenius.com) site is not loading properly in mobile and table devices. It is only loading properly in desktop browser. I have also found 32 console error.
Are there any jQuery issues? And please tell me how to be able to fix it.
I'm getting the "XMLHttpRequest cannot load
file:///D:/Work%20File/My%20Work%20File/mY%20Work%20Backup/Sophie/Work%20File/footer.html.
Cross origin requests are only supported for HTTP." and "Error: Failed
to execute 'send' on 'XMLHttpRequest': Failed to load
'file:///D:/Work%20File/My%20Work%20File/mY%20Work%20Backup/Sophie/Work%20File/footer.html"
error, but I don't know what's causing it nor how to fix it.
Please see the screenshot - http://prntscr.com/4fm0d8
I Think that you should call it from a http webserver and not like simple file in browser. This mean request a file in a web server like http://localhost/XML/catalog.html not from file:///E:/Projects/XML/catalog.html.
It is as the message says:
cannot load file:///D:/Work%20File/My%20Work%20File/mY%20Work%20Backup/Sophie/Work%20File/footer.html. .
You are referencing to a file on a Windows boxes filesystem and not in a webservers folder.
Second: you have a CORS-issue (which in this case is caused by the filesystem reference)
Cross origin requests are only supported for HTTP
See MDN for more infos.
To solve the issue, you have to configure your webserver to allow such requests. Check your webservers manual.
I had the same problem with my InfluxDB connection and it turns out I did not prepend the URL settings in the datasource with 'http://'. This could be nicer in Grafana, e.g. mentioning there is no protocol defined for accessing the source.
In your case it's clear that you somehow configured Grafana to look for D:\, which is not accessible for your browser. So check your data source URL.