Google Maps Remote .js and Access-Control-Allow-Origin - javascript

I wrote a Google Maps API wrapper in JS, did some local tests with static html, and everything worked just fine. Then I loaded the files into a local web server running in localhost:8080, tested the map panels, and once again, everything worked just fine.
Then a week went by, I added a map to a page, and I couldn't get it to load. Nothing has changed (that I know of anyway, obviously something has), and static tests continue to work just fine, but when I try to load the API from the server I get this in the debug console:
XMLHttpRequest cannot load http://maps.google.com/maps/api/js?sensor=false.
Origin http://localhost:8080 is not allowed by Access-Control-Allow-Origin.
I read a bit around, but I still can't understand the error, much less fix it. Can someone please give me a hand?
edit: I use a simple tag to load google's js. No jQuery, no nothing.
Thanks in advance!

#Santiago hopefully I can help you now :) The error you are receiving is due to Google's prohibiting of cross site scripting. You can find the info here: Google which also references:Wikipedia Article
It looks like you'll need to create a proxy service for your client on your public facing webserver. Since the request comes from your public webserver and the reply goes back to your webserver it will meet the same domain requirements that Google requires. I do not know enough python yet to create a pythonic cgi proxy but I have to think that there are many different solutions out there already. Plus you will be limited by your server as to what type of solution you employ.
HTH!
~MWR

Related

How do I make a chrome extension that will take the current webpage and upload it to my custom domain via FTP so I can view it on a phone?

Overview
I am trying to make a Chrome Extension that takes the currently open html page and all its dependencies (CSS, JS) and uploads it to a custom domain via FTP. I would then be able to open it on my phone to make sure the website looks good on a phone.
Basically, I am trying to replicate the VSCode extension Live Server's functionality, but with it uploading the file to a custom domain. I know you'd normally be able to access live server's locally hosted server from a phone, but my university's internet setup doesn't seem to allow for this, hence my desire for an extension like this.
All I know about my hosting service is that it uses cPanel and supports FTP, which I assume is all I need. I can set up new FTP connections and logins. All the FTP details in the code will be hardcoded, but drawn from a separate file and .gitignored so they aren't in my commit history, which I hope is enough.
What I've Tried & What I'm Stuck On
I have most of the chrome extension stuff figured out; The FTP transfer process is what's giving me issues.
I first tried using chrome-app-ftp, but quickly realized that was old and was running into issues, so I switched to jsftp.
I used browserify to fix the "require" issue, and that cleared up some stuff.
I'm currently stuck on the following bug:
Error: TypeError: createConnection is not a function
I've done my research, and I do not think the error is because of an issue in my code; I believe that it is just a limitation of the tools I am using. This seems to be an issue with front-end JS not supporting the "net" module, which brings me to my question.
My Question
How do I circumvent my lack of support for the "net" module in the front-end? Do I need to set up some sort of local back-end for this with node or something like that? I have basically zero experience with anything back-end, so I might need pointed towards what sort of back-end is best for this. I more just need to know which tech stack is best for doing this.
If additional information is necessary I'll be checking back frequently and happy to help. Thanks in advance.

analytics.js net::ERR_CONNECTION_REFUSED

Could anyone help me understand why this happens with google analytics code?
This is the error which then gives me to load the tracking code:
https://www.google-analytics.com/analytics.js Failed to load resource: net::ERR_CONNECTION_REFUSED
Is there any solution to this? I was trying to fix it all morning.
My situation:
If this error occurs, the tracking code does not work.
So if events are monitored and callback functions specified in hitCallback property are never executed. When the causes a deficiency in the functionality of the website and also in tracking statistics.
Types of events that have failed me: behavioral events, events advanced electronic commerce (step forward cart).
My attempts:
I have disabled antivirus, antivirus firewall, Windows firewall and defend unsuccessfully thinking that could be.
I installed the extention to debug google analytics. Googled for hours. I tried loading the code directly in the browser and on any pc I can get it. I could only get https://www.google-analytics.com/analytics.js making a wget from my server console.
It could be a problem with the ISP?
Thank you very much!
Update:
Confirmed is the ISP.
Although they have not solved in my case because the technician went on vacation.
If someone would come to have this problem, it occurs to me as a solution, edit the code tracking Analitycs to call a url eg "mydomain.com/analytics.js" it processor by modrewrite, and direct this request php file to download the actual file Js www.google-analytics.com/analytics.js and return in the response. As a proxy php. It's just an idea I have not tested.
Thank you all, and may close this question if you want.
It can be that you are using a VPN(Virtual private network) which analytics blocks. Try loading https://www.google-analytics.com/analytics.js
If it does not work, try disabling the VPN and try again
Check your hosts file (on mac go finder > go > go to folder and paste there /etc/hosts | on windows search), in my case www.google-analytics.com was blocked in hosts file.

Why can Chrome execute javascript on other pages but I can't?

Apologies if this is a roundabout way of asking this question, but I am a little confused about how the web and javascript work.
What I want to do: execute javascript on all pages of a list of urls I have found. (Specifically use jquery to pull info from them)
Problem I can't execute Javascript on these pages because they aren't mine and don't have the Access-Control-Allow-Origin header. So I can't load them (with AJAX) in order to use JQuery on them.
BUT Google Chrome can both load pages and execute javascript on them (with their developer's console). So if I wanted too, I could go to each page, open the developers console, and pull the information from there. If there's nothing stopping Chrome from accessing these, then why am I stopped? And, is there a way around this?
Thank you, and I hope my description makes sense. I've been researching this for a while but have found nothing that explains how seemingly inconsistent CORS is.
I could go to each page, open the developers console, and pull the information from there. If there's nothing stopping Chrome from accessing these, then why am I stopped?
You're not stopped. You, the human at the keyboard, can do exactly as you say, by visiting each page as a top-level page.
What is stopped -- happily -- is any and all scripts on the Web you happen to run having the same level of visibility that you do. Based on your cookies and your network topology, you have a unique view into the Web. You can see your home router's control interface (on 192.168.1.1 or similar). You can see any local web server you're running on 127.0.0.1. No one else can see these. If the same-origin policy were not in place, then any script that you loaded on the Web could inspect these.
And, is there a way around this?
If you have some scripts that you trust absolutely (hopefully a significant subset of "all scripts that exist on the Web") that you want to be able to bypass the same-origin policy and see your full, cross-domain view of the Web, you could load them as an extension, which can act with elevated permissions beyond the abilities of normal web pages. (See How does Same Origin Policy apply to browser extensions?)
I'm going to assume that you are looking to grab data from these pages that aren't yours and store it somewhere. I have done this before with curl using php. If you are looking to display these sites for users to interact in a different way, but starting from a page that is yours, you may be able to render these pages by grabbing the source html using curl and rendering it as a sort of proxy.
I've used this tutorial for something similar https://www.youtube.com/watch?v=_kQN-3aNCeI . Hopefully this gives you a start. I think you should be a little more detailed in your question though to get more help.

Why not CDN everything?

It looks like AJAX is indeed unable (at least for all practical purposes) to write foreign HTML to the current page. But what if your CDN website had, say, a JS that would simply document.write() everything? Then your HTML document would have nothing but a remote script.
<html>
<script src="https://pastebin.com/raw.php?i=0wm5v7i6">
</script>
</html>
I tried this. Funny thing is, sometimes it works and other times it does a kind of security error:
Why doesn't this work? What if, on your own website, you simply put everything on an easy host like Google Drive?
What if, on your own website, you simply put everything on an easy host like Google Drive?
That is possible, unless
You want control over your website and don't want to depend on the security and availibility of another site, or that somebody reports your pastebin as abuse and it gets deleted.
You want to make proper use of security features like content security policy and don't want to allow everything from pastebin.com.
You want search engines to find you. Although at least google does limited interpretation of JavaScript I doubt that they will handle this content the way you like.
From the looks of it, PasteBin doesn't supply content over SSL (https). You've put https in the URL to your script, but PasteBin just redirects this request to http, and the net effect is that you are trying to access a script over http when the page is accessed over https, and Chrome prevents that.
Just try going to https://pastebin.com/raw.php?i=0wm5v7i6: your browser will be redirected to http://pastebin.com/raw.php?i=0wm5v7i6.

Implementing the StockTwits Widget over SSL without warnings

Trying to use the StockTwits Widget in a web page that is (and must remain) served over SSL.
Since the widget-loader.min.js script link was being called via http, I copied the code to our domain so it would be served over our SSL. Problem still not solved. Chrome 25 says my page "ran insecure content" and completely refused to run or even display the widget.
So I dug into the .js file and found this little bit:
m=b.ssl?"https://":"http://"
Figuring the warning was coming from the widgets CALL to the service, I hacked this line as follows:
// add one character ↓
m=b.ssl?"https://":"https://"
Initially I thought I had success, because Chrome 25 loaded the widget! But my glee was short lived once I saw that the nice GREEN https:// in the address bar had changed to a yellow warning sign. Clicking it for info revealed a warning: "...displayed insecure content from http://assets1.stocktwits.net....". {darn}
Since the .js is secure, and the CALL to stocktwits is made with https thanks to the hack, I have to conclude that stocktwits either isn't able or configured to reply over HTTPS.
Do you have any experience with this widget, or see something I am not doing correctly? TIA.
We've fixed the widget to use protocol-less URLs now. Let us know if that fixes this issue for you.
Currently the widget is not fully SSL compatible.
We will look into seeing what would be involved to have an SSL option with the widget. Seems like it could be done with minimal impact.

Categories

Resources