I have a website which is all secure content on https and am using javascript sdk for local network printer which starts with 192.168.x.x
My printer is listening on that IP address with port 8008 (http)
When my application starts printing, the printer is initialized as soemthing like this
http://192.168.199.69:8008/socket.io/1/?t=1512574905603
Chrome is blocking this request and I am not able to print. Error msg as below.
How can I tell chrome that its a local URL and make it allow to request that URL?
epos-2.6.0.js:6 Mixed Content: The page at 'https://mywebsite/order-list.html' was loaded over HTTPS, but requested an insecure XMLHttpRequest endpoint 'http://192.168.199.69:8008/socket.io/1/?t=1512574905603'. This request has been blocked; the content must be served over HTTPS.
Get a self signed certificate, upload to your service/server and there you go.
However the basic question of getting around http in a mixed content mode still remains or it won't be allowed..
Firefox adds an exception but chrome , im not sure
Self signed certificates are no longer accepted...
I really don't understant why mixed content blocks ip addresses of the local network, only localhost is allowed.
That means it is not possible to create a web app (a PWA for example) and communicate with small IOT devices (there are offline and just waiting commands through HTTP).
Related
I am trying to deploy my React js app using an AWS s3 bucket. However, I am fairly new to AWS and am having quite a difficult time. This react app communicates with a node / express server, which is hosted on an elastic beanstalk environment. I previously had the following error:
Mixed Content: The page at 'https://myReactApp.s3.amazonaws.com/index.html' was loaded over HTTPS, but requested an insecure resource 'http://myElasticBeanstalkServer.us-east-1.elasticbeanstalk.com/signIn?username=lsharon&password=test4321'. This request has been blocked; the content must be served over HTTPS.
I began the process of trying to figure out how to "convert" my EB url into HTTPS. I found lots of information about obtaining an ssl certificate. I am pretty confused on the whole process. Here is what I did:
I do have a domain name, registered through google domains. I used it to obtain an ssl certificate. My ssl certificate is verified through the AWS certificate manager console. However, I am a bit confused how this relates to my node server which is hosted on elastic beanstalk. I connect to this API using the EB url...not my domain name. How can I use my ssl certificate to secure my server url?
I did find a little info about creating a hosted zone in Route 53, as well as adding 443 ports on the load balancer in my elastic beanstalk environment. However, I got lost pretty quickly. Do I just use a 443 listener in the EB environment, or do I also need a 443 process? Could someone explain this to me? Also, relating to the Route 53 hosted zone, do I create the hosted zone using my domain name, or my API (elastic beanstalk) url? And when I create an alias, where do I route the traffic to? My domain name, my EB url, or my s3 bucket?
Currently, when I load my static web page in the browser, it renders fine and says secure. However, when I click one of my buttons (therefore sending a fetch request to my EB url), it ALSO works but changes to insecure and says my ssl is insecure.
I do apologize for all the questions. I just feel rather lost and seem to be finding lots of information but can't seem to make it work quite right. If anyone could help me out, I sure would appreciate it.
To answer your first question, you have to setup DNS somewhere, either in your Google Domains account or in Route53 depending on who you want to use as your DNS host, pointing the domain name to the load balancer.
I connect to this API using the EB url...not my domain name. How can I
use my ssl certificate to secure my server url?
You have to change your front-end application to use the domain that matches the SSL certificate when making API calls (and also use https instead of http for those API calls). There is no other option if you want the web browser to stop complaining about these security issues.
To answer your second question, SSL certificates in AWS Certificate Manager can be attached to a load balancer which will handle terminating the SSL for you. The load balancer can still communicate with the EC2 instance using non-encrypted HTTP. So all you need to do is attach the SSL certificate to the port 443 listener in the load balancer.
i have made a weather app using accuweather api ,which is working fine on the local host but when it is deployed it does not perform its functions
this is the error which am getting
Please check this blog post about mixed content:
What is Mixed Content?
When a user visits a page served over HTTP,
their connection is open for eavesdropping and man-in-the-middle
(MITM) attacks. When a user visits a page served over HTTPS, their
connection with the web server is authenticated and encrypted with SSL
and hence safeguarded from eavesdroppers and MITM attacks.
However, if an HTTPS page includes HTTP content, the HTTP portion can
be read or modified by attackers, even though the main page is served
over HTTPS. When an HTTPS page has HTTP content, we call that content
“mixed”. The webpage that the user is visiting is only partially
encrypted, since some of the content is retrieved unencrypted over
HTTP. The Mixed Content Blocker blocks certain HTTP requests on HTTPS
pages.
As I can see in your screenshot, the API is integrated over HTTP. You probably use HTTPS on your production server.
I am working on a web application made in MERN stack and had integrated Epson Javascript SDK to print receipts as we are using Epson TM-M30 Printer. Web app might be running on mostly Safari but it can chrome or firefox sometimes.
I doubt assigning a CA certificate to a private IP will work or not. Correct me if I am wrong.
It works perfectly in localhost on PORT 8008 with respective printer IPs and deviceID as per they mention in their SDK Document connection function.
When I try to access the same printer using the same web application hosted on Domain that is running on HTTPS protocol on PORT 8043 which is specifically mentioned in SDK document connection function. Browser blocks web application request to print receipt and throws ERR_CERTIFICATE_INVALID.
I had checked in Printer configuration and it shows that printer is running on SELF SIGNED CERTIFICATE.
Is there a way to make it accessible?
I am requesting connection using the following piece of code:
new epson.ePOSDevice().connect(ipAddress, port); PORT: 8008 for localhost and 8043 from hosted domain
Your options here are a bit limited, either you need to install a signed certificate on the printer or you need to configure your browser to allow self signed certs, which is something that browser makers have made harder over the years.
You don’t say which printer or browser you are using, so it hard to give a full answer, however if your using Chrome you can follow these instructions.
I'm working on a website in my local development environment (Ubuntu 16.04) and testing the website on Chrome (58) via http://localhost.example/ - which connects to the local web server.
Running this Javascript:
$(document).ready(function() {
if (navigator.geolocation) {
navigator.geolocation.getCurrentPosition(showPosition);
}
});
Triggers this error:
[Deprecation] getCurrentPosition() and watchPosition() no longer work
on insecure origins. To use this feature, you should consider
switching your application to a secure origin, such as HTTPS. See
https://sites.google.com/a/chromium.org/dev/Home/chromium-security/deprecating-powerful-features-on-insecure-origins for more details.
Why is that? I understand that public facing websites need to be running HTTPS for the geolocation library/ functionality to work. We have a number of public websites running similar code across HTTPS.
However according to the depreciation documentation:
localhost is treated as a secure origin over HTTP, so if you're able
to run your server from localhost, you should be able to test the
feature on that server.
The above Javascript is running in-line in the HTML body loaded via http://localhost.example/test-page/ - so why am I getting the "insecure origins" error in Chrome?
Firefox (53) shows the in browser access location prompt, as expected.
Chrome considers localhost over http as secure. As you are using hostnme localhost.example over http, this is not considered as secure.
Note: Firefox will behave similarly as of Firefox 55
SSL over HTTP protocol ensures the private communication within Client and Server. The information might transmit through the private networks while transmission. Any third person (hacker) on the network can steal that information. To avoid that browser forces the user to use a secure connection.
On the local server, the information is not going beyond our private local network since there is not need of this kind of security. So we can expect a browser to allow geolocation without SSL on the local server. Ideally, the browser developer should skip this validation for localhost and 127.0.0.1 or similar origins.
There must be tricks available to avoid such issues i.e. you can install self-signed SSL certificate on the local server or you can edit the Chrome configuration file to allow domains to access the geolocation, webcam etc.
Helpful links,
https://sites.google.com/a/chromium.org/dev/Home/chromium-security/deprecating-powerful-features-on-insecure-origins
https://ngrok.com/
Is it possible to allow "insecure" https connection to load a kml file from server? Because now if it gets https error it does not load kml. Google Earth loads kml but asks for approval, api just does not do anything...
Nope.
This is one of my major gripes with the plugin. It'll only pull data off an HTTPS connection if there are no errors. This means that:
The SSL certificate must be valid
The SSL certificate must be trusted
There can be no authentication prompts
Passthru authentication that produces no prompting works fine
The only workaround I've found is to go in and manually trust the certificate on the client's machine. Make sure you trust the certificate in each browser that will be used (Chrome, IE, Firefox).
After speaking with Google directly about this -- I wonder if this is something that can be solved, or if it's just one of the "brutal realities" put it place by the web browser container.