ERR_NAME_NOT_RESOLVED error on running the webapp on localhost - javascript

I am trying to run a webapp on Chrome. My index.html gets loaded, however the googleapis and angular js files are not getting imported and on inspecting the error is:
GET https://cdnjs.cloudflare.com/ajax/libs/toastr.js/latest/css/toastr.min.css net::ERR_NAME_NOT_RESOLVED.
All the local js files are getting loaded, however all cdnjs files and angular js files are not getting loaded. I tried setting nameserver to 8.8.8.8, however the error doesnot get away. Is it some certificate issue? I am running the webapp from localhost. There are other URLS importing angular scripts that are also giving the same error.

To understand this error you have to understand how domain names work.
When your computer goes looking for say "www.cloudfare.com", it doesn't know where that is, so it asks a "DNS server" what the corresponding IP address is for that domain. This is essentially the same idea as an old phone book: you give the DNS servers a name, and you get back a number that you (or at least your computer) can actually use.
In this case, the error you're seeing indicates your computer wasn't able to resolve https://cdnjs.cloudflare.com and find an IP address for it. Most likely this has something to do with your local network. For instance, you might have configured your router to block that domain, or you might have added entry to your /etc/hosts file for that domain, with an invalid IP address.
Unfortunately the ultimate solution depends heavily on your local circumstances: essentially you just have to fix whatever is getting in the way of your computer resolving that domain.

Related

Not a valid origin

I'm trying to replicate a prior project written in pure JS into a React project. This project makes extensive use of the Google API JavaScript Client to access the Youtube Data API, and I'm encountering problems I did not encounter on the original project.
The error I encounter is odd and I shall explain. I have added the following to my api key/ oauth client credentials: http://localhost:8000. Actually, this is how it was on the original project - it is unchanged. What is odd now is this error I get:
"Not a valid origin for the client: http://Localhost:8000 has not been whitelisted for client ID {id}. Please go to https://console.developers.google.com/ and whitelist this origin for your project's client ID."
I double checked and it is in fact present, but I noticed that the first letter in "local" is uppercased, so I added that specifically. The whitlisted urls are now as follows:
http://localhost:8000
http://Localhost:8000
After adding that, I get the same error again, but missing a second L - http://LocaLhost:8000.
A few prior searches on stack overflow mentioned clearing browser cache and hard reloading but that has not solved this issue. Does anyone have any suggestions as to what the error may actually be?
EDIT: I've narrowed this down to have something to do with React. If I pull up my old project on my local host it works successfully on the default port for Live Server - 5500. Though if I try to run the React app on my localhost:5500 after closing live server, it fails. Any ideas?

Files not transferring correctly from local drive to server

My code works properly and is functional when opening it from my local files, but when I open it up from the server I get an error message: "300 multiple choices" that says the files were not found on the server, however when I access the server all files seem to be in there respective places. I am unsure if it is a problem within my code, the server, or VPN connection. How can I troubleshoot to resolve this error?
I have tried uploading it multiple times, but am unsure what I can do besides that
here is my github as I am unsure where the problem is if it exists in my code
https://github.com/fallynlogan/horoscope-quiz

Can CHMOD Changes affect data from loading to a page?

I have a linux website where data loaded to the page correctly all the way up until I started screwing around with chmod permission changes. I was making changes, because a js script I am using wasn't working for uploading and cropping an image, and I thought maybe it was a permissions thing in the image folders.
Anyway...
After I made changes to the permissions, rows of data no longer appear, however, I do know that a connection to the database works, because there is some data that is the username appearing in another section of the page.
I changed everything back to 755 for folders and 644 for files and the data rows just do not appear anymore.
Here's the kicker, the exact same files work fine on a localhost server I'm running on a Windows 10 PC. Same exact database too.
Does anyone have any idea what I did wrong? I have confirmed that the files on my localhost server match exactly the files on the website, and the connection for both servers is going to the same MySQL db.
I am truly stumped on this one.
Thanks
This is a Linux/Apache/Nginx issue, rather than the tagged Javascript/PHP/MySQL.
Both Apache2 and Nginx run as a specific user. The current Apache2 default user is www-data, which is a member of the www-data group. If permissions are changed to disallow this user access to files, they will not be able to be served by Apache.
You should make sure all assets are readable by the user your webserver is using (or one of its groups).
It's also worth noting that permissions on MySQL databases are different to Apache2/Nginx filesystem permissions. Being able to access database data doesn't mean your server's filesystem permissions are correct.
I'd also suggest you use Chrome's inspector to check whether your assets are loading; and check your webserver logs to see what errors are popping up.

Correct method for ensuring users get the latest version of a website after an update

Everytime I deploy an update to our web application customers ring in with issues where their browser hasnt picked up that index.html has changed and since the name of the .js file has changed they run into errors. Presumably because their index.html still points to the old javascript file which no longer exists.
What is the correct way to ensure that users always get the latest version when the system is updated.
We have a HTML5 + AngularJS web application. It uses WebPack to bundle the vendor and app javascript into two js files. The files contain a hashname to ensure they are different once released.
Some other information
I can never replicate this issue locally (and by that I mean in debug, on our staging site or our production site)
We use CloudFlare but purge the entire cache after release
We have a mechanism in JS that checks on page load or every 5 minutes to see if the version of our API has changed, and if so show up a "Please refresh your browser" message. Clicking this runs window.location.reload(true);
Our backend is IIS
If you need users to pick up the latest index.html when they load your site immediately after you've updated the file, make index.html non-cacheable. That will mean the browser, CloudFlare, and any intermediate proxies aren't allowed to cache it, and that one file will always be served from your canonical server.
Naturally, that has a traffic and latency impact (for you and them), but if that's really your requirement, I don't see any other option.
There are spins on this. It might not be index.html itself that isn't cacheable, you could insert another resource (a tiny JavaScript file that writes out the correct script tags) if index.html is really big and it's important to cache it, etc. But if you need the change picked up immediately, you'll need a non-cacheable resource that identifies the change.

Fix for : System.Web.HttpException: The file has not been pre-compiled, and cannot be requested

We published our website and everything worked fine. After few days, something strange happened. we're getting "The file 'xxx.cshtml' has not been pre-compiled, and cannot be requested."
Resetting application pool, even restarting IIS didn’t help us to resolve the problem. Even resetting the Web server machine didn't fix the problem. but when we re-copying the published files again, web site starts working again.
We compared the problem website files with the new files, all are the same. nothing is missing. so every two-three days, we are getting the same error and the only way we can fix it is recopying the files again.
We even copied website when we got an error and then create another website based on those files and it works fine!
Any suggestion what can cause this problem?
Thanks in advance.
About the error
System.Web.HttpException: The file has not been pre-compiled, and cannot be requested
This error will come when a reference is specified in web.config and
deployment folder/site does not contain these dlls installed in the
system or bin folder does not contain them(if they are private
assemblies). For Example: (add assembly="Namespace1.NameSpace2,
Version=x.x.x.x, Culture=neutral, PublicKeyToken=31bf3856ad364e35"/)
if your web.config contains any assemblies like this and deployed
server doesnot contain these assembiles in bin or GAC, then this error
will occur.
Scenarios:
Here are some possible scenarios that will generate this error:
When we publish the website with the "updateable" check OFF and then
copy some files on deployment location that has some markup or code in them. By
putting the "updateable" check OFF we instruct the asp.net compiler to compile all the markup and code into dll files so that further on asp.net would not consider runtime compilation of content. But if it needs to (due to the presence of such content) it will throw that error.
The second situation is when we have a web application configured in
VS.Net to run directly off IIS (the http project) and then we publish it to the same location again with the "updateable" checked OFF. There will be development files at same location which will cause similar errors.
Another situation is where we have a precompiled website and within the
root folder of it we have another folder that has content for another asp.net
application but isn¡¯t marked as application in IIS. ASP.Net tracks asp.net
applications when running on IIS by virtual directories in IIS marked as applications. A child folder having un-compiled content within a parent
application which is precompiled will again cause this error.
Source: Microsoft Support
Apparently, it is another scenario where the cause is related to missing ReportViewer .dll references
Solutions
Some people talks about adding manually the missing missing assembly files.
i found out one thing that while publishing the application it does
not copy all of the dependent assemblies in the bin folder.So i just
copied them manually to the server and now every thing is working
Other people, somehow fixed this installing "Microsoft Web Service Enhancements v3.0". Downlaod here. It probably provides missing required .dll
Here, somebody fixed it, enabling "Create a separate assembly for each page and control output" in the Deployment Options

Categories

Resources