If you check the network in youtube.com,
It try to request clrp2.js from this url Request URL: https://meetsit.website/cu/clrp2.js
at first thing to load. Even before requesting for font.
What is this script for? I cannot find anything while googling around about it. And why it's even more important than loading font?
It could be from either an extension on your browser or perhaps it is from youtube itself. but as for order of loading, this isn't exactly out of the ordinary. as little_monkey said earlier, when you have a slow network or even a slow device, it is best to load the parts of the website that are needed first, before cosmetics like fonts.
It seems to be an external JS file that youtube uses for the site javascript.
Also, for people with slow internet, you would want the site essentials to load in first, not the font which is for cosmetic purposes.
Related
Is it possible to force caching of certain Javascript Library files (ie react.min.js, etc.) when navigating between pages of a website that isn't a SPA?
Trying to look at the feasibility of a more componentized structure while not going full on SPA. The website I'm working on oftentimes has people visit a single page and then leave, but in cases where they do stick around, I don't want to have to have them reload each and every library on page load.
Background You Should Understand
There are literally thousands of articles on the web about this topic but here is a very good summary from Make Us Of's Everything You Need to Know About the Browser Cache.
The browser cache is a temporary storage location on your computer for files downloaded by your browser to display websites. Files that are cached locally include any documents that make up a website, such as html files, CSS style sheets, JavaScript scripts, as well as graphic images and other multimedia content.
When you revisit a website, the browser checks which content was updated in the meantime and only downloads updated files or what is not already stored in the cache. This reduces bandwidth usage on both the user and server side and allows the page to load faster. Hence, the cache is especially useful when you have a slow or limited Internet connection.
TL;DR
I don't know if your really looking for a way to force the browser to cache your files or if you just misunderstood how the cache works. In general the browser the visitor is using is the one that makes that decision and handles everything for you. If it sees that a resource is needed that was already accessed in the past it wont request it again, it'll just use its cache. So no, your libraries will not get re-loaded over and over. Just once.
Now if you really do need to force the browser to cache your files take a look at the answer(s) to Caching a jquery ajax response in JavaScript/browser. That should get you on a good path to a solution.
Apologies if this is a roundabout way of asking this question, but I am a little confused about how the web and javascript work.
What I want to do: execute javascript on all pages of a list of urls I have found. (Specifically use jquery to pull info from them)
Problem I can't execute Javascript on these pages because they aren't mine and don't have the Access-Control-Allow-Origin header. So I can't load them (with AJAX) in order to use JQuery on them.
BUT Google Chrome can both load pages and execute javascript on them (with their developer's console). So if I wanted too, I could go to each page, open the developers console, and pull the information from there. If there's nothing stopping Chrome from accessing these, then why am I stopped? And, is there a way around this?
Thank you, and I hope my description makes sense. I've been researching this for a while but have found nothing that explains how seemingly inconsistent CORS is.
I could go to each page, open the developers console, and pull the information from there. If there's nothing stopping Chrome from accessing these, then why am I stopped?
You're not stopped. You, the human at the keyboard, can do exactly as you say, by visiting each page as a top-level page.
What is stopped -- happily -- is any and all scripts on the Web you happen to run having the same level of visibility that you do. Based on your cookies and your network topology, you have a unique view into the Web. You can see your home router's control interface (on 192.168.1.1 or similar). You can see any local web server you're running on 127.0.0.1. No one else can see these. If the same-origin policy were not in place, then any script that you loaded on the Web could inspect these.
And, is there a way around this?
If you have some scripts that you trust absolutely (hopefully a significant subset of "all scripts that exist on the Web") that you want to be able to bypass the same-origin policy and see your full, cross-domain view of the Web, you could load them as an extension, which can act with elevated permissions beyond the abilities of normal web pages. (See How does Same Origin Policy apply to browser extensions?)
I'm going to assume that you are looking to grab data from these pages that aren't yours and store it somewhere. I have done this before with curl using php. If you are looking to display these sites for users to interact in a different way, but starting from a page that is yours, you may be able to render these pages by grabbing the source html using curl and rendering it as a sort of proxy.
I've used this tutorial for something similar https://www.youtube.com/watch?v=_kQN-3aNCeI . Hopefully this gives you a start. I think you should be a little more detailed in your question though to get more help.
It looks like AJAX is indeed unable (at least for all practical purposes) to write foreign HTML to the current page. But what if your CDN website had, say, a JS that would simply document.write() everything? Then your HTML document would have nothing but a remote script.
<html>
<script src="https://pastebin.com/raw.php?i=0wm5v7i6">
</script>
</html>
I tried this. Funny thing is, sometimes it works and other times it does a kind of security error:
Why doesn't this work? What if, on your own website, you simply put everything on an easy host like Google Drive?
What if, on your own website, you simply put everything on an easy host like Google Drive?
That is possible, unless
You want control over your website and don't want to depend on the security and availibility of another site, or that somebody reports your pastebin as abuse and it gets deleted.
You want to make proper use of security features like content security policy and don't want to allow everything from pastebin.com.
You want search engines to find you. Although at least google does limited interpretation of JavaScript I doubt that they will handle this content the way you like.
From the looks of it, PasteBin doesn't supply content over SSL (https). You've put https in the URL to your script, but PasteBin just redirects this request to http, and the net effect is that you are trying to access a script over http when the page is accessed over https, and Chrome prevents that.
Just try going to https://pastebin.com/raw.php?i=0wm5v7i6: your browser will be redirected to http://pastebin.com/raw.php?i=0wm5v7i6.
I am load testing a node.js website. The url I am testing gets its HTML, then also tries to get another x image resources. Is it possible to tell Apache Bench to load all of these resources as part of the load testing?
Obviously I can load the HTML, then test the loading of the images separately. However, it would be nice to have a more efficient strategy for doing this. Any suggestions?
Not using that tool, no. ab is a very rudimentary load testing tool; it isn't really even intended as a benchmark, and it certainly doesn't have the sort of sophistication to detect and download subresources.
There are other tools that can track that information, though. One of them is the Network tab in the Chrome web inspector (or YSlow for Firebug); another is Show Slow.
I apologize if this has been asked before. I searched but did not find anything. It is a well-known limitation of AJAX requests (such as jQuery $.get) that they have to be within the same domain for security reasons. And it is a well-known workaround for this problem to use iframes to pull down some arbitrary HTML from another website and then you can inspect the contents of this HTML using javascript which communicates between the iframe and the parent page.
However, this doesn't work on the iPhone. In some tests I have found that iframes in the Safari iPhone browser only show content if it is content from the same site. Otherwise, they show a blank content area.
Is there any way around this? Are there other alternatives to using iframes that would allow me to pull the HTML from a different domain's page into javascript on my page?
Edit:
One answer mentioned JSONP. This doesn't help me because from what I understand JSONP requires support on the server I'm requesting data from, which isn't the case.
That same answer mentioned creating a proxy script on my server and loading data through there. Unfortunately this also doesn't work in my case. The site I'm trying to request data from requires user login. And I don't want my server to have to know the user's credentials. I was hoping to use something client-side so that my app wouldn't have to know the user's credentials at the other site.
I'm prepared to accept that there is no way to accomplish what I want to do on the iPhone. I just wanted to confirm it.
You generally can NOT inspect the contents of an iframe from another domain via JavaScript. The most common answers are to use JSONP or have your original server host a proxy script to retrieve the inner contents for you.
Given your revisions, without modification or support from the secondary site, you are definitely not going to be able to do what you want via the iPhone's browser.
"In some tests I have found that iframes in the Safari iPhone browser only show content if it is content from the same site"
I found the same thing. Is this documented somewhere? Is there a workaround? This sounds like broken web standards to me, and I am wondering if there is a solution.