Most of my shared hosting accounts have changed theme from X3 to paper_lantern. However, I find the paper_lantern theme takes way longer to load - often 15-to-20 seconds before the page is fully displayed. During this time, the GENERAL INFORMATION side panel will have a loading spinner beside the THEME item, so I suspect that is somehow involved in the delay.
In DevTools, the network tab has a bunch of entries like below image which, when added up, account for almost the entire wait time.
Does anyone know of a way, perhaps using GreaseMonkey/TamperMonkey or AdBlock(?), to abort whatever process is happening here and just get on with displaying the CPANEL panels/icons?
(Browser in use is Google Chrome)
Related
The application checks when a certain websocket goes offline.
Whenever this happens, a few popups will be shown in order and each with their own functionality.
This is the only time these popups will be shown.
On some browsers, not all images will be shown on the popups
On other browsers, the 2nd popup will not be shown
I cannot prerender anything on the server side. Everything is hosted in an iframe by another party and I have no control over the parent framework or any of the servers.
I have no idea how to start tackling this issue.
Should I just render it anyway and hide it in css? (will the browser load / show it or is it optimized to now even download the images)
Should I create the component but update the z-index?
... a better solution?
I'm having a situation in which I want to allow the user to download a PDF. The generation of this PDF can take a couple of seconds, and I don't want a webserver thread to wait for the generation of the PDF, since that means the thread isn't available to handle other incoming requests.
So, what I would like to do is introduce the following URL patterns:
/request_download which invokes a background job to generate the PDF;
/document.pdf which will serve the PDF once it is generated
The use case is as follows.
A user clicks on 'Download PDF'. This invokes a piece of Javascript that'll show a spinner, make a request to /request_download and receive a HTTP 202 Accepted, indicating the request was accepted and a background job was created. The JS should then poll the /request_download url periodically until it gets HTTP 201 Created, indicating that the PDF has been created. A Location header is included that is used by the JS to forward the client to /document.pdf. This has to be in a new window (or tab, at least it shouldn't replace the current page). The level of expertise of our users is very low, so when I forward to the url, they might not know how to get back to the website.
The problem
Now, window.open works fine if it is invoked by the user via a click event. However, I want to invoke window.open in a callback function through setInterval as soon as I see that 201 Created response. However, browsers won't like that and will interpret it as a popup, causing it to get caught by popup blockers on IE8-10 as well as Chrome. Which makes sense.
I also can't open a new window and invoke the Javascript over there. In that case, users would see a blank page with a spinner. If that page then forwards to the /document.pdf, IE will show this yellow warning bar telling that it prevented files from being downloaded. Chrome and Firefox will do this without any problems, but a large percentage of our users is on IE.
What I've tried, and didn't work
window.open(location)
Setting the src of an iframe to the retrieve location
Manually adding an <a> tag to the document body and trying to click it using javascript
Adding an <a> tag to the document body and invoking a MouseEvent on it
Partially works: opening a new window on user click and storing a reference to it, then perform the asynchronous requests and upon completion, set the location of the earlier opened window. But: IE will block this and say it prevented files from being downloaded. So still not a fully working solution.
I'm straight out of ideas on how I can make this work and decided and decided to ask The Internet for help. I'm hoping you guys can help me with this, or recognise the problem. Thanks in advance!
I have a demo site built in Word Press.
In the head of the template I injected a script tag that loads a special script.
This script does a document.write to load another script from a server.
The script from the server in turn can do other document.write's to load up to as many as 10.
It is extremely important for these to load like this as these scripts make changes to the page before and thus it eliminates any visible change of content to the visitor.
Problem:
I am using a visitor experience recording service that loads my page inside an iframe and overlays it's mouse actions and heat maps.
When viewing the heat maps built in Flash, there is an odd behavior that is causing my script to load asynchronously I presume, cause the result is the page gets cleared because the second document.write is done after DOM ready and thus clears all content.
I am presuming that for some reason the page continues to load as my script runs it's series of document.write's in this particular case.
The only difference from mouse actions recordings that work fine is the presence of the Flash heat map.
I have Googled this to left and right and have found some similar reports some also reporting the added presence of a Flash but not found any clue that would point me towards a solution.
Has anyone see this behavior before and found a solution?
Please do not ask for links as I can only give you the link to the site that has no issues outside the recording service iframe to which I cannot give my credentials to.
Note: Asynchronous loading of my script would cause the default content to show for a second or two then be replaced. This would be visible to the visitor and thus unacceptable.
I have a site using some ajax here: http://deezteez.com/
If you sort by "Newest" (top right drop down box) you will notice that the new images (of products that just got added recently) will take about 30 seconds to actually load in, even though the page is done loading. older images don't do this, even if I start with a clear cache.
Anyone have an idea of what would be causing this?
Chrome's console seems to show that your server is simply slow. The graph below is how your images load in. The light colored bar is when the image is requested. The dark colored bar is the image actually being downloaded.
And you can see they all get requested at the same time. But then it takes a while for the server to respond to those requests. Once the server responds, things seem to download quickly, but that response seems quite lagged.
What is going on behind the scenes on your server, I have no idea. But some suggestions:
Drastically lower product count per page, so that far less images are requested at once.
Use CDN services to speed up static asset delivery and even provide geographically local image download servers.
If you have image data being generated on the fly or pulled form the database on each request, DO NOT DO THAT. Or if you need to do that, use server side caching to prevent doing it over and over again.
I need to have a way or tools to test the actual perceived rendering time for the browser to render the entire page to users. Any suggestions?
The reason I ask is because firbug and Yslow only reports the DomContentLoaded and OnLoad time.
For instance, my application reports 547ms (onLoad:621ms) for the contents. But the actual content is rendered around 3 seconds. I know so because I actually counted 1, 2, 3 slowly from the moment I hit enter in the url field of the browser to the moment when content appears in front of my eyes. So I know 547ms nor 621ms DOES NOT represents the actual time it takes for the page to load.
Not sure if this helps. But my application
renders json data on the server side, save the data as a javascript variable along with the rest of the page's html before server returns the entire html to browser
page loads Jquery 1.5 and Jquery template
jquery code grabs the json data from the variable defined at step 1
use jquery template to render the page.
Technically, no Ajax involved here and images on the page are all cached. I don't see firebug downloads any of them.
[Edit]
What i'm trying to figure out is after the firebug reported onLoad time which in my case is 621ms, to the time the page is completed and loaded in my eyes (which is at least 3 seconds), what happened to the 2.4s in between? What took place there? Browser is doing something? Something is blocking? Network? what is it?
Google Chrome has excellent auditing built in. Your results will be skewed because it's one of the fastest browsers right now, but it will give you exact measurements of how long it takes for Chrome to render. =)