on my live website that I created
there's a big problem I'm having. On the page linked to, which is also the home page of the site, there is an image gallery that performs the following functions:
1) I have a directory in my html file structure called public_html/images/lightbox where all of the images that will be displayed are stored. A php script (dynamically) takes the contents of that directory using the scandir function, and then takes that PHP array returned by this function, implodes it into a PHP string, assigns that PHP string to a javascript string, and then splits that javascript string to a javascript array (which stores the filepaths of the images stored in the public_html/images/lightbox directory.
2) There are two javascript functions called prev() and next(). next() is called automatically every 4.5 seconds, incrementing an indexing variable that moves through the array of images. This goes on in the background regardless of user interaction with the webpage; there is a counter that makes next() execute every 4500 ms using the method setTimeout.
3) The user can trigger the execution of prev() and next() by pressing two buttons, which are absolutely positioned relative to the div that contains the image gallery. Upon the pressing of either button, corresponding to either next() or prev() being called, either function is immediately executed, whereupon the timer for the next execution of next() will restart from 4500 ms.
It all works out fine, but when I first open the site with a web browser, the images lag upon changing. After the images have all been cycled through, they do not lag anymore (is this because they have been cached in the browser), but the first time it is viewed on a web browser, the lagging as the image URL/image is changed spoils the user experience and makes the website come across as poorly designed.
Here's the code: Many thanks.
<?php
$path = "images/lightbox/";
$gallery = scandir($path);
array_shift($gallery); array_shift($gallery);
foreach($gallery as &$image) {
$image = $path . $image;
}
$gallery_string = implode(" ", $gallery);
?>
<script>
$(function() {
$("#prev").on("click", $.prev);
$("#next").on("click", $.next);
});
</script>
<script>
var gallery= new Array();
var gallery_string = "<?php echo $gallery_string; ?>";
gallery = gallery_string.split(" ");
var ImageCnt = 0;
$.prev = function(event) {
if (ImageCnt == 0) { ImageCnt = gallery.length - 1; }
else { ImageCnt -= 1; }
$("#lightbox").css("background", "url(" + gallery[ImageCnt] + ")").css("background-size","670px");
clearTimeout(t);
t = setTimeout($.next, 4500);
event.stopPropagation();
}
$.next = function(event){
if(ImageCnt < gallery.length-1) { ImageCnt++; }
else { ImageCnt = 0; }
$("#lightbox").css("background", "url(" + gallery[ImageCnt] + ")").css("background-size","670px");
if(event != undefined) {
clearTimeout(t);
event.stopPropagation();
}
t = setTimeout($.next, 4500);
}
var t = setTimeout($.next, 4500);
</script>
The lag has nothing to do with the script, it's however mainly due to these reasons
There're many resources being loaded on your site (images, sound).
The background music you have on is ~4.5 MB in size and that seems to be the main cause
This also depends on your hosting platform and speed
I recommend getting rid of the music, as besides causing the lag, it would in my opinion, affect the experience of your site visitor's. Also try using less graphics, use GIF/PNG when possible rather than JPEG, however, if you must use JPEG try compressing them.
Related
I have some images that I am displaying through JSON. This file refreshes the content every 10 seconds so the new images added show without a page refresh.
I am struggling to add a slideshow code without the two refresh's clashing with each other.
I would really appreciate some help.
This is my current code.
function update_content() {
$.getJSON("showImages.php", function(data) {
$("#slides").html(
data.result.map(({image1}) => `<img class="slides" src="data:image/png;base64,${image1}" />`).join("")
);
setTimeout(update_content, 10000);
var index = 0;
slideshow();
function slideshow() {
var i;
var x = document.getElementsByClassName("slides");
for (i = 0; i < x.length; i++) {
x[i].style.display = "none";
}
index++;
if (index > x.length) {index = 1}
x[index-1].style.display = "block";
setTimeout(slideshow, 20000);
}
})
}
$(function() {
update_content()
})
The way this is written, there's no way the refreshes wouldn't clash with each other and cause a large mess in updating. What you have here will, every 10 seconds, do a ping back to the server for some json and then spawn what is essentially a thread (not in the technical sense, but in the behavior sense) that every 20 seconds hides all the slides and shows the first slide. By about 60 seconds into this page running, you now have six instances of the slideshow() function queued to run, the newly created one trying to show the first slide, the next most recently created two showing the second, the next two showing the third, etc. And because network lag is unpredictable, they'll all fire at slightly different times in an unpredictable order.
The main problem is setTimeout(slideshow, 20000). It's not needed as this is currently written. Slideshow() is being run every 10 seconds already from the outer function running every 10; it doesn't need to run separately in its own timeout. And if you're running it at that interval already, the slideshow function is useless anyway, and the server only needs to return one image in its json, and the whole slideshow function can be deleted.
Though I question why you need to do a network round-trip every 10 seconds to begin with. Unless this is some real-time snapshot of a camera feed or something, you can easily just give javascript a large array of images for it to cycle through and maybe do the server ping for new images every 10 minutes or so instead. If you go this route, instead move slideshow() out of the update_content() function and just call it once from the jquery onready function to set it running and leave it be. If you need to call slideshow() in the getJson callback, be sure to cancelTimeout on the previous setTimeout(slideshow, ...)'s return value, so you don't make pseudo-threads as described above.
I have an internet radio station and I need a script that will display a picture of the current song in a particular dvi with an id. The image is automatically uploaded via ftp to the server each time the song changes..
HTML:
<div id="auto"></div>
JS:
$ (document).ready(function() {
$('#auto').html('<img src="artwork.png"></img>');
refresh();
});
function refresh() {
setTimeout (function() {
$('#auto').html('<img src="artwork.png"></img>');
refresh();
}, 1000);
}
I tried this, but all I get is that the image is loaded, but in case of a change, I have to manually refresh the whole page again..
I'll point out multiple things here.
I think your code is just fine if you are going for the setTimeout recursive calls instead of one setInterval action to repeat it.
File Caching
your problem is probably the browser's cache since you are using the same image name and directory all the time. browsers compare the file name and directory and to decide to load it from its cache or else it will request it from the server. there are different tricks you can do to reload the image from the server in this particular case.
Use different file names/directories for the songs loaded dynamically
Use a randomized GET query (e.g. image.png?v=current timestamp)
Your method for switching
you are replacing the file with FTP, I wouldn't recommend that. maybe you should have all your albums and thumbnails uploaded to the server and use a different dynamic switching for efficiency and less error proneness and will help you achieve method #1 in the previous section better.
Loading with constant refresh
I would like to highlight that if you are using nodeJs or nginx servers - which are event based - you can achieve the same functionality with much less traffic. you don't need a refresh method since those servers can actually send data on specific events to the browser telling it to load a specific resource at that time. no constant refresh is required for this.
You consider your options, I tried to be as comprehensive as I could
At the top level, browser cache the image based on its absolute URL. You may add extra query to the url to trick browser that is another new image. In this case, new URL of artist.png will be artist.png?timestamp=123
Check this out for the refresh():
function refresh() {
setTimeout (function() {
var timestamp = new Date().getTime();
// reassign the url to be like artwork.png?timestamp=456784512 based on timestmap
$('#auto').html('<img src="artwork.png?timestamp='+ timestamp +'"></img>');
refresh();
}, 1000);
}
You may assign id attribute to the image and change its src url
html
<img id="myArtworkId" src="artwork.png"/>
js in the refresh method
$('#myArtworkId').attr('src', 'artwork.png?timestamp=' + new Date().getTime());
You can use window.setInterval() to call a method every x seconds and clearInterval() to stop calling that method. View this answer for more information on this.
// Array containing src for demo
$srcs = ['https://www.petmd.com/sites/default/files/Acute-Dog-Diarrhea-47066074.jpg',
'https://www.catster.com/wp-content/uploads/2018/05/Sad-cat-black-and-white-looking-out-the-window.jpg',
'https://img.buzzfeed.com/buzzfeed-static/static/2017-05/17/13/asset/buzzfeed-prod-fastlane-03/sub-buzz-25320-1495040572-8.jpg?downsize=700:*&output-format=auto&output-quality=auto'
]
$i = 0;
$(document).ready(function() {
$('#auto').html('<img src="https://images.pexels.com/photos/617278/pexels-photo-617278.jpeg?auto=compress&cs=tinysrgb&dpr=1&w=500"></img>');
// call method after every 2 seconds
window.setInterval(function() {
refresh();
}, 2000);
// To stop the calling of refresh method uncomment the line below
//clearInterval()
});
function refresh() {
$('#auto').html('<img src="' + $srcs[$i++] + '"></img>');
// Handling of index out of bound exception
if ($srcs.length == $i) {
$i = 0;
}
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
<div id="auto"></div>
I'm looking for something that will allow me to have a page in the root directory randomly selected to load each time the page is first loaded and with subsequent reloads/refreshes.. I have an index.html and index2.html for example slightly different from each other in the root directory.
I have tried teh Googles but can't find anything / everything I have tried isn't working.
The javascript I'm currently trying try to use is:
var howMany = 3; // number of pages below, count them.
howMany = howMany-1
var page = new Array(howMany+1);
page[0]="index.html";
page[1]="index2.html";
page[2]="index.html";
function rndnumber(){
var randscript = -1;
while (randscript < 0 || randscript > howMany || isNaN(randscript)){
randscript = parseInt(Math.random()*(howMany+1));
}
return randscript;
}
quo = rndnumber();
quox = page[quo];
location.href=(quox);
If I test it locally it seems like it works but it gets stuck in an infinite and automatically reloading/refreshing loop for some reason. When I upload it to server there's no reloading loop but the randomization doesn't work, it just loads the index.html
I have a set up a test page here: http://www.samnorris.co.nz/test2/ which has both the index.html and index2.html in the root directory
Can anyone offer any clue why this might not be working properly and/or a better solution?
thanks!
The javascript is executed on the client on page load. So everytime you load the page, it randomly selects a page and redirects to that page. That page then has some JavaScript emebedded that when executed, selects a random page and loads that, ...
a better solution would be to handle the serving of the page on the server using a server side language. PHP for instance. The code would be something like this:
index.php
<?php
$randNumber = mt_rand(1,3);
if ( $randNumber == 1 )
{
include 'index_1.html';
}
else
{
include 'index_2.html';
}
You have to, somehow, tell the script it has been already redirected. This could be probably achieved using location's hash:
if (location.hash === "#redirected") {
location.hash = "";
}
else {
quo = rndnumber();
quox = page[quo];
location.href=(quox) + "#redirected";
}
Create separate html file with the following content in the same dir as your "index*.html" files:
<script type="text/javascript">
var pages = [
"index.html",
"index2.html",
"index3.html"
];
function randomPage() {
return pages[Math.round(Math.random() * (pages.length - 1))];
}
location.href= randomPage();
</script>
I'm trying to download the HTML of a website that is almost entirely generated by JavaScript. So, I need to simulate browser access and have been playing around with PhantomJS. Problem is, the site uses hashbang URLs and I can't seem to get PhantomJS to process the hashbang -- it just keeps calling up the homepage.
The site is http://www.regulations.gov. The default takes you to #!home. I've tried using the following code (from here) to try and process different hashbangs.
if (phantom.state.length === 0) {
if (phantom.args.length === 0) {
console.log('Usage: loadreg_1.js <some hash>');
phantom.exit();
}
var address = 'http://www.regulations.gov/';
console.log(address);
phantom.state = Date.now().toString();
phantom.open(address);
} else {
var hash = phantom.args[0];
document.location = hash;
console.log(document.location.hash);
var elapsed = Date.now() - new Date().setTime(phantom.state);
if (phantom.loadStatus === 'success') {
if (!first_time) {
var first_time = true;
if (!document.addEventListener) {
console.log('Not SUPPORTED!');
}
phantom.render('result.png');
var markup = document.documentElement.innerHTML;
console.log(markup);
phantom.exit();
}
} else {
console.log('FAIL to load the address');
phantom.exit();
}
}
This code produces the correct hashbang (for instance, I can set the hash to '#!contactus') but it doesn't dynamically generate any different HTML--just the default page. It does, however, correctly output that has when I call document.location.hash.
I've also tried to set the initial address to the hashbang, but then the script just hangs and doesn't do anything. For example, if I set the url to http://www.regulations.gov/#!searchResults;rpp=10;po=0 the script just hangs after printing the address to the terminal and nothing ever happens.
The issue here is that the content of the page loads asynchronously, but you're expecting it to be available as soon as the page is loaded.
In order to scrape a page that loads content asynchronously, you need to wait to scrape until the content you're interested in has been loaded. Depending on the page, there might be different ways of checking, but the easiest is just to check at regular intervals for something you expect to see, until you find it.
The trick here is figuring out what to look for - you need something that won't be present on the page until your desired content has been loaded. In this case, the easiest option I found for top-level pages is to manually input the H1 tags you expect to see on each page, keying them to the hash:
var titleMap = {
'#!contactUs': 'Contact Us',
'#!aboutUs': 'About Us'
// etc for the other pages
};
Then in your success block, you can set a recurring timeout to look for the title you want in an h1 tag. When it shows up, you know you can render the page:
if (phantom.loadStatus === 'success') {
// set a recurring timeout for 300 milliseconds
var timeoutId = window.setInterval(function () {
// check for title element you expect to see
var h1s = document.querySelectorAll('h1');
if (h1s) {
// h1s is a node list, not an array, hence the
// weird syntax here
Array.prototype.forEach.call(h1s, function(h1) {
if (h1.textContent.trim() === titleMap[hash]) {
// we found it!
console.log('Found H1: ' + h1.textContent.trim());
phantom.render('result.png');
console.log("Rendered image.");
// stop the cycle
window.clearInterval(timeoutId);
phantom.exit();
}
});
console.log('Found H1 tags, but not ' + titleMap[hash]);
}
console.log('No H1 tags found.');
}, 300);
}
The above code works for me. But it won't work if you need to scrape search results - you'll need to figure out an identifying element or bit of text that you can look for without having to know the title ahead of time.
Edit: Also, it looks like the newest version of PhantomJS now triggers an onResourceReceived event when it gets new data. I haven't looked into this, but you might be able to bind a listener to this event to achieve the same effect.
Searching for a js script, which will show some message (something like "Loading, please wait") until the page loads all images.
Important - it mustn't use any js framework (jquery, mootools, etc), must be an ordinary js script.
Message must disappear when the page is loaded.
Yeah an old-school question!
This goes back to those days when we used to preload images...
Anyway, here's some code. The magic is the "complete" property on the document.images collection (Image objects).
// setup a timer, adjust the 200 to some other milliseconds if desired
var _timer = setInterval("imgloaded()",200);
function imgloaded() {
// assume they're all loaded
var loaded = true;
// test all images for "complete" property
for(var i = 0, len = document.images.length; i < len; i++) {
if(!document.images[i].complete) { loaded = false; break; }
}
// if loaded is still true, change the HTML
if(loaded) {
document.getElementById("msg").innerHTML = "Done.";
// clear the timer
clearInterval(_timer);
}
};
Of course, this assumes you have some DIV thrown in somewhere:
<div id="msg">Loading...</div>
Just add a static <div> to the page, informing user that the page is loading. Then add window.onload handler and remove the div.
BTW, what’s the reason of this? Don’t users already have page load indicators in their browsers?
You should do async ajax requests for the images and add a call back when it's finished.
Here's some code to illustrate it:
var R = new XMLHttpRequest();
R.onreadystatechange = function() {
if (R.readyState == 4) {
// Do something with R.responseXML/Text ...
stopWaiting();
}
};
Theoretically you could have an onload event on every image object that runs a function that checks if all images is loaded. This way you don´t need a setTimeOut(). This would however fail if an image didn´t load so you would have to take onerror into account also.