Loading images sequentially using Jquery in Google App Engine - javascript

Solution:
I used setTimeout(ajaxcall,timeoutmillis) instead of making all ajax calls instantly.
The images were updated perfectly. No problem.
Never send multiple ajax request in a loop without giving browser some time to breathe.
:)
I am uploading multiple images to Google App Engine using javascript. I am sending images
one by one to the server and receiving responses from server one by one. The response
contains thumbnail link of the loaded image. I want to be able to display those thumbnails as they come one by one. The problem is that for example if I have 100 images the images are not displayed until 100th response is received from the server. Till then the page behaves as if it is loading something but images are not visible. All the images show up after the Ajax call is complete though.
Update: I have found not so elegant workaround. If you create image placeholders with
some dummy image and change the img src later during ajax load, it works. Not very elegant solution but if you add 1 pixel invisible image the effect will be more or less the same.
Here is the code.
this.handlephotoupload = function(input) {
var uploadedfiles = input.files;
var toolarge = "";
var maxsize = 10240000;
var counter = 1;
var downloadcounter = 0;
var rownumber = 0;
var images=new Array();
var arraycount=0;
var totalimagecount=0;
$("#phototable").append("<tr><td><div id=loading>Loading images please wait......</div></td></tr>");
for(var i = 0; i < uploadedfiles.length; i++) {
if(uploadedfiles[i].size > maxsize) {
toolarge += uploadedfiles[i].name + "\n";
totalimagecount+=1;
} else {
var filedata = new FormData();
filedata.append("uploadimage", uploadedfiles[i]);
$("#loading").show();
$.ajax({
url : 'photodownloader',
data : filedata,
cache : false,
contentType : false,
processData : false,
type : 'POST',
success : function(receiveddata) {
var imagedata = JSON.parse(receiveddata);
var data = imagedata['imageinfo'];
var imagelink = data['imagelink'];
var thumbnaillink = data['thumbnailLink'];
var imageID = data['uniqueID'];
var imagename = data['imagename'];
if(downloadcounter % 3 == 0) {
rownumber += 1;
var row = $('<tr id=thumbnailsrow' + rownumber + '></tr>');
$("#phototable").append(row);
} else {
var row = $("#thumbnailsrow" + rownumber);
}
//images[arraycount++]'<td><a href=' + imagelink + '><img src=' + thumbnaillink + '/></a></td>')
var curid="imgload"+downloadcounter;
//$("#loadimg").append("<div id="+curid+"></div>");
//$("#loadimg").append("<img src="+thumbnaillink+"></img>");
//$("#"+curid).hide();
//$("#"+curid).load(thumbnaillink);
$(row).append('<td align=center><a href=' + imagelink + '><img src=' + thumbnaillink + '/></a></td>');
//$("#"+curid).remove();
downloadcounter+=1;
totalimagecount+=1;
if(totalimagecount==uploadedfiles.length){
$("#loading").hide();
}
}
});
}
}
if(toolarge != "") {
alert("These images were not uploaded due to size limit of 1MB\n" + toolarge);
}
}

If you want separate responses, you have to make separate requests.
Don't asynchronously fire 100 requests at once though, just fire X and hold a counter that you check with a timer. Each time you receive a response you decrease that counter and each time the timer hits you can simply fire X - counter requests. That way you only have X simultaneous requests at a time...

Related

Why does the Resources Received change so frequently for same URL (page.open(url))

I am currently working on fetching/scraping all the images being received on requesting a URL.
The problem i am facing is the response changes after few tries or is very inconsistent even for the same URL using the phantomjs.
I have tried clearing cache multiple times and at different location in my code but the the request numbers dont appear to be the same.
o/p
1st call to URL
19 images and total off 49 responses from server
2nd call
14 images and 48 responses from server
3rd call
14 images and 38 responses from server
Output for execution twice
phantomjs-2.1.1-windows
running a java-script code using the phantomjs command
I am very new to Javascript, i have manged to write the following till now
var page = require('webpage').create();
var fs = require('fs');
var url = "https://..........";
page.settings.clearMemoryCaches = true;
page.clearMemoryCache();
page.clearCookies();
page.viewportSize = {width: 1280, height: 1024};
var imageCounter = 0;
var responseCounter = 0;
page.open(url, function (status) {
console.log(status + '*/*/*/*/*/*/**/*/*/*/*/*/*/*/*/*/*/*/*/*/')
if(status=='success'){
console.log('The entire page is loaded.............################');
console.log('\n' + imageCounter + '^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n');
console.log('\n' + responseCounter + '^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n');
imageCounter = 0;
page.clearMemoryCache();
page.clearCookies();
page.close();
phantom.exit();
}
});
page.onResourceReceived = function(response) {
if(response.stage == "start"){
responseCounter++;
var respType = response.contentType;
if(respType.indexOf("image")==0){
imageCounter++;
//console.log('Content-Type : ' + response.contentType)
//console.log('Status : ' + response.status)
//console.log('Image Size in byte : ' + response.bodySize)
//console.log('Image Url : ' + response.url)
//console.log('\n');
console.log(imageCounter + '^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n');
}
}
};
I want to get consistent response for the images at least, I am really confused how does phantom cache this kind of resources on second attempt.

Can't figure out why my phantomJS script isn't showing the status and taking a picture of the page

I am not getting a status of the website nor an error when executing the code when using the URL that is being generated. However, when I use 'http://google.com', the status shows and it takes a screenshot of the page. The URL that is being used is a correct one, when I open the URL in browser it is the Page I want. It seems like the page is unreachable but I am not getting an error nor a status update.
var size = 9;
var model = 'CQ2626';
function URL(size, model){
var baseSize = 580;
var shoeSize = size - 6.5;
shoeSize *= 20;
rawSize = shoeSize + baseSize;
var URL = 'http://www.adidas.com/us/' + model + '.html?forceSelSize=' + model + '_' + rawSize;
return URL
};
console.log(URL(size, model));
var page = require('webpage');
var webPage = page.create();
webPage.open(URL(size, model), function(status){
console.log("Status: " + status);
if(status === "success") {
webPage.render('example.png');
}
phantom.exit();
});

Unable to receive parameter in query string when redirected to other page

Hi I am developing one application in java-script. I have two pages default.aspx and addnewitem.aspx. there is one html table in default.aspx and one button. When i click on button i want to redirect to addnewitem.aspx page. I have some parameters to send in query string. I am able to redirect to addnewitem.aspx but page not found error i am getting. I am not sure why i am getting page not found error. I am trying as below.
function getValues() {
var Title = "dfd";
var PrimarySkills = "fdfd";
var SecondarySkills = "dfdf";
var url = "http://sites/APPSTEST/JobRequisitionApp/Pages/AddNewItem.aspx?Title=" + encodeURIComponent($(Title)) + "&PrimarySkills=" + encodeURIComponent($(PrimarySkills)) + "&SecondarySkills=" + encodeURIComponent($(SecondarySkills));
window.location.href = url;
}
I am checking querystring in addnewitem.aspx as below.
<script type="text/javascript">
var queryString = new Array();
$(function () {
if (queryString.length == 0) {
if (window.location.search.split('?').length > 1) {
var params = window.location.search.split('?')[1].split('&');
for (var i = 0; i < params.length; i++) {
var key = params[i].split('=')[0];
var value = decodeURIComponent(params[i].split('=')[1]);
queryString[key] = value;
}
}
}
if (queryString["Title"] != null && queryString["PrimarySkills"] != null) {
var data = "<u>Values from QueryString</u><br /><br />";
data += "<b>Title:</b> " + queryString["Title"] + " <b>PrimarySkills:</b> " + queryString["PrimarySkills"] + " <b>SecondarySkills:</b> " + queryString["SecondarySkills"];
$("#lblData").html(data);
alert(data);
}
});
</script>
"http://sites/APPSTEST/JobRequisitionApp/Pages/AddNewItem.aspx?Title=%5Bobject%20Object%5D&PrimarySkills=%5Bobject%20Object%5D&SecondarySkills=%5Bobject%20Object%5D"
I tried lot to fix this. May i know where i am doing wrong? Thanks for your help.
You should use the relative path in your url instead of hard coding the entire folder structure, which is probably incorrect since you are getting a 404. And you need to change the url every time you publish the site to the hosting enviroment when you hard code it like that.
So change
var url = "http://sites/APPSTEST/JobRequisitionApp/Pages/AddNewItem.aspx?Title=...
into
var url = "/AddNewItem.aspx?Title=...
if both the pages are in the same folder. Should AddNewItem.aspx be located in the Pages folder, you have to add that folder of course: var url = "/Pages/AddNewItem.aspx?Title=...

Javascript - How to get script to run with Ajax requested data

Battlefield Page
In the image above, there is a page that has a battlefield with 20 users on it. I have written JavaScript to capture the data and store it in a MySQL db. The problem comes into the picture when I need to hit next to go to the next page and gather that data.
It fetches the next 20 users with an Ajax call. Obviously when this happens, the script can't log the new information because the page never loads on an Ajax call which means the script doesn't execute. Is there a way to force a page load when the Ajax link is clicked?
Here's the code:
grabData();
var nav = document.getElementsByClassName('nav')[0].getElementsByTagName('td')[2].getElementsByTagName('a')[0];
nav.addEventListener("click", function(){
grabData();
});
function grabData(){
var rows = document.getElementsByClassName('table_lines battlefield')[0].rows;
var sendData = '';
for(i=1; i < rows.length -1 ; i++){
var getSid = document.getElementsByClassName('table_lines battlefield')[0].getElementsByTagName('tr')[i].getElementsByTagName('td')[2].getElementsByTagName('a')[0].href;
var statsID = getSid.substr(getSid.indexOf("=") + 1); //Grabs ID out of stats link
var name = document.getElementsByClassName('table_lines battlefield')[0].getElementsByTagName('tr')[i].getElementsByTagName('td')[2].textContent.replace(/\,/g,"");
var tff = document.getElementsByClassName('table_lines battlefield')[0].getElementsByTagName('tr')[i].getElementsByTagName('td')[3].textContent.replace(/\,/g,"");
var rank = document.getElementsByClassName('table_lines battlefield')[0].getElementsByTagName('tr')[i].getElementsByTagName('td')[6].textContent.replace(/\,/g,"");
var alliance = document.getElementsByClassName('table_lines battlefield')[0].getElementsByTagName('tr')[i].getElementsByTagName('td')[1].textContent.trim();
var gold = document.getElementsByClassName('table_lines battlefield')[0].getElementsByTagName('tr')[i].getElementsByTagName('td')[5].textContent.replace(/\,/g,"");
if(alliance == ''){
alliance = 'None';
}
if(gold == '??? Gold'){
gold = 0;
}else{
gold = gold.replace(/[^\/\d]/g,'');
}
sendData += statsID + "=" + name + "=" + tff + "=" + rank + "=" + alliance + "=" + gold + "#";
}
$.ajax({
// you can use post and get:
type: "POST",
// your url
url: "url",
// your arguments
data: {sendData : sendData},
// callback for a server message:
success: function( msg ){
//alert(msg);
},
// callback for a server error message or a ajax error
error: function( msg )
{
alert( "Data was not saved: " + msg );
}
});
}
So as stated, this grabs the info and sends to the php file on the backend. So when I hit next on the battlefield page, I need to be able to execute this script again.
UPDATE : Problem Solved. I was able to do this by drilling down in the DOM tree until I hit the "next" anchor tag. I simply added an event listener for whenever it was clicked and had it re execute the JavaScript.
Yes, you can force a page load thus:
window.location.reload(true);
However, what the point of AJAX is to not reload the page, so often you must write javascript code that duplicates the server-side code that builds your page initially.
However, if the page-load-code-under-discussion runs in javascript on page load, then you can turn it into a function and re-call that function in the AJAX success function.
Reference:
How can I refresh a page with jQuery?

Parsing links from an XML files with JavaScript

To display my latest blog-posts on a different page I want to parse the rss-feed from the blog and then generate elements with it.
I first tried to parse a fixed .xml file for which I wrote the following code:
var maxBlogposts = 5;
var blogPosts = 0;
$.get("rss.xml", function(data) {
$(data).find("item").each(function() {
if(blogPosts === maxBlogposts) return;
var el = $(this);
//Only display 3 posts on small devices.
var extra = (blogPosts >= 3) ? "not-small 12u(small)" : "12u(small)";
var div = $('<div class="6u ' + extra + '" class="blog-entry"></div>');
var h = $('<h4>' + el.find("title").text() + '</h4>');
var description = el.find("description").text().replace('[…]', '[…]');
var p = $('<p>' + description + '</p>');
div.append(h);
div.append(p);
$('#blog').append(div);
blogPosts++;
});
});
This worked perfectly fine. Now I want to parse the actual rss-feed. For this I wrote a PHP script which simply gets the feed and echos it.
<?php
$rss = file_get_contents('http://xn--der-grne-baum-1ob.net/feed/');
die($rss);
?>
And again I get the correct XML file on the frontend.
The problem I have is that now my code is no longer working. Getting the description was failing as well as the links. I fixed the description by accessing
el.find(description")[0].innerHTML
However I can't seem to get the links to work. The data returned from the PHP file contains a node with the link in it. The "el"-element also contains children named "link" but those no longer contain the actual link.
I feel like the links may get "escaped" during parsing? At least that is the only reason i could think of that would result in what I am observing.
The XML I am parsing comes from http://xn--der-grne-baum-1ob.net/feed/
Try
var maxBlogposts = 5
, blogPosts = 0;
$.get("https://query.yahooapis.com/v1/public/yql?q=select"
+ " * from feed where url='http://xn--der-grne-baum-1ob.net/feed/'")
.then(function(data) {
$(data.documentElement).find("results item")
.each(function() {
if(blogPosts === maxBlogposts) return;
var el = $(this);
//Only display 3 posts on small devices.
var extra = (blogPosts >= 3) ? "not-small 12u(small)" : "12u(small)";
var div = $('<div class="6u ' + extra + '" class="blog-entry"></div>');
var h = $('<h4>' + el.find("title").text() + '</h4>');
var description = el.find("description").text().replace('[…]', '[…]');
var p = $('<p>' + description + '</p>');
div.append(h);
div.append(p);
$('#blog').append(div);
blogPosts++;
});
}, function(jqxhr, textStatus, errorThrown) {
console.log(textStatus, errorThrown)
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
<div id="blog"></div>
See YQL Console

Categories

Resources