I am using jQuery.autocomplete(1.02) on my search box and I want exact string and substring matching. I don't care (yet!) about the database load, I'm happy for it to fire off a query every keystroke and bypass the caching entirely - I just don't want anything missed.
To this end I have tried setting cacheLength=1, the minimum permitted, but the autocomplete function refuses to fire off a GET request for each key up.
searchbox GET_request
'a' -> http://localhost/service_search_request?q=a
'ar' -> http://localhost/service_search_request?q=ar
'ars' -> http://localhost/service_search_request?q=ars
Instead, it sends the first and the third and misses the second, giving me the wrong results for 'ar' :-/ I've cleared my cache and sessions but it looks like some sort of caching is still going on. AFAIK I have no proxying going on and I'm shift-refreshing each time. It looks likely then that this behavior is from jQuery.autocomplete itself.
So my questions are...
A) Does this seem likely? i.e. is it a feature, or maybe a bug?
B) If so is there a clean way around it?...
C) If not, what autocomplete would you use instead?
Naturally D) No you're just using it incorrectly you douche! is always a possibility, and indeed the one I'd prefer having spent time going down this road - assuming it comes with a link to the docs I've failed to find / read!
Cheers,
Roger :)
I wonder why cacheLength doesn't work, but had trouble with autocomplete too. IMHO, there are errors in it. However, in the list of options, there is a matchSubset you could set to false.
EDIT:
somewhere around line 335 is a function called "request". You could add some debug messages to it, to see what happens: (note: you need firebug installed or "console" will be unknown)
function request(term, success, failure) {
console.debug("ac request...");
if (!options.matchCase)
term = term.toLowerCase();
var data = cache.load(term);
console.debug("ac request 1, loaded data from cache: " + data + " term: " + term);
// recieve the cached data
if (data && data.length) {
success(term, data);
// if an AJAX url has been supplied, try loading the data now
} else if( (typeof options.url == "string") && (options.url.length > 0) ){
console.debug("ac request 2, data is not in the cache, request it");
"flushCache" can easily be used in the function you can attach / set as options. I used this, to clear the Cache, if there could be more data in the backend:
formatItem: function (data,i,n,value){
if(i === (this.max -1)){
console.debug("flushCache");
jQuery(this).flushCache();
}
return data[1] + " (" + data[0] + ")";
}
I am having the same problem. Caching doesn't work although I have set the option cacheLength to 1.
With your solution to call the flushCache function after each printed term it works. I couldn't use the:
if(i === (this.max -1)){
since 'i' was e.g 1 after filtering but 'this.max' still 25 as the original backend query resulted in 25 returned rows.
However, this bug ONLY appears when typing words that contain the swedish characters 'å', 'ä' or 'ö'. So maybe the cashing works as expected but not with these special characters.
Anyway. the solution for me was to always call the flushCache control in the formatItem() function:
function formatItem(row, position, n, term) {
if($("#keywords-h").length > 0){
$("#keywords-h").flushCache();
}
// format Item
return "<span>" + row[0] + "</span>";
}
Hope this helps someone and if someone is having the same problems with special characters please post a reply.
Have obviously come to this 18 months on, but
cacheLength: 0
in the options worked for me. So maybe latest release has fixed the bug?
This worked for me.
function requestData(q) {
if (!options.matchCase) q = q.toLowerCase();
//-- I turned off this line
// var data = options.cacheLength ? loadFromCache(q) : null;
//-- And added this line of code
var data = null;
There is an option to disable subset matching e.g.
$("#query").autocomplete(
url,
{
matchSubset: false
}
)
Related
Running in IE is a legacy app built with frames that makes alot of references cross-frame like parent.header.blah.blah and parent.sidebar.so.and.so. Worked fine in old IE compatibility mode. Works in chrome and edge (chromium).
But in regular IE without compatibility mode on, it's throwing a permission denied error on line 43. Thing is, it throws the error in the console NO MATTER WHAT IS ON LINE 43!!! I added superfluous lines of code to push other code down, took out code to move other code up. Doesn't matter, the console ALWAYS says it's on line 43.
I put breakpoints in and noticed the error doesn't actually add to the console until AFTER the javascript has finished running. The page is very large with ALOT of javascript, and it's dificult to comment a section out without breaking the page to experiment with what might be causing the permission denied.
Permission Denied is supposed to indicate a same-origin violation as I understand it, but all frames and files are coming through the same servlet on the same URL with only parameters changing. I printed out the document.domain of every frame, they all are identical.
So..I'm not even sure what to do at this point to narrow it down. How can I figure out what is really the offending piece of code...or even section?
UPDATE - So it seems that the error is actually coming from a function in another frame being called from this frame (nowhere near line 43 by the way). That function is managing the options in a select list. The actual error comes here:
for (var k=0; k < assetListz.options.length; k++) {
if (assetListz.options[k].value == currentAsset) { //permission denied!
inList = true;
assetListz.options[k].selected = true;
break;
}
}
assetListz didn't have a 'z' on it until I just did that to make sure I wasn't accidentally getting scope to some OTHER assetList. I can test the length of the assetList, but as soon as I check the value on that second line, kaboom. Ideas?
Update 2 -
I changed the code to get the assetlist in each reference. No storing it. Blows up in the same place.
for (var k=0; k < document.getElementById('assetList').options.length; k++) {
if (document.getElementById('assetList').options[k].value == currentAsset) {
inList = true;
document.getElementById('assetList').options[k].selected = true;
break;
}
}
Okay, I got this fixed. I'm not sure exactly WHY this fix works, though I can guess. It seems that by rewriting the method so that it avoids use of the options array on the select object, everything works fine. I reference it just once to get the length...which it allows. but if i try to get a specific option by assetList.options[i].anything, then I get permission denied.
I still think this is a bug in IE11's same-origin code, but lucky for me, it seems like MS didn't re-use code so the same bug didn't 'protect' all means of accessing the select options. Just via the array property. Or maybe something else is goin on. I just know this worked for me.
//By changing the value attribute, we change the current selection.
assetList.value = currentAsset.toUpperCase();
if(assetList.selectedIndex == -1) {
//this means the current asset wasn't in the list.
if(assetList.options.length >= 1000) {
//not allowed to add more than 1000. And if so, set the selectedIndex back
alert("The Active Asset List contains the maximum of 1000 entries. \n" +
"The current Asset ID '" + currentAsset + "' was not added to the Active Asset List.");
assetList.selectedIndex = currentSelectedIndex;
return;
} else {
var option = document.createElement("OPTION");
option.value = currentAsset;
option.text = currentAsset;
assetList.add(option, 0);
assetList.selectedIndex = 0;
}
}
I've built an angular/express/node app that runs in google cloud which currently uses a JSON file that serves as a data source for my application. For some reason, (and this only happens in the cloud) when saving data through an ajax call and writing it to the json file, everything seems to work fine. However, when refreshing the page, the server (sometimes!) sends me the version before the edit. I can't tell whether this is an Express-related, Node-related or even Angular-related problem, but what I know for sure is that I'm checking the JSON that comes in the response from the server, and it really is sometimes the modified version, sometimes not, so it most probably isn't angular cache-related.
The GET:
router.get('/concerts', function (request, response) {
delete require.cache[require.resolve('../database/data.json')];
var db = require('../database/data.json');
response.send(db.concerts);
});
The POST:
router.post('/concerts/save', function (request, response) {
delete require.cache[require.resolve('../database/data.json')];
var db = require('../database/data.json');
var concert = request.body;
console.log('Received concert id ' + concert.id + ' for saving.');
if (concert.id != 0) {
var indexOfItemToSave = db.concerts.map(function (e) {
return e.id;
}).indexOf(concert.id);
if (indexOfItemToSave == -1) {
console.log('Couldn\'t find concert with id ' + concert.id + 'in database!');
response.sendStatus(404);
return;
}
db.concerts[indexOfItemToSave] = concert;
}
else if (concert.id == 0) {
concert.id = db.concerts[db.concerts.length - 1].id + 1;
console.log('Concert id was 0, adding it with id ' + concert.id + '.');
db.concerts.push(concert);
}
console.log("Added stuff to temporary db");
var error = commit(db);
if (error)
response.send(error);
else
response.status(200).send(concert.id + '');
});
This probably doesn't say much, so if someone is interested in helping, you can see the issue live here. If you click on modify for the first concert and change the programme to something like asd and then save, everything looks fine. But if you try to refresh the page a few times (usually even up to 6-7 tries are needed) the old, unchanged programme is shown. Any clue or advice greatly appreciated, thanks.
To solve: Do not use local files to store data in cloud! This is what databases are for!
What was actually the problem?
The problem was caused by the fact that the App Engine had 2 VM instances running for my application. This caused the POST request to be sent to one instance, it did its job, saved the data by modifying its local JSON file, and returned a 200. However, after a few refreshes, the load balancing causes the GET to arrive at the other machine, which has its individual source code, including the initial, unmodified JSON. I am now using a MongoDB instance, and everything seems to be solved. Hopefully this discourages people who attempt to do the same thing I did.
So, I appear to have hit a problem for which I can't find any relevant information!
Essentially, I have been trying to write this YouTube API call pretty much all day, and when I finally think it is complete I realise that it is only displaying 5 results, and not 7 as it should be doing.
Edit: All 7 do display on the JSON call if I was to visit the $.getJSON('URL') in my browser. The two videos appear to be going missing during the parsing?
The jQuery is as follows:
$.getJSON('https://www.googleapis.com/youtube/v3/playlistItems?part=snippet&maxResults=7&playlistId=UUmaGgGFQU_1cv3X4pIUzW9g&key={API_KEY}',function(data){
var i = 0;
$.each(data, function() {
if (typeof(data.items[0]) != "undefined") {
console.log('video exists ' + data.items[i].snippet.title);
title = data.items[i].snippet.title;
description = data.items[i].snippet.description;
videoID = data.items[i].snippet.resourceId.videoId;
if ( i <= 0 ) {
$('#player').append('<div class="first-videocontainer"><h3>'+title+'</h3><iframe width="1120" height="630" src="https://www.youtube.com/embed/'+videoID+'" frameborder="0" allowfullscreen></iframe></div>');
}
else {
$('#player').append('<div class="videocontainer"><h3>'+title+'</h3><iframe width="365" height="205" src="https://www.youtube.com/embed/'+videoID+'" frameborder="0" allowfullscreen></iframe></div>');
}
console.log(i);
i++;
} else {
console.log('video not exists');
}
});
});
I have set the maxResults=7, as I believe this is the only parameter available using APIv3(?).
I'm receiving these errors in the log too, though from Googling them I don't even know if they're of any help because I certainly couldn't find anything constructive from them:
GET chrome-extension://fjhoaacokmgbjemoflkofnenfaiekifl/cast_sender.js net::ERR_FAILED
XHR failed loading: GET "chrome-extension://fjhoaacokmgbjemoflkofnenfaiekifl/cast_sender.js".
So I'm wondering if it's something simple in the jQuery I've just overlooked, or whether my API call is just terrible. This is the first time I've tried to work with this!
Any help is appreciated.
Edit 2: It looks as though it is possibly returning the default value and ignoring my parameter? However, if I set maxResults=4 it does listen.
Documentation: https://developers.google.com/youtube/v3/docs/videos/list#id
"The maxResults parameter specifies the maximum number of items that should be returned in the result set.
Note: This parameter is supported for use in conjunction with the myRating parameter, but it is not supported for use in conjunction with the id parameter. Acceptable values are 1 to 50, inclusive. The default value is 5."
I know i'm late in reply but use this url
https://www.googleapis.com/youtube/v3/search?part=snippet&channelId={channelId}&maxResults=50&key={apikey}
Happy coading
I have written a jquery addon, with a little help from the internet, which retrieves data from Facebook and does as intended on all browsers tested so far apart from IE9.
I work for local government and unfortunately we still use IE9 in our builds (It was still IE8 a few weeks back!! So could have been a lot worse I expect :).
Anyways, I digress, I have added the section of code below which never completes in IE9, but does in IE10, and other browsers...
Can anyone explain/help me adapt or fix this snippet so that I can get it working in IE9?? And not break it in any other browsers in the process :)??
$.when($.getJSON(ogUSER), $.getJSON(ogPOSTS)).done(function (user, posts) {
// user[0] contains information about the user (name and picture);
// posts[0].data is an array with wall posts;
var fb = {
user: user[0],
posts: []
};
var idxLimit = 0;
$.each(posts[0].data, function () {
// We only show links and statuses from the posts feed:
if (this.type != 'link' && this.type != 'status') {
return true;
}
// Copying the user avatar to each post, so it is
// easier to generate the templates:
this.from.picture = fb.user.picture.data.url;
// Converting the created_time (a UNIX timestamp) to
// a relative time offset (e.g. 5 minutes ago):
this.created_time = relativeTime(this.created_time * 1000);
// Converting URL strings to actual hyperlinks:
this.message = urlHyperlinks(this.message);
//remove all anchors
//var content = $('<div>' + this.message + '</div>');
//content.find('a').remove();
//this.message = content.html();
fb.posts.push(this);
idxLimit++;
if (idxLimit === 2) {
return false;
}
});
In all browsers, not including IE9, if I insert a breakpoints anywhere within the .done() callback it stops execution and I can debug. With IE9 the breakpoint is not reached leading me to believe there is an issue with IE9 script engine and jQuery.when() API call, or the .done() callback method...
But, I'm just guessing at the mo... I've been searching the web for the last few hours to see if anyone else has happened upon a similar issue but to no avail. I hope some of the more experienced coders here can help... would be very much appreciated. Until then the search goes on :)
Thanks for your time folks ;)
PS. I don't receive any console errors what so ever in IE9 running the script...
TartanBono
Is there a way to tell, after the fact, whether an image (placed with the <img> tag, not via JS) has loaded correctly into a page? I have a gallery of head shots, and occasionally the third-party image server ends up serving up a 404. I can change the server-side code to use an onerror="showGenericHeadshot()", but I really want to avoid making changes to server-side code. Ultimately, I want to determine if an image is missing or broken and replace it with a generic "Image Not Found" graphic. Things I've tried:
Image.prototype.onerror = showGenericHeadshot -- doesn't work for <img> tags
$('img[src*=thirdpartyserver.com]).error(showGenericHeadshot) -- doesn't work in IE
$('img[src*=thirdpartyserver.com]).css('backgroundImage','url(replacementimage.gif)') -- works, but still doesn't get rid of the broken image icon in IE
<img scr='someUrl' id="testImage" />
jQuery('#testImage').bind('load',function(){
alert ('iamge loaded');
});
to avoid race condition do as below
<img _src="http://www.caregiving.org/intcaregiving/flags/UK.gif" />
// i have added an underscore character before src
jQuery('img').each(function(){
var _elm=jQuery(this);
_elm.bind('load',_imageLoaded).attr('src',_elm.attr('_src'))
});
function _imageLoaded()
{
alert('img loaded');
}
Unfortunately, I'm not able to accept either #TJ Crowder's nor #Praveen's excellent answers, though both do perform the desired image-replacement. #Praveen's answer would require a change to the HTML (in which case I should just hook into the <img> tag's own error="" event attribute. And judging by network activity, it look like if you try to create a new image using the url of an image that just 404ed in the same page, the request actually does get sent a second time. Part of the reason the image server is failing is, at least partly, our traffic; so I really have to do everything I can to keep requests down or the problem will only get worse..
The SO question referred to in #danp's comment to my question actually had the answer for me, though it was not the accepted answer there. I'm able to confirm that it works with IE 7 & 8, FF and webkit browsers. I'm doubtful it will work with older browsers, so I've got a try/catch in there to handle any exceptions. The worse case will be that no image-replacement happens, which is no different from what happens now without doing anything. The implementation I'm using is below:
$(function() {
$('img[src*=images.3rdparty.com]').each(
function() {
try {
if (!this.complete || (!$.browser.msie && (typeof this.naturalWidth == "undefined" || this.naturalWidth == 0))) {
this.src = 'http://myserver.com/images/no_photo.gif';
}
} catch(e) {}
}
);
});
Would an alternate text be sufficient? If so you can use the alt attribute of the img tag.
I think I've got it: When the DOM is loaded (or even on the window.load event — after all, you want to do this when all images are as complete as they're going to get), you can retroactively check that the images are okay by creating one new img element, hooking its load and error events, and then cycling through grabbing the src from each of your headshots. Something like the code below (live example). That code was just dashed off, it's not production quality — for instance, you'll probably want a timeout after which if you haven't received either load or error, you assume error. (You'll probably have to replace your checker image to handle that reliably.)
This technique assumes that reusing a src does not reload the image, which I think is a fairly reliable assumption (it is certainly an easily testable one) because this technique has been used for precaching images forever.
I've tested the below on Chrome, Firefox, and Opera for Linux as well as IE6 (yes, really) and IE8 for Windows. Worked a treat.
jQuery(function($) {
var imgs, checker, index, start;
// Obviously, adjust this selector to match just your headshots
imgs = $('img');
if (imgs.length > 0) {
// Create the checker, hide it, and append it
checker = $("<img>").hide().appendTo(document.body);
// Hook it up
checker.load(imageLoaded).error(imageFailed);
// Start our loop
index = 0;
display("Verifying");
start = now();
verify();
}
function now() {
return +new Date();
}
function verify() {
if (!imgs || index >= imgs.length) {
display("Done verifying, total time = " + (now() - start) + "ms");
checker.remove();
checker = undefined;
return;
}
checker[0].src = imgs[index].src;
}
function imageLoaded() {
display("Image " + index + " loaded successfully");
++index;
verify();
}
function imageFailed() {
display("Image " + index + " failed");
++index;
verify();
}
function display(msg) {
$("<p>" + now() + ": " + msg + "</p>").appendTo(document.body);
}
});
Live example