This is part of a larger program which is handling a scrollbar on a <div> when modifying the height.
When logging the output of various values for moving the scrollbar, there's an issue occurring where values are being produced with decimal places, but only on Opera (version 44.0.2510.1449), and this is only happening on my friend's browser. On my own Opera (version 44.0.2510.1449) I do not encounter the same problem.
Though it's probably irrelevant, the purpose of the code is to find out where the scrollbar is in the div with id #mydiv and do something based on the result.
Similar code with changed variable names:
var myDivHeight = $('#mydiv').height();
$('#mydiv').height(myDivHeight + 50); //10 extra for padding
var scrollTop = $('#mydiv').scrollTop();
var scrollHeight = $('#mydiv').prop('scrollHeight');
console.log(scrollHeight + '-' + scrollTop + '=' + (scrollHeight - scrollTop));
console.log(myDivHeight + 60);
Note: 60 is due to the changes made to the page dynamically, so the div height has been changed. The result of the output should be that scrollHeight - scrollTop = myDivHeight + 60.
Here's my friend's console output on Opera (giving fractional scrollTop):
Here's my console output on Opera:
Here's the console output on Chrome:
Here's the console output from Firefox:
I can't find anyone else reporting this. Has this been reported or seen by anyone else? Is there any way to overcome this?
Thank you.
It turns out that taking the time to ask the question helped me answer it on my own.
First of all, to overcome the problem, it's a case of using Math.round(scrollTop.value).
The specification for scrollTop shows it is a unrestricted double data type so Opera is handling this within spec. Reference: https://drafts.csswg.org/cssom-view/#dom-element-scrolltop
An issue where this came up in jQuery before the type was changed from integer to Number: https://github.com/jquery/api.jquery.com/issues/608
Related
I am encountering some strange behavior for the following code.
function linkFunc(scope, element, attribute) {
var page = angular.element($window);
page.bind('scroll', function() {
var windowHeight = "innerHeight" in window ? window.innerHeight : document.documentElement.offsetHeight;
var body = document.body, html = document.documentElement;
var docHeight = Math.max(body.scrollHeight, body.offsetHeight, html.clientHeight, html.scrollHeight, html.offsetHeight);
var windowBottom = windowHeight + window.pageYOffset;
if (windowBottom >= docHeight) {
scope.$apply(attribute.myDirective);
}
});
}
The above is a piece of code that detects if the bottom of the page is reached, if its reached it will call whatever function bind to myDirective
The main issue is that most of the time the lazy loading works, and myDirective gets sucessfully called. However some of the times the lazy loading won't work, and I wasn't able to reproduce the bug.
I tried different screen size, different browser, but it seems like the bug just happens randomly.
Maybe someone have this happened to them before, and can point me a direction?
Edit:
More information
I was able to reproduce the bug after a bit of experimenting.
Basically, when the zoom in percentage of the browser is < 100 % , window.pageY returns a decimal value that is slightly inaccurate which cause windowBottom to be off by a 0.1 to 0.9
eg.
console.log(windowBottom); // 1646.7747712336175
console.log(docHeight); // 1647
Does anyone know why this happens?
Edit 2:
The above behavior is also non deterministic, but the decimal part is true.
0.1 + 0.2 !== 0.3
This one is an oddity not just in JavaScript; it’s actually a prevailing problem in computer science, and it affects many languages. The output of this is 0.30000000000000004.
This has to do with an issue called machine precision. When JavaScript tries to execute the line above, it converts the values to their binary equivalents.
This is where the problem starts. 0.1 is not really 0.1 but rather its binary equivalent, which is a near-ish (but not identical) value. In essence, as soon as you write the values, they are doomed to lose their precision. You might have just wanted two simple decimals, but what you get, as Chris Pine notes, is binary floating-point arithmetic. Sort of like wanting your text translated into Russian but getting Belorussian. Similar, but not the same.
You can read more here. Without digging into browser source, I would guess that your problem stems from this.
Given the floating-point precision issues, you may want to loosen you condition to check instead if the two values are less than 1 pixel different. For example:
if (Math.abs(windowBottom - docHeight) < 1) {
scope.$apply(attribute.myDirective);
}
i have this script for virtual scroll and for some reason after about 68500 rows it breaks in Internet Explorer 11,but works in FF and Chrome...
https://jsfiddle.net/dLq2284r/5/
at the end you can see the rows overlap .., but only after over 65k rows
i think something is wrong here:
positionPage: function(inPage)
t is getting quite big :) over 1535274
so i think setting a css top : 1535274px; or more is the problem, but i might be wrong :D
positionPage: function(inPage) {
var pn = inPage.pageNum;
if (this.fixedHeight) {
t = pn * this.rowHeight * this.pageSize;
} else {
if (this.pageTops[pn]) {
t = this.pageTops[pn];
} else {
var n = 0, t = 0;
while (n < pn) {
t += this.getPageHeight(n);
n++;
}
}
}
var t0 = inPage.style[this.horiz ? 'left' : 'top'].slice(0, -2);
// update pageTops cache
this.pageTops[pn] = t;
this.pageTops[pn+1] = t + this.getPageHeight(pn);
// set the page's top
inPage.style[this.horiz ? 'left' : 'top'] = t + 'px';
if (t0) {
return t0 - t;
}
}
I have tried everything... any help, hints or anything will be appreciated.
I use this for a database with over 90k rows and i would love it to work on IE for at least 100k rows.
Also don't suggest frameworks or anything else, i have tried them all
Thanks
IE has a long way to go before can compete with other browsers .. So to answer my own question, based on #Sam Segers comment and my research it seems that after a specific height IE has a height miscalculation, based on
Determine Maximum Possible DIV Height
IE can handle about 10M pixels, but after about 1.5M it is not accurate, for example my element had a style top: 1535274px; and the browser was adding the content at 1534484px, so 790px higher, and my js stop working after that point. And since until that height everything was ok i concluded that my script is doing it's job.
I just added a warning in my script to let the IE users know that after that height results are not accurate and to use another browser.
Thank you for your help
I'm running into some strange error with the Date.now() function in JS.
Background Info: I'm developing a webApp that sends and receives packets to/from a server. All my incoming packets are marked with a timestamp, generated by Date.now(). Among these packets is data, that is printed into a line chart. So I get a value on the y axis and the timestamp on the x axis. So far so good.
Letting the app run has shown me some strange behaviour -> My line chart sometimes draws data points back in the past, right before already present data points! While checking the timestamps of these points i saw, that my chart does everything right, but the timestamps are wrong.
So I wrote a little test script, to collect some more evidence:
var i = 0;
var ts = [Date.now()];
setInterval(function () {
ts.push(Date.now());
console.info("checking next timestamp ...");
if (ts[i] > ts[i+1]) {
console.error("old timestamp (" + ts[i] + ") is BIGGER than current timestamp (" + ts[i+1] + ")!");
}
i++;
}, 100);
Running this over a few minutes prints me one error:
Way later i get another one, and so on.
For this test i chose 100ms interval time to check many timestamps. But my App has an update interval of maybe 5s and still there arrives this error sometimes. Not as often as with 100ms, but it's there :(
Has anyone else seen this behaviour before? Am I doing something wrong here?
BTW: The packets in my App come in one by one, so it's impossible that the order of the packets is mixed up or sth.
Well, as I have taken a look onto the Performance API yesterday, I think this is the way to go. I implemented it in my application as follows:
Everytime when I receive an input packet from my server I give them a timestamp. Until now I had realised this by Date.now() but this seems to be inconsistent as the OS or the VM (or both) recalculate their current time from time to time.
So now I calculate my own timestamp. For this I have to read the timestamp from when my application started. The performance API gives me this value by
var startTS = window.performance.timing.navigationStart; //returns a timestamp in ms
So this is a value available everywhere in my app since it exists in the window namespace and does not need a global variable made by myself. Very nice.
Now, to get the current timestamp, I have to add a time value that indicates how long my application is running since startTS. To get this we just use:
var timeSinceStartup = window.performance.now(); //returns sth. like 1337.9836718 (ms with fractional part)
If needed, one can round this value as the fractional part might not be needed:
var rounded = (timeSinceStartup + 0.5) | 0; //same as but faster than Math.round(timeSinceStartup);
Well, the rest is easy. We just have to add those values (as both are milliseconds) and voila:
var currentTS = startTS + timeSinceStartup;
I have updated the little code snippet from my question to test this:
var i = 0;
var start = window.performance.timing.navigationStart;
var ts = [start + window.performance.now()];
setInterval(function () {
ts.push(start + window.performance.now());
console.info("checking next timestamp ...");
if (ts[i] > ts[i+1]) {
console.error("old timestamp (" + ts[i] + ") is BIGGER than current timestamp (" + ts[i+1] + ")!");
}
i++;
}, 100);
Chrome is running this test at the moment (in VM) and it looks like everything is just fine (17500 checks without an error). I would have wondered if sth. else would happen, as we just add a constant with a monoton rising value ;)
Only drawback of this solution seems to be the browser support. As of now every modern browser but Safari support this. As my app is developed to run only on modern browsers this is ok for me, but the lack of Safari is bad.
http://caniuse.com/#feat=high-resolution-time
I will have to take a look into crossbrowser solutions/polyfills/Frameworks or sth. Thanks everybody for your help. Anyway I'm still curious if there might be a solution for Date.now() in Chrome, as FF and IE had no problems with using it.
https://code.google.com/p/chromium/issues/detail?id=408077
EDIT
This seems to be a good polyfill for a missing Performance API:
https://gist.github.com/paulirish/5438650
Ofcourse it falls back to Date.now(), so in browsers that don't support the API I might run into the same problem as before. For this case I just keep checking my timestamps in the app and if a timing error occurs I'm going to handle it silently (e.g. don't draw it to the chart, instead wait for the next value).
So long...
Hello fellow code people :)
I am a frontend web developer and as such in need of constant knowledge of the actual viewport size in order to see where in responsive designing breakpoints start and end.
I know FF's own 'test window size' function, but came across a very handy extension: FireSizer.
the extension has one itsy bitsy drawback: It gives back the window-size including FF's borders and scrollbar. I need the viewport-size though. So I need the extension hacked, but dont't know enough javaScript to do so. Maybe someone is willing to help em out here?
I would love the extension to actually look for the scrollbar, and subtract from the width
a) 14 if no scrollbar present or
b) 30 if scrollbar present
I found of what I think is the right place to alter the code:
//
// Update the status bar panel with the current window size
//
function FiresizerUpdateStatus() {
var width = window.outerWidth + ''; // <- Think code needs to be edited here
var height = window.outerHeight + '';
document.getElementById("firesizer-statuspanel").label = width + 'x' + height;
}
Thanks for any effort!
AO
#Chen Asraf:
Well thank you very much. I didn't know there was an element to call the document-width. I changed the code to the following, and that did the trick (also when compared to FF's own 'Responsive Design View mode', which is spot on, its off by 2px - which i subtract from clientWidth.)
function FiresizerUpdateStatus() {
var width = window.outerWidth + ''; // changed this line to:
var width = document.documentElement.clientWidth-2 + '';
var height = window.outerHeight + '';
document.getElementById("firesizer-statuspanel").label = width + 'M' + height;
}
Thanks
AO
Possible duplicate of Get the browser viewport dimensions with JavaScript
Seems like you can get the window's inner dimensions by using:
// My window is maximized; screen is 1366x768
alert(document.documentElement.clientWidth);
// ^ returns 1349 (17 missing pixels because of scrollbar)
alert(document.documentElement.clientHeight);
// ^ returns 643 (125 pixels missing because of start bar & Chrome toolbars)
You can then compare the following with whatever else you need (for example, compare client width with window width to find if the difference is big enough to be a scrollbar - just experiment with the sizes)
I'm try to get the browser viewport size.
When the page initially loads (in jQuery(function() { .. });) , both these show the correct value (eg: 560):
console.log($(window).height());
console.log(document.documentElement.clientHeight);
But later when I do the same thing, it shows the height of the whole docoument (eg: 11675).
There's a lot of HTML and JS and it would take a while to figure out what's going on, I was just wondering, did anyone see anything like this, if so, what can cause it and how can I get the correct size of the viewport? All google hits show that's the correct way to retrieve the value.
Note: I'm using chrome.
I recently bumped into the same problem in one of my projects. I didn't have time to dig and isolate this weird bug, and I ended up using this function (adapted from this answer) to correctly get the viewport dimensions :
var getViewportSize = (function(){
var w = window,
d = document,
e = d.documentElement,
g = d.getElementsByTagName('body')[0];
return function(){
return {
w : Math.max(w.innerWidth || e.clientWidth || g.clientWidth, app.config.minWidth),
h : Math.max(w.innerHeight|| e.clientHeight|| g.clientHeight, app.config.minHeight)
};
}
})();
From what I've tested, jQuery returned the incorrect size when the console or some other browser extension/toolbar was occupying some of the viewport space.
Hope this helps, but I'm also curious and trying to figure this one out, because it's hard to think that a mature lib such as jQuery 2.0 has these kind of bugs.