Day overflows in javascript Date constructor - javascript

I am currently writing some sort of javascript based client calendar and observed some issues. All over the net, I can find code samples, where people use day overflows in the Date constructor.
i.e.
// get the first day of the next month
var myDate = new Date(someDate.getFullYear(),someDate.getMonth(),32);
myDate.setDate(1);
The general idea of this concept is, that since there is no month with 32 days, the constructor will create a date within the next month. I saw even codesample with negative overflows:
i.e.
// get the last day of the previous month
var myDate = new Date(someDate.getFullYear(),someDate.getMonth(),1);
myDate.setDate(-1);
Now while this seems to work in many cases, I finally found a contradiction:
// this prints "2012-12-30" expected was "2012-12-31"
var myDate = new Date(2013,0,1);
myDate.setDate(-1);
Further examination finally revealed that dates like
new Date(2013,0,23) or new Date(2013,0,16) combined with setDate(-1) all end up in "2012-12-31". Finally I observed that using -1 seems to subtract two days (for getting the expected result setDate(0) has to be used).
Is this a bug in the browser implementations or are the code samples spread accross the internet crap??
Furthermore, is this setDate with positive and negative overflow secure to be used and uniformly implemented by all major browsers?

From MDN:
If the parameter you specify is outside of the expected range, setDate attempts to update the date information in the Date object accordingly. For example, if you use 0 for dayValue, the date will be set to the last day of the previous month.
It's logical if you think about it: setDate(1) sets the date to the first of the month. To get the last day of the previous month, that is, the day before the first of this month, you subtract one from the argument and get 0. If you subtract two days (1 - 2) you get the second to last day (-1).
are [..] code samples spread accross the internet crap?
Yes. This is true at least 90% of the time.

At MDN they say:
If the parameter you specify is outside of the expected range, setDate
attempts to update the date information in the Date object
accordingly. For example, if you use 0 for dayValue, the date will be
set to the last day of the previous month.
So you're getting coherent results:
1 - Jan 1
0 - Dec 31
-1 - Dec 30
-2 - Dec 29
Edit: It may look counter-intuitive if you think of it as a mere relative value, such as PHP's strtotime() function:
strtotime('-1 day');
It's not the case ;-)

Related

Date Handling Unix Date Incorrectly (or I'm using Date incorrectly?)

I have the following data structure. The first column is intervals. The first row of the interval datum is a unix time and the subsequent data are intervals (i.e. 300*1, 300*2, ect). The other column is the data values. Here is the head of the data:
a1521207300,555.45
1,554.53
2,554.07
3,553.9
4,552.67
And here I went about converting the unix time to a Date object. The a here is ornamental, so I slice() at 1 like so:
var rawTime = data[0].interval;
var timeValue = Math.round(rawTime.slice(1));
console.log(timeValue)
console.log(new Date(timeValue))
I also tried using parseInt() instead of round(). The console shows that this unix time is equivalent to: Jan 18 1970 which I had quite the guffaw at. Then I got to thinking, maybe I did something wrong. It's supposed to be a very recent date -- March 16th 2018. This is strange because my understanding is that javascript can be passed a unix date directly as per this answer.
I also checked the unix time at a conversion site: www.onlineconversion.com/unix_time.htm
Which confirmed that it's indeed a March 16th 2018 timestamp.
Question: Why is this unix date for my March 2018 data being treated like a 1970's date? Maybe the a is actually doing something after all... Anyway, what is the correct way to handle this time stamp? It's only 10 numerical digits, it does not seem to be a precision problem. Date can handle unix times up to 13 digits I believe.
As per the documentation, when you invoke new Date(value) with an integer value, it is used as the number of milliseconds since January 1, 1970. To get the date you want, the value 1521207300 appears to be number of seconds instead of milliseconds. That is, you missed a factor of 1000. new Date(1521207300000) gives Fri Mar 16 2018.
When I take away new from new Date it seems to be ok. Not sure why though.
The documentation mentions the different behavior:
Note: JavaScript Date objects can only be instantiated by calling JavaScript Date as a constructor: calling it as a regular function (i.e. without the new operator) will return a string rather than a Date object; unlike other JavaScript object types, JavaScript Date objects have no literal syntax.
It seems when called as a function Date(value), it treats the value as the number of seconds, instead of milliseconds. I didn't dig deep enough to confirm this, because it doesn't matter: the documentation says to not use it this way (and since it gives a string instead of a date object, it's not very useful anyway).

Create a Date Object with the year only

I'm used to create Date objects by using the fourth syntax from MDN as new Date(year, month, day, hours, minutes, seconds, milliseconds); But lately I tried to set a Date object with only a year (as new Date(2017)) but as you could expect it was treated as a value and considered the year as a number of milliseconds.
Is there any way of still easily use the year as is without changing the syntax and expect a correctly set Date ?
Two solutions come to my mind:
(1) Set the year argument to 2017 and set the month argument to 0 when constructing the date:
let d = new Date(2017, 0);
console.log(d.toString());
The arguments will be treated as local time; month and day of month will be January 1; all time components will be set to 0.
(2) Specify "2017T00:00" as the first and only argument when constructing the date:
let d = new Date("2017T00:00");
console.log(d.toString());
According to current specs this is a valid format and browsers are supposed to treat it as local time. The behavior is same as that of previous example.
If you are passing a single parameter (number or string), then it is taken as per doc
value
Integer value representing the number of milliseconds since January 1,
1970 00:00:00 UTC, with leap seconds ignored (Unix Epoch; but consider
that most Unix time stamp functions count in seconds).
dateString
String value representing a date. The string should be in a format
recognized by the Date.parse() method (IETF-compliant RFC 2822
timestamps and also a version of ISO8601).
Also as per doc
If at least two arguments are supplied, missing arguments are either
set to 1 (if day is missing) or 0 for all others.
You can pass one more parameter as 0 or null (or the actual value you want to set)
new Date(2017,0);
Demo
var date = new Date(2017,0);
console.log( date );
You could pass null as second argument:
new Date(2017, null);
However, without knowing the details of how missing values are interpreted, what do you think happens now? Will the date be initialized with the current month, day, hour, etc? Or something different?
Better be explicit and pass all arguments, so that you know what the code is doing half a year later.
I have another suggestion. You can just create a regular date object and set it's year. So at least you know to expect what the rest of the Date object values are.
var year = "2014";
var date = new Date();
date.setFullYear(year);
// console.log(year) => Wed Dec 27 2014 16:25:28 GMT+0200
Further reading - Date.prototype.setFullYear()

fullcalendar confusion with UTC and local date

I do let fullcalendar initialize normally. So it represents current date. (Midnight->midnight, 1day, 1h slots)
From some other datasource I get data with timestamps. The format is "YYYY-MM-DD HH:mm" (transmitted as a string, no timezone information)
So I convert that string to a moment object and test against fullcalendar.start and .end to see if it is within.
moment("2016-04-07 00:00") == $('#calendar').fullCalendar('getView').end
This results in false though the following command
$('#calendar').fullCalendar('getView').end.format("YYYY-MM-DD HH:mm")
returns
"2016-04-07 00:00"
I also tried to compare with diff
moment("2016-04-07 00:00").diff( $('#calendar').fullCalendar('getView').end,"minutes")
which returns
120
Some research on the calendars.end object in Chrome Dev Tools revealed that it internally is represented as
2016-04-07 02:00 GMT+0200
This looks strange to me. I am in timezone 2h ahead of GMT. So it should correctly say 2016-04-07 00:00 GMT+0200, should it not?
This also explains why the diff test above resulted in 120 minutes.
Can someone help? I do not get where the conversion problem comes from. I am using only dates with no timezone information. And as said above, fullcalendar initalizes with no gotodate information and shows a time bar from 00:00 to 00:00. So why does it come that there is this 2h difference?
Thanks a lot. I do understand things a lot better now.
Some of the dates I tried to compare were 'now'. I got 'now' by
var n = moment()
That turned out to be a date time including my timezone.
E.g. moment().format() resulted in '2016-04-07 00:00 GMT+0200' and I now see how this went wrong excepting a comparison against full calendar.end to be true but it was false as '2016-04-07 00:00 GMT+0200' is '2016-04-06 22:00' at UTC.
As
moment.utc()
does not work, I know ended up with using
moment.utc(moment().format('YYYY-MM-DD HH:mm'))
This now seems to work as this treats my local time as it would be the 'numerical same time' at UTC.. thus matching with how fullcalendar handles times internally (ambiguously-zones moments).
Thanks
A few things:
The timezone parameter controls how FullCalendar works with time zones.
By default, FullCalendar uses "ambiguously-zoned moments". These are customizations to moment.js made within fullCalendar. The docs state:
The moment object has also been extended to represent a date with no specified timezone. Under the hood, these moments are represented in UTC-mode.
Thus, to compare dates in this mode, treat them as if they were in UTC.
moment.utc("2016-04-07 00:00")
To compare moments, use the moment query functions, isSame, isBefore, isAfter, isSameOrBefore, isSameOrAfter, and isBetween.
In this case, since FullCalendar's start is inclusive but the end date is exclusive, you probably want to compare like this:
var cal = $('#calendar').fullCalendar('getView');
var start = cal.start;
var end = cal.end;
var m = moment.utc("2016-04-07 00:00"); // your input
var between = m.isSameOrAfter(start) && m.isBefore(end);
Note that there's an pending enhancement to moment's isBetween functionality for a future release that will give you control of exclusivity, but currently isBetween is fully inclusive, so you have to use the combination of functions shown here.

later.js - February and End of Month

I am creating a platform for recurring monthly orders.
I am using later.js for the recurrence. I have come across the following two cases and I am wondering if anybody has suggestions on how to better handle these (or if later.js handles them natively somehow):
later.parse.recur().on(31).dayOfMonth()
The date is the 31st of a given month. Current result is that is jumps months that end on the 30th. WORKAROUND: is to use last().dayOfMonth().
later.parse.recur().on(30).dayOfMonth()
later.parse.recur().on(31).dayOfMonth()
Month of February, ending on the 28th or 29th. How to handle if the date is 30th (or 31st). WORKAROUND: If date > 28th, add .and().on(59).dayOfYear()
Thanks!
I don't know the specifics of later.js, but apparently you can write something called a custom modifier: https://github.com/bunkat/later/blob/master/example/modifier.js
In addition to this, if you add a month to a javascript date (doesn't matter if the number becomes greater than 11/december), set the day of the month to the first then subtract 1 day, then you'll get the date of the last day in the originally given month. For example:
var a = new Date("2000-02-25");
var b = new Date(new Date(a.getFullYear(),a.getMonth()+1,1)-1);
console.log(b);

Will assigning 0 to the 3rd parameter of a JavaScript Date() object always create an end of month date?

I'm working on a jQuery credit card expiration date validation script. Credit cards expire after the last day of the expiration month. For instance, if the card expires on 8/2013 then it's good through 8/31/2013.
In the past on the server side I've determined the last day of the month by adding 1 to the current month, then subtracting 1 day.
Today I noticed that when creating a new date, if 0 is applied to the 3rd parameter of the JavaScript Date() object, the resulting date will be the end-of-month day. But I've been unable to locate any online documentation to affirm this observation.
Here is some sample code.
var month = 10;
var year = 2013;
var expires = new Date(year, month, 0);
alert(expires);
And here is a jsFiddle example that I created.
This is a bit confusing, because I thought in JavaScript months were zero based. I've tested this in Chrome, Firefox, IE, and Safari, and the behavior appears consistent. The returned date consistently displays the last day of the month. This looks like a lucky find, but I'd really like to understand what is happening here.
Am I safe to run with this approach to assigning an end of month date, and if so is there some online documentation that I can point to which affirms this? Thanks.
Months are zero-based. That creates an end-of-month date in the previous month. Month 10 is November, so creating a date with day 0 in November gives you the end of October (month 9).
That is, day 0 in November means "the day before 1 November", which is the last day of October. Day -1 in November would be 30 October.

Categories

Resources