In the library D3. I find the set of functions to handle dates a bit inconsistent. For example, doing the following 4 steps in a console of a page loading D3 I get:
> start = new Date(2010, 11, 30)
Thu Dec 30 2010 00:00:00 GMT+0000 (GMT Standard Time)
> end = new Date(2011, 0, 2)
Sun Jan 02 2011 00:00:00 GMT+0000 (GMT Standard Time)
> d3.time.months(start, end, 1)
[Sat Jan 01 2011 00:00:00 GMT+0000 (GMT Standard Time)]
> d3.time.days(start, end, 1)
[Thu Dec 30 2010 00:00:00 GMT+0000 (GMT Standard Time), Fri Dec 31 2010 00:00:00 GMT+0000 (GMT Standard Time), Sat Jan 01 2011 00:00:00 GMT+0000 (GMT Standard Time)]
the above indicates that day.range starts from the first item and ends just before the second, while month.range seems to do the opposite.
In the documentation it's stated:
# d3.time.months(start, stop[, step])
Alias for d3.time.month.range. Returns the month boundaries (e.g., January 01)
after or equal to start and before stop. If step is specified, then every step'th
month will be returned, based on the month of the year. For example, a step of 3
will return January, April, July, etc.
after or equal to start and before stop is also mentioned for time.days but the result appears to be different. Also, when these functions return after and when equal to the start? What makes the difference?
NB: my wish would be having these functions returning arrays of days, months, years including both start and end parameters.
As clearly explained in here the behaviour is in fact consistent. Both day.range and month.range aim to return each daily and monthly boundaries between the start and end parameter.
Related
I've been struggling for days with some DateTime values.
I have an API backend that uses entity framework and sql server with .netcore.
The big issue when i want to send a datetime from angular to c#
backend. I noticed that Date() in typescript/javascript by default
uses my timezone and i don't know how to exclude it.
For example my date looks like this:
Wed Jul 11 2019 21:00:00 GMT+0300
And when it arrived in c# it becomes 07/10/2010(mm-dd-yyyy), it subtracts 1 day due to timezone.
Is there a way to standardize the Date variable to ignore timezone and always keep the same format DD-MM-YYYY ?
I've also tried to use MomentJS and still can't figure it out, even my MomentJS compares are acting strange due tot his issue.
For example:
const VacationStart = moment(calendarEntity.Vacation.StartTime).utc(false);
const VacationEnd = moment(calendarEntity.Vacation.EndTime).utc(false);
if (VacationStart.isSameOrBefore(ColumnDate,'day') && VacationEnd.isSameOrAfter(ColumnDate,'day')) {
return '#FF0000';
}
In the above example:
VacationStart is Wed Jul 10 2019 21:00:00 GMT+0300
VacationEnd is Wed Jul 17 2019 00:00:00 GMT+0300
ColumnDate is Thu Aug 15 2019 03:00:00 GMT+0300 (incremental value)
Yet for some reason even if i use isSameOrBefore(ColumnDate,'day') to specify to compare only up to days it still does not work. When VacationEnd should be equal to ColumnDate is return false.
Note: everything is in a foreach loop where ColumnDate increases by +1 day.
You just need to use UTC time (Greenwich Mean Time)
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date/UTC
https://learn.microsoft.com/en-us/dotnet/api/system.datetime.utcnow?view=netcore-2.2
So something like this:
new Date(new Date().toUTCString()); -- "Mon Jul 01 2019 17:55:41 GMT-0700 (Pacific Daylight Time)"
new Date().toUTCString(); -- "Tue, 02 Jul 2019 00:56:38 GMT"
new Date().toString(); -- "Mon Jul 01 2019 17:57:03 GMT-0700 (Pacific Daylight Time)"
i use date.js for doing certain date calculations.
i am able to find if the date falls in this week, the following returns true
dateFld.between(Date.monday(), Date.friday())
but I want to check if date falls in the previous week.
i am using the following code without luck.
alert(dateFld.between(Date.last().week().monday(), Date.last().week().sunday()));
please help.
Sunday is the first day of the week.
Date.last().week().monday()
Mon Sep 07 2015 00:00:00 GMT+0100 (GMT Daylight Time) Correct
Date.last().week().sunday()
Sun Sep 06 2015 00:00:00 GMT+0100 (GMT Daylight Time) Incorrect
Date.last().sunday()
Sun Sep 13 2015 00:00:00 GMT+0100 (GMT Daylight Time) Correct
Say I have some data that can span anywhere from 28-31 days. I don't know how many days the data spans, but I know the beginning and ending date, and I create a d3 time scale using the two dates as the domain.
If I specify that I want 1 tick per day, is there a way to get the axis to return how many ticks it's going to create?
Or put another way, is there another method to determine in Javascript how many days are in a range of two dates?
Check out d3.time.day. It's what d3.time.scale uses to do "time math". It's kinda hard to figure how to use, but it looks like there's a method that'll return every day between two dates (represented as a Date object at midnight of each day).
For example, here are the days that elapsed between Jan 24th and now:
d3.time.day.range(new Date(2015,0,24), new Date())
/* returns
[
Sat Jan 24 2015 00:00:00 GMT-0500 (EST),
Sun Jan 25 2015 00:00:00 GMT-0500 (EST),
Mon Jan 26 2015 00:00:00 GMT-0500 (EST),
...
Tue Feb 03 2015 00:00:00 GMT-0500 (EST),
Wed Feb 04 2015 00:00:00 GMT-0500 (EST)
]
*/
So then you can take the .length of that array and there you have it... There are also equivalent functions for counting hours, weeks, months etc.
Maybe there's also a way to get just the number of days — without producing the actual array of Dates — but I couldn't find one.
> new Date('2015-1-1')
Thu Jan 01 2015 00:00:00 GMT-0500 (EST)
> new Date('2015-01-1')
Thu Jan 01 2015 00:00:00 GMT-0500 (EST)
> new Date('2015-1-01')
Thu Jan 01 2015 00:00:00 GMT-0500 (EST)
// Yet...
> new Date('2015-01-01')
Wed Dec 31 2014 19:00:00 GMT-0500 (EST)
// Similarly:
> new Date('2015-1-10')
Sat Jan 10 2015 00:00:00 GMT-0500 (EST)
> new Date('2015-01-10')
Fri Jan 09 2015 19:00:00 GMT-0500 (EST)
Can't figure out why this is happening (Chrome 39). Is it related to octal parsing?
Firefox only accepts new Date('2015-01-10'), and returns what I expect: Date 2015-01-10T00:00:00.000Z
Found the answer in a related question; it appears Chrome parses the YYYY-MM-DD format as UTC time, then converts it the local timezone. So, 2015-01-01 00:00:00 in UTC is Dec 31 in EST.
See Inconsistencies when creating new date objects:
It looks like the form '1979-04-05' is interpreted as a UTC date (and then that UTC date is converted to local time when displayed).
Apparently, a possible cross browser solution is to replace the dashes with slashes to force using local time:
new Date('2015-01-10'.replace(/-/g, '/'))
I am unsure of your problem since My chrome(39.0.2171.99) gives me Jan 01 in all case. But having said this, I would like to point out that you should probably use
new Date(2015,1,1)
This is how JS Date is supposed to be initialised.
I have just noticed some strange behavior in one of my Arrays. I am sure the issue is with how Javascript stores object references in arrays. I will demonstrate the issue with a bit of code I posted as an answer to another question on SO. The code, just loops to get todays date and the 6 previous dates of the month, pretty self explanatory.
var dates = [];
var date = new Date();
for (var i = 0; i < 7; i++){
var tempDate = new Date();
tempDate.setDate(date.getDate()-i);
dates.push(tempDate);
}
console.log(dates);
Output: [Thu Jun 05 2014 14:54:14 GMT+0100 (GMT Daylight Time),
Wed Jun 04 2014 14:54:14 GMT+0100 (GMT Daylight Time),
Tue Jun 03 2014 14:54:14 GMT+0100 (GMT Daylight Time),
Mon Jun 02 2014 14:54:14 GMT+0100 (GMT Daylight Time),
Sun Jun 01 2014 14:54:14 GMT+0100 (GMT Daylight Time),
Sat May 31 2014 14:54:14 GMT+0100 (GMT Daylight Time),
Fri May 30 2014 14:54:14 GMT+0100 (GMT Daylight Time)]
This is correct, and is expected, as tempDate is continually recreated as a new Date object inside of the loop.
When I take tempDate out of the loop however, it seems to update all of the objects in the array every iteration of the loop (also the loop seems to go one Month and 1 day too far to 29th apr):
var dates = [];
var date = new Date();
var tempDate = new Date();
for (var i = 0; i < 7; i++){
tempDate.setDate(date.getDate()-i);
dates.push(tempDate);
}
console.log(dates);
Output: [Tue Apr 29 2014 14:52:21 GMT+0100 (GMT Daylight Time),
Tue Apr 29 2014 14:52:21 GMT+0100 (GMT Daylight Time),
Tue Apr 29 2014 14:52:21 GMT+0100 (GMT Daylight Time),
Tue Apr 29 2014 14:52:21 GMT+0100 (GMT Daylight Time),
Tue Apr 29 2014 14:52:21 GMT+0100 (GMT Daylight Time),
Tue Apr 29 2014 14:52:21 GMT+0100 (GMT Daylight Time),
Tue Apr 29 2014 14:52:21 GMT+0100 (GMT Daylight Time)]
So the two questions I pose are:
Why does the object stored in the array keep mutating with every
iteration? (I suspect this is because of the way Javascript is
storing the references to the object, but an explanation would be
nice)
Why does the loop run one too far in the second bit of code? The first example shows the loop ends after 7 successful iterations, on May the 30th (7 days in the past), the second example shows the resultant date after the iterations as the 29th april - more than a month into the past. Why?
edit: I have a rudimentary jsfiddle, to allow testing the code.
var tempDate = new Date(); creates new object. so in this example:
var array = [tempDate,tempDate,tempDate,tempDate,tempDate];
array[0] is same object as array[1]
this is just how objects works in almost every programming language:
var a = tempDate;
var b = tempDate;
var c = b;
(in above example a, b, c, and tempDate is same thing)
Regarding question 2:
please have a look at this fiddle http://jsfiddle.net/Mqnmm/
with new object tempDate is always today, then you apply setDate()
with old object, before last literation date is Sat, 31 May 2014 14:23:34 GMT, then you apply tempDate.setDate(-1) which sets date to last N days of previous month
Subtract days from a date in JavaScript please check Rob Dawley`s answer to properly adjust dates