Javascript getTime since beginning of 2012 in miliseconds - javascript

getTime() gives you milliseconds since January 1, 1970.
How can I get the milliseconds since January 1, 2012?
This is what I currently have:
var n = new Date()
var a = n.getTime()
console.log (a)

How about:
var diff = new Date() - new Date(2012, 0, 1); // diff is in milliseconds
for calculating differences including local time zone deviations, or
var diff = new Date() - Date.UTC(2012, 0, 1); // diff in ms
for more scientific solutions.
Note that months in Javascript are zero based.

var ms = +new Date() - new Date( '2012/01/01' )
or make that 2nd date object new Date( '2012/01/01 GMT' ) if desired

Here is an example:
http://dl.dropbox.com/u/56150009/html/SO8780297/example.htm
Sample output. Notice how the DurationGMT & DurationLocal are different. When doing comparisons with dates always use GMT.
Now: 1,326,054,979,124 ms (Sun, 08 Jan 2012 20:36:19 GMT)
Start1: 1,325,376,000,000 ms (Sun, 01 Jan 2012 00:00:00 GMT)
Start2: 1,325,376,000,000 ms (Sun, 01 Jan 2012 00:00:00 GMT)
Start3: 1,325,376,000,000 ms (Sun, 01 Jan 2012 00:00:00 GMT)
DurationGMT: 678,979,124 ms (Accurate method)
StartLocal1: 1,325,397,600,000 ms (Sun, 01 Jan 2012 06:00:00 GMT)
DurationLocal: 657,379,124 ms !!! Don't use this method
Here are three methods to get a GMT date, #3 would be what you want.
var now = new Date();
var startOfYear1 = createGMTDate1(2012, 0, 1, 0, 0, 0, 0);
var startOfYear2 = createGMTDate2(2012, 0, 1, 0, 0, 0, 0);
var startOfYear3 = createGMTDate3(2012, 0, 1, 0, 0, 0, 0);
var durationGMTMillis = now.getTime() - startOfYear1.getTime(); // accurate
var startOfYearLocal1 = new Date(2012, 0, 1, 0, 0, 0, 0);
var durationLocalMillis = now.getTime() - startOfYearLocal1.getTime(); // inaccurate
function createGMTDate1(year, month, date, hours, mins, secs, millis) {
var dateDefaultTz = new Date(year, month, date, hours, mins, secs, millis);
var localeTzGMTMillis = dateDefaultTz.getTime();
var localeTzGMTOffsetMillis = dateDefaultTz.getTimezoneOffset() * 60 * 1000;
var dateGMT = new Date(localeTzGMTMillis - localeTzGMTOffsetMillis);
return dateGMT;
}
function createGMTDate2(year, month, date, hours, mins, secs, millis) {
var dateGMT = new Date(0);
dateGMT.setUTCFullYear(year);
dateGMT.setUTCMonth(month);
dateGMT.setUTCDate(date);
dateGMT.setUTCHours(hours);
dateGMT.setUTCMinutes(mins);
dateGMT.setUTCSeconds(secs);
dateGMT.setUTCMilliseconds(millis);
return dateGMT;
}
function createGMTDate3(year, month, date, hours, mins, secs, millis) {
var dateGMT = new Date(Date.UTC(year, month, date, hours, mins, secs, millis));
return dateGMT;
}

As others have said, the solution is subtracting Date instances:
var ms = now - before;
This works because the - operator converts its operands to Number (ECMAScript Language Specification, Edition 5.1, section 11.6.2). The corresponding ToNumber algorithm checks if the object has a valueOf method, and calls it if it exists (sections 9.3, 9.1, and 8.12.8).
It now happens that the Date.prototype.valueOf() method, that Date instances inherit, returns the same value as Date.prototype.getTime() for a given Date instance (section 15.9.5.8). Which is the number of milliseconds since January 1, 1970 (CE) 00:00:00.000 UTC ("epoch") (section 15.9.1.1).
The first operand is obvious if you want to compare against the local time:
var now = new Date();
The second part is a bit more tricky because you want to count from January 1, 2012 (CE) 00:00:00.000 GMT. For that you cannot use
var before = new Date(2012, 0, 1);
(or variations thereof) because it uses 00:00:00.000 local time (section 15.9.3.1). There are at least two ways to make this work:
Use a string value in a date format that must be recognized by conforming implementations of ECMAScript Edition 5.1 (section 15.9.1.15):
var ms = new Date() - new Date("2012-01-01T00:00:00.000Z");
If you are concerned about backwards compatibility, you can set the time explicitly (section 15.9.5):
var before = new Date();
before.setUTCFullYear(2012, 0, 1);
before.setUTCHours(0, 0, 0, 0);
var ms = now - before;

Related

Why Daylight Saving is not detected by Javascript in my code?

I have tried below code to detect if Daylight Saving is observed or not, but although it's on, it always says it's not observed.
var myDate = new Date();
document.write(myDate.getTimezoneOffset());
var rightNow = new Date();
var jan1 = new Date(rightNow.getFullYear(), 0, 1, 0, 0, 0, 0);
var temp = jan1.toUTCString();
var jan2 = new Date(temp.substring(0, temp.lastIndexOf(" ")-1));
var std_time_offset = (jan1 - jan2) / (1000 * 60 * 60);
var june1 = new Date(rightNow.getFullYear(), 6, 1, 0, 0, 0, 0);
temp = june1.toUTCString();
var june2 = new Date(temp.substring(0, temp.lastIndexOf(" ")-1));
var daylight_time_offset = (june1 - june2) / (1000 * 60 * 60);
var dst;
if (std_time_offset == daylight_time_offset) {
dst = "0"; // daylight savings time is NOT observed
} else {
dst = "1"; // daylight savings time is observed
}
It's currently returning me "not oberserved", no matter which way I am using. (Can't add all methods here as it will be too long question)
Hence, I just want to confirm:
Does that require any settings in my machine?
Is there any specific country you have to be in to get this observed?
That code seems really complicated for what it needs to do. Here's a simple function that makes the same assumption you do (that DST is not in effect on Jan 1st of the year) and detects whether DST is in effect for the given Date instance:
function isDST(dt) {
// Northern or southern hemisphere?
// NOTE: Assumes that Jan 1st (southern hemisphere) or July 1st (northern hemisphere) will be DST
// This may be a "good enough" assumption, but if not, you'll need to download timezone information
const jan1 = new Date(dt.getFullYear(), 0, 1);
const jul1 = new Date(dt.getFullYear(), 6, 1);
const ref = jan1.getTimezoneOffset() < jul1.getTimezoneOffset() ? jul1 : jan1;
return dt.getTimezoneOffset() !== ref.getTimezoneOffset();
}
console.log(isDST(new Date(2018, 6, 1))); // true (if July 1st is DST in your timezone)
console.log(isDST(new Date(2018, 1, 1))); // false (if Feb 1st is not DST in your timezone)
It takes advantage of the fact that getTimezoneOffset includes the DST offset if DST is in effect.
Of course, if a location should change the timezone it's in permanently between the two dates, it would give a false positive, but that's a relatively rare occurrence.
The OP has been modified since my original answer so that it no longer applies. T.J. has alternative code, but so as not to waste an answer, here's a similar function that:
Gets the timezone offset for 1 Jan and 1 Jul for the related year
Assumes DST is in force and returns true if either:
the place observes DST as for the southern hemisphere DST (i.e. Jan offset is less than Jul offset, noting that javascript offsets are the opposite sense of ISO and all other commonly used systems) and the offset is the same as for January
the place observes DST as for the northern hemisphere DST (same caveat as above) and the offset is the same as for July
For any other case (jan offset equals jul offset so no DST at all, or date offset is as for non–DST offset) the date must not be in DST
function inDST(inDate) {
var inOffset = inDate.getTimezoneOffset();
var janOffset = new Date(inDate.getFullYear(), 0, 1).getTimezoneOffset();
var julOffset = new Date(inDate.getFullYear(), 6, 1).getTimezoneOffset();
return (janOffset < julOffset && janOffset == inOffset) ||
(janOffset > julOffset && julOffset == inOffset);
}
// Just a helper for testing
function fDate(d) {
var z = n => (n < 10 ? '0' : '') + n;
return `${z(d.getDate())}-${d.toLocaleString(undefined, {month:'short'})}-${d.getFullYear()}`;
}
// Tests
[new Date(2018, 1, 2), // 2 Feb 2018
new Date(2018, 5, 30)] // 30 Jun 2018
.forEach(d =>
console.log(`${fDate(d)}: ${inDST(d)}`)
);

Javascript New Date() / UTC - GMT cross browser

The issue:
Different formats for new Date() in IE 10 - IE 11.
Javascript:
IE 11 / Chrome :
var m = new Date("2014-07-04T04:00:00");
console.log(m); // Fri Jul 04 2014 06:00:00 GMT+0200 (W. Europe Summer Time)
IE 10:
var m = new Date("2014-07-04T04:00:00");
console.log(m); // Fri Jul 4 04:00:00 UTC+0200 2014
Is possible to use one ring to rule them all?
You shouldn't pass a string to new Date, specifically for this reason.
Instead, you should either give it the individual arguments:
new Date(2014, 6, 4, 4, 0, 0); // remember months are zero-based
Or, if you want to give it a time in UTC, try:
var d = new Date();
d.setUTCFullYear(2014);
d.setUTCMonth(6);
d.setUTCDate(4);
d.setUTCHours(4);
d.setUTCMinutes(0);
d.setUTCSeconds(0);
d.setUTCMilliseconds(0);
You can, of course, make a function to do this.
Alternatively, if you have a timestamp, you can simply do:
var d = new Date();
d.setTime(1404446400000);
To complete the answer a bit. The UTC example given is dangerous, given that you execute on 31st of May (or any other 31st day of month) the following:
var d = new Date();
d.setUTCFullYear(2014);
d.setUTCMonth(5);
d.setUTCDate(4);
d.setUTCHours(4);
d.setUTCMinutes(0);
d.setUTCSeconds(0);
d.setUTCMilliseconds(0);
it will produce "2014 July 4 04:00:00".
So prefer Date.UTC function instead:
new Date(Date.UTC(2014, 5, 4, 4, 0, 0, 0))
it will produce "2014 June 4 04:00:00".

Javascript: find beginning of Advent weeks each year

I have created the following code (which works) to print something different based on the weeks of a specified month:
<script language="javascript">
<!--
var advent;
mytime=new Date();
mymonth=mytime.getMonth()+1;
mydate=mytime.getDate();
if (mymonth==12 && (mydate >= 1 && mydate <= 6)){document.write("xxx");
}
if (mymonth==12 && (mydate >= 7 && mydate <= 13)){document.write("yyy");
}
if (mymonth==12 && (mydate >= 14 && mydate <= 20)){document.write("zzz");
}
if (mymonth==12 && (mydate >= 21 && mydate <= 30)){document.write("qqq");
}
//-->
</script>
But I need this to change for Advent each year and Advent changes based on when Christmas falls each year:
Advent starts on the Sunday four weeks before Christmas Day. There are
four Sundays in Advent, then Christmas Day. The date changes from year
to year, depending on which day of the week Christmas fall. Thus, in
2010, Advent began on 28 November. In 2011, it will occur on 27
November.
How do I calculate when the weeks of Advent begin each year?
Start with a Date that's exactly 3 weeks before Christmas Eve. Then, walk backwards until the day-of-week is Sunday:
function getAdvent(year) {
//in javascript months are zero-indexed. january is 0, december is 11
var d = new Date(new Date(year, 11, 24, 0, 0, 0, 0).getTime() - 3 * 7 * 24 * 60 * 60 * 1000);
while (d.getDay() != 0) {
d = new Date(d.getTime() - 24 * 60 * 60 * 1000);
}
return d;
}
getAdvent(2013);
// Sun Dec 01 2013 00:00:00 GMT-0600 (CST)
getAdvent(2012);
// Sun Dec 02 2012 00:00:00 GMT-0600 (CST)
getAdvent(2011);
// Sun Nov 27 2011 00:00:00 GMT-0600 (CST)
(2013 and 2012 were tested and verified against the calendar on http://usccb.org/. 2011 was verified against http://christianity.about.com/od/christmas/qt/adventdates2011.htm)
Here's what I was talking about in my comment:
function getAdvent(year) {
var date = new Date(year, 11, 25);
var sundays = 0;
while (sundays < 4) {
date.setDate(date.getDate() - 1);
if (date.getDay() === 0) {
sundays++;
}
}
return date;
}
DEMO: http://jsfiddle.net/eyUjX/1/
It starts on Christmas day, in the specific year. It goes into the past, day by day, checking for Sunday (where .getDate() returns 0). After 4 of them are encountered, the looping stops and that Date is returned.
So to get 2009's beginning of Advent, use: getAdvent(2009);. It returns a Date object, so you can still work with its methods.
As a reference of its methods: https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Date
You can get Advent Sunday by adding 3 days to the last Thursday in November, which seems simpler:
function getAdventDay(y){
var advent=new Date();
advent.setHours(0,0,0,0);
//set the year:
if(typeof y!='number')y=advent.getFullYear();
//get the last day of november:
advent.setFullYear(y,10,30);
//back up to the last thursday in November:
while(advent.getDay()!==4)advent.setDate(advent.getDate()-1);
//add 3 days to get Sunday:
advent.setDate(advent.getDate()+3);
return advent;
}
getAdventDay(2013)
/*
Sun Dec 01 2013 00:00:00 GMT-0500 (Eastern Standard Time)
*/
const getFirstAdvent=function(y){
const firstAdvent=new Date(y,11,3);
firstAdvent.setDate(firstAdvent.getDate()-firstAdvent.getDay());
return firstAdvent;
};
alert(getFirstAdvent(2020));
I love these challenges, here's how it can be done with recursion. First I find the fourth Sunday. Then I Just keep minusing 7 days until I have the other 3. The variables firstSunday, secondSunday, thirdSunday and fourthSunday - contains the dates.
EDIT: I believe I misunderstood, but the firstSunday variable Will be the date you are looking for.
Demo
Javascript
var year = 2011;//new Date().getFullYear();
var sevenDays = (24*60*60*1000) * 7;
var foundDate;
var findClosestSunday = function(date){
foundDate = date;
if (foundDate.getDay() != 0)
findClosestSunday(new Date(year,11,date.getDate()-1));
return foundDate;
}
var fourthSunday = findClosestSunday(new Date(year, 11, 23));
var thirdSunday = new Date(fourthSunday.getTime() - sevenDays);
var secondSunday = new Date(fourthSunday.getTime() - sevenDays *2);
var firstSunday = new Date(fourthSunday.getTime() - sevenDays *3);
console.log
(
firstSunday,
secondSunday,
thirdSunday,
fourthSunday
);
Javascript works with time in terms of milliseconds since epoch. There are 1000 * 60 * 60 *24 * 7 = 604800000 milliseconds in a week.
You can create a new date in Javascript that is offset from a know date doing this:
var weekTicks, christmas, week0, week1, week2, week3;
weekTicks = 604800000;
christmas = new Date(2013, 12, 25);
week0 = new Date(christmas - weekTicks);
week1 = new Date(week0 - weekTicks);
week2 = new Date(week1 - weekTicks);
week3 = new Date(week2 - weekTicks);
See how that works for you.
Also, the Date.getDay function will work to help you find which day of the month is the first Sunday.

Different ways to initialize Javascript Date object and getting different results each way [duplicate]

Why do these two dates are differents :
var date1 = new Date();
date1.setFullYear(2012); // year (four digits)
date1.setMonth(10); // month (from 0-11)
date1.setDate(1); // day of the month (from 1-31)
var date2 = new Date(2012, 10, 1, 0, 0, 0, 0);
Result :
Date 1 : Sat Dec 01 2012 14:56:16 GMT+0100
Date 2 : Thu Nov 01 2012 00:00:00 GMT+0100
whereas these two dates are equals :
var date3 = new Date();
date3.setFullYear(2012); // year (four digits)
date3.setMonth(9); // month (from 0-11)
date3.setDate(1); // day of the month (from 1-31)
var date4 = new Date(2012, 9, 1, 0, 0, 0, 0);
Result :
Date 3 : Mon Oct 01 2012 14:56:16 GMT+0200
Date 4 : Mon Oct 01 2012 00:00:00 GMT+0200
Another question is why do date1.setMonth(10) gives a date in December (should be November).
Finally got it. new Date() sets the date to the current date and time. In other words, October 31st (at the time of this writing).
When you then try to set the month to November, what's it to do? November only has 30 days... so it wraps it round to December.
If you change the order so that you set the day-of-month before the month, it works:
var date1 = new Date();
date1.setFullYear(2012); // year (four digits)
date1.setDate(1); // day of the month (from 1-31)
date1.setMonth(10); // month (from 0-11)
Or as implied by jbabey's answer:
var date1 = new Date();
date1.setFullYear(2012); // year (four digits)
date1.setMonth(10, 1); // month (from 0-11) and day (1-31)
The documentation isn't terribly clear, but it's at least suggestive:
If a parameter you specify is outside of the expected range, setMonth attempts to update the date information in the Date object accordingly. For example, if you use 15 for monthValue, the year will be incremented by 1 (year + 1), and 3 will be used for month.
("Accordingly" is far from precise, but it means the implementation is at least arguably correct...)
setMonth accepts a second parameter:
If you do not specify the dayValue parameter, the value returned from the getDate method is used.
When you set the month to 10 (November), it grabs the current day value (31) and sets that as the day. Since there are only 30 days in November, it rolls you over to December 1st.
You're creating a var containing the current date (new Date()) and then you're changing some of it's keys (year, month and day).
On the other hand new Date(2012, 10, 1, 0, 0, 0, 0) means "create a date object with those exact values".
And that's why your date objects aren't equal.

Dates before Jan. 01 1970?

I'm trying to write a javascript function that calculates the time since Oct 11th, 1910 so I can throw it into a timer for a project I'm working on. I get that javascript's milliseconds works from epoc, but I don't and can't find a way to get the milliseconds since a date earlier than 01.01.1970
Does anyone have any loose code that can do the above that they may be willing to share?
var oldGoodTimes = new Date(1910, 9, 11); // January = 0
var actualDate = new Date();
console.log(actualDate.getTime() - oldGoodTimes.getTime());
Try this:
var yeOldeTimes = new Date();
yeOldeTimes.setFullYear(1910, 9, 11);
var myNewDate = new Date();
console.log("Milliseconds since Ye Olde Times: " + (myNewDate - yeOldeTimes));
Number of milliseconds since Oct 11th, 1910
console.log(new Date() - new Date('1910', '10', '11'))
// new Date().valueOf() - milliseconds since 1970
// -(new Date('1910', '10', '11')).valueOf() - milliseconds from 1910 since 1970

Categories

Resources