I'm used to create Date objects by using the fourth syntax from MDN as new Date(year, month, day, hours, minutes, seconds, milliseconds); But lately I tried to set a Date object with only a year (as new Date(2017)) but as you could expect it was treated as a value and considered the year as a number of milliseconds.
Is there any way of still easily use the year as is without changing the syntax and expect a correctly set Date ?
Two solutions come to my mind:
(1) Set the year argument to 2017 and set the month argument to 0 when constructing the date:
let d = new Date(2017, 0);
console.log(d.toString());
The arguments will be treated as local time; month and day of month will be January 1; all time components will be set to 0.
(2) Specify "2017T00:00" as the first and only argument when constructing the date:
let d = new Date("2017T00:00");
console.log(d.toString());
According to current specs this is a valid format and browsers are supposed to treat it as local time. The behavior is same as that of previous example.
If you are passing a single parameter (number or string), then it is taken as per doc
value
Integer value representing the number of milliseconds since January 1,
1970 00:00:00 UTC, with leap seconds ignored (Unix Epoch; but consider
that most Unix time stamp functions count in seconds).
dateString
String value representing a date. The string should be in a format
recognized by the Date.parse() method (IETF-compliant RFC 2822
timestamps and also a version of ISO8601).
Also as per doc
If at least two arguments are supplied, missing arguments are either
set to 1 (if day is missing) or 0 for all others.
You can pass one more parameter as 0 or null (or the actual value you want to set)
new Date(2017,0);
Demo
var date = new Date(2017,0);
console.log( date );
You could pass null as second argument:
new Date(2017, null);
However, without knowing the details of how missing values are interpreted, what do you think happens now? Will the date be initialized with the current month, day, hour, etc? Or something different?
Better be explicit and pass all arguments, so that you know what the code is doing half a year later.
I have another suggestion. You can just create a regular date object and set it's year. So at least you know to expect what the rest of the Date object values are.
var year = "2014";
var date = new Date();
date.setFullYear(year);
// console.log(year) => Wed Dec 27 2014 16:25:28 GMT+0200
Further reading - Date.prototype.setFullYear()
Related
Is there a way to indicate only the date portion of a Date() object, without indicating the time?
e.g.
var d = Date();
d.setFullYear(2015, 0, 13);
d.toString();
"Tue Jan 13 2015 00:00:00 GMT-0500 (EST)" // Wrong - I didn't set a time!
"Tue Jan 13 2015 NULL GMT-0500 (EST)" // Expected Result
I want to be able to tell the difference between a user who only inputed the Date portion, vs one who explicitly inputed both a Date and a Time
Not really. A Javascript Date object always has a time. You can leave it at midnight and ignore it if you want, but it'll still be there. It's up to you how you interpret it.
If you want to be able to represent a null time, you could interpret midnight to mean that, though then you would have no way to represent times that actually are midnight. If you want to be able to have a null time and still represent every possible time you would need to have two variables.
You could have:
// Date with null time
var date = new Date(2015, 0, 13); // time component ignored
var time = null;
// Date with non-null time
var date = new Date(2015, 0, 13); // time component ignored
var time = new Date(1970, 0, 1, 9, 30); // date component ignored
Note in the second example the year, month and day in the time component are arbitrary and won't be used, but they still need to be there if you want to create a Date object.
JavaScript Date objects are internally defined using number of milliseconds since January 1, 1970 UTC. Therefore you are stuck with the time part.
Try this code
var date = new Date(2015, 0, 13);
console.log(date.valueOf());
You'll get this output
1421125200000
Here is the standard definition...
ECMAScript Language Spec See page 165
From ECMA standard:
A Date object contains a Number indicating a particular instant in time to within a millisecond. Such a Number
is called a time value. A time value may also be NaN, indicating that the Date object does not represent a
specific instant of time.
Time is measured in ECMAScript in milliseconds since 01 January, 1970 UTC. In time values leap seconds
are ignored. It is assumed that there are exactly 86,400,000 milliseconds per day. ECMAScript Number values
can represent all integers from –9,007,199,254,740,992 to 9,007,199,254,740,992; this range suffices to
measure times to millisecond precision for any instant that is within approximately 285,616 years, either
forward or backward, from 01 January, 1970 UTC.
The actual range of times supported by ECMAScript Date objects is slightly smaller: exactly –100,000,000
days to 100,000,000 days measured relative to midnight at the beginning of 01 January, 1970 UTC. This gives
a range of 8,640,000,000,000,000 milliseconds to either side of 01 January, 1970 UTC.
The exact moment of midnight at the beginning of 01 January, 1970 UTC is represented by the value +0
Dates are objects. As such, you can add properties to them as needed:
var date_only = new Date("2015-03-04");
date_only.time_is_meaningful = false;
var date_with_time = new Date("2015-03-04T10:47");
date_with_time.time_is_meaningful = true;
This is simpler and cleaner than your millisecond hack, and more
convenient than having two separate variables. You could then, if you
wish, subclass Date with a custom toString that checks the
time_is_meaningful property and acts accordingly.
You cannot remove the time information from a Date.
If you want to have both informations independently, use a Date for the date and ignore the time (e.g. ensure it's always exactly midnight for instance), and use another field to hold the time (it could be a Date where you ignore the date but not the time, or it could be one input text with formatted time, or several input texts with hour, minute, etc).
The UI representation is up to you.
Why does a string of numbers work differently than actual numbers in a new Date():
var myfirstDate = new Date("2013, 10, 15"); //returns Tue Oct 15 2013 00:00:00 GMT-0500 (CDT)
var mysecondDate = new Date(2013, 9, 15); // also returns Tue Oct 15 2013 00:00:00 GMT-0500 (CDT)
myfirstDate.value == mysecondDate.value; //returns true
I looked at several tutorials and the idea of having a string like myfirstDate above isn't even mentioned. Does javascript automatically parse the string?
See the docs.
You're effectively invoking two different constructors.
The first one is parsed as a human-readable date:
new Date(dateString)
The second one expects 3 or more parameters, providing a year, a 0-based month number, and a day
new Date(year, month, day [, hour, minute, second, millisecond]);
year
Integer value representing the year. For compatibility (in order to avoid the Y2K problem), you should always specify the year in full; use 1998, rather than 98.
month
Integer value representing the month, beginning with 0 for January to 11 for December.
day
Integer value representing the day of the month (1-31).
Until ES5, parsing of date strings was entirely implementation dependent, though there were one or two strings that were consistently parsed by several browsers. ES5 introduced parsing of a version of ISO8601, however it's not supported by all browsers in use.
It is best to manually parse date string to ensure they are correct. There are various libraries to assist with that, but it isn't difficult (2 lines of code).
Incidentally, there is no Date.prototype.value method, so likely you are comparing undefined with itself. You should be comparing the time value, so:
myfirstDate.getTime() == mysecondDate.getTime();
or just:
myfirstDate == mysecondDate;
Oh, to answer the question: when the Date function is called as a constructor with a single string argument, it is treated as a date string and parsed (see above). So "10" represents October.
When Date is called as a constructor with more than one argument, they are treated as date values so 9 is treated as October since month arguments are zero indexed (0=January, 1=February, etc.).
I was trying to convert date object into long format (may be in milliseconds format) as we do in java.
So to fulfill my need, after some trial and error, I found below way which works for me:
var date = new Date();
var longFormat = date*1; // dont know what it does internally
console.log(longFormat); // output was 1380625095292
To verify, I reverse it using new Date(longFormat); and it gave me correct output. In short I was able to fulfill my need some how, but I am still blank what multiplication does internally ? When I tried to multiply current date with digit 2, it gave me some date of year 2057 !! does anyone know, what exactly happening ?
The long format displays the number of ticks after 01.01.1970, so for now its about 43 years.
* operator forces argument to be cast to number, I suppose, Date object has such casting probably with getTime().
You double the number of milliseconds - you get 43 more years, hence the 2057 (or so) year.
What you are getting when you multiply, is ticks
Visit: How to convert JavaScript date object into ticks
Also, when you * 2 it, you get the double value of ticks, so the date is of future
var date = new Date()
var ticks = date.getTime()
ref: Javascript Date Ticks
getTime returns the number of milliseconds since January 1, 1970. So when you * 1 it, you might have got value of this milliseconds. When you * 2 it, those milliseconds are doubled, and you get date of 2057!!
Dates are internally stored as a timestamp, which is a long-object (more info on timestamps). This is why you can create Dates with new Date(long). If you try to multiply a Date with an integer, this is what happens:
var date = new Date();
var longFormat = date*1;
// date*1 => date.getTime() * 1
console.log(longFormat); // output is 1380.....
Javascript tries to find the easiest conversion from date to a format that can be multiplied with the factor 1, which is in this case the internal long format
Just use a date object methods.
Read the docs: JavaScript Date object
var miliseconds=yourDateObject.getMiliseconds();
If You want to get ticks:
var ticks = ((yourDateObject.getTime() * 10000) + 621355968000000000);
or
var ticks = someDate.getTime();
Javascript date objects are based on a UTC time value that is milliseconds since 1 January 1970. It just so happens that Java uses the same epoch but the time value is seconds.
To get the time value, the getTime method can be used, or a mathematic operation can be applied to the date object, e.g.
var d = new Date();
alert(d.getTime()); // shows time value
alert(+d); // shows time value
The Date constructor also accepts a time value as an argument to create a date object, so to copy a date object you can do:
var d2 = new Date(+d);
If you do:
var d3 = new Date(2 * d);
you are effectively creating a date that is (very roughly):
1970 + (2013 - 1970) * 2 = 2056
You could try the parsing functionality of the Date constructor, whose result you then can stringify:
>
new Date("04/06/13").toString()
"Sun Apr 06 1913 00:00:00 GMT+0200"
// or something
But the parsing is implementation-dependent, and there won't be many engines that interpret your odd DD/MM/YY format correctly. If you had used MM/DD/YYYY, it probably would be recognized everywhere.
Instead, you want to ensure how it is parsed, so have to do it yourself and feed the single parts into the constructor:
var parts = "04/06/13".split("/"),
date = new Date(+parts[2]+2000, parts[1]-1, +parts[0]);
console.log(date.toString()); // Tue Jun 04 2013 00:00:00 GMT+0200
I need to convert date to Java epoch and then read it and convert back. Not sure what I'm doing wrong here?
var date = new Date('1/3/2013');
var timeStamp = date.getTime();
console.log(timeStamp);
var revertDate = new Date(timeStamp);
console.log(revertDate.getDate()+'/'+revertDate.getMonth()+'/'+revertDate.getFullYear());
The output is 3/0/2013 instad 1/3/2013?
fiddle link
You've got two problems here:
The Date constructor is assuming M/d/yyyy format - whereas you're logging d/M/yyyy format. Personally I'd suggest using an ISO-8601 format if at all possible: yyyy-MM-dd
You're not taking into account the fact that getMonth() returns a 0-based value
For the formatting side, you'd be better off using toISOString or something similar, rather than doing the formatting yourself.
(Note that looking at the documentation for the Date constructor it's not clear that the code you've got should work at all, as it's neither an RFC822 nor ISO-8601 format.)
Neither of the problems are to do with converting between Date and a numeric value. If you change your logging, you'll see that clearly:
var date = new Date('1/3/2013');
var timeStamp = date.getTime();
console.log(date);
var revertDate = new Date(timeStamp);
console.log(revertDate);
var date = new Date('1/3/2013');
The Date constructor is parsing this given string this way:
Month / Day / Year
So, in this case, Month is 1, Day is 3 and Year is 2013. What's going on there? Well that's quite simple. This Gregorian representation of a date(which is specifically Day / Month / Year ) isn't the one used by the Date constructor, so it will parse the 1(the month) as January, the 3 as the third day of the month(the third of Jan) and the year correctly, the 2013. Now, due to its 0-based indexing, the constructed Date object will return a month which is n-1 among the one provided. That's why you're getting 3/0/2013. It is the third day(3) of the month 0(which is January) of 2013. If you want to get your real date you have to do this:
var date = new Date('3/1/2013');
console.log(date.getDate()+'/'+(date.getMonth()+1)+'/'+date.getFullYear());
I am facing a weird problem while initializes javascript date object,no matter what I initialize to it shows the date as 1 JAN 1970 05:30;
this is the way I try to initialize
var d=new date(27-02-1989);
alerting 'd' shows 1 JAN 1970.....,also sometimes it takes a date passed from the database but in the format as mm/dd/yyyy not in the format I want i.e dd/mm/yyyy
This problem has suddenly popped-up, as everything was working smooth couple of days ago,but today after opening the project (after 2 days) this issue is irritating me
I see you've accepted an answer, but it isn't the best you can do. There is no one format that is parsed correctly by all browsers in common use, the accepted answer will fail in IE 8 at least.
The only safe way to convert a string to a date is to parse it, e.g.
var s = '27-02-1989';
var bits = s.split('-');
var date = new Date(bits[2], --bits[1], bits[0]);
// Transform your european date in RFC compliant date (american)
var date = '27-02-1989'.split('-').reverse().join('-');
// And this works
var d = new Date( date );
Proof:
You're doing an initialization with a negative integer value (27-02-1989 == -1964). The Date object's constructor takes arguments listed here.
If you want to pass strings, they need to be in an RFC2822-compliant format (see here).
according to here you can try:
new Date()
new Date(milliseconds)
new Date(dateString)
new Date(year, month, day [, hour, minute, second, millisecond ])
so for your case use (edit: You need to remember that months are zero based)
var d = new Date(1989,01,27);
pleas notice - use Date (capital D)
First of all
var d=new date(27-02-1989);
is totaly wrong expression in javascript, moreover even if we rewrites it more correctly:
var d=new Date('27-02-1989');
there is no way to parse this date string natively in js.
Here solutions you can try:
transform string to ISO8601: YYYY-mm-dd, this can be parsed by most modern broswers, or you can use many js libraries for polyfill
split string string by '-' and then use Date constructor function new Date(year, month-1, day)
split string and use setDate, setMonth, setYear method on new Date() object
Note that in last two methods you need to deduct 1 from month value, because month is zero-based (0 stands for January, 11 for December)