I have a date object:
var thedate = new Date("2012-05-02T11:00:00.000+0000");
When I do getMonth() I get 4, but when I do getDay() I get 3? I want to make it so that when I call getDay, I get what’s reflected in the original string (2). I could just subtract 1 from getDay(), but I’m not sure if that’s the right way to do it, and whether it applies to all dates.
According to MDN, getMonth will return a number in the range 0-11 (so 0 is for January), and getDay will return the day of the week in the range 0-6 (so 0 is for Sunday). If you want to get the day in month, you should use getDate, which will return a number in the range 1-31.
getDay/getMonth will return the index of the day, which starts from 0, therefore +1.
getDay() Returns the day of the week (from 0-6)
Read here: http://www.w3schools.com/jsref/jsref_obj_date.asp
Related
I have an object with a date property. I set the date to 01 April 2000 and can see in the debugger that it is set properly to the same date. However when I do a getMonth() on the same date object it returns month as 3 ( March ) . Why is this happening. Does it have anything to do with UTC or Localization both of which i am not using ?
You need +1 for getMonth function
var month = date.getMonth() + 1
this is normal behavior. Months starts with Zero(i.e january is 0). thats why its giving 3 for april. use adding 1 with month to get exact month's digit value.
i guess, to help indexing javascript starts month with zero.
suppose an array of months in string form then we dont have to worry to fetch correct month from string.
I'm used to create Date objects by using the fourth syntax from MDN as new Date(year, month, day, hours, minutes, seconds, milliseconds); But lately I tried to set a Date object with only a year (as new Date(2017)) but as you could expect it was treated as a value and considered the year as a number of milliseconds.
Is there any way of still easily use the year as is without changing the syntax and expect a correctly set Date ?
Two solutions come to my mind:
(1) Set the year argument to 2017 and set the month argument to 0 when constructing the date:
let d = new Date(2017, 0);
console.log(d.toString());
The arguments will be treated as local time; month and day of month will be January 1; all time components will be set to 0.
(2) Specify "2017T00:00" as the first and only argument when constructing the date:
let d = new Date("2017T00:00");
console.log(d.toString());
According to current specs this is a valid format and browsers are supposed to treat it as local time. The behavior is same as that of previous example.
If you are passing a single parameter (number or string), then it is taken as per doc
value
Integer value representing the number of milliseconds since January 1,
1970 00:00:00 UTC, with leap seconds ignored (Unix Epoch; but consider
that most Unix time stamp functions count in seconds).
dateString
String value representing a date. The string should be in a format
recognized by the Date.parse() method (IETF-compliant RFC 2822
timestamps and also a version of ISO8601).
Also as per doc
If at least two arguments are supplied, missing arguments are either
set to 1 (if day is missing) or 0 for all others.
You can pass one more parameter as 0 or null (or the actual value you want to set)
new Date(2017,0);
Demo
var date = new Date(2017,0);
console.log( date );
You could pass null as second argument:
new Date(2017, null);
However, without knowing the details of how missing values are interpreted, what do you think happens now? Will the date be initialized with the current month, day, hour, etc? Or something different?
Better be explicit and pass all arguments, so that you know what the code is doing half a year later.
I have another suggestion. You can just create a regular date object and set it's year. So at least you know to expect what the rest of the Date object values are.
var year = "2014";
var date = new Date();
date.setFullYear(year);
// console.log(year) => Wed Dec 27 2014 16:25:28 GMT+0200
Further reading - Date.prototype.setFullYear()
I am using Moment.js for adding dates to an option list I am making, so that I can use those dates to show available appointments, for example, someone can select Friday, February 3rd from the option list and then a list of available times will show up for February 3rd.
What I am having trouble with is I am using an online scheduling api that takes month values that start at 1, for example, January is 01, February is 02, etc. but moment.js months start at 0 and only go up to 11, for example, January is 0, February is 1, etc. I need to be able to convert the values from moment.js into an integer so I can add 1 to each to account for the difference.
The real problem is I tried using parseInt(Month) to get the int value to add one to it, but that didn't work. Here is my code attempting to do just that:
var d = moment(),
Month = d.month();
var GetMonthint = parseInt(Month),
GetMonth = GetMonthint++;
GetAppointmentDates(GetMonth, Year);
GetMonth still only returns 1 for February, is there some special way I can get the int value from the month?
The line
GetMonth = GetMonthint++;
is the problem. The ++ operator returns the original value not the incremented value. You should do this:
GetMonth = GetMonthint + 1;
I am trying to convert a UNIX timestamp to a JavaScript date object. It works perfectly fine for all other inputs except the following,
dt = new Date(1421020800 * 1000);
dt.getMonth(); // yields '0'
For some mysterious reason, the ouput value is 0 for month, whereas the expected value as per this link should be 1.
Does the month in Javascript date object starts with 0?
It's normal - you can read in W3Schools (or anywhere else)
The getMonth() method returns the month (from 0 to 11) for the
specified date, according to local time.
Note: January is 0, February is 1, and so on.
I'm am trying to format a date in Javascript but the date command is returning the wrong date unless I use toUTCString() which returns the correct date, I've tried different ways of giving the date to the Date() function and both get and getUTC functions to get the date. I've also tried on different browsers (Chrome, Safari, FireFox) and what makes in even more confusing is if I do it in Chrome's inspector is works perfectly. And I missing something obvious?
var d = new Date(1324141200000);
// return "Sat, 17 Dec 2011 17:00:00 GMT" - Correct!
alert(d.toUTCString());
// returns "6-11-2011" - Wrong!
alert(d.getUTCDay() +'-'+ d.getUTCMonth() +'-'+ d.getUTCFullYear());
The "getUTCDay()" function returns the day of the week. The months are numbered from zero. Saturday is the sixth day of the week (in JavaScript land at least), and 11 is the 12th month counting from zero.
Thus, all is well.
The day of the month can be retrieved with "d.getUTCDate()".
d.getUTCDay() // day of week
d.getUTCMonth() // zero based index
Instead of getUTCDay, you want getUTCDate. And getUTCMonth returns 0-11 (0 = January). Section 15.9.1 of the specification may help, but the language is heavy-going.
Use
getFullYear()
function to get the year,
getMonth()
function to get the month, and
getDate()
function to get the day.