I am trying to use gettime to sort my date string. But it is returning some vague values like.
1428303000000 16/06/2014 16:50
1389074040000 01/07/2014 16:54
The first date is smaller than second so its no. of milliseconds should also be smaller.
You can also check it on http://www.tutorialspoint.com/cgi-binpractice.cgi?file=javascript_178
So don't know why this is behaving like this.
Any help ?
Test your code if you are correctly creating Date object
// new Date(year, month, day, hour, minute, second, millisecond);
// Working with date 16/06/2014 16:50
var foo = new Date(2014, 6, 16, 16, 50);
foo.getTime(); // 1405518600000
// Working with date 01/07/2014 16:54
var foo = new Date(2014, 7, 1, 16, 54);
foo.getTime(); // 1406901240000
Read more about Date object reference.
Until we see your code and how do you get from "16/06/2014 16:50" to "1428303000000", I can't help more.
You're probably creating the date using 16/06/2014 and intending this to mean the 16th day of the 6th month. However, this is not how it's parsed. The first element is treated as the month; the second element is the day. Since there aren't 16 months in a year, the date is rounded forward to the next year (i.e. the 16th month of 2014 is the 4th month of 2015).
In other words:
Date.parse("16/06/2014 16:50") === Date.parse("04/06/2015 16:50"); // => true
Related
So there is a column that has the date with the hour and i was trying to create a variable date with the same date, month, year and hour to be able to compare it wiht that date but this didn't work with me so I thought I would do that by creating the same date but when i compare i won't consider the hour but im facing some difficulties.
anyone of the two solutions would be great
I wrote many other codes but none of them worked and that was the last one i wrote
var date = new Date();
var year = date.getYear();
var month = date.getMonth() + 1; if(month.toString().length==1){var month =
'0'+month;}
var day = date.getDate(); if(day.toString().length==1){var day = '0'+day;}
var date = month+'/'+day+'/'+year;
Logger.log(date);
Im using JavaScript in google app script.
Thank you!
From MDN
We have a first step to create an object date.
let today = new Date()
let birthday = new Date('December 17, 1995 03:24:00')
let birthday = new Date('1995-12-17T03:24:00')
let birthday = new Date(1995, 11, 17) // the month is 0-indexed
let birthday = new Date(1995, 11, 17, 3, 24, 0)
let birthday = new Date(628021800000) // passing epoch timestamp
You can create your Date object following the example above that fits you better. I also recommend giving a good look into this page.
For the second step...
From there, you can use Date.now(). As explained here, this will return "A Number representing the milliseconds elapsed since the UNIX epoch."
The third step is...
comparing both numbers. Which one is smaller will be an "earlier date" and vice-versa.
If some dates don't have time, I would consider it as midnight. Using the default Date format, that would be something like this.
yyyy-mm-ddThh:mm:ssZ
Ex:
2022-02-21T09:39:23Z
The Z at the end means UTC+0.
More about this on this link.
So, a date without time would be:
2022-02-21T00:00:00Z
I'm used to create Date objects by using the fourth syntax from MDN as new Date(year, month, day, hours, minutes, seconds, milliseconds); But lately I tried to set a Date object with only a year (as new Date(2017)) but as you could expect it was treated as a value and considered the year as a number of milliseconds.
Is there any way of still easily use the year as is without changing the syntax and expect a correctly set Date ?
Two solutions come to my mind:
(1) Set the year argument to 2017 and set the month argument to 0 when constructing the date:
let d = new Date(2017, 0);
console.log(d.toString());
The arguments will be treated as local time; month and day of month will be January 1; all time components will be set to 0.
(2) Specify "2017T00:00" as the first and only argument when constructing the date:
let d = new Date("2017T00:00");
console.log(d.toString());
According to current specs this is a valid format and browsers are supposed to treat it as local time. The behavior is same as that of previous example.
If you are passing a single parameter (number or string), then it is taken as per doc
value
Integer value representing the number of milliseconds since January 1,
1970 00:00:00 UTC, with leap seconds ignored (Unix Epoch; but consider
that most Unix time stamp functions count in seconds).
dateString
String value representing a date. The string should be in a format
recognized by the Date.parse() method (IETF-compliant RFC 2822
timestamps and also a version of ISO8601).
Also as per doc
If at least two arguments are supplied, missing arguments are either
set to 1 (if day is missing) or 0 for all others.
You can pass one more parameter as 0 or null (or the actual value you want to set)
new Date(2017,0);
Demo
var date = new Date(2017,0);
console.log( date );
You could pass null as second argument:
new Date(2017, null);
However, without knowing the details of how missing values are interpreted, what do you think happens now? Will the date be initialized with the current month, day, hour, etc? Or something different?
Better be explicit and pass all arguments, so that you know what the code is doing half a year later.
I have another suggestion. You can just create a regular date object and set it's year. So at least you know to expect what the rest of the Date object values are.
var year = "2014";
var date = new Date();
date.setFullYear(year);
// console.log(year) => Wed Dec 27 2014 16:25:28 GMT+0200
Further reading - Date.prototype.setFullYear()
I am using Moment.js for adding dates to an option list I am making, so that I can use those dates to show available appointments, for example, someone can select Friday, February 3rd from the option list and then a list of available times will show up for February 3rd.
What I am having trouble with is I am using an online scheduling api that takes month values that start at 1, for example, January is 01, February is 02, etc. but moment.js months start at 0 and only go up to 11, for example, January is 0, February is 1, etc. I need to be able to convert the values from moment.js into an integer so I can add 1 to each to account for the difference.
The real problem is I tried using parseInt(Month) to get the int value to add one to it, but that didn't work. Here is my code attempting to do just that:
var d = moment(),
Month = d.month();
var GetMonthint = parseInt(Month),
GetMonth = GetMonthint++;
GetAppointmentDates(GetMonth, Year);
GetMonth still only returns 1 for February, is there some special way I can get the int value from the month?
The line
GetMonth = GetMonthint++;
is the problem. The ++ operator returns the original value not the incremented value. You should do this:
GetMonth = GetMonthint + 1;
I would like to add 24 hours to the unix timestamp for now in nodejs or in javascript. I also would like to know is there is any direct function or property in Date DOM object.
I found the relevent function in PHP. This code will returns new unix time after adding 24hrs in current unix timestamp.
$currentUnixTime=time();
$newUnixTime=strtotime('+1 day', $currentUnixTime);
return newUnixTime;
var myDate = new Date();
myDate.setHours(myDate.getHours()+24);
return myDate;
If you specifically want to add one day, you should use momentjs library, which is available both for frontent and nodejs.
var now = new Date()
var tomorrow = moment(now).add(1, 'day');
This is more robust than adding 24 hours because it takes into account DST changes. For that sole reason you should avoid direct Date manipulation in JS in most of the cases.
http://momentjs.com/docs/#/manipulating/add/
Special considerations for months and years
If the day of the month on the original date is greater than the
number of days in the final month, the day of the month will change to
the last day in the final month.
moment([2010, 0, 31]); // January 31
moment([2010, 0, 31]).add(1, 'months'); // February 28
There are also special considerations to keep in mind when adding time
that crosses over daylight saving time. If you are adding years,
months, weeks, or days, the original hour will always match the added
hour.
var m = moment(new Date(2011, 2, 12, 5, 0, 0)); // the day before DST in the US m.hours(); // 5
m.add(1, 'days').hours(); // 5
If you are adding hours, minutes, seconds, or milliseconds, the
assumption is that you want precision to the hour, and will result in
a different hour.
var m = moment(new Date(2011, 2, 12, 5, 0, 0)); // the day before DST in the US m.hours(); // 5
m.add(24, 'hours').hours(); // 6
My js code is as follows:-
var earlierdate=new Date(2012,09,22);
alert(earlierdate.getDay());
var date2 = new Date('2012,09,22');
alert(date2.getDay());
The problem is first alert gives me 1 0r Monday(which is incorrect) and second one gives me 6 or sat which is correct. So date given in quotes is giving correct result. Now if want to use variables instead of hard-coded values like
var date1 = new Date(a,b,c);
alert(date1.getDay());
What should be the syntax. I have tried a lot of variations but failed.
Thanks in advance.
The month parameter of Date is 0-indexed.
month
Integer value representing the month, beginning with 0 for January to
11 for December.
So if you mean September, that should be var earlierdate=new Date(2012, 8, 22);
//Option 1
var myDate=new Date();
myDate.setFullYear(2010,0,14);
//Option 2 (Neater)
var myDate=new Date(2010,0,14);
This will set the time to 14th January 2010.