I was trying to convert date object into long format (may be in milliseconds format) as we do in java.
So to fulfill my need, after some trial and error, I found below way which works for me:
var date = new Date();
var longFormat = date*1; // dont know what it does internally
console.log(longFormat); // output was 1380625095292
To verify, I reverse it using new Date(longFormat); and it gave me correct output. In short I was able to fulfill my need some how, but I am still blank what multiplication does internally ? When I tried to multiply current date with digit 2, it gave me some date of year 2057 !! does anyone know, what exactly happening ?
The long format displays the number of ticks after 01.01.1970, so for now its about 43 years.
* operator forces argument to be cast to number, I suppose, Date object has such casting probably with getTime().
You double the number of milliseconds - you get 43 more years, hence the 2057 (or so) year.
What you are getting when you multiply, is ticks
Visit: How to convert JavaScript date object into ticks
Also, when you * 2 it, you get the double value of ticks, so the date is of future
var date = new Date()
var ticks = date.getTime()
ref: Javascript Date Ticks
getTime returns the number of milliseconds since January 1, 1970. So when you * 1 it, you might have got value of this milliseconds. When you * 2 it, those milliseconds are doubled, and you get date of 2057!!
Dates are internally stored as a timestamp, which is a long-object (more info on timestamps). This is why you can create Dates with new Date(long). If you try to multiply a Date with an integer, this is what happens:
var date = new Date();
var longFormat = date*1;
// date*1 => date.getTime() * 1
console.log(longFormat); // output is 1380.....
Javascript tries to find the easiest conversion from date to a format that can be multiplied with the factor 1, which is in this case the internal long format
Just use a date object methods.
Read the docs: JavaScript Date object
var miliseconds=yourDateObject.getMiliseconds();
If You want to get ticks:
var ticks = ((yourDateObject.getTime() * 10000) + 621355968000000000);
or
var ticks = someDate.getTime();
Javascript date objects are based on a UTC time value that is milliseconds since 1 January 1970. It just so happens that Java uses the same epoch but the time value is seconds.
To get the time value, the getTime method can be used, or a mathematic operation can be applied to the date object, e.g.
var d = new Date();
alert(d.getTime()); // shows time value
alert(+d); // shows time value
The Date constructor also accepts a time value as an argument to create a date object, so to copy a date object you can do:
var d2 = new Date(+d);
If you do:
var d3 = new Date(2 * d);
you are effectively creating a date that is (very roughly):
1970 + (2013 - 1970) * 2 = 2056
You could try the parsing functionality of the Date constructor, whose result you then can stringify:
>
new Date("04/06/13").toString()
"Sun Apr 06 1913 00:00:00 GMT+0200"
// or something
But the parsing is implementation-dependent, and there won't be many engines that interpret your odd DD/MM/YY format correctly. If you had used MM/DD/YYYY, it probably would be recognized everywhere.
Instead, you want to ensure how it is parsed, so have to do it yourself and feed the single parts into the constructor:
var parts = "04/06/13".split("/"),
date = new Date(+parts[2]+2000, parts[1]-1, +parts[0]);
console.log(date.toString()); // Tue Jun 04 2013 00:00:00 GMT+0200
Related
I thought with the c# call:
var x = DateTime.Now.Ticks
and with the Javascript call:
var x = Date.now()
I should be getting the same result.
But on c# I am getting: 637593006969672760
While Javascript returns: 1623750547564
(Those are not from the same day, but should be extermely close. However, both values differ by A LOT)
I thought both return the value of miliseconds since the first of jan 0:00 of 1970?
So why are both values so different?
And how can I translate the c# call to javascript?
For c# Ticks:
The value of this property represents the number of 100-nanosecond intervals that have elapsed since 12:00:00 midnight, January 1, 0001 in the Gregorian calendar
For Javascript Date:
JavaScript Date objects represent a single moment in time in a platform-independent format. Date objects contain a Number that represents milliseconds since 1 January 1970 UTC.
Sources:
https://learn.microsoft.com/en-us/dotnet/api/system.datetime.ticks?view=net-5.0
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date
In C# to get the number of milliseconds since some point in time (eg 1/1/1970) you could use:
var datum = new DateTime(1970,1,1);
var msSinceDatum = DateTime.Now.Subtract(datum).TotalMilliseconds;
Having run that a few seconds ago it gave the answer 1623751729961.4617 whereas the DateTime.now() in javascript gave 1623751739058. Should be close enough for your needs.
Note you can also use DateTimeOffset.ToUnixTimeMilliseconds
var msSinceUnix = DateTimeOffset.Now.ToUnixTimeMilliseconds();
Going the other way (C# ticks to javascript date) is a little more involved.
var ticks = 637593490842364954; // ticks retrieved a few moments ago
var ticksToMicrotime = ticks / 10000;
var epochMicrotimeDiff = Math.abs(new Date(0, 0, 1).setFullYear(1));
var date = new Date(ticksToMicrotime - epochMicrotimeDiff);
console.log(date)
I'd like to convert a timestamp into a date:
status_transitions:
{ finalized_at: 1557419382,
marked_uncollectible_at: null,
paid_at: 1557419384,
voided_at: null },
In particular, paid_at: 1557419384. But when I try new Date(1557419384) I do not get the expected result. If I use Date.setMilliseconds() I do.
What's especially strange is the output of Date.setMilliseconds()
const ms = 1557419384;
const fromConstructor = new Date(ms);
const fromFn = new Date();
const strangeOutput = fromFn.setMilliseconds(ms);
console.log(`Milliseconds: ${ms}`);
console.log(`Output from setMilliseconds: ${strangeOutput}`);
console.log(`Date from new Date(ms): ${fromConstructor}`);
console.log(`Date using setMilliseconds(ms): ${fromFn}`);
The output of the code above is:
Milliseconds: 1557419384
Output from setMilliseconds: 1558978824384
Date from new Date(ms): Sun Jan 18 1970 18:36:59 GMT-0600 (Central Standard Time)
Date using setMilliseconds(ms): Mon May 27 2019 12:40:24 GMT-0500 (Central Daylight Time)
Why does creating a new date from a number not yield the same result of setMilliseconds()? Also, why is the output from setMilliseconds() different than the actual milliseconds passed in?
I've read the docs and they seem to imply there should be no difference between these two methods.
Consider the example shown in the setMilliseconds() part of the docs here.
var theBigDay = new Date();
theBigDay.setMilliseconds(100);
Running this right now, this gives me the value 1557421875100. Notice only the last three digits are 100. Thus, it doesn't set the entire date object to 100, but only sets the milliseconds portion of the value. The rest is coming from new Date(), which is the current UTC-based timestamp.
As far as why you don't get the expected result from new Date(1557419384), that timestamp would appear to be in seconds rather than milliseconds. Multiply the value by 1000 and it gives a more reasonable value. (Unix timestamps are commonly expressed in whole seconds, which appears to be the case here.)
Why is the output from setMilliseconds() different than the actual milliseconds passed in?
Check the docs more closely. setMilliSeconds has a "Parameter:
A number between 0 and 999, representing the milliseconds." and its description states "If you specify a number outside the expected range, the date information in the Date object is updated accordingly. For example, if you specify 1005, the number of seconds is incremented by 1, and 5 is used for the milliseconds.".
So new Date() creates a date with the current time, and then you add your millisecond value to that.
Why does creating a new date from a number not yield the same result of setMilliseconds()?
I've read the docs and they seem to imply there should be no difference between these two methods.
What you were looking for is setTime. new Date().setTime(millseconds) creates an object that is equal to new Date(milliseconds).
When I try new Date(1557419384) I do not get the expected result.
As Matt Johnson noted, this value appears to be seconds not millisecond. Multiply it by 1000.
I understand that dealing with dates, in any environment, could be quite confusing, but I'm in a nightmare with a function that should be a trivial job.
I want to manipulate in different ways some dates, but I get errors or wrong results.
I report hereafter a very simple example made to test the execution; the goal here is to get the current month beginning date, just to show what happens:
function DateAdjust() {
var newdate = new Date(); //1: 2018-12-12T21:00:20.099Z
newdate = newdate.setDate(1); //2: 1543698020099
newdate=Date(newdate); //3: Wed Dec 12 2018 21:01:43 GMT+0000 (Ora standard dell’Europa occidentale)
var d = newdate.getDate(); //4: newdate.getDate is not a function
}
4 lines, 3 unexpected results (as shown by Firefox's debugger):
1. the starting date has no day-of-week and no timezone
2. setting the day, result is transformed in milliseconds (why?); I do not know if it is correct.
3. reconverting in string gives the original date, unmodified (why?) but with week day and timezone
4. trying to get the day value an error is thrown (why?)
My environment:
Win 7 32bits SP1
Firefox 63.0.3 (32 bit)
jquery-2.2.4.min.js
I know these questions are boring, but hope someone will find few minutes to clear my mind.
Regarding line 1, the Z at the end is the timezone designation for UTC in ISO 8601 (see Wikipedia).
If the time is in UTC, add a Z directly after the time without a space. Z is the zone designator for the zero UTC offset. "09:30 UTC" is therefore represented as "09:30Z" or "0930Z". "14:45:15 UTC" would be "14:45:15Z" or "144515Z".
Regarding line 2 see the MDN article on setDate (emphasis mine):
The number of milliseconds between 1 January 1970 00:00:00 UTC and the given date (the Date object is also changed in place).
So you can see the 'correct' behavior you probably expect simply by ignoring the return value:
var newdate = new Date(); //1: 2018-12-12T21:00:20.099Z
newdate.setDate(1); //2: 1543698020099
console.log(newdate); //3: 2018-12-01T21:00:20.099Z
Regarding line 3, see MDN article on Date (emphasis mine):
Note: JavaScript Date objects can only be instantiated by calling
JavaScript Date as a constructor: calling it as a regular function
(i.e. without the new operator) will return a string rather than a
Date object; unlike other JavaScript object types, JavaScript Date
objects have no literal syntax.
Regarding line 4, the above also explains this error, since newdate is now a string rather than a Date object.
For what it's worth, I agree with the other commenters. JavaScript's date functions are pretty messy compared to many other modern languages. I strongly recommend using a library like moment, luxon, or date-fns. It'll make your life much easier.
I do recommend using moment.js
But there are 2 problems with your code:
1-
newdate = newdate.setDate(1);
setDate mutates newDate in place, and return it in miliseconds, not a new Date object. If you just want to set the date, do this instead:
newdate.setDate(1);
2-
newdate=Date(newdate);
Not realy sure why you are trying to get a new Date object, but you need the new, otherwise it will just be a string
newdate= new Date(newdate);
Fixing problem 1 should eliminate the need for the code of problem 2
var newdate = new Date(); // 1
console.log(typeof newdate, newdate); // object Wed Dec 12 2018 23:00:44 GMT+0200 (Eastern European Standard Time)
newdate = newdate.setDate(1); // 2
console.log(typeof newdate, newdate); //number 1543698085383
newdate=Date(newdate); //3
console.log(typeof newdate, newdate); //string Wed Dec 12 2018 23:04:44 GMT+0200 (Eastern European Standard Time)
var d = newdate.getDate(); // 4
console.log(typeof d, d); //
Date type is assigned to the object.
number is assigned to newdate. which is ticks
returns string
string.getDate() is not defined, so undefined.
hope it helps.
I'm used to create Date objects by using the fourth syntax from MDN as new Date(year, month, day, hours, minutes, seconds, milliseconds); But lately I tried to set a Date object with only a year (as new Date(2017)) but as you could expect it was treated as a value and considered the year as a number of milliseconds.
Is there any way of still easily use the year as is without changing the syntax and expect a correctly set Date ?
Two solutions come to my mind:
(1) Set the year argument to 2017 and set the month argument to 0 when constructing the date:
let d = new Date(2017, 0);
console.log(d.toString());
The arguments will be treated as local time; month and day of month will be January 1; all time components will be set to 0.
(2) Specify "2017T00:00" as the first and only argument when constructing the date:
let d = new Date("2017T00:00");
console.log(d.toString());
According to current specs this is a valid format and browsers are supposed to treat it as local time. The behavior is same as that of previous example.
If you are passing a single parameter (number or string), then it is taken as per doc
value
Integer value representing the number of milliseconds since January 1,
1970 00:00:00 UTC, with leap seconds ignored (Unix Epoch; but consider
that most Unix time stamp functions count in seconds).
dateString
String value representing a date. The string should be in a format
recognized by the Date.parse() method (IETF-compliant RFC 2822
timestamps and also a version of ISO8601).
Also as per doc
If at least two arguments are supplied, missing arguments are either
set to 1 (if day is missing) or 0 for all others.
You can pass one more parameter as 0 or null (or the actual value you want to set)
new Date(2017,0);
Demo
var date = new Date(2017,0);
console.log( date );
You could pass null as second argument:
new Date(2017, null);
However, without knowing the details of how missing values are interpreted, what do you think happens now? Will the date be initialized with the current month, day, hour, etc? Or something different?
Better be explicit and pass all arguments, so that you know what the code is doing half a year later.
I have another suggestion. You can just create a regular date object and set it's year. So at least you know to expect what the rest of the Date object values are.
var year = "2014";
var date = new Date();
date.setFullYear(year);
// console.log(year) => Wed Dec 27 2014 16:25:28 GMT+0200
Further reading - Date.prototype.setFullYear()
this.D = new Date(1433760825 * 1000);
this.NewD = this.D.getHours();
D = "2015-06-08T10:53:45.000Z" - this is fine, it is what I was expecting to get.
But...but....NewD results to 11 and Not 10. Why???
Thanks!
When you instantiate the Date object using a value like this, you get a date based on UTC. From MDN:
Integer value representing the number of milliseconds since 1
January 1970 00:00:00 UTC (Unix Epoch).
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date
When you subsequently call the getHours() method, you get the hours in your local time zone.
So for your example:
var sampleDate = new Date(1433760825 * 1000);
var hours = sampleDate.getUTCHours();
alert(sampleDate);
alert(this.hours);
Should get you the result you are looking for.
A couple of quick points:
Don't use single character variable names, and don't capitalize
them if you do
D is not equal to the date that you put in.
The new Date(value) is expecting an integer and you are giving it
something larger then an integer. So it is defaulting back to
current time.
Try using a DateString, or other method as described in this documentation:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date
When you create a DateObject, if you try to print it, it will print the datetime according to UTC. But when you do the getHours() property it will tell you the hours passed according to your own local time zone.
(new DateTime()).getHours()
This will return the hours in (UTC + offset) according to your timezone.