JavaScript Dateobject Increment goes wrong - javascript

I want to increment the year-value of a given date, but this goes wrong.
this is my code:
var endDate = entry.start;
endDate.setDate(endDate.getFullYear() + 5);
and for comparison the output (console.log) is:
Date {Thu Jun 30 2011 11:30:10 GMT+0200}
Date {Tue Dec 06 2016 11:30:10 GMT+0100}
as you can see, it also incremented the month and Day.
What am I missing?
thanks in advance

You have to set the year only, using the setYear method:
endDate.setYear(endDate.getFullYear() + 5);
Using setDate(getFullYear()+5) you add 5 + (year of the date) days to the date value of endDate

You're adding 2016 days to it, not modifying the year, which is 5 and a half year.

Related

How to get last month day 15 and current month day 16 in moment

I am trying to get last month of day 15 and current month of day 16 in moment but I am failed could someone please help me how to resolve this issue.
Expected result => 15 oct, 15 Nov 2020
To get last month with date 15 you need this:
moment().subtract(1, 'month').date(15);
You subtract one month and set date to 15. This returns 15 october.
To get current date 15, just remove the subtract part.
To get exactly the result you asked for then:
const currentMonthDate15 = moment().date(15);
const lastMonthDate15 = moment().date(15).subtract(1, 'month');
const string = lastMonthDate15.format('DD MMM') + ', ' + currentMonthDate15.format('DD MMM YYYY');
Where string is 15 Oct, 15 Nov 2020

get week number from date in whole week base in javascript

I have searched the web and found the script to get the week number in year. However my counting is difference. The below image is the week number I want to get. When I tested using '1/5/2015', my code got week number is 2, but the week number should be 1 in my requirement. Would someone can help me. Thanks in advance.
I found the javascript at IamSilviu/Get week number
There is my code:
function myWeekNumber(thisDate) {
var dt = new Date(thisDate)
var onejan=new Date(dt.getFullYear(), 0, 2);
return Math.ceil((((dt - onejan) / 86400000) + onejan.getDay() + 1) / 7); }
The algorithm you're trying to implement seems to be that:
Weeks start on Sunday
The first week of the year is the one that has any days in the year, e.g. 1 Jan 2016 was a Friday, so the first week of 2016 started on Sunday 27 December 2015
In this case, it's best to use UTC methods to avoid daylight saving issues:
function getWeekNumberNonISO(d) {
// Create UTC equivalent for 23:59:59.999 on the passed in date
var sat = new Date(Date.UTC(d.getFullYear(), d.getMonth(), d.getDate(),23,59,59,999));
// Set to Saturday at end of week
sat.setUTCDate(sat.getUTCDate() + 6 - sat.getUTCDay());
// Get first day of year
var firstDay = new Date(Date.UTC(sat.getUTCFullYear(), 0, 1));
// Set to Sunday on or before, i.e. first day of first week in year
firstDay = firstDay.setUTCDate(firstDay.getUTCDate() - firstDay.getUTCDay());
// Week number is difference in dates divided by ms/week rounded
return Math.round((sat - firstDay)/(6.048e8));
}
// Get week number for Mon 5 Jan 2015
console.log(getWeekNumberNonISO(new Date(2015,0,5))); // 2
// Get week number for Sat 31 Dec 2011
console.log(getWeekNumberNonISO(new Date(2011,11,31))); //53
// Get week number for Sat 1 Jan 2011
console.log(getWeekNumberNonISO(new Date(2011,0,1))); // 1
// Get week number for Sun 2 Jan 2011
console.log(getWeekNumberNonISO(new Date(2011,0,2))); // 2
Js has function inbulid function which can be used to fetch the date from the given date of the week getweek().
var week=date.getWeek()

MomentJS returns obscure date for 1st of month

I'm having a small problem with MomentJS returning a nonsense date. I am attempting to set the date to the first of a given month and year. I have tried the following:-
var _year = 2015;
var _month = 10;
var _dateString = _year.toString() + '-' + _month.toString() + '-1';
var _date = moment(_dateString, 'YYYY-MM-D');
console.log('_date', _date.format('dddd, do MMMM YYYY'));
This gives Thursday, 4th October 2015 as the _date. Which doesn't exist. I tried using .set() and .date(), both give the same result:-
var _date = moment(_dateString, 'YYYY-MM-D').set('date', 1);
> Thursday, 4th October 2015
var _date = moment(_dateString, 'YYYY-MM-D').date(1);
> Thursday, 4th October 2015
So, I can't see what I'm doing wrong now, can anyone offer any suggestions or help?
Many thanks.
Your code is correct except you should use capital D not small d in do:
console.log('_date', _date.format('dddd, Do MMMM YYYY'));
Difference between Do and do is:
do is the index of the day in the week, for example if you check the calender you will find 1st October is Thursday which is the 4th day of the week as the index start from 0 and if you changed to 2 October which is Friday it will give you 5th and same for 3 Oct => 6th and then the new week start from Sunday then 4 Oct => 0th and start over again.
Do is the index of the day in the month and that what you expected the result to be, 1th Oct is 1th, 2nd Oct => 2nd and so on.
Check the docs here for more info

Why there is need to add 1 to the month for this date conversion?

I have this date variable in javascript $scope.dt and the contents is Tue Jul 08 2014 00:00:00 GMT+0800 (Malay Peninsula Standard Time). I want to convert it to return a string that is 2014-7-8 (YYYY-MM-DD).
Below is the function I wrote;
function convertDate_YYYYMMDD(d)
{
var curr_date = d.getDate();
var curr_month = d.getMonth()+1; //why need to add one?
var curr_year = d.getFullYear();
return (curr_year + "-" + curr_month + "-" + curr_date );
}
It works fine. What I don't understand is why do I need to add 1 to get the correct curr_month? If I do not do this, the month will always be off by one. The code works but I don't know why it works.
Can someone advise?
That's legacy of C. The month in timestamps are zero-based.
Compare Date.getMonth():
The value returned by getMonth is an integer between 0 and 11. 0 corresponds to January, 1 to February, and so on.
struct tm:
int tm_mon month of year [0,11]
And why the months start with zero in many programming languages is explained in here: Zero-based month numbering. Paraphrased: Using January == 0 was useful in ancient times, and now we re stuck with it.
http://www.w3schools.com/jsref/jsref_getmonth.asp
The getMonth() method returns the month (from 0 to 11) for the
specified date, according to local time.
Note: January is 0, February is 1, and so on.
The month range is 0-11. i.e. For January it will be 0 and for December it will return you 11. Therefore we need to add 1 to it.
Check this

Problem in adding days / month / year to a given date using javascript?

I trying to add days / months / year to a given date and map it to an input field
var d = new Date();
d.setDate(15);
d.setMonth(06);
d.setYear(2011);
document.getElementById("test").innerHTML=d;
d.setDate(d.getDate()+20);
document.getElementById("test").innerHTML+=""+d.getDate()+"/"+d.getMonth()+"/"+d.getYear("YY");
this actually prints out
Fri Jul 15 2011 12:45:48 GMT+0530 (India Standard Time)
4/7/111
actually this is wrong.. it should print out 5/7/2011.. i think by default the system takes as "30" days for a month and adds the +20 days.. but actually Jun has 30 days so that result should be 5/7/2011..
any suggestion about what goes wrong in here.. any alternative for this?
At the first, you have better to use getFullYear to get year as 2011. You did get number from getDate() and add 20. This break Date. You should get long value from getTime(), and add milli-seconds.
<div id="test"></div>
<script>
var d = new Date();
d.setDate(15);
d.setMonth(06);
d.setFullYear(2011);
document.getElementById("test").innerHTML+=" "+d.getDate()+"/"+d.getMonth()+"/"+d.getFullYear();
d.setTime(d.getTime()+1000*60*60*24*20);
document.getElementById("test").innerHTML+=" "+d.getDate()+"/"+d.getMonth()+"/"+d.getFullYear();
</script>
> var d = new Date();
> d.setDate(15);
> d.setMonth(06);
> d.setYear(2011);
is equivalent to:
var d = new Date(2011,6,15); // 15 Jul 2011
Months are zero based (January = 0, December = 11).
Date.prototype.getYear is specified in ECMA-262 ed5 as Return YearFromTime(LocalTime(t)) − 1900. so:
alert(d.getYear()); // 111
whereas:
alert(d.getFullYear()); // 2011
i think by default the system takes as "30" days for a month and adds the +20 days.. but actually May has 31 days so that result should be 5/7/2011.
You are interpreting it a wrong way, Month in a date starts with 0 - Jan..
So as per the date entered by you it comes Jul 15 2011 on the month number 6.
When you add 20 to date it will be Aug 04 2011 and you are directly getting month number which is 7 - i.e. Aug which misleads your calculation. And for the year, yes it is you should getFullYear
Read this to get your basics correct..

Categories

Resources