Convert UTC in milliseconds to text format results in local time
I'm having some confusion in working with datetime data. I have datetime values represented in milliseconds. However, when I convert/format it to a human readable format in MySQL code, I'm getting the local time. Here's an example where the data is coming from Typescript:
const time = moment.utc().format()
// format the date.
console.log(time)
// get the time in miliseconds.
const timeStamp = moment.utc().valueOf();
console.log(timeStamp)
and this is the output (in UTC)
2022-08-03T21:07:32Z
1659560852242
however, when I format the same value in MySQL, I get local time (CST):
SELECT FROM_UNIXTIME(1659560852242 * POWER(10, 9 - FLOOR(LOG10(1659560852242))));
and the output is converted to CST:
8/3/2022 4:07:32.242000 PM
Can someone please explain why this is happening and how to convert milliseconds to the actual time it is and not the local time of the server?
Thank you.
Subject
Written By
Posted
Convert UTC in milliseconds to text format results in local time
August 03, 2022 03:50PM
Sorry, you can't reply to this topic. It has been closed.
Content reproduced on this site is the property of the respective copyright holders.
It is not reviewed in advance by Oracle and does not necessarily represent the opinion
of Oracle or any other party.