It is funny (or maddening, depending on your mood) how long something can be under your nose until you come to assume you understand it. Only after you are forced to look at that thing in detail do you realize you have no idea how it works.
That happened to me just now with Java method System.currentTimeMillis().
I’ve been writing Java for more than a decade and never looked at the Javadoc for that method until today, it reads:
… the difference, measured in milliseconds, between the current time and midnight, January 1, 1970 UTC.
Sounds easy enough, so this is the elapsed time since the unix epoch, but in milliseconds (1000x seconds), got it!
Is the value returned the calculation of the difference between the current LOCAL TIME and the UTC epoch time (different timezones) or is it the difference between the local time CONVERTED to UTC time then subtracted from the UTC epoch time?
Put another way: depending on my time zone and day light savings time, if I run this code on different clients in different positions around the globe all at the same time, will I get different values?
If I get different values, does that mean when I am writing a web service or REST API, I need the client to send me it’s timestamp as well as it’s timezone so I can properly compare the times? Damnit all!
Fortunately, Age of Code is all over this and explains in great detail (with examples) that the calculation done by that method is done between two UTC times and the time you get back is diff-time-in-UTC.
So it’s all UTC!
As it turns out, Java actually treats all millisecond time as milliseconds-since-unix-epoch-in-UTC, so when you convert your timestamps using a Calendar or Date (or I imagine JodaTime) Java is applying the time zone shift for you and giving you back a time-zone friendly time; the millisecond values itself is not timezone-shifted.
Hope that helps!