UNIX format timestamps have an extensive range of use. You will find out the main reasons that back up this claim, but for now, let’s say that it is because working with dates and times in the UNIX time format is easier and more resource efficient. In this article, our goal is to demystify the concept, learn the theory that lies behind it, and ultimately write algorithms and implementations that convert dates back and forth from the traditional ISO standard to UNIX format.
In short, the UNIX time format stands for the count of seconds elapsed since midnight UTC (Coordinated Universal Time) of January 1, 1970, without considering leap seconds. I won’t go into the details and point out the definitions of UTC and GMT because the difference between the two of them never exceeds 0.9 seconds, and it would be beyond the scope of this article. So, basically, we can say it’s the GMT, too.
Just as its name suggests, it’s related to UNIX. The UNIX epoch is the time 00:00:00 UTC on January 1st, 1970. In the ISO 8601 standard format, this epoch can be considered 1970-01-01T00:00:00Z. Basically, it is a reference point from which this measurement unit (UNIX time) starts. Therefore, the zero UNIX time is exactly at the reference point—the one we mentioned earlier.
The UNIX time format is not a linear representation of time because it does not count in leap seconds; therefore, it does not correspond to the UTC. There are workarounds to count in leap seconds as well, but that’s not what we care about right now. What’s important to us is that any date and time can be transcoded into UNIX format.
This is critical because we can get away with just a simple scalar real number. From a programmer’s point of view then, let’s call these a sequence of bits (digits). I mentioned earlier that one of the main advantages of UNIX time format is that it makes working with dates and times very easy. The reason why is simple.
Consider the following scenario where you need to move forward and increase a date variable three (3) months. The situation can get tricky when the starting initial timestamp looks like “2008-07-04T00:00:00” (I’ve stripped the +00:00 from the end; it’s in UTC/GMT locale). You need to consider that July has 31 days, as does August, but September has only 30. And then you need to find which section stands for the month, year, and days. Different locales have different habits.
Isn’t it easier when you have the sequence “1215129600” and adding three months can be narrowed down to adding 90 days to the initial date/time variable? We know that UTC days are 86400 seconds long. That means in order to move forward 90 days, we need to add the initial sequence of digits with 90 * 86400. The result is the following: “1222905600” — voila! No hassle at all.
As you can see, working with UNIX time formats isn’t a black art—ultimately, it can be turned into simple mathematics. Explain this to a kid that goes to the elementary school and s/he might be fascinated by it.
Now let’s see a step-by-step calculation where we count in leap years. We start out with the following date: Mon, 31 Mar 2008 21:15:30. At first, we need to calculate the offset from the year 1970 (UNIX epoch). 2008 – 1970 = 38. Every 4 years, a leap year occurs. 38 / 4 = 9.5 (but we’ll consider this a 9 because the decimals can be ignored).
The calculation continues and now we need to convert those 38 years to a number of days: 38 * 365 = 13870. We also add the approximated amount of 9 days due to leap years and that gives us 13879 days. Moving on, we need to calculate the offset from January 1, 31 + 29 (February has 29 days this year) + 31 = 91 days. Right after, we add these 91 days to our original expression: 13879 + 91 = 13970 days.
As stated earlier, each day has 86,400 seconds. Let’s multiply this amount with our count of days: 13970 * 86400 = 1207008000. All that’s left is adding the time. 09 PM is hour 21 – thus, the offset is from 21hours since midnight. Each hour is 3600 seconds. 21 * 3600 = 75600. Then we also have 15 minutes and 30 seconds. A minute is 60 seconds, so we end up with 15 * 60 + 30 = 930 seconds. 75600 + 930 = 76530.
And finally, we need to add our sequence of digits with the aforementioned number of seconds. That’s how we’ll end up with the final result: 1207008000 + 76530 = 1207084530. Class dismissed. End of math for today—programming is next!