What's a Unix Timestamp and Why Should You Care?
That giant number in your database isn't random. Here's what it means and how to work with it.
You're debugging an API response. There's a field called created_at with the value 1704067200. What day is that?
That's a Unix timestamp. It's the number of seconds since January 1, 1970. And once you understand it, you'll see it everywhere.
Why 1970?
Unix was being developed in the late 1960s. The designers needed a starting point for time calculations. January 1, 1970 was recent enough to be practical, early enough to cover most computing history.
That moment—midnight UTC on January 1, 1970—is called the Unix epoch.
Why Seconds Instead of Dates?
Dates are complicated. Time zones, daylight saving, leap years, varying month lengths. "March 15, 2024 at 3pm EST" requires a lot of parsing.
A timestamp is just a number. 1710522000. Easy to store, easy to compare, easy to do math with.
Want to know which event happened first? Compare two numbers. Want to know how long something took? Subtract timestamps. Want to add 24 hours? Add 86400.
The Timezone Advantage
Here's the clever part: timestamps are always UTC.
When someone in Tokyo and someone in New York both record the same moment, they get the same timestamp. The display might differ—one sees 9am, the other sees 7pm—but the underlying number is identical.
This makes distributed systems possible. Databases, APIs, log files—all using the same reference point regardless of where the servers are.
Milliseconds vs Seconds
JavaScript uses milliseconds since epoch. Most other languages use seconds.
1704067200 (seconds) = January 1, 2024, 00:00:00 UTC
1704067200000 (milliseconds) = Same moment
If a timestamp looks way too large, it's probably milliseconds. Divide by 1000.
Reading Timestamps Quickly
A few reference points to calibrate your mental math:
- 1000000000 (1 billion) = September 9, 2001
- 1600000000 = September 2020
- 1700000000 = November 2023
- 1800000000 = January 2027
Current timestamps are in the 1.7 billion range. If you see something starting with 17, it's probably a recent timestamp.
Common Operations
Current timestamp:
Math.floor(Date.now() / 1000) // JavaScript
import time; int(time.time()) # Python
Timestamp to date:
new Date(1704067200 * 1000) // JavaScript needs milliseconds
Date to timestamp:
Math.floor(new Date('2024-01-01').getTime() / 1000)
The 2038 Problem
Unix timestamps were originally stored as 32-bit signed integers. The maximum value? 2,147,483,647.
That's January 19, 2038, 03:14:07 UTC.
After that moment, 32-bit systems will overflow. It's Y2K for timestamps.
Most modern systems use 64-bit integers now, which won't overflow for billions of years. But legacy systems exist. If you're working with older code, it's worth checking.
When Timestamps Get Weird
Negative timestamps. Dates before 1970 are negative numbers. -86400 is December 31, 1969.
Leap seconds. UTC occasionally adds a second to stay synchronized with Earth's rotation. Most systems ignore this. Occasionally it causes chaos.
Clock drift. Servers' clocks can disagree. Two timestamps from different machines might not be directly comparable if the clocks weren't synchronized.
Timestamps are everywhere once you start looking. API responses, database records, log files, cookies. Understanding them makes debugging faster and time-related code less mysterious.