How to Convert Unix Timestamp to Time (Human-Readable Format)
You're looking at a number like 1710700800 and thinking... what? That's a Unix timestamp. It's the number of seconds that have passed since January 1, 1970 at midnight UTC. No time zones, no AM/PM, no daylight saving headaches. Just a raw number.
Problem is, no human being can look at that and know it means March 17, 2024. So let's convert it.
Need It Converted Right Now?
If you just want the answer and don't care about the how, paste your timestamp into our Unix timestamp converter. It takes seconds or milliseconds, shows your local time zone, and goes both ways. Done.
But if you need to do this in code or want to actually understand what's going on — keep going.
The Seconds vs. Milliseconds Thing
Okay this trips people up ALL the time and I've personally wasted hours on it.
Some systems give you seconds since epoch — that's a 10-digit number like 1710700800. Others give you milliseconds — 13 digits, like 1710700800000. JavaScript's Date.now()? Milliseconds. Python's time.time()? Seconds with decimals.
Easy rule: count the digits. 10 = seconds. 13 = milliseconds. If you accidentally feed milliseconds into something expecting seconds, you'll get a date in the year 56,000 or something ridiculous. That's your clue.
JavaScript
// From seconds
const timestamp = 1710700800;
const date = new Date(timestamp * 1000);
console.log(date.toLocaleString());
// "3/17/2024, 4:00:00 PM" (depends on your locale)
// From milliseconds
const tsMs = 1710700800000;
const date2 = new Date(tsMs);
console.log(date2.toISOString());
// "2024-03-17T16:00:00.000Z"
See that * 1000 on the first one? JavaScript's Date constructor wants milliseconds. If your timestamp is in seconds, you gotta multiply. Forget that and you get January 20, 1970. Don't ask me how I know this.
Python
import datetime
timestamp = 1710700800
# UTC time
utc_time = datetime.datetime.utcfromtimestamp(timestamp)
print(utc_time) # 2024-03-17 16:00:00
# Local time
local_time = datetime.datetime.fromtimestamp(timestamp)
print(local_time) # depends on your system timezone
# With timezone info (Python 3.9+)
from zoneinfo import ZoneInfo
eastern = datetime.datetime.fromtimestamp(
timestamp, tz=ZoneInfo("America/New_York")
)
print(eastern) # 2024-03-17 12:00:00-04:00
SQL
-- PostgreSQL
SELECT to_timestamp(1710700800);
-- Result: 2024-03-17 16:00:00+00
-- MySQL
SELECT FROM_UNIXTIME(1710700800);
-- Result: 2024-03-17 16:00:00
-- SQLite
SELECT datetime(1710700800, 'unixepoch');
-- Result: 2024-03-17 16:00:00
Terminal (Mac vs. Linux)
# macOS
date -r 1710700800
# Linux
date -d @1710700800
# Both spit out something like:
# Sun Mar 17 16:00:00 UTC 2024
That -r vs -d @ difference between Mac and Linux? One of those tiny annoyances that eats 10 minutes of your life every time you switch machines. Now you won't have to google it again.
Going Backwards: Date to Unix Timestamp
Sometimes you need the reverse — you've got a readable date and need the timestamp number.
// JavaScript
const ts = Math.floor(new Date('2024-03-17T16:00:00Z').getTime() / 1000);
// 1710700800
# Python
import datetime
dt = datetime.datetime(2024, 3, 17, 16, 0, 0)
ts = int(dt.timestamp())
# 1710700800 (assumes local timezone!)
Careful with time zones on this one. If you create a datetime without explicitly saying it's UTC, most languages assume your local time zone. Same code running on a server in Tokyo produces a different timestamp than the same code in New York. Always specify UTC when accuracy matters.
The 2038 Thing
Here's a fun one for your CS exam. If a Unix timestamp gets stored as a 32-bit signed integer, it maxes out at 2,147,483,647. That's January 19, 2038, at 3:14 AM UTC. After that, overflow — the number wraps around to a giant negative, which translates to some date in December 1901.
Most modern systems are on 64-bit now, which pushes the limit out about 292 billion years. So we're fine. But old embedded systems and legacy firmware? Those are gonna have a really bad day in 2038.
Stuff That'll Bite You
- Seconds vs. milliseconds. I know I already said it. It's still the number one mistake. Always check.
- Time zone confusion. A Unix timestamp is always UTC. Always. When you convert it to local time, the display changes but the underlying number stays the same.
- Negative timestamps. Dates before 1970 have negative Unix timestamps.
-86400is December 31, 1969. Some older systems choke on negative values. - Leap seconds. Unix time just... ignores them. There've been 27 leap seconds added since 1972, and Unix timestamps act like none of them happened. For 99% of applications, this doesn't matter. But if you're doing astronomical calculations or something, be aware.
When Should You Use Unix Timestamps?
They're actually great for a bunch of stuff:
- Database storage — zero time zone ambiguity
- Comparing dates — it's just comparing two numbers
- APIs — universal, works everywhere
- Log files — sortable and compact
Where they're terrible: showing times to actual humans. Nobody wants "1710700800" on their receipt. Always convert before displaying.
If you don't want to write code every time you need to convert a Unix timestamp to time, our converter tool does both directions and shows results across time zones. I've bookmarked it myself — I end up using it way more often than I expected.