What Is a Unix Timestamp?
Learn how Unix timestamps work, why developers use epoch time, and how to convert timestamps to readable dates.
Use the related XKit tool
Open Epoch ConverterUnix timestamps give software systems a compact way to store a single moment in time. Instead of saving a formatted date like 2026-04-18 09:30:00, many APIs, databases, logs, and schedulers store the number of seconds or milliseconds since the Unix epoch.
Unix epoch time in plain English
The Unix epoch starts at 00:00:00 UTC on January 1, 1970. A timestamp of 0 points to that exact moment. A timestamp of 86400 points to one day later, because there are 86,400 seconds in a day.
Developers use epoch time because it is timezone-neutral. A single integer can represent the same moment for every user, server, and database. The display layer can then format that moment in UTC, local time, or any target timezone.
- Seconds are common in Unix tools, PHP, Python, and many APIs.
- Milliseconds are common in JavaScript, browser APIs, and event logs.
- UTC is the safest reference point when comparing timestamps across systems.
Seconds vs milliseconds
The most common timestamp bug is mixing seconds and milliseconds. A 10-digit value such as 1713427200 is usually seconds. A 13-digit value such as 1713427200000 is usually milliseconds.
If a converted date lands in 1970, the app may have treated milliseconds as seconds. If the result is far in the future, the app may have treated seconds as milliseconds.
Common developer use cases
Epoch values appear in API responses, authentication tokens, audit logs, queue messages, database rows, metrics, and cron-like scheduling workflows. They are easy to sort, compare, and serialize without worrying about locale-specific date formats.
When debugging, convert the timestamp first, confirm whether it is seconds or milliseconds, then compare the result against the expected timezone and event order.
Frequently Asked Questions
What is the current Unix timestamp?
It is the number of seconds that have passed since January 1, 1970 at 00:00:00 UTC. The value changes every second.
Is Unix time always in UTC?
The timestamp itself represents a UTC-based instant. Human-readable output can be formatted in UTC or a local timezone.
How do I know if a timestamp is seconds or milliseconds?
A 10-digit value is usually seconds, while a 13-digit value is usually milliseconds. Some systems document this explicitly, so check the API or database schema when possible.
