Developer Tools

Unix Timestamps: The Developer's Universal Clock

Unix timestamps appear in logs, APIs, databases, and tokens. Here's exactly what they are, why every developer should understand them, and the gotchas to watch for.

</>

DevPulse Team

Open any server log, API response, or JWT token and you'll likely see a large integer like 1714046400. That's a Unix timestamp. It's a deceptively simple format that underpins most of how computers record time.

What a Unix Timestamp Is

A Unix timestamp is the number of seconds elapsed since the Unix epoch: 00:00:00 UTC on January 1, 1970. That's it. No timezones, no daylight saving, no calendar complexity — just a count of seconds from a fixed reference point.

This simplicity makes it ideal for storage and computation. Comparing two timestamps is a single integer comparison. Calculating time differences is subtraction. Timestamps sort correctly as integers. No date-parsing library required.

Reading Timestamps in Practice

The timestamp 1714046400 is meaningless at a glance. A converter tells you it's April 25, 2024 at 16:00 UTC. When you're reading logs or debugging API responses, translating timestamps to human dates is something you do constantly.

Common places you'll see Unix timestamps:

  • JWT tokens — the exp, iat, and nbf claims are all Unix timestamps
  • HTTP headersLast-Modified and Expires use RFC 2822 format, but many APIs use Unix time in response bodies
  • Database timestamps — some schemas store timestamps as integers for simplicity and storage efficiency
  • Log files — syslog and many application log formats include Unix timestamps
  • File metadatastat on Linux shows created, modified, and accessed times as timestamps

Milliseconds vs Seconds

Standard Unix timestamps are in seconds. But JavaScript's Date.now() and many web APIs return milliseconds — so the same moment is either 1714046400 (seconds) or 1714046400000 (milliseconds). This is a common source of bugs: a timestamp that's 1000× too large looks like a date thousands of years in the future.

The quick sanity check: if the timestamp has 13 digits, it's likely milliseconds. 10 digits is seconds. Timestamps won't reach 13 digits in seconds until the year 2286.

The Year 2038 Problem

Systems that store Unix timestamps as 32-bit signed integers can only represent values up to 2147483647 — January 19, 2038 at 03:14:07 UTC. After that, the value overflows to a large negative number. This is the "Year 2038 Problem", analogous to Y2K.

Modern 64-bit systems don't have this issue — a 64-bit signed integer can hold timestamps until the year 292 billion CE. But embedded systems and legacy code using 32-bit timestamps remain at risk. It's still worth checking any code you maintain.

Timezones

Unix timestamps are always UTC. Converting to a local time for display is the application's job. The conversion is straightforward in every language, but always be explicit about which timezone you're displaying. Showing a timestamp in a user's local timezone without labelling it is a common UX mistake that causes confusion in international systems.

Use our Unix Timestamp Converter to quickly translate between timestamps and human-readable dates in any timezone.

Free developer tools, right in your browser.

No sign-up. No tracking. 30+ utilities for developers.

Explore DevPulse Tools →