What is a Unix Timestamp?
A Unix timestamp (also called Epoch time) is the number of seconds elapsed since January 1, 1970 00:00:00 UTC. It's the universal time reference used in databases, APIs, log files, and programming languages worldwide.
1. Why does 1970-01-01 matter?
It's arbitrary — early Unix engineers picked January 1, 1970 as a convenient epoch. All Unix systems count time from this point.
2. What's the difference between seconds and milliseconds timestamps?
Most Unix timestamps are in seconds. JavaScript's Date.now() returns milliseconds. The tool auto-detects which you're using based on the magnitude.
3. What is the Year 2038 problem?
32-bit systems store timestamps as signed integers, which will overflow on January 19, 2038. 64-bit systems extend this by billions of years.
4. Why does my timestamp show the wrong timezone?
We always display UTC for consistency. To display your local timezone, use the 'local' checkbox or convert the UTC result yourself.
5. Is this timestamp generator accurate?
Yes. It reads from the browser's native Date object which syncs with your OS clock, typically accurate to within milliseconds.