Hal Fulton wrote:
> Strangely the math isn't making sense to me at the moment.
> 
> A billion seconds should be very roughly thirty years.
> That's 10**9 or roughly 2**27. The epoch is 68 years long.
> Surely we're not storing it in 28 bits. Someone show me
> where I'm being stupid.

Sorry to reply to myself. Of course, a billion ~= 2**30
((2**10)**3).

But then another factor of two should be sufficient to store
the date (31 bits).

The answer apparently is that Unix time with the sign bit
(was that always there?) can go backwards ~68 years also.
So in theory we should be able to represent dates roughly
back to 1902. And checking in irb, I see that this works.
Hmm, did I get this wrong in TRW ch 2?


Hal