Felipe Contreras wrote in post #1144885:
>
>> If you provide both %s and %z, you provide *too much* information.
>
> But we already have that information!
>
> Here, I'll put it yet one more time:
>
>   DateTime.parse("1970-01-01 01:00 +0100").strftime("%s %z")
>   => "0 +0100"
>
> If we follow your rationale, that should return "0 +0000", or maybe even
> fail.
>
> But we don't do that. Why? Because if we *already* have the timezone
> information, it doesn't hurt to simply display it.
>

I don't think this is what happens at all. If you do

   t = DateTime.parse("1970-01-01 01:00 +0100")

you'll get a DateTime object complete with timezone because that is what
parse does. But if you

   t.strftime('%s %z')

you do not get some sort of "timestring". What you actually do is ask to
build a string in the following manner:

   '%s %z' => '0' + ' ' + '+0100'


And now that i think about it:

Fromhttps://bugs.ruby-lang.org/issues/9794
>Time.strptime() works correctly:
>
>  Time.strptime('0 +0100', '%s %z').strftime('%s %z')
>  => "0 +0100"

It indeed works perfectly fine, just not the way you think:

   Time.strptime('0 +0200', '%s %z').strftime('%s %z')
   => "0 +0100"

You ask Time to infer a time from a string and give it *hints* as to how
to interpret what it finds. But since you say the first thing in your
string should be the number of seconds since epoch ('%s') it stops right
there at the first space. It already has sufficient data do infer a
time so the timezone in the string never gets parsed.
But I guess that all Time and DateTime objects do get a timezone by default
so the local timezone is assigned and gets printed if you ask strftime
to do so.

Cordially,

daemor