Except for desktop computers and mobile phones, all my networked devices live in UTC timezone (sometime incorrectly referred to as GMT).
First, the most obvious reason is that my servers and devices live in two very different locations. Most of them are in USA but a few still remain in Croatia (yep, I have transcontinental offsite backup). For anything that needs time sync, I would need to manually calculate time difference. And not only once – thanks to different daylight time schedule there are four different time offsets throughout the year. With multiple devices around, mistakes are practically assured.
However, I would use UTC even with all devices in the same location. And the reason is aforementioned daylight saving time. Before I switched to UTC every year after daylight starts or ends I would have one hour difference on something. Bigger devices (e.g. NAS) would usually switch time but smaller IoT devices would not.
Since my network has centralized logging I can be sure that some devices will be one hour off at any time. And I am sure to notice this only when I need the logs, leaving me to add mental calculations to already annoying troubleshooting task. And, even if I remember to reconfigure it, I can be sure damn daylight saving screws it again later.
And yes, it might not be necessarily important for all my servers and devices to share the same time in the grand scheme of things. But UTC makes it easy enough and adjusting to it is reasonably easy.
If you have the same issues, jump in – you’ll not be sorry.
PS: The only downside is that my server sends me e-mail health report at different time depending if it is winter or summer.
PPS: Why the heck we still use daylight saving time?