It was back in 1675 that Greenwich Mean Time (GMT) was first established to assist mariners in determining longitude at sea. Less than 200 years later, countries around the world had begun to base their own standard times as an offset of GMT. Sadly, digital photography has yet to catch on to this trend, and continues to treat time as something of a local phenomena.
One of the advantages of digital photography over film is that most modern digital cameras support EXIF tagging, a system of automatically recording the camera settings within each image at the time of capture – this typically includes the date and time the picture was taken, exposure settings, camera and lens details, and other pertinent information essential for ascertaining the circumstances under which each shot was taken. With some cameras, your precise position on the planet can be recorded too, so you don't even have to remember where each picture was shot. Most photo management software such as Adobe Lightroom and Aperture, not to mention photo sharing sites such as Flickr and Picasaweb, can extract this information and use it to help present or organise your pictures in meaningful ways. Even the simple act of importing pictures onto your desktop makes use of the EXIF data – the date and time associated with each picture file on your hard drive is invariably synched with the capture time rather than the time it was imported.
However, what I can only imagine was a very unfortunate and permanent oversight with EXIF data in its current incarnation, which threatens to undermine much of its potential usefulness, is that it does not support time zones, and nobody else seems to be in any hurry to accommodate this either. During a recent trip to Europe, I captured a lot of images using a Canon EOS 5D Mark II (which is still a high end camera) and an Apple iPhone 4. The time on the 5D had to be adjusted manually in much the same way you would an analogue wristwatch, but the iPhone already had time zone support – I only needed to specify the city I was in and it would make all necessary adjustments. Despite this difference, when used as a camera, both performed exactly the same in terms of time zone accommodation: not at all.
Here's a semi-hypothetical scenario to illustrate the problem (precise times have been simplified): Having gone to the UK and set both camera and phone to UK time, pictures were taken with both cameras at 12:00 (midday). The EXIF data records and shows the images as having been taken at 12:00, which is 21:00 in Japan thanks to the 9 hour time difference. On returning to Japan, setting the camera and phone times back to Japanese time, and importing the pictures onto my computer, it now shows the capture time as... 12:00 midday. Local time. Japan. A whole 9 hours before I even took them. I upload them to Flickr, and that agrees they were taken at "12:00 JST".
Now, I appreciate that both Lightroom and Flickr (both of which I use frequently) have systems allowing you to manually time-shift capture times by a set amount of hours to accommodate different time zones, but that's a far cry from basing the times off a global standard. It still works on the assumption that time cannot be absolute (despite whatever the wise King Charles II acknowledged in 1675), local time is everything, and renders meaningless the whole process of adjusting your camera's clock in the first place. And if Flickr is going to go to the bother of displaying a time zone with each picture, it should at least make sure it's the correct time zone, and have a system for adjusting it – show the capture time as 12:00 GMT, or adjust to local time as 21:00 JST, but don't try and do both at the same time.
The iPhone's GPS throws another spanner of bafflement into the equation. On November 22nd last year, Apple released the iOS 4.2 update, and one of the undocumented (and possibly unintended) changes was to the way global position data (geotags) was stored in photos. GPS satellites transmit a time signal in UTC (GMT for all intents and purposes) from which the position is triangulated, and that time signal is then recorded in the EXIF data together with the longitude, latitude and altitude. Until the iOS 4.2 update, the time recorded was localised to the present time zone. Now, it's recorded as transmitted in UTC. So all subsequent iPhone photos now have both the local capture time AND a global capture time recorded, which would be very useful if anybody ever wanted to make use of this information ever. Again, Lightroom, Aperture, Flickr, Picassaweb, none of these services are interested in global time even when it IS available. Local time is everything.
But wait. It becomes yet more baffling, as the way the iPhone records capture time now varies between photos and videos: photos get a local time stamp while video gets a global time stamp. So a video shot at 12:00 in the UK looks very different to the picture I captured moments earlier when brought onto my desktop in Tokyo – the picture file shows the time as 12:00 midday (JST), but the video file shows 21:01. It's like they're not even trying to make sense.
It's just as well that nobody really ever travels that far from home these days, nor shows their digital photos to people any further away than the next town, because otherwise this failure to progress past the mid 19th century would start to prove extremely inconvenient, and both camera and photo management software giants might have to start looking beyond their own back yards for a change.