October 2017

S M T W T F S
1234567
89 1011121314
151617 18192021
22232425262728
293031    

Page Summary

Style Credit

Expand Cut Tags

No cut tags

May 31st, 2011

gfish: (Default)
Tuesday, May 31st, 2011 12:06 pm
When I was researching clock design extensively last year, I had an odd linguistic revelation. All my life I'd been saying a clock was "fast" or "slow" to indicate it was ahead or behind the correct time. This purely meant a fixed offset -- a clock was 5 minutes fast, and would still be 5 minutes fast next week. In fact, if a clock really WAS fast, getting further and further ahead I'd have to explicitly specify that. It was a sloppy use of "fast", but, eh, natural language is full of that. I never really thought about it. It wasn't until getting into the history of clocks that I realized this hadn't always been an idiosyncratic phrasing. It used to be quite literal. Clocks used to suck, even very expensive ones. If your clock was fast, it would be noticeably gaining time day after day. We live in an era of incredibly cheap, incredibly accurate clocks. Even the cheapest quartz wristwatch might only gain a few seconds over an entire year, a level accuracy that used to only be found in the best observatory regulators. So while the immediate observation is the same (a clock is 5 minutes ahead), the underlying condition is completely different. It's like seeing a plane taking off over a continental plate boundary and thinking they both have the same velocity since they're the same distance away from you. We take it so granted that clocks are (for >99.99% of human activity) perfectly accurate that we have taken the indefinite integral of a common figure of speech without even realizing it. That's pretty cool. I can't think of any other examples of that happening.