History of Horology from Sundials to Atomic Clocks
Mankind has always been preoccupied with measuring and recording the passage of time. Timekeeping has been essential for the development of civilisations; from knowing when to plant or harvest crops to identifying important events in the year.
Time has historically been measured in relation to the movement of the Earth; a day, is one revolution of the planet; while a year is an entire orbit of the Sun. Calendars were developed from as far back as 20,000 years ago when hunter-gatherers scratched lines and gouged holes in sticks and bones to possibly count the days between phases of the moon.
Civilizations from the Ancient Egyptians to the Roman Empire have used differing methods to discover what day of the year it is. However, measuring time as it passed throughout the day had always proved difficult to early mankind. Sundials were perhaps the first time pieces and they can trace their origin back over five thousand years; when obelisks were built, possibly to allow the telling of time by the cast of their shadows.
However, the time told on a sundial was based on the movement of the sun in the sky, which would differ throughout the seasons and of course would not work on cloudy days or at night. Other methods such as water clocks or the hourglass would simply act as crude timers. Telling the time of day would prove difficult with people relying on comparisons as time references such as: “As long as it would take a man to walk a quarter mile.”
People were reliant on these methods and others such as bell ringing to indicate important moments until the 14th century, when mechanical clocks first appeared which were driven by weight and regulated by a verge-and-foliot escapement (a gear system that advancing the gear train at regular intervals or ‘ticks’). These clocks were far more reliable than sundials or other methods allowing accurate and reliable telling of the time of day for the first time in human history.
The next step forward in horology came in the 17th century when the pendulum was developed to help clocks maintain their accuracy. Clock making soon became widespread and it was not for another three hundred years that the next revolutionary step in horology would take place; with the development of electronic clocks. These were based on the movement of a vibrating crystal (usually quartz) to create an electric signal with an exact frequency.
While electronic clocks were far more accurate than mechanical clocks it wasn’t until the development of Atomic Clocks and around fifty years ago that modern technologies such as communication satellites, GPS and global computer networks became possible.
Most atomic clocks use the resonance of the atom caesium-133 which vibrates exactly at a frequency of 9,192,631,770 every second. Since 1967 the International System of Units (SI) has defined the second as that number of cycles from this atom which makes atomic clocks (sometimes called caesium oscillators) the standard for time measurements.
Atomic clocks are accurate to less than 2 nanoseconds per day, which equates to about one second in 1.4 million years. Because of this accuracy, a universal time scale UTC (Coordinated Universal Time or Temps Universel Coordonné) has been developed that maintains a continuous and stable time scale and supports such features as leap seconds – added to compensate for the slowing of the Earth’s rotation.
However, atomic clocks are extremely expensive and are generally only to be found in large-scale physics laboratories. However, NTP servers (Network Time Protocol), the standard means for achieving time synchronisation on computer networks, can synchronise networks to an atomic clock by using either the Global Positioning System (GPS) network or specialist radio transmissions.
The development of atomic clocks, GPS and NTP time servers has been vital for modern technologies, allowing computer networks all over the world to be synchronized to UTC.