A Brief History of Telling Time

From sundials to atomic clocks, a journey through the way humans have measured time.

From sundials to atomic clocks, a journey through the way humans have measured time.

Credit: fintbo/Flickr CC BY-NC-ND 2.0

Credit: fintbo/Flickr CC BY-NC-ND 2.0

We live in a world where time is all important. Nanoseconds mark the difference between success or failure to make an electronic transaction and where we are continuously reminded of “the time”: of being early or late, of having missed an appointment or arriving “before time”. In today’s world, time now governs our life.

In his bestseller, A Brief History of Time, physicist Stephen Hawking reminded us that: “The increase of disorder or entropy is what distinguishes the past from the future, giving a direction to time.”

There is no evidence that we can move backwards in time or that “time tourists” from the future are with us. But the arrow of time does carry us forward, and humans have measured this time through the ages in different ways.

Sundials and water clocks

We will never know who was the first man or woman to try to give structure to the measurement of time, although in the Bible, the book of Genesis exemplified change on a day-to-day basis, and with evening and morning. The ancient Egyptians used simple sundials and divided days into smaller parts, and it has been suggested that as early as 1,500 BC, they divided the interval between sunrise and sunset into 12 parts.

An ancient Egyptian sundial. Credit: University of Basel/The Conversation

Our familiar divisions of time are more recent and current terminology about time and time-keeping originated from the Babylonians and the Jews (the seven-day week in Genesis). The Ancient Romans, during the republic, went with eight days – including a shopping day where people would buy and sell things. When the Roman emperor Constantine made Christianity the state religion early in the 4th century AD, the seven‑day week was officially adopted.

The sundial (of course an effective instrument only when the sun shines) was refined by the Greeks and taken further by the Romans a few centuries later. The Romans also used water clocks which they calibrated from a sundial and so they could measure time even when the sun was not shining, at night or on foggy days. Known as a clepsydra, it uses a flow of water to measure time. Typically a container is filled with water, and the water is drained slowly and evenly out of the container – markings are used to show the passage of time.

But the changing length of the day with the seasons in the Roman world made time measurement much more fluid than today: hours were originally calculated for daytime and based on a division of the day. The water clock made it possible to measure time in a simple and reasonably reliable way.

Clocks come of age

The better measurement of time has been a human fascination for centuries but in the 18th century the clock emerged as a scientific instrument in its own right, notwithstanding its conventional role to mark the passing of the hours.

The pendulum clock owes is refinement to Galileo noticing the regularity of a suspended lamp swinging back and forth in the cathedral of Pisa, when he was still a student there.

The high water mark of an instrument of measuring time that was both perfectly fit for purpose and elegant was the marine chronometer invented by John Harrison in England. It was a response to the need to measure time on board ship to a high level of precision, and so to be able to determine longitude (the pendulum clock was unsuitable for marine use due to the motion of the ship).

Harrison’s device drew on his brilliance in design and knowledge of the best materials. His clock enabled the measurement of time, and so a position at sea, to high accuracy. It gave the Royal Navy an unprecedented tool for navigation.

The work of 20th-century watch and clock makers continued that tradition – the skill of George Daniels in Britain in creating some of the best and most beautiful timepieces using traditional and hand-crafted methods can be seen in the permanent exhibition now at the Science Museum in London.

Atoms and lasers

Measuring time also changed in the 20th century changed through the development of the atomic clock in the 1950s at the National Physical Laboratory. This allowed for new and better definition of time, and the second as its prime measure.

The invention of the laser in 1960 changed time measurement for ever. Lasers can produce pulses of a duration of a few attoseconds – 10⁻¹⁸ seconds – and the accuracy of international time measurement must reflect this.

Time today is defined not by a second that we may have expected to be a fraction – 1/86,400 – of the day. Instead, it is through an atomic frequency: formally done through something called the “caesium standard”. This measures the exact number of “cycles” of radiation – 9,192 631,770 – that it takes for a caesium 133 atom to transition from one state of energy into another.

Time has moved away from terrestrial measurement to a measurement that could, in principle, be carried out on another planet or across the universe. The accuracy of this atomic time continues to be refined through research and work at National Physical Laboratory in the UK is a world-leading presence.

And the future? To quote Hawking again: “Only time (whatever that may be) will tell.” We know it will involve the ongoing work of scientists to allow the accuracy with which we measure time to increase as we inevitably, it seems, find our lives becoming more ruled by time, its measurement and how it dictates what we do and when we do it.

The Conversation

Kenneth Grattan is George Daniels Professor of Scientific Instrumentation at City University London.

This article was originally published on The Conversation. Read the original article.