How did watches evolve into wristwatches?
Buffy AcaciaIf you take the short version of the story at face value, here’s how it sounds: “In ye olden days, men wore pocket watches. Then, some clever watchmakers made movements small enough to fit into women’s jewellery. When World War I erupted, the men who fought realised that wearing them on their wrists was far more practical. From that point onwards, the wristwatch industry became what it is today.”
As with all oversimplifications, this contains a speck of truth, but completely loses all of the nuance that hundreds of years of history contain. It skips over the trailblazers who actually did start wearing wristwatches first, and unfairly diminishes those earlier watches for women. It also overlooks the importance of mass production and the development of new technologies, without which we’d never have progressed past the bespoke-made watches for Renaissance royalty.
The oldest known wristwatches
One of the first mental reframings we need to do is the acknowledgement that wristwatches weren’t “invented” by anyone at all. Sure, there had to have been a first time for it, but watches were put absolutely everywhere as soon as the parts were small enough. Mechanical clocks were invented in Europe as a replacement for water clocks throughout the 12th and 13th centuries, and with the likes of Ismail al-Jazari of Mesopotamia pushing boundaries with miniaturised automata and robots during the Islamic Golden Age, portable clocks were around (although still rare and exclusively for the elite) by the 1450s. By the late 16th and 17th centuries, they were widespread.
As soon as clocks were small enough to wear, they were worn in every way imaginable. They could be hung from necklaces, belts, set into hand-held mirrors, and even something as small as a ring. They weren’t always simple, time-only movements either, usually featuring some kind of repeating complication. Our first written record of something most likely to be a wristwatch was an “armlet” gifted to Queen Elizabeth I from the Earl of Leicester, Robert Dudley in 1571. Still, the miniature clocks of those days were not considered to be instruments of high precision. As technically complex as they could be, expected to be used as astronomical tools, most people still used sundials to set their clocks until the invention of highly accurate pendulum clocks in the 1650s.
Because watches weren’t expected to be wholly accurate, wearing them on the wrist for convenience of checking the time wasn’t a priority for many. That’s especially true of the royalty and aristocracy who were actually wealthy enough to own them. It’s not as if they’d ever be running late for work or worrying about parking metres. Even though wristwatches existed and were actively developed for up to 400 years, it wasn’t until the 1800s that cultural and technological circumstances led to their proliferation.
The Industrial Revolution as a catalyst
So, what changed between the 18th and 19th centuries? The Industrial Revolution happened, for starters. Workshops became factories, peasants became the workforce, and small towns that had barely left the Middle Ages were suddenly expanding into cities. Industrial mining led to gold rushes and discoveries of precious minerals, whose deposits injected wealth into corporations and birthed a new upper class, entirely separate from monarchies and aristocrats. The availability of steel helped railroads spread throughout the whole world, allowing people, culture, and technology to move more freely than ever before.
All of this change made watches available to a much larger percentage of the population, and that’s why pocket watches became so popular. Movements made from machined brass could be fabricated relatively quickly down an American-style assembly line, and on an average wage you could even buy yourself a watch that looked like solid gold thanks to electroplating. But while railroad networks allowed such commercialisation to take place across entire continents, the speed and efficiency of trains was fundamentally at odds with the world’s whole relationship with time.
How trains changed our perception of time
If you were to start walking at sunrise and walk all day, following the path of the sun until sunset, you would extend the length of your day by approximately one minute. Horse-drawn carriages move at approximately the same speed as us, if not slightly slower depending on the journey. Long before ordinary citizens had personal watches, individual towns in medieval times would set central clock towers by solar measurements, and there would naturally be a time difference reflecting their longitudinal location. For the average watch owner travelling between towns in the 1700s-1800s, the time difference would have been completely negligible and probably eclipsed by the daily inaccuracy of their pocket watch.
Trains, which usually ran between 60 to 80 kilometres per hour, are around 15 times faster than us. Telegraph poles which shot up in the mid-1800s allowed near instant communication across whole countries. What happens when two trains running on slightly different timetables share a track between towns? They can end up crashing. Nearby towns keeping their time mere minutes apart was once a natural occurrence, but now it was a serious liability with deadly consequences. Chronometers, which had been under development for centuries as a marine navigation tool, were now needed back on land for the simple purpose of keeping accurate time. But for those highly-accurate pocket watches to make a difference, the towns themselves needed to agree on a standardised time.
Time zones worked by clustering all of those towns and cities together under a single, unified time standard. Most countries adopted Greenwich Mean Time plus or minus a certain number of hours, with some larger nations spanning multiple time zones. It was the railroads that demanded accuracy for mass-produced pocket watches, but only in conjunction with standardised time zones. That isn’t to say accuracy was never important before, but the way we think about time as a fixed constant rather than a fluid, local reliance based on the sun is entirely thanks to the railroad’s cognitive shift. Time zones were one of the major steps towards globalisation, allowing people to more easily travel or conduct business overseas as well.
Wristwatches for utility rather than decoration
For the period between 1450 and 1850, wristwatches had mostly been a rare novelty enjoyed by the ruling class. Another example is the Breguet wristwatch made for the Queen of Naples in 1810, that is commonly accepted to be the first “modern” wristwatch. The only reason they weren’t more common was because there were frankly more interesting things to do with watches from an artistic and astronomical perspective, and despite wrists being practical places to wear a watch, the watches themselves were not yet practical tools. The need for accuracy thanks to the railroads promoted the development of movement resistance to shock, extreme temperature, water or moisture ingress, and eventually magnetism inspired by marine chronometry. Once all of that was in place, a utilitarian wristwatch was actually feasible for the first time.
Enter the wristlet. Around the 1880s, notably several decades before World War I, both women and men were wearing watches on their wrists. The truly ornate examples for wealthy women could be tied to their wrists with ribbons, sometimes even strapped with a wide array of strings of pearls. Men’s fashion involved wearing even more jewellery than women throughout history, but the Great Male Renunciation of the late 18th century had slashed back on jewellery and colours in favour of the modern lounge suit. The idea that wearing jewellery was emasculating had taken hold, even if it was functional.
Still, for those who needed them, wristlets for men were used without regard to snobbish fashion critics. Men’s wristlets were typically leather holders for smaller pocket watches, not unlike bund straps, and these were also worn by women who couldn’t afford the higher-end jewellery pieces. The leather straps didn’t just keep the watches in a convenient place, they also added to their ruggedness, making them perfect for military officers operating in the outdoors. That’s exactly how they were used by the British Hazara Expedition of 1888, a photo from which shows many if not all of the men wearing them. The Cartier Santos-Dumont was released to the public in 1911, essentially marking the beginning of wristwatches as we now know them, both practical and fashionable for any gender.
The World Wars as the turning point
As the First World War dawned, militaries across Europe were already aware of the practical superiority of wristwatches. The idea that wristwatches were invented during WWI is obviously misplaced, because we’ve now explored how they had existed for centuries. However, it was for the soldiers of WWI that wristwatches began to be mass produced. They still shared many elements with pocket watches, such as the pebble-like cases, enamel dials, and occasionally being made from precious metals, but they gained soldered wire lugs for inserting a strap, a 3 o’clock crown for ease of winding and setting with your right hand, and large, legible Arabic numerals often filled with luminous (radium) paint.
By the time WWI concluded, the wristwatch industry as we know it today had more or less been fully established. Many of those same Swiss factories that produced trench watches explored more decorative and art-informed styles during peacetime, leading to such design classics as the Jaeger-LeCoultre Reverso in 1930. World War II completely cemented the world’s reliances on wristwatches, and that’s when the Swiss really took over from the English, French, and American establishments as the world’s premiere watch nation. Of course developments never stopped, and events such as the quartz crisis also played a large role in shaping wristwatches in contemporary culture. But if you’re looking for someone to thank for conceptualising the watch on your wrist, you need to look much further back than the 20th century.