×

Historical standards of length

The measurements made by surveyors in Warren in the late 19th century are impressively precise.

Until 1912, the measurement of an inch in the United States was based on the metre — one metre = 39.37 inches.

So, if you needed to know how long an inch was and you had a metre-stick, all you had to do was divide it into 39 equal parts with just over one-third of one of those equal parts left over.

Inches have a rich history, but cubits are perhaps the oldest formal measure of length.

A cubit is/was the length from the elbow to the tip of an extended finger.

Cubits were subdivided in palms (four per cubit) and fingers (four per palm).

Most cubits were about 20 of today’s inches.

Royal cubits were longer (can’t expect the nobility to use the same units of measure as the commonfolk).

The forearm cubit — from elbow to wrist — was shorter by some seven inches.

Inches may have originally been based on the Roman foot. One inch was one-twelfth of a foot.

To help with consistency, the King of Scotland around 1150 declared an inch to be the breadth of a man’s thumb at the base of the nail.

Knowing that there were people with different sized thumbs, in order to get a better inch, the tradition of the time was to take the thumb-width average of three men — one large, one medium, and one small, according to Britannica.

Surprisingly, even taking the average of three men’s thumbs was not precise enough.

In the 14th century, the inch was refined. It was defined as “three grains of barley, dry and round, placed end-to-end lengthwise,” according to Britannica.

Sounds great. Except, barleycorns span a range of about 1/6th of an inch to nearly half an inch.

In the 1800s, the barleycorn was the unit on which the English long measure system was based. The inch still existed, but it was defined as ‘three barleycorns.’

At least there was a chance of normalizing when there were three. If you ran into a gigantic barleycorn when trying to measure something and the barleycorn was the primary unit of measure, you could multiply big barleycorns. With the inch, the person with that over-large barleycorn would need two more and might notice that one of them was unusual.

One of the people who decided a barleycorn made more sense as the base of a system than an inch made up of three barleycorns, admitted to some problems.

“The length of the barley-corn cannot be fixed, so the inch according to this method will be uncertain,” Teacher Charles Butler said in 1814, according to Academic Dictionaries and Encyclopedias (enacademic.com).

There was a formal inch measurement locked away in a vault.

In 1842, George Long wrote that the loss of that measure (A measuring stick? A barleycorn-sized rock? Ye olde official 30-year-old one-inch barleycorn?) would require taking the average of a large number of barleycorns to find a new inch and doing that could introduce meaningful error.

After a while, the base unit became the yard and inches were defined as 1/36th of one of those. At least that’s easier than 1/39.37th.

The original yard was the length of a man’s belt.

Then King Henry I came along and said it was the distance from his nose to the thumb of his outstretched arm, according to factmonster.com.

The ell, a slightly longer measure, took the yard’s place for a while. Then, Queen Elizabeth decided that England would return from the French-derived ell to the traditional English yard.

Rods — five-and-a-half yards, furlongs — 40 yards, miles, and leagues — three miles, will have to wait. A distance of 20,000 leagues is a lot more than the diameter of earth. The title refers to the distance traveled underwater, not the depth. Still, 20,000 leagues would take one around the world twice… quite a trip.

Newsletter

Today's breaking news and more in your inbox

I'm interested in (please check all that apply)
Are you a paying subscriber to the newspaper? *
   

Starting at $4.62/week.

Subscribe Today