Author Archives: Richard Romano

About Richard Romano

Richard Romano has been involved in the graphic arts since before birth. He is a writer and analyst for the graphic communications industry and a regular contributor to WhatTheyThink.com, for which he oversees the Wide-Format and Production Inkjet special topic areas. For eight years, he was the senior analyst for The Industry Measure (formerly TrendWatch Graphic Arts), until its demise in March 2008. He has also worked on consulting and market research projects for many other organizations and companies and has contributed to such magazines as Graphic Arts Monthly, GATFWorld, Printing News, and HOW; is the former executive editor of, CrossMedia magazine; and is the former managing editor of Micro Publishing News and Digital Imaging magazines. As if that weren’t enough, he is also the author or coauthor of more than a half dozen or so books, the last three with WhatTheyThink’s Dr. Joe Webb, including Disrupting the Future, which has been translated into Japanese and Portuguese. Their most recent title is "The Home Office That Works! Make Working At Home a Success—A Guide for Entrepreneurs and Telecommuters." He has vague recollections of having graduated from Syracuse University’s Newhouse School of Public Communications in 1989, and has a 1994 certificate in Multimedia Production from New York University. He is currently in the final throes of a Masters program at the University at Buffalo, which he really does need to wrap up at some point. He lives in Saratoga Springs, NY.

Orange Was the New Red

By

If there is one color associated with this time of year (autumn, in case you are reading this in the future) it is orange. From the leaves that are falling, to pumpkins, to Halloween decorations, and, well, to Syracuse football, it’s an orange world. And yet, if I were writing this prior to the 16th century, I would have called it a red world.

The word “orange” was first used in English around 1300, coming from Old French orange or orenge, originally norange, which came from the Arabic naranj, which came from the Persian narang, and which ultimately came from the Sanskrit naranga-s. In all of these cases, the word referred only to the fruit and the tree the fruit came from, not the color. But how did people refer to things that were what we would call orange? (This was long before the Pantone Matching System.) Occasionally the word saffron was used, and there was also the word crog that also referred to things that were saffron-colored, ġeolurēad for a reddish orange, or ġeolucrog for yellowish orange. More often than not, though, they simply used the word red. So if you see a red deer, a red fox, or a robin redbreast, the thing is, they’re not really red, are they? And red-headed people do not have hair that we would really consider red, except in those cases where it has been specifically dyed red. It’s just that these things dated from a time before the word orange referred to a color.

According to the Oxford English Dictionary, the first recorded use of orange as a color is 1557 in Great Britain Statutes at Large:

Coloured cloth of any other colour or colours..hereafter mentioned, that is to say, scarlet, red, crimson, morrey, violet, pewke, brown, blue, black, green, yellow, blue, orange, [etc.].

The color orange has its origin in a legal document.

Wait…back up a sec. “Pewke”? Yes, that’s an obsolete spelling of puke which is an obsolete term that referred to a type of woolen cloth (there is a reference from 1499 to someone making “a longe gowne of pewke,” which today would be an unfortunate après-party euphemism) as well as the color of that cloth (a bluish black, by all accounts). And, actually, there is no connection between that word puke and what we typically mean today.

But there really is no need to bring up puke….

Anyway, back to orange. The word is somewhat infamous in that there is said to be no word that rhymes with it, although some have tried to force a rhyme, most notably Tom Lehrer:

Eating an orange

While making love

Makes for bizarre enj-

oyment thereof

And Charles Fox and Norman Gimbel wrote the song “Oranges Poranges” (“oranges poranges, who says? there ain’t no rhyme for oranges!”) for Sid and Marty Krofft’s TV show H.R. Pufnstuf. There is a hill in Wales called The Blorenge, but I expect that’s pretty difficult to work into a song or poem.

You may also have noticed, going back through the etymological history of the word, that somewhere along the line the word norange, lost an “n.” This is because in French it would have been called un norange, and in the same way that in English “a norange” could easily become “an orange” that’s pretty much what happened to our fruit as it made its way up through Europe. (Orange trees were originally native to Southeast Asia—northeastern India and southern China specifically—and oranges were brought to Europe by Arabs through North Africa and then Sicily, where they migrated north, arriving in Britain around the 14th century.)

Indeed, the letter “n” comes and goes rather arbitrarily, especially in English. Notice how someone who sends or relays a message is suddenly a messenger (the original word messager gained an “n” around 1300) and someone who books passage is mysteriously a passenger (passager gained its “n” around the 15th century). Why? No one really knows, except that people seemed to prefer to pronounce those words that way. (I do not know if Henry Kissinger had an ancestor named Kissiger.)

In some cases, it’s easy to figure out the origin of the “parasitic n.” A newt was originally an ewt, and a nickname was originally an ekename (“an additional name”). The letter can vanish as abruptly as it arrived. An apron was originally a napron (basically a small table cloth, with the same root as napkin), and for baseball fans, an umpire comes from a noumper, from the Old French nonper (“not even,” as in an odd number). The idea was that a noumper was a third person to settle differences between two parties.

Sometimes letter migrations can get fairly complicated. Take the phrase humble pie. The Middle English word numbles (“edible inner parts of an animal”) became umbles, and umble pie was an actual dish made from organ meat (it was considered a low-class food). The word humble existed at the time—etymologically unrelated to umbles—although the “h” was not pronounced. “To eat humble pie,” then, was coined essentially as a pun to humble oneself by eating umble pie.

If you don’t like humble pie (or umble pie), you might prefer to eat crow. No one is entirely certain where that phrase came from, but one of the OED’s definitions of crow is “intestine or mesentery of an animal”—and thus the meaning could be virtually the same as “humble pie.” Or it could be that, since crows are scavengers feeding on carrion, eating them was perceived as being rather disgusting, and thus being forced to eat one would be a form of humiliation.

Oh, and about the word pie. There is some evidence that it derives, at least in part, from magpie, a bird that tends to collect miscellaneous objects. By the way, it is also believed that the term pie fonts—aka pi fonts—also comes from the magpie, as pi font was originally 16th-century printer’s slang for random bits of type jumbled together.

One kind of pie that is pretty agreeable to a fair number of people this time of year is pumpkin pie. Pumpkin has a pretty boring etymology (Middle French pompon for melon), but pumpkins are orange. And, in fact, it is entirely possible that, had pumpkins been more popular in Europe than oranges, we would be using the word pumpkin instead of orange to refer to that color. So the TV series could indeed have been called, in an alternate reality, Pumpkin Is the New Black. And Pantone’s Hexachrome system would refer to CMYK plus green and pumpkin. Ah, what might have been.

 

References:

Sam Dean, “The Etymology of the Orange,” Bon Appetit, February 28, 2013, http://www.bonappetit.com/test-kitchen/ingredients/article/the-etymology-of-the-orange.

John Lawler, “The Data Fetishist’s Guide to Rime Coherence,” Style 40 (1&2), 2006.

“Messenger,” “Napkin,” “Newt,” “Nickname,” “Passenger,” “Pie,” “Umpire,” Online Etymology Dictionary, http://www.etymonline.com.

“Orange,” Online Etymology Dictionary, http://www.etymonline.com/index.php?term=orange&allowed_in_frame=0.

“Orange,” Oxford English Dictionary, http://www.oed.com/view/Entry/132163

“Puke,” Oxford English Dictionary, via Word Finder, http://findwords.info/term/puke.

“Eating crow,” Wikipedia, last modified on October 25, 2015, retrieved on October 27, 2015, https://en.wikipedia.org/wiki/Eating_crow.

“Humble Pie,” Wikipedia, last modified on September 15, 2015, retrieved on October 27, 2015, https://en.wikipedia.org/wiki/Humble_pie.

“Orange (colour),” Wikipedia, last modified on October 27, 2015, retrieved on October 27, 2015, https://en.wikipedia.org/wiki/Orange_(colour).

“Orange (word),” Wikipedia, last modified on September 16, 2015, retrieved on October 27, 2015, https://en.wikipedia.org/wiki/Orange_(word).

Of Bricks and Vick’s: Not-So-First-Class Mail

By

For years, my bank has been trying to get me to “go paperless,” not so much to save a tree, methinks, but to save them printing and mailing costs. But, you know, there could be worse things then sending a bank statement through the mail; I could be trying to send an actual bank through the mail. Which is what W.H. Coltharp famously did.

Coltharp was a businessman in the town of Vernal, Ut., and in 1916, upon receiving permission to construct a new building in town—a portion of which would be used for a bank—he decided that he wanted to use bricks made in Salt Lake City, 120 miles away as the crow, if not the brick, flies. It being the 1910s, shipping options were decidedly limited, and to ship the 80,000 bricks (some sources say it was more like 15,000) would cost four times what the bricks themselves cost. What to do…what to do?

Three years earlier, the United States Post Office had launched its Parcel Post service for sending packages from one place to another. (Before that, the Post Office only handled letters.) Parcel Post was an affordable way for rural folk—of which there were many in 1913—to send and receive packages, and quickly became a phenomenally useful and popular service.

It also helped Coltharp solve his problem. He would mail the bricks to himself via Parcel Post. Each package could weigh no more than 50 pounds, so Coltharp wrapped each brick individually, packaged them in crates that weighed no more than that upper limit, and sent out 40 of them at a time—basically shipping a ton of bricks each day for many days. But while Coltharp was shipping bricks, postmasters were sh***ing bricks. Since there was no direct route from Salt Lake City to Vernal, the bricks had to be sent by railroad to Colorado, transferred onto another railroad, then sent by freight wagon to Vernal. And, by postal law at the time, packages had to be handed over the counter. Oy.

Telegrammed the Vernal postmaster to Washington: “Some S.O.B. is trying to ship a whole building through the U.S. mail.”

Actually, it wasn’t the whole building, just the exterior bricks, but still… The bank, when finally constructed, was nicknamed “The Parcel Post Bank” and it still stands today. If I owned a UPS Store or FedEx Office franchise, I would make it a point to locate it in that building, just for the irony.

Needless to say, the Post Office changed their rules after Coltharp’s stunt, limiting customers to 200 pounds a day.

No doubt everyone involved in the brickscapade was feeling exceedingly sore by the end of it, although I bet a lot of people got a really good workout, and could have used some healing and soothing salve. To do so, they would probably have turned to products made by the Vick’s Chemical Company, and the Post Office would later have the founder of that company to thank (and the rest of us curse, depending upon your point of view) for something else that started arriving en masse through the mail.

Lunsford Richardson was born in Selma, N. Car., in 1854. After college, he taught Latin at a local academy for a bit, and soon bought a drugstore in Selma, where he became a pharmacist. After a short time, he sold that drugstore and relocated to Greensboro, where he bought the Porter and Tate Drug Store. (The “Porter” was the uncle of William Sydney Porter, who would achieve fame as the writer O. Henry. Ironically, there is no irony in that.)

Anyway, whilst beavering away in Greensboro, Richardson began concocting various menthol-based ointments, initially for babies with croup, but eventually he compiled a portfolio of 21 ointments and salves, which he collectively called “Vick’s Family Remedies,” taking the name “Vick’s” from, it is said, his brother-in-law Dr. Joshua Vick, who helped him get started in business, as well as an advertisement for a product called Vick’s Seeds.

In 1891, Richardson compounded a menthol-based ointment initially called Vick’s Croup and Pneumonia Salve, later renamed Vick’s Magic Croup Salve, and, in 1912, at the insistence of his son Smith Richardson who had become active in the business, Vick’s VapoRub. That was the magic formula, or at least the magic name, and after a major flu outbreak in 1918, sales of VapoRub went through the roof.

Richardson was not only a pharmacist but also a marketing dynamo, availing himself of virtually all the media channels available in the 1910s. He gave free samples to druggists, he ran coupons in newspapers, and he advertised on billboards. (In 1925, six years after Richardson’s death, the company published a children’s book about two elves that treat a sick child; three guesses what product plays a very large role.)

One other marketing strategy that Richardson pioneered was sending free samples to renters of Post Office boxes. He didn’t have access to a mailing list, and postal regulations required the name of a recipient in order for mail to be delivered. What to do…what to do? Aha! Address each sample simply to “Boxholder.” Much later, this would become “Occupant,” “Resident,” or the like. (I once got a promotional flyer addressed to “Smart Shopper at…” I sent it back marked “Addressee Unknown.”)

Anyway, Richardson is thus often considered “the father of junk mail.”

Richardson is probably also lucky that samples of Vick’s VapoRub weighed much less than bricks. The Post Office bristles at the term “junk mail”—but had used much stronger language for Coltharp’s bank mailing.

 

References:

John Hollenhorst, “Vernal bank built by bricks sent through the mail—partly true,” KSL-TV, November 19, 2014, http://www.ksl.com/?nid=148&sid=32424611.

“Lunsford Richardson, Inventor of VapoRub and Junk Mail,” NC Cultural Resources Blog, December 30, 2014, http://www.ncdcr.gov/blog/2014-12-30/lunsford-richardson-inventor-vaporub-and-junk-mail.

“The Bank That Was Sent Through the Post Office,” Stamps of Distinction, July 11, 2008, http://www.stampsofdistinction.com/2008/07/bank-that-was-sent-through-post-office.html.

Jimmy Tomlin, “The Father of Vick’s,” Our State North Carolina, December 3, 2012, http://www.ourstate.com/lunsford-richardson.

“Lunsford Richardson,” Wikipedia, last modified November 19, 2014, retrieved September 9, 2015, https://en.wikipedia.org/wiki/Lunsford_Richardson.

“Vicks,” Wikipedia, last modified August 28, 2014, retrieved September 9, 2015, https://en.wikipedia.org/wiki/Vicks.

 

Plato vs. Play-Doh

By

Sometimes Wikipedia can be a source of inadvertent amusement, and as much as I miss print encyclopedias, you’d be hard-pressed to get this kind of entertainment out of Britannica. For example, if you go to the Wikipedia page for Play-Doh, you get this at the top:

This article is about the children’s modeling material. For the ancient Greek philosopher, see Plato.

It’s a fine line, I know.

It would be fitting if anyone ever sculpted Plato out of Play-Doh. (And if you’ve seen carved busts of Plato, you know he had a pretty substantial beard, making him a prime candidate for the Play-Doh Fuzzy Pumper Barber Shop, a popular toy in my youth, and whilst I don’t think I ever actually owned one, just saying the name alone was enough to get my friends to shoot milk from their noses.)

Silliness aside, Plato of course was one of the most important figures in the history of human civilization, the center of three generations of important thinkers. His teacher was Socrates, his student Aristotle, and if you suddenly find yourself singing Monty Python’s “Philosophers’ Song,” you are not alone. Anyway, Alfred North Whitehead once famously wrote, “the safest general characterization of the European philosophical tradition is that it consists of a series of footnotes to Plato.”

Despite the fact that a fair amount of Plato’s writing has survived to the present day, very little is known of his early life—not even his real name. There is some evidence that suggests that he was born Aristocles, after—supposedly—his grandfather. Trouble is, near as anyone can tell, the only Aristocles on record has no direct connection to Plato’s family. As for the name “Plato,” which is the name he wrote under, where it came from is also open to debate. Some historians say that his wrestling coach dubbed him Platon (“broad,” as he was a big guy, something you look for in a wrestler), others that he himself took his name from platytēs (“breadth,” as in the breadth of his eloquence, if not his humility), and others still that someone unknown started calling him platys (“wide,” in that he had a big forehead). Who knows? Maybe someone even did name him after Play-Doh, for—I don’t know—an easily molded, non-toxic mind.

Adds Wikipedia: “one story…suggests Plato died in his bed, whilst a young Thracian girl played the flute to him.” I’ll bet it was only a Platonic relationship.

We give a great deal of credit to these early thinkers, and for good reason, although they sometimes fell a bit short when it came to empirically proving certain assertions. For example, Aristotle was known to insist that women had fewer teeth than men, which you’d think could be pretty easily disproven, if by one else than Mrs. Aristotle.

Forgive the simplified timeline, but the Golden Age of Ancient Greece was superseded by the Roman Empire, and the fall of Rome marked the beginning of what we know of as the Middle Ages, sometimes called—unfairly, I think—the Dark Ages. It was the Italian scholar Petrarch who, in the 1330s, came up with the idea of referring to the period after the fall of Rome as a time of cultural and economic “darkness,” certainly when compared to the “light” of the golden years. (The phrase “Dark Age” itself—saeculum obscurum—was coined by Caesar Baronius in 1602.)

Granted, with the coming of the plague in the Middle Ages, things certainly became rather grim, and it certainly seemed like a bit of a dark age.

There were a number of factors that led Europe out of the darkness and into the light of what would later be dubbed the Renaissance. (By the way, the first significant use of the term “Renaissance” was by French historian Jules Michelet in 1858.)

(It should also be noted that there was more than one “Renaissance”; although we usually associate that term with the period from the 14th to the 17th centuries, there was an earlier “Renaissance” in the 12th century, also referred to as the High Middle Ages.)

One of the biggest contributing factors to the “rebirth” of art, science, and philosophy was the advent of printing in the 1450s. Printing made the writings of the ancient Greeks and Romans widely available, as the Renaissance can be characterized by nothing so much as a rediscovery of antiquity, and Renaissance thinkers and artists picked up where the ancients left off, thanks largely to readily available books.

We usually associate the Renaissance with Italy—Florence in particular—but rebirthings were happening all over Europe. From Italy, the new art and culture migrated north, and the 15th and 16th centuries saw a German Renaissance. One of the biggest names of the German Renaissance—at least in terms of painting—was Albrecht Dürer (1471–1528). Dürer was born in Nuremberg, the son of a successful goldsmith (although the name “Dürer” derives from a translation of a Hungarian word for “doormaker”). His godfather, Anton Koberger, was the most successful printer and publisher in Germany, owning 24 presses and having offices throughout Germany and Europe. Koberger is perhaps most famous for the Nuremberg Chronicle, published in 1493. Although it sounds like a newspaper, it was actually a lavish story of civilization based on the Bible. It contained more than 1,800 woodcut illustrations produced by the workshop of Michael Wolgemut, Nuremberg’s most prominent printmaker and painter. It is believed that young Albrecht worked on at least some of these illustrations, as he had been a student of Wolgemut’s at the time.

Dürer became a dab hand at all manner of artforms—woodcuts, etchings, oil paintings, watercolors, you name it. Dürer also was known for making wallpaper. A type of printmaking, wallpaper began to be fashionable in the 15th and 16th centuries, superseding the handwoven tapestries that had for centuries been used not only for décor but also for insulation. Tapestries were far too expensive for anyone but the upper classes to afford, so paper-based wall prints—while not as insulating as cloth wallcoverings—were a good substitute and added a little je ne sais quoi to your average fashionable hovel. The prints were pasted onto the walls, and often comprised several smaller prints that were tiled, much in the way that wide-format prints are tiled today.

Dürer is known to have made a number of these, including one commissioned by the Holy Roman Emperor Maximilian I: The Triumphal Arch. It was a woodcut that measured 116 x 141 inches and was printed on 36 sheets of paper from 195 separate wood blocks. Designed to be affixed to walls in city halls or palaces, it was one of the largest prints ever made up to that time. (Dürer also produced three more “superwide-format” prints for Maximilian.)

Wallpaper has evolved over the years as printing technologies have evolved, and like any kind of fashion, has waxed and waned in popularity. In Britain in 1712, under Queen Anne, a wallpaper tax was levied (one penny per square yard, rising to one shilling per square yard by 1809), which lasted until 1836. Today, digitally printed wallcoverings and décor are becoming popular.

If you have wallpaper, you know very well that it needs to be cleaned every now and then. One company that manufactured a wallpaper cleaner was founded in 1912. Kutol Products Company of Cincinnati produced (and still produces) a wide variety of personal and industrial soaps, sanitizers, washes, and cleaners. In the 1930s, one of Kutol’s employees was Noah McVicker who, at the request of Kroger Grocery, developed a material that could remove coal residue from wallpaper. What he came up with was a cleaning putty consisting of flour, water, salt, boric acid, and mineral oil. It was quite successful for a time, but after World War II, the combination of a decline in coal heating as well as the emergence of easier-to-clean vinyl wallpaper killed the market for the putty—at least as a cleaner.

McVicker noticed, however, that schoolchildren were using the wallpaper cleaning putty as a modeling compound to make Christmas ornaments. The clay lightbulb went off over McVicker’s head and he, with his nephew Joe, began selling the putty as Rainbow Modeling Compound. Originally available only in white (other colors of the rainbow would be added shortly), the McVickers test-marketed the material in schools and kindergartens, and in 1956 founded the Rainbow Crafts Company to make and sell what they now called “Play-Doh.” It was a massive hit. More than two billion cans of Play-Doh have been sold (and much of them probably eaten) since 1955, and the material has become one of the most successful children’s products in the history of toys.

And that’s how you get from Plato to Play-Doh.

 

References:

“Fascinating facts about the invention of Play-Doh,” The Great Idea Finder, http://www.ideafinder.com/history/inventions/playdoh.htm.

“Play-Doh Was Originally Wallpaper Cleaner,” Today I Found Out, November 12, 2011, http://www.todayifoundout.com/index.php/2011/11/play-doh-was-originally-wallpaper-cleaner/.

“Albrecht Dürer,” Wkipedia, modified on September 22, 2015, retrieved September 30, 2015, https://en.wikipedia.org/wiki/Albrecht_Dürer.

“Plato,” Wikipedia, modified on September 5, 2015, retrieved September 30, 2015, https://en.wikipedia.org/wiki/Plato.

“Play-Doh,” Wikipedia, modified on September 26, 2015, retrieved September 30, 2015, https://en.wikipedia.org/wiki/Play-Doh,

“Wallpaper,” Wikipedia, modified on August 17, 2015, retrieved September 30, 2015, https://en.wikipedia.org/wiki/Wallpaper.

Jetways

By

I had a minor bit of home office construction a few weeks ago—I was getting my windows upgraded to version 2.0—and the need to move the furniture around offered me the opportunity (or, more correctly, the necessity) to clean my desk. In one of the various cubbyholes I came across a souvenir from an EFI press conference at Print 13: a personalized, printed ceramic tile from EFI’s Cretaprint inkjet ceramic decoration system.

CretaprintCeramic printing and decoration has become a hot application, driven, as many things are these days, by new digital printing technologies—especially inkjet technologies—that expand the range of materials that can be printed on. In the case of ceramics, in a weird way, it was the substrate itself that played a role in the development of the system that would eventually be used to print on it.

You may recall in a previous edition of these VPRHDs (Vaguely Printing-Related Historical Digressions), we met the Wedgwood family. Ralph Wedgwood had invented the first writing system that used carbon paper, one of the side benefits of which was that the visually impaired could easily use it to write.

One of Ralph’s forebears was Josiah Wedgwood, a prominent potter and founder of the Wedgwood company. By the age of nine, Josiah had demonstrated a skill for making pottery, but a bout of smallpox left his knee too weak to operate a potter’s wheel. So, when life gives you lemons, I guess you…design the ceramic vessels in which to put lemonade, which is essentially what he did. In addition to designing pottery, he formed an alliance with one of the greatest potters of the day, Thomas Whieldon, and in 1754 they formed a business partnership. Always one to experiment, Wedgwood played around with new materials and techniques, and, in a case of being in the right place at the right time, was based near Manchester in the north of England, which was just emerging as a major industrial city. He soon created what was the first true pottery “factory” and was selling his wares to the royal family.

Speaking of industrial-scale production, Wedgwood would father eight children. His son Thomas Wedgwood grew up with a highly artistic disposition, and spent much of his time hanging around painters, sculptors, poets, and other aesthetic types. (When he inherited his father’s vast wealth, he was able to be a patron to many of them, including Samuel Taylor Coleridge.) Thomas Wedgwood never married, nor did he have any children of his own. His biographer, Richard Buckley Litchfield, notes that “neither his extant letters nor family tradition tell us of his caring for any woman outside the circle of his relations” (Litchfield, 1903).

Anyway, although he had no children of his own, Wedgwood became interested in education, and studied infants to try to understand how they processed all the information that came at them. He reasoned, not unreasonably, that babies’ primary “data collectors” were the eyes, and that thus light and images were the most important bits of information to the infant brain. This led Wedgwood to experiment with light-sensitive materials in order to capture images on paper. He thus earned a reputation among some historians as “the first photographer.” He was able to capture the images seen in a camera obscura on paper that had been rendered light-sensitive, but all he was able to capture were shadows and indistinct blurs (kind of the way I take pictures today).

The exact dates of Wedgewood’s experiments are not known, but are believed to have taken place sometime in the 1790s. Wedgwood himself, who had always been sickly, died at the age of 34.

The art and science of photography, as we all know, proceeded apace over the course of the 19th century, and was very much what we would today call a disruptive technology. It was a great boon, though, to scientists and other tinkerers who needed a way to capture processes or actions that went by too fast to see with the naked eye. The famous example of Eadweard Muybridge using a series of cameras to capture a racehorse in motion not only helped win a bet (he had been hired by Leland Stanford, the former governor of California, a businessman and race-horse owner, to prove that, as it gallops, a racehorse will at one point have all four hooves off the ground), but his series of photographs also led to the development of motion picture photography.

For our purposes here, though, the advent of photography also helped researchers in another area of study answer the question, does a water jet break up into droplets?

Some of the earliest research on drop formation was conducted by Félix Savart. Although Savart’s specialty was acoustics and sound, he also dabbled in other areas, such as water jets. Are jets of water continuous, or at some point do they break into drops? Photography wasn’t up to speed, as it were, at that time, so Savart had to improvise as best he could:

Savart was able to extract a remarkably accurate and complete picture of the actual breakup process using his naked eye alone. To this end he used a black belt, interrupted by narrow white stripes, which moved in a direction parallel to the jet. This effectively permitted a stroboscopic observation of the jet. To confirm beyond doubt the fact that the jet breaks up into drops and thus becomes discontinuous, Savart moved a “slender object” swiftly across the jet, and found that it stayed dry most of the time. Being an experienced swordsman, he undoubtedly used this weapon for his purpose (Eggers, 2003).

Am I the only one imagining a John Belushi-esque “Samurai scientist”? Probably. By the way, early attempts—such as Savart’s—to understand drop formation were missing one key element: the idea of surface tension, which was first recognized by Joseph Plateau. (Plateau, by the way, was also the first person to create the illusion of a moving image, via a device he invented in 1832 called the phenakistoscope.)

One researcher who was able to avail himself of photography to better understand how water drops form and behave was Lord Rayleigh, born John William Strutt. Rayleigh would be right to Strutt his stuff; he co-discovered argon, as well as what he called Rayleigh scattering, the phenomenon that explains why the sky is blue. And his Theory of Sound will still ring a bell with any sound engineer today.

One of the things that Rayleigh discovered, in 1878, was that

a stream of liquid droplets issuing from a nozzle can be made uniform in size and spacing by applying cyclic energy or vibrations to the droplets as they form at the nozzle orifice (Romano, 2008).

What does this remind you of? Indeed, RCA (Radio Corporation of America) pursued this line of research and in 1946 received a patent (2,512,743) for what was the very first drop-on-demand piezo inkjet device.

The further refinement of inkjet printing takes a long, circuitous route through the 20th century before it ends up where we are now, with myriad inkjet devices transforming entire industries—and, indeed printing on ceramics. Imagine what Josiah Wedgwood could have done with a Cretaprint.

 

References:

“Freeze Frame: Eadweard Muybridge’s Photography of Motion,” American Museum of Natural History, http://americanhistory.si.edu/muybridge/.

Jens Eggers, “A Brief History of Drop Formation,” School of Mathematics, University of Bristol, UK, http://www.maths.bris.ac.uk/~majge/moreau.pdf.

B. Litchfield, Tom Wedgwood, the first photographer; an account of his life, his discovery and his friendship with Samuel Taylor Coleridge, including the letters of Coleridge to the Wedgwoods and an examination of accounts of alleged earlier photographic discoveries, London: Duckworth & Co., 1903, https://archive.org/stream/tomwedgwoodfirst00litcrich/tomwedgwoodfirst00litcrich_djvu.txt.

Frank Romano, Inkjet!, Sewickley, Pa.: PIA GATF Press, 2008.

“Félix Savart,” Wikipedia, last modified on August 21, 2015, retrieved September 9, 2015, https://en.wikipedia.org/wiki/Félix_Savart.

“Josiah Wedgwood,” Wikipedia, last modified on August 30, 2015, retrieved September 9, 2015, https://en.wikipedia.org/wiki/Josiah_Wedgwood.

“Thomas Wedgwood,” Wikipedia, last modified on May 1, 2015, retrieved September 9, 2015, https://en.wikipedia.org/wiki/Thomas_Wedgwood_(photographer).

 

How Sweet It Is

By

I subscribe to a marvelous daily e-newsletter called Today I Found Out, which on more than one occasion has given me an idea for these rather long, historical digressions here on The Digital Nirvana, especially if there is some way—even a tangential one—I can work printing or graphics into it. A few weeks ago, the topic was “How Do They Get the ‘Ms’ on ‘M&Ms’?” I think most people reading this have a general idea; they use a variety of offset printing—which I expect I don’t need to explain—and a vegetable dye-based ink.

Printing of an older variety also played a role in the M&Ms origin story, which begins in the north of England almost a century and a half ago.

Henry Isaac Rowntree (1837–1883) was born in York and raised as a Quaker. In his adulthood, he became active in the politics of the day and he realized that his particular cause, Quaker Liberal Reform, didn’t have a newspaper. (Back then, newspapers were specifically aligned with political parties and movements. The idea of an objective and neutral press was a 20th-century idea.) So, in 1868, he founded one, the weekly Yorkshire Express. It was a propitious time to start a newspaper; the telegraph greatly enabled Rowntree to publish foreign news, stock market updates, and other items that came in via Reuters Telegram, one of the earliest wire services.

Running a newspaper is a full-time business, and, alas, Rowntree found that his attention was so devoted to the Express that he was neglecting his primary business: a confectionary company. Rowntree had started his career working for the Tuke family, owners of a candymaking company in Walmgate in York. In 1862, Henry bought the chocolate-making part of the business from the Tukes and ran it himself, expanding into a disused foundry two years later. However, the business began to founder after Henry started the Yorkshire Express, and by early 1869, was teetering on the edge of bankruptcy. It didn’t help that Henry was not the greatest businessman in the world, and the account books were a complete shambles. So he took on his older brother Joseph as a full partner and changed the name to H.I. Rowntree & Co. The first thing Joseph insisted upon was that Henry sell the Yorkshire Express, which he did. Joseph was horrified by the deplorable state of the accounts, so it was decided that Joseph would handle all the business aspects of the company, whilst Henry would attend to the manufacturing aspects. That made everyone happy.

And it worked. Alas, Henry did not live to see what would become the company’s massive success; he died abruptly of peritonitis on May 2, 1883. So, Joseph and company carried on. In the 1880s, Rowntree’s Fruit Pastilles were effectively competing against popular French imports of the time, and by the 1890s the company had scaled up to become a major manufacturer of confectionery to compete with the venerable Cadbury’s. In the 1930s, Rowntree & Co., as the company had renamed itself, launched a chocolate-covered wafer called the Kit Kat, which you may have heard of, and in fact may be eating right now.

Meanwhile, in 1882, shortly before Henry’s death, Rowntree & Co. began making what it was calling “Chocolate Beans.” In 1937, the company renamed them “Smarties Chocolate Beans.” However, the powers that regulate trade thought the use of the word “beans” was misleading, so the company went with “Milk Chocolate in a Crisp Sugar Shell”—which was rather more of a mouthful than the candies—before settling on just calling them “Smarties.” They became massively popular and today are produced by Nestlé.

In the 1930s, during the Spanish Civil War, soldiers were quite fond of Smarties, or at least that was the impression made on Forrest Mars, Sr. Mars (1904–1999) was born in Minnesota, raised in Canada, and granted a degree in industrial engineering from Yale. His father, Frank C. Mars, had founded candy company Mars, Inc., and invented the Milky Way bar in 1923. (Mars pére and his wife would go on to invent the 3 Musketeers and the Snickers bars by the end of the 1920s.) Feuding with his father over where to take the business, Forrest went to Europe where he invented the Mars Bar, and did stints with Nestlé and Tobler. While bopping around Spain, he noticed the popularity of Smarties, and made a note of it. A few years later, he returned to the U.S. where he founded his own food products company, launching Uncle Ben’s Rice and a somewhat less successful line of gourmet foods he called Pedigree, a name since given to a brand of dog food.

In 1940, Forrest Mars formed a partnership with Bruce Murrie, son of the president of Hershey Chocolate, and the pair developed their own variant of Smarties, which they called M&Ms, using their initials (Mars & Murrie).

In a case of history repeating itself, M&Ms became very popular with American soldiers during World War II, and during the war Mars and Murrie exclusively made M&Ms for the military.

It was in 1950 that the company began printing the tiny Ms on each M&M, first in black and then, in 1954, in white. Peanut M&Ms also came in the late 1950s, despite the fact that Mars had a peanut allergy.

And in case you’re wondering, yes, you can indeed have personalized M&Ms printed.

 

References:

Nat Bodian, “Looking Back at Newark Origins of World-Famous M&M Chocolates,” Old Newark Memories, http://www.virtualnewarknj.com/memories/newark/bodianmm.htm.

Elizabeth Jackson, Henry Isaac Rowntree: his life and legacy (reprinted from York Historian vol. 28), http://www.rowntreesociety.org.uk/assets/HIR-life-legacy-FINAL-as-PDF.pdf.

“How Do They Get the ‘Ms’ On ‘M&Ms’?” Today I Found Out, August 10, 2015, http://www.todayifoundout.com/index.php/2015/08/get-ms-mms/.

“Henry Issac Rowntree,” Wikipedia, last modified on July 23, 2015, accessed August 17, 2015, https://en.wikipedia.org/wiki/Henry_Isaac_Rowntree.

“M&Ms,” Wikipedia, last modified on August 7, 2015, accessed August 17, 2015, https://en.wikipedia.org/wiki/M%26M%27s.

“Rowntree,” Wikipedia, last modified on July 23, 2015, accessed August 17, 2015, https://en.wikipedia.org/wiki/Rowntree%27.

“Smarties,” Wikipedia, last modified on July 6, 2015, accessed August 17, 2015, https://en.wikipedia.org/wiki/Smarties.

Carbon-Based Life

By

I was recently given a gift subscription to Mental Floss magazine (in print) which is one of those miscellanies of random information that I can’t help but find endlessly fascinating. (The magazine also earned my utmost respect in December 2013 when they scored an interview with Bill Watterson, who, since ending the great Calvin & Hobbes, has become the J.D. Salinger of cartooning.) I occasionally check out the magazine’s website, and a couple of months ago there was a curious little listicle called “15 Common Expressions Younger Generations Won’t Understand,” which is one of those clickbaity attempts at making me feel old by pointing out that common phrases such as “hanging up a phone,” “dialing a phone,” “rolling up a window,” etc., are anachronistic terms that no longer accurately describe what it is they do.

The trouble is, the way technology changes—and has always changed—means that you could go back 20 years, or to any era, and find expressions that became equally archaic. Technology changes faster than language. (Look through any Adobe Creative Suite application’s menus sometime and you’ll see that many commands took the name of graphic arts terms that are likewise technologically obsolete. Photoshop’s Unsharp Mask, for example. When’s the last time anyone used an actual unsharp mask? What’s a pasteboard? (And we don’t use actual paste when we paste something.) Screening? Even terms like “upper- and lowercase” letters are, strictly speaking, archaic.

I do, however, take exception to the notion that “clockwise” is somehow an outdated term. Oh, come on; they still make analog clocks. Certainly younger generations occasionally wander into a Target or a bank, or look up at a clock tower. Big Ben has not gone digital just yet.

One that I will grant them is the term “cc” when referring to email that is copied to other recipients. “Cc” in this context obviously stands for “carbon copy” referring to carbon paper which, in the days before the photocopier, was how documents were copied. (Even in 1990, when I began my post-collegiate career at a New York City book publisher, we were still using carbon paper, largely because the cut-rate photocopiers we were using were always broken.) The process, you may recall (or be horrified to discover), went something like this: in between two sheets of blank paper you inserted a third sheet (the carbon paper) that had one side coated with some kind of ink or pigment. When you wrote or typed on the top sheet, pressure transferred the ink or pigment to the second blank sheet, and you had a copy. You could insert a few sheets of carbon paper between several blank sheets if you needed to make more than one copy, although after a few layers, the bottommost copies got progressively lighter.

Believe it or not, they still do make and sell carbon paper, although there are few uses for it anymore. (Some credit card slips still use carbons.) True carbon paper was replaced first by carbonless paper, then by photocopiers. Now with word processing and desktop printers, we rarely need to copy originals anymore. In fact, it could be argued that there aren’t originals anymore—or it could be that every copy is an original. (I think I shall lie down for a moment.)

Carbon paper was invented as part of a “Stylographic Writer,” patented in 1806 by Ralph Wedgwood, an English inventor and potter. Yes, “potter”; that isn’t a typo or Autocorrectism. Wedgewood’s main claim to fame was inventing things for the ceramics industry, but he borrowed £200 to develop what he later called the Noctograph. The carbon paper—or, as he called it, “carbonated paper” (there’s an amusing mental image for you)—was just one part of the system. The paper, both sides of which were coated with ink and then dried, was placed between two blank sheets, then clipped to a metal board. You wrote on it with a metal stylus, and the bottom sheet was the original, while the top sheet—with the writing on the bottom—was the copy, albeit with the writing reversed. It also had guides to help keep the writing straight. Wedgewood designed it to help the blind write, although it also found favor among the sighted who used it to write in the dark. By Wedgwood’s estimate, in seven years, he made £10,000 in profits from the Noctograph. He later toyed with an idea for a system that could write the same thing in multiple places simultaneously, sort of a proto-telegraph/fax machine, but alas it never came to fruition.

(A side note: the person from whom Wedgwood borrowed the £200 was Josiah Wedgwood II, the son of his cousin and business partner. Josiah and his brother Thomas were good friends with the poet Samuel Taylor Coleridge, and supported him financially so that Coleridge could dedicate himself to his poetry without having to muck about with “uncreative endeavors” like earning a living. Xanadu, indeed.)

Anyway, a prominent user of the Noctograph was James Holman, the so-called “Blind Traveler” who, despite being, indeed, completely blind, was a British adventurer who traveled the globe—at a time when such journeys were rare even among the sighted—and wrote extensively about his exploits. (He got around using what has been called “human echolocation”—yes, Holman was a little batty.)

Another famous Noctographer was an American named William H. Prescott. While a student at Harvard, he took part in a food fight and was hit in the eye by a crust of bread, which permanently damaged his eyesight. (That’s gotta be embarrassing.) He never went totally blind, but the condition got worse over time. Still, Prescott went on to become one of the most celebrated and respected historians of the 19th century, specializing in Spanish history. Shortly after graduating college, he began a series of travels, and spent some time in London, staying with renowned English vascular surgeon Astley Cooper and oculist William Adams, the latter of whom introduced him to the Noctograph.

Prescott and his Noctograph became as inseparable as…well, as inseparable as the iPhone and the young folks who have no idea what “cc” means.

References:

Arika Okrent, “15 Common Expressions Younger Generations Won’t Understand,” Mental Floss, http://mentalfloss.com/article/64669/15-common-expressions-younger-generations-wont-understand.

“James Holman,” Wikipedia, last modified on May 5, 2015, accessed June 19, 2015, http://en.wikipedia.org/wiki/James_Holman.

“William H. Prescott,” Wikipedia, last modified on May 24, 2015, accessed June 19, 2015, http://en.wikipedia.org/wiki/William_H._Prescott.

“Ralph Wedgwood (inventor),” last modified on April 29, 2015, accessed June 19, 2015, http://en.wikipedia.org/wiki/Ralph_Wedgwood_(inventor).

 

Long Distance Voyager

By

(Optional soundtrack to this post.)

History was made last week, when a space probe the size of a piano flew by and took the first close-up photographs of the last of the “classic nine” planets of the Solar System. I am speaking of course of Pluto, and while astronomy may be fairly far afield of what we usually post here on the Digital Nirvana (even by the standards of what I usually post here), the New Horizons flyby last week was really the end of a story that began well over a century ago. The discovery of Pluto itself illustrates what I deem to be the three forces essential to the success of any successful endeavor, be it scientific discovery or a business: a dream, dogged perseverance, and—last but not least—luck, or at the very least being the right place at the right time.

We’re often told to follow our dreams, and it’s the pursuit of those dreams that leads to ultimate success. Sometimes, the pursuit of a dream can lead to obsession, which sounds like a bad thing—just ask the crew of the Pequod and a certain white whale—but sometimes crazy obsessions can yield real, non-crazy fruit.

Most people reading this are likely familiar with the meaning and origin of the word “quixotic.” Derived from the name of Cervantes’ titular hero Don Quixote, it means “foolishly impractical, especially in the pursuit of ideals; marked by rash lofty romantic ideas or extravagantly chivalrous action.” This week, we mark a major scientific milestone, the origin of which began as a quixotic, obsessive, perhaps even downright barmy quest, the success of which was all the more remarkable for being based almost entirely on false premises and incorrect information. And for one man, spending lonely nights in an unheated dome high above the Arizona desert, it could very well have been an interplanetary snipe hunt.

Our story begins with one basic error: a mistranslation.

Giovanni Schiaparelli (1835–1910) was an Italian astronomer who, in 1877, made the first detailed surface maps of Mars, aided by advances in telescopes, as well as the fact that in that year Mars was unusually close to the Earth. Among the surface features Schiaparelli saw and drew were crisscrossing lines that he called canali, which in Italian means “channels” or “grooves.” (Canali is not to be confused with cannoli, although finding cannoli on Mars would sure be something.) The word canali was inaccurately translated into English as “canals,” not just for the obvious reason (it looks like it’s a cognate, and in come contexts can in fact mean “canals”), but because the Suez Canal, the engineering marvel of its day, had been completed a few years earlier and people still had “canal fever.” The word “canal” connotes a manmade (or, perhaps, Martian-made) waterway, and canal fever soon gave way to Mars fever. Speculation about life on Mars was rife, and would lead to one of the earliest science-fiction classics, H.G. Wells’ War of the Worlds (1898).

Mars mania also afflicted a man named Percival Lowell, scion of a wealthy New England family (after which Lowell, Mass., was named). After graduating Harvard, Lowell worked in his family’s textile business for a while, then spent several years traveling the world in various diplomatic capacities, lingering in Japan and writing about its culture and customs. He had always been an astronomy enthusiast, and was intrigued by Schiaparelli’s Martian canali. He returned to the U.S. in 1893 and began his quest to gain a better understanding of Mars. He founded Lowell Observatory on a mountaintop just outside Flagstaff, Ariz., and, making his own observations, saw the same canali that Schiaparelli had seen. He made many intricate drawings of Mars, and wrote several books, and became one of the primary proponents of the belief that intelligent life lived on the Red Planet.

Unfortunately, the scientific community thought he was a bit of a kook, and Lowell Observatory was not seen as a “real” research institution, at least not for a while.

Then, in 1906, Lowell got another celestial bee in his bonnet.

Slight rewind. In 1781, William Herschel discovered the planet Uranus, noted for being the butt, as it were, of juvenile puns, but also as the first planet to be discovered since ancient times. So, it was a pretty big deal. Astronomers began to wonder, well, what else could be out there? In 1821, Alexis Bouvard published tables of Uranus’ orbit—or, that is, what should be its orbit. However, later observations found that Uranus was not where it was supposed to be, as if there were another large body nearby whose gravitational field was perturbing the orbit of Uranus. A large body like…another planet? Teams of astronomers scanned the skies and, in 1846, Johann Gottfried Galle in Berlin—working from predicted positions by fellow astronomer Urbain Le Verrier in France—discovered Neptune.

Astronomers soon worked out the masses and orbits of our new celestial neighbors—except… something was wrong. Uranus, and now Neptune, didn’t move the way the math said they should. Hence, the logical question: was there yet another planet out there? (What they didn’t know—this was the 19th century remember—was that they got the mass of Neptune wrong, and that, coupled with later findings, makes the next chapter in this tale all the more remarkable.) By the turn of the century, finding this new planet became the new obsession.

Re-enter Lowell. Just like celestial billiards, this new planet knocked Mars from his attention. In 1906, he dedicated Lowell Observatory to the task of finding what he called “Planet X.” Alas, Lowell died in 1916 of a stroke, not realizing that his observatory actually had photographed what he was looking for.

Although Lowell passed on, the search for Planet X lived on in the dogged persistence of a Kansas farm boy.

Clyde Tombaugh was born in 1906, the same year that Lowell began his quixotic quest to find Planet X. Born on the family farm, his dreams of going to college were scuppered when a hailstorm destroyed his family’s crops. But he was still smitten with astronomy and when a Sears telescope proved inadequate, he built his own. Observing Jupiter and Mars, he sent drawings to the Lowell Observatory which, in 1929, hired him. As a new hire, he was given the typical low-man-on-the-totem-pole job: searching for Planet X. (Since Lowell’s death, finding it was no longer a priority at the observatory.) The 24-year-old Tombaugh attached a camera to a telescope and took a series of photographic plates of the night sky in the general vicinity of where Planet X was believed to be. Each photo was taken one to two weeks apart. He would then place two separate plates into a device called a “blink comparometer” which essentially toggled between two different pictures. The goal was to see which of the many many white dots had moved (stars are fixed, planets move about). It was a long, tedious, laborious process but on February 18, 1930, Tombaugh announced he had found it, a new planet.

This new planet went nameless for a while—Lowell’s window favored Zeus—until Venetia Burney, an 11-year-old British girl whose father had connections in the astronomical community, suggested “Pluto,” not after the Disney dog but rather the Roman God of the Underworld (aka Hades in Greek mythology). The directors of Lowell University rather liked the idea—especially in that the name Pluto starts with Percival Lowell’s initials. Although Pluto was demoted from planethood in 2006—it’s now considered a dwarf planet (when I toured Lowell University in 2003 they were having none of this “trans-Neptunian object” crap)—very little was known about the last of the “classic nine” planets most of us grew up with. Until now.

Last week, NASA’s New Horizons probe made a historic flyby of the tiny planet, and for the past few weeks has been sending back an increasingly extraordinary set of photographs of Pluto and its moon Charon. As of this writing, the best is yet to come.

Although New Horizons won’t be orbiting or landing on Pluto (Pluto is so small, and thus its gravitational force so weak, that in order for the probe to decelerate enough for orbital insertion or landing it would have needed to carry more fuel than it could have feasibly launched with), the flyby will still glean enough information to keep astronomers and “Plutocrats” busy for years. And even though Clyde Tombaugh passed away in 1997, New Horizons is carrying some of his ashes.

Lowell had one crazy dream that eventually panned out, and it was his, and later Tombaugh’s, dogged persistence that allowed that dream to pan out. And, since the “evidence” that led to the search for Planet X was wrong, it also was luck of the most astronomical sort.

 

References:

Britt, Robert Roy, “Mars: A History of False Impressions,” Space.com, September 26, 2005, http://www.space.com/1583-mars-history-false-impressions.html.

Drake, Nadia, “Pluto at Last,” National Geographic, July 2015, http://ngm.nationalgeographic.com/2015/07/pluto/drake-text.

“Percival Lowell,” Wikipedia, https://en.wikipedia.org/wiki/Percival_Lowell, last modified on July 16, 2015, accessed July 17, 2015.

“Pluto,” Wikipdia, https://en.wikipedia.org/wiki/Pluto#Discovery, last modified on July 20, 2015, accessed July 20, 2015.

“Clyde Tombaugh,” Wikipedia, https://en.wikipedia.org/wiki/Clyde_Tombaugh, last modified on July 17, 2015, accessed July 17, 2015.

 

Editing Independence

By

Whenever I do speaking gigs in connection with the various books I have written with Dr. Joe Webb—Disrupting the Future, This Point Forward, and The Home Office That Works!—one question I am invariably asked is, “How do you write with a coauthor?” My standard response is, “I handle the nouns and adjectives, Joe handles the verbs and adverbs, and we split infinitives.” Thank you, I’m here all week.

Writing with a coauthor is actually of enormous benefit, because it reins in some of my more…let’s say, distracting writing tropes and tendencies, of which there are many. That’s largely what I miss about having a proper editor. Back in my Micro Publishing News days in the 1990s, editorial director David Griffith was a very hands-on editor, and we would often talk about what worked and what didn’t, and he was always trying to refine and improve my style, which I appreciated (well, most of the time). It’s very rare in trade publishing to have that kind of collaborative relationship with an editor, the way book authors and editors work. (Commented David once: “I swear I’m going to remove the paragraph keys from your computer.”) And in this age of bogging, tweeting, and other social media, writers are all left to their own devices, which has its pluses and minuses.

There are good editors and bad editors, of course, but a good editor should approach copy in a manner not unlike a doctor: “first, do no harm.”

Sometimes, though, editors actually are doctors.

Benjamin Rush (1746–1813) was born in Byberry, Penn., what is now a Philadelphia neighborhood. He attended the College of New Jersey (today called Princeton University), and after a lengthy apprenticeship under prominent Philadelphia physician Dr. John Redman, studied medicine at the University of Edinburgh in Scotland. His doctoral thesis was called “On the Digestion of Food in the Stomach,” and was based on—says Princeton’s biography of Rush—“some heroic experiments with emetics on his own person.” (Nice euphemism. In my time, we called them “dorm parties.”)

Rush got his M.D. in 1768 and trained for several months at St. Thomas’ Hospital in London, where by chance he met Benjamin Franklin. Franklin persuaded Rush to visit France with him—on Franklin’s dime—where Rush got to hobnob with French doctors, scientists, and other leading members of the intelligentsia.

He returned to Philadelphia in 1769, opening his own medical practice and also becoming a Professor of Chemistry at the College of Philadelphia (today, the University of Pennsylvania’s Perelman School of Medicine). Rush also wrote A Syllabus of a Course of Lectures on Chemistry, which was the first American chemistry textbook. Rush also became involved in the nascent independence movement afoot in the colonies, and became a member of the Sons of Liberty.

It was in Philadelphia in 1775 that Rush met a British expatriate and together they would help change the world.

Thomas Paine (né Pain) was born in 1736 in Norfolk, England, and in adolescence apprenticed to his father who made rope stays for shipping vessels. In his 20s and 30s, he kicked around England working in various capacities—rarely lucratively—was married, widowed, and married again. The onset of the 1770s saw Paine working as an excise officer while also running a tobacco shop. In 1772, he published his first political work, a 21-page treatise called The Case of the Officers of Excise, essentially a long argument for Parliament to increase the pay and improve the working conditions of excise officers. He was soon fired from his job (he tended to play hooky a lot) and his tobacco shop went under. Threatened with debtor’s prison, he sold all his possessions.

In 1774, he left his wife and moved to London where a mutual acquaintance introduced him to—yep, you guessed it—Benjamin Franklin. Franklin suggested Paine move to the British Colonies in America, and even gave Paine a letter of recommendation. So Paine packed up whatever meager possessions he had and put to sea.

He barely made it.

On board the ship, a tainted water supply led to a typhoid fever outbreak, and five fellow passengers never made it across the Atlantic. When the ship finally landed in Philadelphia, a doctor had to be sent for to physically carry the ailing Paine off, and Paine was weeks recovering. Soon, though, he was back up to speed and, like so many people who came to America, started a completely new life. He began as a publicist, then in the spring of 1775 published a pamphlet called African Slavery in America, a condemnation of that institution. He then landed a position at Pennsylvania Magazine, which had been founded by Robert Aitken, a Revolutionary printer who had run the official press for the Continental Congress. Paine was a major contributor and part-time editor for the magazine, being promoted to sole editor within six months.

By the middle of 1775, Revolutionary fervor was starting to kick into high. The Boston Tea Party had taken place in 1773 and April 1775 saw the battles of Lexington and Concord, the first shots fired in the American Revolution. In the latter half of 1775, Paine began composing his thoughts in a long pamphlet, laying out the case for American independence from Great Britain, but he needed help editing it. He turned to a friend of his, a prominent Philadelphia physician and fellow Son of Liberty, Benjamin Rush. Rush helped him hone the text—which ended up being a 41-page treatise—as well as find a printer to publish it for him. Rush had recommended Robert Bell, whose competitive advantage (we might say today) was that he wouldn’t refuse to print it because of its incendiary content, a serious concern at the time.

Rush also suggested one other change. Paine’s original title was Plain Truth. Rush suggested, instead, Common Sense. It was a fortuitous change. The new title thus attracted everyday “common” folk to the fight for independence, who up until that point felt that politics and government were strictly the purview of the more elite members of society.

The pamphlet was a hit; even people who couldn’t read would attend public readings of Common Sense. It sold 120,000 in its first three months of publication and, by some accounts, more than 500,000 worldwide in its first year. Paine would have been a household name, but—for obvious reasons—he’d had to publish it anonymously. (Bell the printer had added “Written by an Englishman” to the second edition, which greatly pained Paine.) Paine donated all his profits from Common Sense to George Washington’s Continental Army and, perhaps to get back at Bell, gave up his copyright, granting all American printers the right to reproduce it. (Print clients…go figure.)

Seven months after the publication of Common Sense, in July 1776, the Declaration of Independence was written by Thomas Jefferson, edited by the Continental Congress, approved on July 2, and ratified on July 4. It was issued as a broadside initially by Philadelphia printer John Dunlap. One of its signers: Benjamin Rush.

On behalf of Digital Nirvana, have a happy and safe Independence Day weekend.

 

References:

Alexander Leitch, A Princeton Companion, Princeton University Press, 1978, http://etcweb.princeton.edu/CampusWWW/Companion/rush_benjamin.html.

“Benjamin Rush,” Wikipedia, last modified on June 21, 2015, retrieved June 25, 2015, https://en.wikipedia.org/wiki/Benjamin_Rush.

“Benjamin Rush (1746-1813),” Penn Biographies, University of Pennsylvania University Archives and Records Center, retrieved June 25, 2015, http://www.archives.upenn.edu/people/1700s/rush_benj.html.

“A Biography of Thomas Paine,” American History from Revolution to Reconstruction and Beyond, University of Groningen, http://www.let.rug.nl/usa/biographies/thomas-paine/.

“Thomas Paine,” Wikipedia, last modified on June 18, 2015, retrieved June 25, 2015, https://en.wikipedia.org/wiki/Thomas_Paine.

“Common Sense (pamphlet),” Wikipedia, last modified on June 18, 2015, retrieved June 25, 2015, https://en.wikipedia.org/wiki/Common_Sense_(pamphlet).

Stop, In the Name of Love

By

It would seem The X Files is coming back to TV. As a big fan of the original back in the 1990s, I have mixed feelings about its return, a combination of anticipation and dread. Recently I was watching a few old episodes and admittedly, in some ways, it really hasn’t aged very well. And long story arcs tended to fizzle out and disappear, as if kidnapped by aliens. Still, it had that certain je ne sais quoi.

The show used to poke occasional fun at what was supposedly a real “alien autopsy” videotape—Alien Autopsy: Fact or Fiction?—that the Fox network aired in 1995. It was said to have been shot in 1947 following what some believed to have been the crash and recovery of an extraterrestrial spacecraft in Roswell, N.M., and it depicted a rubbery alien being, well, autopsied. The debunkings began almost immediately. All sorts of holes could be poked into, certainly the alien, but also the veracity of the tape itself. One interesting one, however, came via a letter that appeared in Skeptical Inquirer magazine at the height of alien autopsy fever. It pointed out one overlooked detail in the video: a bit of OSHA signage seen in the background.

During the early 1980s I was responsible for re-signing a large industrial facility in southern California to bring the various hazard signs up to OSHA (Occupational Safety and Health Administration) requirements for that time. This involved the review and replacement of signs indicating dangerous and hazardous environments in and around the facility.

When I first saw the “Alien Autopsy” film, I felt that the Danger sign looked all too familiar. I decided to research the graphic format of the sign. This involved an archival search with OSHA and an additional search of the ANSI (American National Standard Institute) archives. The results were most interesting: they confirmed what I originally suspected. The graphic format used in the Danger sign was adopted by ANSI in 1967, Ref. ANSI index Z53.1-1967, and approved for OSHA in 1973, with implementation to be achieved by 1983, Ref. OSHA index 1910.145.

All said and done, it is very unlikely that a sign with a graphic design originating in 1967 would be available for use in a 1940s environment or film.

The devil’s in the details.

Signage. We often take it for granted, and it seems an appropriate time to commemorate the fact that 2015 marks the 100th anniversary of a sign we see every day and is recognized virtually everywhere in the world, even if not everyone obeys it: the Stop sign.

First appearing in Detroit, Mich., in 1915, the original Stop sign was a 2-foot by 2-foot sheet of metal with “Stop” written in black letters on a white background. The birth of such traffic signage in Detroit makes sense; it was the center of the nascent automobile industry, and as the number of vehicles on the road began to increase, there was the correspondingly increasing need to regulate traffic. At the time, there really were no traffic control mechanisms in existence at all, and traffic was quickly becoming a nightmarish, chaotic free-for-all. Kind of like Massachusetts.

It was in 1923 that the Stop sign became octagonal, and for a rather abstruse reason: the idea at the time was that the number of sides a sign had communicated its level of importance or potential peril. A circle, strictly speaking, has an infinite number of sides, and communicated (the thinking went) the highest level of danger and thus caution. So circular signs were used for railroad crossings.

The octagon, having eight sides, albeit a few less than infinity, was used for “second-tier” danger levels like intersections. Thus, it was deemed perfect for the Stop sign. Diamonds were used for lesser warnings and square and rectangular signs for informational signage, even though diamonds, squares, and rectangles all have the same number of sides.

“You have to realize this was done by engineers, and engineers can be overly analytical,” says Gene Hawkins, a professor of civil engineering at Texas A&M University and the nation’s pre-eminent expert on the history of the stop sign.

True dat.

The shape of the Stop sign was determined some time before its color. For a while, it was black text on a yellow background and, in fact, it wasn’t until the 1950s that the Stop sign turned red—primarily because signmakers couldn’t find a red reflective material that was durable enough for outdoor applications. So we can thank signage substrate manufacturers for the color of the modern Stop sign.

It was primarily the advent of the automobile that triggered the need for traffic control signage and signaling. The first non-electric traffic signals were installed in London outside the houses of Parliament in the 1860s to control horse-drawn traffic. Gas-powered, they had the unfortunate tendency to leak and explode—very bad form. So they were rather quickly discontinued. The first electric, non-exploding traffic light was installed in Cleveland, Ohio, in 1914, by which time horseless traffic was on the rise.

It was the age of the bicycle, though, that changed the demands for traffic signage.

As far back as ancient Rome, traffic signage consisted of little more than “milestones,” markers or columns that offered directions and/or distances to destinations. By the Middle Ages, as the number of roads and thus intersections increased, multidirectional signs were used to send travelers the right way. In 1686, in Lisbon, Portugal, the narrowness of some city streets led King Peter II to install the first “right of way” signage—one, which still exists, reads, “Year of 1686. His Majesty commands all coaches, seges and litters coming from Salvador’s entrance to back up to the same part.” (Maybe something got lost in translation, but it’s still clearer than some New York City parking signs.)

In the late 19th century, bicycling became all the rage. The earliest bikes were a bit ungainly and difficult to control, but people still liked to ride at high speed on unfamiliar roads. Cycling organizations took it upon themselves to post signs warning of upcoming dangers, like hills, bridges, and other potential hazards. Think of it as the first crowdsourcing of traffic information, a kind of primitive Waze.

Traffic signage is still evolving, and whilst we don’t often think about it beyond the extent we need to either comply with the law (or disregard it) or find our way someplace, highway departments have had to adapt traffic signage to changes in automobile headlight design (all traffic signage is required to have a precise degree of reflectivity at night—not too faint, and not too blinding) and the next challenge in traffic signage is adapting it to an aging population. According to the DOT, there were 34 million licensed older (65+) drivers in 2010 (which are the latest data available), which was a 22% increase from 2001. These older drivers comprised 16% of all licensed drivers in 2010, up from 14% in 2001. The percentage of older drivers on the road will only increase in the next decade.

Perhaps if self-driving cars ever become viable and widespread (I have my doubts, but it would be nice), maybe we won’t even need traffic signage anymore.

 

References:

Hilary Greenbaum and Dana Rubinstein, “The Stop Sign Wasn’t Always Red,” New York Times, December 9, 2011, http://www.nytimes.com/2011/12/11/magazine/stop-sign.html.

Traffic Safety Facts, U.S. Department of Transportation, National Highway Traffic Safety Administration, July 2012, http://www-nrd.nhtsa.dot.gov/Pubs/811640.pdf.

“Traffic Sign,” last modified June 2, 2015, accessed June 8, 2015, Wikipedia, http://en.wikipedia.org/wiki/Traffic_sign.

“Westminster Road Semaphore,” Victoria County History, http://www.victoriacountyhistory.ac.uk/explore/items/westminster-road-semaphore.

Stormwatch

By

One afternoon, a couple of weeks ago, I was deciding whether to go for my afternoon run, keeping an eye on my iPhone’s weather app, trying to gauge the ETA of an approaching thunderstorm. There had also been talk of hail which, last year at the same time, was golf ball size and literally totaled a friend of mine’s car. It’s not something I want to be caught out on a running trail in.

You know what they say: everyone talks about the weather, but no one ever does anything about it. But the weather is one of those things that has been the killer app for virtually every new media and communication technology.

Weather forecasting dates back to the Babylonians circa 650 B.C.E., and Aristotle got his feet wet in the subject in Meteorologica. Other ancient Greek, Chinese, and Indian writers also compiled what might more correctly be called “weather lore.” It relied upon observing patterns, such that if a sunset was especially red, the next day would be sunny. And so on. (In a similar vein, when I see Jim Cantore in my general vicinity, I head indoors as quickly as possible.)

What is believed to be one of the world’s first true weather reports—and indeed the first work of modern journalism—was The Storm, written by Daniel Defoe and published in 1704. It is an account of the Great Storm of 1703, one of the worst natural disasters ever to hit the south of England. A fierce maelstrom of wind and rain that hit in late November, 1703, it knocked down 2,000 chimneys in London alone, tore the roof off Westminster Abbey, wrecked 700 ships in the Thames—including one-fifth of the Royal Navy—and sent more than 1,000 seamen to Davy Jones’ locker. Coastal towns such as Portsmouth, Defoe wrote, “looked as if the enemy had sackt them and were most miserably torn to pieces.” The Storm combined his firsthand account of the event with those of others’. (He had placed ads looking for witnesses to share their stories.) Defoe also wrote that the wreck of the naval vessels was divine retribution for losing to France and Spain at the beginning of the War of the Spanish Succession. That seems a bit picayune for God, but what do I know?

Defoe was quite a character in his own right. A very prolific writer—mostly pamphlets of the political muckraking type—he caused a great storm of his own with newly crowned Queen Anne and ended up in the pillory and, shortly thereafter, Newgate Prison. Upon his release, he would go on to pioneer economic journalism and, with 1719’s Robinson Crusoe, wrote the first English novel.

The Storm was a weather report, but a past-tense one. It really wasn’t until the advent of the telegraph that weather forecasting really got going, and this was largely because communication was slower than the weather. For example, if I want to know what the weather is going to do in an hour or so, I look at the radar map and see what is happening to the west. If a long line of many-colored thunderstorms is trundling across Syracuse and Utica, chances are it’ll be here shortly. But until the mid-19th century, there was no quick way to get that information.

Two pioneers of weather forecasting were British Navy men. Sir Francis Beaufort was a Rear Admiral in the Royal Navy who invented both the wind scale that bears his name as well as a set of weather notations. In 1831, Beaufort used his influence to intercede on behalf of a young captain named Robert FitzRoy, who was trying to get his ship back after losing a bid for a seat in Parliament. The ship was the HMS Beagle, which FitzRoy had taken command of following the suicide of its previous captain. The mission was to head to Tierra del Fuego, and FitzRoy was looking for an educated traveling companion to break of the boredom of months at sea. He asked Beaufort if he knew anyone. Beaufort did; a man named Charles Darwin. But that’s another story for another time.

In 1854, FitzRoy was appointed chief of a new department to collect weather data at sea, a service designed to help mariners. (This would become today’s Meteorological Office.) In 1859, there was a massive storm that resulted in the wreck of a clipper called the Royal Charter. It was a famous shipwreck at the time (Dickens had written an account of it in The Uncommercial Traveler) but it so affected FitzRoy—who was prone to depressive moments—that he was motivated to figure out how to use all the weather data he had been collecting to make predictions or, to use the phrase he coined, “weather forecasts.” He used the telegraph to collect reports from various places at various times during the day, and would use these reports to issue storm warnings to ports that appeared to be in harm’s way. The first storm warnings began in 1861 and quickly expanded to include a forecast for the next day or two. Utilizing another new communication medium, the newspaper, FitzRoy’s daily weather reports began appearing in The Times in that same year, and began to be syndicated to papers throughout Britain. They became exceedingly popular.

FitzRoy didn’t have a wealth of data to work with, and it’s not like weather forecasting had ever really been attempted in a scientific fashion before, so his forecasts were more than a little hit or miss. When he was really wrong, he would write apologies on the Letters page of the Times. There’s an idea today’s weather forecasters might do well to emulate.

FitzRoy became a bit of a media celebrity, as he was accurate a good chunk of the time. Still, there came the inevitable backlash; haters gotta hate. The scientific community thought he was full of unseasonably hot air, MPs grumbled about the cost of all the telegraphing (the British Government was paying for it), and everyone cursed him out when he made an errant forecast.

The increasing criticism, combined with financial problems and health issues, stirred up his depressive condition and, after one final forecast on April 29, 1865 (thunderstorms over London), he slit his throat with a razor and died.

On this side of the Atlantic, weather forecasting progressed less tragically. In 1871, an astronomer turned meteorologist named Cleveland Abbe was appointed chief meteorologist for the United States Weather Bureau, then part of the Signal Corps. He assembled a team of correspondents who also would avail themselves of the telegraph to feed him information, and he would devise and issue weather forecasts. He would also verify his forecasts after the fact; in his first year of operation, he had verified 69 percent of them, apologizing for the other 31 percent but blaming them on “time constraints.” Uh huh.

As newspapers gained in prominence, weather reports and forecasts played a greater and greater role, and as new media emerged, weather was one of the first things drawn on for content. In 1911, the U.K.’s Meteorological Office began sending weather reports via radio, and in the U.S., Boston radio station WEEI began broadcasting reports in 1925. On television, the BBC experimented with weather broadcasts as early as 1936, but didn’t really get going until after the war. Over here, sometime in the 1940s, James C. Fidler did some nascent TV weather broadcasts in Cincinnati on the DuMont Network. TV weather went high-tech in the late 70s and early 80s when Good Morning America’s John Coleman began using satellite imagery and computer graphics. Coleman would found The Weather Channel in 1982 which was a flop at first, but its launch coincided with the massive build-out of cable television in the 80s and 90s. They also realized they could get more viewers by getting their weather people out from behind desks and into the storms themselves. (It helped that video equipment had become less expensive, more portable, and more rugged.)

Today, instead of looking at an outdoor-mounted thermometer to see what the temperature is, or even turning on the TV, I look at my phone. And if I, for some reason, decide to check out Facebook, people often post screenshots of their iPhone weather app screens, especially last winter. Even on social media, we can’t stop talking about the weather.

Uh oh…is that Jim Cantore coming down the street?

References:

Peter Moore, “The birth of the weather forecast,” BBC, April 30, 2015, http://www.bbc.com/news/magazine-32483678.

“Cleveland Abbe,” Wikipedia, last modified on June 7, 2015, accessed June 8, 2015, https://en.wikipedia.org/wiki/Cleveland_Abbe.

“Daniel DeFoe,” Wikipedia, last modified on May 21, 2015, accessed June 8, 2015, http://en.wikipedia.org/wiki/Daniel_Defoe.

“Robert FitzRoy,” Wikipedia, last modified on May 17, 2015, accessed June 8, 2015, https://en.wikipedia.org/wiki/Robert_FitzRoy.

“Weather Forecasting,” Wikipedia, last modified on May 26, 2015, accessed June 8, 2015, https://en.wikipedia.org/wiki/Weather_forecasting.