Author Archives: Richard Romano

About Richard Romano

Richard Romano has been involved in the graphic arts since before birth. He is a writer and analyst for the graphic communications industry and a regular contributor to, for which he oversees the Wide-Format and Production Inkjet special topic areas. For eight years, he was the senior analyst for The Industry Measure (formerly TrendWatch Graphic Arts), until its demise in March 2008. He has also worked on consulting and market research projects for many other organizations and companies and has contributed to such magazines as Graphic Arts Monthly, GATFWorld, Printing News, and HOW; is the former executive editor of, CrossMedia magazine; and is the former managing editor of Micro Publishing News and Digital Imaging magazines. As if that weren’t enough, he is also the author or coauthor of more than a half dozen or so books, the last three with WhatTheyThink’s Dr. Joe Webb, including Disrupting the Future, which has been translated into Japanese and Portuguese. Their most recent title is "The Home Office That Works! Make Working At Home a Success—A Guide for Entrepreneurs and Telecommuters." He has vague recollections of having graduated from Syracuse University’s Newhouse School of Public Communications in 1989, and has a 1994 certificate in Multimedia Production from New York University. He is currently in the final throes of a Masters program at the University at Buffalo, which he really does need to wrap up at some point. He lives in Saratoga Springs, NY.

Poll Position


We’re in the home stretch, folks—The Election That Won’t End is finally on its last legs. Pundits have referred to this year’s contest as the “Twitter election,” for better or worse, but that shouldn’t be surprising. New communications technologies—even though Twitter isn’t really all that new at this point—have always played roles in politics, and indeed the first “modern” Presidential campaign was enabled by print. Not that print was a new technology at the time—that time being 1840—but shortly before that campaign, printing had seen its first major technological advances since Gutenberg.

To identify these changes, we’ll start down in some English mines, and quickly ascend to an abandoned German monastery.

In the late 17th century, British mining operations had kicked into high. Trouble was, a lot of mines were located along the coast, and in Cornwall, mines even extended under the sea some distance offshore. This meant that mines flooded rather routinely, so there was a pressing need for drainage. Enter Thomas Savery, who had built a drainage pump in the hope of solving the problem of massively moist miners. Savery—who called his pump the “Miner’s Friend”—had based his design on the work of a Frenchman, Denis Papin, who had invented a pump that was powered by gunpowder, although he later wisely substituted steam power. Savery made a few alterations to Papin’s pump, and soon had the Miner’s Friend. Alas, it was a fickle friend; when put into actual operation, it couldn’t withstand the high heat and pressure and fell apart.

Now enter an ironworker named Thomas Newcomen, who tweaked the Savery pump to create his own “Newcomen engine.” It took a few tries, but soon his engine worked, was a hit, and within a decade was being used in mines throughout Europe.

Flash forward nearly half a century to 1763, and a scale model of the Newcomen engine is owned by Glasgow University in Scotland. At one point, it broke down, and the resident repairman was tasked with fixing it. In the course of fixing it, he realized that it was not as efficient as it could be. He added an extra cylinder and made some other tweaks, and it is for this that the repairman, James Watt, is acknowledged as the inventor of the steam engine and all that it led to.

It took a while, but steam power had transformed—well, everything. Which brings us to Oberzell Monastery in Germany. It had been abandoned some years earlier, and it seemed just the right site for inventor Friedrich Koenig to set up shop. Koenig was born in 1774 to a farming family, but had apprenticed as a printer. He matriculated at Leipzig University, and began to dedicate his researches to eliminating what he called the “horse-work” from printing. In other words, make it more of an automated process. His first attempt was in 1803–04, when he designed what he called the Suhl(er) press, which was really just a powered hand press. It’s not known if he actually built one, but regardless, German printers were not warm to the idea so Koenig went abroad to try his luck with the British.

He had some nibbles and some business agreements, but, long story short, he eventually hooked up with an old friend from Germany, Andreas Bauer. The two pooled their ideas and began working on what would become the first steam-powered printing press. It was patented on March 29, 1810, and a year later it was first used to print 3,000 copies of a portion of the Annual Register, the first book, Koenig claimed, to be printed at least in part by a machine. The press had a top speed of about 400 impressions an hour—which was a bit of an improvement over the old hand press. Koenig and Bauer continued development, and in 1812 began work on a newspaper version of the press. It took some doing—not least because once newspaper pressmen got wind of a steam-powered press they threatened Koenig’s life—but in the wee small hours of November 29, 1814, the first newspaper ever printed on a steam-powered press was produced at a speed of (it was claimed) 1,100 single-sided sheets per hour.

Koenig made a few more improvements, took out more patents, but after falling out with a British business partner who felt he owned all the rights to the machine, he and Bauer fled in discouragement back to Germany. In 1817, Koenig and Bauer—and, yes, they are the “K” and the “B” of the company that would one day be called, and is still called today, “KBA”—bought the Oberzell Monastery near Würzburg and began development of steam-powered presses. They had difficulty finding skilled artisans, alas, and manufacturing was a struggle. Koenig also had a vision of a rotary press, but he died in 1833 before he could realize that dream.

Still, the steam-powered printing press, particularly the newspaper configuration, would change media and politics in ways that we are still recovering from today. The automation and, more importantly, the speed made printing more economical to produce—and thus to buy. As a result, the early 19th century saw the explosion of what would be called the “penny press,” newspapers that sold for a penny (most other papers sold for about six cents a copy). On July 24, 1830, Lynde M. Walter launched the Boston Transcript, the first “penny press” newspaper. Its primary topics were, naively enough, literature and theater. That would not last.

The real penny press pioneer was Benjamin Day, who launched the New York Sun in 1833. Day was a mere slip of a lad (by today’s standards anyway) at 23, and his Sun was the first to use an advertising- rather than subscription-based revenue model. He also introduced to America the “London Plan” whereby he sold the papers in batches of 100 to newsboys who would them sell them on the street (whether they ever really yelled “Extra! Extra!” is open to debate). Day soon became the most important publisher of his, well, day, and not only because of his business acumen. It was also because he chose to publish stories that today would be considered “tabloid” journalism—even though, like other penny press papers, the Sun was physically a broadsheet and not tabloid size. One of his most infamous stories was written by Richard Adams Locke and published in 1835. It was an ostensibly serious report about life that had supposedly been found on the moon. (Some years later, Edgar Allan Poe claimed it had been plagiarized from him.) Anyway, the penny press quickly went for the sensationalistic and lurid.

It wasn’t long before the press became active in politics. Newspapers had always been politically active, at least since the founding of the United States. Individual papers were highly partisan and were more often than not official political party organs. It wasn’t until the 20th century that the media was supposed to be unbiased, at least in theory.

We rightly blanch at the tenor of political debate today, but if you look back at the history of political campaigns, they have only very rarely been based on high-minded discussion of important issues. The campaign of 1800 between Thomas Jefferson and John Adams still could win awards for the ugliest and nastiest Presidential campaign in history—and they were two Founding Fathers, supposed children of the Enlightenment and the Age of Reason, and yet they couldn’t stay out of the mire.

The first more or less modern Presidential campaign was the 1840 contest between Democrat Martin Van Buren, the incumbent President seeking a second term, and Whig William Henry Harrison, a decorated general who had won the Battle of Tippecanoe, although one of the Democrats’ ploys was to question whether he really played an important role in the battle. (It was the precursor of what in 2004 would come to be called “swiftboating.” Some things never change.)

The Harrison campaign had a catchy slogan: “Tippecanoe and Tyler Too” (John Tyler was Harrison’s running mate, which he would come to regret). This was the “Log Cabin and Cider” campaign, where Harrison was depicted as being a rural man of the people, easily at home in a simple log cabin drinking hard cider and Van Buren was a foppish and pampered dandy. Actually, though, Harrison was a wealthy landowner, scion of one of the “First Families” of Virginia, where his father had been governor and one of the signers of the Declaration of Independence. It was Van Buren who had grown up in penury in upstate New York as the son of a tavernkeeper.

It was one of the first mass media campaigns; printing presses were physically part of campaign rallies. The Whigs—they were a new party at the time—launched Log Cabin newspapers, and it was the first campaign to feature banners, merchandise, and even songs; a popular item during the campaign was the Harrison and Log Cabin Songbook. Whigs would lug printing presses down the street during parades and print off copies of song sheets so people could sing along with anti-Van Buren ditties, such as:

What has caused the great commotion, motion, motion

Our country through?

It’s the ball a-rolling on, on

For Tippecanoe and Tyler too

Some of this took Van Buren and company by surprise; Van Buren was a highly experienced political manipulator at the time and was instrumental in establishing the norms of party politics, norms that continue to this day.

Still, no one could resist a good song, and Harrison and Tyler won the election. Alas, as we all likely recall, Harrison died a month after Inauguration Day, and Tyler—soon to be labeled “His Accidency”—would become the first sitting Vice President to become President.

So for the next few days, until our current campaign ends, I’ll be hiding down in a mine. Someone fetch me a Miner’s Friend.



James Burke, Connections, Boston: Little, Brown & Company, 1978, p. 170–174.

John Dickerson, Whistlestop: My Favorite Stories from Presidential Campaign History, New York: Twelve, 2016, p. 334.

Bill Elligett, “Koenig: his first Powered Printing Machines 1803–1818,” Letterpress Printing,

“Friedrich Koenig’s invention changed the media world,” KBA, November 27, 2014,

Alan Kingsbury, “William Henry Harrison, Martin Van Buren, and the Birth of the Modern Political Campaign,” US News & World Report, January 17, 2008,

“Centenary of Friedrich Koenig, 1774–1833,” Nature, 131, 51-52 (14 January 1933),

“Benjamin Day (publisher),” Wikipedia, last modified on October 23, 2016, retrieved October 31, 2016,

“Penny Press,” Wikipedia, last modified on March 19, 2016, retrieved October 31, 2016,

Let’s Drink to Paper!


Trade show season has come and gone (for some of us), and every time I travel I can’t help but notice that the most popular drink on an airplane seems to be the Bloody Mary. It’s not a drink one sees people drinking often in one’s neighborhood bar (not that I pay an awful lot of attention, or that I am Dylan Thomas), but the vodka, tomato juice, and Tabasco concoction seems to be largely the purview of the air traveler. And there could be a very good reason. I recently came across a study, published in 2015, that found some scientific basis for preferring a Bloody Mary on a plane: it tastes better there.

We all know that our five senses don’t operate independently of each other, and, specifically, that  the sense of smell is important to the sense of taste, which we know intimately because food tastes awful, if we can taste it at all, when we have a cold.

Interestingly, the sense of hearing is also involved in taste; sound can affect the taste of food. In study conducted by Cornell University’s Department of Food Science, it was found that when noise levels reach about 85 decibels, such as that found inside an airplane cabin, sensitivity to sweet tastes was diminished. While saltiness, sourness, and bitterness were generally unchanged, the fifth taste, savory or umami, was enhanced. As a result, at that decibel level, people tended to crave tomato-based products, like a Bloody Mary. It also explains why airline food gets a bad rap as being bland and tasteless; it may actually taste much better on the ground. The dryness of the cabin also desiccates airline food, especially meats, which is why they are inevitably served in sauces and gravies. Then again, I almost always fly Southwest, which doesn’t serve meals, and the peanuts and pretzels are just fine. Anyway, the flavors in things like tomatoes, carrots, mushrooms, and potatoes tend to be enhanced whilst aloft. Lufthansa once found, in a study of its own, that airline passengers ordered tomato juice as often as beer. Ach du meine Güte!

Speaking of airplanes, did you know that the paper airplane was invented about 2,300 years before the airplane itself? Not that anyone called it a “paper airplane,” of course. Until the advent of aviation in the 20th century, it was known as a “paper dart,” and is still occasionally called that. Origami and the art of paper folding—using paper for religious and other ceremonial rituals—were among the first uses for paper in ancient China. (Kites, by the way, have been traced back to fifth century B.C.E. China, but they were originally made of silk stretched over a bamboo frame rather than paper.)

As for when paper itself was invented, that’s a bit difficult to pin down. The oldest piece of true paper ever found, in Lu Lan, China dates from 252 B.C.E.

You have to admit, as Mark Kurlansky points out in the prologue to an excellent new book called Paper: Paging Through History, paper isn’t the most obvious invention. Even today, creating it is a fairly complicated, counterintuitive process. (Cut down a tree, boil it into muck, filter the muck through a screen, and let it dry? Who does that?) Originally, though, it was the end of a natural progression of using plants as writing materials. After stone and clay, leaves and bark were used by early civilizations as writing surfaces, which were much easier to carry around (provided you removed the leaves or bark from the tree, of course) and the Egyptians took things a bit further with their processing of the papyrus plant into a remarkably paper-like material. So perfect was papyrus as a writing surface at the time that Egypt sold their papyrus to other countries, becoming the first “paper” distributor. Although papyrus plants grew throughout the Middle East, those grown in Egypt were of the best quality, so the Egyptians pretty much cornered the market on the stuff, and it had become the de facto writing material throughout the civilized world. All the books that were being collected by Ptolemy, ruler of Egypt, for the great library at Alexandria, were written on papyrus. (This was Ptolemy I Soter I, c. 367 B.C.E.–283 B.C.E., successor of Alexander the Great and founder of the Ptolmaic dynasty, not to be confused with Claudius Ptolemy, famous for his star charts.)

Interestingly, as Kurlansky relates, Eumenes, who ruled the Greek city of Pergamum, also wanted to build a big library. This did not sit well with Ptolemy, who wanted the only whopping great library in the world, so he refused to sell Eumenes any papyrus. Eumenes, not to be deterred, tried to find an alternative writing material. It took more than a century, but the Pergamumians (Pergamese?) figured a way to make a good quality writing material out of animal hides. This material was named for the city, pergamum, which over the centuries evolved into the English word parchment.

Over the years, various materials have been used as, or even specifically called, “paper,” but to be considered proper paper, the substance needs to composed of a network of cellulose fibers—that is, derived from plant matter. Although various civilizations used the occasional plants and grasses for papermaking, for centuries—until the 19th century—papermakers obtained cellulose fibers secondhand from discarded clothing and other textiles like flax and cotton derived originally from plant matter. Papermaking used to be a smelly, disgusting business, as this was long before people were as fastidious about (or even cognizant of) personal hygiene as we (hopefully) are today, so sorting through discarded underwear as a prelude to pulping was not one of society’s glamour professions, and was also a good way to catch all manner of diseases, such as the Plague.

The basic building block of paper—cellulose—itself wasn’t discovered until 1838, when French chemist Anselme Payen (1795–1878) first isolated and named it. Payen also co-discovered the first enzyme, diastase, which he named after the Greek word διάστασις (diastasis, meaning “parting, separation”). Payen and his colleague Jean-François Persoz heated a beer mash (a mixture of barley and water), and found that it was the presence of diastase that converted the starch in the barley seed into soluble sugars. Payen and Persoz also began the convention of naming enzymes such that they ended with -ase, which continues today.

We’ve covered the transition from rag- to wood-based papermaking in a previous post, but papermaking had been witness to a few technological innovations over the course of its development. One particular technology that caused no small amount of friction was the introduction of the Hollander beater, developed by the Dutch in the mid-1600s. The Hollander beater was, essentially, an efficient way to chop up rags into paper pulp. (It was the advent of the Hollander beater that made the Netherlands a major player in the international paper market in the latter half of the 17th century.) It was faster, and as a result produced cheaper paper—cheaper in price, not necessarily in quality, despite the early insistence by some that it produced inferior paper. It spread throughout Europe, especially when paper mill owners realized that, by some estimates, using the Hollander beater could cut paper manufacturing costs by as much as 75 percent.

As you would imagine, not everyone was happy about this new technological innovation, and when one French paper company installed it, it became the site of a famous labor contretemps. The company owned several paper mills in Vidalon-le-Haut in the Rhône valley south of Lyon, and employed about 150 people, 50 of them children (this was before child labor laws; c’est dommage). A key employee at the time was called a “vatman,” who was basically the craftsperson who dipped the wire mesh into the slurry of pulp. (I could make a Jake & the Vatman joke here, but I’m trying to wean myself off increasingly obscure Gen X cultural references.) The creation of the pulp itself was also the purview of the vatmen, and they were highly skilled and in-demand employees. Vatmen’s best practices were referred to as les modes, and any mill that eschewed les modes could find itself, shall we say, beaten to pulp by the labor market. Enter the Hollander beater, and a confrontation with the vatmen was inevitable. In 1781, there was a lockout, and mill management sought to recruit younger workers not as entrenched in les modes. It became a pivotal moment in labor relations in the paper industry.

These mills were owned by a papermaking family named Montgolfier. Pierre Montgolfier and his wife, Anne Duret had 16 children (yikes), and while son Raymond was groomed to take over the family business, two of the brothers—Joseph-Michel (1740–1810) and Jacques-Étienne (1745–1799)—had other ideas. As legend has it, Joseph happened to be watching laundry drying (and why not? It’s still better than reality TV). He noticed that the laundry, suspended above a fire, tended to billow upward. This was before the understanding about how hot air is less dense than cool air and thus rises, so Joseph believed that smoke contained a special gas, which he called Montgolfier Gas, with a property he called—no kidding—“levity.” The lightbulb (or candle, perhaps, as the lightbulb had yet to be invented) went off over his head and he and his brother went about designing and building what would become the first hot-air balloon. In 1783, the Montgolfier brothers became the first humans to slip the surly bonds of Earth and fly in a lighter-than-air craft.

And, to bring things full circle, the first-ever in-flight beverage—not a Bloody Mary—was provided by Benjamin Franklin, who popped the cork on a bottle of champagne that was carried aloft on that first balloon flight. I bet it tasted great.



Michael Huberman, review of Papermaking in Eighteenth-Century France: Management, Labor, and Revolution at the Montgolfier Mill, 1761-1805, EH.NET, July 2001,

Mark Kurlansky, Paper: Paging Through History, New York: Norton, 2016.

James Maynard, “Airplane Food Tastes Bad Because Of Noise, Study Reveals—So Why Are The Bloody Marys So Good?” Tech Times, May 19, 2015,

Chris Nichols, “Fine Dining at 30,000 Feet: The History of Airline Food,” Los Angeles Magazine, September 16, 2015,

Leonard N. Rosenband, Papermaking in Eighteenth-Century France: Management, Labor, and Revolution at the Montgolfier Mill, 1761-1805, Baltimore: Johns Hopkins University Press, 2000.

Dan Stone, “Why Your Bloody Mary Tastes Better on a Plane,” National Geographic, May 21, 2015,

“Anselme Payen,” Wikipedia, last modified on October 15, 2016, retrieved October 21, 2016,

“Diastase,” Wikipedia, last modified on October 8, 2016, retrieved October 21, 2016,

“Montgolfier Brothers,” Wikipedia, last modified on September 9, 2016, retrieved October 21, 2016,


A Serpentine Tale with a Photo Finish


I confess I rarely spend more than a few seconds a week on Facebook, but the germ—and final chapter—of this essay came from a linked story I fortuitously happened to catch, posted by that modern-day wizard of light and glass, Andrew Gordon.

The other day, someone asked me, “what’s the good word?” and my usual response is “penultimate,” as I think it’s a really good word. (It sounds so ominous and demands to be spoken by someone with a deep, sonorous voice like James Earl Jones—and yet it only means “next to last.” You have to admire a word like that.) However, I recently came across another good word that I rather like, although it’s perhaps even less useful in normal conversation: “aposematism.” Coined by British evolutionary biologist Edward Bagnall Poulton in his 1890 book The Colours of Animals: Their Meaning and Use Especially Considered in the Case of Insects, it refers to an animal’s warning coloration. You know all those yellow, red, or other vibrantly colored frogs and newts and other critters? Non-green frogs or other unusually-hued creatures are so colored as a kind of advertising display: it identifies to would-be predators that they’re poisonous. You can see the advantage to this; it doesn’t do a frog much good if the creature that eats it later dies from poisoning.

So the perception of color in the animal world can very often be a matter of life and death. It can be for humans, too, particularly in our dealings with the natural world.

Reader participation time: quick, name a poisonous snake.

If you said “rattlesnake,” “cobra,” “copperhead,” or something similar—well, sorry, that’s wrong. Those snakes are not poisonous, they’re venomous. It sounds horribly pedantic, but there is a distinction. A poison is a toxin that does its damage by being inhaled or ingested. A venom is a toxin that requires direct injection into the bloodstream, which is what a venomous snake will attempt (fangs for the memories). You could actually eat a rattlesnake or cobra with no problem (OK, if it’s still alive there will be some problem), and it probably tastes like chicken. You could even conceivably drink snake venom and be fine, as long as you don’t have any cuts in your mouth or any other way for the venom to get into your bloodstream (digestive juices would break down the toxins in the stomach). However, don’t try this at home!

(There is a drink called a snakebite, a mix of lager and cider, that sounds about as appealing as its namesake.)

There are only a few known snakes that are actually poisonous: some species in the Rhabdophis genus, such as the red-necked keelback snake indigenous to Thailand, eat poisonous toads, sequester the poison, then secrete it from glands in their necks to ward off predators. (They’re also properly and highly venomous.) And there is a type of garter snake native to Oregon that eats poisonous newts, and while the snake itself is immune to the poison, enough remains in the snake’s liver après dîner that it becomes poisonous to small predators such as birds and rodents.

Alas, aposematism doesn’t help these guys out, although it does come in handy for some kinds of venomous snakes. How can you tell, say, a lethal copperhead from a generally harmless scarlet king snake, both of which, to the relatively untrained eye, look pretty similar? There are little mnemonic rhymes based on these snakes’ color schemes that can offer guidance: “Yellow touching red: You’re dead,” “Red against yellow can kill a fellow,” or, on the happier side, “Red touching black: Safe for Jack.”

Mind you, these are fairly specific to these species, and may not be of much help when you consider all the various kinds of venomous snakes. Generally, it’s a good idea to keep a discrete distance from any unfamiliar snake, especially since venom is only one way they can kill you. (An anaconda, if Hollywood is to be believed, could comfortably swallow Jon Voight whole.)

Venomous or benign, colorful or drab, the snake has been one of humankind’s most potent symbols, and virtually every religion has a snake involved at some point. Indeed, the oldest known example of a religious ritual is serpentine; in 2006, 70,000-year-old spearheads were found near carvings on a snake-shaped rock, providing, said the investigators, the earliest evidence of ritual behavior.

One of western civilization’s most iconic images is that of a snake eating its own tail, called the ouroboros (from the Greek οὐροβόρος ὄφις “tail-devouring snake”), often used as a symbol for self-reflexivity or cyclicality. The oldest known use of the symbol is from a 14th-century B.C. Egyptian funerary text found in the tomb of King Tutankhamen (who was not, as the song goes, “born in Arizona and moved to Babylonia” nor was he “buried with his donkey”). The ouroboros motif is found in many disparate cultures, and is even used in alchemy as, says Wikipedia somewhat poetically, “a symbol of the eternal unity of all things, the cycle of birth and death from which the alchemist sought release and liberation.”

The ouroboros also played a role in decidedly more modern chemistry in a famous story involving German organic chemist Friedrich August Kekulé. (By the way, the family name was originally Kekule, sans accented “e.” His father added the accent after Napoléon invaded Hesse, where they lived, to ensure that the French pronounced their name correctly. The things people worry about.) Anyway, Kekulé was a pioneer in identifying the chemical structure of compounds, and one of his biggest challenges was the hydrocarbon benzene (C6H6). He wrote about his “aha” moment:

I was sitting, writing at my text-book; but the work did not progress; my thoughts were elsewhere. I turned my chair to the fire and dozed. Again the atoms were gamboling before my eyes. This time the smaller groups kept modestly in the background. My mental eye, rendered more acute by the repeated visions of the kind, could now distinguish larger structures of manifold conformation: long rows, sometimes more closely fitted together; all twining and twisting in snake-like motion. But look! What was that? One of the snakes had seized hold of its own tail, and the form whirled mockingly before my eyes. As if by a flash of lightning I awoke; and this time also I spent the rest of the night in working out the consequences of the hypothesis (via Read, 1957).

And thus had Kekulé ouroborotically divined the ring structure of benzene.

Benzene itself had been known informally as an aromatic resin called gum benzoin since the 15th century, and had been used in medieval medicine and perfumery. It was first chemically isolated in 1825 by Michael Faraday (1791–1867), who, known today for his contributions to electricity and magnetism (he discovered the principles of electromagnetic induction, diamagnetism, and electrolysis), began his career as a chemist. In fact, he invented an early version of what would later be known as the Bunsen burner. From 1821 onward, Faraday was heavily involved with the Royal Institution of Great Britain, and in 1833 he was appointed the Institution’s first Fullerian Professor of Chemistry, which was a lifetime gig. Late in his life, he had frequent conversations with a Scottish physicist named James Clerk Maxwell (1831–1879), who would often attend lectures at the Royal Institution, particularly after being inducted (not electromagnetically) into the Royal Society in 1861.

Maxwell was one of the great polymaths, even dabbling, Vogon-like, in poetry, and the opening line of his “To the Chief Musician upon Nabla: A Tyndallic Ode”—“I come from fields of fractured ice”—perhaps anticipates Led Zeppelin’s “Immigrant Song” by about 100 years. Maxwell is most famous for his equations governing electricity and magnetism, but he also did pioneering work in chemistry, astronomy (he was the first to propose that Saturn’s rings were actually composed of tiny particles), mathematics, and engineering, to name just a few. He is also famous for “Maxwell’s demon,” a thought experiment meant to demonstrate how one might possibly violate the Second Law of Thermodynamics.

In 1860, Maxwell was granted the Chair of Natural Philosophy at King’s College, London, and it was during the five years he spent at King’s College that he would be involved in a wide variety of scientific endeavors—including one that has particular relevance to us.

As I’ve mentioned in this space before, in 1826 or 1827, Joseph Niepce took what is believed to be the first photograph ever taken. Not a selfie, thankfully, but rather the “View from the Window at Le Gras,” it was a black-and-white image, and photography would soon explode into the full gamut of glorious grayscale. But what of color?

In an 1855 paper, Maxwell theorized how the human eye processes color, noting that the eye had separate cones that variously detected red light, green light, and blue light. He then sought a way to demonstrate his color theory, and his intent had not been to create a new technology or art form. Working with photographer Thomas Sutton, who had invented the single-lens reflex (SLR) camera, he took three identical photographs of a tartan ribbon (he was Scottish, remember), each one shot through a separate filter—one filter was red, one was green, and one was blue. Sound familiar? Yes, they were basically color separations. (He had also made a fourth separation, yellow, that he ultimately didn’t use.)

In a live lecture on May 17, 1861, he took the three images, developed them as slides, and when he projected each one through the appropriate color filter, the composite image was a full-color photograph. Well, sort of. The problem was that the red and green photographic plates were less sensitive than the blue, so the color balance wasn’t quite true to life. But still…

Those conversant in color theory will also note that Maxwell’s demonstration was what we know of as “additive color mixing.” It wasn’t until the advent of a subtractive mixing process, using colorants like inks, that color photography—and color printing—could come to life. But that’s another story for another time.

In the meantime…watch out for snakes, whatever color they may be.



Tim Barribeau, “Color Photography Turns 150 Years Old Today,” Popular Photography, May 17, 2011,

Andrew Durso, “Poisonous Snakes Can’t Resist Toxic Toad Tucker…or Can They?” Scientific American Guest Blog, August 21, 2012,

Josh Jones, “Behold the Very First Color Photograph (1861): Taken by Scottish Physicist (and Poet!) James Clerk Maxwell,” Open Culture, August 22, 2016,

JR Minkel, “Offerings to a Stone Snake Provide the Earliest Evidence of Religion,” Scientific American, December 1, 2006,

Sir Edward Bagnall Poulton, The Colours of Animals: Their Meaning and Use, Especially Considered in the Case of Insects, New York: D. Appleton and Company, 1890, p. 337.

John Read, From Alchemy to Chemistry, New York: Dover Publications, 1957, pp. 179–180.

Rhabdophis subminiatus (Red-Necked Keelback Snake), Thailand Snakes, June 23, 2015,

Frequently Asked Questions About Venomous Snakes, Department of Wildlife Conservation & Ecology, University of Florida,

“Snake Rhyme – Red Touch Yellow,” Snake Removal National Service,

“August Kekulé,” Wikipedia, last modified on March 20, 2016, retrieved August 24, 2016,é.

“James Clerk Maxwell,” Wikipedia, last modified on July 9, 2016, retrieved August 24, 2016,

“Michael Faraday,” Wikipedia, last modified on July 19, 2016, retrieved August 24, 2016,

“Ouroboros,” Wikipedia, last modified on August 23, 2016, retrieved August 24, 2016,

“Snake,” Wikipedia, last modified on August 21, 2016, retrieved August 24, 2016,

Olympic Printing?


The 2016 Rio Olympics are in their final weekend, another slate of medalists goes into the record books, and Michael Phelps returns to his tank at SeaWorld, cups and all. Truthfully, though, I can’t say I’m much of an Olympics fan (although I do like the Winter Games marginally more), but I probably would be more of one if they reintroduced graphic arts as an Olympic event.

That’s not a joke; printmaking actually was, at one time, part of the Olympics.

As we all know, the Olympic games date back to ancient Greece, although no one is certain exactly when the first games were held. Legend has it that the Games were founded by Heracles (aka Hercules, his better-known Roman name), but since he was a mythological figure, one questions the veracity of that claim. The first Olympic Games for which there is any record were held in 776 B.C., and the first recorded Olympic champion was a cook named Coroebus of Elis, who won the sprint competition. He was quite the standout athlete, largely because, at the time, there really was only one Olympic event: a footrace known as a stade (from which the word stadium­). Over the ensuing decades, new events were added to the games, such as longer footraces, wrestling, boxing, pentathlon, chariot racing, and an event called pancratium, which was a fairly brutal combination of wrestling, boxing, and general fisticuffs.

The competitors were athletes from Greece’s various city-states, as well as from Greek colonies in what is now Italy, Africa, and Asia Minor. They didn’t have to worry about designing special uniforms because, as was the custom back in ancient Greece, most athletes competed nude. (Etymologically, the word “gymnasium” comes from the Greek gymnos, or “naked”—ergo, a gymnasium is a “naked place,” which is pretty disturbing if you think about your local gym.) This may seem strange to us latter-day Victorians, but nudity was no big deal to the ancient Greeks (a bigger issue for the athletes, you would think, would be aerodynamics….). And, alas, Olympic competition was males-only, although there are records of women and girls competing in small, local Olympic-type games.

Initially, the ancient Olympics were held on a single day, but over the centuries expanded to four days of competition, with a fifth day for a closing ceremony and awarding prizes. The original Olympics were also highly religious events.

By the 2nd century B.C., Greece had been conquered by Rome, and the Romans were not too keen on the Olympic Games; they much preferred chariot races and bloody gladiatorial combat to athletic prowess. The Olympic Games petered out over the years and were officially abolished somewhere around 400 A.D. by the Roman emperor Theodosius I. Shortly thereafter, his son ordered the destruction of Greek temples. So that was that.

Fast forward a millennium and a half or so to 1821. Greece won its independence from the Ottoman Empire, and interest in reviving the old games started to percolate. In 1833, poet and newspaper editor Panagiotis Soutsos published a poem called “Dialogue of the Dead” in which he officially floated the idea of bringing the Games back. For a while, an idea was all it was, until 1856 when a wealthy Greek-Romanian named Evangelos Zappas offered to fund a revival of the Olympics. Greece’s King Otto took him up on it and in 1859, the first Olympic Games in almost two millennia were held in Athens. For the next couple of decades, the Games were held in Athens every four or five years, and it’s been somewhat reliably estimated that the 1870 Games drew a crowd of more than 70,000. It seems they were on to something.

Enter Baron Pierre de Coubertin (1863–1937), a French aristocrat turned educator and historian. He became infatuated with ancient Greece, especially the Hellenic idea of the combination of physical and intellectual fitness. He was also stung by France’s humiliating defeat in the recent Franco-Prussian War, and felt that physical fitness was the route to developing better soldiers. One of his missions was to incorporate physical fitness into French schools, but to no avail. C’est dommage.

Anyway, long story short: Coubertin was inspired by the revival of the Greek Olympic Games and, in 1890, he took the mantle from Zappas and other Olympic organizers and created the International Olympic Committee (IOC), which today remains the governing body of the Games, for better or worse. The nascent IOC’s plan was to create the Olympics as an international “moveable feast,” held in a different city every four years. The 1896 Olympics were the first to take place under the new organization, and the Games were afoot! In 1900, Paris became the first host city outside Athens. The Winter Games were added in 1908 (for obvious reasons, there were no Winter Olympics in ancient Greece) and in 1912 another component was added to the Games….

Remember how Coubertin was devoted to the idea of developing both body and mind? In 1908, Coubertin floated the idea of adding artistic competitions to the Olympic Games. The IOC identified five areas of artistic competition: architecture, literature, music, painting, and sculpture, and the rub was that the works of art created in these competitions had to be inspired by sport. The plan was to launch the “Concours d’Art” at the 1908 Olympics, but for various reasons it was delayed until the 1912 Summer Games, held in Stockholm.

Honestly, though, not everyone was on board with the idea of Olympic art competitions, and they drew few entrants. Few of these competitions were held live; for example, at the 1928 Games, Jan Wils won the gold medal in architecture for designing the stadium in which the 1928 Games were actually being held. Talk about a home-field advantage!

Meanwhile, the categories kept changing from Olympic year to Olympic year, and it’s hard to imagine that town planning (introduced as an Olympic event in 1928) would have merited much TV coverage.

Other artistic competitions included literature (dramatic, epic, and lyric), music (the judging was based on written sheet music not performance, which also would not have made for good TV), sculpture, and painting.

(If you’re like me, the idea of an Olympic literature competition reminds you of Monty Python’s “Wide World of Novel Writing” sketch in which Thomas Hardy writes Return of the Native live.)

Anyway, the “painting” category was in constant flux. Until 1928, there was only a single painting category, but was divided into three smaller sub-categories: drawings, graphic arts, and painting. Four years later, the categories were changed to paintings, prints, and watercolors/drawings, and four years later still, “prints” had been replaced by “graphic arts and commercial graphic arts.” The first gold medalist in the graphic arts was American painter (and later Rear Admiral in the U.S. Navy) Joseph Webster Golinkin, who won in the “prints” category in 1932 for a work called “Leg Scissors.”

The artistic competitions were ultimately dropped after the 1948 Games, largely because it was difficult to gauge the amateur status of the competitors.

Maybe we should bring back the graphic arts to the Olympics. After all, some wide-format output is actually a significant physical challenge to move around. Let’s also not forget the various vehicle wrapping “Olympics.”

And many printers today have been known to jump through hoops for clients—which certainly qualifies as a sub-category of gymnastics.



Olympic Museum, Art Competition 1912–1948,

Harold Maurice Abrahams, “Olympic Games, Encyclopedia Britannica,

“Art competitions at the Summer Olympics,” Wikipedia, Wikipedia, last modified on January 1, 2016, retrieved August 11, 2016,

“Art competitions at the 1932 Summer Olympics,” Wikipedia, last modified on January 1, 2016, retrieved August 11, 2016,

“Olympic Games,” Wikipedia, last modified on August 9, 2016,

“Pierre de Coubertin,” Wikipedia, last modified on August 10, 2016, retrieved August 10, 2016,

A Nose By Any Other Name…


Last month, the Saratoga Shakespeare Company kicked off its annual “Shakespeare in the Park” performances up here in Saratoga Springs, N.Y. They are doing Romeo and Juliet in August, but the July performance was actually “Rostand in the Park,” a production—and a highly enjoyable one—of Edmond Rostand’s Cyrano de Bergerac, the 1897 play about the proboscally prodigious swordsman and the odd love triangle he finds himself in. (Steve Martin fans may be familiar with the 1987 film Roxanne, which was a modern adaptation of Cyrano.) To my shame, I had never read nor seen a production of the original play before, so I ordered a “pocket edition” of the play. Aldus Manutius would have been pleased.


Aldus Manutius (1449–1515) was one of the most important and seminal figures in early book printing and publishing. In his youth, while studying at the University of Rome in the 1460s and 70s, he developed a passion for the classics of Greek and Roman literature. He learned Greek and kicked around Italy for a bit, working as a tutor to the children of various royal families (nice work if you can get it). Later, a friend introduced him to the Prince of Carpi, who gave him the funds to set up a printing press for the purposes of promoting Greek scholarship.

In 1489, Manutius gave up teaching and went into publishing full time, moving to Venice and setting up the Aldine Press as a partnership with printer Andrea Torresano. The Aldine Press was staffed with the usual contingent of compositors and printers—and, yes, probably someone who would might have been referred to as an “Aldus pagemaker” (sorry)—as well as a team of Greek scholars. Manutius did most of the work, but technically only owned about 10 percent of the business, although his situation improved when he married Maria Torresano, his partner’s daughter (no fool he).

Manutius’ goal was to preserve the Greek and Roman texts he so loved, since there were few editions of the classics in print at the time—not surprising given that printing was less than 50 years old. Manutius pioneered many printing and publishing innovations. First, he invented the first “pocket editions,” or small-format books that could be easily carried around. His books came in three sizes: folio, quarto, and octavo, each term referring to how many times a sheet of paper was folded to produce a given number of pages. His innovation, the octavo, folded each sheet three times to produce 16 pages (or eight leaves), and as you would expect these pages were pretty small.

The problem, as you can imagine, was how to fit all the text he wanted into such a small format without having the book end up being prohibitively thick. He then created (or, rather, hired Venetian type designer Francesco Griffo to create) a new kind of typeface which he called “italic,” a reference to classical Italy. Manutius italic, which he used for all the text, not just emphasized words or phrases, was based on the cursive handwriting style used in Italian government offices for—yes—the “fine print” of official documents. His first octavo edition—Virgil’s Opera—appeared in 1501, and the editions would prove to be wildly successful. Aldine’s became the definitive editions for scholars and were largely responsible for kicking off the Renaissance interest in all things ancient.

Manutius also was the first printer to use the semicolon, which would greatly inspire the works of Virginia Woolf. (That’s an obscure joke unless you’ve read To the Lighthouse.) He also developed the modern look of the comma. He was punctual, you have to give him that.

Manutius had tried to patent his italic typeface, but nevertheless various copies of it began to spread inside and outside Italy. In France, a German-born printer, publisher, and bookseller—and popularizer of the Aldine editions—named Sébastien Gryphe (aka Sebastian Gryphius) had set up shop in Lyons circa 1520, initially publishing law books. He was also a big fan of the classics, and began printing his own editions of the Greek and Latin texts. In 1536, he founded l’Atelier du Griffon, and by the 1540s he was the “Prince of Lyons booksellers” (Febvre and Martin, 1976) and was turning out Aldine-esque editions.

One of Gryphe’s Latin text editors was a young former Franciscan monk and medical student named François Rabelais, who had originally come to Gryphe to publish his own translations of works by ancient physicians such as Hippocrates and Galen. In the 1540s, Rabelais juggled his job as a physician with editing Gryphe’s Latin texts, and in his spare time he wrote humorous pamphlets and other works. In 1532, he had published, using the anagrammatic pseudonym Alcofribas Nasier, a book called Pantagruel, which would become the first in a highly successful series, and the reason we know of Rabelais today: The Lives of Gargantua and Pantagruel, a pentalogy published between 1532 and 1564. The books were crude, bawdy, vulgar, scatological, yet funny tales about the two titular father-and-son giants and their adventures. The Sorbonne deemed the book(s) obscene, and the Roman Catholic Church were also not fans, as the tales poked a bit of fun at religion. He did have some heresy scares. Despite all this, the series was phenomenally popular.

Gargantua also inspired some writers who did have subversive intentions. In 1533, only a year after Rabelais’ first volume appeared, a French Protestant pastor named Antoine Marcourt published a satire of the Catholic Church called Le Livre des Marchans, very much in the style of Pantagruel (he also included references to Rabelais’ book in the text). (Le Livre des Marchans was published by Pierre de Vingle, who would, two years later, print the first Bible in French.)

Marcourt is perhaps most infamous for instigating what has become known as “The Affair of the Placards.” During the evening of October 17, 1534, anti-Catholic posters mysteriously appeared around several cities in France. One even appeared on the bedroom door of King François I, which was a pretty cheeky breach of security and scared the bejesus out of him. It also ended whatever tolerance the king had for the Protestants. He closed bookstores and publishing houses, and went on to lead a grand procession that ended in the execution of six Protestants implicated in the Affair.

One of those executed was Audebert Valeton, a Nantes property tax collector whose daughter Catherine was the maternal grandmother of a colorful character named Cyrano de Bergerac (1619–1655). A novelist, playwright, soldier, and duelist, de Bergerac was an actual historical figure, although very little is known about his life. Two of his novels were The Comical History of the States and Empires of the Moon and The Comical History of the States and the Empires of the Sun and, published posthumously, were some of the earliest works of science-fiction; he is perhaps the first person to describe a rocket leaving the Earth.

De Bergerac was the inspiration for Rostand’s fictional Cyrano, and from the one engraving that has been made of the real Cyrano, he did seem to have a bit of a schnozz. (The plot involving him ghostwriting Christian’s love letters to Roxane was completely Rostand’s invention.) The real de Bergerac died at age 35, although no one is entirely certain how. Some speculate he was mortally wounded in an accident, some claim he was assassinated (he had some pretty bitter enemies).

Rostand’s Cyrano was an immediate hit. The original Cyrano, Constant Coquelin, performed it 410 times in France before taking it to North America. It has been performed, adapted, and film many many times ever since, and the basic plot has recycled in countless rom-coms and sitcoms.

And Aldus Manutius would not have been pleased by the poor quality of the pocket edition of the play that arrived in my mailbox.

If you enjoy these historical digressions, 21 of my Digital Nirvana posts from last year have been collected into a new book called Printing Links, available in paperback from Amazon.



Etudes rabelaisiennes, Volume 11, Librairie Droz, 1974, p. 117n.

“1535: Six Protestants for the Affair of the Placards,”, January 21, 2013,

Frank Romano and Richard Romano, The GATF Encyclopedia of Graphic Communication, Sewickley, Pa.: GATF Press, 1997.

Lynn Truss, Eats, Shoot & Leaves: The Zero Tolerance Approach to Punctuation, New York: Gotham Books, 2004.

Lucien Febvre and Henri-Jean Martin, The Coming of the Book: The Impact of Printing, 1450-1800, London: Verso, 1976, Page 149.

“Affair of the Placards,” Wikipedia, last modified on May 17, 2016, retrieved August 1, 2016,

“Aldus Manutius,” Wikipedia, last modified on July 26, 2016, retrieved August 1, 2016,

“Antoine Marcourt,” Wikipedia, last modified on October 1, 2015, retrieved August 1, 2016,

“Cyrano de Bergerac,” Wikipedia, last modified on July 29, 2016, retrieved August 1, 2016,

“Cyrano de Bergerac (play),” Wikipedia, last modified on July 27, 2016, retrieved August 1, 2016,

“François Rabelais,” Wikipedia, last modified on May 13, 2016, retrieved August 1, 2016,çois_Rabelais.

Gargantua and Pantagruel,” Wikipedia, last modified on April 11, 2016, retrieved August 1, 2016,

“Sebastian Gryphius,” Wikipedia, last modified on June 5, 2016, retrieved August 1, 2016,

Heavens to Betsy!


One of the thorniest issues in modern electronic publishing has been the traditional problem of converting on-screen colors (RGB) to printable colors (CMYK). We often hear of difficulty matching certain brand colors, but there is actually another popular and important set of colors that can’t be completely reproduced using either RGB or CMYK colors. Know why? Because they were designed to be reproduced on cloth. These colors are specified using a swatchbook called the Standard Color Reference of America produced by the Color Association of the United States (CAUS), formerly (until 1955) the Textile Color Card Association of the United States. Various Pantone Matching System approximations have been devised (186 C, 193 C, or 200 C for one of the colors, and 281 C or 288 C for the other). The third color is simply “white,” which is easy enough. Got it yet? One of the colors is officially called “Old Glory Red” and the other is “Old Glory Blue.” Of course, we are talking about the official colors used on the American flag.

The flag has a colorful history (as it were), and has obviously changed subtly as new stars needed to be added. The current 50-star version was designed by Robert G. Heft in 1958, who was then a 17-year-old high school student in Lancaster, Ohio. His flag design started as a school project, and Heft only got a B– on it. That didn’t matter, though, because Ike liked it; President Eisenhower chose Heft’s design out of 1,500 submissions and on August 21, 1959, Executive Order No. 10834 hefted Heft’s design as the official 50-star flag of the United States, which we still use today.

There remains some debate as to who designed the original “stars and stripes,” although a stars and stripes motif is common in heraldry, which is likely where it originated. (The Betsy Ross story we learned in grade school is about as apocryphal as George Washington chopping down the cherry tree.) Also of some modest debate is where the name “America” came from. It is commonly accepted that America was named after Italian explorer Amerigo Vespucci. Clouding matters is that it was not common at the time to name countries or continents after an individual’s given name unless s/he were royalty. It would have been more common to use the surname, which means that technically we should be the United States of Vespucci, which would have been pleasing (says a guy named Romano).

It has been claimed that America was actually named after Richard Amerike (c. 1440–1503, ap Meryk, also written Ameryk), a Welshman who relocated to England and became a wealthy merchant and eventually sheriff of Bristol. The evidence—mostly circumstantial—was offered in 1908 by Bristol historian Alfred Hudd, who noted that Amerike had largely financed the 1497 expedition of Genoese explorer John Cabot ( Giovanni Cabotto), who was the first European to reach North America (obviously, the ancestors of the Native Americans got here first). Hudd also cited as evidence a letter from 1497 that indicated that the name “America” was well-known—at least in Bristol—at that time.

This year (1497), on St. John the Baptist’s day (June 24th), the land of America was found by the merchants of Bristow, in a ship of Bristowe called the ‘Mathew,’ [Cabot’s ship] the which said ship departed from the port of Bristowe the 2nd of May and came home again the 6th August following (Hudd, 1908).

And while it may have been well-known in Bristol in 1497, the name “America” wasn’t known to Europe in general until a decade later, when it appeared on an extraordinary map.

Martin Waldseemüller and Matthias Ringmann were German cartographers who produced a 1507 masterpiece of mapmaking called, succinctly, “Universalis cosmographia secunda Ptholemei traditionem et Americi Vespucci aliorum que lustrationes” (aka “A drawing of the whole earth following the tradition of Ptolemy and the travels of Amerigo Vespucci and others”). It was wood block-printed on 12 separate 18×24-inch sheets and, when tiled, measured more than 4×8 feet. It was the first attempt to map the entire world, and was one of those revolutionary documents that changed everyone’s conception of the world, as it was the first map to include the New World. Waldseemüller and Ringmann had been working from a letter written by Vespucci (although the letter was later disputed and believed to have been a fabrication based on a real letter written by Vespucci) called “Mundus Novus” (“New World”) that ostensibly chronicled his voyages across the Atlantic between 1497 and 1504, identifying the land he had reached as indeed a new continent, dismissing Columbus’ notion that the Western Hemisphere was actually Asia.

To honor Vespucci, the cartographers bestowed this new land with the name “America” (a Latinization and feminization of Amerigo). It was rare to name such an important new place after a living person, and Waldseemüller apparently thought as much, because when he revised the map in 1513, he removed the name America and replaced it with “Terra Incognita.” Later cartographic works restored the name America and it seems to have stuck. (By the way, Hudd’s theory about Amerike is not really accepted by too many historians. It had been also suggested that the “stars and stripes” was based on Amerike’s coat of arms, but that’s even more of a reach.)

In this age of GPS and Google Maps, we take actual printed maps for granted (if we take them at all), but cartography—the art and study of mapmaking—is equal parts science and craft, precision and beauty, and mapmaking technology has evolved with the times and new printing techniques. Originally drawn and painted laboriously by hand using brushes on parchment, maps soon were created using relief printing (via wood blocks), copperplate engraving (aka intaglio), and lithography.

A unique map printing technique called cerography (from the Greek for “wax printing”) took the cartographic world by storm in the latter half of the 19th century. It was invented in the 1830s (patented in 1839) by Sidney E. Morse (1794­–1871), brother of our old friend Samuel Morse. See Purinton, 2003, for the exact details, but cerography is a type of wax engraving. A thin layer of wax (the exact composition of which was a “secret sauce” proprietary to different manufacturers) is applied to a blackened copperplate. The image is either drawn, traced, or stamped into the wax, and a printing plate is created from the engraved wax via stereotyping or electrotyping. This plate could then be used on conventional letterpress presses. There were many advantages to this process—for example, text and images could be combined on the same plate—and later developments incorporated photographic imaging. Cerography was eventually rendered obsolete by photoengraving. So it goes.

Before he dabbled with mapmaking, Sidney Morse had embarked on some publishing ventures, mostly religious publications. In 1823, with his brother Richard Cary Morse, he founded the New York Observer, and before that, in 1816, he was the original editor of a brand-new religious publication called The Boston Recorder. Known as just The Recorder, it was established by Nathaniel Willis (1780–1870), himself the son of a newspaperman named Nathaniel Willis. In 1827, Willis fils launched a weekly religious publication for children called The Youth’s Companion, and served as its editor for about 30 years. Its original vision, said Willis and co-publisher Asa Rand, was to encourage “virtue and piety, and…warn against the ways of transgression.” Originally published by the Perry Mason Company (whence the name of Erle Stanley Gardner’s crime-solving lawyer—Gardner was a fan of the magazine as a youth), it took a while to become a success, and was taken over by a series of publishers throughout the years. In the 1890s, it changed its target audience to include adults, and when it added a health column for older folks, its circulation soared. Funny how things don’t change. Contributing writers included Harriet Beecher Stowe, Mark Twain, Emily Dickinson, Booker T. Washington, Jack London, and other literati.

The September 8, 1892, issue included a special feature. One of The Youth’s Companion’s promotional activities was selling American flags (they sold 12×18-inch silk flags for 30 cents each) and the publication was planning a national “flag day” for schoolchildren in conjunction with Columbus Day, which that year commemorated the 400th anniversary of his sailing the ocean blue. The editors of the magazine went one step further and decided to go for the patriotic gusto and come up with something that kids could recite as they saluted the flag. Coming up with this text fell to Francis Bellamy, a former Baptist minister who worked in the magazine’s promotion department. He locked himself away for two hours and what he came up with—and, after a few tweaks, was published in the September 8, 1892, issue—began, “I pledge allegiance to my flag and the Republic for which it stands…” The Pledge was first recited in public schools on October 12, 1892, and has been a staple ever since.

And if you’re pledging allegiance to the old red, while, and blue, make sure they’re the right colors.

On behalf of The Digital Nirvana, have a happy and safe Independence Day weekend.

If you enjoy these historical digressions, 21 of my Digital Nirvana posts from last year have been collected into a new book called Printing Links, available in paperback from Amazon.



Standard Proportions For The United States Flag,

John R. Hébert, “The Map That Named America: Library Acquires 1507 Waldseemüller Map of the World,” Library of Congress Information Bulletin, September 2003,

Alfred E. Hudd, “Richard Ameryk and the name America,” read May 2lst 1908, Proceedings of the Clifton Antiquarian Club Vol VII 1909 – 10,

Frederic Hudson, Journalism in the United States, from 1690–1872, Harper & Brothers, 1873, pp. 295–296.

Peter MacDonald, “The Naming of America,” BBC, March 29, 2011,

Scott Miller, The President and the Assassin, New York: Random House, 2011, p. 120.

Nancy Purinton, “A Historical Map-Printing Technique: Wax Engraving,” Journal of the American Institute for Conservation, JAIC 2003, Volume 42, Number 3, Article 4 (pp. 419 to 424),

Frederick N. Rasmussen, “A half-century ago, new 50-star American flag debuted in Baltimore,” The Baltimore Sun, July 02, 2010,

Jim Sielicki, “Robert G. Heft: Designer Of Americas Current National Flag,” United Press International via The Exchange, July–August 1988, archived at

“Cerography,” Wikipedia, last modified on May 28, 2016, retrieved June 14, 2016,

“Flag of the United States,” Wikipedia, last modified on June 12, 2016, retrieved June 14, 2016,

“Pledge of Allegiance,” Wikipedia, last modified on May 29, 2016, retrieved June 14, 2016,

The Youth’s Companion,” Wikipedia, last modified on February 28, 2016, retrieved June 14, 2016,



A Whole New Ballgame


People of a certain age—and people older than me, for once—have fond memories of the prizes that used to come in boxes of Cracker Jack, the snack food consisting of molasses-flavored, caramel-coated popcorn and peanuts. These folks were likely disheartened by the announcement in April that Frito-Lay (the current owners of Cracker Jack) are doing away with the physical prize. Actually, they did away with toys and tchotchkes years ago, replacing them with printed jokes and riddles. Now, instead, packages will come with a sticker containing a QR code. You scan it with your smartphone, download a Blippar-based app, and you can play a baseball game. I guess that’s progress.

Honestly, I never really liked Cracker Jack, but the prize and iconic box (also discontinued long ago) were all part of the pop culture tapestry, and could actually be useful shorthand. Back in the 1990s, when I wrote for Digital Imaging magazine, a somewhat new application for wide-format digital printing was lenticular imaging, and I used to describe it as an image that, when looked at from a certain angle showed one picture, and when looked at from another angle showed another picture. When I added the phrase, “like those little prizes you get in boxes of Cracker Jack,” suddenly everyone understood. Ah, well. (It occurs to me that maybe in a few years, Heidi Tolliver-Walker can explain AR and QR as “like those codes you get in packages of Cracker Jack.”)

Lenticular images use a plastic sheet that contains very thin (millimeters wide or smaller) lenses, called lenticles. Underneath the lenticles is a printed sheet containing one or more images that have been sliced into tiny stripes the same width as the lenticles, and interlaced such that each stripe lines up with a lenticle. The lenticles then refract the image in such a way that from one angle you see one image, and from another you see a different one. The lenticles can also be used to impart a 3D effect. Lenticular is a variety of what is known as autostereoscopic imaging—or, in essence, “glasses-free 3D display.”

Examples of autostereoscopy date back to the 17th century and the works of French painter Gaspard Antoine Bois-Clair (ca. 1654–1704). Bois-Clair referred to himself as “Pastor Pictor Poeta,” since he was, over the course of his career, priest, painter, and poet. (I like that the alliteration still works in English.) Born in Lyon, France, he began as a Jesuit priest, then later converted to Lutheranism and relocated to Copenhagen, where he remained for the rest of his life.

As a painter, he is not the most famous or celebrated artist in the world (he doesn’t even have a Wikipedia page—mon dieu!), and his obscure oeuvre is distinguished by one unusual painting: Double Portrait of King Frederik IV and Queen Louise of Mecklenburg-Güstow of Denmark. Painted in 1692, it is the earliest autostereoscopic image we have an example of. Here’s how Bois-Clair did it:

To achieve such effect the artist painted on a series of triangularly cut strips of wood. One facet remains against the backing of the painting, while each of the other two equilateral sides are oriented at 60° to it. When the viewer passes in front of the painting he sees successively one image from the right side and then the other from the left (Robert Simon Fine Art).

Bois-Claire is known to have produced at least two other “double paintings” in the same way.

Although Bois-Clair’s works are the earliest examples that we physically have, the concept is a bit older. There is a reference to it in Shakespeare’s Richard II (although I have never liked sequels, Richard III is better):

Each substance of a grief hath twenty shadows,
Which shows like grief itself, but is not so;
For sorrow’s eye, glazed with blinding tears,
Divides one thing entire to many objects;
Like perspectives, which rightly gazed upon
Show nothing but confusion, eyed awry
Distinguish form: so your sweet majesty,
Looking awry upon your lord’s departure,
Find shapes of grief, more than himself, to wail;
Which, look’d on as it is, is nought but shadows
Of what it is not. (II, ii, 14–24)

My annotated Yale Shakespeare edition of the play published in 1957 (the play was written circa 1595) defines “perspectives” in the fifth line above  as “optical toys of various kinds,” which Shickman, 1977, interprets as being a common form of “anamorphic perspective”: the corrugated or pleated panel.

The corrugated type combined two different pictures on a pleated surface, so that one image would be visible when observed directly from the left and another from the right. Looked upon directly, neither subject would be clear (Shickman, 1977).

Think of a louvered door with a different image on each side of the side of the louvers. Shakespeare used this analogy in several plays, as did his contemporaries.

The “turning panel” and Bois-Clair autostereoscopic displays used what is called the “barrier” technique, whereby an image is divided into stripes and aligned behind opaque “barriers” or bars. In the 1890s, the barrier technique went photographic, thanks to the work of Auguste Berthier, and in 1903 American inventor Frederic E. Ives (1856–1937) patented the “parallax stereogram,” which is the earliest precursor to today’s glasses-free 3D imaging.

In 1908, Franco-Luxemborgish inventor Jonas Ferdinand Gabriel Lippmann (1845–1921) developed what he called “integral photography,” an approach to autostereoscopic imaging that used small lenses instead of barrier lines. A scene was photographed through the tiny vertical lenses, and when the image is later viewed through a set of similar lenses, it imparts a lenticular effect. It wasn’t perfect, and it wouldn’t be until the 1920s and 30s that other investigators—including Herbert Ives, Frederic’s son—would improve upon Lippmann’s approach. In the 1930s, the company that would become Vari-Vue developed the first multiple image lenticular image, and even coined the term “lenticular.” Vari-Vue would become the go-to business for lenticular “flip cards” and other items.

Meanwhile, in 1908, the same year that Lippmann was devising his integral photographic process, an everyday 2D image caught the eye of one Jack Norworth. Born John Godfrey Knauff, Norworth was a singer, songwriter, and vaudeville performer. One day, he was riding the New York City subway and happened to see an advertisement that read “Baseball Today—Polo Grounds.” Creatively inspired, by the end of the subway ride, he had a set of lyrics about a character named Katie, whom a gentleman caller asks out on a date. She agrees—but only if they can go to a baseball game. The lyrics were set to music by Tin Pan Alley composer Albert Von Tilzer and, in 1908, the song “Take Me Out to the Ball Game” became a major hit, selling loads of records, sheet music, and piano rolls. Strangely, it wouldn’t get its first performance at a baseball game until 1934 (and a high school game at that). Perhaps even more strangely, Norworth had never actually been to a baseball game, and it would be 32 years before he went to one.

The song, which is much longer than the short excerpt we usually hear (the chorus), features the line “Buy me some peanuts and Cracker Jack!”—this was before nut allergies were an issue, no doubt—and that particular popcorn and peanut confection was only 12 years old in 1908, having been created in Chicago by Frederick William Rueckheim and his brother Lewis, German immigrants who owned a popcorn-making business. As company lore has it, an early taster of the snack exclaimed, “That’s a crackerjack!” and the name stuck like molasses. You could say they hit a home run with the product. “A Prize in Every Box” first appeared in 1912, when the little toys and tchotchkes were added to the package. Some of them were actually pretty cool, even by today’s standards, and over the years included several lenticular card series including Alphabet Magic Motion Fun Cards, Canadian 3-D Animals cards, Kaleidoscope Action Card/Sticker, 1984 Olympic Tilt Cards, Robotrons Cards, and more. Vari-Vue, Inc. was the original supplier of lenticular cards, before Toppan Printing, Ltd. and Optigraphics Corporation got in the game.

Now, with QR and mobile apps taking over as the prize inside Cracker Jack, it’s a whole new ballgame.



“Brief History,”, archived at

Randee Dawn, “Cracker Jack is replacing toy prizes inside with digital codes,”, April 22, 2016,

Ned Lukacher, Time-fetishes: The Secret History of Eternal Recurrence, Duke University Press, 1998, p. 75.

Jeffrey Scott Maxwell, The Alphabet26Dictionary of Cracker Jack Prize Collecting Terms,

David E. Roberts, History Of Lenticular And Related Autostereoscopic Methods, Leap Technologies, 2003,

Allan Shickman, “‘Turning Pictures’ in Shakespeare’s England,” The Art Bulletin Vol. 59, No. 1 (Mar., 1977), College Art Association, pp. 67–70,

“Gaspar Antoine de Bois-Clair,” Robert Simon Fine Art,

“Vari-Vue: Inventor of the Lenticular Imaging Technique,” DIDIK/VariVue,

Alissa Walker, “The Cracker Jack ‘Prize’ Is Now a QR Code,” Gizmodo, April 22, 2016,

“Frederic Eugene Ives,” Wikipedia, last modified on February 5, 2016, retrieved May 31, 2016,

“Gabriel Lippmann,” Wikipedia, last modified on April 23, 2016, retrieved May 31, 2016,

“Take Me Out to the Ball Game,” Wikipedia, last modified on May 20, 2016, retrieved May 31, 2016,


It’s About Time


About 10 years ago, the battery in my wristwatch died. It was one of those watches that didn’t have a user-replaceable battery; it needed to be taken to a jeweler’s, and—given my penchant for both cheapness and laziness—I never quite got around to it. But at the time, I was routinely carrying a mobile phone with me, which told the time, and this seemed to suffice. A year later, the iPhone was introduced and now—nine years on—everyone is constantly staring at their phones. Oddly, I now find myself in the market for a wristwatch.

The obsession with checking “mobile devices” goes back centuries at least, whilst some (OK, maybe just me) can’t help but think of the smartphone as a kind of Pandora’s Box—especially when I use it to play music through the Pandora app.

In Greek mythology, Pandora was the first human woman created by the gods. Acting on instructions from Zeus, Hephaestus and Athena introduced what I guess would be called Woman 1.0, aka Pandora, and during her creation was given a set of unique gifts, not all of them good. Her name, Pandora or Πανδώρα, comes from πᾶν, “pan” or “all,” and δῶρον, “dōron” or “gift.” Ergo, Pandora means “all-giving” or “all-gifted.” Whence the name of Pandora Music. According to the company:

The name Pandora means “all gifted” in Greek. In ancient Greek mythology, Pandora received many gifts from the gods, including the gift of music, from Apollo.

One of the more dubious gifts Pandora was given is perhaps the one she is most known for: “Pandora’s Box,” although in the original Greek myth, it was not a box, but a jar. Opening it, she unwittingly unleashed all the evils of humanity. That’ll happen.

The Pandora myth first appeared in the writings of Hesiod, specifically in Theogeny and at greater length in Works and Days (both written circa 700 B.C.). Hesiod wrote that Pandora opened a jar (pithos), releasing plagues, diseases, and all the other—albeit unnamed—evils of humanity.

(Unlike Pandora, Spotify was not named for a character from Greek myth, although its tendency to inevitably play Eminem at the gym is worse than unleashing the evils of humanity.)

Fast forward to the 16th century, where Dutch humanist and theologian Desiderius Erasmus of Rotterdam (that’s the city in the Netherlands, not the suburb of Schenectady, N.Y.) was translating Hesiod’s Pandora story into Latin. However, he confused the word pithos (“storage jar”) with pyxis (“box”). Thus, “Pandora’s box” caught on in a way that “Pandora’s storage jar” probably wouldn’t have (although Monty Python’s “Storage Jars” lives on, and deservedly so).

Erasmus is perhaps most famous for his satirical treatise In Praise of Folly, but he also had an eye for religious reform. Living during the burgeoning Reformation, he came out against some of the abuses of the Catholic Church, but he kept his distance from Martin Luther and strove to stay on the Pope’s good side (no fool he). He also published one of the first Greek editions of the New Testament. (The bulk of the original New Testament was written in a version of Greek, and was later translated into Latin, the official language of the Church.)

A Greek version of the New Testament was first printed in 1514 by Cardinal Francisco Jiménez de Cisneros (it was part of a polyglot edition of the Bible), but it wasn’t officially published until 1522, as Cisneros was waiting for the Pope to approve the Old Testament. (Printers waiting for clients to approve pages. Some things never change.) While Cisneros was waiting on Pope Leo, it gave Erasmus the opportunity to publish his own New Testament first, featuring a brand spankin’ new Latin translation produced by himself. It also featured the original Greek text, which he included so that readers could compare and contrast his translation with the original, although we use the word “original” advisedly. He was a little less than honest in places; in some cases, he translated his own Latin back into Greek. (Part of the problem was that he didn’t have access to a complete Greek manuscript.) He would spend years making corrections. Anyway, when it was published in 1516—or, as he phrased it, “rushed into print rather than edited” (ha! Wait until the Internet) he dedicated it to Pope Leo X (again, no fool he).

Erasmus’ new New Testament was a bit of a blockbuster, and editions of it were used as the primary source for translations into other languages, most notably Martin Luther’s German translation of the Bible.

Erasmus’ work also inspired an English scholar named William Tyndale (c. 1494–1536) one of the first to attempt an English version of the Bible—and the first to have at least part of the Bible printed in English. Tyndale drew from original Hebrew and Greek texts for his own translation, but only got through the entire New Translation and half the Old Testament. At the time, you didn’t just translate the Bible willy-nilly into whatever language you liked. This was frowned upon by the Catholic Church, and one thing you did not want to be was frowned upon by the Church. Tyndale tried to get permission from the Bishop of London, but his project was deemed “heretical.” He fled to continental Europe and published some partial editions, but his translations were banned and burned in England. Tyndale was caught in Antwerp in 1535, convicted of heresy, and “strangled to death while tied at the stake, and then his dead body was burned in the ritualistic fashion then in vogue” (Farris, 2007).

At this time, the hunt was on for any kind of heretical writings, especially unauthorized Bible translations. While Tyndale’s Bible pages were on press in Cologne, Tyndale was betrayed to the authorities by Johann Cochlaeus, an opponent of Luther’s. A prolific writer and “controversialist,” Cochlaeus had the distinction—if you want to call it that—of being reviled by both the Catholic Church and the Reformers at the same time, which takes talent.

For our purposes, though, we will confine our conversation to one passage from Cochlaeus’ 1511–1512 book Cosmographia Pomponii Melae:

Peter Hele, still a young man, fashions works which even the most learned mathematicians admire. He shapes many-wheeled clocks out of small bits of iron, which run and chime the hours without weights for forty hours, whether carried at the breast or in a handbag (Dohrn-van Rossum, 1996).

“Peter Hele” is Peter Henlein (1485–1542), a Nuremberg locksmith and clockmaker, and inventor of the first watch. From the get go, the term “watch” was used to refer to a timepiece—unlike a clock—that could be moved relatively easily, or even worn. (You could probably wear a pendulum clock, but I’ll save that for Lady Gaga.) The earliest watches were more than three inches in diameter and their general size and shape gave them the nickname “Nuremberg eggs.” They were accurate to within 15 minutes, were worn dangling outside the clothes like pendants, and they were more for status or fashion than for their usefulness as timepieces (funny how some things never change).

A century later, fashions changed, as they inevitably do, and it was thanks to technology: watches got smaller and could fit inside pockets—hence, pocketwatches. (And it wasn’t just the watches, but the pocket itself had to be invented! This happened in the 17th century when the waistcoat was invented.)

So this brings us to May 13, 1665, where British man of letters Samuel Pepys wrote in his famous diary:

To the ‘Change [Royal Exchange] after office, and received my watch from the watchmaker, and a very fine [one] it is, given me by Briggs, the Scrivener. Home to dinner, and then I abroad to the Atturney Generall, about advice upon the Act for Land Carriage, which he desired not to give me before I had received the King’s and Council’s order therein; going home bespoke the King’s works, will cost me 50s., I believe. So home and late at my office. But, Lord! to see how much of my old folly and childishnesse hangs upon me still that I cannot forbear carrying my watch in my hand in the coach all this afternoon, and seeing what o’clock it is one hundred times; and am apt to think with myself, how could I be so long without one… (Pepys, 1665; emphasis added).

Yes, the venerable 17th-century diarist was obsessed with his new pocketwatch and kept obsessively pulling it out and checking the time. Imagine what he’d do with a smartphone.

If you enjoy these historical digressions, 21 of my Digital Nirvana posts from last year have been collected into a new book called Printing Links, available in paperback from Amazon.



“Clocks And Watches—Encyclopedia Of Antiques,” Antiques Digest, Old and Sold,

Andrew Atherston, Reformation: A World in Turmoil, Oxford: Lion Books, 2011, p. 111.

Isabella Bradford/Susan Holloway Scott, “Samuel Pepys Checks His Smartphone…er, Watch, 1665,” Two Nerdy History Girls, March 24, 2016,

Rodney Carlisle, Scientific American Inventions and Discoveries: All the Milestones in Ingenuity—from the Discovery of Fire to the Invention of the Microwave Oven, Hoboken, N.J.: John Wiley & Sons, 2004, p. 143.

Gerhard Dohrn-van Rossum, History of the Hour: Clocks and Modern Temporal Orders, Chicago: University of Chicago Press, 1996, p. 122.

Michael Farris, From Tyndale to Madison: How the Death of an English Martyr Led to the American Bill of Rights, B&H Publishing Group, 2007, p. 37.

Cecilia A. Hatt, ed., English Works of John Fisher, Bishop of Rochester: Sermons and Other 1520 to 1535, Oxford: Oxford University Press, 2002, p. 57.

Samuel Pepys, “Saturday 13 May 1665,” The Diary of Samuel Pepys, 1665,

“Desiderius Erasmus,” Wikipedia, last modified on May 15, 2016, retrieved May 17, 2016,

“Johann Cochlaeus,” Wikipedia, last modified on February 2, 2016, retrieved May 17, 2016,

“Pandora,” Wikipedia, last modified on May 9, 2016, retrieved May 17, 2016,

“Peter Henlein,” Wikipedia, last modified on May 10, 2016, retrieved May 17, 2016,

“Tyndale Bible,” Wikipedia, last modified on May 4, 2016, retrieved May 17, 2016,

“William Tyndale,” Wikipedia, last modified on May 6, 2016, retrieved May 17, 2016,


The Sting


Some years ago, I was flying from Albany to L.A. and, thanks to a blizzard, I had missed my connection in Chicago. As a result, I had to go standby on the next flight out which, as you likely know, is one of the most unpleasant things you can do. I was literally the last one shoehorned into the Sardine Express and I found the last remaining empty seat (a middle, natch). As I opened the overhead bin to stow my carry-on, I noticed that someone had put a large, origami swan right in the middle of the bin. “How rude,” I thought. “Taking up an entire bin on a fully packed flight with a damn paper swan.” As I raised my bag to stuff it into the bin, a rather large and imposing man—and I guess if you’re going to do this sort of thing, you had better be physically large and imposing—stood up and offered to store my bag for me without damaging his swan. Fine.

Origami, the Japanese art of paper-folding (it also has a long Chinese tradition), dates to sometime after the 6th century A.D. Initially a feature of only special celebrations—largely due to the high cost of paper at the time—it didn’t become more or less mainstream until somewhere around the 17th century. The earliest reference is a 1680 poem by Ihara Saikaku about origami butterflies.

Beyond origami figures, insects and paper do share a close relationship.

One of civilization’s oldest sciences is in fact entomology, the study of insects. That’s not surprising, really; no matter where you live, there are lots of bugs around. For much of history, it wasn’t so much a rigorously pursued science as simply observations used to aid agriculture, pest control, and beekeeping. Indeed, the earliest insects depicted by humans—rock paintings that date from 13,000 B.C.—are bees.

The first documented case of forensic entomology—the science of using insects in crime-solving—was recorded by Song Ci (1186–1249) in a “true crime” book published in China in 1235 called Xi Yuan Ji Lu (Collected Cases of Injustice Rectified or Washing Away of Wrongs—or, perhaps, CSI: Shanghai). In one case, a peasant was found murdered, stabbed with a sickle. But whose sickle? The investigators realized that blowflies are attracted to invisible remnants of blood and tissue—which led them to the culprit, who then broke down and confessed in one of those mea culpa moments we all know from episodes of Columbo.

(Getting back to butterflies, one of the earliest butterfly experts was English entomologist Lady Eleanor Glanville [1654–1709]. She not only amassed a large collection of British butterflies, but also inherited a substantial fortune, which her husband and son were eager to get their hands on. After her death, her will—which bequeathed her fortune outside the family—was successfully contested, her butterfly obsession cited as evidence that she was not of sound mind.)

Butterflies notwithstanding, one of the big problems impeding progress in entomology was actually a very small problem: most insects are really tiny, which makes it hard to get a good look at them. For most of us, that’s just as well, but for entomologists, that was a challenge. Enter the microscope, a physicist named Robert Hooke, and a 1665 bestseller called Micrographia. Hooke’s Micrographia was the first attempt to observe insects under a microscope and then draw them. It’s really an extraordinary book, and the illustrations are quite stunning. Hooke did have one problem, though: how do you get an ant to stand still long enough to draw it? He didn’t want to kill it as that would distort it, so what did he do? Exactly what I would do:

Having insnar’d several of these [ants] into a small Box, I made choice of the tallest grown among them, and separating it from the rest, I gave it a Gill of Brandy, or Spirit of Wine, which after a while e’en knock’d him down dead drunk, so that he became moveless [methinks the ant wasn’t drinking alone —Ed.], though at first putting in he struggled for a pretty while very much, till at last, certain bubbles issuing out of its mouth, it ceased to move (Hooke, 1665).

By the way, a “gill” of brandy is about a quarter of a pint! That ant is going to have a problem and start seeing bugs. But don’t call Insect PETA on Hooke: the ant eventually woke up and scuttled off, seemingly none the worse for wear, but probably looking for the medicine chest. Who says scientific progress is without sacrifice?

It was a prominent entomologist, by the way, who inadvertently helped solve a problem the printing industry was having.

After the invention of printing in the 1450s, and print volumes began going up, so did demand for paper. For the first few centuries of printing, paper was made from cotton and linen rags. It was an effective recycling stream: people wore out their clothes, gave them to the ragpickers, who then conveyed them to papermakers, who in turn produced paper. Yesterday’s soiled knickers became tomorrow’s newspapers. (Insert own joke here.) And paper wasn’t just used for printed materials; soldiers used to wrap powder and musket balls in paper, for example.

By the 1700s, there was a dire paper shortage. England even passed a law that stated that bodies could only be buried in wool or other animal fiber that was unsuitable for papermaking. Still, an alternate source of paper pulp was desperately needed.

Enter René Antoine Ferchault de Réaumur (1683–1757), a French scientist who, among many other things, invented the thermometer and temperature scale named for him (it was the first to use 0° as the freezing point of water). He studied all manner of creepy crawlies, and showed that the old wives’ tale about crustaceans regrowing lost limbs was in fact true. He also wrote about the possibility of using spiders to produce silk, which intrigued the emperor of China. In 1719, Réaumur was observing North American yellowjackets, a subspecies of what are known as paper wasps. In a treatise he presented to the French Royal Academy, he wrote:

“The American wasps form very fine paper, like ours; they extract the fibres of common wood of the countries where they live. They teach us that paper can be made from the fibres of plants without the use of rags and linen and seem to invite us to try whether we cannot make fine and good paper from the use of certain woods” (via Hunter, 1978).

Réaumur never followed through on his own advice (which, later in life, he regretted), and while it took a while for this idea to percolate, one percolator was Jacob Christian Schäffer (1718–1790), a German botanist, mycologist (one who studies fungi), ornithologist, and entomologist. He also invented various prisms and lenses, and even a primitive washing machine. Between 1765 and 1771, Schäffer published a treatise with the catchy name Versuche und Muster, ohne alle Lumpen oder doch mit einem geringen Zusatze derselben, Papier zu machen, in which he described his experiments in papermaking using various kinds of plant matter.

You would have thought that, given the need for alternative fibers, the inventor of a wood-pulping machine would essentially be granted a license to print money. However, that turned out to not be the case.

Around the beginning of the 19th century, a British papermaker named Matthias Koops began experimenting with papermaking using such materials as straw, hay, and wood pulp. He was even granted two 1801 patents for pulping machinery, and launched the Straw Paper Manufacturing Company, a seminal industrial paper mill. Alas, Koops was still reeling from an earlier bankruptcy, and his various creditors—seeking to, ahem, re-Koop their investment—seized his equipment. The company was sold at action, and Koops and his patents disappeared without a trace, save for a book he wrote detailing his experiments. (Despite rumors, it was not printed on his own paper.)

Fast forward half a century or so to a disciple of Réaumur’s named Friedrich Gottlob Keller (1816–1895). As a youth, Keller read Réaumur’s account of wasps and papermaking, and, being an inveterate tinkerer, in 1841, at the age of 25, began to build a wood-pulping machine. He was unable to interest the German government in helping finance further development of it, so he sold his invention to a papermaker named Heinrich Voelter for a pittance, and a patent was later issued jointly to both Keller and Voelter. Around 1848, Voelter began manufacturing Keller’s wood-pulping machine in quantity, but when the patent was up for renewal in 1852, Keller couldn’t afford his part of the renewal fee. The patent ended up solely in Voelter’s name, and he ended up making quite a bundle while Keller ended up penniless. Eventually, after wood pulp became the standard, European papermakers, recognizing the part Keller had played in the invention of wood pulping, took pity and financed his meager retirement.

Keller wasn’t the only acolyte of Réaumur that had a tough time of it. Keller was working on his wood-pulping machine virtually at the same time as Canadian inventor Charles Fenerty (1821–1892), who, by 1844, had perfected a wood-pulping system. Based in Halifax, Nova Scotia, Fenerty pitched his process to the Acadian Recorder, a top newspaper at the time, even writing them a letter on his home-made paper. Alas, they, too, were not interested. Discouraged, he never pursued the idea or patented it, and took up a variety of other professions, including poet.

Since wood pulp was inspired by wasps, perhaps there it’s fitting that many of the earliest inventors of wood-based papermaking ended up getting stung.



Robert Hooke, Micrographia, “Observ. xlix. Of an Ant or Pismire,” 1665,

Dard Hunter, Papermaking: The History and Technique of an Ancient Craft, Courier Corporation, 1978, pp. 314–315.

John H. Lienhard, “Of Wasps Making Paper,” Engines of Our Ingenuity, University of Houston,

“Lady Eleanor and her elusive butterfly,” Beyond Pharmacy Blog, Pharmaceutical Journal, September 19, 2012,

“Collected_Cases_of_Injustice_Rectified,” Wikipedia, last modified on March 20, 2016, retrieved April 25, 2016,

“Entomology,” Wikipedia, last modified on March 18, 2016, retrieved April 25, 2016,

“Charles Fenerty,” Wikipedia, last modified on March 29, 2016, retrieved May 3, 2016,

“History of Origami,” Wikipedia, last modified on April 11, 2016, retrieved May 4, 2016,

“Friedrich Gottlob Keller,” Wikipedia, last modified on April 22, 2016, retrieved May 3, 2016,

“Matthias Koops,” Wikipedia, last modified on August 11, 2015, retrieved May 3, 2016,

“René Antoine Ferchault de Réaumur ,” Wikipedia, last modified on February 6, 2016, retrieved April 25, 2016,é_Antoine_Ferchault_de_Réaumur.

Fan Club


I confess that I am not a big fan of emoji, so my first thought when I read that the Oxford English Dictionary’s 2015 Word of the Year was an emoji symbol was, “Civilization continues its downward spiral.” The actual “Word” of the Year that wasn’t actually a word was “Face with Tears of Joy,” and there is no way on Earth I am reproducing it here.

It’s tempting (at least for me) to think that the word “emoji” is related to “emoticon,” the combination of letters, numbers, and punctuation to form smiley and frowny faces, and other little pre-emoji ways of expressing oneself without using actual words. (Japanese emoticons are called “kaomoji.”) Admittedly, there is one emoticon I have used on occasion: I8-#)““’. I came across it in a clue in an emoticon-themed New York Times crossword puzzle back in 1995 (puzzle by Dean Niles): it’s a Groucho Marx emoticon. Other historical, fictional, or otherwise famous characters included in the puzzle were Abraham Lincoln ==} : ‡]] , Colonel Klink [g-}] , and Charlie Chaplin cI[ : -= )I .

Anyway, back to emoji (if we must). The word is unrelated to emoticon, and in fact comes from the Japanese 絵文字, or e (“picture”) + moji (“letter, character”). It was invented in 1995 by Shigetaka Kurita, an employee at Japan’s Nippon Telephone & Telegraph (NTT) Docomo, a company that made pagers which, at the time, were all the rage among Japanese teenagers. However, NTT was trying to make its mobile devices more business-friendly—which turned off the teens. So they made an about-face (as it were) and decided to find a reason for teens to stop fleeing to rival devices. That reason, apparently, was emoji.

Emoji started appearing on Japanese mobile devices in the late 1990s, but were slow to make it across the Pacific. As Oxford implies, the use of emoji hit critical mass among English users in 2015.

Sometimes emoji are obvious, but I confess I often have no idea what 90 percent of emoji symbols mean. Someone once showed me a text conversation that consisted of 99-percent emoji and all I could think was, “Is there a secret decoder ring for this?”

Hand FanBut then I was reminded (at least for the sake of this essay) that in Victorian England, similarly cryptic conversations were carried on using hand fans. Yes, hand fans. In fact, there was said to be a whole “language of the fan,” and, as with texting abbreviations and (I would imagine) emoji, dictionaries were needed to define it.

When one thinks of the dictionary (assuming anyone thinks of the dictionary), one usually thinks of Noah Webster, but my mind more readily goes to Dr. Samuel Johnson, if only because of this classic exchange from the British TV comedy Blackadder the Third (1987):

Dr. Johnson (Robbie Coltrane): Here it is, sir: the very cornerstone of English scholarship. This book, sir, contains every word in our beloved language.

Edmund (Rowan Atkinson): Every single one, sir?

Dr. Johnson (confidently): Every single word, sir!

Edmund (to Prince): Oh, well, in that case, sir, I hope you will not object if I also offer the Doctor my most enthusiastic contrafribularities.

Dr. Johnson: What?

Edmund: “Contrafribularites,” sir? It is a common word down our way…

Dr. Johnson: Damn! (writes in the book)

Edmund: Oh, I’m sorry, sir. I’m anispeptic, frasmotic, even compunctuous to have caused you such pericombobulation.

Dr. Johnson: What? What? WHAT?

We continue. The need for a good dictionary was driven largely by the invention of printing. When the printed book was first produced in the latter half of the 15th century, only one percent of the population of Europe was literate. As a result, there was little demand for a dictionary, since no one really read or wrote, except for academics and the clergy, so no one really needed anything defined. As I’ve mentioned in this space before, the earliest dictionaries were foreign-language dictionaries, containing little more than lists of English words translated into French, Italian, and other languages.

By the mid-1500s, as the printed book proliferated, literacy in Europe had reached 50 percent and kept climbing. Then there were newspapers, chapbooks, pamphlets—all manner of printed materials were readily available. As more people became capable of reading, the more they needed to occasionally look up words they didn’t know. Dozens of dictionaries started to appear, but the problem was, they more often than not were just lexicons of antiquated, foreign, obscure, or what we would call “five dollar” words. “The early lexicographers failed to give sufficient sense of [the English] language as it appeared in use. All proceeded by plagiarizing their predecessors,” according to historian Henry Hitchings (Hitchings, 2005).

In 1746, a group of London printers contacted Johnson and offered him 1,500 guineas (about $315,000 in today’s dollars) to write a dictionary. It took him nine years, and amazingly he accomplished it virtually single-handedly. (By the way, the house where he compiled the dictionary, 17 Gough Square in London, is now a museum, located right around the corner from the excellent Ye Olde Cheshire Cheese pub. I highly recommend both of these attractions—not necessarily in that order—should one find oneself in London.)

Anyway, the first edition of Johnson’s Dictionary of the English Language, published in 1755, contained 42,773 words and their definitions. The truly revolutionary thing about Johnson’s dictionary was that he was the first lexicographer to follow the definition of a word with a quotation showing the word used in context, usually via literary quotations from the likes of Shakespeare and Milton. In his definitions, he sometimes editorialized, often at the expense of the Scottish. His infamous definition of oats was, “a grain, which in England is generally given to horses, but in Scotland supports the people.” On the other hand, he defined “lexicographer” as “a writer of dictionaries; a harmless drudge that busies himself in tracing the original and detailing the signification of words.”

One might wonder why the booksellers had approached Johnson. He was extraordinarily well-read and knowledgeable (his family owned a bookstore) and he had been making a bit of a name for himself as a poet, playwright, and biographer. His first steady writing gig was for The Gentleman’s Magazine. Founded in 1731 by London printer Edward Cave, The Gentleman’s Magazine was a monthly general interest magazine, and was the first publication to use the word “magazine,” which Cave borrowed from the French word magazine, or “storehouse.” The first true magazine, although it didn’t call itself that, was Erbauliche Monaths Unterredungen (“Edifying Monthly Discussions”), a German philosophy periodical that was published from 1663 to 1668.

Johnson was a regular contributor to The Gentleman’s Magazine, and in a 1738 story about British Parliamentary debates, Johnson coined the term “Columbia” to poetically refer to America. It stuck.

As for The Gentleman’s Magazine, it ran for almost 200 years, finally ceasing publication in 1922. If you had been reading it in 1740, well, you’d be really old now, but you may have seen an ad for “The New Fashioned Speaking FAN!” You’d be likely to see this kind of thing in electronics publications today, and would probably refer to some kind of electronic fan that actually spoke to you as it cooled a room. Back in the 18th century, though, what it referred to was a kind of sign language in which the motions of a hand fan could be translated into letters of the alphabet. I kid you not. It could be a little complicated:

The alphabet, with the exception of J, was split into five sections. These sections corresponded to one of the following movements:

  • Moving the fan with the left hand to the left arm
  • Moving the fan with the right hand to the left arm
  • Placing the fan against the bosom
  • Raising the fan to the mouth
  • Raising the fan to the forehead

In order to signal a letter two movements were necessary. The first corresponded to one of the five alphabet groups, and the second told the letter’s position in the group. For example, to signal “D”, one would use movement 1 (first section of the alphabet), followed by movement 4 (fourth letter in that section of the alphabet.

Over time, an alternative to this laborious letter-by-letter approach developed, and instead had specific fan movements denote certain phrases. Some examples:

  • To hold the fan with the right hand in front of the face. “Follow me.”
  • To hold it in the left ear. “I want you to leave me alone.”
  • To move it with the left hand. “They are watching us.”
  • To change it to the right hand. “You are imprudent.”

The language of fans wasn’t always subtle:

  • To hold the fan in the lips. “Kiss me.”
  • To throw the fan. “I hate you.”

And I certainly can’t think of hand fans without thinking of the flirting scene in Woody Allen’s Love and Death.

Now, whether this “language of the fan” was ever actually used is actually open to debate. The idea apparently derived from what is believed to be a satirical letter to The Spectator in 1711, and such a language is said to be a 19th-century marketing invention by fan sellers. (Indeed, there are no references outside of current fan merchants’ websites.)

Still, a secret language of fans makes about as much sense as emoji. Hmm. Is there an emoji figure with a fan over its face?

If you enjoy these historical digressions, 21 of my Digital Nirvana posts from last year have been collected into a new book called Printing Links, available in paperback from Amazon.



“Ladies Decorative Fans: Bootcamps for Coquettes,” Jane Austen’s World, July 25, 2009,

Jeff Blagdon, “How emoji conquered the world,” The Verge, March 4, 2013,

“The language of the hand fan,” TheHandFan,

“Johnson’s Dictionary—Oats,” British Library,

Henry Hitchings, Defining the World: The Extraordinary Story of Dr. Johnson’s Dictionary, New York: Farrar, Straus and Giroux, 2005, p. 55.

“Oxford Dictionaries Word of the Year 2015 is…,” Oxford Dictionaries Blog, November 15, 2016,

A Dictionary of the English Language,” Wikipedia, last modified on March 5, 2016, retrieved March 28, 2016,

“Emoji,” Wikipedia, last modified on March 29, 2016, retrieved March 29, 2016,

“European hand fans in the 18th century,” Wikipedia, last modified on September 6, 2015, retrieved March 28, 2016,

The Gentleman’s Magazine,” Wikipedia, last modified on March 4, 2016, retrieved March 28, 2016,

“Samuel Johnson,” Wikipedia, last modified on March 8, 2016, retrieved March 28, 2016,