Author Archives: Richard Romano

About Richard Romano

Richard Romano has been involved in the graphic arts since before birth. He is a writer and analyst for the graphic communications industry and a regular contributor to, for which he oversees the Wide-Format and Production Inkjet special topic areas. For eight years, he was the senior analyst for The Industry Measure (formerly TrendWatch Graphic Arts), until its demise in March 2008. He has also worked on consulting and market research projects for many other organizations and companies and has contributed to such magazines as Graphic Arts Monthly, GATFWorld, Printing News, and HOW; is the former executive editor of, CrossMedia magazine; and is the former managing editor of Micro Publishing News and Digital Imaging magazines. As if that weren’t enough, he is also the author or coauthor of more than a half dozen or so books, the last three with WhatTheyThink’s Dr. Joe Webb, including Disrupting the Future, which has been translated into Japanese and Portuguese. Their most recent title is "The Home Office That Works! Make Working At Home a Success—A Guide for Entrepreneurs and Telecommuters." He has vague recollections of having graduated from Syracuse University’s Newhouse School of Public Communications in 1989, and has a 1994 certificate in Multimedia Production from New York University. He is currently in the final throes of a Masters program at the University at Buffalo, which he really does need to wrap up at some point. He lives in Saratoga Springs, NY.

The Secret of Success Is…


I’ve long been a fan of English author Thomas Hardy, and upon learning that a new adaptation of his novel Far From the Madding Crowd was headed for movie theaters, I decided to go back to the source material. While Wikipediaing to reacquaint myself with Mr. Hardy, I came across the line:

The term “cliffhanger” is considered to have originated with the serialised version of this story [A Pair of Blue Eyes] (which was published in Tinsley’s Magazine between September 1872 and July 1873) in which Henry Knight, one of the protagonists, is left literally hanging off a cliff.

The idea of the cliffhanger as a plot device dates back as far as The Odyssey, but has become a familiar device in novels, movies, and television. I can still remember “Who Shot J.R.?”/Dallas mania back in 1980. (There was also a 1979 TV series called Cliffhangers that featured three different serialized stories per episode—one a mystery, one a sci-fi western, and one a vampire story. It was very well done but, like most shows I have ever liked, it was cancelled after only 10 episodes. So it goes.)

What’s interesting, though, is that while I can recall the whole “summer of ‘Who Shot J.R.?’, I have no recollection of who actually did shoot him, even though I am almost positive I watched it (there weren’t many choices back then). I also remember the show Cliffhangers because it was cancelled before the stories could be wrapped up.

But then maybe that’s not that unusual. There is a phenomenon called “the Zeigarnik effect,” described in the 1920s by Russian psychologist and psychiatrist Bluma Zeigarnik, which states that “people remember uncompleted or interrupted tasks better than completed tasks.”

Zeigarnik began this line of inquiry when she stumbled upon the observation that waiters had better recollections of unpaid orders—I know many people in the service industry and do not find this surprising—but as soon as customers paid, they had no recollection of the orders. One of the conclusions of Zeigarnik’s subsequent research into this was that “students who suspend their study, during which they do unrelated activities (such as studying unrelated subjects or playing games), will remember material better than students who complete study sessions without a break.”

Essentially, the mind hates unfinished business; it’s a source of tension that only the completion of a task can resolve.

Zeigarnik’s research is not without its dissenters; attempts to reproduce her findings have not always been successful, and it seems that the effect is dependent upon other factors, like how important the interrupted task is to the subject. (Unpaid orders are obviously of great import to waiters.)

The Zeigarnik effect has been applied to marketing (big surprise), specifically email marketing, and if you have ever opened an email or read a blogpost because the subject line contained an ellipsis, you have experienced the Zeigarnik effect.

When you have a subject line that finishes with a period, you are basically encouraging the recipient’s mind to think of the message as a completed task. But without the end punctuation, the subject line is perceived as unfinished, and the brain will not be happy with the idea of moving on without finishing the sentence.

The caveat to this is, again, what Zeigarnik’s critics found: the initial subject itself has to be of sufficient importance for our minds to want to care about seeing its resolution. If you don’t have a cat, the unfinished phrase “The best kitty litter is…” will mean nothing to you.

A Saucerful of Sucrets


A couple of years ago, I was getting a new passport photo taken in my local CVS, and it took several goes to get a usable one, since the overhead lights kept reflecting off my glasses, and the clerk said that the State Department doesn’t accept that. (We couldn’t quite get rid of all the glare, but it turned out that no one has ever cared.) It didn’t help that I was vaguely sick at the time and had a cough that made the interminable photo session even more interminable.

I very rarely buy any kind of cold medication as I very rarely get sick and anything I buy expires before I ever get to use it again. But, after we had a photo we deemed usable, I trundled over to the Cough and Cold aisle and picked up a tin of Sucrets for the first time in something like a decade. Suddenly feeling like I was turning into my grandmother, I was shocked to discover that Sucrets are no longer sold in tins but in plastic containers. When did that happen?

The cough suppressant dates back to 1000 B.C. and ancient Egypt, where honey and various herbs and spices were used to suppress coughs and soothe sore throats, but it wasn’t until the 19th century and the patent medicine explosion that people started developing cough drops in earnest. In a perfectly deadpan sentence, Wikipedia says:

In the 19th century, physicians discovered morphine and heroin, which suppress coughing at its source—the brain.

Great—nothing like having a Smith Brother on your back.

Two pioneers in the field of cough suppressants (where few are chosen and fewer still are called) were, indeed, William Wallace Smith and Andrew Smith—aka the Smith Brothers. They were the sons of James Smith, a Scottish immigrant who opened an ice cream shop in Poughkeepsie, N.Y. As the company lore has it (so make of it what you will), James bought a cough drop recipe from a wandering peddler named Sly Hawkins—and if that isn’t the name of a wandering 19th-century peddler then I don’t know what is—and began to sell them in the shop. The cough drops were originally sold from a jar at the ice cream counter, but when William and Andrew took over the business, they started selling them in boxes. For purposes of brand protection, they printed their portraits on the boxes, thus becoming the most famous bearded men in the country until ZZ Top. To stress the fact that their hirsute countenances were a company trademark, they put the words “trade” and “mark” on the boxes—the word “trade” appeared under William’s face and “mark” under Andrew’s. And thus for years the two Smith Brothers were often mistakenly called Trade Smith and Mark Smith. They did not object—and in fact it became part of their branding.

When the company was bought out in the 1970s, the new owners phased out the bearded portraits—and eventually the name Smith Brothers. However, the brand was relaunched in 2011 and Trade and Mark are back and more bearded than ever.

Cough drop makers stopped using heroin (I mean in the cough drops, not recreationally) before the Smiths came along, which is probably a good thing, if for no other reason than it would have resulted in some very strange Velvet Underground songs.

One prominent ingredient in modern cough suppressants is eucalyptus oil, specifically that derived from the species Eucalyptus globulus, E. kochii, and E. polybractea, the latter two of which have the highest concentration of cineole, an organic compound that is the active ingredient in cough drops.

Eucalyptus oil is used in a wide range of applications, from cough suppressants, to insect repellants, to fragrances for soaps and lotions, to—I kid you not—fuel additives. In fact, eucalyptus oil could be used as a fuel in its own right, albeit not economically, though I bet it would make car exhausts smell a whole lot better. Oddly enough, eucalyptus trees present a fire hazard due to the flammability of eucalyptus oil—trees have been known to literally explode. Who knew koala bears led such lives of danger?

There are more than 700 species of Eucalyptus trees, virtually all of which are native to Australia, with a few species ranging as far as New Guinea and Indonesia. Eucalyptus was introduced to Europe following Captain James Cook’s expedition Down Under in 1770. The botanist on the expedition was a man named Sir Joseph Banks (1743–1820). His participation in Cook’s three-year mission (1768–1771)—which included ports of call at Brazil, Tahiti, New Zealand, and Australia—brought him instant fame upon his return home and opened some scientific doors. He served as president of the Royal Society for 41 years and made the Royal Botanic Gardens, Kew, the crown jewel of the botanical world. Banks also appeared as a character in the novel and movie (the 1935 version) Mutiny on the Bounty.

By the way, Cook, Banks, et al., landed in Australia at a place they named Botany Bay. Wrote Cook afterward,

The great quantity of plants Mr. Banks and Dr. Solander found in this place occasioned my giving it the name of BotanistBotany Bay. (strikethrough in the original)

In 1790, Banks was introduced to Francis (né Franz) Bauer (1758–1840), a talented artist who specialized in botanical illustrations. Banks was impressed with Bauer’s art, pulled some strings, and got Bauer a gig as botanical illustrator for the Royal Botanic Gardens. It was a gig that would last for the rest of Bauer’s life. He created beautiful drawings of flowers and plants, often at the microscopic level, many of which he turned into painstakingly hand-colored lithographs. Bauer later himself became a member of the Royal Society and was appointed “Botanick Painter to His Majesty King George III.”

In 1827, Bauer received a visitor from France, Joseph Nicéphore Niépce (1765–1833). Niépce was an inventor who had brought along some specimens of a project he was working on, specimens and a project that caught Bauer’s fancy. Bauer encouraged Niépce to present his work to the Royal Society. However, Niépce refused to share any technical details about what he was up to, and NDAs were apparently not in vogue back then, so the Royal Society pretty much told him to beat cheeks. Niépce went back to France, but left the specimens and his presentation to the Royal Society with Bauer, where they remained until Bauer’s death. They drifted from place to place and eventually vanished before being tracked down in the middle of the 20th century.

Tracked down, because one of these specimens was the first photograph ever taken.

Called “View from the Window at Le Gras,” it was taken by Niépce in 1826 or early 1827 using a process that Niépce had invented called heliography, one of the first iterations of what would eventually become what we know as photography. Niépce had been mucking about with lithographic printmaking and grew frustrated that he was unable to draw anything by hand. So he began looking into other ways of creating images. He hit upon a process in which Bitumen of Judea (a naturally occurring type of asphalt and not an Old Testament prophet), was coated on a glass or metal plate. When exposed to light, it hardened, and when the plate was rinsed with oil of lavender, the hardened areas—those exposed to light—remained. And you had, essentially, a proto-photographic image.

The drawback to the process was that it required an extraordinarily long exposure time. Later analysts of “View from the Window of Le Gras” estimate that it took eight hours to capture, and some researchers have even found evidence that the exposure lasted days. Not exactly an Instamatic.

Even as photographic processes evolved, exposure times were problematic, a problem that resulted in some of the funniest—and creepiest—images ever recorded on film.

By the 1850s, photography was starting to become all the rage, and one of the killer apps of the new process was—then as now—baby pictures. Parents up and down England’s socio-economic ladder were eager to immortalize their children on the new medium of film. Unfortunately—and any parent reading this can sympathize—the combination of squirmy children and long exposure times was not exactly conducive to ready-for-framing photographs. Even though exposures had been trimmed down to only half a minute or so, 30 seconds is still an eternity for a child to sit unmoving, and any movement would result in an indistinguishable blur (like the photographs I take even with today’s high-speed digital cameras). What to do?

Well, to help keep Baby still, you put Mommy in close proximity to Baby, or even Baby on top of Mommy, but you disguise Mommy as furniture. (I swear I am not making this up.) Hence, there emerged a whole genre of Victorian-era photographs called Hidden Mother Photographs—baby pictures that included conspicuously human-shaped lumps or adult figures crouching not entirely invisibly behind chairs. Some of them are quite hilarious—and some will haunt your dreams. A representative sample can be found here.

Photography, and perhaps even children, have gotten better since then, but even today when we pose for photos, it seems an eternity to wait for the shutter to click. Especially if you have a cough.



Bella Bathurst, “The lady vanishes: Victorian photography’s hidden mothers,” The Guardian, December 2, 2013,

“Hidden Mothers: Spooky Photographs Of Victorian Babies Held By Their Mothers,” Bored Panda,

“The First Photograph,” The Harry Ransom Center at the University of Texas at Austin, accessed February 16, 2015,

“Joseph Banks,” Wikipedia, last modified February 5, 2015, accessed February 16 2015,

“Franz Bauer,” Wikipedia, last modified on August 4, 2014, accessed February 16, 2015,

“Eucalyptus oil,” Wikipedia, last modified on January 13, 2015, accessed February 16, 2015,

“Smith Brothers,” Wikipedia, last modified on December 21, 2014, accessed February 16, 2015,

“View from the Window at Le Gras,” Wikipedia, last modified Janaury 20, 2015, accessed February 16, 2015,


Jive Talkin’


Whilst on an early morning flight recently, I ordered a cup of coffee and, the coffee being rather hot, I said to the flight attendant, “Do you have a zarf?” I got a blank look, as it appears few people know that “zarf” is the word for those cardboard sleeves around coffee cups that keep you from burning your hand (Forsyth, 2012). The word comes from the Arabic ظرف, zarf, meaning “container or envelope.” It has its origins in 13th-century Turkey P.S. (Pre-Starbucks), where coffee was consumed in an elaborate ritual using handle-less cups. The zarf was a cover used to protect the cup from damage and the hand from getting burned. Zarfs (also zarves) were decorative, and could be adorned with silver, gold, copper, brass, or other metals, as well as wood, ivory, bone, and other materials—or advertising, in the case of today’s zarfs. (It is also a legal Scrabble word, worth 16 points, more if you land on a double/triple letter/word space.)

At any rate, it’s hard to explain word etymologies in a loud airplane cabin at cruising altitude, and I suddenly had an image of Barbara Billingsley standing up à la Airplane! and saying, “Oh, stewardess? I speak jive.”

The word stewardess has gone out of fashion, which I find regrettable only because I like the mock-plural stewardii coined by Thomas Pynchon in the novel Inherent Vice (I haven’t seen the movie yet).

At any rate, back to jive. At one time, you could actually learn to speak jive. Cab Calloway was a famous jazz singer and bandleader, leading one of the country’s most popular big bands in the 1930s and 40s. His signature hit was 1931’s “Minnie the Moocher”—with its scatted “hi de hi de hi de hi” refrain—but few people know that Calloway also wrote a dictionary, Cab Calloway’s Hepster’s Dictionary: Language of Jive in 1939. The idea was instruct people living outside cities (in Squaresville, baby) how to communicate should you come up to town and encounter a jazzman in his natural habitat. Needless to say, much of the terms relate to music, as well as, uh, other recreational activities to be found there.

Dictionaries are nearly as old as written language. The earliest known to historians date from somewhere around 2300 B.C.E. and were cuneiform tablets created during the Akkadian Empire and consisted of lists of Sumerian-Akkadian words. Indeed, the earliest purpose of dictionaries was to translate words from one language to another. Likewise, the first English dictionaries were simply English translations of Latin, French, or Italian words.

The word “dictionary” itself was coined John of Garland in 1220 in his book Dictionarius, which was a primer on Latin vocabulary and diction. John of Garland—his exact birth and death dates are unknown—was a philologist and grammarian as well as a prolific author and poet. His works were highly popular in England, particularly after 1476 when William Caxton installed the first printing press in England. Caxton himself wasn’t as much a fan of John of Garland’s works as was his assistant, Wynkyn de Werde (né Jan van Wynkyn de Werde). De Werde took over Caxton’s print shop after Caxton’s death in 1491 and it was de Werde who, even during Caxton’s lifetime, sought to improve the quality of printed books. Thus is de Werde commonly thought of a “England’s first typographer.” The reason was aesthetic, true, but also practical: as book printing and the number of print shops started to grow around the turn of the century, improved book quality was an important competitive advantage.

One of the most important results of the establishment of printing in England was the standardization of the English language—or to the extent that English has ever been standardized. In an age before there had been any kind of mass communication, people living in disparate parts of Britain spoke different dialects, some of which could even be considered different languages. Even if you travel around the U.K. today, it can be difficult to understand people the further you get from London.

What the earliest English printers did—Caxton, de Werde, as well as another prolific printer and contemporary of de Werde’s named Richard Pynson—was translate and print books in an English dialect called Chancery Standard. It was the dialect used in London, which was after all the political and economic center of England. It was printing more than anything that started to “fix” the English language.

OK, blog participation time. Say the phrase “Ye Olde Curiosity Shoppe” out loud.

Let me guess: you pronounced “Ye” with a y sound, right? If you did, that’s actually not correct. It is pronounced the because the Y in that context isn’t really the letter Y. It’s an Old English letter called a “thorn,” pronounced with a th sound. The thorn was written in various ways before printing (commonly Þ), but as it evolved with other letters, it began to look vaguely like the letter Y. When Caxton started printing, he had to import type from Germany or Italy, which did not include a thorn character, but did include the letter Y. So Caxton fudged it a bit and took to setting the word the as Ye (Y with a superscript e). It was, however, always pronounced with a th sound and not with a y sound. (The word that was similarly typeset as Yt.) When the first edition of the King James Bible was printed in 1611, it used Ye in various spots. We occasionally see ye used today, usually in a mock-antique way, but pronounced in a way it was never pronounced in antiquity. The only language in which the thorn is used today is Icelandic; those of us who follow the CrossFit Games know that the surname of Icelandic champion athlete Annie Thorisdottir is written as Þórisdóttir in her native language.

There are actually a bunch of letters that were shed from English over the centuries: wynn, yogh, ash, eth, the long s (aka the f, although it wasn’t really an f, used as an s in certain cases until about the late 18th century), and even the ampersand (&) was a proper letter at one time (it stood for et, the Latin for and).

One of the simultaneously great and terrible things about the English language is that it has always been an organic language. That is, the grammar police to the contrary, there is no central authority determining what is proper English and what isn’t. (This is unlike French and other languages, which do have academies that vet changes and additions to the language.) There are conventions—and I refer to you “Weird Al” Yankovic’s “Word Crimes” for a good compilation—but generally speaking those conventions are wont to change. Sure, the Oxford English Dictionary adds new words to the lexicon every year, but that’s not really official. Kind of like the Weather Channel naming snow storms, much to the annoyance of the American Meteorological Society.

English changes according to how it is actually used. The terrifying prospect is that Internet-ese or texting shorthand will infiltrate itself into “real” English. Two weeks ago, while touring Canon Solutions America’s Customer Experience Center in Boca Raton, they had been printing a small pocket lexicon of texting abbreviations. OMG!

Speaking of OMG…did you know that little bit of shorthand for “Oh My God!” was actually coined in 1917? Its first recorded use was in a letter by British Admiral John Arbuthnot Fisher who wrote to Winston Churchill on September 9, 1917:

I hear that a new order of knighthood is on the tapis—O.M.G. (Oh! My God!)—Shower it on the Admiralty! (Fisher, 1919).

The context is a bit involved (it was in the midst of World War I), but suffice to say, spelling out what the abbreviation stands for probably defeats the point of using the abbreviation at all.

If these abbreviations do creep into the language…well, c’est la vie. (See how foreign words and phrases make their way into English?) And that would be OK. (And I could probably do a whole post on the etymology of OK.) Let’s hope, though, that text message abbreviations don’t become the only way that people enjoy classic English literature—back in 2005, British mobile communications network Dot Mobile had announced a plan to “translate” classic works of literature into text messages, for the purpose of helping students study for exams. (“2b? Nt2b? ???” Is that a question?) Dot Mobile went out of business before anything could come of it, though.

English is always evolving and borrowing words from every which way. Which is a good thing, otherwise we would not have lovely and useful words like zarf. And that ain’t jive.



Associated Press, “Get the classics in text messages,” USA Today, November 18, 2005,

Asher Cantrell, “12 Letters that Didn’t Make the Alphabet,” Mental Floss, December 13, 2012.

John Arbuthnot Fisher, Memories, London: Hodder and Stoughton, 1919, p. 78.

Mark Forsyth, The Horologicon: A Day’s Jaunt Through the Lost Words of the English Language, New York: Berkley Books, 2012, p. 55.

“Glossary of jive talk,” Wikipedia, last modified on December 27, 2014, retrieved on February 12, 2015,

“Richard Pynson,” Wikipedia, last modified on January 28, 2015, retrieved on February 12, 2015,

“Thorn,” Wikipedia, last modified on January 29, 2015, retrieved on February 12, 2015,

“Wynkyn de Werde,” Wikipedia, last modified on January 28, 2015, retrieved on February 12, 2015,

“Zarf,” Wikipedia, last modified on January 30, 2015, retrieved on February 12, 2015,

The Nancy Drew


I was watching some early episodes of Seinfeld recently, which were first broadcast in 1991, and what struck me (aside from what a great show it was) was the fairly “archaic” communication technology they (we!) all used at the time. One episode centered around George’s trying to retrieve an answering machine cassette tape (a what?) from a woman he was dating. Another featured George getting annoyed at someone hogging a public pay phone (huh?). And Jerry does a standup bit about how he hates cordless phones because you can’t slam them the way you can corded phones (OK, there are still corded phones…for now). I would imagine that for anyone under 30, watching shows from the 90s is kind of like my generation watching Humphrey Bogart using those candlestick phones from the 1930s and 40s.

A few weeks ago, I was in Barnes & Noble shopping for my niece’s birthday, figuring she was just old enough to start reading Nancy Drew books. (When I was her age, I had been a Hardy Boys boy.) I was talking to the clerk in the children’s book section and she said that there was the original series, but there was also a newly revised and updated series of Nancy Drew books since, she told me, “kids today have no idea what a rotary dial phone or a phone booth is.” Fair point. And if you’re going to engage young readers, it makes sense to make the stories, characters, and settings reasonably contemporary. In fact, the Nancy Drew and Hardy Boys books were often updated over the years.

As everyone likely knows, there was no Carolyn Keene (the bylined author of all the classic Nancy Drew books) or Franklin W. Dixon (the Hardy Boys books). The Hardy Boys were conceived (as it were) in 1926 by Edward Stratemeyer, head of the Stratemeyer Syndicate, which was the first book packager to specialize in children’s books. Stratemeyer developed a number of popular kids’ book series, including the Bobbsey Twins and Tom Swift, which were all immensely popular. (Indeed, the “Taser”—the name of the electric stun gun—is actually an acronym for “Thomas A. Swift’s Electric Rifle,” named by the device’s inventor, NASA researcher Jack Cover, after his childhood hero from the Tom Swift books.)

The Stratemeyer Syndicate’s books were launched in 1899 with The Rover Boys, a series that chronicled the hijinks of a trio of adolescents at a military boarding school. The mystery-solving Hardy Boys were launched in 1927, and, noticing that many girls bought the Hardy Boys books, Stratemeyer launched girl sleuth Nancy Drew in 1930. All the books, though credited to a single author, were written by a revolving crew of ghostwriters, and sometimes even by Stratemeyer himself.

The Hardy Boys and Nancy Drew books were updated a few times over the decades to reflect not only changing technology (like the advent of cars, phones, etc.) but also—at the request of Grosset & Dunlap, the publisher—changing cultural attitudes. Specifically, to remove the racial and ethnic stereotypes that permeated the original editions.

Edward Stratemeyer was a prolific author, said to have penned more than 1,300 books in his lifetime. He got his start writing for a magazine called Good News, published by Street & Smith Publications, which specialized in pulp magazines and dime novels. The term “pulp magazines” or “pulp fiction” comes from the cheap wood pulp-based paper used to print the magazines (in contrast to the upmarket “slicks” which were printed on better paper), and given that these publications tended to include stories that were deemed inferior in quality to “literary fiction,” the term “pulp” came to refer to that kind of content—detective stories, murder mysteries, horror tales, science-fiction yarns, and so forth. Dime novels, as the term indicates, were novels that sold for—wait for it—ten cents (although sometimes more as the years wore on), and the term came to encompass all of what we would consider “mass market paperbacks” today.

What was the first dime novel? It can be traced to a frontier tale called Malaeska, the Indian Wife of the White Hunter, written by Ann S. Stephens. Published in 1860, it kicked off Beadle & Adams’ Beadle’s Dime Novels series of books. Stephens was herself a prolific author of dime novels and stories for magazines (she sometimes used the pseudonym Jonathan Slick). Based in Portland, Maine, she was also cofounder, publisher, and editor (with her husband, a printer named Edward Stephens) of Portland Magazine, a monthly collection of literary fiction.

Ann Stephens also contributed to many other publications, including Godey’s Lady’s Book, which—nicknamed “queen of the monthlies”—was the most widely circulated magazine in the pre-Civil War era. Launched by Louis Godey in 1830, a decade later its circulation had risen to 70,000 and, by 1860, had soared to 150,000. It was launched to capitalize on the then-popularity of what were called “gift books,” or literary annuals. Though published monthly, it featured poems, stories, engravings, and other items of interest largely to women. The magazine’s longtime editor was Sarah Josepha Hale, and she used her success as the editor of a successful magazine—she became quite the tastemaker—to champion several women’s causes. She was also a primary advocate for the establishment of the holiday of Thanksgiving, and as a New Englander (born in New Hampshire), she was also involved in the completion of the Bunker Hill Monument.

Today, we may not know the name of Sarah Josepha Hale, but she is known for one enduring work: she was the author of the nursery rhyme “Mary Had a Little Lamb.”

You can probably see where I’m going with this. “Mary Had a Little Lamb” was of course the first thing that Thomas Edison recorded in 1877 on his brand new phonograph. (Whether Buddy Guy’s 1968 version could be considered a cover of Edison’s original is open to debate—well, OK, not really.)

Although the phonograph would have profound effects on modern music (and remember how well it helped the careers of people like Enrico Caruso), that really wasn’t what Edison was trying to do. Essentially, Edison was trying to invent a telephone answering machine.

Patented on March 10, 1876, Alexander Graham Bell’s invention was the first practical telephone. Being so new, it could stand to use some improvements (heck, it still could), and who better than the “Wizard of Menlo Park” to tweak it? Edison set to work improving the microphone or transmitter, so callers wouldn’t have to bellow into the phone at the top of their lungs to be heard (where have you gone, Thomas Edison, a nation of cellphone users turns its lonely ears to you!). Whilst working on this, it occurred to Edison that when you received a phone call, you actually had to be present to get any message conveyed through it—unlike the telegraph, where messages were written down. So he began to think about how phone messages could be recorded and played back later. Noodling with a telephone diaphragm, he found that sound conveyed through the phone could make indentations on paraffin paper (and later tinfoil) that, when transmitted through a second telephone diaphragm, played the recorded sound—those indentations—back. Not exactly high-fidelity, but the fact that it worked at all surprised even Edison.

The idea of recording sound would eventually lead to the answering machine, although it would take until the advent of magnetic recording media for that to happen; the first working means of recording phone conversations was invented in 1898 by Danish engineer Valdemar Poulsen. Of course, there weren’t an awful lot of phones in 1898, so hopefully Poulsen didn’t feel too bad about not getting a lot of messages. Anyway, that’s whom George Costanza can blame.

It occurs to me, I should really call my niece and see if she liked the Nancy Drew book….Dang, the call went to voicemail.

Pass the Bubbly


A month or so ago, I was binge-watching on Hulu+ the British comedy panel quiz series Q.I. (Quite Interesting), simultaneously the most fascinating, funniest and, at times, bawdiest TV program on the air (Stephen Fry hosts four British comedians who answer impossible questions about obscure knowledge—right up my alley!). In an episode called “Kitsch,” the subject of Bubble Wrap came up, and I learned that today, January 26, is “Bubble-Wrap Awareness Day” or, alternatively, “Bubble-Wrap Appreciation Day.” (It was started by a radio station in 2001.)

It turns out that there are an awful lot of appreciation days, or even weeks, from the important and worthwhile (Down Syndrome Awareness Week, National Cervical Cancer Prevention Week, World Autism Awareness Day, and various other medical awareness days and weeks) to the frivolous (National Popcorn Day, International Pillow Fight Day, and National Flip Flop Day).

There is actually a 3D Printing Day on December 3, but there does not seem to be any kind of general “Print/Printer Awareness Day.” Perhaps it’s high time we started one—sounds like a job for Two Sides?

We might want to start our appreciation with a 19th-century French printer named Édouard-Léon Scott de Martinville (1817–1879), who kicked off a bunch of things that ultimately led to the establishment of Bubble Wrap Awareness Day. (We could probably go even further back, but we have to start somewhere, and these posts are long enough as it is!)

Scott de Martinville was a Parisian printer and bookseller, and amongst the things he printed were science textbooks. Not content with just printing them, he also read them, and sought to stay up-to-date on many of the latest advances in science. Inspired by the latest developments—as it were—in photography, he had the idea of doing for sound and voice what photography did for light and image: capture them. While proofreading a physics textbook, he came across illustrations of how the human auditory system worked. Thus inspired, Scott de Martinville went on to patent, in 1857, the phonautograph, the earliest known device for recording sound. However, while it did record sound, it was unable to play it back, unlike later inventions. What the phonautograph did was transcribe a visual representation of a particular sound. Not intended for home entertainment, it was primarily meant as a research tool for the investigation of sound waves.

(In 2008, researchers at the Lawrence Berkeley National Laboratory in Berkeley, Calif., did successfully convert a “phonautogram”—“squiggles on paper”—recorded in 1860 to a digital audio file. Not exactly a progressive-rock epic, it was a 10-second clip of a singer, possibly female, crooning “Au clair de la lune.” It is believed to be the earliest known sound recording, preceding Thomas Edison’s “Mary had a little lamb” by almost two decades [Rosen, 2008]).

Scott de Martinville alas never really got the credit he deserved, and Edison is known as the inventor of the phonograph, the first device that was capable of both recording and playing back recorded sound, originally using wax cylinders.

Sound like music.

(In 1996, the band They Might Be Giants recorded several songs at the Edison Laboratory on wax cylinders. One was the great “I Can Hear You,” a look at then-modern communication devices that sounded no better than old wax cylinders. And still don’t.)

The (arguably) first “pop music star” owed much of his success to the early phonograph. Even those generally unfamiliar with opera—and who likely couldn’t name a contemporary opera star beyond maybe Pavarotti (I was only ever familiar with Beverly Sills because she once appeared on The Muppet Show)—will likely know the name Enrico Caruso. Born in 1873, he took the opera world by storm, but what made him stand out amongst his peers was his embrace of new technology: the phonograph. Between 1903 and 1920, he made somewhere in the neighborhood of 290 commercially released recordings, which are still available in modern formats today (go to the iTunes Store and you can purchase Caruso’s recordings—interpret “digitally remastered“ with a grain of salt). Later generations of opera singers (Mario Lanza, et al.) all cite Caruso as their chief inspiration, and in large part this was due to their being able to listen to him in their own homes. Many of his contemporaries in the opera world dismissed the phonograph, believing the quality to be too poor. They changed their tunes—so to speak—once they found out how much money Caruso was making from commercial recordings.

Caruso died in 1921 at the age of 48 (hastened by his fondness for cigarettes). However, he does play a major role in another new technological development, a key one in the history of 20th-century mass media.

On January 12, 1910, part of the performance of Tosca, starring Caruso, was broadcast live from the Metropolitan Opera House in New York City. Broadcast live…on what? Well, it was the first live radio broadcast. It was an experiment conducted by Lee DeForest, one of the inventors of what we know today as “radio” (it was originally called “wireless” but DeForest disliked that term and preferred “radio”). DeForest dubbed himself “The Father of Radio,” and is famous for the quote, “I discovered an Invisible Empire of the Air, intangible, yet solid as granite.” You could say he could see DeForest for the trees. But anyway.

DeForest was embroiled in a variety of patent lawsuits, but he is generally acknowledged as the inventor, in 1906, of the Audion, an electronic amplifying vacuum tube originally developed for use in radio receivers and other types of nascent electronic equipment. The three-electrode “triode” version of the Audion was what essentially spawned the electronic age. Whilst the vacuum tube was eventually superseded by the transistor, virtually all the precursors of today’s electronic devices—TVs, radios, computers, and myriad scientific equipment—used vacuum tubes. The vacuum tube also has a role to play in the final chapter of our story.

In 1957, in a Hawthorne, N.J., garage, two engineers—Alfred W. Fielding and Marc Chavannes—were beavering away on something they hoped would change interior décor as we (or more likely they) know it. They sealed two plastic shower curtains together and attempted to market the result as wallpaper. Alas, the world was not ready for plastic wallpaper. Strike one. (They were probably a few years too early; add a few psychedelic images and it could have decorated the set of Rowan and Martin’s Laugh-In a decade later.) They then tried to sell it as insulation for greenhouses. Nope. Strike two. However, there was one distinguishing characteristic that eventually made the material a success; in between the layers of shower curtain were small pockets of air. Bubbles, you might say.

Timing is everything, isn’t it, and the Sealed Air Corporation, founded by Fielding and Chervannes in 1960, had time on its side. In 1959, IBM had introduced the 1401, the first in its 1400 series of business computers. It was one of the world’s first mass-produced computers, and still ran on vacuum tubes. It also had other fragile internal components. How to protect them during shipping to customers?

According to the Sealed Air Corporation’s company lore, a marketing expert named Frederick Bowers brought the shower-curtains-with-air-bubbles to IBM and it proved to be the perfect material to protect delicate glass and electronic computer components.

And thus was born Bubble Wrap.

In another blow for the printing industry, however, the subsequent popularity of Bubble Wrap for packaging and shipping displaced newspaper for these purposes; crumpled up newspaper had been the previous low-cost packaging material. So it goes.

Since then, Bubble Wrap has become almost a cultural icon—if not for packaging then certainly for the popping of the bubbles. (There is even an iPhone app that lets you pop “virtual Bubble Wrap,” for reasons passing understanding.) Bubble Wrap also played an interesting role in a 2013 study of “cuteness” and the extent to which cuteness triggers aggression (Pappas, 2013):

Dyer and her colleagues asked 90 male and female volunteers to come into a psychology laboratory and view a slideshow of cute, funny and neutral animals.

Researchers told the participants that this was a study of motor activity and memory, and then gave the subjects sheets of bubble wrap. The participants were instructed to pop as many or as few bubbles as they wanted, just as long as they were doing something involving motion.

In fact, the researchers really wanted to know if people would respond to cute animals with an outward display of aggression, popping more bubbles, compared with people looking at neutral or funny animals.

That’s exactly what happened. The people watching a cute slideshow popped 120 bubbles, on average, compared with 80 for the funny slideshow and just a hair over 100 for the neutral one.

Here is how the study was described on Q.I.:

Quite interesting.

At any rate, Happy Bubble Wrap Awareness Day.



Monte Burke, “Wrap Star,” Forbes, April 28, 2006,

Stephanie Pappas, “‘I Wanna Eat You Up!’ Why We Go Crazy for Cute, Live Science, January 21, 2013,

John Potter, “Almost as good as Presley: Caruso the pop idol,” The Public Domain Review, February 13, 2012,

Q.I. (Quite Interesting),

Jody Rosen, “Researchers Play Tune Recorded Before Edison,” New York Times, March 27, 2008,

Enrico Caruso, Wikipedia, last modified December 28, 2014, accessed January 13, 2015,

Lee De Forest, Wikipedia, last modified January 4, 2015, accessed January 13, 2015,

Édouard-Léon Scott de Martinville, Wikipedia, last modified September 25, 2014, accessed January 13, 2015,Édouard-Léon_Scott_de_Martinville.

Bicycle Couriers


Last November, I spent a night in Northampton, Mass., and no trip to Northampton can be complete without a stop at the Northampton Brewery. One of the specialties at the time was called the Juggernaut IPA, which was very good. (Hoppy? Well, it was rather like having one’s sinuses filled with thousands of tiny, hyperactive nano-rabbits.) I got to thinking about the word “juggernaut”—and well, why not?—which has always been one of my favorite words, if only because of its etymology.

The word means, says Oxford, “A huge, powerful, and overwhelming force or institution,” as in “WhatTheyThink is an industry information juggernaut.” The word comes from the Sanskrit Jagannātha, one of the names of Krishna. There’s a temple to Jagannātha and an annual celebration that comprises a procession of immense chariots. It has been said, apocryphally, that the more enthusiastic of Jagannātha’s devotees would hurl themselves in front of these chariots and be crushed beneath their wheels.

The ritual itself was first described to the West in the 14th century in a book called The Travels of Sir John Mandeville. The thing is, no one has ever been able to prove that there ever was anyone named John Mandeville who made these travels. In any case, a lot of the things “John Mandeville” wrote about were actually made up.

Be that as it may, it took a few centuries to percolate, but by the 19th century, the word juggernaut had come into prominent use. Charlotte Brontë used it in Jane Eyre and Robert Louis Stevenson used it to describe his titular Mr. Hyde (Jane Eyre and Mr. Hyde—now there’s a mashup I’d love to see!).

H.G. Wells wrote this passage in his 1895 novel The Wheels of Chance:

Anon Mr. Hoopdriver found himself riding out of the darkness of non-existence, pedalling Ezekiel’s Wheels across the Weald of Surrey, jolting over the hills and smashing villages in his course, while the other man in brown cursed and swore at him and shouted to stop his career. There was the Putney heath-keeper, too, and the man in drab raging at him. He felt an awful fool, a—what was it?—a juggins, ah!—a Juggernaut.

The Wheels of Chance is a far cry from what Wells is typically known for (Victorian science fiction) and is subtitled “A Bicycling Idyll.” It was written during what was considered to be “the golden age of bicycling,” those halcyon days before the invention of the automobile. The bicycle had just recently come onto the market and took Europe like…well, like a juggernaut. (If you’ve ever walked in New York City, bicycle couriers almost regularly run down pedestrians like those ostensible devotees of Jagannātha.)

The bicycle went through a bit of an evolution before it became commercially successful, but the precursor was something called the “Laufmaschine” (“running machine”), invented circa 1817 by Baron Karl von Drais. It has been suggested (more via circumstantial evidence than anything else) that von Drais was motivated to invent the Laufmaschine because of a climate anomaly. 1816 has been called “The Year Without a Summer”: due to a combination of low sunspot activity and a series of major volcanic eruptions, global temperatures plummeted by as much as 1.7°F. Indeed, in Europe, it snowed in the summer of 1816. This caused agricultural disasters, which led to the starvation and slaughtering of horses, and thus—among other things—a transportation crisis, since at the time everyone pretty much needed horses to get anywhere. Hence the need for something horseless, and the “horseless carriage” was still a ways away.

Von Drais was a flamboyant character and his life later took a few bad turns: he was fired from his day job as a forester as he was deemed “unfit,” and he got embroiled in retribution for a political murder. For a complicated series of reasons, he had to spend much of his later life in exile in Brazil. He died penniless. The Laufmaschine and what it eventually led to were his legacy—even if he didn’t profit from it in his lifetime—but so is one other thing. He also invented the typewriter. Well, okay, a typewriter. Well, yes, okay, not even a typewriter, really, but more of a shorthand or stenography machine. Wikipedia says that it was the first typewriter with a keyboard, but that’s not really true.

Von Drais invented and marketed two typewriter-like devices, a 25-character model in 1821 and a 16-character model in the early1830s. Von Drais used to claim, in good PR fashion, that his device was capable of typing a thousand characters a minute. Wrote typewriter historian Michael Adler in Antique Typewriters:

That kind of flamboyant extravagance was consistent with the inventor’s well-documented character and, if at all credible, must surely be related simply to the maximum number of random marks the machine was physically capable of making using all fingers…and perhaps a few toes, for good measure (Messenger, 2014).

Ouch. Poor Von Drais; even among his contemporaries he was the Rodney Dangerfield of inventors. At an exhibition of his machine in Frankfurt in 1831, one wag described it as “eine mechanishe Narrheit und Alberne Erfindung” (“a mechanical madness and an absurd invention”). Double ouch.

The typewriter as we know it (assuming there are people who still know what a typewriter is!) was invented by Christopher Latham Sholes. Or, to be more exact:

The fifty-second person to invent the typewriter and the first person to call it that, was Christopher Latham Sholes (Romano, 1986).

The dominance of the typewriter for written communication led to a number of typographic conventions that still remain with us—even if they are anachronisms in today’s word processing, desktop, and online publishing worlds. One of my pet peeves is the tendency to put two word spaces after a period. This is said to date from the Age of the Typewriter, but that is not entirely true. Back before any kind of automated typography, if you wanted justified text, you had little recourse but to noodle with word spacing, and typesetters used to routinely add entire en and em spaces after periods. (Today’s desktop publishing programs noodle far more deftly with a combination of word and character spacing to justify text.)

The practice of adding additional space after periods was later adopted by typewriter users when typewriters were only capable of using monospaced typefaces like Courier. With such faces, each character and each word space has exactly the same width, which adversely affects legibility. The two-word-space convention was thus a visual cue to make it clearer that a sentence had ended. CreativePro has a nice essay on this, saying:

It’s a question of balancing the white space bound up in each character with the spaces around them. In addition, a single word space simply lacks the visual impact to cue the reader that a sentence has ended. The punctuation mark alone, in short, isn’t enough to punctuate the texture of the type flow.

Makes perfect sense in retrospect. But, alas, it makes little sense when using a proportional-width typeface like Times.

Monospaced typefaces like Courier (or a similar typeface called, cleverly enough, American Typewriter) are still common; in fact, they’re required for professional playwrights and screenwriters (monospaced typefaces and standard script formats make it easy to gauge timing). Those of us who have done electronic prepress are no doubt intimately familiar with the infamous “Courier substitution,” or what RIPs used to put into page layouts—or on expensive film—when the correct font wasn’t available, although the advent of PDF has largely made the Courier substitution history. (I remember around late 2000 or so I picked up a print edition of my local newspaper, The Saratogian, and on the front page, every headline and photo caption was in Courier. In the Help Wanted ads, I noticed a big ad saying that the paper was looking for a managing editor. I bet.)

Why was Courier almost always the default font? Why not something more appealing or less obtrusive?

Courier was designed by Howard “Bud” Kettler in 1955 and later redrawn by Adrian Frutiger for the IBM Selectric Composer series of electric typewriters. The typeface had been commissioned by IBM, but the company chose not to copyright, trademark, or patent it—unlike other typefaces—so Courier has always been completely royalty-free. Ergo, this is why it has become so ubiquitous and remains so. No one has to pay for it.

Why the name Courier? It was originally called Messenger, but, Kettler once said, “A letter can be just an ordinary messenger, or it can be the courier, which radiates dignity, prestige, and stability.”

I’m not sure that Courier—or even couriers—still radiate those traits, but that was the thinking.

So thanks to IBM’s decision to not patent or copyright the typeface, Courier, for better or worse, has become a typographic juggernaut.



James Felici, “To Double-Space or Not to Double-Space…,” CreativePro, August 24, 2009,

Robert Messenger, “1000 Characters a Minute! The Karl Drais ‘Typewriter,’” OzTypewriter, Australian Typrewriter Museum blog, January 14, 2014,

“Juggernaut,” Oxford Dictionaries,, accessed December 31, 2014.

Frank J. Romano, Machine Writing and Typesetting, (Salem, N.H. 1986), p. 1.

Tom Vanderbilt, “Courier, Dispatched,” Slate, February 20, 2004,

H.G. Wells, The Wheels of Chance: A Bicycling Idyll,

“Courier (Typeface),” Wikipedia,, last modified December 30, 2014, accessed December 31, 2014.

“John Mandeville, Wikipedia,, last modified December 29, 2014, accessed December 31, 2014.

“Juggernaut,” Wikipedia,, last modified December 25, 2014, accessed December 31, 2014.

“Karl Drais,” Wikipedia,, last modified September 16, 2014, accessed December 31, 2014.

“Year Without a Summer,” Wikipedia,, last modified December 2, 2014, accessed December 31, 2014.

Power Windows


A few years ago, while in England, I visited Canterbury Cathedral. (We were on a pilgrimage and we all told tales as we trekked southward, a doughy poet feverishly writing them all down in rhyming couplets; and as the Miller told his tale, our faces, at first just ghostly, turned a whiter shade of pale. We continue.) The first cathedral at Canterbury dates from A.D. 602, dedicated by St. Augustine. It was destroyed by fire in 1067 and over the next couple of centuries was rebuilt into the magnificent structure we can visit today.

DigNirv-010715-Stained GlassOne of the most notable features of the cathedral—aside from the shrine to Thomas Becket, who had rather a bad day there in 1170—is the stained glass. Indeed, Canterbury Cathedral contains more than 1,200 square meters of stained glass which, like the stained glass found in many a medieval church and cathedral, depicts stories from the Bible, as well as other related events, lives of the saints, and so on. Some of Canterbury’s windows also depict the life of St. Thomas (Becket). The stained glass was not just decorative, although it certainly is that; medieval stained glass is quite beautiful. Much of it did, however, serve a more practical purpose: back in the Middle Ages, prior to the invention of printing, the vast majority of the population was illiterate. Stained glass windows (not all of them, but most cathedrals had various picture series) were a communication medium, a form of visual storytelling. (The verbally related tales that Chaucer’s pilgrims told on the way to the shrine to Thomas Becket were also a dominant form of communication at the time.)

(Quick quiz: what’s the difference between a church and a cathedral? The latter contains the cathedra, or the seat where the bishop sits. It’s not true, though, that bishops can only move diagonally.)

Anyway, in the sixteenth century, there was a revolt against all things iconic, and many churches and cathedrals throughout Europe saw their stained glass and other iconography destroyed. (Whence the word iconoclast, “breaker or destroyer of images.”) Canterbury was spared much of this destruction, although the English Civil War brought damage to some of the windows. (The German bombing of England during World War II also took a bit of a toll.)

However, a recent Wall Street Journal story (via Gizmodo) identifies a new threat to stained glass or, more specifically, the stained glass industry—one with which our own industry is not unfamiliar. (And if you didn’t know that there even was a stained glass industry, you are not alone.)

[C]hurch architects and experts say modern churches rely more on video and photo slideshows, which they say connect with attendees more than the static imagery of stained glass. “They want to have it dark, so they can project PowerPoint onto a screen,” says Richard Gross, editor of Stained Glass Quarterly.

There is a Stained Glass Association of America, the industry trade group, which

has seen membership dwindle to around half of its peak size of 900 during the 1970s. The annual conference draws about half the number of attendees it once did, and SGAA officials say they have privately considered broadening the group’s name.

One can’t ignore the fact that stained glass is really expensive. That, combined with the inevitable force of modernization, the desire to appear modern to bring in the parishioners, and the fact that stained glass uses decidedly static imagery, have all led new churches to use LED screens or HD projectors to display dynamic content like text, video, images, and PowerPoints (and, perhaps St. Paul’s First Email to the Ephesians).

As a result, stained glass manufacturers are looking at potential secular installations to keep their businesses going, such as casinos, retail establishments, restaurants, hospitals, and private homes.

And, hey, the stained glass industry has a “Dr. Doom,” too:

Kenneth F. von Roenn saw this trend coming nearly four decades ago. At an industry conference inside a Nevada hotel, Mr. von Roenn says he tried to warn his fellow artisans, urging them to shift focus to nonreligious buildings, in a speech called “Time to Jump Ship.”

His advice won a lot of glassy-eyed stares and little applause…

You might even call him an iconoclast.

Here is a case of individuals and companies in a market that are seeking alternate, more technological alternatives, and as a result those individuals and companies are altering their production (such as adopting a cheaper, more efficient way of producing the stained glass) and actively seeking new markets beyond their traditional ones.

Troy Story


’Twas the night before Christmas…

As the year draws to a close, it’s a time for reflecting on the past and anticipating the future. The month of January was named for Janus, the Roman god of transitions and beginnings, who is notable for having two faces, one looking to the past, the other looking to the future (it’s a good thing beards were fashionable amongst the gods; he’d have gone through razor blades like crazy).

The first known new year’s celebration was held by the Babylonians circa 2000 B.C. They didn’t have a calendar per se (or Dick Clark), so it took place after the vernal equinox and was intended to herald the coming of spring, a time when the world is reborn. Their festival was known as Akitu and included parades and other religious rites. There is one aspect of the Akitu that raises an eyebrow or two: the ritual humiliation of the king.

[T]he king [is] brought before a statue of the god Marduk, stripped of his royal regalia and forced to swear that he had led the city with honor. A high priest would then slap the monarch and drag him by his ears in the hope of making him cry. If royal tears were shed, it was seen as a sign that Marduk was satisfied and had symbolically extended the king’s rule (Andrews, 2012).

I like this idea; perhaps we should revive it for our elected leaders. It might make New Year’s Rockin’ Eve a bit more entertaining.

Anyway, this time of year is also a time for forecasts and predictions. Everyone loves forecasts, even though the vast majority of them will end up being wrong. (So-called professional psychics and astrologers inevitably make predictions and are almost universally wrong.) Still, forecasting has always been big business, and the Old Farmer’s Almanac, which debuted in 1792, is still going and is the oldest continuously published periodical in North America.

It wasn’t the first almanac, though; that distinction belongs to (no, not Benjamin Franklin) a man named Leonard Digges. Digges (c.1515–c.1559) was an English mathematician and scientist, who is believed—albeit dubiously—to have invented, independently of Galileo, the first reflecting telescope. Regardless, he was one of the first popularizers of science, sort of the Carl Sagan or Neil deGrasse Tyson of his day. He wrote many books, the first of which was a 1553 bestseller called A General Prognostication. It contained a perpetual calendar, weather lore, facts about astronomy, and so forth. It was revised a few times in the ensuing years. (By the way, Digges’ son Thomas picked up where his father left off and eventually played a strong role in the popularization of a subversive little book called De revolutionibus orbium coelestium by Copernicus, which first posited the fact that the Earth revolves around the sun and not vice versa.)

At any rate, Digges père was by trade a surveyor, and his true claim to fame was the invention of the theodolite. The what? Basically, it’s a surveying instrument that has a rotating telescope for measuring horizontal and vertical angles. Digges named the device in a 1571 surveying textbook called A geometric practice named Pantometria (it was published posthumously) and the thing is, no one can quite figure out where the word “theodolite” actually comes from, as it appears to be a mélange of Latin and Greek words that only seem vaguely relevant. Nobody really cared about that, though; it got the job done and that was really all that mattered.

As the Age of Exploration kicked into high gear, those venturing into parts unknown required instruments of all kinds—navigation, surveying, etc.—that had greater and greater precision. In the New World, one of the seminal and most famous surveying projects was the Mason-Dixon line, drawn between 1763 and 1768 by Brits Charles Mason and Jeremiah Dixon, which was only possible because the team had access to the most sophisticated instruments at the time; Mason had served as assistant astronomer at the Royal Greenwich Observatory near London. (For an entertaining fictionalized account of Mason and Dixon’s exploits, I recommend Thomas Pynchon’s Mason & Dixon, or Mark Knopfler’s 2000 song “Sailing to Philadelphia.”)

As America continued to move west, particularly after the Civil War, surveyors of the ever-advancing frontier were aided by precision theodolites, and in the 19th century, no one’s theodolites were more respected than those made by W. & L. E. Gurley, Co. In fact, theirs was the gold standard for surveying equipment until the advent of lasers and digital technology. The company had been founded by two brothers, William Gurley and Lewis E. Gurley, graduates of the Rensselaer Polytechnic Institute in Troy, N.Y.

Conveniently located near the intersection of the Hudson River and the Erie Canal, Troy at the time was a major industrial manufacturing center, and there was no better location for the Gurleys to base their company.

Troy, N.Y., also has a role to play at this time of year. On December 23, 1823, the Troy Sentinel first published an anonymous poem, later revealed to have been written by a New York City-based Episcopalian professor. Reprinted in many other publications, it was that particular poem that created much of our contemporary Christmas imagery and lore, particularly regarding the poem’s main character, said to have been inspired by a Dutch handyman the poet knew. That main character was Santa Claus, the author was Clement Clarke Moore, and the poem, of course, was “A Visit From St. Nicholas” (aka “The Night Before Christmas”).

It was Moore’s poem, followed by an 1881 illustration by cartoonist Thomas Nast, that helped define the modern conception of St. Nicholas, aka Santa Claus.

…and to all a good night.

Happy Holidays from The Digital Nirvana.



Evan Andrews, “5 Ancient New Year’s Celebrations,”, December 31, 2012,

“Clement Clarke Moore,” Wikipedia, last modified November 18, 2014, retrieved December 3, 2014,

“Gurley Precision Instruments,” Wikipedia, last modified April 22, 2014, retrieved December 3, 2014,

“Leonard Digges,” Wikipedia, last modified September 28, 2014, retrieved December 3, 2014,

Old Farmer’s Almanac,” Wikipedia, last modified October 6, 2014, retrieved December 3, 2014,

“Santa Claus,” Wikipedia, last modified October 6, 2014, retrieved December 3, 2014,

“Theodolite,” Wikipedia, last modified November 30, 2014, retrieved December 3, 2014,


We Blog


It has become common these days for industry events—like major product announcements—to be “liveblogged,” or written up in more or less real time on a news site, blog, or Twitter. Apple’s splashy press events are inevitably liveblogged, and closer to our home, HP’s Blended Reality press event in October was liveblogged in a number of places. Other industry events are also liveblogged and they can be a great way for non-attendees to follow along with the action.

Liveblogging is not just for press events. I don’t get cable TV, so this fall I did often follow Syracuse football games via the Syracuse Post-Standard’s livetweets of the games. (The games were far less depressing when I could “watch” them in 140-character bursts of inept offense.)

Last fall, one of my favorite musicians, Kate Bush, performed live in London for the first time in 35 years. Alas, I didn’t get a chance to attend, although the Guardian did liveblog one of the shows. I noticed from the setlist that she didn’t play one of her most famous songs, her first hit single way back in 1978 (at the tender age of 18): “Wuthering Heights,” based on the Emily Brontë novel. The title “Wuthering Heights” does immediately conjure up two competing associations in my mind: the Kate Bush song, yes, but also the Monty Python sketch, “The Semaphore Version of Wuthering Heights,” in which Heathcliff and Catherine express their tempestuous love affair via signal flags.

Did you know that the term “semaphore” (from the Greek sema “sign, signal” and phoros “bearer”) was coined by a late 18th-century French inventor named Claude Chappe (1763–1805)? Chappe’s approach to semaphore was not signal flags, but rather a series of stone towers that used large rods and moveable shutters to send messages—viewed by telescope—from one to another. It was a successful form of long-distance communication and was eventually used across Europe, most notably by Napoléon, who implemented the system to move his armies around his growing empire. Oh, and another term that Chappe coined? “Telegraph,” which was actually what he called his system. Others, like Samuel Morse, working on an electric long-distance communications medium would adopt the term.

The telegraph would trigger off a whole host of telecommunications revolutions culminating in what you are reading right now (the Internet, presumably), while another technology emerging about the same time is also simultaneously connected to both the ability to pick up and read Wuthering Heights on paper as well as read this blog post. (Choose wisely…)

It begins with someone trying to find a cheap way to print something.

The story of Alois Senefelder (full name: Aloys Johann Nepomuk Franz Senefelder—boy, did they know how to name ’em back then!) is pretty well known, certainly to this crowd. Senefelder (1771–1834) was born in Prague but trod the boards in Germany, as he was predominantly an actor and, more predominantly, a playwright. A play roguishly called Connoisseur of Girls was a success, but he ran into problems getting a follow-up play, Mathilde von Altenstein, printed. (It was not “a play about nothing”—that was Seinfeld, not Senefelder.) He ran into debt and couldn’t afford to publish another play he had written (been there, done that). However, in one of those random twists of fate, an entire industry was soon to be born—all because Senefelder had to do his laundry.

He had been trying to come up with an alternate form of printing to the letterpress methods in use at the time. He had been mucking about with copper plates, to no avail, until one day, he recalled, he wrote a laundry list on a slab of Bavarian limestone with a grease pencil. He wasn’t immediately sure of what he had, but after some more mucking about, he essentially invented what became known as lithography (Romano and Romano, 1997). He developed the image carrier (litho stones) as well as a special lithographic press that used the stones and in 1796, partnering with composer Franz Gleißner, started a music publishing firm based on lithographic printing.

Lithography—or planography (printing from a flat surface)—was largely used to print illustrations (lithographs) but there were early uses in packaging, specifically, printing on tin cans. As the technology advanced, lithographic presses started using metal rather than stone as an image carrier, and, thanks to Robert Barclay, went from flatbed to rotary.

By the end of the 19th century, a cutting-edge new technology—photography—was reducing demand for lithographs (sound familiar?). By the early 20th century, lithography—using either stone or metal—was a low-cost printing process used for printing books, photographs, and transactional documents (sound familiar?).

Enter Ira Rubel, a commercial printer in Nutley, N.J. He had been using lithographic stone presses to print bank deposit slips and, as was common at the time, the press’s impression cylinder was covered with a rubber blanket. However, good help is hard to find and automatic feeding had yet to be invented, so occasionally the person feeding the paper into the press missed a sheet, and the lithographic plate would print directly on the blanket. The next sheet through the press would then pick up the impression from the blanket, albeit reversed. However, Rubel noticed that the blanket-printed image was sharper than the image from the plate itself (Romano and Romano, 1997). Ding ding ding! Owwwoooogaahh!! Rubel had himself an idea, and thus was born offset lithography, which would eventually kill off letterpress, although far too late for it to do Senefelder any good.

Rubel made his inadvertent discovery in 1904. Literature nerds know that 1904 is the year in which expatriate Irish author James Joyce set his classic (and controversial) book Ulysses. Indeed, literati around the world celebrate “Bloomsday” on June 16, the day in 1904 that Joyce’s protagonist, advertising canvasser Leopold Bloom, has his Homeric odyssey around Dublin.

Joyce’s book was a publishing sensation and not in the best way; it was subject to censorship and court cases (Molly Bloom’s lengthy extended internal monologue in the book’s final chapter contains a bit TMI), but it has also inspired many other writers and musicians. Allan Sherman got a laugh with a Ulysses reference in his 1963 novelty hit “Hello Muddah, Hello Faddah” and, at the other end of the musical spectrum, Grace Slick sings about Molly Bloom in “rejoyce” on Jefferson Airplane’s 1967 album After Bathing at Baxter’s.

And in 1989, Kate Bush’s “The Sensual World” was originally going to set Molly Bloom’s soliloquy to music, but the Joyce estate wouldn’t give her permission, so she paraphrased. Yes.

Anyway, James Joyce leads us to Jorn Barger, who launched the website Robot Wisdom in 1995. It was one of the very first “weblogs,” and in fact it was Barger who coined the term “weblog,” which he originally meant to refer to the act of “logging the web” as he went from site to site. He had wide-ranging interests, a significant one being the works of James Joyce, about which he wrote frequently.

The term “weblog” was abbreviated to “blog” by Peter Merholz in 1999.

So, today, we have liveblogging, video blogging, slow blogging, and so on. Senefelder wanted to find a cheap way of publishing his plays, and as it turns out, a blog is a pretty cheap way of doing it. Blogging—as this site amply demonstrates (except for the present post, I expect)—is a good way for companies to offer interesting and useful information to present and prospective customers, and should be considered one of the primary tools in one’s marketing arsenal.



Frank Romano and Richard Romano, The GATF Encyclopedia of Graphic Communication (Sewickley, Penn., 1997).

“Claude Chappe,” Wikipedia, last modified on November 29, 2014, 2014, accessed on December 2, 2014,

“Alois Senefelder,” Wikipedia, last modified on June 4, 2014, 2014, accessed on December 2, 2014,

To Moon! Variable-Data Printing Flies Into New Directions


Recently, I was working on a project for one of our clients, and in a marked-up Word document that came back to me was a comment that read, in part, “which came first, digital print or one-to-one marketing?” That got me thinking—which is always dangerous—and then some poking around—which is even more dangerous.

Today’s notion of “one-to-one communication,” aka “variable-data printing,” has something of an unlikely origin and, like just about any technology, is the result of a fortuitous confluence of other related and unrelated technologies. Curiously enough, it begins with an attempt at pre-Internet “blogging” of a sort, which turned into one of American publishing’s biggest success stories. And that Word doc’s “chicken and egg”-esque analogy is also apt: this story begins down on the farm.

DeWitt Wallace was born in St. Paul, Minn., and in 1912, after college, he got a job for a magazine publisher that specialized in farming literature. A few years later, World War I intruded and Wallace enlisted in the army. While in combat, he was wounded, and recovering in a French hospital, he passed the time reading magazines. Realizing that many rural Americans—of which there were many back in the 1910s—didn’t have access to a newsstand, he thought he would compile a “digest” of various magazine articles that caught his eye and promote it via direct mail. You can probably see where this is going…

When he returned Stateside, he went through the magazines at the Minneapolis Public Library and put together a diverse collection of articles, condensed and often rewritten. He was the blogger of his age, in some ways. His project was officially launched in 1922, and the resulting Reader’s Digest became one of the most popular periodicals in the world.

For our purposes, Reader’s Digest also holds the distinction of being what is believed to be the first use of in-letter personalization—the use of a person’s name in a computer-generated letter. (Political mailers—especially those on the conservative side of the aisle—would finesse Reader’s Digest’s early experiment and perfect database marketing.) What do you need to produce a computer-generated letter? Well, obviously you need a computer.

The history of computing is a long and winding road indeed, but the Reader’s Digest condensed version of this story would lead us very quickly to the 1960s and the advent of the IBM 360, which was announced in 1964 and started shipping a year later. It was the first commercially popular and upgradable mainframe computer (this was more than a decade before desktop computers). Essentially, it was affordable by businesses large and small rather than massive academic or government research labs or the largest of corporations. It was the IBM 360 that gave a jolt to direct mailers, because of another invention that appeared around the same time.

The advent of “merge/purge” software is often credited to Alan Drey, a Chicago-based mailing list professional. In the early 1960s, he helped develop the seminal System DupliMatch, software for cleaning up mailing lists. Other types of data analytics software started appearing at this time, as well.

So, by the end of the 1960s, you had the hardware that was powerful enough to process really big mailing lists and could be afforded by a large number of businesses, you had software for managing mailing lists, and you had many proofs-of-concept (Reader’s Digest and political mailers). One other element would be needed to take one-to-one marketing to the next level.

Robert Moon may not be a household name, but if you are in any way involved in direct mail—or a fan of Beverly Hills 90210—it should be. Moon was born in 1917 went to work for the post office as a mail carrier and then a postal clerk, soon passing the exam to become a postal inspector. In the 1940s, he had an epiphany and felt that

the existing rail-based system would no longer be adequate for huge new volumes of mail. He believed the future was in airplanes.

To that end, he became an amateur pilot, true, but he also proposed, in 1944, to the postal powers that be an idea he had for streamlining delivery of the mail. For a variety of reasons—Moon’s widow felt it was political—the idea languished until 1963, when it was finally adopted: the ZIP code. Short for “Zone Improvement Plan,” it revolutionized mail delivery in general, and direct mail in particular. It also revolutionized the ability to segment recipients.

By the way, Moon—living up to his name or perhaps the space-race euphoria of the 1960s—apparently also left behind a ZIP code scheme for interplanetary mail. (I kid you not!) So have no fear: if we colonize Mars, Harry & David will still be able to find us.

Anyway, by the 1970s, direct mail had exploded, and we all began to be inundated with mail that was seemingly written by a human being just for us. In a previous post, I used as an example of the kinds of things we used to receive:

Dear Mr. Ramono,

We very much want to put you, Mr. Ramono, in a new car. Mr. Ramono, have you ever seen yourself behind the wheel of a luxurious yet sporty new vehicle. Have you ever envisioned your own vehicle, Mr. Ramono, being the envy of your neighborhood? Surely the entire Ramono family would derive nothing but benefits from this…

By the end of the 1970s, all the pieces were in place for what eventually became known as “variable-data printing” (VDP). Digital printing itself emerged circa 1994 and the earliest VDP programs—the “killer apps” for digital printing—appeared not long after that. Variable-data printing wasn’t so much a revolution as an evolution of all that had gone before—it was more about digital front ends having the horsepower to process more and more unwieldy databases, as well as variable images. The key to keeping VDP-based campaigns effective is to make them unique and eye-catching; novelty (and relevance, of course) is the most important aspect of one-to-one marketing. After all, no one is especially impressed by seeing their name in the body of a letter anymore.

Any VDP expert will be first to tell you that, despite whatever technological bells and whistles you care to add—and as printed electronics become more prevalent, who knows, we may be printing actual bells and whistles on direct mail—the most important element of a VDP campaign is the content and the message—or the offer—itself. That is the one thing that has not changed since the advent of personalized direct mail all those years ago.