Archive for the ‘Vertical Markets’ Category

Eye Books

Thursday, October 30th, 2014

Herein a long tale of history, technology, and media change.

Several years ago, one of the community arts organizations I am involved with—the Saratoga Film Forum, an art house movie theater in downtown Saratoga Springs, N.Y.—had on its programming committee a serious film buff. He was, essentially, a veritable walking (or sitting, as the case may be) encyclopedia of cinema. This is, of course, not surprising. What was surprising was that he was almost totally blind, suffering from severe macular degeneration and needing elaborate optics that resembled a wearable Viewmaster to watch movies or read books.

Today, optometrists and ophthalmologists understand macular degeneration thanks in large part to the work of Samuel Thomas von Sömmerring. Von Sömmerring (1755–1830) was a German physician and one of the most renowned anatomists in Germany at the time. Amongst his many contributions to our knowledge of physiology was his discovery of the macula in the retina of the human eye. The macula contains the fovea and foveola. They contain a high density of cones, which, with their partners the rods, are the photoreceptors that allow us to see. Macular degeneration, as you would expect, involves damage to these photoreceptors.

Von Sömmering was, like many men of his age, a bit of a polymath and an inventor. He designed a telescope, among other things, and in 1809 created one of the first electric telegraph systems. Based on a crude earlier design, his system used as many as 35 electrical wires, each of which represented a different letter or number. Thus:

messages could be conveyed electrically up to a few kilometers…with each of the telegraph receiver’s wires immersed in a separate glass tube of acid. An electric current was sequentially applied by the sender through the various wires representing each digit of a message; at the recipient’s end the currents electrolysed the acid in the tubes in sequence, releasing streams of hydrogen bubbles next to each associated letter or numeral (Wikipedia, 2014).

Not the most elegant of designs, but it did trigger off several decades of development to produce an effective working telegraph. The first commercially successful electric telegraph was co-developed by William Fothergill Cooke and Charles Wheatstone in the UK. In 1838, it was installed by the Great Western Railway between Paddington Station and West Drayton.

Across the pond, Samuel Morse had patented his own version of a telegraph, as well as the eponymous code (the “Morse code” was devised by Morse with his assistant, Alfred Vail). In 1844, the famous “What hath God wrought” telegram was transmitted, and the rest is history.

The legacy of the telegraph is easy to spot today; what is texting, really, but a high-tech version of the telegram? And all those texting abbreviations and emojie are not a million miles removed from the Morse code, although they’re often less comprehensible.

The telegraph did help solve a problem that had briefly plagued U.S. President Andrew Jackson. For the first 125 or so years of U.S. history, mail delivery was literally 24/7. Indeed, the postal service was the only form of communication back then, and few things were more important than the mail. Postmaster General was a Cabinet position, and until 1971, the Postmaster was in the line of Presidential succession. Post offices were also great gathering places, as people socialized, drank, and played cards or what have you while they waited for the mail to arrive (there was no home delivery until after the 1860s).

“The advance of the human race in intelligence, in virtue and religion itself depends in part, upon the speed with which…knowledge…is disseminated,” wrote Colonel Richard M. Johnson, a Kentucky Congressman who served during the Jackson presidency (he was later Martin Van Buren’s veep) (Meacham, 2008). Why did he write this?

The fact that there was mail delivery every day of the week meant, logically, that there was mail delivery on Sunday, aka the Sabbath. This didn’t sit well with some of the more religiously inclined personalities of the time—in particular, one Reverend Ezra Stiles Ely, who was a man on a mission. That mission was to end what he called “the national evil of great magnitude”: mail delivery on Sunday. (The things they worried about back then…) He took it up directly with President Jackson—one of the problems of being a populist like Jackson was that you were constantly being accosted by the public—and even though Jackson had other things to contend with (like, say, nullification), Congressman Johnson was appointed to head a committee to investigate closing the Post Office on Sunday. The committee ultimately decided, “The mail is the chief means by which intellectual light irradiates to the extremes of the republic. Stop it one day in seven, and you retard one-seventh of the advancement of our country” (Meacham, 2008). (Boy, did they have a way with words back then!) So Sunday mail delivery stayed. (Another of Johnson’s arguments was that since some religions celebrate the Sabbath on Saturday, singling out Sunday would give unfair—and unconstitutional—preference to one particular faith.)

After 1844, however, the volume of mail in general—and Sunday mail in particular—started to drop thanks to the telegraph, which became a prominent tool of business communication.

Remember, too, that businesses tended to operate seven days a week back then. Reverend Ely and his successors were still eager to get the Sunday Sabbath free, so by the end of the century, religious leaders formed an alliance with organized labor, which was starting to become an influential force. Both parties, religious leaders and labor leaders, wanted the same basic thing—Sundays off—albeit for different reasons. By the early 20th century, technology had made the issue, as far as the mails were concerned, moot. The telegraph and the railroad made businesspeople less reliant on the mail, so in 1912, when Congress decided to eliminate Sunday mail delivery, a bill which President Taft signed without complaint, there really wasn’t much hue and cry.

As Dr. Joe Webb has pointed out many times, mail volumes have continued to drop thanks to all the communications revolutions of the 20th century—the telephone, radio, television, the Internet, and now all the various mobile and social media. And while debate centers around whether mail delivery should be pared back to five days a week, last year Amazon partnered with the USPS to restore Sunday delivery, if only in selected cities (at first).

One of the things you could have Amazon deliver to you on a Sunday is a new Kindle.

It was the Kindle, more than anything, that triggered off the ebook revolution. Electronic books were nothing really new; Project Gutenberg dates back to 1971, after all, and by the turn of the millennium there were at least a dozen companies and platforms jockeying for market share in the nascent ebook space, including such giants as Microsoft and Adobe. The early Palm devices—precursors to today’s smartphones—were highly touted as an ebook platform. (Have you ever read a long novel on a Palm Pilot? It was not fun.) The E Ink approach to “electronic paper”—the reflective electrophoretic technology that essentially made reading a screen as comfortable as reading ink on paper—started to gain traction, and the Sony Reader was the first commercially successful ereader. It debuted first in Japan and was introduced in the U.S. in 2006. It was a modest hit, but it wasn’t until the Amazon Kindle, based on the same E Ink technology, launched in 2007 that the ebook market took off. (The poor Sony Reader; discontinued in 2013, it is alas a mere footnote, albeit an important one, in the history of ebooks.) Although ebook growth has been flat in the past couple of years, in 2013 ebook sales still amounted to $3 billion, which ain’t nothin’. Even if ebooks aren’t exactly cannibalizing print book sales, they are still an important part of the cross media mix.

Ebooks like those available for the Kindle have found favor amongst older readers for a very basic reason: it’s easy to make the type bigger. And thus book lovers who may have failing eyesight—either from basic aging or specific problems like macular degeneration—are still able to read. And Apple’s perhaps aptly named “Retina” displays make even backlit screens easy to read.

Samuel von Sömmerring would approve.



“BookStats: Ebooks Flat in 2013,” DigitalBookWorld, June 26, 2014,

Megan Garber, “The Unlikely Alliance That Ended Sunday Mail Delivery…in 1912,” The Atlantic, November 12, 2013,

Tiffany Hsu, “U.S. Postal Service to deliver Amazon packages on Sundays,” Los Angeles Times, November 10, 2013,

Jon Meacham, American Lion: Andrew Jackson in the White House (New York, 2008), pp. 87–88.

“About Project Gutenberg,”

“Samuel Thomas van Sömmerring,” Wikipedia, modified September 26, 2014, accessed October 29, 2014,ömmerring.

Printing Is Easy, Marketing Is Hard

Wednesday, March 26th, 2014

“Outside of a dog, a book is man’s best friend. Inside of a dog, it’s too dark to read. —Groucho Marx

It has been said, by whom, I’m not entirely sure, that everyone has a book inside them (insert your own “Marxist” joke here), or at least everyone thinks they do. I am regularly asked by friends and colleagues, both inside and especially outside the printing industry, about how to self-publish a book. Almost universally, the questions are about the physical production and printing process (“how many pages/words do I need I need to write?” “How expensive is it?”, etc.) or how ebooks work. However, from my experience, the questions one asks about self-publishing should focus less on production and more on marketing—and even whether there is an audience at all for the book you want to write.

There are success stories, of course. The 50 Shades of Gray franchise (to my horror, I discovered too late that it had nothing to do with color management) is perhaps the emblematic example of the self-publishing experiment that was enough of a hit to lead to mainstream publishing success. (Imagine, erotica being a saleable commodity. Who’d’a thunk it?)

Regular WhatTheyThink readers may know (or be in denial about the fact that) that Dr. Joe Webb and I have co-written and self-published almost half a dozen books (see in particular here, as well as here, here, here, and here), and the half-dozenth is on the drawing board—and, no, will not be called 128 Levels of Gray and will not chronicle the erotic adventures of a prepress department manager. The one thing that we have learned in our self-publishing adventures is that production, printing, and even writing all comprise the easy part of the self-publishing process. Today’s digital and on-demand printing technologies make it easy and inexpensive to publish your own books, and services like Amazon and Lulu, to name two that we have used, handle both the physical production and offer an online storefront for a book. But that is, again, only the smallest of first steps.

Some serious questions and considerations to ponder before even setting finger to keyboard include:

  • What is the real market for the book? Be honest. What is the competition like? Do your due diligence. Search Amazon, Barnes & Noble—even venture to the nearest physical bookstore to see what books may exist on your topic. You may very well be entering a very crowded or even saturated market—even if you have a unique take on a well-trodden topic—and being self-published is one major strike against you if your closest competition is from an established publishing company.
  • Is there a lot of free competition? Our recent book is The Home Office That Works!, about setting up a productive home office, and while there are few published titles (that we found) that cover the topic the way we did (most are about launching a specific home business), but we discovered after the fact that there are a lot of blogs and online articles about various aspects of running a home office. It’s strewn piecemeal all over the Internet, but a challenge is getting people to buy something they can probably search out and get for free. If I were to write a book offering tips for prospective self-publishers, I would be in trouble because of blogposts like this one.
  • Do you have a promotional/marketing apparatus already in place? That is, are you a fairly well-known speaker in your industry and can use speaking gigs as marketing tools for the book (and/or vice versa)? When we published Disrupting the Future in 2010, it hit enough of a nerve in the industry that it led to Joe and I getting speaking gigs that, in turn, promoted the book. It helped that we were known quantities (for better or worse) in the industry.
  • How popular are you on social media? I’ll get in trouble for saying this, but I think social media has become vastly overrated as a marketing and publicity tool, but that’s not to say it is not without value. Are you active enough in these areas or do you—like me, I hasten to add—have to be dragged kicking and screaming into social media? If you are like me (and my thoughts and prayers go out to you), do you know someone who can do your social media stuff for you?

Self-publishing is not as looked down upon as the old vanity publishers of yore, but there is still a stigma attached to it, as in “you couldn’t get a real publisher, could you”—even though all the questions you should ask yourself before self-publishing are the same as you should ask before seeking out any publisher.

Digital printing technology has truly enabled the small, independent, or self-publisher—but that really is only the beginning of the process.

Are You Selling 1:1 Printing for Its Efficiency?

Tuesday, November 12th, 2013

When we think about data-driven printing, we think about elevated response and conversion rates, we should not forget to promote its benefits for cost savings, as well. These savings can come in hard costs such as the ability to reduce postal costs through co-mingling and in soft costs such as labor reduction, reductions in calls to call centers, and improving cash flow.

I ran across the great case study from XMPie recently that talks about these “hidden” cost savers.

Time Magazine Europe wanted to increase subscriptions, and it also wanted to improve the customer experience when subscribing to the magazine. So it partnered with Latcham Direct (UK) to drive respondents to personalized URLs where they could sign up themselves.

Thirty percent of people responding to this campaigned used the personalized URLs.

When people responded using this channel, not only did this improve their customer experience over the impersonal, disconnected process of sending back forms by direct mail, but it helped Time’s bottom line in three ways:

  1. It reduced the costs for the manual input of subscriptions coming in from direct mail cards.
  2. Conversion rate for personalized URL responders was 75%, reducing the cost of follow-ups.
  3. Cash flow is improved since the subscriptions are processed more quickly and readers get into the system earlier.

To date, this campaign has reached more than 1.6 million people. Because the response rate via personalized URL was so high (30% of people responding), the cost savings reaped by the company were significant.

How could your clients use a similar model to reduce costs?

Personalized URLs were created in XMPie PersonalEffect and metrics were tracked by UProduce Marketing Console.

Printing in Spook Country

Monday, July 29th, 2013

Spook Country,” the 2007 novel by William Gibson, introduced the concept of “locative art” to the reading public. Gibson’s character Hollis Henry is constantly searching for works of art with her smartphone; art that Gibson describes as akin to techno graffiti.  His descriptions of art tied to a particular GPS location and viewable with a smart phone or VR glasses include a virtual image of  F. Scott Fitzgerald dying at the very spot in Hollywood where he had a fatal heart attack, and Archie – a 90 foot giant squid (Architeuthis for those in the know.) In the book, Archie was designed as a display for a Tokyo department store with “an endless rush of digital imagery along Archie’s distal surface.”

The Museum of Vancouver took a page from Gibson’s book this month by launching their augmented reality museum app “The Visible City.” Truly a work of locative art, Visible City enables a walking tour augmented by your smart device in which the tourist sees the streets of Vancouver as they were in their “neon era.” The application overlays pictures and interviews with local personalities to create an immersive experience.

VisibleCity - Webheaderimage

However, augmented reality today is as much about commerce as it is about art. Like the Tokyo department store in Gibson’s novel, retail is the main early adopter. Major brands realize that the opportunity for consumers to interact with products in retail locations can drive sales. There are many examples of AR used for product marketing including LEGO toys, Heinz Ketchup, Budweiser and Audi. While the first three involve interactions at the point of sale, Audi used Metaio to develop an AR enhanced brochure and a virtual users guide (it’s in German – but it’s so clear it doesn’t matter.) There are also numerous examples of catalogs enhanced with augmented reality apps to deliver 3D product views as the reader directs their smart device at a specific item.

While the early adopters were in retail, other brands are getting on board, most recently PNC bank with their Finder AR-based bank locator app. It’s really not anything that couldn’t be accomplished with a Google search or asking “Siri, where’s the nearest PNC Bank?” Nonetheless, it demonstrates the conservative banking industry’s interest in embracing the new cool thing.

Finder by PNC landing page image

Direct Marketing is a natural fit for augmented reality; just ask Omni Hotels and Resorts. Omni-live, their AR app was released in June and is part of a multi-media campaign tailored to meeting and events planners. It includes print, social media, online video and web advertising in concert with augmented reality. In addition to making the campaign more interesting and interactive, AR also makes the campaign more measurable. As soon as the consumer launches the app, the marketer knows that the campaign is being read and how much time the consumer is interacting with the contents. With a really well done virtually reality application, consumers will return again and again.

There is also potential for AR with transaction printing from mundane explanations to incredibly creative advertising. With AR, a financial institution or wireless/internet/cable provider could virtually welcome new customers on board walking them through their statement or invoice and offering detailed instructions (like the Audi user manual above.)

There are plenty of agencies and AR developers out there ready to partner with you to bring new services to your clients. All it takes is a creative vision of how your current print products can deliver more value. Adding a virtual layer between the reality of print and a virtual world revealed through smart apps is the next step in business communications – are you ready to take that step?

For a nice primer on Augmented Reality (written well before AR was on the tip of people’s tongues) visit Common Craft’s Youtube presentation (sorry, there is advertising on the site.)

Elizabeth Gooding Elizabeth Gooding is the President of Gooding Communications Group and editor of the Insight Forums blog. She writes, presents and provides training on trends and opportunities for business communications professionals within regulated vertical industries.

Desperately Seeking… A Utility Bill

Wednesday, May 1st, 2013

As a utility consumer, I have needs. I need to be asked how I’m doing. I need to feel needed. I need to be understood. I desire warmth from more than just my HVAC unit.

I want to know where my money is going and why I owe as much as I do. Once I come to terms with the hard fact that I indeed do need to part with my hard-earned money, I want it to be as convenient and easy to decipher as possible. I want to be able to check my bill from my phone or computer and have the option to pay from my mobile phone.

I don’t want to call a customer service line, and I don’t want to navigate through a series of voice prompts. Parting with my hard earned money isn’t an intrinsically fun thing to do, so when I have an experience with my utility company, I’m already on the defensive. I need my utility company to open a communication with me, not just a one-way message. I don’t at all mind the utility company sharing a third-party deal with me, as long as it applies to me, and isn’t a hassle to read through.

What I can’t deal with is poor design that lacks graphics to clarify my statement. I’m a visual learner, so I need to see where my money is going. I want to see the crucial information front and center. If I have to call customer service, I want to easily find my account number and all other pertinent information in one place. I want an e-statement that looks like my bill. I find it helpful to see why I’m using so much energy, and I like to see if I was demonstrated better or worse habits in the prior year (or better than my neighbors!). I want to see actual meter readings and I want to know how to lower my consumption. I also don’t like getting a water bill, a sewer bill and a waste collection bill separately, when all three are paid with the same invoice!

Also, I need reminders. A printed bill in the mail is a great reminder, but for some bills, I prefer e-presentment and mobile solutions. When I use e-statements, it really helps to get a reminder in my email or a text to my phone. If there’s one thing I hate more than having to pay bills, is paying late fees. A simple reminder and an easy to use payment portal help me make late fees a thing of the past. I have some bills on autopay from my bank, some I pay monthly with my credit card and some I send a check for- so I count on my utility provider to make it easy on me with a reminder. The worst is getting hassled by customer service or risking a service interruption from a late payment when literally, “The check is in the mail!” Please track your remittance efforts as well, and save us all some time!

I understand that some providers have an outdated legacy system in place, but that is no excuse to not get with the times. Work with a provider to transform your legacy system into a more modern system, and begin a statement archival system for easy access in the future. Offer me online and offline options for my statement. Offer an electronic bill pay system.

Is that too much to ask?

Insurance and Retail get Married

Monday, April 29th, 2013

About this time last year I posted a release about the new retail sales branch opened by Horizon Blue Cross Blue Shield of New Jersey. Horizon was one of the first health insurance companies to take a “retail” approach to selling individual insurance policies under the then newly approved Affordable Care Act.

In May of 212, Forbes reported on the partnership between Aetna and Costco to offer the Costco Personal Health Insurance medical and dental program.  Consumers who buy the Aetna coverage through Costco will get extra discounts when they buy prescriptions through Costco pharmacies. Costo had already developed banking partnerships to allow them to sell mortgages.

This year we are starting to see the life insurance industry, particularly products geared to lower and middle income consumers, pursue retail sales opportunities. MetLife, for example, has set up kiosks in hundreds of Walmart stores. Unlike the Horizon branch which has specially trained staff to answer questions, visitors to a MetLife kiosk pick up their “box of insurance” in the form of a prepaid card and take it to the checkout. They then have to call MetLife’s toll-free number to answer health questions posed by a life agent. If the customer qualifies for coverage, the policy is activated, otherwise the card can be returned for a full refund.

Two key things we can learn from this trend:

1. As more insurance companies start courting retail partners as distribution channels, or opening up direct branches, they will need a new “retail approach” to their communications as well. This opens up new opportunities for graphic arts services like signage, sell sheets, and packaging for direct branches. It should also increase potential for transaction printers to offer statement marketing to highlight approved retail partners. Design services are a potential “foot in the door” as so much new material will need to be developed for the retail audience.

2. Partnerships, particularly distribution partnerships, can be wonderful things. Printers and other business communications professionals may also find value in new distribution channels and regional partnerships. Insurers are able to reach a broader audience that will pay a premium for convenience through retail relationships. Perhaps there are similar opportunities out there for your business.

If retail and insurance are getting married, let’s crash the wedding or at least get some good dating advice.

Elizabeth GoodingElizabeth Gooding is the President of Gooding Communications Group and editor of the Insight Forums blog. She writes and speaks and provides training on trends and opportunities for business communications professionals within regulated vertical industries.


P&C: Agents of Change?

Monday, March 11th, 2013

negotiation timeThere nearly 1 million insurance agents and brokers employed in the U.S. and you would think that they would be fiercely competitive with each other. You’d be wrong.  In fact, more and more, agencies are merging, consolidating and forming agency networks to compete with the real enemy – Direct Writers. Many large carriers with captive agency forces or who sell insurance directly online or over the phone, “Direct Writers,” spend as much as $200 to $700 million per year on advertising, particularly in the home and auto insurance market. According to estimates from Independent Insurance Agents & Brokers of America and A.M. Best Co. direct writers dominated the overall personal lines market in 2010 writing over 53 percent of total premium. Independent agents are fighting back and asserting their value to insurance customers.

“Personal touch is what will keep independent agents alive in the future,” says Christopher Misterka, Marketing Coordinator with the Kaplansky Insurance Agency which has offices in 11 locations and has experienced 13% growth in each of the past 2 years. Misterka says that client correspondence used to be primarily letters but, “now it’s primarily email, social media and a monthly online newsletter that offers customer education on timely issues like potential tax scams during tax season.” While Kaplansky has moved most of their personal lines marketing online due to the size of the household audience they want to reach, they continue to prospect for commercial clients using direct mail. To keep content fresh and campaigns timely, Misterka engaged an insurance specialty organization called Agency Revolution with professional writers, an existing library of content and a digital marketing platform that enables quick generation of marketing campaigns.

Angelyn Treutel, President of SouthGroup Insurance agrees that providing educational content is critical in positioning an agency well with customers and that consistency of communication with the customer builds trust. Her organization leverages the Trusted Choice solution available through the IA&B however; their direct marketing is all managed in-house. “We use a multi-channel, multi-touch approach recognizing that it may take 2 or 3 or 4 touches before a customer takes action,” says Treutel. “We need multi-channel because different customers are in all different places,” relative to their acceptance of online versus print communications. SouthGroup has had particular success with personalized direct mail that includes pictures of agents as part of the mailing. Personalization, careful segmentation of campaigns and ensuring that the customer never gets the exact same message twice are important in crafting an effective campaign according to Treutel.

Differentiation between segments of the P&C business such as commercial versus personal lines is one simple method, but many agencies are looking deeper at markets, customers and the communication preferences of individuals. For example, the High-Net-Worth Personal Lines market is more often served by independent agents than direct writers as the agencies give affluent customers the specialized services they have come to expect. In this market, agencies can create campaigns to educate customers on the superior products, pricing, loss prevention services and risk management services that are available through carriers that specialize in the HNW market versus more generic solutions from direct writers. Since many HNW clients are also business owners or senior executives, there are great opportunities to cross sell HNW personal lines insurance to commercial clients and vice versa. Independent agents currently sell the lion’s share of premium in the commercial market and can strengthen that position by building trusted personal relationships with business owners and managers.

Creating innovative communications has historically not been a core competency of insurance agencies but most recognize that this needs to change. As the demand for more effective and consistent customer touches continues to grow, agencies are looking for partners to help them execute regular, cost effective communications programs with their customers. If you are a service provider with a truly robust multi-channel offer and the strategic services to become an agent of change in the P&C industry – opportunities abound. If your offering is not quite as developed, don’t despair, there are still opportunities to pursue business from the “Agency Agencies” that are primarily focused on providing marketing content and digital distribution, but typically outsource printing and mailing services. As agencies cooperate and grow larger, the opportunities to serve them grow larger as well.

Elizabeth GoodingElizabeth Gooding is the President of Gooding Communications Group and editor of the Insight Forums blog. She writes and speaks and provides training on trends and opportunities for business communications professionals within regulated vertical industries.

Risky Business

Monday, February 11th, 2013

Property and Casualty (P&C) Insurance carriers are in the business of assessing risk; risk of theft, damage, injury, professional malpractice and catastrophe as well as investment risk. They make their money by laying odds on the likelihood that things will go sideways for their customers and that they will earn enough money by investing the pool of premium dollars to pay out on the bet if things do. Lately it seems that climate change is blowing up all the models for setting the odds of a natural disaster and insurers are dealing with defining and delineating coverage for new threats like cyber-terrorism that have completely changed the game.

The core systems most insurers have in place are woefully inadequate to handle the scope and pace of this new insurance game. In order to keep up, companies have built add-on modules and work-arounds to their core systems, often relying on Microsoft Excel or Microsoft Access “Band-Aids” to keep business moving. Many carriers that have upgraded their core systems did it on a “go-forward” basis leaving existing business on the old policy administration or claims system and writing new business on the new platform. At some companies this has happened more than once and there are now several “core” systems in production for different lines of business. All of the Band-Aids, work-arounds and go-forward solutions have left data scattered in multiple repositories just when carriers need data in one place more than ever.

In order to adequately assess risk, insurance carriers need large amounts of policy, claims, fraud and customer demographic data all in one place so that they can use risk modeling and data analytics to determine which types of risk are profitable to insure.  According to Accenture’s  2012 North American Claims Investment Survey, 54% of P&C insurers have core systems that are more than five years old, 66% say their claims systems are not optimized to collect and analyze data and 78% regard their capabilities inadequate to manage new forms and levels of risk, such as those presented by cybercrime, terrorism and increasingly frequent and severe natural catastrophes. So, after years of avoiding the disruption, expense and well – risk of a major core systems upgrade many companies have realized that they just can’t avoid taking the leap. A small study of 37 insurance carriers by Novarica indicated that 25 percent of large P&C insurers and more than 40 percent of midsize carriers were in the middle of converting their policy administration systems or planning to start a conversion at the end of 2011.

Keep in mind that the typical core systems upgrade will take from an incredibly fast eighteen months to a more typical three years plus to complete, depending on the number of undocumented work-arounds that need to be incorporated into the system and the level of data conversion to be completed. This means that a large percentage of the industry is either planning a core system upgrade or in the midst of completing one. And what comes out of these systems you ask? Documents, lots and lots of documents: quotes, policies, premium invoices, notices, claims reports, payments and more.

Opportunities abound for reducing the costs of producing documents in parallel with core systems conversion. Bringing systems together increases the opportunity for postal optimization, targeting analytics and improvements to the design of the documents themselves. The core systems upgrades have a larger implication as well; they enable insurers to develop more segmented and personalized products to appeal to different age, risk, ethnic and geographic groups of consumers. Direct marketing and agency marketing support is becoming more tailored and personalized as well with multi-touch, multi-channel and multi-language campaigns hitting the paper, airwaves and cyberspace simultaneously.

P&C Insurers are expected to spend an average of 17.5 million on Claims System upgrades alone. This seems like a pretty substantial number until you consider that the top 16 P&C insurers spend an average of $315 million on advertising each. GEICO alone spent over $993 million on advertising in 2011. This is not counting direct marketing spend – P&C Affinity Mail alone exceeded 500 million mailings in 2011 according to Mintel Comperemedia.

Savvy service providers are positioning themselves to help insures take advantage of newly upgraded systems and a wealth of new data to improve their customer experience throughout the insurance lifecycle. With their plates full to overflowing with core systems conversion initiatives, insurers need help to ensure that the tangible representation of their value to consumers – namely insurance documents – are not put at risk by the very projects intended to reduce risk. Now is the time to show insurers how to redirect some of those advertising dollars toward investments in customer experience and cross-sell using low-risk, high-reward solutions like direct mail, statement marketing and personalized collateral in tandem with QR codes and other calls to action that drive social media engagement and leverage consumers interest in mobile insurance applications. If your company isn’t positioned to help them, maybe you should be looking at some core systems upgrades too.


Elizabeth GoodingElizabeth Gooding is the President of Gooding Communications Group and the Editor of She covers business communications trends in highly regulated industries such as insurance, financial services, healthcare and telecommunications.

Jell-O, Healthcare and the New Normal

Wednesday, January 16th, 2013

physicians and hospitalsRunning a hospital or healthcare practice is already labor and capital intensive, highly regulated and impenetrably complex. The Affordable Care Act and the growing trend toward consumerism has added constant change to the list of industry challenges. While the ACA itself is the law of the land and implementation is moving forward, the foundational elements to be implemented are as firm as warm Jell-O. Change is the new normal:

  • States may or may not expand their Medicaid programs;
  • Health Insurance Exchanges (HIX) may be set up by states or by the Federal government in certain states, and the Federal HIX implementation structure is not fully defined;
  • Accountable Care Organizations (ACOs) are being formed and tested in near real-time;
  • Definitions of Essential Health Benefits (EHBs) can vary by state and guidance is required;
  • The “standard” 8 pages format for the newly mandated Summary of Benefits & Coverage (SBC) can now be any length determined necessary by insurance companies (Note: the purpose of this new, somewhat redundant, document was to provide a standardized plan comparison for consumers.)

Provider’s biggest concern may be potential changes and interpretations surrounding “Necessary Care.” According to the Journal of the American Medical Association, “care that did not show a proven health benefit, and where a less costly alternative was not used,” accounted for between $158 billion and $226 billion in 2011. Proposed regulation around necessary care shifts financial risk to doctors and hospitals and this, along with other regulations and stricter Medicare compliance requirements, will require investment in Electronic Health Records (EHR) and other major infrastructure upgrades that smaller providers are not equipped to fund.

The combination of independent providers’ flight from risk, a need to dramatically reduce costs and increased capital requirements is driving the next big source of change: Mergers and Acquisitions.

Market consolidation

fish eat fish

The pace of consolidation is mind boggling: the annualized number of hospital acquisitions or mergers nearly doubled between 2009 and 2012. Plus, physicians are merging with health plans and hospitals, hospitals are merging with hospitals and long-term care providers, health insurers are investing in hospitals and physician practices. Not to mention non-provider consolidation in biotech and pharmaceutical manufacturers, disease management companies and all along the healthcare supply chain. In an interview with The Huffington Post, Robert Laszewski,  president of Health Policy and Strategy Associates referred to the M&A climate in healthcare as an arms race in which the players are merging into bigger entities in hopes of restraining their own costs and grabbing larger shares of the markets.

According to PWC’s Healthcare Executive Agenda consolidation is not a panacea and even small healthcare mergers carry a lot of risk. We’ve seen the same thing in the merger-prone print industry – nearly two-thirds of deals do not meet pre-merger expectations. This lack of stellar success is not likely to stem the tide of mergers; however, it does present many opportunities for print service providers and industry consultants to make these newly consolidated entities more successful. Here are a few thoughts:

  • While individual providers and provider groups have low volumes of communications, larger merged-entities have volumes that are more attractive for outsourcing.
  • Newly formed entities have redundant documents and systems that need to be unified or eliminated in order to gain the sought-after costs savings from the merger. Consultants and Outsourcers can help to meet those needs more quickly.
  • The ability to consolidate volumes, processes and technology allows outsources to deliver immediate savings from house holding, postal optimization, white paper processing and electronic services such as electronic payment and presentment.

What needs to happen after a merger? Plenty of situations where service providers can add value:

  • Determine brand strategy. Research demonstrates that capital markets respond more favorably to brand strategies that involve combining elements of the two companies than strategies that replace one entirely or leave both untouched. This requires an analysis of the strength of both companies.
  • In parallel with re-branding considerations, a business communications audit needs to be performed to identify the people, processes and technology used for generating business documents. This audit should generate documentation on current processes and recommendations for leveraging the best practices within each firm, the combined volumes of the merged firms and eliminating redundancies.
  • New branding (and likely new regulatory language) will need to be incorporated in the systems that are proposed to be maintained going forward. Efficient implementation will typically require an additional analysis and redesign step to create document standards and streamline implementation.

The newly merged company will likely also be evaluating their supply chain; eliminating vendors or “right-sizing” with vendors that fit their new status as a larger organization. The ability to support these firms with the analysis and streamlining processes makes it more likely that you will be considered for additional outsourcing opportunities rather than dropped from the vendor list.

While it would be prudent for these companies to go through a detailed pre-merger fit and synergy analysis from both a financial and a customer perspective – most often the customer and customer communications strategy is in that “warm Jell-O” mentioned earlier. The opportunity to help companies evaluate these issues pre-merger or immediately post-merger can be of huge benefit in achieving the hoped-for cost savings and also maintaining market share by communicating effectively with customers and making sure the bills get paid amidst the merger madness. Let’s call that preventative care.

The bottom line is that if constant change is the new normal for health care providers, there will be constant opportunities for companies who can help them deal with those changes.

Editor’s Note: Additional information on changes in the Health care industry is available from our sponsor, Canon Solutions America. See the PressGo! Industry Guide to Healthcare.


Elizabeth Gooding


Elizabeth Gooding is the president of Gooding Communications Group and the editor of helping clients in highly regulated industries—and the service providers they depend on— to optimize the designs, processes and production technology used for multi-channel communications.

Health Insurance – Change Brings Opportunities

Thursday, December 13th, 2012

It’s fair to say that the business model for health insurance is in the process of being completely redefined by the Patient Protection and Affordable Care Act (PPACA or ACA). Health insurers can expect to spend the bulk of 2013 getting ready for the new post-ACA marketplace. How far reaching are these changes? Well, they impact critical factors like:

  • Who insurers can sell to: individuals in addition to groups.
  • Who insurers must sell to: no ability to deny coverage for pre-existing conditions.
  • Where they sell their products: new Health Insurance Exchanges (HIE) in addition to the usual channels plus new retail branches.
  • How they can sell their products: products offered through exchanges must conform to one of 5 standardized options.
  • How they can price their products: they must devote 80% (in some cases 85%) of premiums to actual customer medical expenses leaving only 15% to 20% for all administration and overhead.

In addition to the changes that are mandated by the plan, there are many changes that just naturally flow from adapting to a consumer-driven market. In 2011 approximately 50 million people – or about 16% of the US population – had no health insurance coverage or eligibility for government sponsored health programs. In 2014 approximately 60% of that population is expected to purchase private health insurance coverage – that’s about 30 million new customers. In addition, another 17 million customers may come on the books as states expand Medicaid eligibility to more low-income Americans since most states contract Medicaid coverage to private insurers.

Insurers are trying to turn their marketing and sales organizations into retail operations to tap the consumer market. Like retailers, they are trying to leverage data on their customer base to drive effective marketing and communications programs. Since, other than marketing Medicare supplement programs, most insurers have had little or no consumer marketing experience they need help in this area. Compounding the problem, according to PWC, this new insurance market is made up of consumers who are likely to be less educated and many will need material in a language other than English.

Since many of these new insurance consumers have never enrolled in a health plan before, they are likely to shop for health insurance they way that they would shop for any other major purchase like a home appliance or a car – by seeking out a familiar brand. To become top of mind before these people enter the market, insurers are investing in a wide array of advertising: TV, radio, web, print and billboards to build awareness. Direct mail, email and mobile marketing will only increase as new products become available and market data is refined.

But the retail transformation goes beyond branding, insurers are opening branches where consumers can learn about insurance options and buy on the spot. In May, Horizon BCBS announced that they would be opening a new retail center in New Jersey and Blue Shield of California recently opened a “Blue Shield Store” inside of Lucky’s Supermarket in San Francisco. These are two of several retail store-fronts in 5 or 6 states with more to come in 2013.

These retail operations will naturally need to be staffed with knowledgeable people and supported with kiosks and other technology but, they will also need printed collateral, the ability to order and manage collateral across locations and the kind of seasonal and tailored signage seen in the best branch banks and retail stores.

I’ve skimmed the issues affecting health insurers and haven’t even touched on the impact to health care providers – but I think you can see that this is a market in transition. And where there is transition, there is opportunity. It may be difficult to get the attention of insurance executives with everything on their plate, however, if you do get their attention and have solutions to help them market more effectively and efficiently to consumers while driving down the costs of servicing their insured members – you could be busy for years!


 Elizabeth Gooding is the President of Gooding Communications Group and the Editor of the Insight Forums blog. She covers key issues affecting business communications in highly-regulated industries.




Editors Note: White papers and podcasts on the impact of the ACA on business communications are available on Océ PressGo!:  a business development program for Océ customers.



Encyclopedia Britannica Ceases Print Edition After 244 Years

Friday, March 16th, 2012

The Encyclopedia Britannica made headlines earlier this week when it announced that it was “stopping the presses” and ceasing publication of its print edition after a strong 244-year run. From a business standpoint, one can understand why this inevitably needed to happen: Encyclopedia Britannica Inc. has sold just 8,000 sets of its latest 32-volume, $1,395 print edition released in 2010, with another 4,000 sitting in a warehouse waiting to be ordered. When the last set is shipped, that will be that. Sales of Britannica’s print edition peaked around 1990 at 120,000 sets, with significant decreases in volume through the 1990’s and into the 2000’s. For the company itself, the print edition represented only a small portion of revenue, with the majority derived from selling curriculum products to schools, as well as online subscriptions and other digital versions of its content.

In my view, this move is not revolutionary, but it is certainly evolutionary. It serves as a reflection point on multiple fronts, including the transformation occurring in the publishing industry and in education; it also highlights the true impact that the Internet and digital media continue to have in the way we learn, work, and play.

Is the sunsetting of Encyclopedia Britannica’s printed set just another death knell for the demise of the printed book or other printed publications? No… BUT… it does serve as a reminder that it is imperative for publishers to have a digital media strategy. Luckily for Encyclopedia Britannica, the company has been working to publish its vast repository of the world’s facts and figures to digital channels since the 1980’s. It released the first CD-ROM (remember those?) of Britannica in 1989. It put its collection online in 1994, which was seven years before Jimmy Wales launched Wikipedia in 2001.

Encyclopedia Britannica was actually ahead of its time in its digital publishing efforts, and ensured that it built up a strong digital business before deciding to end its print edition. The company reports having 500,000 subscribers to its $69.95/year premium Britannica Online service, which users can access via the Web and also through its iPad application. Think about that: what was once a 129-pound set of books now fits on a device of just over 1 pound… and it’s searchable, browsable, interactive, and constantly updated.

Some are of the opinion that more searchable and hyperlinked content, while efficient, takes away some of the serendipitous nature of perusing through a printed encyclopedia or other printed publications. Apparently those people have never gone on a Wikipedia bender, letting the hours melt away while clicking through dozens (or hundreds) of interconnected articles. Of course, there is definitely something about looking through a tome like Encyclopedia Britannica that is hard to replicate in the digital world, but the reality is that in today’s world, efficiency is paramount. Furthermore, I believe that information is power, and limiting that type of high-quality, trusted reference information to the confines of a fixed-length format is, in the end, inhibitive.

Another thing this news made me really reflect on is the impact of technology on education. While print is going to continue to play an important role in education well into the future, digital media can be used in conjunction or even on its own to more effectively help students learn new concepts and expand their knowledge. A lighthouse example of how digital media can be used as an effective teaching tool is Khan Academy, whose mission is “to provide a free world-class education to anyone anywhere.” Now that is revolutionary.

Through short, instructive video lessons often taught by the site’s founder, Sal Khan, students can work their way from the basics of a particular subject all the way through to the most complex applications. While the information is freely available online, the not-for-profit is piloting programs in 23 schools with its math curriculum, where the video lessons are their primary instructor and teachers are used in more of a support role. Students’ progress is tied back to analytics that help pinpoint where they are having problems and in what subject. Sal Khan and his team may have cracked the code for how to effectively use the Web and digital media to enhance learning.

In the 60 Minutes piece on Khan Academy from this past weekend, Sal Khan was asked how he approaches learning about a topic he is going to create a video for. His answer? Textbooks. “If I’m doing something that I haven’t visited for a long time, you know, since high school I’ll go buy five textbooks in it. And I’ll try to read every textbook,” says Khan. He, of course, also uses the Internet. Clearly there is still value in trustworthy, authoritative reference information, and print is a symbol of that trust. Digital media, however, is becoming just as trustworthy, and its use along with other technology can help optimize the learning experience like never before.

What do you think? Are you lamenting the loss of Encyclopedia Britannica’s print edition or is it inconsequential?

Photo Publishing Opportunities with Instagram

Monday, February 27th, 2012

For the past few years, many companies have attempted to find opportunities for the print publishing of social online content. Content-rich social networks like Twitter and Facebook have Application Programming Interfaces (APIs) that enable third-parties to, with user approval through social sign-in, flow information and graphics from your account to their service. As more consumers centralize their photos around a handful of online services, particularly in the social media realm, it becomes increasingly important for photo publishers to offer integration with these sites. How much does social matter? According to an InfoTrends study from 2010, close to 60% of consumers that upload photos to the Internet reported using Facebook most often to accomplish this task (up from just 30% in 2009, and continuing on a growth trajectory).

While networks like Facebook are critically important due to their sheer size of user base (i.e., potential opportunity), niche social networks that revolve around photo sharing are making a big splash. Specifically, photo sharing network Instagram has garnered a lot of attention recently. If you’re not familiar with Instagram, it is an iOS app that lets users take photos, apply different filters to those photos, and share them with other Instagram friends or with other networks like Facebook and Twitter. Social sign-on is tenet of Instagram’s popularity: users connect their existing social network accounts to Instagram to find friends, share photos, and build a following.

Instagram has experienced incredible growth over the past year-plus. At the end of 2010, the network had around 100,000 users; at the end of 2012, it surpassed 15 million users… not bad for a start-up with 10 employees. Social integration, filter types, and the quality of iOS cameras (ranging from 5 megapixels to 8 megapixels, depending on the version) are all success factors for Instagram. That quality level is also important for photo publishers looking to tap into the Instagram opportunity.

Around this time last year, Instagram launched an API for developers looking for new and interesting ways to tap into the photo sharing network’s content. One result of this launch was the proliferation of a number of tools and services that enable Instagram users to print their photos in a variety of different formats. All users have to do to use most of these services is login with their Instagram account (again, the power of social sign-on); depending on service, different methods will be provided for selecting and printing your photos. A number of existing services include:

  • Postagram: This service, which is available through iOS and Android mobile apps, lets users send personalized print postcards with a photo of their choice to friends and family. While there are many similar services available (including Apple’s own “Cards” app), what’s unique is that the photo area is die-cut, enabling photos to be popped out of the postcard and posted elsewhere. While Postagram takes its namesake from Instagram, the app also lets users send postcards including their Facebook photos, as well as photos residing on their mobile device.
  • Blurb: As many of you may know, Blurb is a prominent photo publisher, primarily with its photo book offerings. To complement its existing services, Blurb launched its Instagram integration in July last year, providing templates that easily let people turn their Instagram albums into long-lasting physical keepsakes.
  • CanvasPop: CanvasPop specializes in providing online services for canvas printing, and its Instagram integration was activated just before Christmas last year, enabling people to order 12″ x 12″ or 20″ x 20″ canvas prints of their Instagram photos.
  • Stickygram: A project birthed from digital ad agency MintDigital, Stickygram provides an interface for people to order a pack of 9 magnets that include different Instagram photos on it for just $14.99, and also lets users buy them as gifts for people. This company markets its product particularly well, with lots of different promotions and a strong social media presence.
  • Other Instagram-inspired printing services include Instagoodies (1″ stickers), Instamaker (photo merchandise), (various photo products), and Casetagram (iPhone cases).

It is important to note that integrating with the API is just the first step; in other words, if you build it, they will not necessarily come. The aforementioned companies all do a fair amount of marketing, especially on social networks and through daily deal sites like Groupon. Additionally, these sites are dealing with user-generated content, and even though most people use Instagram to take and share their own photos, they can also post third-party, non-original content, which could run afoul of copyright laws. These types of factors need to be considered before embarking on your quest to capture print volume from Instagram.

Clearly, there is a lot of interest and potential opportunity by leveraging Instagram (and other online social content) to drive photo publishing services. That opportunity will likely increase dramatically, as Instagram plans to release an Android version of its app at some point this year (date still TBA); this move will bring millions more users to the photo sharing network, especially considering Android’s market share dominance in the smartphone space.

The ultimate point is that today, Internet start-ups can gain user traction extremely rapidly (just look at Pinterest… that topic is for another blog, though). Additionally, to promote growth, these services inevitably offer sharing and integration capabilities, providing opportunities for third-parties to utilize content in new and interesting ways. Building useful services around these new platforms, especially around photo publishing and on-demand printing, can open the door to new customers and more volume.

Understanding Different Applications for Personalization

Tuesday, January 24th, 2012

“Personalization” continues to be a prominent topic in a number of different circles: marketing, publishing, eCommerce, social networking, and search. It’s no wonder why: personalization helps boost response rates and profitability in cross-media campaigns, helps marketers drive conversion on their Websites & landing pages, and much more.

Wikipedia provides a very broad definition of personalization, which I do like: “using technology to accommodate the differences between individuals.” Specific to the groups that I am referring to, I believe that personalization can be more precisely defined as leveraging data to deliver relevant content to specific individuals.

That’s still pretty broad; what kind of data? what kind of content? what channels are being used? With this many constituencies looking to use personalization in their own ways to meet specific goals, those answers can range extensively. Furthermore, when these groups end up talking to each other about personalization, it can cause confusion and miscommunication. To clear the air, so-to-speak, I wanted to shed some light on the different ways personalization is being employed by these different groups.

  • Cross-media Direct Marketing: You’re likely familiar with the personalization model for cross-media campaigns: a digitally-printed direct mail piece (or e-mail) with variable text and graphic elements and a personalized URL, which links to a personalized microsite with variable text and graphic elements, often highlighting the recipient’s name in some way. Personal and demographic data is primarily used to drive the personalization in these applications. Depending on the client/campaign, additional data may be used for more granular, relevant content.
  • Digital Marketing: Personalization is popular with digital marketers. E-mail is a popular spot for personalization: according to a 2011 study by marketing technology provider Alterian, 72% of marketing professionals surveyed reported using personalization for their e-mail campaigns. E-mail marketing complexity ranges from mass blasts to segmentation to real-time individualization, typically using customer data and purchase history data to make recommendations. Another prominent personalization tactic for marketers is retargeting, which involves serving ads to a specific user after they have left a Website in efforts to raise brand awareness, recapture their attention, and drive people back to their Website.
  • eCommerce: Business-to-consumer eCommerce was and still is a center of innovation in Web personalization, driven by and other eTailers looking to provide a custom-tailored experience for each individual user in hopes of getting them to buy more. For these sites, personalization often comes in the form of a recommendation engine, which tracks your browsing habits, shopping cart, wish list, reviews, purchase history, and other facets to deliver personalized recommendations on what the system thinks you would like. It should be noted that digital marketing goes hand-in-hand with eCommerce; real-time individualized e-mail marketing is common for eCommerce companies, and retargeting helps bring back shoppers that left the conversion funnel.
  • Publishing: For print publishing, personalization often means mass customization, specifically in the print-on-demand model for books, where eCommerce orders trigger specific books to be printed, often in one-off fashion. Services like MagCloud and Time Inc’s Mine Magazine endeavor represent personalization efforts for magazines. On the Web and in digital media, personalization is geared more toward delivering relevant content based on an individual’s specific interests or preferences. Sometimes meeting this objective requires readers to input specific information about their tastes; other times, information like a Twitter, Facebook, or Google Reader account may be analyzed to assess your interests and deliver content based on who you’re friends with, who you follow, or what news you already read. A great example of this method is exhibited through Zite, a “personalized digital magazine” mobile app.
  • Social Networking: Social networks are rife with different types of individuals’ data, making them ideal for personalization. Social networks typically employ personalization to deliver relevant content feeds from a user’s friends or connections on a network, as well as to deliver highly-targeted display advertising. For content delivery, networks may use algorithms to interpret connections, interactions, and profile information among users and deliver content based on what it believes is most relevant to each user. For advertising, networks typically act a facilitator between advertisers and users, presenting key profile characteristics of users that advertisers can choose to target.  Facebook generated over $3.5 billion in revenue through this type of advertising.
  • Search: Search engines have always utilized algorithms to determine the display results of a user’s query, but these algorithms have recently started to take user information, such as profile or location data, into consideration before displaying results. Just recently, Google stepped up its game in this area, launching “Search, plus Your World“, which integrates a user’s Google+ data into everyday search queries. Advertising is a critical component to search, and generated over $35 billion in revenue worldwide for Google in 2011. Up until now, most search ads have been delivered based on the content of users’ search queries, but location information and even personal information are starting to be used to deliver more targeted search ads to users.

At its core, all that is needed to enable personalization is data, content, and a mechanism to have one drive the other. As has been covered, applying personalization for different use cases has a substantial impact on the type of data being used, the content that is being tied to that data, and the types of delivery mechanisms that enable that personalization. Understanding these differences and requirements for each application can help different stakeholders communicate more effectively when pursuing personalization, as well as open the door to new opportunities

Adobe Refocuses on Digital Media, Digital Marketing

Friday, December 9th, 2011

Adobe has been making waves with its series of acquisitions over the past few years, including Web analytics provider Omniture and content management provider Day Software. More recently, Adobe acquired web font specialist Typekit, electronic signature provider EchoSign, and video enhancement software provider Iridas Technology.

At a financial analyst briefing in November, Adobe made a number of announcements about what it is doing with those acquisitions, and more broadly, the direction in which the company is headed. Most of the news coverage in the tech community that surrounded this briefing was Adobe’s intention to stop any future development of its Flash for mobile platform. Instead, the company is opting to focus on leveraging HTML5 and other standard Web technologies in the mobile arena. Adobe is also putting more emphasis on these technologies in general, as showcased by some of its concept products it has released for testing, including Muse (aimed at helping users design and publish HTML websites without the need to write code) and Edge (an application that is meant to help people create animated Web content using HTML5, CSS3, and JavaScript).

Some noted the scaling back of Flash as a posthumous win for the late Apple CEO Steve Jobs, who was adamantly opposed to putting Flash on Apple’s iDevices because of what he felt were flaws that made Flash inferior in the mobile realm. What was substantially under-reported in the tech world was Adobe’s clear shift in direction, as highlighted by a reorganization that re-targets the company to focus on two main areas: digital media and digital marketing.

The company is also pushing its users to get out of a perpetual licensing model of buying and upgrading its Creative Suite product line to a cloud-based subscription pricing model that lets users pay for access to Creative Suite tools on a monthly basis. To do this, Adobe has developed the Creative Cloud, a Web-based community and portal for users to manage their Creative Suite applications and connect with other creative professionals. While the company will continue to sell perpetual licenses in the near future, it has very clear plans to fully migrate 100% of Creative Suite users to the Creative Cloud over time.

With the Creative Cloud on the Digital Media side, there is also Adobe’s cloud-based Digital Marketing Suite, which is geared toward the company’s solutions for digital marketing, including Web and social analytics, content management, digital asset management, eCommerce, display advertising, e-mail marketing, and customer relationship management. Adobe’s goal is to provide a suite of solutions for marketing professionals that can help them compete effectively in the online channel.

Furthermore, Adobe is shifting its business strategy from simply being a technology provider to a company that also provides services to help businesses with things like content monetization. In this sense, Adobe’s transformation pushes it closer to competing with some of its customers and partners; it will be interesting to see how this plays out in the near future. Just weeks after its financial analyst briefing, the company announced the acquisition of Efficient Frontier, a provider of digital ad buying and performance management solutions. This acquisition is further proof that Adobe is intent on not just providing tools to create content, but that it fully wants to provide solutions to help its customer monetize the content they are creating.

All in all, Adobe’s changes are much more substantial than no longer developing Flash for mobile; the company is totally revamping its strategy to focus on digital media and digital marketing, and expanding its scope to offer companies help with content monetization. As an unfortunate by-product of this reorganization, Adobe is also laying off about 750 people, or around 7% of its workforce. Layoffs aside, the company is, of course, painting a compelling future for itself, as well as digital media and marketing in general. With the marketing and media landscapes still undergoing a high degree of transformation, it may not be a bad bet.

What do you think of Adobe’s recent moves? Can it refocus its business while maintaining trust and good relationships with its long-standing customer base? Have you already moved from a perpetual licensing model to a monthly subscription via the Creative Cloud? We’d love to hear your thoughts.

Exploring Opportunities with Small and Medium Businesses

Tuesday, November 1st, 2011

While many companies compete to do business with large companies that can deliver sizeable long-term contracts for print and marketing services, a trend has emerged over the past few years related to targeting small and medium sized businesses across a wide range vertical markets. Many point to providers like Vistaprint on the print marketing side and Constant Contact on the digital marketing side to paving the way for the so-called Long Tail of services for SMBs.

Indeed, by offering self-service tools and a broad array of vertically-focused templates, these companies have grown tremendously and their services are used by millions of businesses worldwide to do everything from buy business cards and manage e-mail newsletters to launching full-blown direct marketing campaigns. While we often talk in terms of business-to-business (B2B) and business-to-consumers (B2C) models, these types of services have blurred these lines by making businesses buy more like consumers, while keeping the systems open enough to even attract consumer users.

Why are these services popularized and still growing? According to the Small Business Administration, there are over 27 million small and medium businesses in the United States, accounting for between 60% and 80% of all U.S. jobs. SMBs are typically characterized as establishments with fewer than 500 full-time employees. Thus, the market opportunity is tremendous, even if you are only able to reach a fraction of SMBs in the country. The power and flexibility of the Web, and in the case of Vistaprint and other online print businesses, the power of a highly-automated production environment, have enabled companies to service large volumes of small orders, something which is becoming more common even in larger organizations.

InfoTrends saw this trend becoming prominent and in 2009, conducted an in-depth study on the topic entitled Capturing the SMB Business Communication Services Opportunity, which surveyed over 2,000 small and medium business across 13 major vertical markets to understand how these companies were utilizing some of the very services just mentioned. We found that just like larger companies, SMBs were diversifying their marketing mix, with traditional media still being an important component but also heavily emphasizing the use of the Web and e-mail to reach their target audiences. Social media was also increasing in importance. At the time, 32.1% of SMBs indicated using Facebook to promote their businesses, while 16.9% indicated using Twitter to do the same. We hypothesize that these numbers have increased substantially in just the last few years.

Furthermore, we found that SMBs had a preference for a “one stop shop” type of experience for printing needs, and we feel that also translates into marketing services, as well. There are a number of vertically-focused services and service providers on the market that cater to a specific set of small and medium businesses. For instance, GuestEngine and Fishbowl provide turnkey marketing services and tools to restaurant owners. Demandforce originally focused on automotive services and dentists, but has expanded its marketing platform to personal services and other healthcare specialists. SharperAgent provides self-service, cross-media marketing campaigns to independent real estate agents, and was recently acquired by real estate software developer Market Leader.

These types of vertical-focused platforms are the next step in the evolution of SMB marketing services. InfoTrends is referring to them as SMB marketing automation services, as many of them aim to automate various aspects of the marketing process for companies while tailoring the services to meet the intricate needs of a particular market. Focusing on one or a particular set of vertical markets also equals more replicable applications. We are currently conducting a follow-up study to our 2009 research on this topic with a study entitled Capturing the SMB Marketing Automation Opportunity, which will be sure to glean valuable insight into how this market continues to evolve.

As it is often said, small and medium businesses are the backbone of the U.S. economy, and they want to succeed and be effective in their marketing just as much as a large enterprise corporation does. Through the power of the Web, many SMBs now have the tools to market smarter. Nevertheless, there is always room for improvement and plenty of opportunity exists for service providers that want to take on the task of making print and marketing services more effective for a particular market.