Editor’s Note Nicholas Barzelay is a recent graduate of RIT School of Print Media’s graduate program. Mr. Barzelay’s research area was variable data printing. He is the co-author of Upstream Database and Digital Asset Management in Variable Data Printing (Executive Summary available here). Mr. Barzelay is working on two books on the subjects of data management and Variable Data Printing. He is sharing some of his work to get industry feedback.
Insights on VDP
By Nicholas Barzelay
What we now call “transactional” variable data printing has its origins in the earliest accounting and financial programs: Accounts Receivable, Invoicing and Billing, Accounts Payable, Demand Deposit Accounting, Financial Asset Reporting, and Explanation of Benefits for insurance.
Initially these kinds of program applications were run on mainframe computers. Later they were modified to run on mini computers. Eventually they were migrated to multi-tier enterprise scale client-server applications.
Early VDP was generally output on chain impact printers. The addressing and transactional data was printed on continuous-feed preprinted forms. Identification cards were printed in a similar manner on heavier card stock. Background designs for print stock were preprinted on an offset press.
The development of digital printers ultimately replaced impact printing. However, it is interesting to note from a programmatic standpoint, the migration from impact to digital printing was accomplished by changing out printing hardware with little affect on application program code. It was not until the advent of desktop publishing that VDP began to evolve into a more design intensive graphic product.
Infrastructure associated with early VDP (transactional printing) was complex, but did not present the integration problems of current system infrastructure. Processing was centralized on a mainframe computer running a single standard operating system and integrated application packages certified for the operating system. Standard programming languages and data management applications provided limited choices, but also simplified the number of required skills to a few.
CRT’s (cathode ray tubes) connected directly to the mainframe and provided online viewing of data. Printers, also directly connected to the mainframe, provided printed output. Integration issues were localized to the mainframe and managed by system programmers who did their programming at the operating system level.
Infrastructure & Applications Now
Client-server system environments have decentralized processing in a number of ways that complicate system integration. In the distributed processing environment of client-server, different physical processing units may be running different operating systems and application systems that may not readily talk to each other. Units are loosely coupled via network connections.
This means that more expertise is required to keep the entire system running than a single set of tools and skills. This may easily translate into additional staff requirements – knowledge to manage the network, knowledge to support varying hardware, and application knowledge. Application knowledge is similarly diverse and specialized: database management, digital asset management, and graphic design software, to name just a few.
An installation’s applications need integration to share each other’s data and content. Those applications have become more complex than the original transactional applications related to accounting. Beyond the inclusion of digital imagery, many VDP solutions require matching and merging disparate data from different sources, drawing conclusion and developing business logic to meet solution objectives. Solution objectives extend well beyond accounting requirements.
Those objectives include maintaining customer relationships, marketing to new customers, the application of pricing and discount models, the movement of aging inventory, and a host of other information communications. These must be accomplished in a highly competitive environment that ultimately demands the measurement of cost-effectiveness in order to obtain optimal results. The most successful VDP solutions are closed loop.
Therefore, simplification is a necessity. Simplifying does not refer to VDP solution results (an all too often approach). Rather, simplification means removing complexity from the workflow and integration path, beginning with data and content storage and progressing through data stream generation, document creation, and production.
The best way to simplify is to identify basic processes and tools that are understandable and easy to use. This may mean temporarily abandoning applications with all the “bells and whistles” in favor of a few small components that are easy to understand and easy to connect together (integrate).
A variety of very sophisticated VDP solutions can be created using a few fundamental tools. This may ultimately mean that a company can accomplish a greater variety of increasingly complex VDP solutions than those previously generated with the more expensive tool sets that may at length be relegated to producing “fancy” mail merge.
A related benefit of simplifying the workflow and the VDP development system is in the area of required skill sets. It may be easier and less expensive to find people who can use the basic tools and then grow their expertise, rather than recruiting people with fully matured skill sets that can facilitate the most complex integration workflows walking in the door. People with matured skills are likely to be few in number at premium prices.