Print Verification

From ColorWiki

Jump to: navigation, search

Are Your Proofs Bona Fide?

Reserved Article

This page is a
Reserved Article.
For more details see
Reserved ColorWiki Articles

ColorNews

This reserved article originally appeared in CHROMiX ColorNews Issue 48 on May 16th, 2012.

Click here to see the original in its original context.
Email
colornews(at)chromix.com to subscribe to the ColorNews newsletter.

an article by Terry Wyse, CHROMiX partner

(All about print verification and what it may mean....or not).
With proofing being a large percentage of my color management business, the topic of proof verification comes up quite often. I thought an article covering proof and print verification and what it means.....and what it DOESN'T NECESSARILY mean would be valuable. This article is intended both for those in print and proof production but also for those receiving proofs from outside vendors. If you're receiving a proof and it has some sort of "pass/fail" label on the proof, it's critical to know what EXACTLY that pass/fail label means to you as the person accepting the proof.

I'll break things down into three categories of verification:

  • Proof verification (external)
  • Proof verification (internal)
  • Calibration verification

Contents

Proof Verification (external)

Let's talk about "external" proof verification first since this is probably the easiest one to define and likely what most people understand the term "proof verification" to mean when someone is handing them a verified proof.

Generally, a "verified proof" is meant to convey to the party receiving the proof that this proof has passed some sort of quality assurance test (measured with a spectrophotometer such as an i1Pro) with the implication that the proof you received was compared to some "standard" and was deemed acceptable to within some tolerance, usually a "delta e" tolerance (I'll refer to "delta e" as simply "dE" from now on). What's usually measured in this case is a small color bar that's included on the proof, usually something like the IDEAlliance 12647-7 Control Strip (see graphic) but other types of control strips can be used as well such as the FOGRA Media Wedge or similar. The point here is that the control strip should include, at minimum, both primary (CMYK) and secondary (RGB) colors plus tints. Most control strips will also include a selection of "memory colors" such as skin tones plus several steps of CMY neutrals and similar steps of K only....all-in-all a couple dozen patches of colors.

IDEAlliance 12647-7 Control Strip


The first thing that's critical here is WHAT standard is being used to compare against? If the proofing is targeted to a standard print specification such as GRACoL, SWOP or one of the many FOGRA specifications, then the logical thing would be to compare the control strip colors against the standard colorimetric (L*a*b*) values specified by that standard. If you don't know what they are, you can visit the various "owners" of these specifications and usually they will publish exactly what the L*a*b* values should be for the control strip they support. If they don't, the values can usually be derived either from their standard characterization data set (ECI2002 or IT8.7/4 data sets) or via an ICC profile made from their standard data set (this is easy to do in ColorThink Pro via the Worksheet). It's generally understood in this scenario that you'll be using some form of "absolute colorimetric" rendering since you're comparing directly to an external standard which, by definition, would include the paper white point of that standard.

So the proof is printed, including the control strip, and then measured/compared either in software made specifically for proof verification or you can even do it by hand in an Excel spreadsheet (this is rather clumsy and some of the dE formulae used for comparing are not for the faint of heart or weak of stomach!). Once verified, typically a small adhesive "pass/fail" label is printed and affixed to the proof to show the person receiving the proof that it's "all good" and can be trusted to represent the final printed job, assuming the press run is targeted to that same printing specification (it doesn't do anyone any good if you produce the perfect GRACoL proof only to have the job printed via web offset on a #5 press stock...they won't match!).

Proof Verification (internal)

So far so good......but let's say you're really not interested in comparing your proof to some "absolute" external standard but are more interested in proof consistency as opposed to accuracy. If that's your goal (and it's equally valid in my opinion), then your "standard" becomes the proof that you agreed was a good visual match to your reference when the proofing system was first installed and profiled. I call this "internal" proof verification. The simulation in your proofing system could still be based on a specification such as GRACoL, but you simply want to verify against your proofing system's interpretation of that specification and monitor your proofing system's consistency.

Print your "stake in the ground" proof (include a control strip) that everybody agrees is a good proof and then measure the control strip, the same one you'll use later for verification. This measurement is established as the standard that all subsequent proofs will be compared against. Usually the verification software you're using will accommodate custom standards or at least will give you a way of manually entering the L*a*b* values from your "golden" proof into the software.

Calibration Verification

A third option is similar to the internal verification above. I call it calibration verification. In this case, instead of verifying against your proof's color-managed interpretation of a standard specification, you want to verify that your proofing system's calibration (ink limiting and linearization generally) is consistent. The major difference here is that you print a control strip with color management DISabled but with calibration (linearization) ENabled. Again, if you're primarily interested in consistency from proof-to-proof, this method has several advantages:

You're verifying proof consistency using the entire color gamut of the printer, not just a dumbed-down version that's been run through a profile conversion ("color-managed"). By testing/comparing using the entire color gamut of the printer, your proof verification should pick up on proof consistency issues sooner than a color-managed verification would (a color-managed verification could actually "mask" or hide proof consistency problems).

Since a printer's calibration is generally common for all color-managed conversions on that media, you only need to perform a single proof verification. On the other hand, if you verify to a proof standard and use several production proofing simulations (GRACoL, SWOP, uncoated, etc.), you may need to establish proof verification parameters for all those proofing simulations....even though they all use the same basic calibration parameters. "Calibration" verification eliminates the need for separate verification of each standard, assuming your production proof standards share the same media calibration.

Disadvantages to this method

Some proofing systems require that the same color management applied to the images be applied to the control strip. Only a few of the high-end systems allow you to include a control strip on the proof and print it without color management. It's possibly overly sensitive to proof inconsistency...you may be lead to overreact to a calibration issue when it may not show up visually on a proof.

No matter which of these methodologies you employ, you'll likely be setting a tolerance based on dE....but it's important to know WHAT dE formula is appropriate (there are several) for these different scenarios. Since a thorough discussion of the different dE calculation methods is really beyond the scope of this article, I'll focus on the methods most commonly employed and how they differ.

Delta E

The first and most common dE calculation is called dE 1976 or simply dE76 for short. dE76 is a very simple calculation and simply gives you the mathematical distance of one color from another using L*a*b* colorimetry (show formula here?). It's important to note that the mathematical difference between two colors is not the same as the visual difference. In other words, a dE76 difference of "2" between two yellow colors would not illicit the same visual response as that same difference between two blue colors.....you would likely see a visual difference between the two blues but perhaps not perceive any difference at all between the two yellows, even though they both had the same dE76 color difference.

Fortunately, we have another dE calculation that is more relevant to visual differences, not just mathematical ones. That formula is called dE 2000 or simply dE00. The beauty of this formula is that it more accurately accounts for how we humans respond visually to various colors across the spectrum. With dE00, a difference of, say, "2" elicits roughly the same visual response no matter what color pairs you're comparing. In colorimetric terms, dE00 is more "sensitive" to hue and lightness differences as opposed to chroma or "saturation" differences. These explanations oversimplify the differences between these two dE calculation methods but I think you get the idea.

For more information on deltaE, see ColorNews #17: The Color Difference

What's important here is that when exchanging "verified" proofs and using dE criteria as the tolerance, you need to know WHAT dE calculation is being employed. From the discussion above, it would seem obvious that if we're interested in comparing visual differences between proofs and press sheets that dE00 would likely make the most sense. Wish it were that simple! Unfortunately, there's a long history of using dE76 as the color difference metric for legacy reasons as well as because it's the simpler formula to use. If you see any dE tolerance specifications for print standards and specifications, you can almost be assured that they are using dE76 as the calculation method.

Which dE to use?

Here's where I come down on what dE method should be used for these different types of proof verification:

If you're using the "external" proof verification method and you're being asked to use the same dE tolerances as established by the standards bodies (IDEAlliance, Fogra), then you're pretty much stuck with using the "straight" dE76 method. But as the standards bodies transition from dE76 to more modern "visual" difference calculations such as dE00, by all means use these newer methods instead. If it's an "internal" or closed proof verification you're doing, I would suggest you use dE00 since this will alert you to any real visual differences that may be happening.

For "calibration" verification, I would likely stick to using dE76 since this tends to be the more sensitive method......I would rather have my calibration verification routine alert me to color problem before it actually becomes an issue on my production proofs. The dE76 method may be overly-sensitive where visual differences are concerned but when it comes to checking calibration, I would want to run a fairly tight ship. If you feel it's too sensitive, then simply adjust your dE tolerance criteria upwards a bit to give yourself a bit more calibration slack.

Summary

If you take only one thing away from all this verification and dE talk, it's that you need to communicate with your proof-provider and 1) insist on some sort of proof verification and 2) understand exactly how they are verifying the proofs you receive and what standard (if any) they're being compared against....and understand what dE method they are employing. It would also behoove you to be able to verify the proofs you're receiving for yourself. The investment in hardware and software to do this yourself is minimal. CHROMiX plug: Talk to someone at CHROMiX about their excellent Maxwell system. With Maxwell, not only can you verify and share your proof/press verification data, the entry fee is extremely inexpensive compared to other stand-alone solutions. I've started using it myself with some of my customers to monitor their proofing systems remotely and it's been extremely helpful.


Terry Wyse




[Editor's note: Terry Wyse is a well known and recognized industry expert of color management. Terry is a G7 Certified Expert and provides press profiling and press optimization services. He also provides knowledge and services for pre-press, proofing and other related areas. Terry has a wide range of product familiarity too long to list here. Finally, Terry is a valued partner of CHROMiX and is much appreciated.]

Personal tools
Namespaces
Variants
Actions
Navigation
Toolbox