I've been using the 1280 and before that, the Epson 1270 for some time now. I've gotten many beautiful prints however I have found that certain colors seem to be hard to reproduce accurately. The prints just don't match my color calibrated monitor in certain colors. In particular the greens and blues are often off. I've tried having custom profiles made and this still failed to solve the problem. I read in a recent review that the R1800 uses an inkset better for RGB whereas the 2200 series of printers uses inks better for CMYK prints. I print from sRGB or adobe 98 color spaces. Which is likely to give me better results? Is the problem with blues and greens a known problem with the 1270/1280 printers? Thanks
The CMYK ink palette has for a long time been seen by the industry as having some inherent weaknsses due to the difficulty of making the ink colors pure. Apparently Epson felt that there is a sufficient purely photographic market to alter the ink palette to include red and blue inks and thereby achieve better reproduction fidelity of photographic images without re-engineering the entire ink technology. So, the R800 and and R1800 use essentially the same ink as the 2200 but substituting red for light magenta and blue for light cyan, are able to provide improved photographic reproduction, but at the expense of making these printers less capable for the purpose of providing simulated CMYK press proof prints.
In the tests I did printing photographic image files I found improved color matching was achieved with particularly better reproduction of skin tones and foliage.
In other words by making these two printeres specialized, just for the advanced photographic market, Epson has provided improved color reproduction performance by making two changes to the ink palette.
Since I am only interested in printing photos from sRGB and RGB98 color spaces, it sounds like I would be better off with the R1800 series printers rather than the CMYK type inkset printers.
Since I am reasonably happy with my 1280 my issue is whether the color accuracy of the newer R1800 is better and warrants my changing printers. I have had some problems getting accurate skin tones and foliage with the 1280 so it is good to hear that the R1800 should be better in this regard. You didn't comment on blues and cyans. I find that in certain images, the sky on my 1280 prints comes out rather strange looking. It is hard to get a truly accurate match with my monitor display. Did you notice any problems in your experience with the sky on the R1800? Getting accurate blues and cyan shades seems to be my biggest problem, even worse than skin tones and foliage....
The fact the R1800 has a blue ink seems to reduce the occasional problem of a sky printing off what you would expect. However, sometimes it is not the printer that fouls the color matching but the fact the image data is actually outside the gamut of what your display can reproduce, not to mention your printer. In other words if your monitor's gamut is short relative to the data in the file representing the sky, then when that data is sent to the printer, an unexpected result occurs, which may not be a shortcoming of the printer but your monitor's.
Interesting, I hadn't thought of that possibility. Is there anyway to see if this is the case?
>>Is there anyway to see if this is the case?<<
Not really in a practical sense unless you do what I do, which is work with many of these printer products as they come out over the years and accumulate print results from specific images that can be compared.
I am currently using a 1280, and while it does a very good job, I have noticed less than stellar greens. Being in the Northwest, well its mostly green here. I am considering either the R1800 or the R2400. I might rarely do a B&W, but very seldom. Take the cost issue out of the decision and based on color/quality of photographic prints not press, do you have a feeling as to which one to steer towards. I do know it is a subjective decision, but I see alot of posts about both.
Thanks for your time.
When I tested the R1800 I was very much impressed with the much improved rendering of both foliage and skin tones (I lived and worked in Seattle from '89 through '95). I was dubious before I got the R2400 that it would do as well with skin tones and foliage, but once I did a thorough run of print testing I found there was so little difference between the printers as far as foliage and complexion reproduction most people I am sure would not be able to detect any distinction in the prints. However, the K3 inks with the R2400 do have a capability of reproducing a somewhat higher print Dmax with some papers which have the capability of handling the ink load. So, my choice ignoring the cost difference, would be to choose the R2400. But you should also take into account I do a lot of B&W and I cannot entirely ignore the fact that influences my favoring the R2400. If I were printing only color I am sure I would be quite satisfied with the R1800.
Sorry if I don't make the choice any easier<S>.
There may be another cause for bad monitor-to-print matching. I know David Brooks won't agree with me, but I and many others believe that it is very important for the monitor and lighting used to view your prints match, not only for brightness but also for color temperature. Also, your lighting needs to be of high quality, without spikes in its color spectrum. If you want to study this issue more, I recommend you read my article on this subject at: www.solux.net/ies_files/Digital%20Darkroom%20Lighting.pdf
Technically what you said in your post is true as far as it goes. We don't really disagree except in the particulars of what "works" for a digital darkroom.
However, most of us make prints for a final destination of display in lighting environments which may be quite different from the ambient illumination in our digital darkroom work area.
Printer color profiles for matching is the key to resolving "print matching" specifically for a particular print display environment's ambient illumination. For most folks who display print images in the living space of their homes, the ambient illumination may be different, quite different from the specific "quality" illumination of your darkroom area. And, usually it is not a constant ambient light condition in a home, changing during the course of a day between mostly indirect sunlight to whatever artificial light used in the evening. So that would require adjusting print color balance to an ideal compromise between what looks best during the day and at night. And of course if the print is to be displayed in a gallery or some other specifically illuminated environment that is constant, that allows balancing the print to that ambient light color temperature specifically, which may be quite different from the parameters of your digital darkroom.
As I said that can be accomplished through the printer/paper/ink color management profile used with your printer. Professional level software for creating custom printer profiles (essential if you are using 3rd party printing papers), have the capability of adjusting the red/blue output color balance (color temperature) to accommodate just about any display illumination ambient light color temperature.
Of course having a light to view prints as they are produced in your digital darkroom that is a match with the illumination for the display environment you are making prints for is an advantage to immediately evaluate the printer output in or near your digital darkroom.
The color and tonality match in the digital darkroom is the most important one because that is where you edit, finalize and print our images. If the monitor-to-print match in the digital darkroom is bad, then you loose the tight correlation between what you see on the screen and how your print looks like, right there where you do all of your editing. Any other consideration, including the final viewing environment, is secondary at best.
When an image looks good on your monitor and in print in a digital darkroom with high quality lighting, matched for intensity and color temperature with your monitor, then there is no need to tweak it for different viewing environments. This is because of an important feature of our visual system, color constancy and I quote from the Real World Color Management book: "Color constancy... is the tendency to perceive objects as having a constant color, even if the lighting conditions change. In other words, even if the wavelength composition (the spectral energy) of the light coming from the object changes, our visual system picks up cues from the surrounding objects and attributes that change to the lighting, not to the object." After I installed very specific lighting and calibrated my monitor accordingly, I have never needed to change my images once I finalized them in my digital darkroom and credit color constancy for it.
>>If the monitor-to-print match in the digital darkroom is bad<<
I don't dispute that at all, and agree wholeheartedly. And, I would say unless very extreme the ambient light environment in the digital darkroom is as you say "secondary at best".
>>in print in a digital darkroom with high quality lighting, matched for intensity and color temperature with your monitor<<
Other than viewing an immediate print output to evaluate, the ambient light in the darkroom as long as it is not shining direct light on the monitor face to skew its appearance, is inconsequential. Yes the surroundings immediately around your monitor should be neutral and at a level of brightness substantially lower than the display image.
Individuals vary considerably in visual adaptation. Within the range of typical indoor lighting, if as it should be subdued in the area surrounding your work (display), unless you are unusually sensitive in adaptive visual response even several 100 degrees in color temperature balance will not effect perception of the image on screen unless the ambient light level is very high, which it should not be.
If this were as critical an issue as you state, based on your particular personal experience, you would see a great deal more attention paid to ambient light work environments in the country's top-end graphic arts and design studios, ad agencies and other environments where the most critical color work involving millions of dollars in reproduction takes place.
However, if you make and evaluate a print in 4000K light that is red/blue balanced to reproduce a neutral gray as viewed, and then display the print in a room that is lit by indirect sunlight and skylight measuring 6000K color temperature, the color balance would not look correct, and the neutral gray would appear bluish.
When I say "Any other consideration, including the final viewing environment, is secondary at best." my definition of final viewing environment is the one OUTSIDE of the digital darkroom. That enviroment is not critical because of color constancy. The lighting conditions INSIDE the digital darkroom are very critical and NOT inconsequential. There are 2 tasks for this lighting (both can be performed by the same light(s) if carefully chosen): 1) illuminate the area immediately around your monitor; it should be at a brightness level substantially lower than the screen and of the same color temperature and 2) it should illuminate the prints at about the same brightness and be of the same color temperature.
I agree with your statement that "even several 100 degrees in color temperature balance will not effect perception of the image". The sad fact is that many people have a setup with a way higher difference. Just do the math: most lights are 5000K or less and most monitors are set to 6500K or more, resulting in a difference of more than 1500K and that is VERY noticeable and easily verified: look at the image on the monitor and on paper, then change your monitor setting from 6500K to 5000K or 5000K to 6500K and you will be amazed at how much change you will notice; colors will absolutely look too warm or cold for at least one of the monitor settings, if not both.
Yes, this issue is as critical as I stated and cannot be overemphasized. There is a great deal of attention being paid to lighting in top-end digital darkrooms. I happen to have advised on the lighting for a new digital darkroom for the graphic arts department of a major design company and yes, they decided to spend the extra money on low-voltage tungsten-halogen lighting because of the advantages of a decently high color temperature and pure, spike-free color spectrum. More and more companies are offering this type of lighting with good reason: there simply isn't any better alternative.
As for your last paragraph: color constancy will make a good image that looks good at 4000K, look good at 6000K.
>>I happen to have advised on the lighting for a new digital darkroom for the graphic arts department of a major design company and yes, they decided to spend the extra money on low-voltage tungsten-halogen lighting because of the advantages of a decently high color temperature and pure, spike-free color spectrum.<<
I would say that ONE instance is an exception and not the rule. And I would be pleased to spend the time and money to establish the facts on this if you are willing to put up an equal investment. Second, most office environments probably have just the opposite bias to what you describe for the home, that the color temperature of professional office environments the ambient illumination is cooler than 5000K. Second, other than pre-press these days there are more in graphics, and particularly professional photographers using a monitor color temperature setting of 6500K rather than the old Apple standard of 5000K (the Windows default has always been 6500K). And, I would also suggest that a lot of homes have artificial lighting that is cooler than the tungsten of the past that ranged from 2700-3000K, because more and more are lighting for energy efficiency with fluorescent. The only tungsten lamp I have in my home and office is a Solux.
As for your last paragraph: >>color constancy will make a good image that looks good at 4000K, look good at 6000K.<<
Really you just argued that changing the color temperature just 1500 degrees will make a huge difference, then you argue that a larger change will not be noticed!!
As for Solux, which is just a standard quartz halogen tungsten bulb with a dichroic filter to raise the color temperature, is neither clean or good quality illumination. All you need to do to establish that is to use dichroic filtered tungsten light (like Solux) as the sole illumination to make some color photographs under that light. Even with a white balance correction (if done with digital) I will bet you will obtain an image that has all kinds of strange color casts that are almost impossible to remove. There is only one kind of light that will produce worse (dirtier) results and that is what you find in some commercial buildings that have a mixture of cool-white fluorescents and sodium vapor lighting.
I realize Solux works for you, and I do not begrudge you that. But it does not work for me, and knowing there is a wide range in human perception, I am sure it will not suit others as well. Finally, as an advocate of Solux it should be noted that as you have said as a viewing and ambient work light source it is a match if you are using a 5000K monitor color setting. I think most serious digital photography enthusiasts are more likely to conform to the standard of 6500K which is default for both Apple and Windows currently and is also the recommended workspace color temperature recommended by Adobe for doing Photoshop digital photography editing and processing.
We've had much of this argument before, and it is at a point where we should agree to disagree.
Yes, we disagree; I am fully aware of that. However, I would like the users of the Shutterbug Forum to get a better presentation of the facts. The application of color science can clarify many issues that more often than not are misrepresented and misunderstood. So, let's one more time look at some facts and figures.
You state that the color temperature of most professional office environments is cooler than 5000K.
Here are the facts about all kinds of light sources:
- incandescent lamps: 2500-2900K
- tungsten photoflood lamps: 3000-3400K
- generic low-voltage tungsten-halogen lamps: 3000K
- SoLux low-voltage tungsten-halogen lamps: 3500-5000K
- "cool" fluorescent lights: 5000K
Conclusion: most lighting is less than 5000K.
You state that many graphics and photography professionals use a monitor color temperature of 6500K.
You are quite correct. However, fact is there are no high-quality 6500K light sources.
Conclusion: using a monitor setting of 6500K virtually guarantees a mismatch between the monitor and the light used to illuminate prints.
You state that a lot of homes have artificial lighting that is cooler than the tungsten of the past that ranged from 2700-3000K, because more and more are lighting for energy efficiency with fluorescent.
You may be quite right, but as I showed above, most lighting is less than 5000K.
Conclusion: lighting in the home is still at 5000K or less.
You state that I argue that changing the color temperature just 1500 degrees will make a huge difference, then I argue that a larger change will not be noticed.
Fact is that I argued that when there are two light sources at work SIMULTANEOUSLY (the monitor and the light used to illuminate the print) with a color temperature difference of 1500K, there is a very obvious color mismatch. I also argued that color constancy will cause us to perceive an image as correct in color when it is first viewed in 4000K and then in 6000K (but NOT simultaneously).
Conclusion: there is nothing contradictory in my reasoning.
You state that SoLux lamps are neither clean nor good quality illumination; make some color photographs under that light and even with a white balance correction you bet you will obtain an image that has all kinds of strange color casts that are almost impossible to remove.
Fact is that SoLux has a clean color spectrum, without any spikes and with the highest color rendering index in the industry. Fact is also that I have made many photographs with SoLux illumination and, as color science would predict, there are no color casts whatsoever.
Conclusion: SoLux bulbs are about the best light sources available.
You state that most serious digital photography enthusiasts are more likely to conform to the standard of 6500K which is default for both Apple and Windows currently and is also the recommended workspace color temperature recommended by Adobe for doing Photoshop digital photography editing and processing.
Fact is that workspace color temperature has nothing, absolutely nothing to do with monitor or lighting color temperature. It doesn't really matter if the workspace is at 6500K and the monitor/lighting is at a radically different color temperature of say 5000K. What matters, IF one wants to have a close correlation between what an image looks like on the monitor and on paper, that both the monitor and the lighting are at the same brightness and color temperature.
And that is all I have to say say about that.
>>You state that many graphics and photography professionals use a monitor color temperature of 6500K. You are quite correct. However, fact is there are no high-quality 6500K light sources. Conclusion: using a monitor setting of 6500K virtually guarantees a mismatch between the monitor and the light used to illuminate prints.<<
On the basis of your above statement the only PRACTICAL conclusion is that the mismatch does not have an effect sufficient to influence the quality of output, because that is what the REALITY is.
>>Fact is that I argued that when there are two light sources at work SIMULTANEOUSLY (the monitor and the light used to illuminate the print) with a color temperature difference of 1500K, there is a very obvious color mismatch. I also argued that color constancy will cause us to perceive an image as correct in color when it is first viewed in 4000K and then in 6000K (but NOT simultaneously).
Conclusion: there is nothing contradictory in my reasoning.<<
There is in these facts. You are talking about the purported importance of the ambient light of a work area, which we both agree should be at a subdued, lower level than the brightness of the computer monitor. Then if the environment light level is at a lower level, and is also in the periphery of the user's vision, I would argue its effect is small in terms of color temperature differences, which would explain why the great predominance of users in professional graphics can produce critical color work output working with 6500K display settings and environments that are usually closer to 5000K.
Then I spoke of a 2000 degree difference in the red/blue balance of a print. If the prints is balance to look neutral in typical artificial light and then is illuminated by daylight/skylight which is very cool conservatively 7000K, the neutrals in the print will appear bluish. Here we are talking about the primary light source, not a lower-level peripheral light source like that that surrounds a computer monitor.
>>You state that SoLux lamps are neither clean nor good quality illumination; make some color photographs under that light and even with a white balance correction you bet you will obtain an image that has all kinds of strange color casts that are almost impossible to remove.
Fact is that SoLux has a clean color spectrum, without any spikes and with the highest color rendering index in the industry. Fact is also that I have made many photographs with SoLux illumination and, as color science would predict, there are no color casts whatsoever.
Conclusion: SoLux bulbs are about the best light sources available.<<
All you are saying above is repeating the claims and specifications provided by Solux. You have not made the test I suggested.
There is really nothing new about the Solux lights. That basic design first appeared when Kodak introduced low-voltage quartz-halogen lamps for the Kodak Carousel projectors, and was later adapted in commercial retail lighting. However before either became popular motion picture production lighting adopted quartz-halogen lighting and added dichroic daylight filters to simulate daylight, used primarily in banks as fills for daylight. (Previously motion picture production used very powerful arc-lamps , and the "daylight" dichroic lamps became popular because they were more compact, lighter, and required less electric power. I became aware of how dichroic filtered quartz halogen light affects a color photographic exposure when I was shooting stills on motion picture sets in Hollywood, and later with my own daylight studios because I used the same sources as accent lights in my studios. I have made color photographs using tungsten-halogen lamps with dichroic daylight filters as the primary source, and the results are images with serious color casts, not from spikes but from frequency modulation effects of the dichroic filter.
>>Fact is that workspace color temperature has nothing, absolutely nothing to do with monitor or lighting color temperature. It doesn't really matter if the workspace is at 6500K and the monitor/lighting is at a radically different color temperature of say 5000K. What matters, IF one wants to have a close correlation between what an image looks like on the monitor and on paper, that both the monitor and the lighting are at the same brightness and color temperature.<<
I doubt that you will find corroboration of the statement you made above from any major source in color management, because it is not a description of reality. Give it a try, refer your statement to the experts at Apple Colorsync, or Gretag-Macbeth, any established authority in color management. I think you will revise your thinking.
Clearly, this discussion is going nowhere and I will not continue to participate in it. If other people want to continue this discussion with me, I would be more than happy to do so. Please email me at: email@example.com
Whew, Mine will be much shorter.
While I don't have the expertise and the technical data, I do know there is a difference in looking at my prints in the digital darkroom versus the place they will be shown. Since it is not realistic to paint the walls and ceiling of my digital darkroom, (somethings take priority to keep our homelife running smooth) I try not to make the final judgement on my prints right next to the printer/monitor. (well I've learned not to) If I take them into the living area, I also notice a difference from the kitchen light, flourescent to the tungsten light in the rest of the house. Or I wait till next day and view them in the natural indirect light.
Again, I am not a professional, however I do know there is a difference in the environments.
Thanks to both of you for such efforts you have gone through to provide us with the knowledge & information to determine our best use of same.
My position is, and my experiences bear that out, that you can judge your prints in the digital darkroom, if certain precautions are taken. One of the most overlooked and ignored issues, in my opinion is the match or, in most cases mismatch, of the monitor caliabration and lights used to illuminate/judge your prints. Again, if you are interested, let's take it off-forum and email me at firstname.lastname@example.org.
Probably your situation is what most of us deal with, compromises. However that does not mean you can't find some things within the scope of possibilities you can do to obtain both a monitor image that is not skewed by ambient conditions, and to make prints which are balanced to look neutral in a particular display environment.
If for instance the light illuminating the area where your computer is located and is not ideal with some shining on the monitor face, you can shield the monitor screen with a hood. Some hoods are available for some monitors sizes, from LaCie for instance, as well as some large dealers too if you make an inquiry. But a hood is not difficult to make and install, Black FoamCore or mountboard will do to make the top and sides for a hood about 6 inches deep. Then use black cloth tape, available at most hardware stores, to hinge the top and two sides together. Finally get a roll of stick-on Velcro tape to secure the sides and top of the hood to the outside bezel surrounding your monitor screen.
Of course keep the room illumination level less bright shining on the area surrounding the monitor screen than the screen image brightness. And, try not to have anything that is intensely colored within your field of vision while you are looking at the monitor screen. As long as the light level on the area surrounding your monitor is subdued (not dark) smaller color variations will not adversely affect your perception of the monitor colors. When your eyes are focused on the monitor screen and it takes up one half or more of your field of vision, the periphery of sight involves a part of the eye which contains a reduced ratio of color receptors. In other words the periphery of human sight is less sensitive to color than the center.
You are quite correct in observing that your prints look different in different display locations where the light source varies from warm incandescent household lighting, to cool fluorescent, to even cooler window light at midday or especially light from the sky coming in through a window. The primary difference that you notice is color temperature, how warm or cool the light is. This can be matched in a print so the neutral areas (gray) of the image look neutral by changing the red/blue balance of the image printed.
This can be most easily accomplished if you use Adobe Photoshop Elements 3.0 and switch to the QuickFix mode to edit your image for print output and manually adjust the color temperature slider. For advanced users who do custom printer profiles, you can actually adjust an output printer/paper/ink profile's red-blue balance to custom print to match any display illumination color temperature.
David and Frans,
You are so right that my photography and prints are a matter of compromise. As is life in general. I will view and print with the ambient light less than the monitor, and a small lamp is set behind the monitor so no light fall on the face of the monitor. One of my problems is the color decor. Walls and ceiling a wonderful coral color, and my desk is a beautiful walnut bankers desk in that stunning brown walnut with a hint of red. So as you see life is a compromise. I think I will get something neutral grey to cover the desktop, however painting the walls/ceiling is a bit more of an issue.
I do plan on replacing my 5 yr + trinitron with either the LaCie 319 or the Apple Cinema some time next year. (waiting for David's reviews of LCDs).
So while my darkroom is far from neutral color, it is choices that allow us to live in harmony and practice our hobby to the best of our ability, (and budget)
While I am new to this forum, I really do appreciate all the posts and opinions and advice.
Thanks for sharing and patience with us partimers
Regarding LCD monitors my plans are to test and review several one at a time over the next year. New models, more and more that are claimed to be capable of supporting pro graphics and photography are coming out from different producers. For instance LaCie just the other day announced even newer 19 and 20 inch models. And I am currently evaluating the top end of Samsung's SyncMaster line a 244T and our editor George Schaub is working with Samsung's newest 19" model, if you will notice in our web site. I am finding this newest Samsung is providing a much better screen image quality than what I experienced just a year ago with a 20.1 inch LaCie and Sony's best at that time.
So, if you are looking ahead a ways before you make a decision, just a few months from now the choices may be quite different than they are today. The industry news mavens are indicating for 2006 their may be some over-production and ever more competition as well as continuing lowering of prices.