CCD vs CMOS does one sensor have any advantage over the other. The question being Nikon has introduced their D200 with a 10 mpix CCD sensor and Sony has released their DCS-R1 using a 10 mpix CMOS sensor. Makes you wonder what KM will use since they teamed up with Sony.
The problem is the people who might know aren't talking, and those who are talking have little to go on. However we do know two things: one that CMOS as a sensor chip technology is much newer than CCD technology and might have more potential for further development, and two the one company that makes both sensors and camera, Canon, is betting on CMOS. Sony on the other hand, although they make video cameras, is largely dependent on other camera companies to provide a market for their sensor production. Kodak is becoming less and less competitive in the faster growing segment of the digital camera market. So unless a dark horse enters the race and changes things, it looks like next year will just be more of the same kinds of choices.
that's a great question...and one that made me go out and do a little research.... hoepefully some other people will speak up with what they know...
I checked out a couple of web pages that discuss the topic...
I did find one site that claimed the CCD was worse at attracting dust...that "seems " logical since the CCD uses more power...but who knows if that's really true or not.
both sites discuss the issue from a very technical perpective but both also seem to come to the same conclusions...
1. CCD's are more expensive to make....but they are EASIER to make especially with older tech processes
2. CMOS sensors are cheaper to make but require some pretty advanced tech processes to do it properly
3. CMOS sensors use less power and can be made smaller
nowhere did I see anyone claim that either chip procuded any optical difference vs the other.... it seems that is simply a different way of doing the same thing. In the case of CMOS its a cheaper way to do it.....once you pay for the tooling.
id love to see other folks add to this...I have seen some pages that go on and on about one chip being better than the other, but I couldn't find any page that offered any real support to their opinions..
several pages touted the CMOS as "better" but the only FACTUAL support for that opinion was that it was the "new thing"..
I found several pages that said the CCD was better but the only factual support they gave for the opinion was that it cost more...
Thanks for the info. on the sensor. Seems that Kodak and Canon are aimed at the Cmos Technology with full frame sensor for the Pro camera crowd. Nikon is using a JFET and Cmos and now a CCD using the APS-C size, if you consider the D200 as a pro grade camera which the info seems to suggest, if so then they are trying to use 3 technologies in their pro grade cameras. This has got to be expensive to develope and produce.
yes, it is interesting that Nikon uses all three and it will be VERY interesting to see what they do in the near future.. a couple of things that stand out in my mind...
1. what is the future of the D2H ...with that JFET sensor? it seems like such a niche camera... the only real benefit it seems to offer to anyone is the 8 frames per second.. and the D2X can do that.....and its tough to even argue that anyone even needs that kind of FPS anyway....other than that its a D2X with a much smaller sensor and twice the price of the D200
2. notice the D100 went away, thank God... but one has to wonder about the D50... is it the D70s replacement? seems that the D70s doesn't offer much more for the price.
3. what does this mean for the D2X? I have a D2X...wonderful camera.... but its pretty hard to justify the extra $3000 for the camera.. There used to be a lot of reasons to buy the D2X vs the D70 if you had the cash, but the D200 pretty much covers all the limitations of the D70( which is a great camera too) and does it so well that there isn't much that the D2X offers vs the D200. I almost regret having dropped $5K on a D2X last year because if I had the choice today Id just buy two D200's and save $1500. It makes me wonder what Nikon has up it's sleeve for the D2X....and it makes me drool at the possiblities :-)
With Sony (CMOS) and Sharp (CCD) now using 10 mpix sensors.
Nikon and Canon 12 mpix aren't as impressive now as they were a month ago. Just nice to own. I bought my 10D 3 years ago and am trying to wait but it is hard, especally with my current plans. Things are changing fast.
My question has nothing to do with the subject (CMOS/CCD. My question: Why do you "thank God" the D100 went away? I have one and I guess I'm not experienced enough to know what I'm missing. Maybe I'm just lucky but I have had awfully good luck with mine. The only thing I miss is a way to fire it remotely. Is there a way to do that?
Yes there is away to fire Nikon dSLRs remotely via a USB cable and download the image to your computer. It's called Nikon Capture 4 (version 4.3) Software and sells for $100 at B&H Photo Video.
There is also a $17 infrared remote ML-L3 to fire many Nikon cameras, but I'm not sure if it fires the D100.
Consider these factors:
*Does the sensor require power to detect light. This will require more accurate amplification/power ie) variations will result in noise. A sensor might be manuf. by a single entity, but the post processing is where it counts.
*How does the camera handle the image before creating the file. Post processing is a big factor.
*There are variations in noise reduction algorithms.
*Does the camera compress the file, what does it leave out (shadow details?). Jpg is fine, but more stuff happens before that, even in raw mode.
*10 megapixel. What kind of buffer is there to back that up.
*How sharp is the lens (at all zoom lengths). There will be a compromise somewhere. Some lenses will never get away from CA, it's a cost trade off.
*Is there a moire filter over the sensor. This can degrade the pixel value to compensate for (smoothness/rainbows/etc)
I would rather shoot a D70 w/ 80-200 AF than a 12 megapix fixed zoom lens with and electronic viewfinder. Why? Becuase I can have fully flexibility over the camera settings and I never feel restricted to a single lens; CMOS or CCD.
Why should you think lens "sharpness" is a significant factor with digital cameras when the average area array sensor cell site size is 5 to 10 microns? You could fit quite a number of ISO 100 film grains in the space taken up by a single sensor cell site's area size.
That is not to say lens quality is not an issue, but more so in terms of aberration correction and its being rectilinear than its being particularly 'sharp'.
Secondly all sensor arrays, either lineal or area arrays, require considerable software/firmware sharpening.
Then, on the business of Raw versus JPEG. The post exposure is quite different. The purpose of Raw is to collect essentially unadjusted data from the sensor so the user can provide the "processing" individually in a computer's image editing application. While JPEG does involve auto processing that optimizes the gamut, adjusts brightness (mid-point placement, as well as adjusts contrast and saturation to pre-determined standards, applies sharpening, and then usually outputs into sRGB a very small, limited colorspace.