Does higher density sensor imply more noise? 
Thursday, November 30, 2006, 11:49 AM - Photography
This year, there is a trend toward increased sensor density in DSLR, mainly because of the new 10 megapixels cameras.
Many people (including me) are wondering about the noise level of those cameras compared to the lower density sensors. We have the following models that are in competition:

Canon 400D (10Mpix) vs Canon 350D (8Mpix)
Sony alpha-100 (10Mpix) vs KM 5D/7D (6Mpix)
Pentax K10 (10Mpix) vs Pentax K100 (6Mpix)
Olympus E400 (10Mpix) vs Olympus E500 (8Mpix)

The first constatation is usually that the 10Mpix cameras are producing more noise at higher ISO sensitivities. Comparing full scale crops side by side shows an increased noise level.

But there is something that should be considered: the increased resolution. Let's compare the noise of a 10Mpix to what a 6/8Mpix would gave you, but resized to 6/8Mpix. Once resized, it's not that obvious anymore that the noise level is increased.
So for 10Mpix DSLR we have:

*increased noise in full frame size
*similar noise when resized
*higher resolution
*similar noise level at low ISO
*lack of 3200 ISO setting (usually)

Seems that after all, the only real drawback might be the lack of 3200 ISO setting, but not really an increased noise level.
  |  0 trackbacks   |  permalink   |  related link

Switching raw negatives to DNG? 
Thursday, November 16, 2006, 11:02 PM - Photography
I am wondering about swiching my raw files to DNG, mainly because of space constraints.

Right now, my Minolta (5D) raw files are about 9 MB each one. Of course, it's quickly becoming (storage) space consuming, but I'm still reluctant to only keep jpeg versions. When you decide to edit a file, it seems to me that having the full 12 bits to start with is adding a lot more freedom, and seems to take care of the uniform areas that tend to appear if you use curves too much on jpeg files.

So I decided to have a serious look at DNG. I downloaded the Adobe DNG converter, and tryed it on a few files. The first feelings about the converter were positive:
*DNG converter is a single executable, no installation needed
*the converter is quite fast, and seems to be using all the available cpu cores
*it provided a great size reduction over the original raw files (about 30% smaller files)

Then I tryed to open the DNG files with Paint Shop Pro X, and things started to go wrong: even if PSPX is supposed to support DNG, it failed to open my newly created DNG files. I then tryed to browse the files using XnView, and it was displaying the files with some obviously broken parts. Quite disappointing. I think that it's likely that DNG support will improve in the future, a some camera manufacturers (especially Pentax) are starting to use it as a native file format in their DSLR, but it seems that right now there are still a few problems.

I still decided to have a look at the compressibility of my raw files:

original mrw file: 8.77MB
DNG converted: 5.13MB

A good size reduction, but:

zip: 5.79MB
bz2: 4.76MB
7zip: 5.13MB

So if I have some big storage space issues, I could still decide to use 7zip on my older raw files.

Conclusion: not yet switching to DNG
  |  0 trackbacks   |  permalink   |  related link

Quad cores benchmarks and LAME 
Tuesday, November 7, 2006, 12:39 PM - LAME
Some samples of Intel "quad cores" processors are now available. As with the introduction of dual cores, we'll see a lot of hardware websites doing some benchmarks. Several of them will also use LAME encoding as one of the tests.

In the current version (3.97) LAME is not multithreaded at all, so for a single encoding it's obvious that adding cores will not change anything, no need to run a benchmark to know it, and no need to display bar graphs of encoding speed results (as they are quite pointless).


Message to benchmarkers:

If you want to use mp3 encoding for a multicores/multiprocessors benchmarks, then you have the following choices:

*use a natively multithreaded encoder (example: iTunes)

*use a multithreaded frontend and encode several files. As an example, EAC allows you to choose the number of threads. If you use more than 1, of course you will benefit from the added cores.
  |  0 trackbacks   |  permalink   |  related link

LAME encoding speed evolution 
Saturday, October 28, 2006, 01:27 PM - LAME
For the sake of curiosity, I tested the evolution of encoding speed of Lame using different versions. To simplify things, encoding is done at default settings (128kbps/cbr). Computer features MMX/3dnow/SSE/SSE2, audio file is about 1 minute long.

Here are the results:

3.20: 3.51s
3.50: 6.49s
3.70: 3.78s
3.80: 3.70s
3.90: 3.74s
3.93.1: 4.58s
3.96.1: 5.53s
3.97: 5.31s

By looking at this, it's clear that overall Lame has became slower over time. We tryed to keep speed reasonable, but as we increased quality our speed optimisations were not enough to compensate for the extra computations.

We had a notable speed decrease when releasing 3.94. In 3.94, we switched our default psychoacoustic model from GPsycho to NSPsytune. NSPsytune was already used (starting with 3.90) when using the preset/alt-preset settings. To have a comparison value, here is the "preset cbr 128" speed of versions prior to 3.94:

3.90 --alt-preset cbr 128: 9.62s
3.93.1 --preset cbr 128: 6.55s

So between 3.90 and 3.97, NSPsytune's encoding time decreased from 9.62s to 5.31s. Not bad at all, but still quite slower than early Lame releases.
  |  0 trackbacks   |  permalink   |  related link


Back Next