RSS/Atom feed Twitter
Site is read-only, email is disabled

Reducing noise with multiple exposures

This discussion is connected to the gimp-user-list.gnome.org mailing list which is provided by the GIMP developers and not related to gimpusers.com.

This is a read-only list on gimpusers.com so this discussion thread is read-only, too.

9 of 9 messages available
Toggle history

Please log in to manage your subscriptions.

Reducing noise with multiple exposures giuliogiuseppecarlo@interfree.it 24 Nov 20:03
  Reducing noise with multiple exposures GSR - FR 24 Nov 20:47
  Reducing noise with multiple exposures Toby Haynes 28 Nov 02:00
   Reducing noise with multiple exposures Rich 03 Dec 06:34
Reducing noise with multiple exposures giuliogiuseppecarlo@interfree.it 27 Nov 21:26
  Reducing noise with multiple exposures GSR - FR 27 Nov 23:51
Reducing noise with multiple exposures giuliogiuseppecarlo@interfree.it 02 Dec 10:02
  Reducing noise with multiple exposures GSR - FR 02 Dec 19:51
Reducing noise with multiple exposures giuliogiuseppecarlo@interfree.it 02 Dec 10:03
giuliogiuseppecarlo@interfree.it
2007-11-24 20:03:37 UTC (over 16 years ago)

Reducing noise with multiple exposures

I'm trying to reduce noise using multiple photos of the very same subject, using both a tripod or simply repeating a scan a number of times ( i have a flatbed scanner with a lot of noise especially in dark areas... )

So far i have tried with my old broken camera that had powerful noise at 400 iso and first results are not bad at all.

Samples: http://img90.imageshack.us/img90/6967/prova1wc2.jpg http://img131.imageshack.us/img131/448/prova2xh7.jpg http://img116.imageshack.us/img116/7589/prova3on7.jpg ( heavy purple fringing is probably due to the fact that i choosed 5/7 of the max resolution, 5 mp instead of 7 )

But i have still some questions:

I have used the first level, as normal, with 100% opacity, then i have added other 9 levels ( 9 photos ), with opacity 100/9=11,1, and mode set to normal.

If i use 10 levels, at 10% opacity each one, the brightness of the result is different from the original ( lighter ).

An example: i would like to use a grayscale image, shot 3 times.

For a given pixel, i have that first image is 200, second 190, third 192.

So i would like to have as a result (200+190+192)/3=194

So what do you think i should use?

Thank you in advance.

Giulio.

---------------------------------------------------------------------------- DEXGATEMICRO il centralino VoIP multifunzione per l'azienda. Prova gratuita per 4 utenti!

Scopri tutte le funzionalita' sul sito Dexgate.com ----------------------------------------------------------------------------

GSR - FR
2007-11-24 20:47:16 UTC (over 16 years ago)

Reducing noise with multiple exposures

Hi,
giuliogiuseppecarlo@interfree.it (2007-11-24 at 1903.37 -0000):

I'm trying to reduce noise using multiple photos of the very same subject, using both a tripod or simply repeating a scan a number of times ( i have a flatbed scanner with a lot of noise especially in dark areas... )

[...]

I have used the first level, as normal, with 100% opacity, then i have added other 9 levels ( 9 photos ), with opacity 100/9=11,1, and mode set to normal.

If i use 10 levels, at 10% opacity each one, the brightness of the result is different from the original ( lighter ).

The trick for mixing this way is that the N layer has to be set to 1/N opacity. The reasoning is that the N layer will contribute that factor, and the rest ((N-1)/N) has to come from the previous mixed layers, so the result always totals 100% ((1 + N-1)/N). First (base) layer 1/1 -> 100%, second layer (the one just above base) 1/2 -> 50% (and the other 50% comes from the first), third layer 1/3 -> 33.33% (and the other 66.66% comes from the 1 and 2) and so on. This approach will probably have rounding errors.

So what do you think i should use?

Imagemagick. Simpler, faster and maybe even less rounding issues (it can add all images at once, then do a single multiply by 1/N, at least that is how I would code such op). :]

convert image1.png ... imageN.png -average result.png

I investigated the topic for mixing different frames time ago to create more film like renders (lots of other things in the trick, but finding a fast and nice average op was one of the core issues I had to solve): http://www.infernal-iceberg.com/blender/mblur/

GSR

giuliogiuseppecarlo@interfree.it
2007-11-27 21:26:23 UTC (over 16 years ago)

Reducing noise with multiple exposures

Hi,
giuliogiuseppecarlo@interfree.it (2007-11-24 at 1903.37 -0000):

I'm trying to reduce noise using multiple photos of the very same subject,

using both a tripod or simply repeating a scan a number of times ( i have a flatbed scanner with a lot of noise especially in dark areas... ) [...]

I have used the first level, as normal, with 100% opacity, then i have added

other 9 levels ( 9 photos ), with opacity 100/9=11,1, and mode set to normal. >

If i use 10 levels, at 10% opacity each one, the brightness of the result is

different from the original ( lighter ).

The trick for mixing this way is that the N layer has to be set to 1/N opacity. The reasoning is that the N layer will contribute that factor, and the rest ((N-1)/N) has to come from the previous mixed layers, so the result always totals 100% ((1 + N-1)/N). First (base) layer 1/1 -> 100%, second layer (the one just above base) 1/2 -> 50% (and the other 50% comes from the first), third layer 1/3 -> 33.33% (and the other 66.66% comes from the 1 and 2) and so on. This approach will probably have rounding errors.

Ok, thank you, works perfectly :D

So what do you think i should use?

Imagemagick. Simpler, faster and maybe even less rounding issues (it can add all images at once, then do a single multiply by 1/N, at least that is how I would code such op). :]

convert image1.png ... imageN.png -average result.png

I have tried, but it requires more memory than i have, after 30 minutes ( with 10 images, each of 5 Megapixels ) it was still working ( 256mb ram +768 swap ), whilst with gimp everything, from loading to adjusting the opacity, took only about 3 minutes.

I investigated the topic for mixing different frames time ago to create more film like renders (lots of other things in the trick, but finding a fast and nice average op was one of the core issues I had to solve): http://www.infernal-iceberg.com/blender/mblur/

I'll take a look.

Thank you!

---------------------------------------------------------------------------- DEXGATEMICRO il centralino VoIP multifunzione per l'azienda. Prova gratuita per 4 utenti!

Scopri tutte le funzionalita' sul sito Dexgate.com ----------------------------------------------------------------------------

GSR - FR
2007-11-27 23:51:08 UTC (over 16 years ago)

Reducing noise with multiple exposures

Hi,
giuliogiuseppecarlo@interfree.it (2007-11-27 at 2026.23 -0000):

I have tried, but it requires more memory than i have, after 30 minutes ( with 10 images, each of 5 Megapixels ) it was still working ( 256mb ram +768 swap ), whilst with gimp everything, from loading to adjusting the opacity, took only about 3 minutes.

I would try the following then: cut the images into manageable parts (using PNG as format), average the separate "stacks", then glue back the parts. For example chunks of 50 lines (I looked for a 5MP camera and it said 2560*1920, so 2560*50). I ran the following commands after running ulimit -d 131072 -m 131072 -v 131072, to make sure all fits in 128M:

# First cut all
for i in [a-j].jpg
do
convert ${i} -crop 2560x50 +repage ${i}_%02d.png done
# ~120 sec

# Look up the maximum number generated, in this case 38 (starts in 0) # And average the stacks
for i in $( seq -w 0 38 )
do
convert [a-j].jpg_$i.png -average result_$i.png done
# ~25 sec

# Join back, 1x means one column (= full width parts, simpler) montage -mode concatenate -tile 1x result_*.png indirect.png # ~20 sec
# Total: ~165

I also tried the original approach, it took ~120 secs (~30 without setting limits, no swapping in any case) and did not crash due the forced memory limit (original JPEGs were 2560*1920 and ~3MBytes each, work dir ended being ~180MB with all the intermediate PNGs and the two final versions):

convert [a-j].jpg -average direct.png

So if you figure why the system was so slow (disk? too many other apps running? really old CPU?), it would be nice to know. Also memory is nicer than swap, specially if you are into image editing (just for when you have to get a new computer or update current ;] ).

GSR

Toby Haynes
2007-11-28 02:00:57 UTC (over 16 years ago)

Reducing noise with multiple exposures

giuliogiuseppecarlo@interfree.it wrote:

I'm trying to reduce noise using multiple photos of the very same subject, using both a tripod or simply repeating a scan a number of times ( i have a flatbed scanner with a lot of noise especially in dark areas... )

So far i have tried with my old broken camera that had powerful noise at 400 iso and first results are not bad at all.

Samples: http://img90.imageshack.us/img90/6967/prova1wc2.jpg http://img131.imageshack.us/img131/448/prova2xh7.jpg http://img116.imageshack.us/img116/7589/prova3on7.jpg ( heavy purple fringing is probably due to the fact that i choosed 5/7 of the max resolution, 5 mp instead of 7 )

You might consider the Anti-Lameness Engine, which is expressly designed for exactly this sort of task (including re-aligning the images and suppressing noise).

http://auricle.dyndns.org/ALE/

Hugin (http://hugin.sf.net/) is useful if you need to exactly align a number of images prior to any stacking. It can also be used to remove chromatic aberation with a little care (although it's fiddly and time consuming).

Cheers,
Toby Haynes

giuliogiuseppecarlo@interfree.it
2007-12-02 10:02:40 UTC (over 16 years ago)

Reducing noise with multiple exposures

Hi,
giuliogiuseppecarlo@interfree.it (2007-11-27 at 2026.23 -0000):

I have tried, but it requires more memory than i have, after 30 minutes ( with 10 images, each of 5 Megapixels ) it was still working ( 256mb ram +768 swap ), whilst with gimp everything, from loading to adjusting the opacity, took only about 3 minutes.

I would try the following then: cut the images into manageable parts (using PNG as format), average the separate "stacks", then glue back the parts. For example chunks of 50 lines (I looked for a 5MP camera and it said 2560*1920, so 2560*50). I ran the following commands after running ulimit -d 131072 -m 131072 -v 131072, to make sure all fits in 128M:

# First cut all
for i in [a-j].jpg
do
convert ${i} -crop 2560x50 +repage ${i}_%02d.png done
# ~120 sec

Here:
real 0m53.639s
user 0m42.179s
sys 0m3.112s

# Look up the maximum number generated, in this case 38 (starts in 0) # And average the stacks
for i in $( seq -w 0 38 )
do
convert [a-j].jpg_$i.png -average result_$i.png done
# ~25 sec

real 0m38.739s
user 0m26.850s
sys 0m3.420s

# Join back, 1x means one column (= full width parts, simpler) montage -mode concatenate -tile 1x result_*.png indirect.png # ~20 sec
# Total: ~165

real 0m25.854s
user 0m21.225s
sys 0m0.660s

I also tried the original approach, it took ~120 secs (~30 without setting limits, no swapping in any case) and did not crash due the forced memory limit (original JPEGs were 2560*1920 and ~3MBytes each, work dir ended being ~180MB with all the intermediate PNGs and the two final versions):

convert [a-j].jpg -average direct.png

Let's try again.

free -m:
total used free shared buffers cached Mem: 250 115 135 0 2 57 -/+ buffers/cache: 55 195 Swap: 729 0 728

After few seconds, hd is swapping.

After 10 minutes is still swapping.

free -m reports:

total used free shared buffers cached Mem: 250 247 3 0 0 36 -/+ buffers/cache: 210 40 Swap: 729 377 352

Then, ctrl-c .

So if you figure why the system was so slow (disk? too many other apps running? really old CPU?), it would be nice to know. Also memory is nicer than swap, specially if you are into image editing (just for when you have to get a new computer or update current ;] ).

The pc is a pentium 4 1800Mhz with 256 mega of ram ( unfortunatly RIMM, here i'm unable to find modules at decent price, i found once 256mb expansion, for only 90€ ). Disk is a 60GB Maxtor, with hdparm 40 MB/s.

Btw, i'm using imagemagick from debian etch ( imagemagick 6.2.4.5.dfsg1-0.14 )

And i have always found that imagemagick is extremely slow ( converting images is for example much faster with netpbm tools ).

Thank you.

---------------------------------------------------------------------------- DEXGATEMICRO il centralino VoIP multifunzione per l'azienda. Prova gratuita per 4 utenti!

Scopri tutte le funzionalita' sul sito Dexgate.com ----------------------------------------------------------------------------

giuliogiuseppecarlo@interfree.it
2007-12-02 10:03:36 UTC (over 16 years ago)

Reducing noise with multiple exposures

giuliogiuseppecarlo@interfree.it wrote:

I'm trying to reduce noise using multiple photos of the very same subject,

using both a tripod or simply repeating a scan a number of times ( i have a

...

You might consider the Anti-Lameness Engine, which is expressly designed for exactly this sort of task (including re-aligning the images and suppressing noise).

http://auricle.dyndns.org/ALE/

Hugin (http://hugin.sf.net/) is useful if you need to exactly align a number of images prior to any stacking. It can also be used to remove chromatic aberation with a little care (although it's fiddly and time consuming).

Thank you, i'll take a look.

---------------------------------------------------------------------------- DEXGATEMICRO il centralino VoIP multifunzione per l'azienda. Prova gratuita per 4 utenti!

Scopri tutte le funzionalita' sul sito Dexgate.com ----------------------------------------------------------------------------

GSR - FR
2007-12-02 19:51:59 UTC (over 16 years ago)

Reducing noise with multiple exposures

Hi,
giuliogiuseppecarlo@interfree.it (2007-12-02 at 0902.40 -0000):

I also tried the original approach, it took ~120 secs (~30 without setting limits, no swapping in any case) and did not crash due the forced memory limit (original JPEGs were 2560*1920 and ~3MBytes each, work dir ended being ~180MB with all the intermediate PNGs and the two final versions):

convert [a-j].jpg -average direct.png

Let's try again.

free -m:
total used free shared buffers cached Mem: 250 115 135 0 2 57 -/+ buffers/cache: 55 195 Swap: 729 0 728

So you have 135 free, that is good. 195 if cache and buffers are not count.

After few seconds, hd is swapping.

After 10 minutes is still swapping.

free -m reports:

total used free shared buffers cached Mem: 250 247 3 0 0 36 -/+ buffers/cache: 210 40 Swap: 729 377 352

377 in swap, not good now. :[

Then, ctrl-c .

I think the issue you have can be solved by the same trick I used to simulate small memory. Your system has free memory before starting (and all swap free too), so imagemagick requests memory and as it never gets a "no more memory", it keeps on requesting, until swap usage is so big that the system is mostly trashing the disk instead of doing real jobs.

But if imagemagick gets a "there is no more memory" (due to ulimit or really hitting the hardware max space), it completes the task with what it already got, even if a bit slower than it would be in a computer with lots of RAM.

Remember, for apps, the "memory" is ram+swap, but in practice once you have to use the "slow memory" (swap), it is not worth in many cases. So try again running ulimit -S -d 131072 -m 131072 -v 131072 then convert [a-j].jpg -average direct.png. That should keep imagemagick into memory, or at least most of it, instead of forcing the system to use over 300 of "really slow memory".

Personally I prefer computers with really small swap, except when used to cover the needs of tmpfs, swsusp or similar systems. If a process goes mad, it will die soon, instead of making the computer become barely usable for minutes. In your case, maybe I would set up a soft ulimit of 192 or so for user accounts (that would leave 64 free for other processes running at the same time), so no app can request more than that. As it would be soft kind, not hard, users can raise if in case they really really need (at their own risk).

[...]

And i have always found that imagemagick is extremely slow ( converting images is for example much faster with netpbm tools ).

Yes, it is not exactly fast, it focuses more in features.

GSR

Rich
2007-12-03 06:34:58 UTC (over 16 years ago)

Reducing noise with multiple exposures

Toby Haynes wrote:

giuliogiuseppecarlo@interfree.it wrote:

I'm trying to reduce noise using multiple photos of the very same subject, using both a tripod or simply repeating a scan a number of times ( i have a flatbed scanner with a lot of noise especially in dark areas... )

So far i have tried with my old broken camera that had powerful noise at 400 iso and first results are not bad at all.

Samples: http://img90.imageshack.us/img90/6967/prova1wc2.jpg http://img131.imageshack.us/img131/448/prova2xh7.jpg http://img116.imageshack.us/img116/7589/prova3on7.jpg ( heavy purple fringing is probably due to the fact that i choosed 5/7 of the max resolution, 5 mp instead of 7 )

You might consider the Anti-Lameness Engine, which is expressly designed for exactly this sort of task (including re-aligning the images and suppressing noise).

http://auricle.dyndns.org/ALE/

Hugin (http://hugin.sf.net/) is useful if you need to exactly align a number of images prior to any stacking. It can also be used to remove chromatic aberation with a little care (although it's fiddly and time consuming).

Hi

I normally take photos in raw, which seems to eliminate a lot of the color noise on long exposures.
My camera is usually okay within 5-10sec exposures, but this time I took shots in Jpeg, which then to generate more noise when there is a lot of low-level light on a long exposure.
It looks like lint or squiggly lines all throughout the image.

I tried using despeckle, which does a fine job, however, it tends to kill the sharpness and distorts anything that was smooth, especially circles and droplet edges.
I tried Selective Gaussian, that seems to help with some of the color blockiness.

Is there another method I can use to help eliminate those squiggly artifacts?
Would using multiple images help eliminate the noise?

Thanks Rich