Discussion:
[SciPy-user] ndimage convolve vs. RAM issue...
fred
2009-01-16 10:58:00 UTC
Permalink
Hi all,

On a bi-xeon quad core (debian 64 bits) with 8 GB of RAM, if I want to
convolve a 102*122*143 float array (~7 MB) with a kernel of 77*77*41
cells (~1 MB), I get a MemoryError in correlate:

File "/usr/lib/python2.5/site-packages/scipy/ndimage/filters.py", line
331, in convolve
origin, True)
File "/usr/lib/python2.5/site-packages/scipy/ndimage/filters.py", line
312, in _correlate_or_convolve
_nd_image.correlate(input, weights, output, mode, cval, origins)
MemoryError

Why ?

Is there a workaround to compute such convolution ?

TIA.


Cheers,
--
Fred
fred
2009-01-20 10:33:53 UTC
Permalink
Post by fred
Hi all,
On a bi-xeon quad core (debian 64 bits) with 8 GB of RAM, if I want to
convolve a 102*122*143 float array (~7 MB) with a kernel of 77*77*41
File "/usr/lib/python2.5/site-packages/scipy/ndimage/filters.py", line
331, in convolve
origin, True)
File "/usr/lib/python2.5/site-packages/scipy/ndimage/filters.py", line
312, in _correlate_or_convolve
_nd_image.correlate(input, weights, output, mode, cval, origins)
MemoryError
Nobody can help me on this issue ?

I really need some help, since ndimage.convolve is _very_ efficient ;-)

TIA


Cheers,
--
Fred
Gael Varoquaux
2009-01-20 10:35:12 UTC
Permalink
Post by fred
I really need some help, since ndimage.convolve is _very_ efficient ;-)
Did you try fftconvolve?

Gaƫl
fred
2009-01-20 11:22:17 UTC
Permalink
Post by Gael Varoquaux
Post by fred
I really need some help, since ndimage.convolve is _very_ efficient ;-)
Did you try fftconvolve?
Yep.

On a smaller kernel:

data: 600x800x720
kernel: 361

ndimage.convolve: 184 s

signal.fftconvolve: MemoryError


Another one :

data: 300x400x360
kernel: 361

ndimage.convolve: 22 s

signal.fftconvolve: 37 s


Besides this, ndimage.convolve can handle NaN, not signal.fftconvolve.


Cheers,
--
Fred
fred
2009-01-20 12:17:43 UTC
Permalink
Post by fred
Hi all,
On a bi-xeon quad core (debian 64 bits) with 8 GB of RAM, if I want to
convolve a 102*122*143 float array (~7 MB) with a kernel of 77*77*41
File "/usr/lib/python2.5/site-packages/scipy/ndimage/filters.py", line
331, in convolve
origin, True)
File "/usr/lib/python2.5/site-packages/scipy/ndimage/filters.py", line
312, in _correlate_or_convolve
_nd_image.correlate(input, weights, output, mode, cval, origins)
MemoryError
Can someone give me an explanation, if not a solution (I get one, called
multi-processing ;-))


Cheers,
--
Fred
fred
2009-01-20 16:29:38 UTC
Permalink
Post by fred
Can someone give me an explanation, if not a solution (I get one, called
multi-processing ;-))
Stupid me.

I tested the wrong example.

It does not work :-(((((((((((


Cheers,
--
Fred
Loading...