Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
PyCNN: Cellular Neural Networks Image Processing Python Library (github.com/ankitaggarwal011)
64 points by bane on Aug 20, 2016 | hide | past | favorite | 18 comments


I'm reading up on Cellular Neural Networks and it seems like they are simple convolutions, of which you can specify the kernel. In fact, this library is just calling scipy.signal.convolve2d() with different kernels [2].

The research (from the original 1988 paper [1]) apparently tried tackling the signal processing problem from a different angle (namely, having connected neighboring 'cells'), but the end result is the same as what we know today. I actually wouldn't be surprised if CNNs were a precursor to today's ConvNets in some way.

TLDR: these 'CNN's are convolutions. They aren't ConvNets because they don't have pooling or fully connected layers.

[1] http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7600 [2] https://github.com/ankitaggarwal011/PyCNN/blob/master/cnnimg...


Hi, author here. Even though I was on HN, I didn't realize that this is posted over here :) I'm glad about all the feedback. Thank you.

> I'm reading up on Cellular Neural Networks and it seems like they are simple convolutions, of which you can specify the kernel.

Actually its more than that, simply put, cellular neural networks are a parallel computing paradigm similar to neural networks, with the difference that communication is allowed between neighboring units only [1].

> In fact, this library is just calling scipy.signal.convolve2d() with different kernels.

The part you're referring to performs the convolution between the kernel function and the feedback template to get the result of the feedback loop. Please note the kernel function is sigmoidal or its approximation and remains unchanged.

It will be easier to understand if you'll visualize it as a control system as shown in [2] with a feedback template and a control template. These templates (coefficients) are configurable and produce different results for different configurations.

One of the applications of these networks is image processing as stated in [3] "CNN processors were designed to perform image processing; specifically, the original application of CNN processors was to perform real-time ultra-high frame-rate (>10,000 frame/s) processing unachievable by digital processors needed for applications like particle detection in jet engine fluids and spark-plug detection.".

[1] https://en.wikipedia.org/wiki/Cellular_neural_network

[2] http://www.isiweb.ee.ethz.ch/haenggi/CNN_web/CNN_figures/blo...

[3] https://en.wikipedia.org/wiki/Cellular_neural_network#Applic...


Hi there. I'm completely unfamiliar with CNN. Do they relate to Cellular Autonoma in any way other than sharing part of a name? And of course the fact that they communicate only with neighbors. Are there patterns of emergent behavior in the classical sense, or is it more closely related to a neural networks?


Nice catch! It is indeed closely related to cellular automata. CNN processors could be thought of as a hybrid between ANN (artificial neural networks) and CA (continuous/cellular automata) [1]. Like neural networks, they are large-scale nonlinear analog circuits that process signals in real time. Like cellular automata, they consist of a massive aggregate of regularly spaced circuit clones, called cells, which communicate with each other directly only through their nearest neighbors [2].

The topology and dynamics of CNN processors closely resembles that of CA. Like most CNN processors, CA consists of a fixed-number of identical processors that are spatially discrete and topologically uniform. The difference is that most CNN processors are continuous-valued whereas CA have discrete-values [1].

You can also read this: http://www.nims.go.jp/nanophys6/Anirban%20Bandyopadhyay/site...

[1] https://en.wikipedia.org/wiki/Cellular_neural_network#Relate...

[2] http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7600


For the people interested in reading about different template configuration and their results, this is a great resource:

[1] http://cnn-technology.itk.ppke.hu/Template_library_v3.1.pdf

[2] http://cnn-technology.itk.ppke.hu/Template_library_v4.0alpha...


The name CNN is quite unfortunate, as it is most often used for convolutional neural networks.


Speaking of unfortunate naming, there already exist a neural network package called PyCNN, which binds python to the neural network library CNN, named so, because it is a neural network library written in C++.

https://github.com/clab/cnn


Actually, it's more related to the original naming term that was adopted for cellular neural networks (CNN). Convolutional neural networks were originally referred to as ConvNets (CNN term is also used for them as well).

IMHO, I don't think this naming is misleading since CNN has been originally used to refer cellular neural networks.


The illustration looks like a 3x3 convolution too.


Those examples seem like standard image processing to me, could someone explain why a neural network is useful for these?


Please take a look at the image processing specific applications [1] and advantages of the cellular neural networks.

[1] https://en.wikipedia.org/wiki/Cellular_neural_network#Applic...


the tl;dr version is: cnns or cellnets (how's that for a name), offer speedier image processing with lower computational costs?

am i right?


Yes, this is one of the applications of these networks.


>Yes, this is one of the applications of these networks. Thanks, would you mind elaborating more on other advantages? I did read the wikipedia link and other links you posted, however it's not entirely clear what other benefits exist. May I suggest writing a blog post about what you perceive are the benefits of CNN (cellular networks) vis-a-vis the other CNNs.


Thanks!


For those interested in studying the dynamics of networks with lateral/recurrent connections, Stephen Grossberg pretty much wrote the book on mathematically analyzing these systems: http://www.scholarpedia.org/article/Recurrent_neural_network....

A number of really interesting properties emerge like automatic gain control and contrast enhancement when you include network properties similar to what is seen in the brain.


Hmm, I wonder if this could be adapted to be a good VapourSynth filter.


Making a play for the CNN TLA I see...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: