In the 1980s, I was really into NN technology: spent a year on a DARPA advisory panel, implemented all 12 neural models in the SAIC ANSim product, and advised on several interesting projects (e.g., a bomb detector that worked well, image analysis, etc.).
I started to read Jeff Hawkin's most interesting book "On Intelligence" last night and that got me thinking and reminiscing about NNs. Jeff mentioned the first IEEE neural network convention in 1987: I presented a paper there on phoneme recognition with NNs and I also manned the SAIC NN booth. Lots of fun!!
One of the most interesting neural models that I used was Adaptive Resonance Theory (ART) by Stephen Grossberg and Gail Carpenter (his wife). Gail kindly spent some time on the telephone with me to make sure I got it right - very helpful, and very interesting technology.
I think that Jeff Hawkin's book will be very influential and his hierarchical model of the cortex definitely reminds me of ART. If I did not have to work for a living, I have often thought that I would be quite happy trying to make a theoretical contribution by using hierarchical associative memory models to build an AI Go player. I like Hawkin's emphasis on temporal processing: except for recurrent NNs, something often ignored.