If you are working with Neural Networks then the book "Perceptrons" is perhaps one of the books that you should definitely read. This is the first systematic study of parallelism in computation and has remained a classical work on threshold automata networks for nearly two decades. The connectionist revolution, which was shadowed by the symbolic era for a while, started when this book came out. Marvin Minsky (Co-Founder of MIT AI Labs) and Papert have done quite a lot of mathematical analysis of the original "perceptron". The "Neuron" in a Artificial Neural network owes its existence to the concept of perceptron. While the content is excellent, you might find the language a bit difficult to understand. A major reason being that a lot of concepts that we use seamlessly today came up after this book came.