It has been known for quite a while that the Vapnik-Chervonenkis dimension
(VC-dimension) of a feedforward neural net with linear threshold gates is at
most

, where

is the total number of weights in the
neural net. We show in this paper that this bound is in fact asymptotically
optimal. More precisely, we exhibit for any depth

a large class of
feedforward neural nets of depth
d with

weights that have
VC-dimension

. This lower bound holds even if the
inputs are restricted to Boolean values. The proof of this result relies on a
new method that allows us to encode more "program-bits" in the weights of a
neural net than previously thought possible.