Abstract In this dissertation I investigate the neural mechanisms underlying the human ability to learn, store and make use of grammatical structure, so-called syntax, in language. In doing so I incorporate insights from linguistics, cognitive psychology and neuro-biology. From linguistic research it is known that the structure of nearly all languages exhibits two essential characteristics: language is productive -- from a limited number of words and rules one can produce and understand an unlimited number of novel sentences. Further, language is hierarchical -- sentences are constructed from phrases, which in turn can be constructed from other phrases, etc. These two structural properties of language provide minimum requirements that a system of language processing, such as the brain, must satisfy. A first contribution of this dissertation is that it attempts to formulate these requirements as concisely as possible, allowing for a strict evaluation of existing models of neural processing in the brain (so-called works). From this evaluation it is concluded that conventional types of works (in particular so-called recurrent, fully works) are unsuited for modeling language, due to certain oversimplifying assumptions. In the remainder of this thesis I therefore develop a novel type of work, based on a neural theory of syntax that does take into account the hierarchical structur