To “know a language” is to know, in part, the rules by which individual words can be combined to make new meaningful expressions. Theories of syntax aim to specify the mental representations that constitute this knowledge. Evidence from diverse spoken and manual languages indicates that these representations are hierarchically structured and include dependencies between elements that point to a constrained class of rules that are characteristic of human language. Experimental studies show that language users recognize and interpret these representations rapidly, in real time. Debates center on the precise format of these representations and the degree to which they share fundamental and perhaps universal properties across different languages. Theories are constrained by the fact that syntax is acquired without explicit instruction by young children, who show exquisite sensitivity to the usage patterns of their language community while also inducing rules that go beyond the surface patterns of the input they receive. Standing at the intersection of multiple scholarly traditions, syntax has faced historical tensions with adjacent disciplines in the cognitive sciences. Interdisciplinary cross-fertilization is supported by open discussion of methodological practices as well as shared interests in rigorous computational accounts of human language and linguistic diversity.