In this paper, I include a wide range of grammatical and syntactic complexity features to test the extent to which different features can predict grades in upper-secondary student writing. The data consists of a selection of graded example texts (n=142) provided by the Swedish National Agency for Education (SNAE) to teachers as examples for how to assess the tests and texts graded by teachers (n=190) during the actual exams. Grammatical and syntactic features mentioned by the SNAE to positively influence grades include, for example, varied sentence structure and the use of conjunctions. The aim of the paper is thus to try to better understand how the SNAE and teachers assess grammatical and syntactic complexity and how this is reflected in the grades that different texts receive. It is important to note that I am interested in identifying features that show a stable distribution in sequential order across different grades (i.e. that they exhibit a cline from A to F, or from F to A). Preliminary results show that very few grammatical complexity features predict grade in any meaningful way, suggesting that grammatical and syntactic complexity is largely overlooked in the assessment of national tests in Sweden. Also, the SNAE and teachers appear to value different aspects of grammatical and syntactic complexity. In the paper, I will also discuss the general implications of the findings on assessment of national tests, and the ways in which teachers can be helped with assessment moving forward.