Reducing Vector Space Dimensionality in Automatic Classification for Authorship Attribution


  • Antonio Rico Sulayes Universidad de las Américas Puebla

Palabras clave:

Vector space modelling, Classifying features, Feature reduction


For automatic classification, the implications of having too many classificatory features are twofold. On the one hand, features may not be helpful to discriminate classes and should be removed from the classification. On the other hand, redundant features may produce negative effects as their number grows and their detrimental impact should be minimized or limited. In text classification tasks, where word and word-derived features are commonly employed, the number of distinctive features extracted from text samples can grow fast. For the specific context of authorship attribution, a number of features traditionally used, such as n-grams or word sequences, can produce long lists of distinctive features, a great majority of which have very few instances. Previous research has shown that in this task feature reduction can supersede the performance of noise tolerant algorithms to solve the issues associated with the abundance of classificatory features. However, there has been no attempt to show the motivation of this solution. This article shows how even in the small data collections characteristically used in authorship attribution, the frequency rank of common elements remains stable as their instances accumulate and novel, uncommon words are constantly found. Given this general vocabulary property, present even in very small text collections, the application of techniques to reduce vector space dimensionality is especially beneficial across the various experimental settings typical of this task. The implications of this may be helpful for other automatic classification tasks with similar conditions.

Biografía del autor/a

Antonio Rico Sulayes, Universidad de las Américas Puebla

Profesor Asociado Tiempo CompletoDepartamento de LenguasRed Temática de Tecnologías del LenguajeCONACYT