Power-law regularities in human language
Department of Physics, Faculty of Science, Noshirvani University of Technology, Babol, Iran
Received: 8 July 2016
Received in final form: 31 August 2016
Published online: 9 November 2016
Complex structure of human language enables us to exchange very complicated information. This communication system obeys some common nonlinear statistical regularities. We investigate four important long-range features of human language. We perform our calculations for adopted works of seven famous litterateurs. Zipf’s law and Heaps’ law, which imply well-known power-law behaviors, are established in human language, showing a qualitative inverse relation with each other. Furthermore, the informational content associated with the words ordering, is measured by using an entropic metric. We also calculate fractal dimension of words in the text by using box counting method. The fractal dimension of each word, that is a positive value less than or equal to one, exhibits its spatial distribution in the text. Generally, we can claim that the Human language follows the mentioned power-law regularities. Power-law relations imply the existence of long-range correlations between the word types, to convey an especial idea.
Key words: Statistical and Nonlinear Physics
© EDP Sciences, Società Italiana di Fisica, Springer-Verlag, 2016