Abstract
In the present study, a systematic statistical analysis has been performed by the use of words in continuous Garhwali speech corpus. The words of Garhwali in continuous speech corpus are taken from different sources of Garhwali, viz., Newspapers, storybooks, poems, lyrics of songs and magazines, and it showed that there is a quantitative relation between the role of content words in Garhwali and the Shannon information entropy [S] defined by the probability distribution. So far, very few researches have been conducted in Garhwali language. There is no previous knowledge about the syntactic structure of Garhwali language. We have taken finite continuous corpus of Garhwali language. The occurrences of words (frequency) are almost an inverse power law functions, i.e. Zipf’s law and very close to 1.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Lawnik M, Shannon’s entropy in literary works and their translations (web: http://computer.scientific-journal.com/articles/1/23.pdf)
Nigam K, Lafferty J, McCallum A (1999) Using maximum entropy for text classification. In: IJCAI-99 workshop on machine learning for information filtering, vol 1
Papadimitriou C, Karamanos K, Diakonos FK, Constantoudis V, Papageorgiou H (2010) Entropy analysis of natural language written texts. Physica A 389(16):3260–3266
Shannon Claude E (1948) A mathematical theory of communications. Bell Syst Tech J 27:379–423
Kalimeri M, Constantoudis V, Papadimitriou C, Karamanos K, Diakonos FK, Papageorgiou H (2012) Entropy analysis of word-length series of natural language texts: Effects of text language and genre. Int J Bifurcat Chaos 22(09):1250223
Ebeling Werner, Pöschel Thorsten (1994) Entropy and long-range correlations in literary English. EPL (Europhysics Letters) 26(4):241
Kalimeri M, Constantoudis V, Papadimitriou C, Karamanos K, Diakonos FK, Papageorgiou H (2015) Word-length entropies and correlations of natural language written texts. J Quant Linguist 22(2):101–118
Montemurro MA, Zanette DH (2002) Entropic analysis of the role of words in literary texts. Adv Complex Syst 5(01):7–17
Ospanova R (2013) Calculating information entropy of language texts. World Appl Sci J 22(1):41–45
Zipf GK (1949) Human behaviour and the principal of least effort, reading. Addison-Wesley Publishing Co., MA
Cancho RFI, Sole´ RV (2002) Least effort and the origins of scaling in human language. Proceedings of the national academy of sciences of the United States of America, vol 100, pp 788–791
Devadoss S, Luckstead J, Danforth D, Akhundjanov S (2016) The power law distribution for lower tail cities in India. Physica A 15(442):193–196
Riyal MK, Rajput NK, Khanduri VP, Rawat L (2016) Rank-frequency analysis of characters in Garhwali text: emergence of Zipf’s law. Curr Sci 110(3):429–434
Manin DY (2009) Mandelbrot’s model for Zipf’s law: can Mandelbrot’s model explain Zipf’s law for language? J Quant Linguist 16(3):274–285
Montemurro Marcelo A (2001) Beyond the Zipf-Mandelbrot law in quantitative linguistics. Physica A 300(3-4):567–578
http://e-agazineofuttarakhand.blogspot.in/2009/10/garhwali-kumaoni-himalayan-literature_6723.html
http://www.language-archives.org/language/gbm#language_descriptions
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Riyal, M.K., Upadhyay, R.K., Kumar, S. (2021). Entropic Analysis of Garhwali Text. In: Singh, M., Rafat, Y. (eds) Recent Developments in Acoustics. Lecture Notes in Mechanical Engineering. Springer, Singapore. https://doi.org/10.1007/978-981-15-5776-7_3
Download citation
DOI: https://doi.org/10.1007/978-981-15-5776-7_3
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-5775-0
Online ISBN: 978-981-15-5776-7
eBook Packages: EngineeringEngineering (R0)