Skip to main content

Entropic Analysis of Garhwali Text

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Mechanical Engineering ((LNME))

Abstract

In the present study, a systematic statistical analysis has been performed by the use of words in continuous Garhwali speech corpus. The words of Garhwali in continuous speech corpus are taken from different sources of Garhwali, viz., Newspapers, storybooks, poems, lyrics of songs and magazines, and it showed that there is a quantitative relation between the role of content words in Garhwali and the Shannon information entropy [S] defined by the probability distribution. So far, very few researches have been conducted in Garhwali language. There is no previous knowledge about the syntactic structure of Garhwali language. We have taken finite continuous corpus of Garhwali language. The occurrences of words (frequency) are almost an inverse power law functions, i.e. Zipf’s law and very close to 1.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   219.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   279.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   279.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Lawnik M, Shannon’s entropy in literary works and their translations (web: http://computer.scientific-journal.com/articles/1/23.pdf)

  2. Nigam K, Lafferty J, McCallum A (1999) Using maximum entropy for text classification. In: IJCAI-99 workshop on machine learning for information filtering, vol 1

    Google Scholar 

  3. Papadimitriou C, Karamanos K, Diakonos FK, Constantoudis V, Papageorgiou H (2010) Entropy analysis of natural language written texts. Physica A 389(16):3260–3266

    Article  Google Scholar 

  4. Shannon Claude E (1948) A mathematical theory of communications. Bell Syst Tech J 27:379–423

    Article  MathSciNet  Google Scholar 

  5. Kalimeri M, Constantoudis V, Papadimitriou C, Karamanos K, Diakonos FK, Papageorgiou H (2012) Entropy analysis of word-length series of natural language texts: Effects of text language and genre. Int J Bifurcat Chaos 22(09):1250223

    Article  Google Scholar 

  6. Ebeling Werner, Pöschel Thorsten (1994) Entropy and long-range correlations in literary English. EPL (Europhysics Letters) 26(4):241

    Article  Google Scholar 

  7. Kalimeri M, Constantoudis V, Papadimitriou C, Karamanos K, Diakonos FK, Papageorgiou H (2015) Word-length entropies and correlations of natural language written texts. J Quant Linguist 22(2):101–118

    Article  Google Scholar 

  8. Montemurro MA, Zanette DH (2002) Entropic analysis of the role of words in literary texts. Adv Complex Syst 5(01):7–17

    Article  Google Scholar 

  9. Ospanova R (2013) Calculating information entropy of language texts. World Appl Sci J 22(1):41–45

    Google Scholar 

  10. Zipf GK (1949) Human behaviour and the principal of least effort, reading. Addison-Wesley Publishing Co., MA

    Google Scholar 

  11. Cancho RFI, Sole´ RV (2002) Least effort and the origins of scaling in human language. Proceedings of the national academy of sciences of the United States of America, vol 100, pp 788–791

    Google Scholar 

  12. Devadoss S, Luckstead J, Danforth D, Akhundjanov S (2016) The power law distribution for lower tail cities in India. Physica A 15(442):193–196

    Article  Google Scholar 

  13. Riyal MK, Rajput NK, Khanduri VP, Rawat L (2016) Rank-frequency analysis of characters in Garhwali text: emergence of Zipf’s law. Curr Sci 110(3):429–434

    Article  Google Scholar 

  14. Manin DY (2009) Mandelbrot’s model for Zipf’s law: can Mandelbrot’s model explain Zipf’s law for language? J Quant Linguist 16(3):274–285

    Article  MathSciNet  Google Scholar 

  15. Montemurro Marcelo A (2001) Beyond the Zipf-Mandelbrot law in quantitative linguistics. Physica A 300(3-4):567–578

    Article  Google Scholar 

  16. http://e-agazineofuttarakhand.blogspot.in/2009/10/garhwali-kumaoni-himalayan-literature_6723.html

  17. http://www.language-archives.org/language/gbm#language_descriptions

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Manoj Kumar Riyal .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Riyal, M.K., Upadhyay, R.K., Kumar, S. (2021). Entropic Analysis of Garhwali Text. In: Singh, M., Rafat, Y. (eds) Recent Developments in Acoustics. Lecture Notes in Mechanical Engineering. Springer, Singapore. https://doi.org/10.1007/978-981-15-5776-7_3

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-5776-7_3

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-5775-0

  • Online ISBN: 978-981-15-5776-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics