Nuzhi Meyen
1 min readNov 5, 2017

--

Firstly, thank you for pointing me to the paper. I had not seen it before. However, this paper is predicated on Chomsky’s claim of Context Free Grammars and Probabilistic Context Free Grammars (PCFGs) being better suited to model natural language. (There are alternative views regarding regular grammars which are modeled by HMMs). HMMs perform reasonably well if the assumption of regular grammars hold. As per this paper, when measured using Mutual Information (MI) LSTMs perform better than HMM on a set generated by a PCFG language.Either way whether you consider PCFGs or regular model HMMs to model language, probability is still involved, so your heading still is trifle misleading.

--

--

Nuzhi Meyen
Nuzhi Meyen

Written by Nuzhi Meyen

Co-founder of Helios P2P. Sri Lankan. Interested in Finance, Advanced Analytics, BI, Data Visualization, Computer Science, Statistics, and Design Thinking.

Responses (1)