Advanced topics

Still under construction!

  1. Hidden Markov models
    • What is a hidden Markov model?
    • What are typical applications of HMMs in NLP?
    • What are typical tasks that can be solved by using an HMM?
    • What kind training data are required?
    • How can a hidden Markov-Model be trained?
    • What kind of algorithmic approaches are needed to solve the differents HMM tasks?
  2. Probabilistic context-free grammars (optional)
    • How can a context-free grammar be extended to a probabilistic model?
    • How can the probabilities be estimated?
    • How the most probable parse tree can be determined?
  3. Recurrent Neural language models
    • How can the network of word2vec be modified for language modeling?
    • What's in contrast a typical architecture of a recurrent neural language model?
    • What are the advantages of recurrent models networks compared to window-based predictions as used in word2vec?
    • How can a recurrent model be extended to sequence-to-sequence transformation?
    • What are applications of sequence-to-sequence models?
    • What are the limitations of recurrent language models?
  4. Long Short-Term Memory and Attention
    • What's a long short-term memory?
    • What's an attention mechanism?
    • Why they are needed? Which benefits they provide? 1 Representation learning
    • What are pretrained representations?
    • What are multi-layer architectures for representation learning?
    • What are typical applications for pretrained representations?
  5. Character-based neural models
  6. Neural models for speech recognition

-- WolfgangMenzel - 07 Apr 2022
 
This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki? Send feedback