UHH
>
Informatik
>
NatS
>
Addis2023 Web
>
CourseStructure
>
ScheDule
>
QSM9a
(06 Mar 2023,
WolfgangMenzel
)
P
rint version
Questions for Self-monitoring: RNN and LSTM
What's the difference between recurrent and recursive NN?
What are the benefits of RNN compared to window-based feedforward NNs?
What is the most frequently used RNN architecture for NLP and what are its characteristics?
How training of and inference in simple RNN is performed?
To which kinds of NLP tasks RNN can be applied?
How do the training data for these tasks look like?
How does a stacked RNN architecture look like?
What's the benefit of a stacked RNN?
How does a bidirectional RNN look like?
What are the major disadvantages of simple RNN?
[Couldn't the problem be solved by using more nodes on the hidden layer?]
How does the LSTM address these problems?
--
WolfgangMenzel
- 06 Mar 2023
Addis2023
Navigation
Index
Changes
Notifications
Preferences
NatsWiki
Main
User
Sandbox
System
Copyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Foswiki?
Send feedback