Skip to content
/ mtgru Public

Multiple Timescale Gated Recurrent Unit (MTGRU)

Notifications You must be signed in to change notification settings

dennissm/mtgru

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 

Repository files navigation

Multiple Timescale Gated Recurrent Unit (MTGRU)

This is the original implementation of MTGRU developed at Artificial Brain Research Lab., School of Electronics Engineering, Kyungpook National University, Daegu, South Korea

MTGRU

Figure: A Multiple Timescale Gated Recurrent Unit.

Equation

MTGRU_eqn

Requirements

  • Tensorflow 0.9
  • python 2.x

References

  • Minsoo Kim, Dennis Singh Moirangthem, and MinhoLee. 2016. Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization. In Proceedings of the 1st Workshop on Representation Learning for NLP, Association for Computational Linguistics. pages 70–77.
  • Dennis Singh Moirangthem and Minho Lee. 2017. Temporal hierarchies in multilayer gated recurrent neural networks for language models. In Neural Networks (IJCNN), 2017 International Joint Conference on. IEEE, pages 2152–2157.
  • Dennis Singh Moirangthem, Jegyung Son, and Minho Lee. 2017. Representing compositionality based on multiple timescales gated recurrent neural networks with adaptive temporal hierarchy for character-level language models. In Proceedings of the 2nd Workshop on Representation Learning for NLP. Association for Computational Linguistics, pages 131–138.

About

Multiple Timescale Gated Recurrent Unit (MTGRU)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages