RNN with a recurrent output layer for learning of naturalness

Ján Dolinský, Hideyuki Takagi

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    The behavior of recurrent neural networks with a recurrent output layer (ROL) is described mathematically and it is shown that using ROL is not only advantageous, but is in fact crucial to obtaining satisfactory performance for the proposed naturalness learning. Conventional belief holds that employing ROL often substantially decreases the performance of a network or renders the network unstable, and ROL is consequently rarely used. The objective of this paper is to demonstrate that there are cases where it is necessary to use ROL. The concrete example shown models naturalness in handwritten letters.

    Original languageEnglish
    Title of host publicationNeural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers
    Pages248-257
    Number of pages10
    EditionPART 1
    DOIs
    Publication statusPublished - 2008
    Event14th International Conference on Neural Information Processing, ICONIP 2007 - Kitakyushu, Japan
    Duration: Nov 13 2007Nov 16 2007

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    NumberPART 1
    Volume4984 LNCS
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Other

    Other14th International Conference on Neural Information Processing, ICONIP 2007
    Country/TerritoryJapan
    CityKitakyushu
    Period11/13/0711/16/07

    All Science Journal Classification (ASJC) codes

    • Theoretical Computer Science
    • Computer Science(all)

    Fingerprint

    Dive into the research topics of 'RNN with a recurrent output layer for learning of naturalness'. Together they form a unique fingerprint.

    Cite this