Catalog Home Page

Using an effective Boltzmann machine to learn context dependencies of a sequence

Bellgard, M.I. and Tsang, C.P. (1995) Using an effective Boltzmann machine to learn context dependencies of a sequence. In: 1995 IEEE International Conference on Neural Networks (ICNN '95), 27 November - 1 December 1995, Perth, W.A pp. 2841-2846.

[img]
Preview
PDF - Published Version
Download (515kB)
Link to Published Version: http://dx.doi.org/10.1109/ICNN.1995.488184
*Subscription may be required

Abstract

It has been recognised that current Artificial Neural Network (ANN) systems that employ windowing techniques to learn context dependencies in sequences have many deficiencies. One variant of this type of system that attempts to overcome many of these deficiencies is the Effective Boltzmann Machine(EBM). The EBM which is based on the Boltzmann Machine (BM) has the ability to perform completion and to provide an energy measure for the solution. In this paper, we extend the EBM and show that the system has many desirable properties. This paper reports two major improvements to the EBM First, whereas in the past, a BM is used to learn the local contexts of a sequence, we show that the EBM itself is an architecture suitable for learning local contexts. Our initial experiments show that it outperforms a window based BA4 Second, we demonstrate that, with this new training scheme, a multilayer EBM can be constructed to extend the effective window size. This means that learning contextual dependencies is not limited by the window size. This is done by utilising the state information of the first EBM hidden layer to train that of the second EBM layer. The effect of this long range effective window is demonstrated by experiments.

Publication Type: Conference Paper
Murdoch Affiliation: School of Mathematical and Physical Sciences
URI: http://researchrepository.murdoch.edu.au/id/eprint/17458
Item Control Page Item Control Page

Downloads

Downloads per month over past year