Patel ND, Nguang SK, Coghill GG. Neural network implementation using bit streams.
IEEE TRANSACTIONS ON NEURAL NETWORKS 2007;
18:1488-1504. [PMID:
18220196 DOI:
10.1109/tnn.2007.895822]
[Citation(s) in RCA: 2] [Impact Index Per Article: 0.1] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/25/2023]
Abstract
A new method for the parallel hardware implementation of artificial neural networks (ANNs) using digital techniques is presented. Signals are represented using uniformly weighted single-bit streams. Techniques for generating bit streams from analog or multibit inputs are also presented. This single-bit representation offers significant advantages over multibit representations since they mitigate the fan-in and fan-out issues which are typical to distributed systems. To process these bit streams using ANNs concepts, functional elements which perform summing, scaling, and squashing have been implemented. These elements are modular and have been designed such that they can be easily interconnected. Two new architectures which act as monotonically increasing differentiable nonlinear squashing functions have also been presented. Using these functional elements, a multilayer perceptron (MLP) can be easily constructed. Two examples successfully demonstrate the use of bit streams in the implementation of ANNs. Since every functional element is individually instantiated, the implementation is genuinely parallel. The results clearly show that this bit-stream technique is viable for the hardware implementation of a variety of distributed systems and for ANNs in particular.
Collapse