Thursday , April 18 2024

An Artificial Neural Network (ANN) in Electronic Product Design

Tom Page
Loughborough Design School, Loughborough University
Epinal Way, Loughborough, UNITED KINGDOM, LE11 3TU

Abstract:

This paper reports on the development of a wearable gesture recognition device for communicating with children that have acute communication difficulties. The device was designed with the aid of an artificial neural network (ANN) to develop and refine the physical gestures made by the user. The findings of the evaluation of the data handling are discussed with a view to improving on both its functions in gesture recognition and further development of its database.

Keywords:

Artificial Neural Network, Product Design, Gesture Recognition.

>>Full text
CITE THIS PAPER AS:
Tom PAGE, An Artificial Neural Network (ANN) in Electronic Product Design, Studies in Informatics and Control, ISSN 1220-1766, vol. 21 (3), pp. 259-266, 2012. https://doi.org/10.24846/v21i3y201204

1. Introduction

An ‘artificial neural network’ is, as it suggests, is a manmade system designed to mimic the cognitive ability of an animal brain. Gurney (1997) provided a neural model of the way in which artificial neural networks (ANNs) operate, it is estimated that the human brain comprises around 100 billion neurons. These neurons combine to make highly complex networks, as each neuron may have up to 10,000 connections. They communicate with each other via electrical impulses thus making up the brain (Drachman, 2005). Interneuron connections are mediated by synapses which are located on the end of dendrite cells that feed into the cell body, the soma. Here the synaptic inputs are integrated or summed together in some way and if the resultant is greater than a threshold level then the neuron will ‘fire’ a voltage impulse (Gurney, 1997). This output then feeds into other neuronal inputs.

Unlike conventional alphanumeric, equation-based computing, artificial neural networks provide for adaptability and ‘learning’ within a computer program. However, where biological and artificial neurons differ is in their ability to make each input more or less important than others through use of a weighting technique. Depending on the synaptic connection, each synapse’s electrochemical response can be polarised so that it is either excitatory (designed to promote synaptic firing) or inhibitory (designed to discourage synaptic firing). Additionally these ‘weights’ can change over time and this change in weight is interpreted as learning (Russell and Norvig, 2010).

In artificial neural networks, the equivalent to the neuron is called as a node or unit. A simple node works on the same level of understanding as biological neurons just given. As artificial neurons are currently based on digital inputs, the weighting technique differs slightly. Inputs, as in conventional computing, are still essentially made up of 1’s and 0’s. According to Gurney (1997), to achieve ‘weighting’, input data is passed through individual equations:

Input + Weight = Weighted Input
e.g. 1 (input) + 5 (weight) = 6 (weighted input)

These weighted inputs are then summed together and passed through a threshold limit. If the resultant is higher than the threshold then the node will produce a 1 if not then it will produce a 0. This type of artificial neuron is the simplest and historically earliest. It was put forward by McCulloch and Pitts (1943) and is known as a threshold logic unit.

In the brain, learning is achieved by neurons becoming more or less sensitive to certain synaptic responses. This can be emulated in artificial neurons by altering an input’s weight. The modification of weights is usually done automatically through use of comparison of data to classification code (Gurney, 1997). For example, if one considers pattern recognition of text, say a handwritten paper was scanned and inputted into the neural network. The ANN is told to analyse the individual characters looking for specific characteristics e.g. roundness, how many straight lines there are, are there any holes etc. This information becomes the input data for the neural network. The network is then given the output data (the correct translation of the inputted data). Through use of back propagation the neural network will attempt to work out how to get from the output to the input and adjust the weights accordingly (Lampinen, Laaksonen, and Oja, 1998).

This process is called ‘training’, effectively allowing the neural network to ‘learn’. All artificial neural networks need to go through this process hundreds of times before they can start to make logical and accurate assumptions (Gurney, 1997). If you have ever used voice recognition software you will be aware that before you can start using it effectively that you must read from passages of text. Here you are literally teaching and familiarising the software to the way that you speak and pronounce words (Van Daalen, and Bots, 2010).

References:

  1. Bolle, R. M., J. H. Connel, S. Pankanti, N. K. Ratha, A. W. Senior, Basics of Biometrics, in Guide to Biometrics, Springer, New York, 2004.
  2. Brown, S. L., K. M. Eisenhardt, Product Development: Past Research, Present Findings, and Future Directions, Academy of Management Review, Vol. 20, 1995, pp.343-378.
  3. Bourg, D. M., G. Seemann, Neural Networks, in: AI for Game Developers, O’Reilly Media Inc., USA, 2004.
  4. Chu, S. R., R. Shoureshi, M. Tenorio, Neural Networks for System Identification, IEEE Control Systems. Vol. 10, 1990, pp. 31-35.
  5. Curran, K., L. Xuelong, and N. Mc Caughley, The Use of Neural Networks in Real-time Face Detection. Journal of Computer Sciences, Vol. 1, No. 1, 2005, pp. 47-62.
  6. Drachman, D. A., Do We Have Brain to Spare?, Journal of American Academy of Neurology, Vol. 64, No. 12, 2004-5. Available online: http://www.neurology.org/cgi/content/citation/64/12/2004
  7. Erol, A., G. Bebis, M. Nicolescu, R. D. Boyle, X. Twombly, Vision-based Hand Pose Estimation: A Review, Computer Vision and Image Understanding Vol. 108, No. 1-2, 2007, pp. 52-73 Special Issue on Vision for Human-Computer Interaction.
  8. Fels, S. S., G. Hinton, GloveTalk: A Neural Network Interface Between a DataGlove and a Speech Synthesizer, IEEE Transactions on Neural Networks. Vol. 4, 1993, pp. 2-8.
  9. Gardner, J. W., J. Yinon, Review of Conventional Electronic Noses and their Possible Application to the Detection of Explosives, in Electronic Noses & Sensors for the Detection of Explosives. Kluwer Academic Publishers, Massachusetts, USA, 2004.
  10. Gero, J. S. (Ed.), Applications of AI in Engineering V, Vol. 1: Design, Proc. 5th International Conference on Applications of AI in Engineering, Computational Mechanics Publications & Springer-Verlag, Boston, MA, 1990.
  11. Greening, C., Sudoku Grab – How does it all work? 2009, Available at: http://sudokugrab.blogspot.com/2009/07/how-does-it-all-work.html, [Accessed 6 March 2011].
  12. Gurney, K., Neural Networks – An Overview. An Introduction to Neural Networks. UCL Press, London, 1997.
  13. Lampinen, J., J. Laaksonen, E. Oja, Pattern Recognition Problem, in Image Processing and Pattern Recognition, edited by Cornelius T. Leondes. Neural Network Systems Techniques and Applications, Vol. 5, Academic Press Limited, London. 1998.
  14. Lehrman, R. L., Notions of Motion, in Physics the Easy Way, 3rd ed. Barron’s Educational Series Inc., New York, USA, 1998.
  15. Lin, Y.-C., H.-H. Lai, C.-H. Yeh, Consumer-oriented Product Form Design based on Fuzzy Logic: A Case Study of Mobile Phones, International Journal of Industrial Ergonomics, Vol. 37 2007, pp. 531-543.
  16. Malmgren, B. A., L. Niklassan, ART Neural Networks for Medical Data Analysis and Fast Distributed Learning, in Artificial Neural Networks in Medicine and Biology, Springer-Verlag, London, 2000.
  17. Mc Culloch, W., W. Pitts, A Logical Calculus of the Ideas Immanent in Nervous Activity, Bulletin of Mathematical Biophysics, Vol. 7, 1943, pp. 115-133.
  18. Mellis, D. A., What is Arduino?, 2007, Available at: http://www.arduino.cc/en/Guide/Introduction [Accessed 13 March 2011].
  19. National Health Service, Symptoms of Cerebral Palsy, UK, 2008, Available at: http://www.nhs.uk/Conditions/Cerebral-palsy/Pages/Symptoms.aspx [Accessed 19 October 2010]. What is Cerebral Palsy? In: Cerebral Palsy. Capstone Press,  Minnesota, USA.
  20. Page, T, Prospects for the Design of Electronic Products in Second Life, Studies in Informatics and Control, Vol. 20, No. 3, 2011, pp. 293-303.
  21. Page, T, G. Thorsteinsson, A. Niculescu, Management of Knowledge in a Problem Based Learning Environment, Studies in Informatics and Control, Vol. 18, No. 3, 2009, pp. 225-262, ISSN: 1220-1766.
  22. Porta, M., Vision-based User Interfaces: Methods and Applications. International Journal of Human-Computer Studies, Vol. 57, No. 11, 2001, pp.27-73.
  23. Rozendaal, M. C. D. V., Keyson, Designing User Interfaces with Gestures and Sound: Towards the Performance and Appeal of Voice Mail Browsing. Journal of Design Research, Vol. 5, No. 1, 2006, pp. 96-115.
  24. Russell, S., P. Norvig, The Nature of Environments, in Artificial Intelligence: A Modern Approach, 3rd ed. Pearson Education, Inc., New Jersey, United States, 2010.
  25. Schifferstein, H. N. J., J. J. Otten, F. Thoolen, P. Hekkert, The Experimental Assessment of Sensory Dominance in a Product Development Context, Journal of Design Research, Vol. 8, No. 2, 2010, pp. 119-144.
  26. Starner, T., A. Pentland, Visual Recognition of American Sign Language using Hidden Markov Models. Proceedings of the International Workshop on Automatic Face and Gesture Recognition, Zurich, Switzerland, June, 1995.
  27. Thorsteinsson, G, Piloting a Use of Graphic Tablets to Support Students Drawing within a Secondary School in Iceland, Studies in Informatics and Control, Vol. 21, No. 2, 2012, pp.201-209.
  28. Van Daalen, C. E., P. W. G. Bots, (2010) Using a Systems Perspective to Design a Problem Solving Process. Journal of Design Research, Vol. 8 No. 4, 2010, pp. 301-316.