E-Book, Englisch, 354 Seiten
Reihe: Biological Techniques Series
Dempster The Laboratory Computer
1. Auflage 2001
ISBN: 978-0-08-052155-8
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark
A Practical Guide for Physiologists and Neuroscientists
E-Book, Englisch, 354 Seiten
Reihe: Biological Techniques Series
ISBN: 978-0-08-052155-8
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark
The Laboratory Computer: A Practical Guide for Physiologists and Neuroscientists introduces the reader to both the basic principles and the actual practice of recording physiological signals using the computer. It describes the basic operation of the computer, the types of transducers used to measure physical quantities such as temperature and pressure, how these signals are amplified and converted into digital form, and the mathematical analysis techniques that can then be applied. It is aimed at the physiologist or neuroscientist using modern computer data acquisition systems in the laboratory, providing both an understanding of how such systems work and a guide to their purchase and implementation. - The key facts and concepts that are vital for the effective use of computer data acquisition systems - A unique overview of the commonly available laboratory hardware and software, including both commercial and free software - A practical guide to designing one's own or choosing commercial data acquisition hardware and software
Autoren/Hrsg.
Weitere Infos & Material
1;Front Cover;1
2;The Laboratory Computer;4
3;Copyright Page;5
4;Contents;10
5;Series Preface;6
6;Preface;8
7;Chapter One. Introduction;14
7.1;1.1 The rise of the laboratory computer;15
7.2;1.2 The data acquisition system;17
7.3;1.3 Analysing digitised signals;20
7.4;1.4 Analysis of electrophysiological signals;21
7.5;1.5 Image analysis;23
7.6;1.6 Software development;23
7.7;1.7 Summary;24
8;Chapter Two. The Personal Computer;25
8.1;2.1 Computer families;25
8.2;2.2 Main components of a computer system;27
8.3;2.3 The central processing unit;30
8.4;2.4 Random access memory;32
8.5;2.5 Cache memory;33
8.6;2.6 Motherboards;34
8.7;2.7 Magnetic disc storage;35
8.8;2.8 Removable disc storage;38
8.9;2.9 Interface buses and expansion slots;40
8.10;2.10 Input devices;42
8.11;2.11 Video displays;43
8.12;2.12 Peripheral device interfaces;46
8.13;2.13 Printers and other output devices;48
8.14;2.14 Operating systems;50
8.15;2.15 Computer networks;56
8.16;2.16 Further reading;57
9;Chapter Three. Digital Data Acquisition;58
9.1;3.1 Digitising analogue signals;59
9.2;3.2 The Nyquist criterion;60
9.3;3.3 The A/D converter;61
9.4;3.4 The laboratory interface unit;65
9.5;3.5 Laboratory interface–host computer connections;67
9.6;3.6 Laboratory interfaces and suppliers;72
9.7;3.7 Recording modes;80
9.8;3.8 Data acquisition software;83
9.9;3.9 Choosing a data acquisition system;85
9.10;3.10 Further reading;86
10;Chapter Four. Signal Conditioning;87
10.1;4.1 Amplifiers;88
10.2;4.2 Analogue filtering;96
10.3;4.3 Event detectors;101
10.4;4.4 Signal conditioners;102
10.5;4.5 Interference and its elimination;105
10.6;4.6 Stimulators;110
10.7;4.7 Further reading;112
11;Chapter Five. Transducers and Sensors;114
11.1;5.1 Basic transducer properties;115
11.2;5.2 Temperature transducers;117
11.3;5.3 Light detectors;120
11.4;5.4 Force transducers;128
11.5;5.5 Pressure transducers;134
11.6;5.6 Chemical Sensors;138
11.7;5.7 Further reading;148
12;Chapter Six. Signal Analysis and Measurement;149
12.1;6.1 Signal measurement;149
12.2;6.2 Basic waveform characteristics;153
12.3;6.3 Signal averaging;156
12.4;6.4 Digital filters;158
12.5;6.5 Frequency domain analysis;160
12.6;6.6 Curve fitting;168
12.7;6.7 Analysis of random distributions;181
12.8;6.8 Further reading;184
13;Chapter Seven. Recording and Analysis of Intracellular Electrophysiological Signals;185
13.1;7.1 Origin of bioelectrical signals;186
13.2;7.2 Cell equivalent circuits;187
13.3;7.3 Intracellular recording techniques;188
13.4;7.4 The intracellular data acquisition system;192
13.5;7.5 Experimental paradigms;197
13.6;7.6 Analysis of voltage-activated currents;199
13.7;7.7 Analysis of synaptic signals;207
13.8;7.8 Single-channel currents;216
13.9;7.9 Noise analysis;227
13.10;7.10 Cell capacitance;234
13.11;7.11 Further reading;238
14;Chapter Eight. Recording and Analysis of Extracellular Electrophysiological Signals;239
14.1;8.1 Extracellular potentials;239
14.2;8.2 Recording electrodes;241
14.3;8.3 Electromyography;242
14.4;8.4 Electrocardiography;244
14.5;8.5 Electroencephalography;251
14.6;8.6 Recording activity of single neurons;254
14.7;8.7 Analysis of neural spike trains;262
14.8;8.8 Neural signal acquisition systems;268
14.9;8.9 Further reading;273
15;Chapter Nine. Image Analysis;274
15.1;9.1 Digitisation of images;275
15.2;9.2 Image acquisition devices;280
15.3;9.3 Charge-coupled devices;280
15.4;9.4 CCD readout architectures;282
15.5;9.5 CCD performance;283
15.6;9.6 Electronic cameras;284
15.7;9.7 Analogue video signal formats;285
15.8;9.8 Analogue video cameras;287
15.9;9.9 Camera performance specifications;287
15.10;9.10 Digitising analogue video signals;289
15.11;9.11 Digital cameras;291
15.12;9.12 Digital frame grabbers;294
15.13;9.13 Scanners;294
15.14;9.14 Confocal microscopy;297
15.15;9.15 Image analysis;299
15.16;9.16 Image calibration;301
15.17;9.17 Image arithmetic;303
15.18;9.18 Spatial filtering;304
15.19;9.19 Image analysis software;304
15.20;9.20 Analysis of moving images;307
15.21;9.21 Three-dimensional imaging;309
15.22;9.22 Further reading;310
16;Chapter Ten. Software Development;312
16.1;10.1 Computer programs;313
16.2;10.2 Assembler code;313
16.3;10.3 Programming language features;314
16.4;10.4 User interface design;319
16.5;10.5 Software development tools;321
16.6;10.6 Visual Basic;321
16.7;10.7 Borland Delphi;326
16.8;10.8 Visual C++;328
16.9;10.9 Multiplatform software development;330
16.10;10.10 Matlab;330
16.11;10.11 LabVIEW;334
16.12;10.12 Choosing a development system;336
16.13;10.13 Further reading;338
17;References;339
18;Suppliers;350
19;Index;354
CHAPTER ONE Introduction
The computer now plays a central role in the laboratory, as a means of acquiring experimental data, analysing that data, and controlling the progress of experiments. An understanding of it and the principles by which experimental data are digitised has become an essential part of the (ever lengthening) skill set of the researcher. This book provides an introduction to the principles and practical application of computer-based data acquisition systems in the physiological sciences. The aim here is to provide a coherent view of the methodology, drawing together material from disparate sources, usually found in highly compressed form in the methods sections of scientific papers, short technical articles, or in manufacturers’ product notes. An emphasis is placed on both principles and practice. An understanding of the principles by which the physiological systems one is studying are measured is necessary to avoid error through the introduction of artefacts into the recorded data. A similar appreciation of the theoretical basis of any analysis methods employed is also required. Throughout the text, reference is therefore made to the key papers that underpin the development of measurement and analysis methodologies being discussed. At the same time, it is important to have concrete examples and to know, in purely practical terms, where such data acquisition hardware and software can be obtained, and what is involved in using it in the laboratory. The main commercially available hardware and software packages used in this field are therefore discussed along with their capabilties and limitations. In all cases, the supplier’s physical and website address is supplied. A significant amount of public domain, or ‘freeware’, software is also available and the reader’s attention is drawn to the role that this kind of software plays in research. Physiology – the study of bodily function and particularly how the internal state is regulated – more than any other of the life sciences can be considered to be a study of signals. A physiological signal is the time-varying changes in some property of a physiological system, at the cellular, tissue or whole animal level. Many such signals are electrical in nature, cell membrane potential and current for instance, or chemical such as intracellular ion concentrations (H+, Ca++). But, almost any of the fundamental physical variables – temperature, force, pressure, light intensity – finds some physiological role. Records of such signals provide the raw material by which an understanding of body function is constructed, with advances in physiology often closely associated with improved measurement techniques. Physiologists, and particularly electrophysiologists, have always been ready to exploit new measurement and recording technology, and the computer-based data acquisition is no exception. 1.1 THE RISE OF THE LABORATORY COMPUTER
Computers first started to be used in the laboratory about 45 years ago, about 10 years after the first digital computer, the ENIAC (Electronic Numerical Integrator And Calculator), had gone into operation at the University of Pennsylvania. Initially, these machines were very large, room-size devices, seen exclusively as calculating machines. However, by the mid-1950s laboratory applications were becoming conceivable. Interestingly enough, the earliest of these applications was in the physiological (or at least psychophysiological) field. The Whirlwind system developed by Kenneth Olsen and others at Massachusetts Institute of Technology, with primitive cathode ray tube (CRT) display systems, was used for studies into the visual perception of patterns associated with the air defence project that lay behind the funding of the computer (Green et al., 1959). The Whirlwind was of course still a huge device, powered by vacuum tubes, and reputed to dim the lights of Cambridge, Massachusetts when operated, but the basic principles of the modern laboratory computing could be discerned. It was a system controlled by the experimenter acquiring data in real time from an experimental subject and displaying results in a dynamic way. Olsen went on to found Digital Equipment Corporation (DEC) which pioneered the development of the minicomputer. Taking advantage of the developments in integrated circuit technology in the 1960s, minicomputers were much smaller and cheaper (although slower) than the mainframe computer of the time. While a mainframe, designed for maximum performance and storage capacity, occupied a large room and required specialised air conditioning and other support, a minicomputer took up little more space than a filing cabinet and could operate in the normal laboratory environment. Clark & Molnar (1964) describe the LINC (Laboratory INstrument Computer), a typical paper-tape-driven system of that time (magnetic disc drives were still the province of the mainframe). However, it could digitise experimental signals, generate stimuli, and display results on a CRT. The DEC PDP-8 (Programmable Data Processor) minicomputer was the first to go into widespread commercial production, and a variant of it the LINC-8 was designed specifically for laboratory use. The PDP-8 became a mainstay of laboratory computing throughout the 1960s, being replaced by the even more successful PDP-11 series in the 1970s. Although the minicomputer made the use of a dedicated computer within the experimental laboratory feasible, it was still costly compared to conventional laboratory recording devices such as paper chart recorders. Consequently, applications were restricted to areas where a strong justification for their use could be made. One area where a case could be made was in the clinical field, and systems for the computer-based analysis of electrocardiograms and electroencephalograms began to appear (e.g. Stark et al., 1964). Electrophysiological research was another area where the rapid acquisition and analysis of signals could be seen to be beneficial. H.K. Hartline was one of the earliest to apply the computer to physiological experimentation, using it to record the frequency of nerve firing of Limulus (horseshoe crab) eye, in response to a variety of computer-generated light stimuli (see Schonfeld, 1964, for a review). By the early 1980s most well-equipped electrophysiological laboratories could boast at least one minicomputer. Applications had arisen, such as the spectral analysis of ionic current fluctuations or the analysis of single ion channel currents, that could only be successfully handled using computer methods. Specialised software for these applications was being developed by a number of groups (e.g. D’Agrosa & Marlinghaus, 1975; Black et al., 1976; Colquhoun & Sigworth, 1995; Dempster, 1985; Re & Di Sarra, 1988). The utility of this kind of software was becoming widely recognised, but it was also becoming obvious that its production was difficult and time consuming. Because of this, software was often exchanged informally between laboratories which had existing links with the software developer or had been attracted by demonstrations at scientific meetings. Nevertheless, the cost of minicomputer technology right up to its obsolescence in the late 1980s prevented it from replacing the bulk of conventional laboratory recording devices. Real change started to occur with the development of the microprocessor – a complete computer central processing unit on a single integrated circuit chip – by Intel Corp. in 1974. Again, like the minicomputer in its own day, although the first microprocessor-based computers were substantially slower than the contemporary minicomputers, their order-of-magnitude lower cost opened up a host of new opportunities for their use. New companies appeared to exploit the new technology, and computers such as the Apple II and the Commodore PET began to appear in the laboratory (examples of their use can be found in Kerkut, 1985; or Mize, 1985). Not only that; computers had become affordable to individuals for the first time, and they began to appear in the home and small office. The era of the personal computer had begun. As integrated circuit technology improved it became possible to cram more and more transistors on to each silicon chip. Over the past 25 years this has led to a constant improvement in computing power and reduction in cost. Initially, each new personal computer was based on a different design. Software written for one computer could not be expected to run on another. As the industry matured, standardisation began to be introduced, first with the CP/M operating system and then with the development of the IBM (International Business Machines) Personal Computer in 1981. IBM being the world’s largest computer manufacturer at the time, the IBM PC became a de facto standard, with many other manufacturers copying its design and producing IBM PC-compatible computers or ‘clones’. Equally important was the appearance of the Apple Macintosh in 1984, the first widely available computer with a graphical user interface (GUI), which used the mouse as a pointing device. Until the introduction of the Macintosh, using a computer involved the user in learning its operating system command language, a significant disincentive to many. The Macintosh, on the other hand, could be operated by selecting options from a...