3-5, D-69120Heidelberg, Germany cremer@popeye.aphys2.uni-heidelberg.de http://www.aphys.uni-heidelberg.de/AG_Cremer/ Tobias Dierig Forschungsgruppe Bildverarbeitung, IWR, Universität Hei
Trang 2Computer Vision
and Applications
A Guide for Students and Practitioners
Trang 3OR PRODUCTION OF THE ACCOMPANYING CODE ("THE PRODUCT") CANNOT AND DO NOT WARRANT THE PERFORMANCE OR RESULTS THAT MAY BE OBTAINED BY USING THE PRODUCT THE PRODUCT IS SOLD "AS IS" WITHOUT WARRANTY OF MERCHANTABILITY
OR FITNESS FOR ANY PARTICULAR PURPOSE AP WARRANTS ONLY THAT THE MAGNETIC CD-ROM(S) ON WHICH THE CODE IS RECORDED IS FREE FROM DEFECTS IN MATERIAL AND FAULTY WORKMANSHIP UNDER THE NORMAL USE AND SERVICE FOR A PERIOD OF NINETY (90) DAYS FROM THE DATE THE PRODUCT IS DELIVERED THE PURCHASER’S SOLE AND EXCLUSIVE REMEDY IN THE VENT OF A DEFECT IS EXPRESSLY LIMITED TO EITHER REPLACEMENT OF THE CD-ROM(S) OR REFUND OF THE PURCHASE PRICE,AT AP’S SOLE DISCRETION.
IN NO EVENT,WHETHER AS A RESULT OF BREACH OF CONTRACT,WARRANTY,OR TORT (INCLUDING NEGLIGENCE),WILL AP OR ANYONE WHO HAS BEEN INVOLVED IN THE CRE- ATION OR PRODUCTION OF THE PRODUCT BE LIABLE TO PURCHASER FOR ANY DAM- AGES,INCLUDING ANY LOST PROFITS,LOST SAVINGS OR OTHER INCIDENTAL OR CON- SEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PRODUCT
OR ANY MODIFICATIONS THEREOF,OR DUE TO THE CONTENTS OF THE CODE,EVEN IF
AP HAS BEEN ADVISED ON THE POSSIBILITY OF SUCH DAMAGES,OR FOR ANY CLAIM BY ANY OTHER PARTY.
ANY REQUEST FOR REPLACEMENT OF A DEFECTIVE CD-ROM MUST BE POSTAGE PREPAID AND MUST BE ACCOMPANIED BY THE ORIGINAL DEFECTIVE CD-ROM,YOUR MAILING AD- DRESS AND TELEPHONE NUMBER,AND PROOF OF DATE OF PURCHASE AND PURCHASE PRICE SEND SUCH REQUESTS,STATING THE NATURE OF THE PROBLEM,TO ACADEMIC PRESS CUSTOMER SERVICE,6277 SEA HARBOR DRIVE,ORLANDO,FL 32887,1-800-321-
5068 AP SHALL HAVE NO OBLIGATION TO REFUND THE PURCHASE PRICE OR TO PLACE A CD-ROM BASED ON CLAIMS OF DEFECTS IN THE NATURE OR OPERATION OF THE PRODUCT.
RE-SOME STATES DO NOT ALLOW LIMITATION ON HOW LONG AN IMPLIED WARRANTY LASTS,NOR EXCLUSIONS OR LIMITATIONS OF INCIDENTAL OR CONSEQUENTIAL DAM- AGE,SO THE ABOVE LIMITATIONS AND EXCLUSIONS MAY NOT APPLY TO YOU THIS WAR- RANTY GIVES YOU SPECIFIC LEGAL RIGHTS,AND YOU MAY ALSO HAVE OTHER RIGHTS WHICH VARY FROM JURISDICTION TO JURISDICTION.
THE RE-EXPORT OF UNITED STATES ORIGINAL SOFTWARE IS SUBJECT TO THE UNITED STATES LAWS UNDER THE EXPORT ADMINISTRATION ACT OF 1969 AS AMENDED ANY FURTHER SALE OF THE PRODUCT SHALL BE IN COMPLIANCE WITH THE UNITED STATES DEPARTMENT OF COMMERCE ADMINISTRATION REGULATIONS COMPLIANCE WITH SUCH REGULATIONS IS YOUR RESPONSIBILITY AND NOT THE RESPONSIBILITY OF AP.
Trang 4Computer Vision
and Applications
A Guide for Students and Practitioners
Editors Bernd Jähne
Interdisciplinary Center for Scientific Computing
University of Heidelberg,Heidelberg,Germany
andScripps Institution of OceanographyUniversity of California,San Diego
Horst Haußecker
Xerox Palo Alto Research Center
San Diego San Francisco New York Boston
London Sydney Tokyo
Trang 5All rights reserved.
No part of this publication may be reproduced or transmitted in any form or
by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher.
Requests for permission to make copies of any part of the work should be mailed to: Permissions Department, Harcourt, Inc., 6277 Sea Harbor Drive, Orlando, Florida, 32887-6777.
ACADEMIC PRESS
A Harcourt Science and Technology Company
525 B Street, Suite 1900, San Diego, CA 92101-4495, USA
http://www.academicpress.com
Academic Press
24-28 Oval Road, London NW1 7DX, UK
http://www.hbuk.co.uk/ap/
Library of Congress Catalog Number: 99-68829
International Standard Book Number: 0–12–379777-2
Printed in the United States of America
0 0 0 1 0 2 0 3 0 4 EB 9 8 7 6 5 4 3 2 1
Trang 6B Jähne
1.1 Components of a vision system 1
1.2 Imaging systems 2
1.3 Signal processing for computer vision 3
1.4 Pattern recognition for computer vision 4
1.5 Performance evaluation of algorithms 5
1.6 Classes of tasks 6
1.7 References 8
I Sensors and Imaging 2 Radiation and Illumination 11 H Haußecker 2.1 Introduction 12
2.2 Fundamentals of electromagnetic radiation 13
2.3 Radiometric quantities 17
2.4 Fundamental concepts of photometry 27
2.5 Interaction of radiation with matter 31
2.6 Illumination techniques 46
2.7 References 51
3 Imaging Optics 53 P Geißler 3.1 Introduction 54
3.2 Basic concepts of geometric optics 54
3.3 Lenses 56
3.4 Optical properties of glasses 66
3.5 Aberrations 67
3.6 Optical image formation 75
3.7 Wave and Fourier optics 80
3.8 References 84
v
Trang 74 Radiometry of Imaging 85
H Haußecker
4.1 Introduction 85
4.2 Observing surfaces 86
4.3 Propagating radiance 88
4.4 Radiance of imaging 91
4.5 Detecting radiance 94
4.6 Concluding summary 108
4.7 References 109
5 Solid-State Image Sensing 111 P Seitz 5.1 Introduction 112
5.2 Fundamentals of solid-state photosensing 113
5.3 Photocurrent processing 120
5.4 Transportation of photosignals 127
5.5 Electronic signal detection 130
5.6 Architectures of image sensors 134
5.7 Color vision and color imaging 139
5.8 Practical limitations of semiconductor photosensors 146
5.9 Conclusions 148
5.10References 149
6 Geometric Calibration of Digital Imaging Systems 153 R Godding 6.1 Introduction 153
6.2 Calibration terminology 154
6.3 Parameters influencing geometrical performance 155
6.4 Optical systems model of image formation 157
6.5 Camera models 158
6.6 Calibration and orientation techniques 163
6.7 Photogrammetric applications 170
6.8 Summary 173
6.9 References 173
7 Three-Dimensional Imaging Techniques 177 R Schwarte, G Häusler, R W Malz 7.1 Introduction 178
7.2 Characteristics of 3-D sensors 179
7.3 Triangulation 182
7.4 Time-of-flight (TOF) of modulated light 196
7.5 Optical Interferometry (OF) 199
7.6 Conclusion 205
7.7 References 205
Trang 8Contents vii
II Signal Processing and Pattern Recognition
8 Representation of Multidimensional Signals 211
B Jähne
8.1 Introduction 212
8.2 Continuous signals 212
8.3 Discrete signals 215
8.4 Relation between continuous and discrete signals 224
8.5 Vector spaces and unitary transforms 232
8.6 Continuous Fourier transform (FT) 237
8.7 The discrete Fourier transform (DFT) 246
8.8 Scale of signals 252
8.9 Scale space and diffusion 260
8.10Multigrid representations 267
8.11 References 271
9 Neighborhood Operators 273 B Jähne 9.1 Basics 274
9.2 Linear shift-invariant filters 278
9.3 Recursive filters 285
9.4 Classes of nonlinear filters 292
9.5 Local averaging 296
9.6 Interpolation 311
9.7 Edge detection 325
9.8 Tensor representation of simple neighborhoods 335
9.9 References 344
10 Motion 347 H Haußecker and H Spies 10.1 Introduction 347
10.2 Basics: flow and correspondence 349
10.3 Optical flow-based motion estimation 358
10.4 Quadrature filter techniques 372
10.5 Correlation and matching 379
10.6 Modeling of flow fields 382
10.7 References 392
11 Three-Dimensional Imaging Algorithms 397 P Geißler, T Dierig, H A Mallot 11.1 Introduction 397
11.2 Stereopsis 398
11.3 Depth-from-focus 414
11.4 References 435
12 Design of Nonlinear Diffusion Filters 439 J Weickert 12.1 Introduction 439
12.2 Filter design 440
12.3 Parameter selection 448
12.4 Extensions 451
12.5 Relations to variational image restoration 452
Trang 912.6 Summary 454
12.7 References 454
13 Variational Adaptive Smoothing and Segmentation 459 C Schnörr 13.1 Introduction 459
13.2 Processing of two- and three-dimensional images 463
13.3 Processing of vector-valued images 474
13.4 Processing of image sequences 476
13.5 References 480
14 Morphological Operators 483 P Soille 14.1 Introduction 483
14.2 Preliminaries 484
14.3 Basic morphological operators 489
14.4 Advanced morphological operators 495
14.5 References 515
15 Probabilistic Modeling in Computer Vision 517 J Hornegger, D Paulus, and H Niemann 15.1 Introduction 517
15.2 Why probabilistic models? 518
15.3 Object recognition as probabilistic modeling 519
15.4 Model densities 524
15.5 Practical issues 536
15.6 Summary, conclusions, and discussion 538
15.7 References 539
16 Fuzzy Image Processing 541 H Haußecker and H R Tizhoosh 16.1 Introduction 541
16.2 Fuzzy image understanding 548
16.3 Fuzzy image processing systems 553
16.4 Theoretical components of fuzzy image processing 556
16.5 Selected application examples 564
16.6 Conclusions 570
16.7 References 571
17 Neural Net Computing for Image Processing 577 A Meyer-Bäse 17.1 Introduction 577
17.2 Multilayer perceptron (MLP) 579
17.3 Self-organizing neural networks 585
17.4 Radial-basis neural networks (RBNN) 590
17.5 Transformation radial-basis networks (TRBNN) 593
17.6 Hopfield neural networks 596
17.7 Application examples of neural networks 601
17.8 Concluding remarks 604
17.9 References 605
Trang 10Contents ix
III Application Gallery
A1 Object Recognition with Intelligent Cameras 610
T Wagner, and P Plankensteiner
A2 3-D Image Metrology of Wing Roots 612
H Beyer
A3 Quality Control in a Shipyard 614
H.-G Maas
A4 Topographical Maps of Microstructures 616
Torsten Scheuermann, Georg Wiora and Matthias Graf
A5 Fast 3-D Full Body Scanning for Humans and Other Objects 618
N Stein and B Minge
A6 Reverse Engineering Using Optical Range Sensors 620
S Karbacher and G Häusler
A7 3-D Surface Reconstruction from Image Sequences 622
R Koch, M Pollefeys and L Von Gool
A8 Motion Tracking 624
R Frischholz
A9 Tracking “Fuzzy” Storms in Doppler Radar Images 626
J.L Barron, R.E Mercer, D Cheng, and P Joe
A103-D Model-Driven Person Detection 628
Ch Ridder, O Munkelt and D Hansel
A11 Knowledge-Based Image Retrieval 630
Th Hermes and O Herzog
A12 Monitoring Living Biomass with in situ Microscopy 632
P Geißler and T Scholz
A13 Analyzing Size Spectra of Oceanic Air Bubbles 634
P Geißler and B Jähne
A14 Thermography to Measure Water Relations of Plant Leaves 636
B Kümmerlen, S Dauwe, D Schmundt and U Schurr
A15 Small-Scale Air-Sea Interaction with Thermography 638
U Schimpf, H Haußecker and B Jähne
A16 Optical Leaf Growth Analysis 640
D Schmundt and U Schurr
A17 Analysis of Motility Assay Data 642
D Uttenweiler and R H A Fink
A18 Fluorescence Imaging of Air-Water Gas Exchange 644
S Eichkorn, T Münsterer, U Lode and B Jähne
A19 Particle-Tracking Velocimetry 646
D Engelmann, M Stöhr, C Garbe, and F Hering
A20Analyzing Particle Movements at Soil Interfaces 648
H Spies, H Gröning, and H Haußecker
A21 3-D Velocity Fields from Flow Tomography Data 650
H.-G Maas
A22 Cloud Classification Analyzing Image Sequences 652
M Wenig, C Leue
Trang 11A23 NO X Emissions Retrieved from Satellite Images 654
C Leue, M Wenig and U Platt
A24 Multicolor Classification of Astronomical Objects 656
C Wolf, K Meisenheimer, and H.-J Roeser
A25 Model-Based Fluorescence Imaging 658
D Uttenweiler and R H A Fink
A26 Analyzing the 3-D Genome Topology 660
H Bornfleth, P Edelmann, and C Cremer
A27 References 662
Trang 12What this book is about
This book offers a fresh approach to computer vision The whole visionprocess from image formation to measuring, recognition, or reacting
is regarded as an integral process Computer vision is understood asthe host of techniques to acquire, process, analyze, and understandcomplex higher-dimensional data from our environment for scientificand technical exploration
In this sense this book takes into account the interdisciplinary ture of computer vision with its links to virtually all natural sciencesand attempts to bridge two important gaps The first is between mod-ern physical sciences and the many novel techniques to acquire images.The second is between basic research and applications When a readerwith a background in one of the fields related to computer vision feels
na-he has learned something from one of tna-he many otna-her facets of puter vision, the book will have fulfilled its purpose
com-This book comprises three parts The first part, Sensors and ing, covers image formation and acquisition The second part, Signal Processing and Pattern Recognition, focuses on processing of the spatial
Imag-and spatiotemporal signals acquired by imaging sensors The third part
consists of an Application Gallery, which shows in a concise overview
a wide range of application examples from both industry and science.This part illustrates how computer vision is integrated into a variety ofsystems and applications
Computer Vision and Applications was designed as a concise edition
of the three-volume handbook:
Handbook of Computer Vision and Applications
edited by B Jähne, H Haußecker, and P Geißler
Vol 1: Sensors and Imaging;
Vol 2: Signal Processing and Pattern Recognition;
Vol 3: Systems and Applications
Academic Press, 1999
xi
Trang 13It condenses the content of the handbook into one single volumeand contains a selection of shortened versions of the most importantcontributions of the full edition Although it cannot detail every singletechnique, this book still covers the entire spectrum of computer visionranging from the imaging process to high-end algorithms and applica-tions Students in particular can benefit from the concise overview ofthe field of computer vision It is perfectly suited for sequential reading
into the subject and it is complemented by the more detailed Handbook
of Computer Vision and Applications The reader will find references
to the full edition of the handbook whenever applicable In order tosimplify notation we refer to supplementary information in the hand-
book by the abbreviations [CVA1, Chapter N], [CVA2, Chapter N], and [CVA3, Chapter N] for the N th chapter in the first, second and thirdvolume, respectively Similarly, direct references to individual sections
in the handbook are given by [CVA1, Section N], [CVA2, Section N], and [CVA3, Section N] for section number N.
Prerequisites
It is assumed that the reader is familiar with elementary mathematicalconcepts commonly used in computer vision and in many other areas
of natural sciences and technical disciplines This includes the basics
of set theory, matrix algebra, differential and integral equations, plex numbers, Fourier transform, probability, random variables, andgraph theory Wherever possible, mathematical topics are describedintuitively In this respect it is very helpful that complex mathematicalrelations can often be visualized intuitively by images For a more for-mal treatment of the corresponding subject including proofs, suitablereferences are given
com-How to use this book
The book has been designed to cover the different needs of its
reader-ship First, it is suitable for sequential reading In this way the reader
gets an up-to-date account of the state of computer vision It is sented in a way that makes it accessible for readers with different back-grounds Second, the reader can look up specific topics of interest.The individual chapters are written in a self-consistent way with ex-tensive cross-referencing to other chapters of the book and externalreferences Additionally, a detailed glossary allows to easily access themost important topics independently of individual chapters The CDthat accompanies this book contains the complete text of the book inthe Adobe Acrobat portable document file format (PDF) This formatcan be read on all major platforms Free Acrobat™ Reader version 4.0
Trang 14pre-Preface xiii
for all major computing platforms is included on the CDs The texts arehyperlinked in multiple ways Thus the reader can collect the informa-tion of interest with ease Third, the reader can delve more deeply into
a subject with the material on the CDs They contain additional ence material, interactive software components, code examples, imagematerial, and references to sources on the Internet For more detailssee the readme file on the CDs
refer-Acknowledgments
Writing a book on computer vision with this breadth of topics is a majorundertaking that can succeed only in a coordinated effort that involvesmany co-workers Thus the editors would like to thank first all contrib-utors who were willing to participate in this effort Their cooperationwith the constrained time schedule made it possible that this concise
edition of the Handbook of Computer Vision and Applications could be
published in such a short period following the release of the handbook
in May 1999 The editors are deeply grateful for the dedicated and fessional work of the staff at AEON Verlag & Studio who did most of theeditorial work We also express our sincere thanks to Academic Pressfor the opportunity to write this book and for all professional advice.Last but not least, we encourage the reader to send us any hints
pro-on errors, omissipro-ons, typing errors, or any other shortcomings of thebook Actual information about the book can be found at the editorshomepagehttp://klimt.iwr.uni-heidelberg.de
Heidelberg, Germany, and Palo Alto, California
Bernd Jähne, Horst Haußecker
Trang 16Prof Dr John L Barron
Dept of Computer Science, Middlesex College
The University of Western Ontario, London, Ontario, N6A 5B7, Canada barron@csd.uwo.ca
Horst A Beyer
Imetric SA, Technopole, CH-2900 Porrentry, Switzerland
imetric@dial.eunet.ch , http://www.imetric.com
Dr Harald Bornfleth
Institut für Angewandte Physik, Universität Heidelberg
Albert-Überle-Str 3-5, D-69120Heidelberg, Germany
Harald.Bornfleth@iwr.uni-heidelberg.de
http://www.aphys.uni-heidelberg.de/AG_Cremer/
David Cheng
Dept of Computer Science, Middlesex College
The University of Western Ontario, London, Ontario, N6A 5B7, Canada cheng@csd.uwo.ca
Prof Dr Christoph Cremer
Institut für Angewandte Physik, Universität Heidelberg
Albert-Überle-Str 3-5, D-69120Heidelberg, Germany
cremer@popeye.aphys2.uni-heidelberg.de
http://www.aphys.uni-heidelberg.de/AG_Cremer/
Tobias Dierig
Forschungsgruppe Bildverarbeitung, IWR, Universität Heidelberg
Im Neuenheimer Feld 368, D-69120Heidelberg
Tobias Dierig@iwr.uni-heidelberg.de
http://klimt.iwr.uni-heidelberg.de
Stefan Dauwe
Botanisches Institut, Universität Heidelberg
Im Neuenheimer Feld 360, D-69120 Heidelberg, Germany
Peter U Edelmann
Institut für Angewandte Physik, Universität Heidelberg
Albert-Überle-Str 3-5, D-69120Heidelberg, Germany
Trang 17Saupfercheckweg 1, D-69117 Heidelberg, Germany
Sven.Eichkorn@mpi-hd.mpg.de
Dirk Engelmann
Forschungsgruppe Bildverarbeitung, IWR, Universität Heidelberg
Im Neuenheimer Feld 368, D-69120Heidelberg
Dirk.Engelmann@iwr.uni-heidelberg.de
http://klimt.iwr.uni-heidelberg.de/˜dengel
Prof Dr Rainer H A Fink
II Physiologisches Institut, Universität Heidelberg
Im Neuenheimer Feld 326, D-69120Heidelberg, Germany
Forschungsgruppe Bildverarbeitung, IWR, Universität Heidelberg
Im Neuenheimer Feld 368, D-69120Heidelberg
Dipl.-Ing Robert Godding
AICON GmbH, Celler Straße 32, D-38114 Braunschweig, Germany
robert.godding@aicon.de , http://www.aicon.de
Matthias Graf
Institut für Kunststoffprüfung und Kunststoffkunde
(IKP), Pfaffenwaldring 32, D-70569 Stuttgart, Germany
graf@ikp.uni-stuttgart.de,Matthias.Graf@t-online.de
http://www.ikp.uni-stuttgart.de
Hermann Gröning
Forschungsgruppe Bildverarbeitung, IWR, Universität Heidelberg
Im Neuenheimer Feld 360, D-69120 Heidelberg, Germany
Prof Dr Gerd Häusler
Chair for Optics, Universität Erlangen-Nürnberg
Staudtstraße 7/B2, D-91056 Erlangen, Germany
haeusler@physik.uni-erlangen.de
http://www.physik.uni-erlangen.de/optik/haeusler
Trang 18Contributors xvii
Dr Horst Haußecker
Xerox Palo Alto Research Center (PARC)
3333 Coyote Hill Road, Palo Alto, CA 94304
hhaussec@parc.xerox.com , http://www.parc.xerox.com
Dr Frank Hering
SAP AG, Neurottstraße 16, D-69190Walldorf, Germany
frank.hering@sap.com
Dipl.-Inform Thorsten Hermes
Center for Computing Technology, Image Processing Department
University of Bremen, P.O Box 33 0440, D-28334 Bremen, Germany
hermes@tzi.org , http://www.tzi.org/˜hermes
Prof Dr Otthein Herzog
Center for Computing Technology, Image Processing Department
University of Bremen, P.O Box 33 0440, D-28334 Bremen, Germany
herzog@tzi.org , http://www.tzi.org/˜herzog
Dr Joachim Hornegger
Lehrstuhl für Mustererkennung (Informatik 5)
Universität Erlangen-Nürnberg, Martensstraße 3, 91058 Erlangen, Germany hornegger@informatik.uni-erlangen.de
http://www5.informatik.uni-erlangen.de
Prof Dr Bernd Jähne
Forschungsgruppe Bildverarbeitung, IWR, Universität Heidelberg
Im Neuenheimer Feld 368, D-69120Heidelberg
Bernd.Jaehne@iwr.uni-heidelberg.de
http://klimt.iwr.uni-heidelberg.de
Dr Paul Joe
King City Radar Station, Atmospheric Environmental Services
4905 Dufferin St., Toronto, Ontario M3H 5T4, Canada
joep@aestor.dots.doe.ca
Stefan Karbacher
Chair for Optics, Universität Erlangen-Nürnberg
Staudtstraße 7/B2, D-91056 Erlangen, Germany
sbk@physik.uni-erlangen.de , http://www.physik.uni-erlangen.de
Prof Dr.-Ing Reinhard Koch
Institut für Informatik und Praktische Mathematik
Christian-Albrechts-Universität Kiel, Olshausenstr 40, D 24098 Kiel, Germany rk@is.informatik.uni-kiel.de
Bernd Kümmerlen
Botanisches Institut, Universität Heidelberg
Im Neuenheimer Feld 360, D-69120 Heidelberg, Germany
Dr Carsten Leue
Institut für Umweltphysik, Universität Heidelberg
Im Neuenheimer Feld 229, D-69120Heidelberg, Germany
Carsten.Leue@iwr.uni-heidelberg.de
Ulrike Lode
Institut für Umweltphysik, Universität Heidelberg
Trang 19Im Neuenheimer Feld 229, D-69120Heidelberg, Germany
http://klimt.iwr.uni-heidelberg.de
Prof Dr.-Ing Hans-Gerd Maas
Institute for Photogrammetry and Remote Sensing
Technical University Dresden, D-01062 Dresden, Germany
maas@rcs.urz.tu-dresden.de
Prof Dr.-Ing Reinhard Malz
Fachhochschule Esslingen, Fachbereich Informationstechnik
Flandernstr 101, D-73732 Esslingen
reinhard.malz@fht-esslingen.de
Dr Hanspeter A Mallot
Max-Planck-Institut für biologische Kybernetik
Spemannstr 38, 72076 Tübingen, Germany
Hanspeter.Mallot@tuebingen.mpg.de
http://www.kyb.tuebingen.mpg.de/bu/
Prof Robert E Mercer
Dept of Computer Science, Middlesex College
The University of Western Ontario, London, Ontario, N6A 5B7, Canada mercer@csd.uwo.ca
Dr Anke Meyer-Bäse
Dept of Electrical Engineering and Computer Science
University of Florida, 454 New Engineering Building 33, Center Drive
PO Box 116130, Gainesville, FL 32611-6130, U.S.
anke@alpha.ee.ufl.edu
Bernhard Minge
VITRONIC Dr.-Ing Stein Bildverarbeitungssysteme GmbH
Hasengartenstrasse 14a, D-65189 Wiesbaden, Germany
bm@vitronic.de , http://www.vitronic.de
Dr Olaf Munkelt
FORWISS, Bayerisches Forschungszentrum für Wissensbasierte Systeme Forschungsgruppe Kognitive Systeme, Orleansstr 34, 81667 München munkelt@forwiss.de , http://www.forwiss.de/˜munkelt
Dr Thomas Münsterer
VITRONIC Dr.-Ing Stein Bildverarbeitungssysteme GmbH
Hasengartenstr 14a, D-65189 Wiesbaden, Germany
Phone: +49-611-7152-38, tm@vitronic.de
Prof Dr.-Ing Heinrich Niemann
Lehrstuhl für Mustererkennung (Informatik 5)
Universität Erlangen-Nürnberg, Martensstraße 3, 91058 Erlangen, Germany niemann@informatik.uni-erlangen.de
http://www5.informatik.uni-erlangen.de
Dr Dietrich Paulus
Lehrstuhl für Mustererkennung (Informatik 5)
Universität Erlangen-Nürnberg, Martensstraße 3, 91058 Erlangen, Germany paulus@informatik.uni-erlangen.de
http://www5.informatik.uni-erlangen.de
Trang 20Contributors xix
Dipl.-Math Peter Plankensteiner
Intego Plankensteiner Wagner Gbr
Am Weichselgarten 7, D-91058 Erlangen
ppl@intego.de
Prof Dr Ulrich Platt
Institut für Umweltphysik, Universität Heidelberg
Im Neuenheimer Feld 229, D-69120Heidelberg, Germany
pl@uphys1.uphys.uni-heidelberg.de
http://www.iup.uni-heidelberg.de/urmel/atmos.html
Dr Marc Pollefeys
Katholieke Universiteit Leuven, ESAT-PSI/VISICS
Kardinaal Mercierlaan 94, B-3001 Heverlee, Belgium
Dr Torsten Scheuermann
Fraunhofer USA, Headquarters
24 Frank Lloyd Wright Drive, Ann Arbor, MI 48106-0335, U.S.
tscheuermann@fraunhofer.org , http://www.fraunhofer.org
Dr Uwe Schimpf
Forschungsgruppe Bildverarbeitung, IWR, Universität Heidelberg
Im Neuenheimer Feld 360, D-69120 Heidelberg, Germany
Uwe.Schimpf@iwr.uni-heidelberg.de
http://klimt.iwr.uni-heidelberg.de
Dr Dominik Schmundt
Forschungsgruppe Bildverarbeitung, IWR, Universität Heidelberg
Im Neuenheimer Feld 360, D-69120 Heidelberg, Germany
Botanisches Institut, Universität Heidelberg
Im Neuenheimer Feld 360, D-69120 Heidelberg, Germany
uschurr@botanik1.bot.uni-heidelberg.de
http://klimt.iwr.uni-heidelberg.de/PublicFG/index.html
Trang 21Prof Dr Rudolf Schwarte
Institut für Nachrichtenverarbeitung (INV)
Universität-GH Siegen, Hölderlinstr 3, D-57068 Siegen, Germany
schwarte@nv.et-inf.uni-siegen.de
http://www.nv.et-inf.uni-siegen.de/inv/inv.html
Prof Dr Peter Seitz
Centre Suisse d’Electronique et de Microtechnique SA (CSEM)
Badenerstrasse 569, CH-8048 Zurich, Switzerland
peter.seitz@csem.ch , http://www.csem.ch/
Prof Dr Pierre Soille
Silsoe Research Institute, Wrest Park
Silsoe, Bedfordshire, MK45 4HS, United Kingdom
Pierre.Soille@bbsrc.ac.uk , http://www.bbsrc.ac.uk/
Hagen Spies
Forschungsgruppe Bildverarbeitung, IWR, Universität Heidelberg
Im Neuenheimer Feld 368, D-69120Heidelberg
Hagen.Spies@iwr.uni-heidelberg.de
http://klimt.iwr.uni-heidelberg.de
Dr.-Ing Norbert Stein
VITRONIC Dr.-Ing Stein Bildverarbeitungssysteme GmbH
Hasengartenstrasse 14a, D-65189 Wiesbaden, Germany
st@vitronic.de , http://www.vitronic.de
Michael Stöhr
Forschungsgruppe Bildverarbeitung, IWR, Universität Heidelberg
Im Neuenheimer Feld 368, D-69120Heidelberg
Michael.Stoehr@iwr.uni-heidelberg.de
http://klimt.iwr.uni-heidelberg.de
Hamid R Tizhoosh
Universität Magdeburg (IPE)
P.O Box 4120, D-39016 Magdeburg, Germany
tizhoosh@ipe.et.uni-magdeburg.de
http://pmt05.et.uni-magdeburg.de/˜hamid/
Dr Dietmar Uttenweiler
II Physiologisches Institut, Universität Heidelberg
Im Neuenheimer Feld 326, D-69120Heidelberg, Germany
dietmar.uttenweiler@urz.uni-heidelberg.de
Prof Dr Luc Van Gool
Katholieke Universiteit Leuven, ESAT-PSI/VISICS
Kardinaal Mercierlaan 94, B-3001 Heverlee, Belgium
Trang 22Institut für Umweltphysik, Universität Heidelberg
Im Neuenheimer Feld 229, D-69120Heidelberg, Germany
Mark.Wenig@iwr.uni-heidelberg.de
http://klimt.iwr.uni-heidelberg.de/˜mwenig
Georg Wiora
DaimlerChrysler AG, Research and Development
Wilhelm-Runge-Str 11, D-89081 Ulm, Germany
Trang 241.1 Components of a vision system
Computer vision is a complex subject As such it is helpful to divide
it into its various components or function modules On this level, it
is also much easier to compare a technical system with a biologicalsystem In this sense, the basic common functionality of biological andmachine vision includes the following components (see also Table1.1):
Radiation source If no radiation is emitted from the scene or the
ob-ject of interest, nothing can be observed or processed Thus priate illumination is necessary for objects that are themselves notradiant
appro-Camera The “camera” collects the radiation received from the object
in such a way that the radiation’s origins can be pinpointed Inthe simplest case this is just an optical lens But it could also be acompletely different system, for example, an imaging optical spec-trometer, an x-ray tomograph, or a microwave dish
Sensor The sensor converts the received radiative flux density into a
suitable signal for further processing For an imaging system mally a 2-D array of sensors is required to capture the spatial dis-tribution of the radiation With an appropriate scanning system insome cases a single sensor or a row of sensors could be sufficient
nor-1
All rights of reproduction in any form reserved.
Trang 25Table 1.1: Function modules of human and machine vision
Visualization Passive,mainly by
re-flection of light from opaque surfaces
Passive and active (controlled lumination) using electromagnetic, particulate,and acoustic radiation Image
il-formation
Refractive optical tem
sys-Various systems Control of
irradiance
Muscle-controlled pupil Motorized apertures,filter wheels,
tunable filters Focusing Muscle-controlled
change of focal length
Autofocus systems based on ous principles of distance measure- ments
vari-Irradiance
resolution
Logarithmic sensitivity Linear sensitivity,quantization
be-tween 8- and 16-bits; logarithmic sensitivity
Tracking Highly mobile eyeball Scanner and robot-mounted
cam-eras Processing
and analysis
Hierarchically organized massively parallel processing
Serial processing still dominant; parallel processing not in general use
Processing unit It processes the incoming, generally
higher-dimen-sional data, extracting suitable features that can be used to measureobject properties and categorize them into classes Another impor-tant component is a memory system to collect and store knowl-edge about the scene, including mechanisms to delete unimportantthings
Actors Actors react to the result of the visual observation They
be-come an integral part of the vision system when the vision system
is actively responding to the observation by, for example, tracking
an object of interest or by using a vision-guided navigation (active vision, perception action cycle).
1.2 Imaging systems
Imaging systems cover all processes involved in the formation of animage from objects and the sensors that convert radiation into elec-tric signals, and further into digital signals that can be processed by
a computer Generally the goal is to attain a signal from an object insuch a form that we know where it is (geometry), and what it is or whatproperties it has
Trang 261.3 Signal processing for computer vision 3Property
E(x)
sensor
Photo-Electric signal g(x)
ADC sampling
G mn
Digital image
Figure 1.1: Chain of steps linking an object property to the signal measured by
an imaging system.
It is important to note that the type of answer we receive from thesetwo implicit questions depends on the purpose of the vision system.The answer could be of either a qualitative or a quantitative nature.For some applications it could be sufficient to obtain a qualitative an-swer like “there is a car on the left coming towards you.” The “what”and “where” questions can thus cover the entire range from “there issomething,” a specification of the object in the form of a class, to a de-tailed quantitative description of various properties of the objects ofinterest
The relation that links the object property to the signal measured by
an imaging system is a complex chain of processes (Fig.1.1) Interaction
of the radiation with the object (possibly using an appropriate nation system) causes the object to emit radiation A portion (usuallyonly a very small part) of the emitted radiative energy is collected by the
illumi-optical system and perceived as an irradiance (radiative energy/area).
A sensor (or rather an array of sensors) converts the received radiationinto an electrical signal that is subsequently sampled and digitized toform a digital image as an array of digital numbers
Only direct imaging systems provide a direct point-to-point
corre-spondence between points of the objects in the 3-D world and at the
image plane Indirect imaging systems also give a spatially distributed
irradiance but with no such one-to-one relation Generation of an age requires reconstruction of the object from the perceived irradiance.Examples of such imaging techniques include radar imaging, varioustechniques for spectral imaging, acoustic imaging, tomographic imag-ing, and magnetic resonance imaging
im-1.3 Signal processing for computer vision
One-dimensional linear signal processing and system theory is a
stan-dard topic in electrical engineering and is covered by many stanstan-dardtextbooks (e.g., [1, 2]) There is a clear trend that the classical signalprocessing community is moving into multidimensional signals, as in-dicated, for example, by the new annual international IEEE conference
on image processing (ICIP) This can also be seen from some recentlypublished handbooks on this subject The digital signal processinghandbook by Madisetti and Williams [3] includes several chapters that
Trang 27deal with image processing Likewise the transforms and applicationshandbook by Poularikas [4] is not restricted to 1-D transforms.
There are, however, only a few monographs that treat signal cessing specifically for computer vision and image processing Themonograph by Lim [5] deals with 2-D signal and image processing andtries to transfer the classical techniques for the analysis of time series
pro-to 2-D spatial data Granlund and Knutsson [6] were the first to publish
a monograph on signal processing for computer vision and elaborate on
a number of novel ideas such as tensorial image processing and malized convolution that did not have their origin in classical signalprocessing
nor-Time series are 1-D, signals in computer vision are of higher mension They are not restricted to digital images, that is, 2-D spatialsignals (Chapter8) Volumetric sampling, image sequences, and hyper-
di-spectral imaging all result in 3-D signals, a combination of any of these
techniques in even higher-dimensional signals
How much more complex does signal processing become with creasing dimension? First, there is the explosion in the number of datapoints Already a medium resolution volumetric image with 5123vox-els requires 128 MB if one voxel carries just one byte Storage of evenhigher-dimensional data at comparable resolution is thus beyond thecapabilities of today’s computers
in-Higher dimensional signals pose another problem While we do nothave difficulty in grasping 2-D data, it is already significantly more de-manding to visualize 3-D data because the human visual system is builtonly to see surfaces in 3-D but not volumetric 3-D data The more di-
mensions are processed, the more important it is that computer ics and computer vision move closer together.
graph-The elementary framework for lowlevel signal processing for puter vision is worked out in Chapters8and9 Of central importanceare neighborhood operations (Chapter9), including fast algorithms forlocal averaging (Section9.5), and accurate interpolation (Section9.6)
com-1.4 Pattern recognition for computer vision
The basic goal of signal processing in computer vision is the extraction
of “suitable features” for subsequent processing to recognize and
clas-sify objects But what is a suitable feature? This is still less well definedthan in other applications of signal processing Certainly a mathemat-ically well-defined description of local structure as discussed in Sec-tion9.8is an important basis As signals processed in computer vision
come from dynamical 3-D scenes, important features also include tion (Chapter10) and various techniques to infer the depth in scenes
Trang 28mo-1.5 Performance evaluation of algorithms 5
including stereo (Section11.2), shape from shading and photometricstereo, and depth from focus (Section11.3)
There is little doubt that nonlinear techniques are crucial for
fea-ture extraction in computer vision However, compared to linear filtertechniques, these techniques are still in their infancy There is also
no single nonlinear technique but there are a host of such techniquesoften specifically adapted to a certain purpose [7] In this volume, wegive an overview of the various classes of nonlinear filter techniques(Section9.4) and focus on a first-order tensor representation of of non-linear filters by combination of linear convolution and nonlinear pointoperations (Chapter9.8) and nonlinear diffusion filtering (Chapter12)
In principle, pattern classification is nothing complex Take some
appropriate features and partition the feature space into classes Why
is it then so difficult for a computer vision system to recognize objects?The basic trouble is related to the fact that the dimensionality of the in-put space is so large In principle, it would be possible to use the imageitself as the input for a classification task, but no real-world classifi-cation technique—be it statistical, neuronal, or fuzzy—would be able
to handle such high-dimensional feature spaces Therefore, the needarises to extract features and to use them for classification
Unfortunately, techniques for feature selection have very often beenneglected in computer vision They have not been developed to thesame degree of sophistication as classification, where it is meanwhilewell understood that the different techniques, especially statistical andneural techniques, can been considered under a unified view [8].This book focuses in part on some more advanced feature-extractiontechniques An important role in this aspect is played by morphologicaloperators (Chapter14) because they manipulate the shape of objects
in images Fuzzy image processing (Chapter16) contributes a tool tohandle vague data and information
Object recognition can be performed only if it is possible to sent the knowledge in an appropriate way In simple cases the knowl-edge can just rest in simple models Probabilistic modeling in com-puter vision is discussed in Chapter15 In more complex cases this isnot sufficient
repre-1.5 Performance evaluation of algorithms
A systematic evaluation of the algorithms for computer vision has beenwidely neglected For a newcomer to computer vision with an engi-neering background or a general education in natural sciences this is astrange experience It appears to him/her as if one would present re-sults of measurements without giving error bars or even thinking about
possible statistical and systematic errors.
Trang 29What is the cause of this situation? On the one side, it is certainlytrue that some problems in computer vision are very hard and that it
is even harder to perform a sophisticated error analysis On the otherhand, the computer vision community has ignored the fact to a largeextent that any algorithm is only as good as its objective and solidevaluation and verification
Fortunately, this misconception has been recognized in the time and there are serious efforts underway to establish generally ac-
mean-cepted rules for the performance analysis of computer vision algorithms
[9] The three major criteria for the performance of computer vision gorithms are:
al-Successful solution of task Any practitioner gives this a top priority.
But also the designer of an algorithm should define precisely forwhich task it is suitable and what the limits are
Accuracy This includes an analysis of the statistical and systematic errors under carefully defined conditions (such as given signal-to- noise ratio (SNR), etc.).
Speed Again this is an important criterion for the applicability of an
algorithm
There are different ways to evaluate algorithms according to the mentioned criteria Ideally this should include three classes of studies:
fore-Analytical studies This is the mathematically most rigorous way to
verify algorithms, check error propagation, and predict catastrophicfailures
Performance tests with computer generated images These tests are
useful as they can be carried out under carefully controlled tions
condi-Performance tests with real-world images This is the final test for
practical applications
Much of the material presented in this volume is written in the spirit
of a careful and mathematically well-founded analysis of the methodsthat are described although the performance evaluation techniques arecertainly more advanced in some areas than in others
1.6 Classes of tasks
Applications of computer vision can be found today in almost everytechnical and scientific area Thus it is not very helpful to list applica-tions according to their field In order to transfer experience from oneapplication to another it is most useful to specify the problems thathave to be solved and to categorize them into different classes
Trang 30Depth,3-D optical metrology 11.2 , A2 , A4 , A5 , A6 , A26
2-D form & 2-D shape 14 , A13
3-D object shape 6 , 7 , A2 , A4 , A5 , A6 , A7
Radiometry-related, 2
Fluorescence A17 , A18 , A25 , A26
Hyperspectral imaging A22 , A23 , A24 , A26
Motion, 10 2-D motion field 10 , A16 , A17 , A19 , A20
3-D motion field A19 , A21
Spatial structure and texture
Local wave number; scale 8.9 , 10.4 , 12 , 13
Local orientation 9.8 , 13
High-level tasks Segmentation 13 , 14 , A12 , A13
Object identification A1 , A12
Object classification A1 , A22 ,??
Model- and knowledge-based
recognition and retrieval A1,A11,A12
3-D modeling 3-D object recognition A6 , A10 , A7
3-D object synthesis A7
Tracking A8 , A9 , A10 , A19 , A20
Trang 31An attempt at such a classification is made in Table1.2 The tablecategorizes both the tasks with respect to 2-D imaging and the analysis
of dynamical 3-D scenes The second column contains references tochapters dealing with the corresponding task
Pro-[4] Poularikas, A D (ed.), (1996) The Transforms and Applications Handbook.
Boca Raton, FL: CRC Press.
[5] Lim, J S., (1990) Two-dimensional Signal and Image Processing Englewood
Prin-[9] Haralick, R M., Klette, R., Stiehl, H.-S., and Viergever, M (eds.), (1999) uation and Validation of Computer Vision Algorithms Boston: Kluwer.
Trang 32Eval-Part I
Sensors and Imaging
Trang 342 Radiation and Illumination
Horst Haußecker
Xerox Palo Alto Research Center (PARC)
2.1 Introduction 12 2.2 Fundamentals of electromagnetic radiation 13 2.2.1 Electromagnetic waves 13 2.2.2 Dispersion and attenuation 15 2.2.3 Polarization of radiation 15 2.2.4 Coherence of radiation 16 2.3 Radiometric quantities 17 2.3.1 Solid angle 17 2.3.2 Conventions and overview 18 2.3.3 Definition of radiometric quantities 20 2.3.4 Relationship of radiometric quantities 23 2.3.5 Spectral distribution of radiation 26 2.4 Fundamental concepts of photometry 27 2.4.1 Spectral response of the human eye 27 2.4.2 Definition of photometric quantities 28 2.4.3 Luminous efficacy 30 2.5 Interaction of radiation with matter 31 2.5.1 Basic definitions and terminology 32 2.5.2 Properties related to interfaces and surfaces 36 2.5.3 Bulk-related properties of objects 40 2.6 Illumination techniques 46 2.6.1 Directional illumination 47 2.6.2 Diffuse illumination 48 2.6.3 Rear illumination 49 2.6.4 Light and dark field illumination 49 2.6.5 Telecentric illumination 49 2.6.6 Pulsed and modulated illumination 50 2.7 References 51
11
All rights of reproduction in any form reserved.
Trang 352.1 Introduction
Visual perception of scenes depends on appropriate illumination to sualize objects The human visual system is limited to a very narrow
vi-portion of the spectrum of electromagnetic radiation, called light In
some cases natural sources, such as solar radiation, moonlight, ning flashes, or bioluminescence, provide sufficient ambient light tonavigate our environment Because humankind was mainly restricted
light-to daylight, one of the first attempts was light-to invent an artificial lightsource—fire (not only as a food preparation method)
Computer vision is not dependent upon visual radiation, fire, orglowing objects to illuminate scenes As soon as imaging detector sys-tems became available other types of radiation were used to probescenes and objects of interest Recent developments in imaging sen-sors cover almost the whole electromagnetic spectrum from x-rays toradiowaves (Chapter5) In standard computer vision applications illu-mination is frequently taken as given and optimized to illuminate ob-jects evenly with high contrast Such setups are appropriate for objectidentification and geometric measurements Radiation, however, canalso be used to visualize quantitatively physical properties of objects
by analyzing their interaction with radiation (Section2.5)
Physical quantities such as penetration depth or surface reflectivityare essential to probe the internal structures of objects, scene geome-try, and surface-related properties The properties of physical objectstherefore can be encoded not only in the geometrical distribution ofemitted radiation but also in the portion of radiation that is emitted,scattered, absorbed or reflected, and finally reaches the imaging sys-tem Most of these processes are sensitive to certain wavelengths andadditional information might be hidden in the spectral distribution ofradiation Using different types of radiation allows taking images fromdifferent depths or different object properties As an example, infrared
radiation of between 3 and 5 µm is absorbed by human skin to a depth
of < 1 mm, while x-rays penetrate an entire body without major ation Therefore, totally different properties of the human body (such
attenu-as skin temperature attenu-as well attenu-as skeletal structures) can be revealed formedical diagnosis
This chapter provides the fundamentals for a quantitative tion of radiation emitted from sources, as well as the interaction of ra-diation with objects and matter We will also show using a few selectedexamples, how this knowledge can be used to design illumination se-tups for practical applications such that different physical properties
descrip-of objects are visualized Radiometry, the measurement descrip-of radiation
properties by imaging systems, will be detailed in Chapter4
Trang 362.2 Fundamentals of electromagnetic radiation 13
2.2 Fundamentals of electromagnetic radiation
2.2.1 Electromagnetic waves
Electromagnetic radiation consists of electromagnetic waves carrying
energy and propagating through space Electrical and magnetic fields
are alternating with a temporal frequency ν and a spatial wavelength λ The metric units of ν and λ are cycles per second (s −1), and meter (m),respectively The unit 1 s−1 is also called one hertz (1 Hz) Wavelength
and frequency of waves are related by the speed of light c:
Photon energy In addition to electromagnetic theory, radiation can
be treated as a flow of particles, discrete packets of energy called tons One photon travels at the speed of light c and carries the energy
pho-e p = hν = hc
where h = 6.626 ×10−34J s is Planck’s constant Therefore the energy
content of radiation is quantized and can only be a multiple of hν for a certain frequency ν While the energy per photon is given by Eq (2.2),
the total energy of radiation is given by the number of photons It wasthis quantization of radiation that gave birth to the theory of quantummechanics at the beginning of the twentieth century
The energy of a single photon is usually given in electron volts (1 eV
= 1.602 ×10−19) One eV constitutes the energy of an electron beingaccelerated in an electrical field with a potential difference of one volt.Although photons do not carry electrical charge this unit is useful inradiometry, as electromagnetic radiation is usually detected by inter-action of radiation with electrical charges in sensors (Chapter5) Insolid-state sensors, for example, the energy of absorbed photons isused to lift electrons from the valence band into the conduction band
of a semiconductor The bandgap energy E gdefines the minimum ton energy required for this process As a rule of thumb the detector
pho-material is sensitive to radiation with energies E v > E g As an example,
indium antimonide (InSb) is a doped semiconductor with a bandgap of only 0.18 eV It is sensitive to wavelengths below 6.9 µm (which can be
Trang 37H z
] ]
F re q u e
H z
W a e le
m ]
γ-rays
γ-raysγ-rays
γ-rays
1,4µm
1mm 100µm
100nm
3µm 780nm 380nm 315nm 280nm
x rraayyss
microwaves(radar)wavesmicro(radar)
UV-C
UV-BUV-A
derived from Eq (2.2)) Silicon (Si) has a bandgap of 1.1 eV and requires
wavelengths below 1.1 µm to be detected This shows why InSb can
be used as detector material for infrared cameras in the 3-5 µm
wave-length region, while silicon sensors are used for visible radiation Italso shows, however, that the sensitivity of standard silicon sensors
extends beyond the visible range up to approximately 1 µm, which is
often neglected in applications (Chapter5)
Electromagnetic spectrum Monochromatic radiation consists of only
one frequency and wavelength The distribution of radiation over the
range of possible wavelengths is called spectrum or spectral tion Figure2.1 shows the spectrum of electromagnetic radiation to-
Trang 38distribu-2.2 Fundamentals of electromagnetic radiation 15
gether with the standardized terminology1 separating different parts.Electromagnetic radiation covers the whole range from very high energycosmic rays with wavelengths in the order of 10−16m (ν = 1024Hz) tosound frequencies above wavelengths of 106m (ν = 102Hz) Only avery narrow band of radiation between 380and 780nm is visible to thehuman eye
Each portion of the electromagnetic spectrum obeys the same cipal physical laws Radiation of different wavelengths, however, ap-pears to have different properties in terms of interaction with matterand detectability that can be used for wavelength selective detectors.For the last one hundred years detectors have been developed for ra-diation of almost any region of the electromagnetic spectrum Recentdevelopments in detector technology incorporate point sensors into in-tegrated detector arrays, which allows setting up imaging radiometersinstead of point measuring devices Quantitative measurements of thespatial distribution of radiometric properties are now available for re-mote sensing at almost any wavelength
prin-2.2.2 Dispersion and attenuation
A mixture of radiation consisting of different wavelengths is subject todifferent speeds of light within the medium it is propagating This fact
is the basic reason for optical phenomena such as refraction and sion While refraction changes the propagation direction of a beam of
disper-radiation passing the interface between two media with different cal properties, dispersion separates radiation of different wavelengths(Section2.5.2)
opti-2.2.3 Polarization of radiation
In electromagnetic theory, radiation is described as oscillating electric
and magnetic fields, denoted by the electric field strength E and the magnetic field strength B, respectively Both vector fields are given by
the solution of a set of differential equations, referred to as Maxwell’s equations.
In free space, that is, without electric sources and currents, a special
solution is a harmonic planar wave, propagating linearly in space and
time As Maxwell’s equations are linear equations, the superposition oftwo solutions also yields a solution This fact is commonly referred to
as the superposition principle The superposition principle allows us to explain the phenomenon of polarization, another important property
of electromagnetic radiation In general, the 3-D orientation of
vec-tor E changes over time and mixtures of electromagnetic waves show
1 International Commission on Illumination (Commission Internationale de l’Eclairage, CIE); http://www.cie.co.at/cie
Trang 39Figure 2.2: Illustration of a linear and b circular polarization of
electromag-netic radiation (By C Garbe, University of Heidelberg.)
randomly distributed orientation directions of E If, however, the tromagnetic field vector E is confined to a plane, the radiation is called
elec-linearly polarized (Fig.2.2a)
If two linearly polarized electromagnetic waves are traveling in the
same direction, the resulting electric field vector is given by E = E1+ E2
Depending on the phase shift Φ in the oscillations of E1and E2, the net
electric field vector E remains linearly polarized (Φ = 0), or rotatesaround the propagation direction of the wave For a phase shift of
Φ=90◦ , the wave is called circularly polarized (Fig.2.2b) The general
case consists of elliptical polarization, that is, mixtures between both
cases
Due to polarization, radiation exhibits different properties in ent directions, such as, for example, directional reflectivity or polariza-tion dependent transmissivity
differ-2.2.4 Coherence of radiation
Mixtures of electromagnetic waves, which are emitted from tional light sources, do not show any spatial and temporal relation The
conven-phase shifts between the electric field vectors E and the corresponding
orientations are randomly distributed Such radiation is called ent.
incoher-Special types of light sources, mainly those operating by stimulatedemission of radiation (e g., lasers), emit radiation with a fixed system-atic relationship between the phases of the electromagnetic field vec-
tors, a property called coherence Such radiation can be subject to
con-structive and decon-structive interference if it is superposed As the electricfield vectors can add up to high amplitudes, the local energy impact ofcoherent radiation is much more severe and can cause damage to deli-cate body tissue
Trang 40the concept of plane angle into 3-D space A plane angle θ is defined
as the ratio of the arc length s on a circle to the radius r centered at
the point of definition:
θ = s
The arc length s can be considered as projection of an arbitrary line
in the plane onto the circle (Fig.2.3) Plane angles are measured in
rad (radians) A plane angle θ quantifies the angular subtense of a line
segment in the plane viewed from the point of definition A circle has a
circumference of 2πr and, therefore, subtends a plane angle of 2π rad.
A solid angle Ω is similarly defined as the ratio of an area A on the
surface of a sphere to the square radius, as shown in Fig.2.4:
Ω= A
The area segment A can be considered as the projection of an arbitrarily
shaped area in 3-D space onto the surface of a sphere Solid angles aremeasured in sr (steradian) They quantify the areal subtense of a 2-Dsurface area in 3-D space viewed from the point of definition A sphere
subtends a surface area of 4πr2, which corresponds to a solid angle of
4π sr Given a surface area A that is tilted under some angle θ between
the surface normal and the line of sight the solid angle is reduced by a
factor of cos θ:
Ω= A