1 General Characteristic of Control Systems...............................................1
1.1 Subject and Scope of Control Theory................................................1
1.2 Basic Terms .......................................................................................2
1.2.1 Control Plant ...............................................................................4
1.2.2 Controller ....................................................................................6
1.3 Classification of Control Systems......................................................7
1.3.1 Classification with Respect to Connection Between Plant and
Controller .............................................................................................7
1.3.2 Classification with Respect to Control Goal...............................9
1.3.3 Other Cases ...............................................................................11
1.4 Stages of Control System Design ....................................................13
1.5 Relations Between Control Science and Related Areas in Science
and Technology .....................................................................................14
1.6 Character, Scope and Composition of the Book..............................15
2 Formal Models of Control Systems ........................................................17
2.1 Description of a Signal ....................................................................17
2.2 Static Plant .......................................................................................18
2.3 Continuous Dynamical Plant ...........................................................19
2.3.1 State Vector Description...........................................................20
2.3.2 “Input-output” Description by Means of Differential Equation24
2.3.3 Operational Form of “Input-output” Description......................25
2.4 Discrete Dynamical Plant ................................................................29
2.5 Control Algorithm ...........................................................................31
2.6 Introduction to Control System Analysis.........................................33
2.6.1 Continuous System...................................................................35
2.6.2 Discrete System........................................................................37
3 Control for the Given State (the Given Output)......................................41
3.1 Control of a Static Plant...................................................................41
3.2 Control of a Dynamical Plant. Controllability.................................44
3.3 Control of a Measurable Plant in the Closed-loop System.............47
3.4 Observability....................................................................................50
3.5 Control with an Observer in the Closed-loop System .....................55
3.6 Structural Approach.........................................................................59
3.7 Additional Remarks .........................................................................62
4 Optimal Control with Complete Information on the Plant .....................65
4.1 Control of a Static Plant...................................................................65
4.2 Problems of Optimal Control for Dynamical Plants........................69
4.2.1 Discrete Plant ............................................................................69
4.2.2 Continuous Plant.......................................................................72
4.3 Principle of Optimality and Dynamic Programming .......................74
4.4 Bellman Equation ............................................................................79
4.5 Maximum Principle .........................................................................85
4.6 Linear-quadratic Problem................................................................93
5 Parametric Optimization.........................................................................97
5.1 General Idea of Parametric Optimization ........................................97
5.2 Continuous Linear Control System..................................................99
5.3 Discrete Linear Control System.....................................................105
5.4 System with the Measurement of Disturbances.............................107
5.5 Typical Forms of Control Algorithms in Closed-loop Systems ....110
5.5.1 Linear Controller.....................................................................111
5.5.2 Two-position Controller .........................................................112
5.5.3 Neuron-like Controller............................................................112
5.5.4 Fuzzy Controller .....................................................................113
6 Application of Relational Description of Uncertainty.........................117
6.1 Uncertainty and Relational Knowledge Representation ................117
6.2 Analysis Problem...........................................................................122
6.3 Decision Making Problem.............................................................127
6.4 Dynamical Relational Plant ...........................................................130
6.5 Determinization .............................................................................136
7 Application of Probabilistic Descriptions of Uncertainty.....................143
7.1 Basic Problems for Static Plant and Parametric Uncertainty........143
7.2 Basic Problems for Static Plant and Non-parametric Uncertainty152
7.3 Control of Static Plant Using Results of Observations..................157
7.3.1 Indirect Approach ...................................................................158
7.3.2 Direct Approach......................................................................164
7.4 Application of Games Theory........................................................165
7.5 Basic Problem for Dynamical Plant...............................................170
7.6 Stationary Stochastic Process ........................................................174
7.7 Analysis and Parametric Optimization of Linear Closed-loop
Control System with Stationary Stochastic Disturbances....................178
7.8 Non-parametric Optimization of Linear Closed-loop Control System
with Stationary Stochastic Disturbances..............................................183
7.9 Relational Plant with Random Parameter ......................................188
8 Uncertain Variables and Their Applications.........................................193
8.1 Uncertain Variables .......................................................................193
8.2 Application of Uncertain Variables to Analysis and Decision
Making (Control) for Static Plant ........................................................201
8.2.1 Parametric Uncertainty ...........................................................201
8.2.2 Non-parametric Uncertainty ...................................................205
8.3 Relational Plant with Uncertain Parameter....................................211
8.4 Control for Dynamical Plants. Uncertain Controller .....................216
9 Fuzzy Variables, Analogies and Soft Variables ...................................221
9.1 Fuzzy Sets and Fuzzy Numbers.....................................................221
9.2 Application of Fuzzy Description to Decision Making (Control) for
Static Plant ...........................................................................................228
9.2.1 Plant without Disturbances .....................................................228
9.2.2 Plant with External Disturbances............................................233
9.3 Comparison of Uncertain Variables with Random and Fuzzy
Variables ..............................................................................................238
9.4 Comparisons and Analogies for Non-parametric Problems ..........242
9.5 Introduction to Soft Variables........................................................246
9.6 Descriptive and Prescriptive Approaches. Quality of Decisions ...249
9.7 Control for Dynamical Plants. Fuzzy Controller ...........................255
10 Control in Closed-loop System. Stability ...........................................259
10.1 General Problem Description.......................................................259
10.2 Stability Conditions for Linear Stationary System......................264
10.2.1 Continuous System...............................................................264
10.2.2 Discrete System....................................................................266
10.3 Stability of Non-linear and Non-stationary Discrete Systems .....270
10.4 Stability of Non-linear and Non-stationary Continuous Systems 277
10.5 Special Case. Describing Function Method.................................278
10.6 Stability of Uncertain Systems. Robustness ................................282
10.7 An Approach Based on Random and Uncertain Variables..........291
10.8 Convergence of Static Optimization Process...............................295
附件列表