Probability and Statistics by Example
By Yuri Suhov, Mark Kelbert,
Publisher: Cambridge University Press
Number Of Pages: 372
Publication Date: 2005-11-07
Sales Rank: 1280294
ISBN / ASIN: 0521612330
EAN: 9780521612333
Binding: Paperback
Manufacturer: Cambridge University Press
Studio: Cambridge University Press
Because probability and statistics are as much about intuition and problem solving, as they are about theorem proving, students can find it very difficult to make a successful transition from lectures to examinations and practice. Since the subject is critical in many modern applications, Yuri Suhov and Michael Kelbert have rectified deficiencies in traditional lecture-based methods, by combining a wealth of exercises for which they have supplied complete solutions. These solutions are adapted to needs and skills of students and include basic mathematical facts as needed.
http://mihd.net/37ce9j
http://rapidshare.com/files/49604290/Suho0521612330.rar
Part I Basic probability 1
1 Discrete outcomes 3
1.1 A uniform distribution 3
1.2 Conditional Probabilities. The Bayes Theorem. Independent trials 6
1.3 The exclusion–inclusion formula. The ballot problem 27
1.4 Random variables. Expectation and conditional expectation.
Joint distributions 33
1.5 The binomial, Poisson and geometric distributions. Probability
generating, moment generating and characteristic functions 54
1.6 Chebyshev’s and Markov’s inequalities. Jensen’s inequality. The Law
of Large Numbers and the De Moivre–Laplace Theorem 75
1.7 Branching processes 96
2 Continuous outcomes 108
2.1 Uniform distribution. Probability density functions. Random variables.
Independence 108
2.2 Expectation, conditional expectation, variance, generating function,
characteristic function 142
2.3 Normal distributions. Convergence of random variables
and distributions. The Central Limit Theorem 168
Part II Basic statistics 191
3 Parameter estimation 193
3.1 Preliminaries. Some important probability distributions 193
3.2 Estimators. Unbiasedness 204
3.3 Sufficient statistics. The factorisation criterion 209
3.4 Maximum likelihood estimators 213
3.5 Normal samples. The Fisher Theorem 215
3.6 Mean square errors. The Rao–Blackwell Theorem.
The Cramér–Rao inequality 218
3.7 Exponential families 225
3.8 Confidence intervals 229
3.9 Bayesian estimation 233
4 Hypothesis testing 242
4.1 Type I and type II error probabilities. Most powerful tests 242
4.2 Likelihood ratio tests. The Neyman–Pearson Lemma and beyond 243
4.3 Goodness of fit. Testing normal distributions, 1: homogeneous samples 252
4.4 The Pearson 2 test. The Pearson Theorem 257
4.5 Generalised likelihood ratio tests. The Wilks Theorem 261
4.6 Contingency tables 270
4.7 Testing normal distributions, 2: non-homogeneous samples 276
4.8 Linear regression. The least squares estimators 289
4.9 Linear regression for normal distributions 292
5 Cambridge University Mathematical Tripos examination questions
in IB Statistics (1992–1999) 298
Appendix 1 Tables of random variables and probability distributions 346
Appendix 2 Index of Cambridge University Mathematical Tripos
examination questions in IA Probability (1992–1999) 349
Bibliography 352
Index 358