全部版块 我的主页
论坛 经管考试 九区 经管考证 金融类
27439 64
2008-02-02
it is over, thank you everyone.<br/>

[此贴子已经被作者于2009-2-5 1:45:04编辑过]

前 言
"We see that the theory of probability is at bottom only common sense reduced to calculation; it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct, often without being able to account for it .... It is remarkable that this science, which originated in the consideration of games of chance, should have become the most important object of human knowledge .... The most important questions of life are, for the most part, really only problems of probability." So said the famous French mathematician and astronomer (the "Newton of France") Pierre Simon, Marquis de Laplace. Although many people might feel that the famous marquis, who was also one of the great contributors to the development of probability, might have exaggerated somewhat, it is nevertheless true that probability theory has become a tool of fundamental importance to nearly all scientists, engineers, medical practitioners, jurists, and industrialists. In fact, the enlightened individual had learned to ask not "Is it so?" but rather "What is the probability that it is so?" .
This book is intended as an elementary introduction to the theory of probability for students in mathematics, statistics, engineering, and the sciences (including computer science, the social sciences and management science) who possess the prerequisite knowledge of elementary calculus. It attempts to present not only the mathematics of probability theory, but also, through numerous examples, the many diverse possible applications of this subject.
In Chapter 1 we present the basic principles of combinatorial analysis, which are most useful in computing probabilities.
In Chapter 2 we consider the axioms of probability theory and show how they can be applied to compute various probabilities of interest.
Chapter 3 deals with the extremely important subjects of conditional probability and independence of events. By a series of examples we illustrate how conditional probabilities come into play not only when some partial information is available, but also as a tool to enable us to compute probabilities more easily, even when no partial information is present. This extremely important technique of obtaining probabilities by "conditioning" reappears in Chapter 7, where we use it to obtain expectations.
In Chapters 4, 5, and 6 we introduce the concept of random variables. Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter 6. The important concepts of the expected value and the variance of a random variable are introduced in Chapters 4 and 5: These quantities are then determined for many of the common types of random variables.
Additional properties of the expected value are considered in Chapter 7. Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented. Sections on conditional expectation, including its use in prediction, and moment generating functions are contained in this chapter. In addition, the final section introduces the multivariate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution.
In Chapter 8 we present the major theoretical results of probability theory. In particular, we prove the strong law of large numbers and the central limit theorem. Our proof of the strong law is a relatively simple one which assumes that the random variables have a finite fourth moment, and our proof of the central limit theorem assumes Levy's continuity theorem. Also in this chapter we present such probability inequalities as Markov's inequality, Chebyshev's inequality, and Chernoff bounds. The final section of Chapter 8 gives a bound on the error involved when a probability concerning a sum of independent Bernoulli random variables is approximated by the corresponding probability for a Poisson random variable having the same expected value.
Chapter 9 presents some additional topics, such as Markov chains, the Poisson process, and an introduction to information and coding theory, and Chapter 10 considers simulation. ..
The sixth edition continues the evolution and fine tuning of the text. There are many new exercises and examples. Among the latter are examples on utility (Example 4c of Chapter 4), on normal approximations (Example 4i of Chapter 5), on applying the lognormal distribution to finance (Example 3d of Chapter 6), and on coupon collecting with general collection probabilities (Example 2v of Chapter 7). There are also new optional subsections in Chapter 7 dealing with the probabilistic method (Subsection 7.2.1), and with the maximum-minimums identity (Subsection 7.2.2).
As in the previous edition, three sets of exercises are given at the end of each chapter. They are designated as Problems, Theoretical Exercises, and Self-Test Problems and Exercises. This last set of exercises, for which complete solutions appear in Appendix B, is designed to help students test their comprehension and study for exams.
All materials included on the Probability Models diskette from previous editions can now be downloaded from the Ross companion website at http://www, prenhaU.com/Ross. Using the website students will be able to perform calculations and simulations quickly and easily in six key areas:
Three of the modules derive probabilities for, respectively, binomial, Poisson, and normal random variables.
Another module illustrates the central limit theorem. It considers random variables that take on one of the values 0, 1, 2, 3, 4 and allows the user to enter the probabilities for these values along with a number n. The module then plots the probability mass function of the sum of n independent random variables of this type. By increasing n one can "see" the mass function converge to the shape of a normal density function.
The other two modules illustrate the strong law of large numbers. Again the user enters probabilities for the five possible values of the random variable along with an integer n. The program then uses random numbers to simulate n random variables having the prescribed distribution. The modules graph the number of times each outcome occurs along with the average of all outcomes. The modules differ in how they graph the results of the trials.
We would like to thank the following reviewers whose helpful comments and suggestions contributed to recent editions of this book: Anastasia Ivanova, University of North Carolina; Richard Bass, University of Connecticut; Ed Wheeler, University of Tennessee; Jean Cadet, State University of New York at SUNY, Stony Brook; Jim Propp, University of Wisconsin; Mike Hardy, Massachusetts Institute of Technology; Anant Godbole, Michigan Technical University; Zakkula Govindarajulu, University of Kentucky; Richard Groeneveld. Iowa State University; Bernard Harris, University of Wisconsin; Stephen Herschkorn, Rutgers University; Robert Keener, University of Michigan; Thomas Liggett, University of California, Los Angeles; Bill McCormick, University of Georgia; and Kathryn Prewitt, Arizona State University. Special thanks go to Hossein Hamedani, Marquette University, and Ben Perles for their hard work in accuracy checking this manuscript.
We also express gratitude to the reviewers on earlier editions: Thomas R. Fischer, Texas A & M University; Jay DeVore, California Polytechnic University, San Luis Obispo; Robb J. Muirhead, University of Michigan; David Heath, Cornell University; Myra Samuels, Purdue University; I. R. Savage, Yale University; R. Miller, Stanford University; K. B. Athreya, Iowa State University; Phillip Beckwith, Michigan Tech; Howard Bird, St. Cloud State University; Steven Chiappari, Santa Clara University; James Clay, University of Arizona at Tucson; Francis Conlan, University of Santa Clara; Fred Leysieffer, Florida State University; Ian McKeague, Florida State University; Helmut Mayer, University of Georgia; N. U. Prabhu, CorneU University; Art Schwartz, University of Michigan at Ann Arbor;Therese Shelton, Southwestern University; and Allen Webster, Bradley University. ...
S.R.
s

附件列表

191786.rar

大小:9.54 MB

只需: 1 个论坛币  马上下载

北美精算师exam p考试指定教材-A first course in probability

191787.rar

大小:9.54 MB

只需: 1 个论坛币  马上下载

北美精算师exam p考试指定教材-A first course in probability

191788.rar

大小:9.54 MB

只需: 1 个论坛币  马上下载

北美精算师exam p考试指定教材-A first course in probability

191789.rar

大小:8.29 MB

只需: 1 个论坛币  马上下载

北美精算师exam p考试指定教材-A first course in probability

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

全部回复
2008-2-2 02:13:00
忘记说了,是英文版本的。
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2008-2-5 12:23:00
aaaaaa
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2008-2-9 01:52:00
北美精算师exam p考试指定教材-A first course in probability <br/>how about I exchange the above book with you?
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2008-2-9 01:57:00
<p> 北美精算师exam p考试指定教材-A first course in probability <br/></p><p></p><p>I wonder whether your file is this one? If yes...I don't need to exchange ....</p>
附件列表
192306.jpg

原图尺寸 40.87 KB

北美精算师exam p考试指定教材-A first course in probability

北美精算师exam p考试指定教材-A first course in probability

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2008-2-14 02:44:00
ding
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

点击查看更多内容…
相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群