Ta strona wykorzystuje pliki cookies. Korzystając ze strony, zgadzasz się na ich użycie. OK Polityka Prywatności Zaakceptuj i zamknij X

A FIRST COURSE IN PROBABILITY SHELDON ROSS 5TH ED

01-02-2014, 11:57
Aukcja w czasie sprawdzania była zakończona.
Cena kup teraz: 90 zł     
Użytkownik bazar-wiedzy
numer aukcji: 3849774860
Miejscowość Warszawa
Wyświetleń: 3   
Koniec: 01-02-2014 11:44:05

Dodatkowe informacje:
Stan: Nowy
Okładka: miękka
Rok wydania (xxxx): 1998
Kondycja: bez śladów używania
Język: angielski
info Niektóre dane mogą być zasłonięte. Żeby je odsłonić przepisz token po prawej stronie. captcha

Polecamy książki dla fachowców w najtańszych cenach na Allegro

Pamiętaj zawsze możesz do nas zadzwonić
533 [zasłonięte] 038
/ w dni powszednie od 10 do 17 / lub napisać
[zasłonięte]@ksiegarnia-fachowa.pl

 


A FIRST COURSE IN PROBABILITY


SHELDON ROSS

5TH ED

 

PRENTICE HALL , 514 STRON, 

 

egzemplarz powystawowy

 

A First Course in Probability

"We see that the theory of probability is at bottom only common sense reduced to calculation; it makes us appreciate with exactitude what reasonable minds feel by a sort of instinct, often without being able to account for it.... It is remarkable that this science, which originated in the consideration of games of chance, should have become the most important object of human knowledge. . .. The most important questions of life are, for the most part, really only problems of probability." So said the famous French mathematician and astronomer (the "Newton of France") Pierre Simon, Marquis de Laplace. Although many people might feel that the famous marquis, who was also one of the great contributors to the development of probability, might have exaggerated somewhat, it is nevertheless true that probability theory has become a tool of fundamental importance to nearly all scientists, engineers, medical practitioners, jurists, and industrialists. In fact, the enlightened individual had learned to ask not "Is it so?" but rather "What is the probability that it is so?"

This book is intended as an elementary introduction to the mathematical theory of probability for students in mathematics, engineering, and the sciences (including the social sciences and management science) who possess the prerequisite knowledge of elementary calculus. It attempts to present not only the mathematics of probability theory, but also, through numerous examples, the many diverse possible applications of this subject.

In Chapter 1 we present the basic principles of combinatorial analysis, which are most useful in computing probabilities.

In Chapter 2 we consider the axioms of probability theory and show how they can be applied to compute various probabilities of interest. This chapter includes a proof of the important (and, unfortunately, often neglected) continuity property of probabilities, which is then used in the study of a "logical paradox."

Chapter 3 deals with the extremely important subjects of conditional probability and independence of events. By a series of examples we illustrate how conditional probabilities come into play not only when some partial information is available, but also as a tool to enable us to compute probabilities more easily, even when no partial information is present. This extremely important technique of obtaining probabilities by "conditioning" reappears in Chapter 7, where we use it to obtain expectations.

In Chapters 4,5 and 6 we introduce the concept of random variables. Discrete random variables are dealt with in Chapter 4, continuous random variables in Chapter 5, and jointly distributed random variables in Chapter 6. The important concepts of the expected value and the variance of a random variable are introduced in Chapters 4 and 5. These quantities are then determined for many of the common types of random variables.

Additional properties of the expected value are considered in Chapter 7. Many examples illustrating the usefulness of the result that the expected value of a sum of random variables is equal to the sum of their expected values are presented. Sections on conditional expectation, including its use in prediction, and moment generating functions are contained in this chapter. In addition, the final section introduces the multivariate normal distribution and presents a simple proof concerning the joint distribution of the sample mean and sample variance of a sample from a normal distribution.

In Chapter 8 we present the major theoretical results of probability theory. In particular, we prove the strong law of large numbers and the central limit theorem. Our proof of the strong law is a relatively simple one which assumes that the random variables have a finite fourth moment, and our proof of the central limit theorem assumes Levy's continuity theorem. Also in this chapter we present such probability inequalities as Markov's inequality, Chebyshev's inequality, and Chemoff bounds. The final section of Chapter 8 gives a bound on the error involved when a probability concerning a sum of independent Bernoulli random variables is approximated by the corresponding probability for a Poisson random variable having the same expected value.

Chapter 9 presents some additional topics, such as Markov chains, the Poisson process, and an introduction to information and coding theory, and Chapter 10 considers simulation.

NEW TO THE FIFTH EDITION

Each chapter in the fifth edition has been updated in response to reviewers comments. Professors who wish to move through the first chapters quickly, will appreciate the addition of asterisks to denote optional sections that may safely be skipped. Among new text material included are discussions on the odds-ratio in Chapter 3, and two new discussions in Chapter 6: a new section on exchangeable random variables and a discussion of the fact that independence is a symmetric relation.

A goal of the Fifth Edition is to make the book more accessible to students. The examples are updated to include many interesting and practical examples including one dealing with the counterintuitive ace of spades versus the two of clubs problem (Example 5j in Chapter 2); the two girls problem (Example 3j in Chapter 3); the analysis of the quicksort algorithm (Example 2o of Chapter 7);

and the best prize problem (Example 41 in Chapter 7). In addition, the problems are thoroughly revised with over 25% being new to this edition. The chapter exercises are reorganized to present the more mechanical problems before the theoretical exercises. Prose summaries now conclude each chapter and a new study tool is included in the book. The new Self-Test Problems and Exercises section is designed to help students test their comprehension and study for exams. After working through the problems and theoretical exercises in each chapter, students are encouraged to do the Self-test problems and to check their work against the complete solutions that appear in Appendix B.

Another new feature of the Fifth Edition, in the addition of the Probability Models Disk. This easy to use PC Disk is packaged in the back of each copy of the book. Referenced in text, this disk allows students to quickly and easily perform calculations and simulations in six key areas.

Three of the modules derive probabilities for, respectively, binomial, Poisson, and normal random variables.

Another illustrates the central limit theorem. It considers random variables that take on one of the values 0, 1, 2, 3, 4 and allows the user to enter the probabilities for these values along with a number n. The module then plots the probability mass function of the sum of n independent random variables of this type. By increasing n one can "see" the mass function coverage to the shape of a normal density function.

The other two modules illustrate the strong law of large numbers. Again the user enters probabilities for the five possible values of the random variable along with an integer n. The program then uses random numbers to simulate n random variables having the prescribed distribution. The modules graph the number of times each outcome occurs along with the average of all outcomes. The modules differ in how they graph the results of the trials.