Markov's Inequality and Chebyshev's Inequality --- aka, the First Moment Method and the Second Method Method. How to bound the probability of a random variable being far from its mean when you only know its mean, or only know its mean and variance. Lecture 5a of "CS Theory Toolkit": a semester-long graduate course on math and CS fundamentals for research in theoretical computer science, taught at Carnegie Mellon University.
Resources for this lecture:
Chapter 2, "Basic tail and concentration bounds" of Wainwright's book "High dimensional statistics"
Dubhashi--Panconesi book "Concentration of measure for the analysis of randomized algorithms"
Mitzenmacher--Upfal book "Probability and computing"
McDiarmid's article "On the method of bounded differences"
Joag-Dev and Proschan's article "Negative association of random variables and applications"
Taught by Ryan O'Donnell ([ Ссылка ])
Course homepage on CMU's Diderot system: [ Ссылка ]
Filmed by Cole H. for Panopto ([ Ссылка ])
Thumbnail photo by Rebecca Kiger ([ Ссылка ])
Ещё видео!