CS 206 - Introduction to Discrete Structures II
Spring 2016
Course Overview
This course is an introduction to probability theory and
combinatorics, including their basic mathematical foundations as well as
several applications of each to computer science, and to life. Your work will
involve solving problems through rigorous mathematical reasoning, often constructing proofs, and the course is designed to
teach how to do this.
By studying probability theory you will learn to think about
randomness in a rigorous and sensical way. This is harder (and more
fun) than it sounds as first: One's intuition is easily misled when it comes
to probability, and we will discover surprisingly complex behavior in the world
around us by examining simple processes carefully.
Combinatorics is about counting the number of objects
fitting a given description. It is intimately related to probability theory:
Knowing the number of possibilities for a random outcome often goes a long way
to understanding that random process. But its relevance goes well beyond
probability and thus we will look at several other applications as well.
Course Topics
The course covers the following list of topics, which are broken into three
parts. Lecture summaries will be posted at the bottom of this page. You can
also find previous semesters' class web pages linked from my homepage.
- Part I: Sets, Counting and Probability
- Simple probability: Sample spaces and events.
- Review of prerequisites: Set basics, countability, proof by induction.
- Basic counting techniques: Addition and multiplication rules, binomial and multinomial coefficients.
- Basic probability: Random processes where all outcomes are equally likely.
- Part II: Probability Theory
- Formal probability theory: probability
measures.
- Conditional probability, Bayes's Theorem, Independence
- Random Variables, with several examples
- Expectation and Variance
- Part III: Advanced Topics
- Advanced counting: Recurrences and Generating Functions
- Randomized data structures and algorithms
- Random walks and Markov chains
- Cryptography
General Information
- Instructor: David Cash (david.cash@cs.rutgers.edu, office Hill 411)
- Course website: http://www.cs.rutgers.edu/~dc789/206-spring-16
- Recommended (not required) textbooks:
- (primary) K. Rosen, Discrete Mathematics and Its Applications, any recent edition.
- J. K. Blitzstein and Jessica Hwang, Introduction to Probability, any
edition
- S. Ross, A First Course in Probability, any
edition
- Lecture meetings:
Tuesday and Friday 8:40am -- 10:00am in SERC room 117 (Busch Campus).
- Recitations:
- Section 01: Tuesday 12:15pm -- 1:10pm in Hill Center room 009
- Section 02: Friday 10:35am -- 11:30am in ARC room 107
- Section H1: Tuesday 12:15pm -- 1:10pm in ARC room 206
- Midterms: (in lecture)
- Final exam: (time/place TBD)
- Instructor office hours:
- 3:00pm -- 4:00pm Mondays in Hill 411
- By appointment (email or talk to me at class)
- Teaching assistants (with contact and office hours):
- Abdul Basit (abasit@cs.rutgers.edu, office hours Weds 1:30pm-3:00pm in Hill 414)
- Hai Pham (hxp1@cs.rutgers.edu, office hours by appointment)
Expectations, Assignments and Grading
You are expected to attend every lectures.
Required readings will be assigned from lecture notes occasionally
posted on Sakai, and the text.
Semester grades will be a weighted average of the following:
- Homeworks (20%)
- Quizzes (15%)
- 2 Midterms (35% total)
- Final Exam (30% total)
Homeworks
Homeworks will be assigned every week. You are allowed to collaborate
on homeworks, but you must write up your own submission. Copying
someone else's work is considered a violation of the Honor Code and will be
addressed accordingly. Homework is worth a significant part of your grade, but
not as much as exams. I recommend viewing them as a chance practice thinking
about problems before exam time -- If you find solutions via collaboration you
will not likely get much out of them, only making things more difficult
later.
Important notes regarding homework:
- You may submit your homework in person or via Sakai. If you use Sakai,
you must type your work up or alternatively scan handwritten pages.
Illegible photographs will not be accepted. (There are iPhone/Android apps
that can improve the readability of submissions by imitating scanners.)
- Late homeworks will not accepted without an approved excuse (like illness). You may be asked for documentation.
Quizzes
There will be X (where X is TBD) in-class quizzes. These will be
scheduled in advance.
Midterms and Final
There will be two midterms and one final exam. The midterms
will given in lecture. Practice material to prepare for the
midterms and final will be provided. You may bring single-sided, handwritten
(not copied) cheat sheet to the exams.
A formula sheet will be provided on
all exams. No books, notes, calculators, phones, laptops, or any other
resource will be allowed during exams.
The midterms will be on February 19 and April 1.
The final exam will be scheduled by the university.
Homework Assignments
Homework assignments will be posted on Sakai.
Lecture Schedule
A brief summary of each lecture and the associated reading will be posted here.
For readings I refer to each of the texts (Ross, Rosen, and BH).
- Tue Jan 19 (Lecture 1): Introduction
- Topics: Class organization. Why we study probability.
Set theory.
- Reading: None
- Fri Jan 22 (Lecture 2): Basic Probability
- Topics: Probability: Sample spaces, events, operations on events.
"Naive" probability where all outcomes are equally likely. Begin counting.
- Reading:
- Ross: 2.2, then 1.2, 1.3
- Rosen: 6.1 (up to "More Complex Counter Problems"), then also read the end of 6.1 about "Tree Diagrams"; 7.1 (this section assumes some material that we'll cover soon; you can skip Examples 4, 5, 6)
- BH: 1.1, 1.2, 1.3, 1.4 up to 1.4.2
- Tue Jan 26 (Lecture 3): Counting
- Topics: Multiplication principle, sampling with/without replacement. Permutations (i.e., arranging n objects in order). The birthday problem.
Overcounting and correcting for overcounting.
- Reading:
- Ross: 1.2, 1.3 up to Ex. 3d
- Rosen: Same as Lecture 2.
- BH: 1.4 up to 1.4.2
- Fri Jan 29 (Lecture 4): Binomial Coefficients and Applications
- Topics: Definition of binomial coefficients and the formula. Arrangements of words with repeated
letters. Poker hand probabilities. Overcounting errors.
- Reading:
- Ross: 1.4 (we will cover the missing examples next time)
- Rosen: Binomial coefficients are in 6.3
- BH: 1.4 (we will cover the missing examples next time)
- Tue Feb 2 (Lecture 5): More on counting
- Topics: Counting the complement (Newton-Pepys question about the
chance of getting some number of 6s on a group of dice). Counting arrangements when some objects are indistinguishable (correcting for overcounting, and directly). Bose-Einstein.
- Reading:
- Ross: 1.3, 1.6
- Rosen: 6.5 (this section contains a few ideas that
we did not cover)
- BH: 1.4 (Newton-Pepys is Ex. 1.4.19)
- Fri Feb 5 (Lecture 6): Correcting for overcounting; Inclusion-Exclusion
- Topics: Dividing kids into teams, where the order of teams
does not matter. More general idea of correcting for overcounting.
Begin inclusions-exclusion (for two sets, three sets, and the general
formula for many sets)
- Reading:
- Ross: Inclusion-exclusion is not covered
in this language by Ross; The probability version of it
is in 2.4, Prop 4.4.
- Rosen: 8.5 (this corresponds closely to the lecture)
- BH: Inclusion-exclusion is not covered
in this language by BH; The probability version of it
is in 1.6, Theorem 1.6.3. Correcting for overcounting is done in
Ex 1.4.11 in section 1.4.
- Tues Feb 9 (Lecture 7): Inclusion-Exclusion
- Topics: General version of inclusion-exclusion with n sets. Examples
with cards (Problem 48 in the text is similar to the example in class, and has a solution online.). de Montmort's card problem..
- Reading: See inclusion-exclusion reading above.
- Fri Feb 12 (Lecture 8): Story Proofs; Bloom Filters
- Topics: Using story proofs to establish combinatorial identities.
Bloom filters.
- Reading:
- Tue Feb 16 (Lecture 9): Midterm I Review
- Friday Feb 19 (Lecture 10): Midterm I
- Tue Feb 23 (Lecture 11): General Probability Measures; Conditional Probability
- Topics: Probability measures when all outcomes might not
be equally likely. Proofs of basic facts in this case. Conditional
probability definition and basic examples.
- Reading:
- Ross: 3.2, through Example 2c
- Rosen: 7.2, through Conditional Probability
- BH: 1.6, 2.2
- Fri Feb 26 (Lecture 12): Chain Rule, Law of Total Probability, and Bayes's
Rule
- Topics: Definition of the chain rule and how to apply it to
some card problems. Basic version of LOTP, and how it works for
Bayes's Rule. Problems without explicit sample spaces (like the
coins in hat).
- Reading:
- Ross: 3.2, 3.3 (note: We will not cover the "odds" version of Bayes's)
- Rosen: 7.3, (LOTP is baked into Bayes's rule in this text)
- BH: 2.3 (note: We will not cover the "odds" version of Bayes's)
- Tue Mar 1 (Lecture 13): Independence
- Viewing conditional probability as a probability measure. Definition of independent events. Equivalent definition with
conditional probability.
Examples: Coin tossing, dice roll sums. Example
tossing a coin chosen from a hat containing two coins, one fair and one biased. Definition of conditional independence: Events may be independent but
not conditionally independent, and vice versa.
- Reading:
- Ross: 3.4, 3.5 (these sections contain some examples
that we won't cover, but may be helpful anyway)
- Rosen: 7.2 covers independence; conditional independence
is not covered
- BH: 2.4, 2.5
- Fri Mar 4 (Lecture 14): Conditional, Pairwise and Mutual Independence; The Monty Hall Problem
- Definitions of independent events, and pairwise independent vs
mutually independent events. Examples of events that are pairwise
independent but not mutually independent. The Monty Hall Problem,
and the solution via conditional probability. Begin random variables.
- Reading:
- Ross: See previous reading for independence. Monty
Hall does not appear to be covered here. Random variables begin in
4.1, 4.2
- Rosen: See previous reading for independence. Monty
Hall is at the very end of 7.1, and also Exercise 15 of 7.3. Mutual
independent is only covered in an exercise (Exercise 25 of the
Supplementary Exercises for Chapter 7). Random variables are in
7.2.
- BH: See previous reading for independence. Monty
Hall is in 2.7.1. Random variables are in 3.1.
- Tue Mar 8 (Lecture 15): Random Variables
- Definitions of a random variable, with examples from dice,
cards, coins, and balls-into-bins. Indicator random variables.
How to define P(X=k) and statements like P(1 ≤ X ≤ 4). The distribution
of a random variable. The Bernoulli named distribution.
- Reading:
- Ross: 4.1
- Rosen: 7.2
- BH: 3.1, 3.2
- Fri Mar 11 (Lecture 16): Bernoulli, Binomial, Hypergeometric, and
Uniform Distributions
- Definitions of the Bernoulli, Binomial, Hypergeometric, and Uniform
distributions, including their stories, ranges, and distribution functions.
How to determine which of these distributions a random variable has.
- Reading:
- Ross: Ber and Bin are in 4.6. HGeom is in 4.8.3. Uniform
is covered in various examples in Chapter 4.
- Rosen: Ber and Bin are in 7.2. HGeom is not covered.
- BH: 3.3, 3.4, 3.5
- Tue Mar 22 (Lecture 17): Functions of a r.v., Independence of r.v.s,; Start Expectation
- Function of a random variable like 2X, |X|, etc and how they affect
the distribution of X. Example with a random walk.
Definition of independence for r.v.s, and how it
differs from independence of events. How to show r.v.s are/aren't
independent.
Expressing Binomial and Hypergeometric r.v.s as sums.
Definition and intuition of expectation as a weighted
average, with examples from one die, two dice, and a deck of cards.
- Reading:
- Ross: Expectation is in 4.3. Functions of a r.v. is not covered in exactly the language from class.
- Rosen: Independence is in 7.4. Expectation is also in 7.4. Functions of a r.v. is not covered in exactly the language from class.
- BH: 3.8 up to Def 3.8.9, and 4.1
- Fri Mar 25 (Lecture 18): Linearity of Expectation
- Alternative formula for E(X) that sums over the sample space. Small
examples with that formula. Linearity of expectation: E(X+Y) = E(X) + E(Y) for any random variables X,Y (even if they're dependent). Expected number
of Aces in a poker hand, done with and without replacement (and why it
doesn't change the expectation). Expectations of Bernoulli, Binomial, and
Hypergeometric distributions. Using linearity to solve harder
expectations, like the expected number of matches in de Montmorte's
problem.
- Reading:
- Ross: The coverage of this topic is quite poor in
this text. It appears in section 7 but I would not recommend trying
to learn it there.
- Rosen: Linearity of expectation is a subsection of in 7.4.
- BH: 4.2. Start 4.4 (we worked example 4.4.4)
- Tue Mar 29 (Lecture 19): The Indicator Method for Expectation
- Refresh on indicator random variables. The expectation
of an indicator is just a probability. Representing a
random variable as a sum of indicators. Combining the observations
to find expectations with couples sitting in chairs, card problems
coin problems and dice problems.
- Reading:
- Ross: The indicator method is not covered well here.
- Rosen: The indicator method is not covered well here.
See exercises 24 and 25 in 7.4.
- BH: 4.4
- Fri Apr 1 (Lecture 20): Midterm 2
- Tue Apr 5 (Lecture 21): The Geometric and Negative Binomial
Distributions
- The story, range, distribution, and expectation of geometric
and negative binomial random variables. Coupon collector problem
(and how it differs from negative binomial).
- Reading:
- Ross: The indicator method is not covered well here.
- Rosen: 4.8.1, 4.8.2. Coupon collector is in 7.2 (example 2i).
- BH: 4.3
- Fri Apr 8 (Lecture 22): Variance
- Three facts about expectation: 1) It might be infinite,
2) Law of the unconscious statistician (LOTUS) for calculating
E(g(X)), and 3) E(XY) = E(X)E(Y) when X and Y are independent.
The definition and intuition behind variance, and some alternative
attempts at a good definition. The easier formula for variance and
some example calculations using LOTUS.
- Reading:
- Ross: The infinite expectation example is exercise 30
and the end of chapter 4. 4.5.
- Rosen: The infinite expectation example is exercise 21
and the end of chapter 7. 7.4.
- BH: The infinite expectation example is
at the end of 4.3, 4.5, 4.6
- Tue Apr 12 (Lecture 23): Properties of Variance
- Facts about variance. Covariance and
correlation (positive correlation, negative correlation, and uncorrelated)
Var(X + Y) = Var(X) + Var(Y) when X and Y are independent.
- Reading:
- Ross: 4.5.
- Rosen: 7.4.
- BH: 4.6 (covariance is covered in a later chapter but
it is mixed with other topics that we don't need)
- Fri Apr 15 (Lecture 24): Inequalities
- Markov's, Chebyshev, and Chernoff's bounds. Intuition for
each, and the proof of Markov and Chebyshev. Applications to
simple problems.
- Reading: Note on Sakai.
- Tue Apr 19 (Lecture 25): The Probabilistic Method
- Proving that certain objects exist using probability.
Examples: The square with red corners, and forming committees
with intersections. A lower bound for Ramsey numbers.
- Reading: Note on Sakai.
- Ross: N/A
- Rosen: N/A
- BH: 4.9 up to 4.9.1 (the two examples I covered are
in this text)
- Fri Apr 22 (Lecture 26): Markov Chains I
- Definition of a Markov chain (i.e. the Markov property).
Notation and examples of Markov chains. The state diagram and
transition matrix of a Markov chain.
- Reading: 6.1 and 6.2 of the PDF on Sakai.
- Tue Apr 26 (Lecture 27): Markov Chains II
- Multi-step transition probabilities and the the multi-step transition matrix. Computing multi-step the transition matrix by matrix multiplication. Initial distributions as vectors, and computing unconditional probabilities via vector-matrix multiplication. Regular Markov chains and the steady state theorem.
- Reading: 6.3 and 6.4 of the PDF on Sakai.