CS 206 - Introduction to Discrete Structures II
This course is an introduction to probability theory and
combinatorics, including their basic mathematical foundations as well as
several applications of each to computer science, and to life. Your work will
involve solving problems through rigorous mathematical reasoning, often constructing proofs, and the course is designed to
teach how to do this.
By studying probability theory you will learn to think about
randomness in a rigorous and sensical way. This is harder (and more
fun) than it sounds as first: One's intuition is easily misled when it comes
to probability, and we will discover surprisingly complex behavior in the world
around us by examining simple processes carefully.
Combinatorics is about counting the number of objects
fitting a given description. It is intimately related to probability theory:
Knowing the number of possibilities for a random outcome often goes a long way
to understanding that random process. But its relevance goes well beyond
probability and thus we will look at several other applications as well.
The course covers the following list of topics, which are broken into three
parts. Lecture summaries will be posted at the bottom of this page. You can
also find previous semesters' class web pages linked from my homepage.
- Part I: Sets, Basic Counting and Probability
- Simple probability: Sample spaces and events.
- Review of prerequisites: Set basics, countability, proof by induction.
- Basic counting techniques: Addition and multiplication rules, binomial and multinomial coefficients.
- Basic probability: Random processes where all outcomes are equally likely.
- Part II: Probability Theory
- Formal probability theory: probability
- Conditional probability, Bayes's Theorem, Independence
- Random Variables, with several examples
- Expectation and Variance
- Part III: Advanced Topics
- Advanced counting: Recurrences and Generating Functions
- Randomized data structures and algorithms
- Random walks and Markov chains
- Instructor: David Cash (email@example.com, office Hill 411)
- Course website: http://www.cs.rutgers.edu/~dc789/206-spring-15
- Required textbook: J. K. Blitzstein and Jessica Hwang, Introduction to Probability
- Lecture meetings:
- Note: You must attend your assigned lecture to get credit for in-class quizzes and to submit homework.
- Sections 01 and 02: Tuesday and Friday 8:40am -- 10:00am in SERC room 117 (Busch Campus).
- Sections 05 and 06: Tuesday and Friday 12:00pm -- 1:20pm in Lucy Stone Hall room A143 (Livingston Campus).
- Section 01: (Fatma) Tuesday 12:15pm -- 1:10pm in Hill Center room 009
- Section 02: (Fatma) Friday 10:35am -- 11:30am in ARC room 107
- Section 05: (Hai) Tuesday 1:55pm -- 2:50pm in Business Rockafeller Road room 4085
- Section 06: (Hai) Friday 1:55pm -- 2:50pm in Tillett Hall room 253
- Midterms: (place TBD)
- February 23, 9:40pm-11:00pm
- April 13, 9:40pm-11:00pm
- Final exam: May 11, 4:00pm-7:00pm (place TBD)
- Instructor office hours (all sections welcome at either time):
- 3:00pm -- 4:00pm Mondays in Hill 411
- 10:00am -- 11:00am Thursdays in Hill 411
- By appointment (email or talk to me at class)
- Teaching assistants (with office hours):
- Fatma Durak (firstname.lastname@example.org): 3:00pm-4:00pm Tuesdays in Hill 410
- Amr Bakry (email@example.com)
- Hai Pham (firstname.lastname@example.org): 5:00pm-6:00pm Mondays in Hill 405
Expectations, Assignments and Grading
You are expected to attend every lectures.
Required readings will be assigned from lecture notes occasionally
posted on Sakai, and the text.
Semester grades will be a weighted average of the following:
- Homeworks (20%)
- Quizzes (15%)
- 2 Midterms (35% total)
- Final Exam (30% total)
Homeworks will be assigned every week. You are allowed to collaborate
on homeworks, but you must write up your own submission. Copying
someone else's work is considered a violation of the Honor Code and will be
addressed accordingly. Homework is worth a significant part of your grade, but
not as much as exams. I recommend viewing them as a chance practice thinking
about problems before exam time -- If you find solutions via collaboration you
will not likely get much out of them, only making things more difficult
Important notes regarding homework:
- You may submit your homework in person or via Sakai. If you use Sakai,
you must type your work up or alternatively scan handwritten pages.
Photographs will not be accepted.
- If you submit your homework in person, you must submit it in your
- Late homeworks will not accepted without an approved excuse (like illness). You may be asked for documentation.
There will be X (where X is TBD) in-class quizzes. These will be scheduled in advance. You must attend your assigned lecture
section to take the quizzes.
Midterms and Final
There will be two midterms and one final exam. The sections
will take midterms and final together. Practice material to prepare for the
midterms and final will be provided. A formula sheet will be provided on
all exams. No books, notes, calculators, phones, laptops, or any other
resource will be allowed during exams.
The midterms will be on February 23 and April 13. Time is TBD.
The final exam will be scheduled by the university.
Homework assignments will be posted on Sakai.
A brief summary of each lecture and the associated reading will be posted here.
- Tue Jan 20 (Lecture 1): Introduction
- Topics: Class organization. Why we study probability.
- Reading: 1.1, and A.1
- Fri Jan 23 (Lecture 2): Basic Probability
- Topics: Probability: Sample spaces, events, operations on events.
"Naive" probability where all outcomes are equally likely. Begin counting.
- Reading: 1.2-1.3, begin 1.4
- Tue Jan 27 (Lecture 3): Canceled due to snow
- Fri Jan 30 (Lecture 4): Counting
- Topics: Multiplication principle, sampling with/without replacement. Permutations. The birthday problem.
Overcounting and correcting for overcounting.
- Reading: 1.4 up to Definition 1.4.12
- Tue Feb 3 (Lecture 5): Binomial Coefficients and Applications
- Topics: Definition of binomial coefficients and the formula. Arrangements of words with repeated
letters. Newton-Pepys (example 1.4.19). Poker hand probabilities (example 1.4.18, and others).
- Reading: 1.4 up to Definition 1.4.20
- Fri Feb 6 (Lecture 6): Finishing counting
- Topics: Sampling with replacement but without order (Bose-Einstein). The binomial theorem. Story
- Reading: Finish 1.4, 1.5.
- Tue Feb 10 (Lecture 7): Formal Probability Spaces
- (Started with some counting example from homework, about the
tree method and how it can overcount.) Definition of probability space
with two rules. Proofs using rules for a probability space. Start
inclusion exclusion for unions of two or three sets.
- Reading: 1.6 up to Theorem 1.6.3
- Fri Feb 13 (Lecture 8): Inclusion-Exclusion
- General version of inclusion-exclusion with n sets. Examples
with cards (Problem 48 in the text is similar to the example in class, and has a solution online.). de Montmort's card problem. Started conditional
- Reading: Finish 1.6, start 2.2
- Tue Feb 17 (Lecture 9): Conditional Probability and Bayes's Rule
- Definition of conditional probability. Examples: getting at
least one Heads vs getting the first coin as Heads (Example 2.2.5 deals
with the same concept). Bayes's rule and law of total probability (LOTP).
Examples with flipping a coin that may be fair or biased (Example 2.3.7)
and testing for diseases (Example 2.3.9). Intuition for why the testing
problem gave such a low probability.
- Reading: 2.2 and 2.3 (skip "odds" part of 2.3, i.e. Definition 2.3.4 and Theorem 2.3.5)
- Fri Feb 20 (Lecture 10): Independence
- Definition of independence for events. Equivalent definition with
conditional probability. The difference between disjoint and independent.
Examples: Coin tossing, dice roll sums. Example
tossing a coin chosen from a hat containing two coins, one fair and one biased. Definition of conditional independence: Events may be independent but
not conditionally independent, and vice versa. The Monty Hall problem.
- Reading: 2.5, 2.7.1
- Tue Feb 24 (Lecture 11): Midterm 1
- Fri Feb 27 (Lecture 12): Random Variables
- Definition of a random variable. Examples with dice and balls
into bins. Building random variables from others (like Z=X-Y). Probability mass functions and support of a random variable. Bernoulli random variables.
- Reading: 3.1, 3.2, start 3.3 (through Story 3.3.3)
- Tue Mar 3 (Lecture 13): Named Distributions, PMFs, Applications
- Binomial and Bernoulli distributions. Derivation of each from a story, and applications. Graphing PMFs and solving word problems with repeated trials.
- Reading: 3.3
- Fri Mar 6 (Lecture 14): Hypergeometric and Discrete Uniform Distributions;
Functions of a Random Variable
- Hypergeometric and discrete uniform distributions. Story, support,
and PMF of both. Functions of a random variable (like g(X)), and how they
change the PMF.
- Reading: 3.4, 3.5, 3.7 (skip CDFs and 3.6)
- Tue Mar 10 (Lecture 15): Independence of Random Variables; Start Expectation
- (Started with a review of what a PMF really means, drawing a picture of a r.v. as a function.)Definition of indpendence for r.v.s, and how it differs from indpendence of events. How to show r.v.s are/aren't independent. Definition and intuition of expectation as a weighted average.
- Reading: 3.8 up to Def 3.8.9, 4.1
- Fri Mar 13 (Lecture 16): Expectation continued; Geometric Distribution
- Expectation of Bernoulli, expectation of Binomial the hard way. Expecation of a sum (like X+Y) and why it is hard to compute. Linearity of expectation, and the easy way. Expectation of Binomial and Hypergeometric. The Geometric distribution,
its support, PMF, and expectation.
- Reading: 4.2, 4.3 up to Ex 4.3.6
- Tue Mar 24 (Lecture 17): Problems Involving Expectation
- The dice game where you can decide to stop after one roll
or continue. The difference between sampling a random family and
sampling a random child to estimate average household size. The
First Success distribution and Negative Binomial distribution.
- Reading: 4.3 up to Ex 4.3.12
- Fri Mar 27 (Lecture 18): Linearity of Expectation
- Expectation of First Success and Negative Binomial distributions.
Coupon collector problem (Ex 4.3.11). The indicator method. Expectations using the indicator method: the number of matches in de Montmort's problem,
the number of matching birthdays in a group of people, the number
of times HTH appears when flipping a coin several times.
- Reading: 4.4 (we do not need the details of Theorem 4.4.1 or
Example 4.4.3, and you can also skip Example 4.4.6 and Theorem 4.4.8 and
- Tue Mar 31 (Lecture 19): Expectation of g(X); Variance
- The St. Petersburg paradox.
LOTUS and computing g(X) the easy way. Intuition and definition
- Reading: St. Petersburg is example 4.3.13 (page 149). 4.5 (LOTUS),
- Fri Apr 3 (Lecture 20): Negative Hypergeometric, Variance
- We started with a hint about Negative Hypergeomtric (example 4.4.7)
and about choosing a random child vs. picking a random family and then
a random child from the family.
- Reading: 4.6
- Tue Apr 7 (Lecture 21): Variance of named distributions
- Variance of each of the named distributions using V(X+Y) = V(X) + V(Y)
when X and Y are independent. Began inequalities starting with Markov.
- Reading: 4.6
- Fri Apr 10 (Lecture 22): Midterm II Review
- Tue Apr 14 (Lecture 23): Midterm II
- Fri Apr 17 (Lecture 24): Inequalities
- Intuition and statement for Markov's, Chebyshev's,
and Chernoff's. Applications to a coin problem and bounding the length of
a chain in a hash table.
- Reading: Notes posted to Sakai. The text has these bounds in a very
different form that we will not use.
- Tue Apr 21 (Lecture 25): Poisson Approximation and The Probabilistic
- The idea behind Poisson approximation, and how to estimate some
probabilities using it. Examples with birthday matches and almost matches.
The probabilistic method for proving that certain objects exist. (The
hard example was 4.9.1 in the text.)
- Reading: 4.7 (Poisson) and 4.9 up to subsection 4.9.1 (but through
example 4.9.1 -- both confusingly have the same number)