#

CS 237 - Probability in Computing

Fall, 2018

Instructor and Lecture

Times and Locations


Instructor:Wayne Snyder

     Email: waysnyder at gmail dot com


      Office: MCS 290

      General CS Office Hours: W 4-6
      CS 237 Office Hours: Th 4-6

      Tutoring Hours in Rich Hall: T 7:30-10:30pm

      Cell Phone: 617-966-(22+41) (text in an emergency; in general, email vastly preferred, but if you call, leave a message!)
;



Lecture: T, Th 11 - 12:15pm in CAS B12

Discussion/Labs:

      A2: M 12:20--1:10pm (MCS B33)
      A3: M 1:25 -- 2:15pm (KCB 103)
      A4: M 3:35--4:25 (MCS B23)
      A5: M 5:45 - 6:35 (EOP 269, located at 890 Commonwealth Avenue)

TF: Jiatong ("Lenka") Hao (jthao at bu dot edu)

Office Hours: W 2-4 in MCS 204

 

Useful Links

 

 

 

 

Lecture
Date
Lecture & Lab Topics

Readings

Prob = "Schaum's Outline of Probability"

MCS = "Mathematics for Computer Science"

Practice Problems, Homeworks, and Tests
Labs
1 T 9/4

Administrative matters; Goals of the course; Motivating Examples: Why should you know probability and statistics? Probability and Statistics as the science of quantifying uncertainty. What is randomness? Probability and Computer Science.

Lecture Slides: PDF

Just for Fun: Here is a short YT video with various clips from movies which involve probability (we will return to the Monty Hall Problem, the first clip, soon): HTML

Here is a nice exploration of the nature of randomness and our perceptions: HTML

   
2 Th 9/6 Basic definitions of probability theory: outcomes, sample spaces, probability functions, axioms for probability functions; examples of finite, countably infinite, and uncountable sample spaces and typical problems in each. Probability functions for finite, equiprobable spaces, and extention to uncountable case. Anomalies arising from infinite sample spaces.

Read through Ch. 1 & 2 in Prob if you need a review of the basic math background.

Prob Sections 3.1 - 3.5.

Practice problems from Prob (not to hand in): 3.2, 3.3, 3.5, 3.6, 3.7, 3.16, 3.18, 3.26.

HW 01: IPYNB: due Th 9/13 @ midnight

HW 01 Solution: PDF

YT Instructions for converting IPYNB to PDF: PDF

 
  M 9/10      

Lab 01: IPYNB (Jupyter Notebook -- right click and select Save Link As...). Due Th 9/13 @ midnight along with HW 01.

Lab 01 Solution: PDF

3 T 9/11 Set operations on events and axiomatic method for proving theorems about probability functions; non-equiprobable probability spaces.

Same readings as last time; also read MCS sections 17.1 & 17.2 on the "Monty Hall Problem" and the "Four Step Method." Look at the link above to a video of the "Monty Hall Problem."

Optional: Section 17.5 from MCS covers the same material in Prob Chapter 3, but more rigorously, it is worth looking at for the application of the tree diagram technique.

Optional: Section 4.3 of Prob has a slightly different presentation of the tree diagram technique.

Prob p.94 has a nice explanation of trees resulting from independent trials (the simplest case).

 

The practice problems listed in the previous row apply to this lecture as well.

 

 

 
4 Th 9/13

Problem solving strategies; analysis using tree diagrams and the "Four Step Method"; the inverse method. Example: The Monty Hall Problem.

Conditional Probability; Independence.

Prob, Ch.4.

MCS 18.5 - 18.7 has the same material.

 

Practice problems from Prob (not to hand in): 4.1, 4.4, 4.5, 4.6, 4.21, 4.22, 4.25, 4.26, 4.29, 4.36

Tree diagram problems from Prob: 2.32, 2.33

HW 02: ZIP (due 9/20 @ 11:59pm)

Solution: PDF

 

 
  M 9/10      

Lab 2 Problem Set: ZIP

5 T 9/18

Independence reviewed; Bayes' Rule.

Counting principles and combinatorics; counting as sampling and constructing outcomes. Selection with and without replacement; permutations; counting sets vs sequences; the Unordering Principle; overcounting due to duplicates, multinomial coefficients.

Lecture Slides: PDF

Look at this summary of problem-solving strategies, most of which involve combinatorics: HTML; this is a good summary of what I will cover, although it does not explain all the formulae.

The second chapter of Prob has a brief treatment of these topics as well.

Bayes' Rule is covered well in Prob pp.90-91, including a nice explanation in terms of trees.

The form of Bayes' Rule in Prob p.90 is very general; we will discuss only the simple case P(B|A).

Prob p. 9 and MCS 14.9 discuss the inclusion-exclusion principle.

Prob discusses random sampling briefly on p.38.

You should also read this link on Permutations with Repetitions;

 

Practice problems from Prob (not to hand in): 4.22, 4.25 (Bayes), 2.2, 2.8, 2.10

 

 
6 Th 9/20 Combinations; applications to finite probability problems; poker probabilities.

Lecture Slides: PDF

Much of this should be review from CS 131!

Prob, Ch. 2; MCS Sections 14.1 - 14.7 is a rather more advanced treatment of the same materal.

Combinations are covered in the previous list of readings; the Wiki article section on 5-card poker is very good;

 

Practice problems from Prob (not to hand in): 2.15, 2.16, 2.17, 2.19, 2.20, 2.12, 2.24, 2.25

HW 3: IPYNB

Solution: PDF

 
  M 9/24      

Lab 3: Poker Probabilities -- verifying the formulae by simulation: IPYNB

7 T 9/25 Counting continued: the power set and counting subsets; using combinations to count partitions; ordered partitions vs. unordered partitions; overcounting due to duplicates among subsets. Applications to probability problems.

Lecture Slides: PDF

MCS 14.5 concerns powersets and counting subsets; here is a nice explanation about how to count partitions, with intuitive examples.

Practice problems from Prob (not to hand in): 2.26, 2.28, 2.29, 2.72, 2.73

 
8 Th 9/27 Discrete Random Variables; Probability Distributions, Functions and expressions of RVs; Expected value

Lecture Slides: PDF

Read Prob 5.1 - 5.3

Practice problems from Prob (not to hand in): 5.1 - 5.5

HW 04: IPYNB

HW Solution: PDF

 
  M 10/1       Lab 4: Generating Random Variates IPYNB
9 T 10/2 Expected Value, and fair games; Variance, and Standard Deviation of RVs;

Lecture Slides: PDF

Practice problems from Prob (not to hand in): 5.8, 5.9, 5.15, 5.18, 5.20, 5.19, 5.21, 5.23

 

 
10 Th 10/4 Important Discrete Distributions: Uniform, Bernoulli, Binomial; Geometric
Lecture Slides: PDF

Read Prob 5.4 - 5.5, 5.8

In Prob the Bernoulli and Binomial are explained in greater depth on pp.177-80, and the Geometric on p.195.

Here is a summary of some of the most useful discrete distributions: IPYNB(you do not need to know any that we do not study in lecture).

For an illustration of the Binomial, take a look at this Quincunx animation: HTML

Practice problems on Binomial from Prob (not to hand in): 6.2, 6.3, 6.5, 6.8, 6.12, 6.15

HW 05: IPYNB

HW Solution: PDF

 
  T 10/9 Monday schedule: Labs will be held but no lecture!     Lab 5: Generating Random Variates II; Comparing Distributions IPYNB
11 Th 10/11

Discrete Distributions concluded: Geometric; Memory-less property of Geometric; Poisson Processes and Poisson Distribution; relationship between Poisson and Binomial.

Lecture Slides: PDF

On Poisson Distribution: Prob section 6.7.

Reading on the Poisson Processes: HTML

Practice problems on Geometric: Example 6.18 on p. 195, plus 6.46 and 6.87.

Practice problems on Poisson from Prob: 6.39, 6.40, and 6.81.

HW 06: IPYNB
Solution: PDF
  M 10/15      

Lab 6 -- Queueing Simulation with Discrete Distributions: IPYNB

12 T 10/16 General (continuous) Random Variables; Uniform Distribution; Importance of CDF, Introduction to Normal Distribution Lecture Slides: PDF

Prob: 5.10, 5.11, 6.8

Brief tutorial on integration: PDF

 

Practice problems on continuous distributions from Prob: 5.12, 5.37

 

 
13 Th 10/18 Normal Distribution continued; Exponential distribution; Memory-less property of Exponential Distribution; Relationship between the Exponential and Poisson Distribution.

Lecture Slides: PDF

Prob: 6.3, 6.4, & 6.5

Practice problems on Normal from Prob: (use calculator rather than converting to Z scores with damn table!) 6.28, 6.29 , 6.21, 6.22, 6.33, 6.34

Practice problems on Exponential from from Prob: 6.89, plus look at Examples 6.19 and 6.20 on p.198.

No homework this week, study for the midterm!

 
  M 10/22      

No Lab

 

14 T 10/23 Review Session for Midterm Sample Midterms: Soon!  
15 Th 10/25 Midterm Exam

Midterm A Solution: PDF

Midterm B Solution: PDF

Exam will have 6 problems, and you must eliminate one, completing the other five. Material covered will be up through lecture 12, but you are not responsible for the Normal or Exponential Distributions (will do those on final exam); however, continuous distributions as in problem 9 on HW 6 are fair game. There will be no questions about the labs or requiring you to write code. For questions with complicated calculations, you may just give the appropriate formula.

Most problems will have multiple parts. One question will be literally from the list of practice problems from Schaum's on this page up through lecture 12 (but nothing about the normal or exponential), perhaps modified in some simple way so they can be done in the test environment. The rest will be similar (but not identical) to homework problems, again modified to make them doable in a test environment.

I will post some additional problems from Schaums (maybe a dozen, not more) by Saturday midday, as well as some past midterms and solutions. These practice exams are just for you to see the format and so on; paying too much attention to those problems is not a good idea, because those exact problems won't appear! Focus on the Schaum's practice problems and the homeworks.

Solutions to homeworks 1 - 5 are on this page; solution to hw 06 will be posted by Sunday night.

No books, no calculators, but you may bring a single 3x5 card or a piece of paper of the same dimensions. You may typeset it but no magnifying glasses :-).
Midterm Examples with Solutions: DIR

HW 07: IPYNB

 
  M 10/29       Lab 7: Discrete Event Simulation -- Queueing with Scheduling: IPYNB

Pre-Lab Video Lecture: YT

Lab 07 with Diagrams (what you should be aiming for): PDF

16 T 10/30

Normal approximation to the Binomial; the Continuity Correction.

Central Limit Theorem

Lecture Slides: PDF

Notebook on CLT: IPYNB

Notebook on CLT PDF

The CLT is stated on p.189 of Prob. Not much discussion.

Here is a reasonable tutorial, with videos: HTML

   
17 Th 11/1

Statistics as Applications of the CLT. Sampling theory and point estimation when population variance is known. Confidence Intervals.

Lecture Slides: PDF

Read Prob, Appendix A1 - A4 (up to p.251)

Here is a very precise and readable summary of all the major points about sampling, confidence intervals, and hypothesis testing: PDF

 

HW 08: IPYNB

 
  M 11/5      

Lab 8 -- Stats Lab 1: ZIP

Here is documentation on the scipy.norm statistics library: HTML

18 T 11/6 Hypothesis Testing

Lecture Slides: PDF

On Hypothesis Testing, here is a nice summary of the major points, with a video lecture: HTML  
19 Th 11/8 Hypothesis testing concluded: One-tailed vs Two-tailed tests; Sampling when the population parameters are unknown; Small samples and the T-Distribution.

Lecture Slides: PDF

 

HW 09: IPYNB

 
  M 11/12       Lab 9 -- Display of Multivariate Data: IPYNB

What you are aiming for: PDF

20 T 11/13 Joint Random Variables (discrete and continuous); Independence of JRVs; Conditional Random Variables

Read Schaum's 5.6 - 5.7,  but correct the bad typo in the definition of correlation at the bottom of page 130:

ρ(X,Y) = cov(X,Y) / (σXσY)

Here is my spreadsheet from lecture: XLS

   
21 Th 11/15 Correlation and Autocorrelation of Joint Random Variables    
  M 11/19      

Lab 10 -- Correlation and Autocorrelation: IPYNB

22 T 11/20 Least Squares and Linear Regression; Linear Models

Read Prob, Appendix A.6.

 

 
    Thanksgiving Recess  

HW 10:

 
  M 11/26       Lab 11 -- Linear Regression: IPYNB
23 T 11/27 Multiple Linear Regression

I will follow closely the treatment in sections 1.4, 1.5, 1.6, 1.9 in this link.

   
24 Th 11/29 Logistic Regressions Video 1 (simpler, first of series of 5)
Video 2 (more detailed)

Nice summary of Logistic Regression: HTML

No more homeworks!  
  M 12/3       Final Project: IPYNB
Dataset for project: CSV
25 T 12/4 Logistic Regression concluded; Introduction to Machine Learning    
26 Th 12/6 Machine Learning and Data Mining    
  M 12/10       No Lab (work on the project)
27 T 12/11 Conclusions; Review for Final Exam; Course Evaluations      
F 12/14 Final Project Due @ 12 midnight      
  M 12/17 Office Hours and Review for Final      
  T 12/18 Final Exam: 12:30 - 2:30pm Table for standard normal distribution (will be provided on exam): PDF