Options

Notes on Takahara's lecture notes on applied probability

edited December 7 in General

From course taught by Glen Takahara at Queens University at Kingston:

Good exposition of concepts, combining informal language with mathematical precision. Calculations get laborious at times.

Comments

  • 1.
    edited December 7

    Notes from Introduction.

    Examples of stochastic systems: CPU with jobs arriving in random fashion; network multiplexor with packets arriving randomly; a store with random demands on its inventory.

    Prob. A monkey hits keys on a typewriter randomly and forever. Show that he eventually types the complete works of Shakespeare with probability 1.

    Ans. Show that the probability that he never types the complete works is zero. Let \(S\) be the string, of length \(n\), that is the complete works of Shakespeare. Let \(A\) be the set of infinite strings that do not contain \(S\) anywhere as a substring, and let \(B\) be the set of infinite strings that do not contain \(S\) as a substring starting at an index which is a multiple of \(n\). Then \(A \subseteq B\), so \(Prob(A) <= Prob(B)\). Now, \(Prob(B) = (1 - x) ^ \infty = 0\), for small positive value \(x\). So \(Prob(A) = 0\).

    Prob. Suppose that in 1 week a chicken will lay N eggs, where N is a random variable with Poisson(\(\lambda\)) distribution. Also each egg has probability \(p\) of hatching a chick. Let \(X\) be the number of of chicks hatched in a week. What is \(E[X]\)?

    Ans. Simple and clear: \(E[X] = \lambda * p\).
    The proof involves conditioning on the event \(N = k\).

    Comment Source:Notes from Introduction. Examples of stochastic systems: CPU with jobs arriving in random fashion; network multiplexor with packets arriving randomly; a store with random demands on its inventory. Prob. A monkey hits keys on a typewriter randomly and forever. Show that he eventually types the complete works of Shakespeare with probability 1. Ans. Show that the probability that he never types the complete works is zero. Let \\(S\\) be the string, of length \\(n\\), that is the complete works of Shakespeare. Let \\(A\\) be the set of infinite strings that do not contain \\(S\\) anywhere as a substring, and let \\(B\\) be the set of infinite strings that do not contain \\(S\\) as a substring starting at an index which is a multiple of \\(n\\). Then \\(A \subseteq B\\), so \\(Prob(A) <= Prob(B)\\). Now, \\(Prob(B) = (1 - x) ^ \infty = 0\\), for small positive value \\(x\\). So \\(Prob(A) = 0\\). Prob. Suppose that in 1 week a chicken will lay N eggs, where N is a random variable with Poisson(\\(\lambda\\)) distribution. Also each egg has probability \\(p\\) of hatching a chick. Let \\(X\\) be the number of of chicks hatched in a week. What is \\(E[X]\\)? Ans. Simple and clear: \\(E[X] = \lambda * p\\). The proof involves conditioning on the event \\(N = k\\).
  • 2.

    From Introduction, cont'd.

    Analysis of QUICKSORT.

    QUICKSORT(\(x_1, ..., x_n\)):

    If \(n = 0\) or \(n = 1\), no sorting needing, so return.

    Randomly choose \(x_i\).

    Divide up the remaining values into two sets \(L\) and \(H\), where \(L\) is the set of values less than or equal to \(x_i\), and \(H\) is the set of values greater than \(x_i\).

    return QUICKSORT(\(L\)), \(x_i\), QUICKSORT(\(H\))

    Comment Source:From Introduction, cont'd. Analysis of QUICKSORT. QUICKSORT(\\(x_1, ..., x_n\\)): If \\(n = 0\\) or \\(n = 1\\), no sorting needing, so return. Randomly choose \\(x_i\\). Divide up the remaining values into two sets \\(L\\) and \\(H\\), where \\(L\\) is the set of values less than or equal to \\(x_i\\), and \\(H\\) is the set of values greater than \\(x_i\\). return QUICKSORT(\\(L\\)), \\(x_i\\), QUICKSORT(\\(H\\))
  • 3.
    edited December 7

    The number of comparisons QUICKSORT makes to sort a list of \(n\) values is a random variable.

    Let \(X_n\) be the number of comparisons QUICKSORT makes to sort a list of \(n\) values, and let \(M_n = E[X_n]\).

    Let \(Y\) be rank of the number \(x_i\) that is chosen. I.e., \(Y = i\).

    Condition on \(Y\) and use the Law of Total Expectation to get:

    $$ M_n = E[X_n] = E[E[X_n|Y]] = \sum_{j=1}^{n} E[X_n|Y=j] * 1/n $$ Next,

    $$ E[X_n | Y = j] = n - 1 + E[X_{j-1}] + E[X_{n-j}] $$ because \(n = 1\) comparisons are required to partition the values into \(L\) and \(H\), and the other two terms stem from the recursive structure of the algorithm.

    ...

    After grinding through the calculation, the final result is:

    $$ M_{n+1} = (n + 2) \sum_{i=1}^{n} \frac{2i}{(i+1)(i+2)} $$ which has complexity \(n\) log \(n\).

    Comment Source:The number of comparisons QUICKSORT makes to sort a list of \\(n\\) values is a random variable. Let \\(X_n\\) be the number of comparisons QUICKSORT makes to sort a list of \\(n\\) values, and let \\(M_n = E[X_n]\\). Let \\(Y\\) be rank of the number \\(x_i\\) that is chosen. I.e., \\(Y = i\\). Condition on \\(Y\\) and use the Law of Total Expectation to get: \[ M_n = E[X_n] = E[E[X_n|Y]] = \sum_{j=1}^{n} E[X_n|Y=j] * 1/n \] Next, \[ E[X_n | Y = j] = n - 1 + E[X_{j-1}] + E[X_{n-j}] \] because \\(n = 1\\) comparisons are required to partition the values into \\(L\\) and \\(H\\), and the other two terms stem from the recursive structure of the algorithm. ... After grinding through the calculation, the final result is: \[ M_{n+1} = (n + 2) \sum_{i=1}^{n} \frac{2i}{(i+1)(i+2)} \] which has complexity \\(n\\) log \\(n\\).
Sign In or Register to comment.