The run time analysis of these algorithms is based on two key lemmas:
Lemma 1: If V (defined in the algorithms) is nonempty it contains a constraint of S*
Lemma 2: Let V* be a subset of S, and let R be a random subset of S \ V* of size r, with |S \ V*| = n. Let V be the set of constraints violated by x*(R union V*). Then the expected size of V is no more than d(n - r + 1) / (r - d).
For proofs of these lemmas see Ken Clarkson's paper.
Lemma 3: The probability that |V| <= 2n1/2 is at least 1/2. So on average two executions are required to obtain a successful one.
This follows directly from lemma 2 and Markov's inequality, and substituting r by d(n1/2). For the iterative case, take V* = {}, and n = w(S), and it follows that it's expected to take on average two iterations to have w(V) <= 2w(S)/(9d - 1).
Theorem 4 The iterative algorithm takes O(d2n log n) + (d log n)O(d)d/2+O(1), expected time as n approaches infinity.
It can be shown that the loop body of the algorithm runs an expected O(d log n) times, by showing w(S*) grows much faster than w(S), so after O(d log n) iterations either V is empty of w(S*) > w(S) a contradiction. The Simplex Algorithm has a time bound of O(d)d/2 + O(1). Determining V takes time O(dn). So the body of the loop takes O(dn) + O(d)d/2+O(1). So bound on the theorem follows.
Theorem 5 The mixed algorithm takes expected time of O(d2n) + (d2log n)O(d)d/2+O(1) + O(d4n1/2 log n) as n approaches infinity.
V* grows by at most 2n1/2 each iteration where |V| <= 2n1/2. So after at most d+1 such iterations, V* will contain S*. It follows that the maximum size of R union V* is (9d2n)1/2. Let Tm(n) be the expected time of the mixed algorithm with n constraints and d variables. Similarily let Ti(n) be the expected time of the iterative algorithm. Since the probability that |V| <= 2n1/2 is at least 1/2, it follows that the expected run time of the mixed algorithm is: