Brief solutions:
For Q.1 you should have stated that you will decide according to
the max of P(C_i|X), and with Bayes rule you are choosing the max of
P(X|C_i), since P(x) doesn't depend on (i) and P(C_i) are equal.
By independence, you transform this to a product of P(x_j|C_i) and then
take the log, so now your discriminant function of for class (i) is the
sum of P(x_j|C_i).
You can write x_j * p + (1-x_j)*(1-p) for each term of function_1 and
similarly set up function_2. Subtract f_2 from f_1 and in 2-3 lines
you're done, showing that if p>0.5 you get a positive answer.
To rephrase, each discriminant function, f_1 or f_2, is a summation of j
terms that are either p or (1-p). In f_1 you have p for every
observed 1, and (1-p) for every observed 0. In f_2 the opposite holds.
Suppose that you have k more observed 1's than 0's. Then when you
subtract f_2 from f_1, you are left with k*p - k(1-p) = k(2p-1).
k is positive and (2p-1) is positive when p>0.5, so f2-f1 is positive.
-------------------------------------
For Q.2 there is a counterexample.
You may check the answer to a similar question (involving Gabriel
graph editing instead of RNG editing). The answer is almost the same.
(question 3 of 2002 exam)
-------------------------------------------------------
Q.3 was straight from homework 5. I believe its the second half of
the third question.