Consider the following statements about two random variables X and Y :
i. If X and Y are independent, the correlation of X and Y must be 0.
ii. If X and Y are uncorrelated, the correlation of X and Y must be 0.
iii. If X and Y are independent, E[XY] must equal to E[X ]• E[Y] .
iv. If X and Y are correlated, E[X +Y] may not equal to E[X ]+ E[Y] .
Which of the statements above is(are) TRUE?
i
ii
iii
iv
None of the above.
Which of the statements above is(are) TRUE?
i
ii
iii
iv
None of the above.
Consider a random variable X with the PDF:
i. The value of the constant C´ is 0.1.
ii. E[X] = 2 .
iii. The variance of X is 3.2.
Which of the statements above is (are) TRUE?
i
ii
iii
iv
None of the above.
Random variables W, X , Y, Z have the following relations:
Consider the following statements about the correlation coefficients:
Which of the statements above is(are) TRUE?
i
ii
iii
iv
None of the above.
Consider an experiment that produces observations of sample values of a random
variable X with unknown yet finite variance Var[X ] and mean E[X ] . The
observed sample value of the i-th trial is denoted by Xi . Define
statements:
i. Mn(X ) is an unbiased estimate of E[X] .
ii. Vn(X ) is an unbiased estimate of Var[X] .
iii. Vn(X ) is an asymptotically unbiased estimate of Var[X] .
iv. {Mn(X )} is a sequence of consistent estimates of E[X] .
Which of the statements above is(are) TRUE?
i
ii
iii
iv
None of the above.
Which of the following subsets or is(are) linearly independent?
{(1,0,0), (0,2,0), (0,0,1)}
{(1,2,3), (0, −1 ,0)}
{(1,0,1), (1,1,0), (0,0,1), (0,1,1)}
{(1,2,1), (2,4, −2 )}
None of the above.
Which of the following subsets of is(are) an orthogonal basis(bases) thereof?
{(1,0,0), (0,2,0), (0,0,1)}
{(1,0,0), (0,1,0), (0,0,0)}
{(3,4,0), ( −4 ,3,0), (0,0,7)}
{(1,1,1), (1, −1 ,0), (1,1, −2 )}
None of the above.
Which of the following is(are)a linear transformation(s)?
None of the above.
Consider an m×n matrix A . Which of the following is(are) TRUE?
rank(A) + nullity(A) = m
The dimension of the row space of A equals to that of the column space of A .
1 ≤ rank(A) ≤ min(n,m) .
AAT is always invertible.
None of the above.
Please derive the following MGFs:
A mathematician decides to reconstruct the whole probability theory based on a new
set of probability axioms. The new set of probability axioms has three axioms. Two of
them are the same as the axioms that we have been used in the original probability
theory. The only different one is the axiom that states P{S} = 3 , which is different
from the axiom of P{S} = 1 that we have commonly used. This means that in the
new set of probability axioms, probability now summed up to 3, not 1. Based on this
new set of probability axioms, the new probability theory is constructed, which is
quite different from the original probability theory we have now. However, the same
definition of expectation is still used for the new probability theory, i.e.
where P[X = x] denotes the probability of X = x . Answer the following questions:
(1) Consider the experiment of flipping a coin n times. Let random variable X
denote the number of heads we see in the experiment. Assume that the coin is a
fair coin. According to the new set of probability axioms, what should the
probability of X = x be in the new probability theory? Clearly explain why your
answer is correct in detail.
(2) Does the original law of large numbers (LLN) still hold in the new probability
theory? Clearly explain why yes or why not in detail.
fine in the new probability theory? Clearly explain why yes or why not in detail.
Let V be a vector space over a field F with addition "+" and scalar
multiplication "⋅" . Prove that c ⋅0 = 0 for any c∈ F , where 0 is the zero vector in
V . Use only axioms of vector spaces (as well as those of fields).
Let W be a subspace of V . Suppose B = {v1, v2 ,?, vm}⊂V is a basis for W .
Prove that any subset S of W containing more than m vectors is linearly
dependent.
(Hint: Use the fact that a homogeneous system of linear equations has an infinite
number of solutions if it has fewer equations than variables.)
(1) Apply Gram-Schmidt orthonormalization process on the subspace of defined
as W = span{[1 1 1 1]T , [1 1 0 0]T , [2 0 1 −1]T }
and give an orthonormal basis for W.
(2) Find an orthonormal basis for W⊥ , the orthogonal complement of W .
可觀看題目詳解,並提供模擬測驗!(免費會員無法觀看研究所試題解答)