Page title matches

Page text matches

  • If the brightness values in the x and y directions are thought of as random variables then C is a scaled version of their covariance matrix.
    14 KB (2,253 words) - 12:21, 9 January 2009
  • ...le="padding-right: 1em;" | Friday || 02/27/09 || Circular convolution, one random variable || 1.6.5., 3.1.1 ...ign="right" style="padding-right: 1em;" | Monday || 03/02/09 || two random variables || 3.1.2
    6 KB (689 words) - 07:59, 2 August 2010
  • ...an/ece438/lecture/module_1/1.1_signals/1.1.5_complex_variables.pdf complex variables] ==Random sequences ==
    8 KB (1,226 words) - 11:40, 1 May 2009
  • *[[ECE600|ECE 600]]: "Random Variables and Stochastic Processes"
    4 KB (474 words) - 07:08, 4 November 2013
  • Let <math>X</math> denote a binomial random variable with parameters <math>(N, p)</math>. *(a) Show that <math>Y = N - X</math> is a binomial random variable with parameters <math>(N,1-p)</math>
    6 KB (883 words) - 12:55, 22 November 2011
  • '''Definition and basic concepts of random variables, PMFs''' Random Variable: a map/function from outcomes to real values
    3 KB (525 words) - 13:04, 22 November 2011
  • This part deals with Binomial Random Variables.
    401 B (68 words) - 15:04, 23 September 2008
  • ...figure out what the point of this question, Is W one of the common random variables we have seen in class?, is. Is any way that I can prove that W is one of the common random variables?
    532 B (101 words) - 05:43, 24 September 2008
  • ...e coupons in it, with all being equally likely. Let <math>X</math> be the (random) number of candy bars you eat before you have all coupons. What are the mea ...t is the PDF of <math>Y</math>? Is <math>Y</math> one of the common random variables?
    4 KB (656 words) - 12:56, 22 November 2011
  • <math>X</math> is an exponential random variable with paramter <math>\lambda</math>. <math>Y = \mathrm{ceil}(X)</ma What is the PMF of <math>Y</math>? Is it one of the common random variables? (Hint: for all <math>k</math>, find the quantity <math>P(Y > k)</math>. T
    3 KB (449 words) - 12:57, 22 November 2011
  • == Problem 1: Arbitrary Random Variables == Let <math>U</math> be a uniform random variable on [0,1].
    4 KB (596 words) - 12:57, 22 November 2011
  • ...comes (1/2)*e^(-d/2) which is the pdf. And, D is one of the common random variables because our pdf's are exponential with parameter lambda = 1/2.
    297 B (54 words) - 12:54, 16 October 2008
  • * For Continuous Random Variable: ==Theorem of Total Probability for Continuous Random Variables==
    4 KB (722 words) - 13:05, 22 November 2011
  • The PDF of the sum of two independent random variables is the convolution of the two PDFs. The lecture notes from 10/10 are helpf
    133 B (23 words) - 19:13, 19 October 2008
  • ...ding P[H2|H1], and H2 and H1 are both events rather than continuous random variables, we can do this. We don't have to worry about finding the conditional PDF
    333 B (64 words) - 10:26, 20 October 2008
  • ...e former is denoted P(A|X = 0) and the latter P(A|X = 1). Now define a new random variable Y, whose value is P(A|X = 0) if X = 0 and P(A|X = 1) if X = 1. Tha ...s said to be the conditional probability of the event A given the discrete random variable X:
    2 KB (332 words) - 16:52, 20 October 2008
  • We create variables : Therefore, in c to produce a random variable with a gaussian distribution you simply do the following
    560 B (112 words) - 18:03, 20 October 2008
  • OK, so what we have initially is a uniform random variable on the interval [0,1]. ...exponential random variable with λ=0.5 is made out of two gaussian random variables with the relationship '''<math>D=X^2+Y^2</math>'''
    1 KB (186 words) - 11:47, 21 October 2008
  • == Problem 1: Random Point, Revisited== In the following problems, the random point (X , Y) is uniformly distributed on the shaded region shown.
    4 KB (703 words) - 12:58, 22 November 2011
  • ...observed should be the sum or mean of many independent random variables. (variables need not be iid)(See the PROOF ) undirected graphs (Markov random fields), probabilistic decision trees/models have a number of
    31 KB (4,832 words) - 18:13, 22 October 2010
  • ...ormally distributed random numbers : ex) RANDN(N) is an N-by-N matrix with random entries, chosen from a normal distribution with mean zero, variance one and ...ro generate a vecort of n-gaussian random variables ? can this be called a random vector ? BAsically my question is how do we simulate gaussian data whcih h
    10 KB (1,594 words) - 11:41, 24 March 2008
  • ...ce of random variables since <math>p_i(\vec{x_0})</math> depends on random variables |sample_space_i|. What do we mean by convergence of a sequence of random variables (There are many definitions). We pick "Convergence in mean square" sense, i
    7 KB (1,212 words) - 08:38, 17 January 2013
  • Deterministic (single, non-random) estimate of parameters, theta_ML ...Bayesian formulation, the parameters to be estimated are treated as random variables. The Bayes estimate is the one that minimizes the Bayes risk by minimizing
    6 KB (995 words) - 10:39, 20 May 2013
  • which datasets with tens or hundreds of thousands of variables are available. These areas include ...on for each criterion is compared with the optimal two-group separation of variables found by total enumeration of the possible groupings.
    39 KB (5,715 words) - 10:52, 25 April 2008
  • ...imit Theorem`_ says that sum of independent identically distributed random variables approximate the normal distribution. So, considering the pattern recognitio The following histograms of N uniformly distributed random variables for different values of N can be given to visualize the [http://en.wikipedi
    2 KB (247 words) - 08:32, 10 April 2008
  • ...iable" being observed should be the sum or mean of many independent random variables.
    213 B (35 words) - 10:01, 31 March 2008
  • ...he principal components of a data set. The principal components are random variables of maximal variance constructed from linear combinations of the input featu
    657 B (104 words) - 01:45, 17 April 2008
  • ...</math> and <math>\mathbb{Y}</math> be jointly distributed discrete random variables with ranges <math>X = \{0, 1, 2, 3, 4\}</math> and <math>Y = \{0, 1, 2\}</m
    7 KB (948 words) - 04:35, 2 February 2010
  • ...e-policy: -moz-initial;" colspan="2" | Expectation and Variance of Random Variables | align="right" style="padding-right: 1em;" | Binomial random variable with parameters n and p
    3 KB (491 words) - 12:54, 3 March 2015
  • ...tral density functions. Random processes and response of linear systems to random inputs.<br/><br/> <br/>ii. an ability to model complex families of signals by means of random processes.
    2 KB (231 words) - 07:20, 4 May 2010
  • let X1,X2,...,Xn be n independent and identically distributed variables (i.i.d) with finite mean <math>\mu</math> and finite variance <math>\sigma^ More precisely the random variable <math>Z_n = \frac{\Sigma_{i=1}^n X_i - n \mu}{\sigma \sqrt{n}}</ma
    5 KB (806 words) - 09:08, 11 May 2010
  • ...I reduced it to [1 2 3; 0 -3 -3]. I'm not even sure whether plugging in random values was the right idea, but I'm stuck here. How do I proceed from here? ...That's like doing an experiment in science. You'd have to plug in lots of random values if you were doing science, but you'd miss the key points in math. Y
    4 KB (756 words) - 04:25, 8 September 2010
  • :*[[ECE 600 Sequences of Random Variables|ECE 600 Sequences of Random Variables]]
    2 KB (250 words) - 10:07, 16 December 2010
  • ...observed should be the sum or mean of many independent random variables. (variables need not be iid)(See the PROOF ) undirected graphs (Markov random fields), probabilistic decision trees/models have a number of
    31 KB (4,787 words) - 18:21, 22 October 2010
  • *[[2010_Fall_ECE_600_Comer|ECE 600]]: "Random Variables and Stochastic Processes"
    3 KB (380 words) - 18:29, 9 January 2015
  • = [[ECE]] 600: Random Variables and Stochastic Processes = :*[[ECE 600 Sequences of Random Variables|2. Sequences of Random Variables]]
    2 KB (238 words) - 12:14, 25 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (273 words) - 17:40, 13 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    1 KB (191 words) - 17:42, 13 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (928 words) - 17:46, 13 March 2015
  • =Addition of two independent Poisson random variables = ...athbf{X}</math> and <math>\mathbf{Y}</math> are independent Poisson random variables with means <math>\lambda</math> and <math>\mu</math>, respectively.
    3 KB (557 words) - 12:11, 25 September 2013
  • == Example. Two jointly distributed random variables == Two joinly distributed random variables <math>\mathbf{X}</math> and <math>\mathbf{Y}</math> have joint pdf
    7 KB (1,103 words) - 05:27, 15 November 2010
  • == Example. Addition of two independent Gaussian random variables == ...is the pdf you determined in part (b)? What is the mean and variance of a random variable with this pdf?
    6 KB (939 words) - 04:20, 15 November 2010
  • == Example. Addition of multiple independent Exponential random variables == ...h parameter <math>\lambda</math> and <math>\mathbf{N}</math> is Geometric random variable with parameter <math>p</math>. Find the distribution of <math>\mat
    2 KB (268 words) - 04:18, 15 November 2010
  • [[Category:random variables]] *[[ECE 600 Prerequisites Discrete Random Variables|Discrete Random Variables]]
    1 KB (139 words) - 13:13, 16 November 2010
  • =='''1.4 Discrete Random Variables'''== ...}_{2},\cdots</math> are i.i.d. Bernoulli random variables, then Binomial random variable is defined as <math class="inline">\mathbf{X}=\mathbf{Y}_{1}+\math
    5 KB (921 words) - 11:25, 30 November 2010
  • ...s="inline">\mathbf{Y}_{1},\mathbf{Y}_{2},\cdots</math> are i.i.d. random variables. <math class="inline">\mathbf{N}\left(t\right)</math> is Poisson process <
    5 KB (920 words) - 11:26, 30 November 2010
  • ='''1.6 Continuous Random Variables'''= ...tribution, then <math class="inline">\mathbf{Y}=\ln\mathbf{X}</math> is a random variable with Gaussian distribution. This distribution is characterized wit
    5 KB (843 words) - 11:27, 30 November 2010
  • ='''1.10 Two Random Variables'''= ...bf{Y}</math> be two jointly-distributed, statistically independent random variables, having pdfs <math class="inline">f_{\mathbf{X}}\left(x\right)</math> and
    6 KB (952 words) - 11:31, 30 November 2010
  • [[Category:random variables]] =Sequences of Random Variables=
    1 KB (194 words) - 11:35, 30 November 2010
  • Given a random sequence <math class="inline">\mathbf{X}_{1}\left(\omega\right),\mathbf{X}_ We say a sequence of random variables converges everywhere (e) if the sequence <math class="inline">\mathbf{X}_{1
    10 KB (1,667 words) - 11:37, 30 November 2010
  • Let <math class="inline">\mathbf{X}</math> be a random variable with mean <math class="inline">\mu</math> and variance <math clas [[ECE 600 Sequences of Random Variables|Back to Sequences of Random Variables]]
    3 KB (435 words) - 11:38, 30 November 2010
  • ...ght\}</math> be a sequence of <math class="inline">i.i.d.</math> random variables with mean <math class="inline">\mu</math> and variance <math class="inline [[ECE 600 Sequences of Random Variables|Back to Sequences of Random Variables]]
    2 KB (303 words) - 11:39, 30 November 2010
  • ...bf{X}_{n}\right\}</math> be a sequence of identically distributed random variables with mean <math class="inline">\mu</math> and variance <math class="inline [[ECE 600 Sequences of Random Variables|Back to Sequences of Random Variables]]
    795 B (126 words) - 11:41, 30 November 2010
  • [[Category:random variables]] ...[ECE_600_Sequences_of_Random_Variables|course notes on "sequence of random variables"]] of [[user:han84|Sangchun Han]], [[ECE]] PhD student.
    4 KB (657 words) - 11:42, 30 November 2010
  • =2.6 Random Sum= Example. Addition of multiple independent Exponential random variables
    2 KB (310 words) - 11:44, 30 November 2010
  • [[Category:random variables]]
    525 B (66 words) - 13:11, 22 November 2010
  • ...ead of mapping each <math class="inline">\omega\in\mathcal{S}</math> of a random experiment to a number <math class="inline">\mathbf{X}\left(\omega\right)</ ...andom about the sample functions. The randomness comes from the underlying random experiment.
    16 KB (2,732 words) - 11:47, 30 November 2010
  • ...cdots</math> be a sequence of independent, identically distributed random variables, each having pdf ...ht)}\left(x\right).</math> Let <math class="inline">Y_{n}</math> be a new random variable defined by
    10 KB (1,713 words) - 07:17, 1 December 2010
  • ...class="inline">\mathbf{X}\left(t,\omega\right)</math> , then we have a new random process <math class="inline">\mathbf{Y}\left(t\right)</math> : <math class= We will assume that <math class="inline">T</math> is deterministic (NOT random). Think of <math class="inline">\mathbf{X}\left(t\right)=\text{input to a s
    11 KB (1,964 words) - 11:52, 30 November 2010
  • [[Category:random variables]] We place at random n points in the interval <math class="inline">\left(0,1\right)</math> and
    5 KB (859 words) - 11:55, 30 November 2010
  • ...endent Poisson random variables|Addition of two independent Poisson random variables]] ...dent Gaussian random variables|Addition of two independent Gaussian random variables]]
    1 KB (188 words) - 11:57, 30 November 2010
  • ...1 dime. One of the boxes is selected at random, and a coin is selected at random from that box. The coin selected is a quater. What is the probability that – A = Box selected at random contains at least one dime.
    22 KB (3,780 words) - 07:18, 1 December 2010
  • ...th> be a sequence of random variables that converge in mean square to the random variable <math class="inline">\mathbf{X}</math> . Does the sequence also co ...> A sequence of random variable that converge in mean square sense to the random variable <math class="inline">\mathbf{X}</math> , also converges in probabi
    6 KB (1,093 words) - 08:23, 27 June 2012
  • Consider the following random experiment: A fair coin is repeatedly tossed until the same outcome (H or T ...math> , respectively. Let <math class="inline">\mathbf{Z}</math> be a new random variable defined as <math class="inline">\mathbf{Z}=\mathbf{X}+\mathbf{Y}.<
    10 KB (1,827 words) - 08:33, 27 June 2012
  • ...irst coin is fair and the second coin has two heads. One coin is picked at random and tossed two times. It shows heads both times. What is the probability th ...mathbf{Y}_{t}</math> by jointly wide sense stationary continous parameter random processes with <math class="inline">E\left[\left|\mathbf{X}\left(0\right)-\
    9 KB (1,534 words) - 08:33, 27 June 2012
  • ...ft(x\right)=P\left(\left\{ \mathbf{X}\leq x\right\} \right)</math> of the random variable <math class="inline">\mathbf{X}</math> . Make sure and specify you ...inline">\mathbf{Y}</math> is <math class="inline">r</math> . Define a new random variable <math class="inline">\mathbf{Z}</math> by <math class="inline">\m
    10 KB (1,652 words) - 08:32, 27 June 2012
  • ...inline">\mathbf{Y}</math> be jointly Gaussian (normal) distributed random variables with mean <math class="inline">0</math> , <math class="inline">E\left[\math ...}</math> . Note: <math class="inline">\mathbf{V}</math> is not a Gaussian random variable.
    6 KB (916 words) - 08:26, 27 June 2012
  • State the definition of a random variable; use notation from your answer in part (a). A random variable <math class="inline">\mathbf{X}</math> is a process of assigning
    10 KB (1,608 words) - 08:31, 27 June 2012
  • ...f{Y}</math> be two independent identically distributed exponential random variables having mean <math class="inline">\mu</math> . Let <math class="inline">\mat ...that it deals with the exponential random variable rather than the Poisson random variable.
    14 KB (2,358 words) - 08:31, 27 June 2012
  • Assume that <math class="inline">\mathbf{X}</math> is a binomial distributed random variable with probability mass function (pmf) given by <math class="inline" ...athbf{X}_{n},\cdots</math> be a sequence of binomially distributed random variables, with <math class="inline">\mathbf{X}_{n}</math> having probability mass f
    10 KB (1,754 words) - 08:30, 27 June 2012
  • ...d <math class="inline">\mathbf{Y}</math> be two joinly distributed random variables having joint pdf Let <math class="inline">\mathbf{Z}</math> be a new random variable defined as <math class="inline">\mathbf{Z}=\mathbf{X}+\mathbf{Y}</
    9 KB (1,560 words) - 08:30, 27 June 2012
  • ...ependent, identically distributed zero-mean, unit-variance Gaussian random variables. The sequence <math class="inline">\mathbf{X}_{n}</math> , <math class="inl ...ath class="inline">\mathbf{X}_{n}</math> is a sequence of Gaussian random variables with zero mean and variance <math class="inline">\sigma_{\mathbf{X}_{n}}^{2
    14 KB (2,439 words) - 08:29, 27 June 2012
  • ...line">\mathbf{Y}</math> be two independent identically distributed random variables taking on values in <math class="inline">\mathbf{N}</math> (the natural nu ...y distributed random variables, with the <math class="inline">n</math> -th random variable <math class="inline">\mathbf{X}_{n}</math> having pmf <math class
    10 KB (1,636 words) - 08:29, 27 June 2012
  • ...athbf{X}_{2},\mathbf{X}_{3},\cdots</math> is a sequence of i.i.d. random variables with finite mean <math class="inline">E\left[\mathbf{X}_{i}\right]=\mu</mat ...{2},\mathbf{X}_{3},\cdots</math> be a sequence of i.i.d Bernoulli random variables with <math class="inline">p=1/2</math> , and let <math class="inline">\math
    12 KB (1,920 words) - 08:28, 27 June 2012
  • =Example. Addition of two independent Poisson random variables= ...and <math class="inline">\mathbf{Y}</math> are independent Poisson random variables with means <math class="inline">\lambda</math> and <math class="inline">\m
    3 KB (532 words) - 11:58, 30 November 2010
  • =Example. Addition of two independent Gaussian random variables= ...is the pdf you determined in part (b)? What is the mean and variance of a random variable with this pdf?
    7 KB (1,015 words) - 11:59, 30 November 2010
  • =Example. Addition of two jointly distributed Gaussian random variables= ...inline">\mathbf{Y}</math> is <math class="inline">r</math> . Define a new random variable <math class="inline">\mathbf{Z}=\mathbf{X}+\mathbf{Y}</math> .
    3 KB (504 words) - 12:00, 30 November 2010
  • =Example. Two jointly distributed random variables= Two joinly distributed random variables <math class="inline">\mathbf{X}</math> and <math class="inline">\mathbf{Y}
    2 KB (416 words) - 11:47, 3 December 2010
  • =Example. Two jointly distributed independent random variables= ..."inline">\mathbf{Y}</math> be two jointly distributed, independent random variables. The pdf of <math class="inline">\mathbf{X}</math> is
    5 KB (803 words) - 12:08, 30 November 2010
  • =Example. Two jointly distributed independent random variables= ..."inline">\mathbf{Y}</math> be two jointly distributed, independent random variables. The pdf of <math class="inline">\mathbf{X}</math> is
    5 KB (803 words) - 12:10, 30 November 2010
  • =Example. Geometric random variable= Let <math class="inline">\mathbf{X}</math> be a random variable with probability mass function
    5 KB (793 words) - 12:10, 30 November 2010
  • =Example. Sequence of binomially distributed random variables= ...of binomially distributed random variables, with the <math>n_{th}</math> random variable <math>\mathbf{X}_{n}</math> having pmf
    3 KB (470 words) - 13:02, 23 November 2010
  • =Example. Sequence of binomially distributed random variables= ...distributed random variables, with the <math class="inline">n_{th}</math> random variable <math class="inline">\mathbf{X}_{n}</math> having pmf
    3 KB (539 words) - 12:14, 30 November 2010
  • =Example. Sequence of exponentially distributed random variables= ...X}_{n}</math> be a collection of i.i.d. exponentially distributed random variables, each having mean <math class="inline">\mu</math> . Define
    3 KB (486 words) - 07:13, 1 December 2010
  • =Example. Sequence of uniformly distributed random variables= ...erval <math class="inline">\left[0,1\right]</math> . Define the new random variables <math class="inline">\mathbf{W}=\max\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\
    3 KB (456 words) - 07:14, 1 December 2010
  • =Example. Mean of i.i.d. random variables= ...ath> be <math class="inline">M</math> jointly distributed i.i.d. random variables with mean <math class="inline">\mu</math> and variance <math class="inline
    2 KB (420 words) - 11:25, 16 July 2012
  • =Example. A sum of a random number of i.i.d. Gaussians= ...{ \mathbf{X}_{n}\right\}</math> be a sequence of i.i.d. Gaussian random variables, each having characteristic function
    2 KB (426 words) - 07:15, 1 December 2010
  • ...ot certain what it says in revers, but I have to believe that whatever the random collection of notes is that would be produced by playing it in reverse woul %Note funtion takes 2 variables
    4 KB (570 words) - 12:13, 15 January 2011
  • * [[Methods of generating random variables|Methods of generating random variables, a class project by Zhenming Zhang]] * [[Applications of Poisson Random Variables|Applications of Poisson Random Variables, a class project by Trevor Holloway]]
    1 KB (195 words) - 07:52, 15 May 2013
  • *Discrete Random Variables ...on_ECE302S13Boutin|Normalizing the probability mass function of a discrete random variable]]
    7 KB (960 words) - 18:17, 23 February 2015
  • ...noisy data set. Complex data sets may hid significant relationship between variables, and the aim of PCA is to extricate those relevant information shrouded by ...three dimensional world, so we decide to set up three cameras in a rather random directions to collect the data on the motion of the spring.
    6 KB (1,043 words) - 12:45, 3 March 2015
  • ...ce of random variables since <math>p_i(\vec{x_0})</math> depends on random variables |sample_space_i|. What do we mean by convergence of a sequence of random variables (There are many definitions). We pick "Convergence in mean square" sense, i
    8 KB (1,246 words) - 11:21, 10 June 2013
  • The linear combination of independent Gaussian random variables is also Gaussian. ...</math> are <math>n</math> independent Gaussian random variables, then the random variable <math>Y</math> is also Gaussian, where <math>Y</math> is a linear
    2 KB (453 words) - 14:19, 13 June 2013
  • ...h>M_{Xi}(t)</math>, <math>i = 1,2,...,n</math>, and if <math>Y</math> is a random variable resulting from a linear combination of <math>X_i</math>s such that ...s can be written as the product of the expectations of functions of random variables (proof). <br/>
    1 KB (261 words) - 14:17, 13 June 2013
  • ...s value, which is also an address.<br>• Nothing is changed to the actual variables in the main function so the output is still 12 -9<br>• Since none of the ...like stack memory. <br>Stack Memory: first in → last out<br>Heap Memory: random access<br>However, it means that you need to allocate and release. You need
    3 KB (540 words) - 09:24, 13 February 2012
  • | [[Media:Walther_MA375_02February2012.pdf| Bernoulli Trials,Random Variables]]
    3 KB (418 words) - 06:38, 21 March 2013
  • Deterministic (single, non-random) estimate of parameters, theta_ML ...Bayesian formulation, the parameters to be estimated are treated as random variables. The Bayes estimate is the one that minimizes the Bayes risk by minimizing
    6 KB (976 words) - 13:25, 8 March 2012
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (406 words) - 10:19, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (763 words) - 10:57, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (780 words) - 01:25, 9 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (766 words) - 00:16, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (729 words) - 00:51, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (735 words) - 01:17, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (609 words) - 01:54, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (726 words) - 10:35, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (632 words) - 11:05, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (643 words) - 11:16, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (616 words) - 10:19, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (572 words) - 10:24, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (748 words) - 01:01, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (358 words) - 10:33, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (638 words) - 10:34, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (248 words) - 10:34, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (282 words) - 10:34, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (532 words) - 10:36, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (655 words) - 10:36, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (357 words) - 10:36, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (234 words) - 10:37, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (496 words) - 10:37, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (547 words) - 16:40, 30 March 2015
  • ...25 pts} \right) \text{ Let X, Y, and Z be three jointly distributed random variables with joint pdf} f_{XYZ}\left ( x,y,z \right )= \frac{3z^{2}}{7\sqrt[]{2\pi}
    5 KB (711 words) - 09:05, 27 July 2012
  • [[Category:random variables]] Question 1: Probability and Random Processes
    8 KB (1,247 words) - 10:29, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    6 KB (932 words) - 10:30, 13 September 2013
  • Probability, Statistics, and Random Processes for Electrical Engineering, 3rd Edition, by Alberto Leon-Garcia, *Discrete Random Variables
    10 KB (1,422 words) - 20:14, 30 April 2013
  • ==Part 2: Discrete Random Variables (To be tested in the second intra-semestrial exam)== *2.2 Functions of a discrete random variable
    4 KB (498 words) - 10:18, 17 April 2013
  • ...e having trouble using the image() command in Matlab for displaying their 'random image' in part 2? Mine comes up as just a black square every time. I set th ...ifferent numbers of arguments. I'd like to not have to create 25 different variables for each window. Can anyone help me with this?
    5 KB (957 words) - 08:11, 9 April 2013
  • ...The formula for obtaining the probability mass function of a function of a random variable was given, and we illustrated it with two simple examples. We fini
    2 KB (307 words) - 10:26, 4 February 2013
  • In Lecture 10, defined the concept of a discrete random variables and gave several examples. The concept that caused the most confusion seems
    2 KB (289 words) - 11:08, 30 January 2013
  • ...the random variable does not change the variance, and that multiplying the random variable by a constant "a" has the effect of multiplying the variance by <m
    2 KB (336 words) - 12:59, 18 February 2013
  • ...n Part III of the material with a definition of the concept of "continuous random variable" along with two examples.
    2 KB (321 words) - 11:12, 15 February 2013
  • ...tative robot spiraling 'inward' or 'outward'. Normally distributed random variables are used to modify the magnitude (M) of the complex vector and rotate the v % generate random initial state with complex magnitude 1
    2 KB (289 words) - 15:14, 1 May 2016
  • ...t it is spades, and both probabilities sum up to 1 (since we only have two variables). We can therefore use the following decision rule; that if ''P(x<sub>1</su ...r), we can describe this as a variable ''y'' and we consider ''y'' to be a random variable whose distribution depends on the state of the card and is express
    5 KB (844 words) - 23:32, 28 February 2013
  • ...looked at an example of continuous random variable, namely the exponential random variable.
    2 KB (329 words) - 08:16, 20 February 2013
  • In Lecture 19, we continued our discussion of continuous random variables. ...outin|Invent a problem about the expectation and/or variable of a discrete random variable]]
    2 KB (252 words) - 08:20, 20 February 2013
  • ...discrete) and we began discussing normally distributed (continuous) random variables. ...on_ECE302S13Boutin|Normalizing the probability mass function of a Gaussian random variable]]
    2 KB (304 words) - 07:43, 23 February 2013
  • ...a normally distributed random variable: it was observed that the resulting random variable Y=aX+b is also normally distributed. The relation between the mean
    3 KB (393 words) - 08:21, 27 February 2013
  • ...lso had a little bit of time to start talking about two dimensional random variables.
    3 KB (387 words) - 07:09, 28 February 2013
  • [[Category:independent random variables]] ...e Problem]]: obtaining the joint pdf from the marginals of two independent variables =
    2 KB (394 words) - 12:03, 26 March 2013
  • ...on_ECE302S13Boutin|Normalizing the probability mass function of a Gaussian random variable]] ...13Boutin|Obtaining the joint pdf from the marginal pdfs of two independent variables]]
    2 KB (337 words) - 06:28, 1 March 2013
  • ...of a 2D random variable. In particular, we looked at the covariance of two variables. We finished the lecture by giving the definition of conditional probabili
    2 KB (324 words) - 13:11, 5 March 2013
  • ...ind the pdf of a random variable Y defined as a function Y=g(X) of another random variable X.
    2 KB (328 words) - 04:58, 9 March 2013
  • ...particular, we obtain a formula for the pdf of a sum of independent random variables (namely, the convolution of their respective pdf's).
    2 KB (286 words) - 09:11, 29 March 2013
  • [[Category:independent random variables]] Two continuous random variables X and Y have the following joint probability density function:
    2 KB (290 words) - 10:17, 27 March 2013
  • A discrete random variables X has a moment generating (characteristic) function <math>M_X(s)</math> suc
    1 KB (211 words) - 03:47, 27 March 2013
  • ...vation of the conditional distributions for continuous and discrete random variables, you may wish to go over Professor Mary Comer's [[ECE600_F13_rv_conditional * Alberto Leon-Garcia, ''Probability, Statistics, and Random Processes for Electrical Engineering,'' Third Edition
    4 KB (649 words) - 13:08, 25 November 2013
  • [[Category:normal random variable]] be a two-dimensional Gaussian random variable with mean <math>\mu</math> and standard deviation matrix <math>\Si
    2 KB (273 words) - 03:22, 26 March 2013
  • ...tudent, and let Y be the arrival time of the professor. Assume that the 2D random variable (X,Y) is uniformly distributed in the square [2 , 3]x[2,3]. '''2.''' Let (X,Y) be a 2D random variable that is uniformly distributed in the rectangle [1,3]x[5,10].
    3 KB (559 words) - 07:02, 22 March 2013
  • ...also a quiz where we re-emphasized how easy it is to compute the mean of a random variable with a symmetric pmf/pdf. (The trick is to guess the answer m, and *Read Sections 2.1.1-2.1.6 of Prof. Pollak's notes on random variables [https://engineering.purdue.edu/~ipollak/ee438/FALL04/notes/Section2.1.pdf
    2 KB (330 words) - 06:16, 9 April 2013
  • [[Category:random process]] ...ariable with the same distribution as the random variable contained in the random process at the time found by differencing the two distinct times mentioned
    9 KB (1,507 words) - 16:23, 23 April 2013
  • ...short introduction to the topic, we covered the definition of a stationary random process.
    3 KB (376 words) - 10:23, 17 April 2013
  • '''Methods of Generating Random Variables''' == 1. Generating uniformly distributed random numbers between 0 and 1: U(0,1) ==
    3 KB (409 words) - 10:05, 17 April 2013
  • '''Applications of Poisson Random Variables''' == Poisson Random Variables==
    5 KB (708 words) - 07:22, 22 April 2013
  • ...e different statistical numbers describing relations of datasets or random variables. So, I decided to crack down on some research and bring the important ideas '''Covariance:''' This is a measure of two random variable’s association with each other.
    7 KB (1,146 words) - 06:19, 5 May 2013
  • where <math>X_1</math> and <math>X_2</math> are iid scalar random variables. ...two Gaussian variables, then <math>X</math> is also a Gaussian distributed random variable [[Linear_combinations_of_independent_gaussian_RVs|(proof)]] charac
    6 KB (1,084 words) - 13:20, 13 June 2013
  • ...nations_of_independent_gaussian_RVs|Linear Combinations of Gaussian Random Variables]] ..._RVs_mhossain|Moment Generating Functions of Linear Combinations of Random Variables]]
    2 KB (227 words) - 11:15, 6 October 2013
  • ...is the expectation function, <math>X</math> and <math>Y</math> are random variables with distribution functions <math>f_X(x)</math> and <math>f_Y(y)</math> res where <math>X_i</math>'s are random variables and <math>a_i</math>'s are real constants ∀<math>i=1,2,...,N</math>.
    3 KB (585 words) - 14:15, 13 June 2013
  • Let <math>X</math> and <math>Y</math> be two random variables with variances <math>Var(X)</math> and <math>Var(Y)</math> respectively and By definition, we have that the variance of random variable <math>Z</math> is given by <br/>
    2 KB (333 words) - 14:17, 13 June 2013
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] ...ECE 600. Class Lecture. [https://engineering.purdue.edu/~comerm/600 Random Variables and Signals]. Faculty of Electrical Engineering, Purdue University. Fall 20
    10 KB (1,697 words) - 12:10, 21 May 2014
  • '''The Comer Lectures on Random Variables and Signals''' *[[ECE600_F13_rv_definition_mhossain|Random Variables: Definition]]
    2 KB (227 words) - 12:10, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] ...finition''' <math>\qquad</math> An '''outcome''' is a possible result of a random experiment.
    20 KB (3,448 words) - 12:11, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] ...es' Theorem. We will see other equivalent expressions when we cover random variables.
    6 KB (1,023 words) - 12:11, 21 May 2014
  • [[ECE600_F13_rv_distribution_mhossain|Next Topic: Random Variables: Distributions]] [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']]
    7 KB (1,194 words) - 12:11, 21 May 2014
  • [[ECE600_F13_rv_definition_mhossain|Next Topic: Random Variables: Definition]] [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']]
    9 KB (1,543 words) - 12:11, 21 May 2014
  • [[ECE600_F13_rv_definition_mhossain|Previous Topic: Random Variables: Definitions]]<br/> [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']]
    15 KB (2,637 words) - 12:11, 21 May 2014
  • [[ECE600_F13_rv_distribution_mhossain|Previous Topic: Random Variables: Distributions]]<br/> ...00_F13_rv_Functions_of_random_variable_mhossain|Next Topic: Functions of a Random Variable]]
    6 KB (1,109 words) - 12:11, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 8: Functions of Random Variables</font size>
    9 KB (1,723 words) - 12:11, 21 May 2014
  • ...13_rv_Functions_of_random_variable_mhossain|Previous Topic: Functions of a Random Variable]]<br/> [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']]
    8 KB (1,474 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] The pdf f<math>_X</math> of a random variable X is a function of a real valued variable x. It is sometimes usefu
    5 KB (804 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 11: Two Random Variables: Joint Distribution</font size>
    8 KB (1,524 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 12: Independent Random Variables</font size>
    2 KB (449 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 13: Functions of Two Random Variables</font size>
    9 KB (1,568 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] Given random variables X and Y, let Z = g(X,Y) for some g:'''R'''<math>_2</math>→R. Then E[Z] ca
    7 KB (1,307 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 15: Conditional Distributions for Two Random Variables</font size>
    6 KB (1,139 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 16: Conditional Expectation for Two Random Variables</font size>
    4 KB (875 words) - 12:13, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 17: Random Vectors</font size>
    12 KB (1,897 words) - 12:13, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] We will now consider infinite sequences of random variables. We will discuss what it means for such a sequence to converge. This will l
    15 KB (2,578 words) - 12:13, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] ...n discrete-time random processes, but we will now formalize the concept of random process, including both discrete-time and continuous time.
    10 KB (1,690 words) - 12:13, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 20: Linear Systems with Random Inputs</font size>
    8 KB (1,476 words) - 12:13, 21 May 2014
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (449 words) - 21:36, 5 August 2018
  • <font size="4">Question 1: Probability and Random Processes </font> ...cting your proof, keep in mind that may be either a discrete or continuous random variable.
    6 KB (995 words) - 09:21, 15 August 2014
  • <font size="4">Question 1: Probability and Random Processes </font> ...} \dots </math> be a sequence of independent, identical distributed random variables, each uniformly distributed on the interval [0, 1], an hence having pdf <br
    12 KB (1,948 words) - 10:16, 15 August 2014
  • ...A stochastic process { X(t), t∈T } is an ordered collection of random variables, T where T is the index set and if t is a time in the set, X(t) is the proc ...s that use X1,…,Xn as independently identically distributed (iid) random variables. However, note that states do not necessarily have to be independently iden
    19 KB (3,004 words) - 09:39, 23 April 2014
  • ...between the prior probability and the posterior probability of two random variables or events. Give two events <math>A</math> and <math>B</math>, we may want t
    19 KB (3,255 words) - 10:47, 22 January 2015
  • ...ollows a multivariate Gaussian distribution in 2D. The data comes from two random seed which are of equal probability and are well separated for better illus #Hotelling, H. (1933). Analysis of a complex of statistical variables into principal components. Journal of educational psychology, 24(6), 417.
    22 KB (3,459 words) - 10:40, 22 January 2015
  • ..._S14_MH|Whitening and Coloring Transforms for Multivariate Gaussian Random Variables]] ...'R''' where '''X''' ∈ '''R'''<math>^d</math> is a d-dimensional Gaussian random vector with mean '''μ''' and covariance matrix '''Σ'''. This slecture ass
    17 KB (2,603 words) - 10:38, 22 January 2015
  • The generic situation is that we observe a n-dimensional random vector X with probability<br>density (or mass) function <span class="texhtm <br> Treating <span class="texhtml">''X''<sub>''j''</sub></span> as random variables in the above equations, we have
    25 KB (4,187 words) - 10:49, 22 January 2015
  • ...nd way was using a uniform random variable in vector operations. A uniform random vector of the same size of the whole dataset is first generated, whose each ...asy to understand even for people who do not have deep knowledge in random variables & probabilities. The demonstrations (figures & codes) in MATLAB were good i
    3 KB (508 words) - 16:12, 14 May 2014
  • ...or a given value of θ is denoted by p(x|θ ). It should be noted that the random variable X and the parameter θ can be vector-valued. Now we obtain a set o ...s parameter estimation, the parameter θ is viewed as a random variable or random vector following the distribution p(θ ). Then the probability density func
    15 KB (2,273 words) - 10:51, 22 January 2015
  • ...random variables and probability mass function in case of discrete random variables and 'θ' is the parameter being estimated. ...es) or the probability of the probability mass (in case of discrete random variables)'''
    12 KB (1,986 words) - 10:49, 22 January 2015
  • ...tor, <math> \theta \in \Omega</math>. So for example, after we observe the random vector <math>Y \in \mathbb{R}^{n}</math>, then our objective is to use <mat ...andom vector <math>Y</math>, the estimate, <math>\hat{\theta}</math>, is a random vector. The mean of the estimator, <math>\bar{\theta}</math>, can be comput
    14 KB (2,356 words) - 20:48, 30 April 2014
  • ...by flipping coins (heads or tails). As the size of the sample increases, a random classifier's ROC point migrates towards (0.5,0.5). ...ring the area under the curve, we can tell the accuracy of a classifier. A random guess is just the line y = x with an area of 0.5. A perfect model will have
    11 KB (1,823 words) - 10:48, 22 January 2015
  • \section{Title: Generation of normally distributed random numbers under a binary prior probability} ...a_1)]$, label the sample as class 1, then, continue to generating a normal random number based on the class 1 statistics $(\mu, \sigma)$.
    16 KB (2,400 words) - 23:34, 29 April 2014
  • For discrete random variables, Bayes rule formula is given by, For continuous random variables, Bayes rule formula is given by,
    7 KB (1,106 words) - 10:42, 22 January 2015
  • <font size="4">Generation of normally distributed random numbers from two categories with different priors </font> ...2), 1]</math> and should be labeled as class 2, then, move onto the normal random number generation step with the class 2 statistics like the same way as we
    18 KB (2,852 words) - 10:40, 22 January 2015
  • ...tor, <math> \theta \in \Omega</math>. So for example, after we observe the random vector <math>Y \in \mathbb{R}^{n}</math>, then our objective is to use <mat ...andom vector <math>Y</math>, the estimate, <math>\hat{\theta}</math>, is a random vector. The mean of the estimator, <math>\bar{\theta}</math>, can be comput
    19 KB (3,418 words) - 10:50, 22 January 2015
  • If the data-points from each class are random variables, then it can be proven that the optimal decision rule to classify a point < Therefore, it is desirable to assume that the data-points are random variables, and attempt to estimate <math> P(w_i|x_0) </math>, in order to use it to c
    9 KB (1,604 words) - 10:54, 22 January 2015
  • The principle of how to generate a Gaussian random variable ...od for pseudo random number sampling first. Then, we will explain Gaussian random sample generation method based on Box Muller transform. Finally, we will in
    8 KB (1,189 words) - 10:39, 22 January 2015
  • ...,X_N be the Independent and identically distributed (iid) Poisson random variables. Then, we will have a joint frequency function that is the product of margi ..._N be the Independent and identically distributed (iid) exponential random variables. As P(X=x)=0 when x&lt;0, no samples can sit in x&lt;0 region. Thus, for al
    13 KB (1,966 words) - 10:50, 22 January 2015
  • The K-means algorithm also introduces a set of binary variables to represent assignment of data points to specific clusters: <br /> This set of binary variables is interpreted as follows: if data point <math>n</math> is assigned to clus
    8 KB (1,350 words) - 10:57, 22 January 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (470 words) - 07:47, 4 November 2014
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (566 words) - 17:28, 23 February 2017
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (927 words) - 16:43, 24 February 2017
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (534 words) - 21:02, 5 August 2018
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (548 words) - 07:33, 20 November 2014
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (700 words) - 17:48, 13 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    1 KB (223 words) - 17:35, 13 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (643 words) - 17:36, 13 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (331 words) - 17:37, 13 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (302 words) - 17:38, 13 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (284 words) - 17:39, 13 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (374 words) - 00:19, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (525 words) - 00:20, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (528 words) - 00:21, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (384 words) - 00:22, 10 March 2015
  • State the definition of a random variable; use notation from your answer in part (a). [[Category:random variables]]
    2 KB (287 words) - 00:52, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    6 KB (895 words) - 00:41, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (247 words) - 00:53, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (328 words) - 00:54, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (538 words) - 00:54, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (414 words) - 00:55, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (740 words) - 01:03, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    1 KB (187 words) - 01:03, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    874 B (125 words) - 01:04, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (318 words) - 01:05, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (698 words) - 01:35, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (366 words) - 01:36, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (452 words) - 01:37, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    6 KB (1,002 words) - 01:38, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (758 words) - 01:56, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (429 words) - 01:57, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (679 words) - 01:58, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (571 words) - 10:24, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (454 words) - 10:25, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (646 words) - 10:26, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (692 words) - 10:36, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (490 words) - 10:36, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (939 words) - 10:37, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (422 words) - 10:38, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (403 words) - 10:48, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (258 words) - 10:58, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (451 words) - 10:56, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (451 words) - 10:58, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (699 words) - 11:08, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (906 words) - 11:09, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (379 words) - 14:42, 10 August 2018
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (882 words) - 01:54, 31 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    8 KB (1,336 words) - 01:53, 31 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    2 KB (351 words) - 00:17, 4 December 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (851 words) - 23:04, 31 January 2016
  • For this problem, it is very useful to note that for any independent random variables <math>X</math> and <math>Y</math> and their characteristic functions <math> We then note that the characteristic function of an exponential random variable <math>Z</math> is written as
    2 KB (243 words) - 22:00, 7 March 2016

View (previous 250 | next 250) (20 | 50 | 100 | 250 | 500)

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood