• [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (780 words) - 01:25, 9 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (766 words) - 00:16, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (729 words) - 00:51, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (735 words) - 01:17, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (609 words) - 01:54, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (726 words) - 10:35, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (616 words) - 10:19, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (572 words) - 10:24, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    5 KB (748 words) - 01:01, 10 March 2015
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (655 words) - 10:36, 13 September 2013
  • [[Category:random variables]] Question 1: Probability and Random Processes
    4 KB (547 words) - 16:40, 30 March 2015
  • Probability, Statistics, and Random Processes for Electrical Engineering, 3rd Edition, by Alberto Leon-Garcia, *Discrete Random Variables
    10 KB (1,422 words) - 20:14, 30 April 2013
  • ...n Part III of the material with a definition of the concept of "continuous random variable" along with two examples.
    2 KB (321 words) - 11:12, 15 February 2013
  • ...discrete) and we began discussing normally distributed (continuous) random variables. ...on_ECE302S13Boutin|Normalizing the probability mass function of a Gaussian random variable]]
    2 KB (304 words) - 07:43, 23 February 2013
  • [[Category:independent random variables]] ...e Problem]]: obtaining the joint pdf from the marginals of two independent variables =
    2 KB (394 words) - 12:03, 26 March 2013
  • ...on_ECE302S13Boutin|Normalizing the probability mass function of a Gaussian random variable]] ...13Boutin|Obtaining the joint pdf from the marginal pdfs of two independent variables]]
    2 KB (337 words) - 06:28, 1 March 2013
  • ...ind the pdf of a random variable Y defined as a function Y=g(X) of another random variable X. **[[Practice_Question_independence_ECE302S13Boutin|Determine if X and Y independent from their joint density]]
    2 KB (328 words) - 04:58, 9 March 2013
  • ...particular, we obtain a formula for the pdf of a sum of independent random variables (namely, the convolution of their respective pdf's).
    2 KB (286 words) - 09:11, 29 March 2013
  • [[Category:independent random variables]] = [[:Category:Problem solving|Practice Problem]]: Determine if X and Y are independent =
    2 KB (290 words) - 10:17, 27 March 2013
  • [[Category:normal random variable]] be a two-dimensional Gaussian random variable with mean <math>\mu</math> and standard deviation matrix <math>\Si
    2 KB (273 words) - 03:22, 26 March 2013
  • ...tudent, and let Y be the arrival time of the professor. Assume that the 2D random variable (X,Y) is uniformly distributed in the square [2 , 3]x[2,3]. '''2.''' Let (X,Y) be a 2D random variable that is uniformly distributed in the rectangle [1,3]x[5,10].
    3 KB (559 words) - 07:02, 22 March 2013
  • [[Category:random process]] ...ariable with the same distribution as the random variable contained in the random process at the time found by differencing the two distinct times mentioned
    9 KB (1,507 words) - 16:23, 23 April 2013
  • '''Methods of Generating Random Variables''' == 1. Generating uniformly distributed random numbers between 0 and 1: U(0,1) ==
    3 KB (409 words) - 10:05, 17 April 2013
  • ...e different statistical numbers describing relations of datasets or random variables. So, I decided to crack down on some research and bring the important ideas '''Covariance:''' This is a measure of two random variable’s association with each other.
    7 KB (1,146 words) - 06:19, 5 May 2013
  • where <math>X_1</math> and <math>X_2</math> are iid scalar random variables. ...two Gaussian variables, then <math>X</math> is also a Gaussian distributed random variable [[Linear_combinations_of_independent_gaussian_RVs|(proof)]] charac
    6 KB (1,084 words) - 13:20, 13 June 2013
  • ...is the expectation function, <math>X</math> and <math>Y</math> are random variables with distribution functions <math>f_X(x)</math> and <math>f_Y(y)</math> res where <math>X_i</math>'s are random variables and <math>a_i</math>'s are real constants ∀<math>i=1,2,...,N</math>.
    3 KB (585 words) - 14:15, 13 June 2013
  • '''The Comer Lectures on Random Variables and Signals''' *[[ECE600_F13_rv_definition_mhossain|Random Variables: Definition]]
    2 KB (227 words) - 12:10, 21 May 2014
  • [[ECE600_F13_rv_definition_mhossain|Next Topic: Random Variables: Definition]] [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']]
    9 KB (1,543 words) - 12:11, 21 May 2014
  • [[ECE600_F13_rv_definition_mhossain|Previous Topic: Random Variables: Definitions]]<br/> [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']]
    15 KB (2,637 words) - 12:11, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 12: Independent Random Variables</font size>
    2 KB (449 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 13: Functions of Two Random Variables</font size>
    9 KB (1,568 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] Given random variables X and Y, let Z = g(X,Y) for some g:'''R'''<math>_2</math>→R. Then E[Z] ca
    7 KB (1,307 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 15: Conditional Distributions for Two Random Variables</font size>
    6 KB (1,139 words) - 12:12, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] <font size= 3> Topic 17: Random Vectors</font size>
    12 KB (1,897 words) - 12:13, 21 May 2014
  • [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] We will now consider infinite sequences of random variables. We will discuss what it means for such a sequence to converge. This will l
    15 KB (2,578 words) - 12:13, 21 May 2014
  • [[Category:random variables]] Question 1: Probability and Random Processes
    3 KB (449 words) - 21:36, 5 August 2018
  • <font size="4">Question 1: Probability and Random Processes </font> ...} \dots </math> be a sequence of independent, identical distributed random variables, each uniformly distributed on the interval [0, 1], an hence having pdf <br
    12 KB (1,948 words) - 10:16, 15 August 2014
  • ...A stochastic process { X(t), t∈T } is an ordered collection of random variables, T where T is the index set and if t is a time in the set, X(t) is the proc ...s that use X1,…,Xn as independently identically distributed (iid) random variables. However, note that states do not necessarily have to be independently iden
    19 KB (3,004 words) - 09:39, 23 April 2014
  • ...between the prior probability and the posterior probability of two random variables or events. Give two events <math>A</math> and <math>B</math>, we may want t where <math>C_i</math> is a constant value which is independent of <math>x</math>. Finally, we can use the following discriminant function
    19 KB (3,255 words) - 10:47, 22 January 2015
  • ...ollows a multivariate Gaussian distribution in 2D. The data comes from two random seed which are of equal probability and are well separated for better illus ...subcomponents are non-Gaussian signals and that they are all statistically independent from each other.
    22 KB (3,459 words) - 10:40, 22 January 2015
  • ..._S14_MH|Whitening and Coloring Transforms for Multivariate Gaussian Random Variables]] ...y data processing techniques such as [[PCA|principal component analysis]], independent component analysis, etc. This slecture discusses how to whiten data that is
    17 KB (2,603 words) - 10:38, 22 January 2015
  • The generic situation is that we observe a n-dimensional random vector X with probability<br>density (or mass) function <span class="texhtm ...ume that samples'''''<math>x_{1}, x_{2} \ldots, x_{N}</math>&nbsp;'''''are independent events. <br>
    25 KB (4,187 words) - 10:49, 22 January 2015
  • ...riable X and the parameter θ can be vector-valued. Now we obtain a set of independent observations or samples S = {x<sub>1</sub>,x<sub>2</sub>,...,x<sub>n</sub>} ...s parameter estimation, the parameter θ is viewed as a random variable or random vector following the distribution p(θ ). Then the probability density func
    15 KB (2,273 words) - 10:51, 22 January 2015
  • ...random variables and probability mass function in case of discrete random variables and 'θ' is the parameter being estimated. ...es) or the probability of the probability mass (in case of discrete random variables)'''
    12 KB (1,986 words) - 10:49, 22 January 2015
  • ...tor, <math> \theta \in \Omega</math>. So for example, after we observe the random vector <math>Y \in \mathbb{R}^{n}</math>, then our objective is to use <mat ...andom vector <math>Y</math>, the estimate, <math>\hat{\theta}</math>, is a random vector. The mean of the estimator, <math>\bar{\theta}</math>, can be comput
    14 KB (2,356 words) - 20:48, 30 April 2014
  • \section{Title: Generation of normally distributed random numbers under a binary prior probability} ...a_1)]$, label the sample as class 1, then, continue to generating a normal random number based on the class 1 statistics $(\mu, \sigma)$.
    16 KB (2,400 words) - 23:34, 29 April 2014
  • <font size="4">Generation of normally distributed random numbers from two categories with different priors </font> ...2), 1]</math> and should be labeled as class 2, then, move onto the normal random number generation step with the class 2 statistics like the same way as we
    18 KB (2,852 words) - 10:40, 22 January 2015
  • ...tor, <math> \theta \in \Omega</math>. So for example, after we observe the random vector <math>Y \in \mathbb{R}^{n}</math>, then our objective is to use <mat ...andom vector <math>Y</math>, the estimate, <math>\hat{\theta}</math>, is a random vector. The mean of the estimator, <math>\bar{\theta}</math>, can be comput
    19 KB (3,418 words) - 10:50, 22 January 2015
  • The principle of how to generate a Gaussian random variable ...od for pseudo random number sampling first. Then, we will explain Gaussian random sample generation method based on Box Muller transform. Finally, we will in
    8 KB (1,189 words) - 10:39, 22 January 2015
  • &nbsp; Suppose we have a set of n independent and identically destributed observation samples. Then density function is f &nbsp; As each sample x_i is independent with each other, the likelihood of θ with the data set of samples x_1,x_2,
    13 KB (1,966 words) - 10:50, 22 January 2015

View (previous 50 | next 50) (20 | 50 | 100 | 250 | 500)

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood