Generating random variate - General approaches to generating random variate

11 important questions on Generating random variate - General approaches to generating random variate

What is the inverse transform method for discrete x

1) Generate U ~ U(0,1)
2) Detemrine the smallest positive integer I such that U <=F(x1)
3) return X = x1

What are the advantages and disadvantages of the inverse transform approach

+ Facilitates variance-reduction techniques that rely on inducing correlation between random variates
+ Ease of generating from truncated distributions
1) Generate U ~ N(0.1)
2) Let V = F(a) + [F(b)-F(a)]U
3) Return X - F^-1(V)
+ Can be useful for generating order statistics
1) Generate V ~ Beta(i,n-1+1)
2) Return X - F^-1(V)

- Continuous: the need to evaluate F^-1(U) (if it is not a closed form expression)
- For a given distribution, the inverse-transform may not be the fastest way to generate the corresponding random variate.

Which assumptions are needed for using the inverse transform method for a continuous distribution?

1) Cumulative distribution is a monotone function
2) The distribution is invertible on its domain

(x is continuous and F is continuous strictly increasing)
  • Higher grades + faster learning
  • Never study anything twice
  • 100% sure, 100% understanding
Discover Study Smart

Give 4 other methods for generating random variates than the inverse-transform method

1) Composition method
2) Convolution method
3) Methods relying on special properties
4) Acceptance-rejection method

What is the composition method

Applies when the distribution function F from which we wish to generate can be expressed as a convex combination of other distribution function F1, F2,.,,
Assume that for all x, F(x)= som of pj*Fj(x)
Algorithm
1) Generate a positive random integer J such P(J=j) = pj
2) Return X with distribution function Fj

What is the convolution method

Assume that there are IID random variables Y1,Y2,.. for fixed m, such that Y1+Y2+...+Ym has the same distribution as X
Algorithm
1) Genrate Y1,Y2,...,Ym IID each with distribution function G
2) return X = Y1+Y2+...+Ym

What is the differendce between the composition method and the convolution method?

Composition: assume distribution function of x is (weighted) sum of other distribution functions
Convolution: assume random variable x can be represented as a sum of other random variables

What is the acceptance-rejection method

1) let density f(x) have domain [a,b]
2) let t(x) be majorizing function t(x) >= f(x), a<=x<=b
3) let c be integral of t(x) over [a,b]; c>=1
==> then r(x) = t(x)/c is density
Algorithm
1) Generate a random variate Y with density r(x)
2) Generate a random number U ~ N(0,1)
3) if U<F(Y)/t(y) return X = Y (accept0
otherwise , try again (reject): step 1

Which assumptions are made for the Acceptance-rejection method

* Requires upper bound function on density
* Uses two or more random numbers per random variate
* Does not use cummulative distribution or its inverse

What has the inverse transform method, the compostion method and the convolution method in common?

They are alle direct methods because they use directly the distribution or random variables

What are the selection criteria for a method for generating random variates?

* Exactness: Within limitations of computer accuracy and RNG accuracy
* Efficiency: large number of samples
* Complexity: Understanding and implementation
* Robustness: with respect to parameter ranges
* One-to-one with random numbers: for variance reduction techniques

The question on the page originate from the summary of the following study material:

  • A unique study and practice tool
  • Never study anything twice again
  • Get the grades you hope for
  • 100% sure, 100% understanding
Remember faster, study better. Scientifically proven.
Trustpilot Logo