Skip to content

Commit

Permalink
Confidence Hypercube in half finished state
Browse files Browse the repository at this point in the history
  • Loading branch information
s-baumann committed Mar 30, 2019
1 parent 4c80f50 commit 8d36abc
Show file tree
Hide file tree
Showing 2 changed files with 25 additions and 0 deletions.
23 changes: 23 additions & 0 deletions docs/src/4___ConfidenceHypercube.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# 4 Applications
## 4.5 Finding a confidence hypercube for a multivariate normal distribution.

We can find a confidence interval that includes x\% of a univariate normal distribution easily. It is more difficult however to come up with a confidence interval (or confidence area) for a multivariate Gaussian distribution. The first reason is that there is some ambiguity in what such a confidence area should look like. Considering some dimensions of the distribution will be correlated, it may be natural to look for an eliptically shaped area that will give the smallest possible area that covers x\% of the probability mass of a multivariate normal distribution (See Korpela et al. 2017).

Parameterising an elliptical area may be difficult however and it may be more natural to define a hypercube based on some basis of the multivariate normal distribution. A natural algorithm to do this would be to guess cutoff points marking the edges of the hypercube. Integrate the pdf of the normal distribution over this hypercube. If the integral deviates from that desired then come up with a new guess with different cutoff points. Iterate until there is a hypercube containing the desired x\% of the mass of the distribution.

```
# Generating data
using Distributions
using FixedPointAcceleration
using HCubature
using Random
# Generating an example distribution.
# Without loss of generality we use means of zero in every dimension.
# We use a random matrix sampled from the Wishart distribution.
Random.seed!(1234)
dist = Normal()
Endowments = rand(LogNormal(), G, N)
Tastes = rand(G, N)
```
2 changes: 2 additions & 0 deletions docs/src/99_refs.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@ Cabay, S., and L.W. Jackson. 1976. ["A Polynomial Extrapolation Method for Findi

Fang, Haw-ren, and Yousef Saad. 2009. ["Two Classes of Multisecant Methods for Nonlinear Acceleration."](https://onlinelibrary.wiley.com/doi/abs/10.1002/nla.617) Numerical Linear Algebra with Applications 16 (3): 197–221.

Korpela, Jussi and Oikarinen, Emilia and Puolamäki, Kai and Ukkonen, Antti. 2017. ["Multivariate Confidence Intervals."](https://arxiv.org/abs/1701.05763?fbclid=IwAR2TppxqYEDHFf3XiPNMW-ndu79WcVkKQ_l-kr2sdrt7f2jfJUumZGYQrHQ)

Novikoff. A. 1963. ["On convergence proofs for perceptrons."](http://www.dtic.mil/dtic/tr/fulltext/u2/298258.pdf) Stanford Research Institute: Technical Report, 298258.

Rosenblatt, F. 1958. ["The perceptron: A probabilistic model for information storage and organization in the brain."](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.335.3398&rep=rep1&type=pdf) Psychological Review, 65(6): 386–408.
Expand Down

0 comments on commit 8d36abc

Please sign in to comment.