第十九章 人工智能
a brief review of the second semester of the course 12/05/16 6:46 AM
probability
probability gives us a formal mechanism to reason about uncertainty; important concepts include:
* random variables
* joint distribution between 2 or more random variables
* conditional distributions Pr(X | Y) = Pr(X, Y) / Pr(Y)
* sum rule and product rule
* Bayes' rule
* independence and conditional independence
* chain rule allows us to factorize a joint distribution into conditional distributions
(Hidden) Markov models
MMs and HMMs are specials of Bayes nets, allow us to reason about time series of RVs
- structure of MM/HMMs
* the stationary distribution - filtering: are always trying to infer Pr(X_t | E_1 ... E_t)
- exact inference for filtering (elapse time/update/forward algorithm)
Particle filtering
Particle filtering is a sampling approach to HMMs that is useful when the state space is very large
Bayes's nets
Bayes nets give us a mechanism to easily model situations and encode conditional independence
- nodes are RVs, edges indicate (often causal) relationships in a DAG
- the graph structure encodes conditional independences
- constructing a reasonable BN is possible with knowledge about the problem
- D-separation gives us an algorithm for determining whether two RVs are conditionally independent given a set of evidence
Decision diagrams / VPI
decision diagrams / value of perfect information give us a framework for decision making that is compatible with probability
- maximize the expected utility!!
- value of perfect information allows us to determine if gathering information is worth the cost