Skip to main content.
Advanced search >
<< Back to previous page Print

<< Wednesday, November 28, 2012 >>


Remind me

Tell a friend

Add to my Google calendar (bCal)

Download to my calendar

Bookmark and ShareShare


Active Sequential Hypothesis Testing: NCD Seminar

Seminar: Net/Comm/DSP | November 28 | 2:30-3:30 p.m. | Cory Hall, 400 Cory


Tara Javidi, UCSD

Qualcomm


Active sequential hypothesis testing problem arises in a broad spectrum of applications in cognition, communications, design of experiments, and sensor management. In all of these applications, a decision maker is responsible to take actions dynamically so as to enhance information about an underlying phenomena of interest in a speedy manner while accounting for the cost of communication, sensing, or data collection. In particular, due to the sequential nature of the problem, the decision maker relies on his current information state to constantly (re-)evaluate the trade-off between the precision and cost as well as the influence that every action has over the entire decision making horizon.

In this talk, we first discuss active sequential hypothesis testing as a partially observable Markov decision problem. In particular, we provide a brief survey of the design of experiment literature and the dynamic programming interpretation of information utility introduced by De Groot: To achieve the best performance, it is essential that the decision maker, in each step, selects the most ¡§informative¡¨ action among the available ones where the critical question is what the proper measure of information should be. Using the notion of asymptotic optimality due to Chernoff, we connect this stochastic control theoretic notion of information utility to the error exponents in statistics and information theory. One of the main drawbacks of Chernoff¡¦s asymptotic optimality notion is his neglecting the complimentary role of the number of hypothesis. In particular, his notion of asymptotic optimality falls short in showing the tension between using (asymptotically large number of) samples to discriminate among a few hypotheses with (asymptotically) high accuracy or a (asymptotically) large number of hypotheses with a lower degree of accuracy. Connecting De Groot's information utility framework with the Shannon theoretic concept of uncertainty reduction, we strengthen Chernoff's lower bound to account for the question of non-zero and maximum information acquisition rates as well as an upper bound on the error exponent as a function of rate.

Using the insights obtained, and to address the shortcomings of the proposed solutions, we introduce Extrinsic Jensen¡VShannon (EJS) divergence as a measure of information based on which a heuristic policy for selecting actions is proposed. Via numerical and asymptotic analysis, the performance of the proposed policy, hence the utility of the EJS divergence in the context of the active hypothesis testing is investigated. In particular, under a mild technical condition, it is shown that the proposed heuristic achieves a strictly positive information acquisition rate with a strictly positive error exponent. Furthermore, in the special case of variable length coding over discrete memoryless channels with perfect feedback, EJS policy is shown to be the first sequential, deterministic and one phase coding scheme, to the best of our knowledge, which achieves Burnashev's optimal error exponent.

This is joint work with Mohammad Naghshvar, Ofer Shayevitz, and Michele Wigger.


gireeja@eecs.berkeley.edu