next up previous
Next: The Distance Filter Up: Filtering Techniques for Dynamic Previous: Filtering Techniques for Dynamic

The Entropy Filter


The entropy H(L) of the belief over L is defined as


and is a measure of uncertainty about the outcome of the random variable L [Cover & Thomas1991]. The higher the entropy, the higher the robot's uncertainty as to where it is. The entropy filter measures the relative change of entropy upon incorporating a sensor reading into the belief Bel(L). More specifically, let s denote the measurement of a sensor (in our case a single range measurement). The change of the entropy of Bel(L) given s is defined as:


The term tex2html_wrap_inline3155 is the entropy of the belief Bel(L) after incorporating the sensor measurement s (see Equations (18) - (20)). While a positive change of entropy indicates that after incorporating s, the robot is less certain about its position, a negative change indicates an increase in certainty. The selection scheme of the entropy filter is to exclude all sensor measurements s with tex2html_wrap_inline3165 . In other words, it only uses those sensor readings confirming the robot's current belief.

Entropy filters work well when the robot's belief is focused on the correct hypothesis. However, they may fail in situations in which the robot's belief state is incorrect. This topic will be analyzed systematically in the experiments described in Section 4.1. The advantage of the entropy filter is that it makes no assumptions about the nature of the sensor data and the kind of disturbances occurring in dynamic environments.

Dieter Fox
Fri Nov 19 14:29:33 MET 1999