In this paper, we study distributed decision networks where uncertainties exist in the statistical environment. Specifically each decision maker (DM) has an unknown probability to be jammed or defective and an unknown probability to provide an incorrect decision when jammed or defective. Each DM in the network has the ability to process its input data consisting of external observations and decisions from preceding DM's, to produce a decision regarding an underlying binary hypothesis testing problem. The local observations are assumed conditionally independent given each hypothesis. The resulting binary hypothesis testing problem is solved using some simple concepts of Dempster-Shafer theory. Each DM employs Dempster's combining rule to aggregate its input information for a decision. The uncertainty presented by the unknown probabilities is treated by discounting the degree of confidence on decisions. As a result, the decision rule of each DM is a likelihood ratio test with a data-dependent threshold that is a function of the uncertainty discount rates. We compare the performance of the proposed decision rule to that of the minimax decision rule and the decision rule that is optimum when there are no jammed or defective DMs for several distributed decision networks with different topologies, and we show that the proposed decision rule has a very robust behavior.
ASJC Scopus subject areas
- Control and Systems Engineering
- Computer Science Applications
- Electrical and Electronic Engineering