No Cover Image

Journal article 1034 views 145 downloads

Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems

Sanjay Pant Orcid Logo

Journal of The Royal Society Interface, Volume: 15, Issue: 142

Swansea University Author: Sanjay Pant Orcid Logo

  • APCE004.Pant.20170871.full.pdf

    PDF | Version of Record

    Distributed under the terms of a Creative Commons CC-BY 4.0 Licence.

    Download (1.17MB)

Check full text

DOI (Published version): 10.1098/rsif.2017.0871

Abstract

A new class of functions, called the ‘information sensitivity functions’ (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and...

Full description

Published in: Journal of The Royal Society Interface
ISSN: 1742-5689 1742-5662
Published: 2018
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa39546
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract: A new class of functions, called the ‘information sensitivity functions’ (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and are based on Bayesian and information-theoretic approaches. While marginal information gain is quantified by decrease in differential entropy, correlations between arbitrary sets of parameters are assessed through mutual information. For individual parameters, these information gains are also presented as marginal posterior variances, and, to assess the effect of correlations, as conditional variances when other parameters are given. The easy to interpret ISFs can be used to (a) identify time intervals or regions in dynamical system behaviour where information about the parameters is concentrated; (b) assess the effect of measurement noise on the information gain for the parameters; (c) assess whether sufficient information in an experimental protocol (input, measurements and their frequency) is available to identify the parameters; (d) assess correlation in the posterior distribution of the parameters to identify the sets of parameters that are likely to be indistinguishable; and (e) assess identifiability problems for particular sets of parameters.
Item Description: Correction available: http://rsif.royalsocietypublishing.org/content/15/143/20180353
College: Faculty of Science and Engineering
Funders: UKRI, ESPRC
Issue: 142