Journal article 1034 views 145 downloads
Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems
Journal of The Royal Society Interface, Volume: 15, Issue: 142
Swansea University Author: Sanjay Pant
-
PDF | Version of Record
Distributed under the terms of a Creative Commons CC-BY 4.0 Licence.
Download (1.17MB)
DOI (Published version): 10.1098/rsif.2017.0871
Abstract
A new class of functions, called the ‘information sensitivity functions’ (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and...
Published in: | Journal of The Royal Society Interface |
---|---|
ISSN: | 1742-5689 1742-5662 |
Published: |
2018
|
Online Access: |
Check full text
|
URI: | https://cronfa.swan.ac.uk/Record/cronfa39546 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
first_indexed |
2018-04-23T13:59:47Z |
---|---|
last_indexed |
2023-02-15T03:48:40Z |
id |
cronfa39546 |
recordtype |
SURis |
fullrecord |
<?xml version="1.0"?><rfc1807><datestamp>2023-02-14T15:30:47.4555720</datestamp><bib-version>v2</bib-version><id>39546</id><entry>2018-04-23</entry><title>Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems</title><swanseaauthors><author><sid>43b388e955511a9d1b86b863c2018a9f</sid><ORCID>0000-0002-2081-308X</ORCID><firstname>Sanjay</firstname><surname>Pant</surname><name>Sanjay Pant</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2018-04-23</date><deptcode>MECH</deptcode><abstract>A new class of functions, called the ‘information sensitivity functions’ (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and are based on Bayesian and information-theoretic approaches. While marginal information gain is quantified by decrease in differential entropy, correlations between arbitrary sets of parameters are assessed through mutual information. For individual parameters, these information gains are also presented as marginal posterior variances, and, to assess the effect of correlations, as conditional variances when other parameters are given. The easy to interpret ISFs can be used to (a) identify time intervals or regions in dynamical system behaviour where information about the parameters is concentrated; (b) assess the effect of measurement noise on the information gain for the parameters; (c) assess whether sufficient information in an experimental protocol (input, measurements and their frequency) is available to identify the parameters; (d) assess correlation in the posterior distribution of the parameters to identify the sets of parameters that are likely to be indistinguishable; and (e) assess identifiability problems for particular sets of parameters.</abstract><type>Journal Article</type><journal>Journal of The Royal Society Interface</journal><volume>15</volume><journalNumber>142</journalNumber><paginationStart/><paginationEnd/><publisher/><placeOfPublication/><isbnPrint/><isbnElectronic/><issnPrint>1742-5689</issnPrint><issnElectronic>1742-5662</issnElectronic><keywords/><publishedDay>31</publishedDay><publishedMonth>5</publishedMonth><publishedYear>2018</publishedYear><publishedDate>2018-05-31</publishedDate><doi>10.1098/rsif.2017.0871</doi><url/><notes>Correction available: http://rsif.royalsocietypublishing.org/content/15/143/20180353</notes><college>COLLEGE NANME</college><department>Mechanical Engineering</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>MECH</DepartmentCode><institution>Swansea University</institution><degreesponsorsfunders>EPSRC, EP/R010811/1</degreesponsorsfunders><apcterm/><funders>UKRI, ESPRC</funders><projectreference/><lastEdited>2023-02-14T15:30:47.4555720</lastEdited><Created>2018-04-23T10:12:32.5814928</Created><path><level id="1">Faculty of Science and Engineering</level><level id="2">School of Aerospace, Civil, Electrical, General and Mechanical Engineering - Mechanical Engineering</level></path><authors><author><firstname>Sanjay</firstname><surname>Pant</surname><orcid>0000-0002-2081-308X</orcid><order>1</order></author></authors><documents><document><filename>0039546-21092018152358.pdf</filename><originalFilename>APCE004.Pant.20170871.full.pdf</originalFilename><uploaded>2018-09-21T15:23:58.6470000</uploaded><type>Output</type><contentLength>1208919</contentLength><contentType>application/pdf</contentType><version>Version of Record</version><cronfaStatus>true</cronfaStatus><embargoDate>2018-09-21T00:00:00.0000000</embargoDate><documentNotes>Distributed under the terms of a Creative Commons CC-BY 4.0 Licence.</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language></document></documents><OutputDurs/></rfc1807> |
spelling |
2023-02-14T15:30:47.4555720 v2 39546 2018-04-23 Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems 43b388e955511a9d1b86b863c2018a9f 0000-0002-2081-308X Sanjay Pant Sanjay Pant true false 2018-04-23 MECH A new class of functions, called the ‘information sensitivity functions’ (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and are based on Bayesian and information-theoretic approaches. While marginal information gain is quantified by decrease in differential entropy, correlations between arbitrary sets of parameters are assessed through mutual information. For individual parameters, these information gains are also presented as marginal posterior variances, and, to assess the effect of correlations, as conditional variances when other parameters are given. The easy to interpret ISFs can be used to (a) identify time intervals or regions in dynamical system behaviour where information about the parameters is concentrated; (b) assess the effect of measurement noise on the information gain for the parameters; (c) assess whether sufficient information in an experimental protocol (input, measurements and their frequency) is available to identify the parameters; (d) assess correlation in the posterior distribution of the parameters to identify the sets of parameters that are likely to be indistinguishable; and (e) assess identifiability problems for particular sets of parameters. Journal Article Journal of The Royal Society Interface 15 142 1742-5689 1742-5662 31 5 2018 2018-05-31 10.1098/rsif.2017.0871 Correction available: http://rsif.royalsocietypublishing.org/content/15/143/20180353 COLLEGE NANME Mechanical Engineering COLLEGE CODE MECH Swansea University EPSRC, EP/R010811/1 UKRI, ESPRC 2023-02-14T15:30:47.4555720 2018-04-23T10:12:32.5814928 Faculty of Science and Engineering School of Aerospace, Civil, Electrical, General and Mechanical Engineering - Mechanical Engineering Sanjay Pant 0000-0002-2081-308X 1 0039546-21092018152358.pdf APCE004.Pant.20170871.full.pdf 2018-09-21T15:23:58.6470000 Output 1208919 application/pdf Version of Record true 2018-09-21T00:00:00.0000000 Distributed under the terms of a Creative Commons CC-BY 4.0 Licence. true eng |
title |
Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems |
spellingShingle |
Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems Sanjay Pant |
title_short |
Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems |
title_full |
Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems |
title_fullStr |
Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems |
title_full_unstemmed |
Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems |
title_sort |
Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems |
author_id_str_mv |
43b388e955511a9d1b86b863c2018a9f |
author_id_fullname_str_mv |
43b388e955511a9d1b86b863c2018a9f_***_Sanjay Pant |
author |
Sanjay Pant |
author2 |
Sanjay Pant |
format |
Journal article |
container_title |
Journal of The Royal Society Interface |
container_volume |
15 |
container_issue |
142 |
publishDate |
2018 |
institution |
Swansea University |
issn |
1742-5689 1742-5662 |
doi_str_mv |
10.1098/rsif.2017.0871 |
college_str |
Faculty of Science and Engineering |
hierarchytype |
|
hierarchy_top_id |
facultyofscienceandengineering |
hierarchy_top_title |
Faculty of Science and Engineering |
hierarchy_parent_id |
facultyofscienceandengineering |
hierarchy_parent_title |
Faculty of Science and Engineering |
department_str |
School of Aerospace, Civil, Electrical, General and Mechanical Engineering - Mechanical Engineering{{{_:::_}}}Faculty of Science and Engineering{{{_:::_}}}School of Aerospace, Civil, Electrical, General and Mechanical Engineering - Mechanical Engineering |
document_store_str |
1 |
active_str |
0 |
description |
A new class of functions, called the ‘information sensitivity functions’ (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and are based on Bayesian and information-theoretic approaches. While marginal information gain is quantified by decrease in differential entropy, correlations between arbitrary sets of parameters are assessed through mutual information. For individual parameters, these information gains are also presented as marginal posterior variances, and, to assess the effect of correlations, as conditional variances when other parameters are given. The easy to interpret ISFs can be used to (a) identify time intervals or regions in dynamical system behaviour where information about the parameters is concentrated; (b) assess the effect of measurement noise on the information gain for the parameters; (c) assess whether sufficient information in an experimental protocol (input, measurements and their frequency) is available to identify the parameters; (d) assess correlation in the posterior distribution of the parameters to identify the sets of parameters that are likely to be indistinguishable; and (e) assess identifiability problems for particular sets of parameters. |
published_date |
2018-05-31T03:50:14Z |
_version_ |
1763752452707844096 |
score |
11.037056 |