No Cover Image

Journal article 354 views

FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention

Haoyun Sun Orcid Logo, Hongwei Zhao Orcid Logo, Liang Xu, Weishan Zhang Orcid Logo, Hongqing Guan, Scott Yang Orcid Logo

IEEE Transactions on Artificial Intelligence, Volume: 6, Issue: 2, Pages: 268 - 280

Swansea University Author: Scott Yang Orcid Logo

Full text not available from this repository: check for access using links below.

Abstract

The proliferation of privacy-sensitive data has spurred the development of federated learning (FL), which is an important technology for state-of-the-art machine learning and responsible AI. However, most existing FL methods are constrained in their applicability and generalizability due to their na...

Full description

Published in: IEEE Transactions on Artificial Intelligence
ISSN: 2691-4581
Published: Institute of Electrical and Electronics Engineers (IEEE) 2025
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa69396
first_indexed 2025-05-01T13:23:43Z
last_indexed 2025-06-19T10:46:26Z
id cronfa69396
recordtype SURis
fullrecord <?xml version="1.0"?><rfc1807><datestamp>2025-06-18T13:09:49.9854616</datestamp><bib-version>v2</bib-version><id>69396</id><entry>2025-05-01</entry><title>FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention</title><swanseaauthors><author><sid>81dc663ca0e68c60908d35b1d2ec3a9b</sid><ORCID>0000-0002-6618-7483</ORCID><firstname>Scott</firstname><surname>Yang</surname><name>Scott Yang</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2025-05-01</date><deptcode>MACS</deptcode><abstract>The proliferation of privacy-sensitive data has spurred the development of federated learning (FL), which is an important technology for state-of-the-art machine learning and responsible AI. However, most existing FL methods are constrained in their applicability and generalizability due to their narrow focus on specific tasks. This article presents a novel federated multitask learning (FMTL) framework that is capable of acquiring knowledge across multiple tasks. To address the challenges posed by non-IID data and task imbalance in FMTL, this study proposes a federated fusion strategy based on dynamic guided attention (FedDGA), which adaptively fine-tunes local models for multiple tasks with personalized attention. In addition, this article designed dynamic batch weight (DBW) to balance the task losses and improve the convergence speed. Extensive experiments were conducted on various datasets, tasks, and settings, and the proposed method was compared with state-of-the-art methods such as FedAvg, FedProx, and SCAFFOLD. The results show that our method achieves significant performance gains, with up to 11.1% increase in accuracy over the baselines.</abstract><type>Journal Article</type><journal>IEEE Transactions on Artificial Intelligence</journal><volume>6</volume><journalNumber>2</journalNumber><paginationStart>268</paginationStart><paginationEnd>280</paginationEnd><publisher>Institute of Electrical and Electronics Engineers (IEEE)</publisher><placeOfPublication/><isbnPrint/><isbnElectronic/><issnPrint/><issnElectronic>2691-4581</issnElectronic><keywords/><publishedDay>1</publishedDay><publishedMonth>2</publishedMonth><publishedYear>2025</publishedYear><publishedDate>2025-02-01</publishedDate><doi>10.1109/tai.2024.3350538</doi><url/><notes/><college>COLLEGE NANME</college><department>Mathematics and Computer Science School</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>MACS</DepartmentCode><institution>Swansea University</institution><apcterm/><funders>National Natural Science Foundation of China under Grant 62072469</funders><projectreference/><lastEdited>2025-06-18T13:09:49.9854616</lastEdited><Created>2025-05-01T14:20:01.1571369</Created><path><level id="1">Faculty of Science and Engineering</level><level id="2">School of Mathematics and Computer Science - Computer Science</level></path><authors><author><firstname>Haoyun</firstname><surname>Sun</surname><orcid>0000-0002-8326-0152</orcid><order>1</order></author><author><firstname>Hongwei</firstname><surname>Zhao</surname><orcid>0000-0001-5235-0748</orcid><order>2</order></author><author><firstname>Liang</firstname><surname>Xu</surname><order>3</order></author><author><firstname>Weishan</firstname><surname>Zhang</surname><orcid>0000-0001-9800-1068</orcid><order>4</order></author><author><firstname>Hongqing</firstname><surname>Guan</surname><order>5</order></author><author><firstname>Scott</firstname><surname>Yang</surname><orcid>0000-0002-6618-7483</orcid><order>6</order></author></authors><documents/><OutputDurs/></rfc1807>
spelling 2025-06-18T13:09:49.9854616 v2 69396 2025-05-01 FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention 81dc663ca0e68c60908d35b1d2ec3a9b 0000-0002-6618-7483 Scott Yang Scott Yang true false 2025-05-01 MACS The proliferation of privacy-sensitive data has spurred the development of federated learning (FL), which is an important technology for state-of-the-art machine learning and responsible AI. However, most existing FL methods are constrained in their applicability and generalizability due to their narrow focus on specific tasks. This article presents a novel federated multitask learning (FMTL) framework that is capable of acquiring knowledge across multiple tasks. To address the challenges posed by non-IID data and task imbalance in FMTL, this study proposes a federated fusion strategy based on dynamic guided attention (FedDGA), which adaptively fine-tunes local models for multiple tasks with personalized attention. In addition, this article designed dynamic batch weight (DBW) to balance the task losses and improve the convergence speed. Extensive experiments were conducted on various datasets, tasks, and settings, and the proposed method was compared with state-of-the-art methods such as FedAvg, FedProx, and SCAFFOLD. The results show that our method achieves significant performance gains, with up to 11.1% increase in accuracy over the baselines. Journal Article IEEE Transactions on Artificial Intelligence 6 2 268 280 Institute of Electrical and Electronics Engineers (IEEE) 2691-4581 1 2 2025 2025-02-01 10.1109/tai.2024.3350538 COLLEGE NANME Mathematics and Computer Science School COLLEGE CODE MACS Swansea University National Natural Science Foundation of China under Grant 62072469 2025-06-18T13:09:49.9854616 2025-05-01T14:20:01.1571369 Faculty of Science and Engineering School of Mathematics and Computer Science - Computer Science Haoyun Sun 0000-0002-8326-0152 1 Hongwei Zhao 0000-0001-5235-0748 2 Liang Xu 3 Weishan Zhang 0000-0001-9800-1068 4 Hongqing Guan 5 Scott Yang 0000-0002-6618-7483 6
title FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention
spellingShingle FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention
Scott Yang
title_short FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention
title_full FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention
title_fullStr FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention
title_full_unstemmed FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention
title_sort FedDGA: Federated Multitask Learning Based on Dynamic Guided Attention
author_id_str_mv 81dc663ca0e68c60908d35b1d2ec3a9b
author_id_fullname_str_mv 81dc663ca0e68c60908d35b1d2ec3a9b_***_Scott Yang
author Scott Yang
author2 Haoyun Sun
Hongwei Zhao
Liang Xu
Weishan Zhang
Hongqing Guan
Scott Yang
format Journal article
container_title IEEE Transactions on Artificial Intelligence
container_volume 6
container_issue 2
container_start_page 268
publishDate 2025
institution Swansea University
issn 2691-4581
doi_str_mv 10.1109/tai.2024.3350538
publisher Institute of Electrical and Electronics Engineers (IEEE)
college_str Faculty of Science and Engineering
hierarchytype
hierarchy_top_id facultyofscienceandengineering
hierarchy_top_title Faculty of Science and Engineering
hierarchy_parent_id facultyofscienceandengineering
hierarchy_parent_title Faculty of Science and Engineering
department_str School of Mathematics and Computer Science - Computer Science{{{_:::_}}}Faculty of Science and Engineering{{{_:::_}}}School of Mathematics and Computer Science - Computer Science
document_store_str 0
active_str 0
description The proliferation of privacy-sensitive data has spurred the development of federated learning (FL), which is an important technology for state-of-the-art machine learning and responsible AI. However, most existing FL methods are constrained in their applicability and generalizability due to their narrow focus on specific tasks. This article presents a novel federated multitask learning (FMTL) framework that is capable of acquiring knowledge across multiple tasks. To address the challenges posed by non-IID data and task imbalance in FMTL, this study proposes a federated fusion strategy based on dynamic guided attention (FedDGA), which adaptively fine-tunes local models for multiple tasks with personalized attention. In addition, this article designed dynamic batch weight (DBW) to balance the task losses and improve the convergence speed. Extensive experiments were conducted on various datasets, tasks, and settings, and the proposed method was compared with state-of-the-art methods such as FedAvg, FedProx, and SCAFFOLD. The results show that our method achieves significant performance gains, with up to 11.1% increase in accuracy over the baselines.
published_date 2025-02-01T06:46:52Z
_version_ 1851284009437364224
score 11.090362