FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025
Call For Paper (CFP) Description
[Call for Papers]
Foundation models (FMs) are typically associated with large language models (LLMs), like ChatGPT, and are characterized by their scale and broad applicability. While these models provide transformative capabilities, they also introduce significant challenges, particularly concerning distributed model management and related data privacy, efficiency, and scalability. The training of foundation models is data and resource intensive and the conventional methods are typically centralized; this creates significant challenges including regulatory and privacy concerns in real-world use cases. These include distributed training data, computational resources to manage distributed data repositories, and development of and alignment with regulatory guidelines (e.g., GDPR) that restrict sharing sensitive data.
Federated learning (FL) is an emerging paradigm that can mitigate these challenges by training a global but distributed model using distributed data. The extensive application of machine learning to analyze and draw insight from real-world, distributed, and sensitive data necessitates familiarity with and adoption of this relevant and timely topic within the general scientific community. As FL allows self-interested data owners to collaboratively train models, end-users can become co-creators of AI solutions. By adopting federated learning approaches, we can leverage distributed data and computing power available across different sources while respecting user privacy.
The rise of FMs amplifies the importance and relevance of FL as a crucial research direction. With FMs becoming the norm in machine learning development, the focus shifts from model architecture design to tackling the issues surrounding privacy-preserving and distributed learning. Advancements in FL methods have the potential to unlock the use of FMs, enabling efficient and scalable training while safeguarding sensitive data.
With this in mind, we invite original research contributions, position papers, and work-in-progress reports on various aspects of federated learning in the era of foundation models. Since the emergence of foundation models has been a relatively recent phenomenon, their full impact on federated learning has not yet been well explored or understood. We hope to provide a platform to facilitate interaction among students, scholars, and industry professionals from around the world to discuss the latest advancements, share insights, and identify future directions in this exciting field.
The workshop topics include but are not limited to:
Theory and algorithmic foundations:
-Impact of heterogeneity in FL of large models
-Multi-stage model training (e.g., base model + fine tuning)
-Optimization advances in FL (e.g., beyond first-order and local methods)
-Prompt tuning in federated settings
-Self-supervised learning in federated settings
Leveraging foundation models to improve federated learning:
-Adaptive aggregation strategies for FL in heterogeneous environments
-Foundation model enhanced FL knowledge distillation
-Overcoming data interoperability challenges using foundation models
-Personalization of FL with foundation models
Federated learning for training and tuning foundation models:
-Fairness, bias, and interpretability challenges in FL with foundation models
-Federated transfer learning with foundation models
-FL techniques for training large-scale foundation models
-Hardware for FL with foundation models
-Optimization algorithms for federated training of foundation models
-Privacy-preserving mechanisms in FL with foundation models
-Resource-efficient FL with foundation models
-Security and robustness considerations in FL with foundation models
-Systems and infrastructure for FL with foundation models
-Vertical federated learning with foundation models
-Vulnerabilities of FL with foundation models
[Submission Instructions]
Formatting Requirements. Submissions must be written in English, in double-column format, and must adhere to the ACM template and format (also available in Overleaf). Word users may use the Word Interim Template. The recommended setting for LaTeX is:
\documentclass\\sigconf, review\\{acmart}.
Submissions must be a single PDF file of 4 (four) to 8 (eight) pages as the main paper, with up to 2 additional pages for references and optional appendix.
Authorship. Submissions are not anonymous, hence authors should list their names and affiliations.
Submission site: TBA.
For enquiries, please email to the workshop general/program co-chairs.
[Publications]
For accepted papers, it is up to the authors to decide if they want them to be included in the WWW'25 Companion proceedings. Doing so will make the paper regarded as published in an archival venue, and preclude it from being submitted to other conferences or journals. If the authors opt out of this option, their papers can still be submitted for consideration in other conferences and journals.
We will contact the authors of accepted papers in due course to make this decision at a later date.
[Organizing Committee]
General Co-Chairs
-Irwin King (CUHK)
-Guodong Long (UTS)
Program Co-Chairs
-Zenglin Xu (Fudan)
-Han Yu (NTU)
Local Chair
-Yifei Zhang (NTU)
Foundation models (FMs) are typically associated with large language models (LLMs), like ChatGPT, and are characterized by their scale and broad applicability. While these models provide transformative capabilities, they also introduce significant challenges, particularly concerning distributed model management and related data privacy, efficiency, and scalability. The training of foundation models is data and resource intensive and the conventional methods are typically centralized; this creates significant challenges including regulatory and privacy concerns in real-world use cases. These include distributed training data, computational resources to manage distributed data repositories, and development of and alignment with regulatory guidelines (e.g., GDPR) that restrict sharing sensitive data.
Federated learning (FL) is an emerging paradigm that can mitigate these challenges by training a global but distributed model using distributed data. The extensive application of machine learning to analyze and draw insight from real-world, distributed, and sensitive data necessitates familiarity with and adoption of this relevant and timely topic within the general scientific community. As FL allows self-interested data owners to collaboratively train models, end-users can become co-creators of AI solutions. By adopting federated learning approaches, we can leverage distributed data and computing power available across different sources while respecting user privacy.
The rise of FMs amplifies the importance and relevance of FL as a crucial research direction. With FMs becoming the norm in machine learning development, the focus shifts from model architecture design to tackling the issues surrounding privacy-preserving and distributed learning. Advancements in FL methods have the potential to unlock the use of FMs, enabling efficient and scalable training while safeguarding sensitive data.
With this in mind, we invite original research contributions, position papers, and work-in-progress reports on various aspects of federated learning in the era of foundation models. Since the emergence of foundation models has been a relatively recent phenomenon, their full impact on federated learning has not yet been well explored or understood. We hope to provide a platform to facilitate interaction among students, scholars, and industry professionals from around the world to discuss the latest advancements, share insights, and identify future directions in this exciting field.
The workshop topics include but are not limited to:
Theory and algorithmic foundations:
-Impact of heterogeneity in FL of large models
-Multi-stage model training (e.g., base model + fine tuning)
-Optimization advances in FL (e.g., beyond first-order and local methods)
-Prompt tuning in federated settings
-Self-supervised learning in federated settings
Leveraging foundation models to improve federated learning:
-Adaptive aggregation strategies for FL in heterogeneous environments
-Foundation model enhanced FL knowledge distillation
-Overcoming data interoperability challenges using foundation models
-Personalization of FL with foundation models
Federated learning for training and tuning foundation models:
-Fairness, bias, and interpretability challenges in FL with foundation models
-Federated transfer learning with foundation models
-FL techniques for training large-scale foundation models
-Hardware for FL with foundation models
-Optimization algorithms for federated training of foundation models
-Privacy-preserving mechanisms in FL with foundation models
-Resource-efficient FL with foundation models
-Security and robustness considerations in FL with foundation models
-Systems and infrastructure for FL with foundation models
-Vertical federated learning with foundation models
-Vulnerabilities of FL with foundation models
[Submission Instructions]
Formatting Requirements. Submissions must be written in English, in double-column format, and must adhere to the ACM template and format (also available in Overleaf). Word users may use the Word Interim Template. The recommended setting for LaTeX is:
\documentclass\\sigconf, review\\{acmart}.
Submissions must be a single PDF file of 4 (four) to 8 (eight) pages as the main paper, with up to 2 additional pages for references and optional appendix.
Authorship. Submissions are not anonymous, hence authors should list their names and affiliations.
Submission site: TBA.
For enquiries, please email to the workshop general/program co-chairs.
[Publications]
For accepted papers, it is up to the authors to decide if they want them to be included in the WWW'25 Companion proceedings. Doing so will make the paper regarded as published in an archival venue, and preclude it from being submitted to other conferences or journals. If the authors opt out of this option, their papers can still be submitted for consideration in other conferences and journals.
We will contact the authors of accepted papers in due course to make this decision at a later date.
[Organizing Committee]
General Co-Chairs
-Irwin King (CUHK)
-Guodong Long (UTS)
Program Co-Chairs
-Zenglin Xu (Fudan)
-Han Yu (NTU)
Local Chair
-Yifei Zhang (NTU)
Conference Topics
Frequently Asked Questions
What is FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025 is Join the FL@FM-TheWebConf 2025: International Workshop on Federated Foundation Models for the Web, exploring the intersection of federated learning and foundation models. Submit your paper and contribute to the latest advancements in this exciting field.
How do I submit my paper to FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
Submit your paper via the official submission portal at https://federated-learning.org/fl@fm-www-2025/. Follow the submission guidelines outlined in the CFP.
How do I register for the FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
Register at https://federated-learning.org/fl@fm-www-2025/. Early registration is recommended to secure your spot and avail discounts.
What topics are accepted at FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
The topics accepted at FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025 include machine learning, the web, foundation models, federated learning. Papers that explore innovative ideas or solutions in these areas are highly encouraged.
What are the important dates for FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
- Start Date: 28 Apr, 2025
- End Date: 29 Apr, 2025
- End Date: 29 Apr, 2025
What is the location and date of FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025 will be held on 28 Apr, 2025 - 29 Apr, 2025 at Sydney, Australia. More details about the event location and travel arrangements can be found on the conference’s official website.
What is the location of FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025 will be held at Sydney, Australia.
Can I submit more than one paper to FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
Yes, multiple submissions are allowed, provided they align with the conference’s themes and topics. Each submission will be reviewed independently.
What is the review process for submissions?
Papers will be reviewed by a panel of experts in the field, ensuring that only high-quality, relevant work is selected for presentation. Each paper will be evaluated on originality, significance, and clarity.
What presentation formats are available at FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
Presentations can be made in various formats including oral presentations, poster sessions, or virtual presentations. Specific details will be provided upon acceptance of your paper.
Can I make changes to my submission after I’ve submitted it?
Modifications to your submission are allowed until the submission deadline. After that, no changes can be made. Please make sure all details are correct before submitting.
What are the benefits of attending FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025?
Attending FL@FM-TheWebConf 2025 : International Workshop on Federated Foundation Models for the Web 2025 provides an opportunity to present your research, network with peers and experts in your field, and gain feedback on your work. Additionally, it is an excellent platform for career advancement and collaboration opportunities.
What should I include in my abstract or proposal submission?
Your abstract or proposal should include a concise summary of your paper, including its purpose, methodology, and key findings. Ensure that it aligns with the conference themes.