IEILM 2025 : The 2nd Workshop on Integrating Edge Intelligence and Large Model in Next Generation Networks

May 19-19, 2025London, United Kingdom
Edge Intelligence and Large Model in next generation networks are closely related to the increasing convergence of networking, artificial intelligence, and cloud-edge technologies. As cloud-edge computing gains momentum, the demand for intelligent, context-aware, and efficient networking solutions is rising. The integration of edge intelligence and large models enables networks to become more adaptive, self-optimizing, and responsive to user and application needs.

The 2nd Workshop on Integrating Edge Intelligence and Large Models in Next Generation Networks provides a forum that brings together industry and academia, engineers, and researchers to discuss up-to-date developments in integrating edge intelligence and large models in Large AI models. The workshop invites submissions of unpublished works on (but not limited to) the following topics:

End-to-end semantic communication systems
Microdata cells/centers for integrating edge intelligent and semantic communications
Large AI models for multi-task intelligent mobile edge computing
Distributed learning architectures for semantic communications
Generative AI for next generation networks
Architectures and protocols for integrating semantic and large models
Designs and optimizations for integrating semantic and large models
Energy-efficient hardware, software, networks and services with edge intelligence
Integrating edge intelligence and large models in future networks
Cache-enabled networks with semantic communications
Large models for multimedia services
Network functions virtualization (NFV) for cloud-edge networks
Efficient reasoning of large models by Edge Intelligence
Deploying large models for cloud-edge networking
Distributed and power-efficient training of large models in wireless networks
The role of semantic communications/large models in emerging 6G applications
Edge intelligent systems for semantic communication Networks
Signal processing for integrating semantic communications
Information-centric and content-centric networks with large models
Interdisciplinary research for integrating edge intelligence/large models
Information theory for integrating semantic communications, and large models
Personalization of FL with large models in cloud-edge networks
Overcoming data interoperability challenges using large models in cloud-edge networks
System architectures for inference and training in cloud-edge networks
Training on cloud/heterogeneous infrastructure with large models


Submission Details:

Papers must be formatted in the standard IEEE two-column format that is used by the INFOCOM 2025 main conference, and must not exceed six pages in length (including references). All submitted papers will go through a peer review process, and all accepted papers which are presented by one of the authors at the workshop will be published in the IEEE INFOCOM 2025 proceedings and IEEE Xplore. Please follow the submission link at https://edas.info/N33128 to submit your paper through EDAS.


Important Dates:

Submission Deadline: January 8, 2025

Notification of Acceptance: February 5, 2025

Camera Ready: February 26, 2025 (Firm Deadline)

Workshop: May 19, 2025