Sunday, December 8, 2024

[DMANET] =?Windows-1252?Q?[CFP]_Extended_Deadline:_December_15, _2024_=96_The_1st_I?= nternational Workshop on TwinNetApp in conjunction with IEEE WCNC 2025 | Milan, Italy

================================================
We apologize if you receive multiple copies of this CFPs.
================================================

CALL FOR PAPERS
The First International Workshop on Data Driven and AI-Enabled Digital Twin Networks and Applications (TwinNetApp) in conjunction with IEEE WCNC 2025, Milan, Italy
https://wcnc25-twinnetapp.bcrg.uk/
https://wcnc2025.ieee-wcnc.org/workshop/ws09-twinnetapp-data-driven-and-ai-enabled-digital-twin-networks-and-applications

The next generation of technologies is envisioned to connect the physical and digital worlds. In this regard, Digital Twins (DT) is an emerging technology that has taken the world by storm due to its multiple benefits and deployment in interdisciplinary applications ranging from real-time remote monitoring to healthcare, industrial control systems and predictive maintenance in aerospace. DT connects the physical and digital worlds by building an accurate virtual replica of system objects in real-time. Real-time network monitoring, performance testing, optimization, and fast simulation are some examples that exploit DT advantages in the communication domain and beyond. DT delivers a new generation platform to analyse and test complex systems that are not currently available in traditional simulations and evaluations. Therefore, the use of DT is required in communication systems. Further, DT can deliver seamless analysis, monitoring, and predictions between digital and virtual counterparts of real-world systems, when it is used together with next-generation mobile communications (5G/6G), Virtual Reality (VR), Internet of Things (IoT), Artificial Intelligence (AI), Transfer Learning (TL), 3D models, Augmented Reality (AR), distributed computing and intelligent health applications. The workshop will invite authors to submit papers presenting new research related to all aspects of DT networks, systems and applications. Topics of interest include, but are not limited to:
• Data-driven and IoT-based DT networks for real-time communication systems
• Real-time communication protocols for DT networks
• DT-enabled health applications
• Wireless communications for cyber-physical DT applications
• Security and privacy concepts in DT
• Trustworthy AI enabled DT applications
• DT-assisted AI applications for smart cities
• Communications protocols for enabling DT deployment in real-world applications
• AI applications of DT systems
• DT in Edge/Fog/Cloud Computing
• DT for enhanced Mobile Broadband (eMBB), massive Machine Type Communications (mMTC), and Ultra Reliable Low Latency Communications (URLLC) applications
• DT for resource management and network optimization
• Connected networking systems for environmental sensing
• DT for precision agriculture, smart city and industry 4.0 applications
• Real-world DT simulations, prototypes, and testbed demonstrations

General Chairs
• Berk Canberk, Edinburgh Napier University, UK (B.Canberk@napier.ac.uk<mailto:B.Canberk@napier.ac.uk>)
• Octavia Dobre, Memorial University, Canada (odobre@mun.ca<mailto:odobre@mun.ca>)
• Marco Di Renzo, Université Paris-Saclay, CNRS, CentraleSupélec, Laboratoire des Signaux et Systémes, France (marco.di-renzo@universite-paris-saclay.fr<mailto:marco.di-renzo@universite-paris-saclay.fr>)

Keynote Speaker
• Trung Q. Duong, Memorial University, Canada (tduong@mun.ca<mailto:tduong@mun.ca>)

Industrial Chair
• Gökhan Yurdakul, CEO - BTS Group, Turkey (yurdakulg@btsgrp.com<mailto:yurdakulg@btsgrp.com>)

Important Dates
Submission Deadline: December 1, 2024 December 15, 2024
Notification of Acceptance: January 15, 2025
Camera Ready: February 1, 2025

Submission Details:
The page length limit for all initial submissions for review is SIX (6) printed pages (10-point font) and must be written in English. Initial submissions longer than SIX (6) pages will be rejected without review. For further details please refer to https://wcnc2025.ieee-wcnc.org/call-workshop-papers.

Submission Link:
https://edas.info/newPaper.php?c=32909&track=127537

Best Paper Award:
The workshop will feature a Best Paper Award with a cash prize of 800 Dollars, sponsored by BTS Group (https://btslabs.ai/).

Regards,
Yagmur Yigit
PhD Student, Edinburgh Napier University


This message and its attachment(s) are intended for the addressee(s) only and should not be read, copied, disclosed, forwarded or relied upon by any person other than the intended addressee(s) without the permission of the sender. If you are not the intended addressee you must not take any action based on this message and its attachment(s) nor must you copy or show them to anyone. Please respond to the sender and ensure that this message and its attachment(s) are deleted.

It is your responsibility to ensure that this message and its attachment(s) are scanned for viruses or other defects. Edinburgh Napier University does not accept liability for any loss or damage which may result from this message or its attachment(s), or for errors or omissions arising after it was sent. Email is not a secure medium. Emails entering Edinburgh Napier University's system are subject to routine monitoring and filtering by Edinburgh Napier University.

Edinburgh Napier University is a registered Scottish charity. Registration number SC018373

BSL users can contact us via contactSCOTLAND-BSL, the on-line British Sign Language interpreting service. Find out more on the contactSCOTLAND website.

**********************************************************
*
* Contributions to be spread via DMANET are submitted to
*
* DMANET@zpr.uni-koeln.de
*
* Replies to a message carried on DMANET should NOT be
* addressed to DMANET but to the original sender. The
* original sender, however, is invited to prepare an
* update of the replies received and to communicate it
* via DMANET.
*
* DISCRETE MATHEMATICS AND ALGORITHMS NETWORK (DMANET)
* http://www.zaik.uni-koeln.de/AFS/publications/dmanet/
*
**********************************************************