Applications (**iMETA2023)*
18–20 September, 2023 – Tartu, Estonia
Over the years, technology has advanced significantly, and the creation of
the virtual environments (i.e., metaverse) is one of the latest innovations
that is set to revolutionize how we interact, process, and connect our
real-life to other lifes. With the metaverse, we are presented with a new
realm that blurs the lines between the physical and digital world,
providing a new space for communication, commerce, services, and
entertainment.
The International Conference on Intelligent Metaverse Technologies &
Applications (iMeta) is to bring together leading researchers, academics,
and industry experts to explore the various aspects of the distributive
metaverse, including its technologies, applications, and implications for
various industries. Attendees can expect to participate in exciting keynote
speeches, panel discussions, and presentations on cutting-edge research
field. Throughout the conference, there will be ample opportunities for
attendees to network, share their knowledge, and collaborate on future
initiatives that will drive the development of the metaverse forward. We
are confident that iMeta conference will inspire new ideas, foster
innovation, and spark collaborations that will push the boundaries of the
metaverse and its potential to change the world as we know it.
Overall, the iMETA conference aims to provide attendees with comprehensive
understanding of the communication, computing, and system requirements of
the metaverse. Through keynote speeches, panel discussions, and
presentations, attendees will have the opportunity to engage with leading
experts and learn about the latest developments and future trends in the
field. The conference will also provide ample opportunities for networking,
sharing knowledge, and collaborating with others in the metaverse community.
*List of Tracks:*
Researchers are encouraged to submit original research contributions in all
major areas, which include, but not limited to:
1. AI
2. Security and Privacy
3. Networking and Communications
4. Systems and Computing
5. Multimedia and Computer Vision
6. Immersive Technologies and Services
7. Storage and Processing
*Important Dates:*
- Papers due: May 10, June 20, 2023
- Acceptance notification: August 05, 2023
- Camera-ready paper due: August 20, 2023
- Workshop Proposal due: June 01, 2023
*Keynote Speakers:*
§ *Walid Saad*, Virginia Tech, USA
- *Merouane Debbah*, TII, UAE
- *Ian F. Akyildiz*, Truva Inc., USA
*Submission Guidelines:*
There are three categories of submission:
- *Long papers:* 7-8 pages.
- *Short papers:* 5-6 pages.
§ *Poster papers: 1-2 pages (undergraduate).*
*Submission Link: **https://easychair.org/my/conference?conf=imeta2023*
<https://easychair.org/my/conference?conf=imeta2023>
*Organizing Committee*
* Steering Committee*
- Giancarlo Fortino, University of Calabria, Italy
- Albert Zomaya, University of Sydney, Australia
- Nirwan Ansari, New Jersey Institute of Technology, USA
- Salil Kanhere, UNSW Sydney, Australia
- Yaser Jararweh, JUST, Jordan
*General Co-Chair*
§ Ilir Capuni, University of Montenegro, Montenegro
*Program Co-Chairs*
- Feras Awaysheh, University of Tartu, Estonia
- Gautam Srivastava, Brandon University, Canada
- Jun Wu, Shanghai Jiao Tong University, China and Waseda University,
Japan
- Moayad Aloqaily, MBZUAI, UAE
*Local Committee*
- Karl Lomp, University of Tartu, Estonia
- Madis Vasser, University of Tartu, Estonia
- Stefania Tomasiello, University of Tartu, Estonia
- Ulrich Norbisrath, University of Tartu, Estonia
**********************************************************
*
* Contributions to be spread via DMANET are submitted to
*
* DMANET@zpr.uni-koeln.de
*
* Replies to a message carried on DMANET should NOT be
* addressed to DMANET but to the original sender. The
* original sender, however, is invited to prepare an
* update of the replies received and to communicate it
* via DMANET.
*
* DISCRETE MATHEMATICS AND ALGORITHMS NETWORK (DMANET)
* http://www.zaik.uni-koeln.de/AFS/publications/dmanet/
*
**********************************************************