The 3rd International Conference on Foundation and Large Language Models
(FLLM2025)
*https://fllm-conference.org/2025/*
25-28 November, 2025 | Vienna, Austria
Hybrid Conference and Technically Co-Sponsored by IEEE Austrian Section
*FLLM 2025 CFP:*
With the emergence of foundation models (FMs) and Large Language Models
(LLMs) that are trained on large amounts of data at scale and adaptable to
a wide range of downstream applications, Artificial intelligence is
experiencing a paradigm revolution. BERT, T5, ChatGPT, GPT-4, Falcon 180B,
Codex, DALL-E, Whisper, and CLIP are now the foundation for new
applications ranging from computer vision to protein sequence study and
from speech recognition to coding. Earlier models had a reputation of
starting from scratch with each new challenge. The capacity to experiment
with, examine, and comprehend the capabilities and potentials of
next-generation FMs is critical to undertaking this research and guiding
its path. Nevertheless, these models are currently inaccessible as the
resources required to train these models are highly concentrated in
industry, and even the assets (data, code) required to replicate their
training are frequently not released due to their demand in the real-time
industry. At the moment, mostly large tech companies such as OpenAI,
Google, Facebook, and Baidu can afford to construct FMs and LLMS. Despite
the expected widely publicized use of FMs and LLMS, we still lack a
comprehensive knowledge of how they operate, why they underperform, and
what they are even capable of because of their emerging global qualities.
To deal with these problems, we believe that much critical research on FMs
and LLMS would necessitate extensive multidisciplinary collaboration, given
their essentially social and technical structure.
The International Conference on Foundation and Large Language Models (FLLM)
addresses the architectures, applications, challenges, approaches, and
future directions. We invite the submission of original papers on all
topics with special interest in but not limited to:
- *Architectures and Systems*
- Transformers and Attention
- Bidirectional Encoding
- Autoregressive Models
- Massive GPU Systems
- Prompt Engineering
- Multimodal LLMs
- Fine-tuning
- *Challenges*
- Hallucination
- Cost of Creation and Training
- Energy and Sustainability Issues
- Integration
- Safety and Trustworthiness
- Interpretability
- Fairness
- Social Impact
- * Future Directions*
- Generative AI
- Explainability and EXplainable AI
- Retrieval Augmented Generation (RAG)
- Federated Learning for FLLM
- Large Language Models Fine-Tuning on Graphs
- Data Augmentation
- * Natural Language Processing Applications*
- Generation
- Summarization
- Rewrite
- Search
- Question Answering
- Language Comprehension and Complex Reasoning
- Clustering and Classification
- * Applications*
- Natural Language Processing
- Communication Systems
- Security and Privacy
- Image Processing and Computer Vision
- Life Sciences
- Financial Systems
*Call for Workshop Papers:*
- The Artificial Intelligence Models and Systems Symposium (AIMS)
<https://aims-conference.ai/2025/>
- The 2nd International Workshop on Advances of GenAI in Software
Engineering (GenAISE)
<https://fllm-conference.org/2025/Workshops/GenAISE2025>
- The 2nd International Workshop on Large Language Models for
Cybersecurity (LLMCS)
<https://fllm-conference.org/2025/Workshops/LLMCS2025/>
- The 2nd International Workshop on Prompt Engineering Large Language
Models (PromptEng)
<https://fllm-conference.org/2025/Workshops/PromptEng2025/>
- The 2nd International Workshop on Generative AI for Textual Document
Analysis (GENAIDOC)
<https://sites.google.com/view/genaidoc-workshop-fllm-2025>
- The 2nd International Workshop on Sustainable AI for Natural Language
Processing (SusAI)
<https://fllm-conference.org/2025/Workshops/SusAI2025/>
- The 2nd International Symposium on Generative Artificial Intelligence
(GenAI) <https://fllm-conference.org/2025/Workshops/GenAI2025/>
- The 2nd International Workshop on Artificial Intelligence for
Cybersecurity (AICS)
<https://fllm-conference.org/2025/Workshops/AICS2025/>
*Journal Special Issue:*
Selected high quality papers will be invited for special issue submission
at the *Information Processing & Management (impact factor : 6.9)*
https://www.sciencedirect.com/journal/information-processing-and-management
*Submissions Guidelines and Proceedings*
Manuscripts should be prepared in 10-point font using the IEEE 8.5" x 11"
two-column format. All papers should be in PDF format, and submitted
electronically at Paper Submission Link. A full paper can be up to 8 pages
(including all figures, tables and references). Extra pages (up to 4 pages)
can be purchased for a fee. Submitted papers must present original
unpublished research that is not currently under review for any other
conference or journal. Papers not following these guidelines may be
rejected without review. Also submissions received after the due date,
exceeding length limit, or not appropriately structured may also not be
considered. Authors may contact the Program Chair for further information
or clarification. All submissions are peer-reviewed by at least three
reviewers. Accepted papers will appear in the FLLM Proceeding, and be
published by the IEEE Computer Society Conference Publishing Services and
be submitted to IEEE Xplore for inclusion. Submitted papers must include
original work, and must not be under consideration for another conference
or journal. Authors of accepted papers are expected to present their work
at the conference. Submitted papers that are deemed of good quality but
that could not be accepted as regular papers will be accepted as short
papers.
*Important Dates:*
- *Paper submission deadline: July 31, 2025 (Extended)*
- Notification of acceptance: September 15, 2025
- Camera-ready Submission: October 10, 2025
*Contact:*
Please send any inquiry on FLLM to: <emergingtechnetwork@gmail.com>
info@fllm-conference.org
**********************************************************
*
* Contributions to be spread via DMANET are submitted to
*
* DMANET@zpr.uni-koeln.de
*
* Replies to a message carried on DMANET should NOT be
* addressed to DMANET but to the original sender. The
* original sender, however, is invited to prepare an
* update of the replies received and to communicate it
* via DMANET.
*
* DISCRETE MATHEMATICS AND ALGORITHMS NETWORK (DMANET)
* http://www.zaik.uni-koeln.de/AFS/publications/dmanet/
*
**********************************************************