EPFLLausanne 1015

Research Engineer - NLP & Large Language Models

Fixed-term

Description

EPFL, the Swiss Federal Institute of Technology in Lausanne, is one of the Research Engineer - NLP & Large Language Models EPFL, the Swiss Federal Institute of Technology in Lausanne, is one of the most dynamic university campuses in Europe and ranks among the top 20 universities worldwide. The EPFL employs more than 6,500 people supporting the three main missions of the institution: education, research and innovation. The EPFL campus offers an exceptional working environment at the heart of a community of more than 17,000 people, including over 12,500 students and 4,000 researchers from more than 120 different countries. Research Engineer - NLP & Large Language Models About the Role We are seeking a Research Engineer in Natural Language Processing (NLP) and Large Language Models (LLMs) to contribute to the design, training, and evaluation of next-generation foundation models. The role sits at the intersection of research and production-grade engineering, with a strong emphasis on post-training, multimodality, and advanced generative modeling techniques, including diffusion-based approaches. You will work closely with researchers and applied scientists to translate novel ideas into scalable, reproducible systems, and to push the state of the art in open, responsible, and multilingual AI. Key Responsibilities Design, implement, and maintain training and post-training pipelines for large language and multimodal models (, instruction tuning, alignment, preference optimization) Conduct research and engineering on post-training methods Contribute to multimodal modeling, integrating text with modalities such as vision, speech, or audio Explore and apply diffusion-based models and hybrid generative approaches for language and multimodal representation learning Optimize large-scale training and inference Develop evaluation pipelines and benchmarks for language understanding, reasoning, alignment, and multimodal performance Collaborate with researchers to prototype new ideas, reproduce results from the literature, and contribute to publications or technical reports Ensure code quality, reproducibility, and documentation suitable for long-term research and open-source release Required Qualifications MSc or PhD in Computer Science, Machine Learning, AI, or a related field (or equivalent practical experience) Strong background in NLP and deep learning, with hands-on experience working with large language models Solid programming skills in Python, with experience using modern ML frameworks (, PyTorch) Experience working with open-weight or open-data models, including releasing models, datasets, or benchmarks Familiarity with post-training techniques for LLMs (, instruction tuning, preference optimization, alignment) Strong experimental rigor: ability to design controlled experiments, analyze results, and iterate efficiently Desired / Bonus Qualifications Experience with diffusion models (, text diffusion, latent diffusion, or multimodal diffusion) Hands-on work on multimodal models (, text-image, text-audio, speech-language systems) Exposure to LLM alignment, safety, or evaluation beyond standard language modeling metrics Experience with distributed training and large-scale model experimentation Familiarity with multilingual or low-resource language settings Contributions to open-source ML or published research in NLP, multimodality, or generative modeling What We Offer A research-driven environment with access to large-scale compute and modern ML infrastructure Close collaboration with leading researchers in NLP, multimodality, and generative modeling The opportunity to work on foundational, open, and socially responsible AI systems Support for publishing research, contributing to open-source projects, and engaging with the broader research community Competitive compensation and benefits, commensurate with experience Information Contract Start Date : to be agreed upon Activity Rate : 100% Duration : 1 year, renewable Contract Type: Fixed-term contract (CDD) jpid94f4a09cv jpit0312cv jpiy26cv

Skills

PyTorchNLPLLMPythonMachine LearningDeep LearningMLAINatural Language Processing