CISPA Helmholtz Center for Information Security
Postdoctoral Position in Collaborative, Distributed, and Scalable Machine Learni
CISPA Helmholtz Center for Information Security, Germantown, Ohio, United States
CISPA Helmholtz Center for Information Security
Organisation/Company CISPA Helmholtz Center for Information Security Research Field Computer science Researcher Profile Recognised Researcher (R2) Positions Postdoc Positions Country Germany Application Deadline 6 Dec 2025 - 12:00 (Europe/Berlin) Type of Contract Temporary Job Status Full-time Is the job funded through the EU Research Framework Programme? Not funded by a EU programme Is the Job related to staff position within a Research Infrastructure? No
Offer Description The group of
Sebastian Stich
works on the design and analysis of optimization methods for machine learning, with research topics ranging from decentralized and federated optimization, adaptive stochastic algorithms, and generalization in deep learning, to robustness, privacy, and security aspects of modern ML systems.
Current research directions include the ERC Consolidator Grant CollectiveMinds, which explores how multiple smaller models can collaborate, adapt, and forget outdated knowledge intelligently to make learning more sustainable and efficient, and a DFG-funded project on distributed optimization and scalable training of deep neural networks, including transformer architectures.
We invite applications from strong and creative researchers in machine learning or optimization who are eager to engage in a collaborative and interdisciplinary environment. Postdocs in our group are encouraged to pursue their own research ideas while contributing to ongoing projects.
We value initiative and scientific independence, and we expect candidates to describe in their cover letter how their research interests connect to our group’s work and how this position supports their career development goals.
Possible research topics include (but are not limited to):
Optimization algorithms for machine learning (stochastic, adaptive, distributed, or large‑scale)
Theoretical foundations of deep learning and generalization
Federated, collaborative, and decentralized learning
Continual learning, model adaptation, and knowledge sharing
Training and optimization of large models (including transformer architectures)
Efficiency and sustainability in large‑scale ML
Robustness, fairness, and privacy in learning systems
Novel directions at the intersection of optimization, theory, and practical ML
#J-18808-Ljbffr
Organisation/Company CISPA Helmholtz Center for Information Security Research Field Computer science Researcher Profile Recognised Researcher (R2) Positions Postdoc Positions Country Germany Application Deadline 6 Dec 2025 - 12:00 (Europe/Berlin) Type of Contract Temporary Job Status Full-time Is the job funded through the EU Research Framework Programme? Not funded by a EU programme Is the Job related to staff position within a Research Infrastructure? No
Offer Description The group of
Sebastian Stich
works on the design and analysis of optimization methods for machine learning, with research topics ranging from decentralized and federated optimization, adaptive stochastic algorithms, and generalization in deep learning, to robustness, privacy, and security aspects of modern ML systems.
Current research directions include the ERC Consolidator Grant CollectiveMinds, which explores how multiple smaller models can collaborate, adapt, and forget outdated knowledge intelligently to make learning more sustainable and efficient, and a DFG-funded project on distributed optimization and scalable training of deep neural networks, including transformer architectures.
We invite applications from strong and creative researchers in machine learning or optimization who are eager to engage in a collaborative and interdisciplinary environment. Postdocs in our group are encouraged to pursue their own research ideas while contributing to ongoing projects.
We value initiative and scientific independence, and we expect candidates to describe in their cover letter how their research interests connect to our group’s work and how this position supports their career development goals.
Possible research topics include (but are not limited to):
Optimization algorithms for machine learning (stochastic, adaptive, distributed, or large‑scale)
Theoretical foundations of deep learning and generalization
Federated, collaborative, and decentralized learning
Continual learning, model adaptation, and knowledge sharing
Training and optimization of large models (including transformer architectures)
Efficiency and sustainability in large‑scale ML
Robustness, fairness, and privacy in learning systems
Novel directions at the intersection of optimization, theory, and practical ML
#J-18808-Ljbffr