Welcome to the AI Safety Collab Course
Do you want to help make AI beneficial to humanity? Explore AI Safety as a participant or facilitate discussions as a moderator in our comprehensive 8-week program.
Applications are open! Apply here
π Course Highlights:
AI Governance Track by BlueDot Impact: Syllabus developed by researchers from Harvard, Oxford, and Cambridge.
AI Alignment Track by AI Safety Atlas: Created in collaboration with experts from OpenAI, Cambridge, and CeSIA.
Dates: March 31 - June 23, 2025 (8 weeks main course + 4 weeks optional project phase)
Format: Available online or in-person
Languages: Sessions in English and additional languages
Costs: Free of charge
Benefits: International networking, career opportunities, and a LinkedIn certificate upon completion
π Application Deadline:
March 23, 2025, 23:59 (all time zones)
Applications submitted late may still be considered if this form remains open.
You have to decide between:
AI Governance Track π
AI Alignment Track π
Introduction to Foundational Concepts in Artificial Intelligence
Exploring the Positive Effects of AI
Understanding the Potential Perils of AI
Challenges in Controlling AI Systems
Policy Levers for AI Governance
Real-World AI Policy Analysis
Policy Proposal for Governing AI
Contributing to AI Governance
Capabilities
Risks
Strategies
Governance
Evaluations
Specification
Generalization
Scalable Oversight
Interpretability (optional)
π Prerequisites for Applying:
A keen interest in "How can we make AI beneficial to humanity?"
No technical background required (helpful for the Alignment Track)
Basic English proficiency required; translation tools like DeepL recommended
Open to all international applicants; non-EU applicants should consider time zone differences
New to AI Safety? Consider watching this introduction video: Introduction to AI Safety
π‘ Recommendations:
Alignment Track: Best for those with a CS, Math, or related technical background.
Governance Track: Ideal for those in Law, Economics, PR, or technical fields looking to explore regulatory aspects. We think that there are also lots of interesting technical questions in AI Governance.
π Course Structure:
Duration: March 31 - June 23, 2025
Weekly Commitment: About 5 hours per week, including 2 hours for readings, 1 hour for exercises, and 1.5-2 hours for discussion sessions. Flexibility is provided during exam periods. Optional project phase after the main course.
π§ Contact Team:
AI Safety Collab is an international initiative, hosted by groups in multiple countries.
AI Safety Collab collaborates with ENAIS
Email: contact@enais.co We will strive to answer within 48 hours.
π¬ FAQ:
Concerned about the complexity of AI Safety?
We provide support at all learning levels to ensure success.
Worried about your English skills?
The course uses technical English, but peer support and translation tools are available to help.
Unsure about the time commitment?
We recommend dedicating approximately 5 hours per week to fully benefit from the course. If you cannot meet a minimum time commitment of 3 hours, we wouldn't recommend applying.
Besides our curricula, there are plenty of options to learn for yourself in your pace about AI risk mitigation, here some ideas:
Overview courses: Courses
Interactive AI Safety Guide with Chatbot: AI Safety Info
Simple & entertaining explainer about core terms: Lethal AI Guide