How to use your career to mitigate AGI risk

  • Home
  • Current affairs
How to use your career to mitigate AGI risk

How to use your career to mitigate AGI risk

There are a number of great organisations working on this issue from a variety of angles, which means many skill sets are valuable.

This includes organisations working on AI policy and governance, like the Center for AI Safety, the nonprofit that wrote the Statement on AI Risk letter. They’ve hired for policy experts, writers, research engineers, and operations staff like finance managers and development officers.

Other organisations work on technical AI safety, like Model Evaluation and Threat Research, a team that evaluates the capabilities and alignment of advanced AI models. They’ve hired for software engineers, researchers, and operations staff like recruiters and administrators.

Some organisations work indirectly on the issue, like Blueprint Biosecurity, which works to achieve breakthroughs in humanity’s capacity to prevent, mitigate, and suppress pandemics. One potential method of attack for future terrorists or misaligned AI is engineered pandemics, so work in this area is quite valuable. They’ve hired for researchers, policy analysts, as well as communication and administrative roles.

Finally, there are organisations working on growing the field, like BlueDot Impact (and us!). BlueDot Impact runs courses which aim to help participants develop the knowledge, community, and network needed to pursue high-impact careers. They’ve hired for teaching fellows, program managers, software engineers, operations specialists, and more.

As an introduction to the field, you can see a list of organisations that we think have particularly promising roles on our job board here.