Largest funder in x-risk reduction

Most funding is done via proactive research, but there are frequent requests for proposals in certain areas. Previously called Open Philanthropy.
Type
Fund
Accepting applications
Yes – rolling basis
Best for mid- to large-scale projects
Funds organizations working on humanity's long-term survival and flourishing. Speculation Grants are rolling; the full S-Process runs annually and is currently open.
Type
Fund
Accepting applications
Yes – closes 22 April 2026
Schmidt Sciences program making grants of up to $5M for technical research that improves our ability to understand, predict, and control risks from frontier AI systems.
Type
Grant program
Accepting applications
Yes – closes 17 May 2026
Large grantmaker funding individuals and small projects aiming to positively influence the long-term trajectory of civilization. Relatively straightforward application process.
Type
Fund
Accepting applications
Yes – rolling basis
Regranting platform where individual regranters have funding pools to direct towards publicly-listed projects.
Type
Platform
Accepting applications
Yes – rolling basis
Funding neglected approaches to AI alignment through grants and fiscal sponsorship to independent researchers, educational programs, and engineers developing mech interp tools.
Type
Fund
Accepting applications
Yes – rolling basis
Funds impact-oriented religious initiatives to avert the risks from rapid AI development, as well as religious projects promoting pro-human visions of the future.
Type
Grant program
Accepting applications
Yes – closes 2 February 2026
Foresight's 'AI for Science & Safety Nodes' program offers funding, a community hub, and local compute in either San Francisco or Berlin.
Type
Grant program
Accepting applications
Yes – rolling basis
Pooled multi-donor fund structured to be fast and rapidly capture emerging opportunities, including in governance, technical alignment, and evaluations.
Type
Fund
Accepting applications
No
12/14-week program run by Fifty Years, helping scientists and engineers build startups working on AI safety.
Type
Incubator
Accepting applications
No
Provides $5k–25k project funding for journalists covering topics related to how technology and society intersect.
Type
Fund
Accepting applications
Yes – rolling basis
Aiming to reduce catastrophic risks from advanced AI through grants towards technical research, policy, and training programs for new researchers.
Type
Fund
Accepting applications
No
Aiming to increase the impact of effective altruism projects (including AI safety) by increasing their access to talent, capital, and knowledge.
Type
Fund
Accepting applications
Yes – rolling basis
Supporting digital content from creators raising awareness and understanding about ongoing AI developments and issues.
Type
Grant program
Accepting applications
Yes – rolling basis
Funding initiative with $25M annual giving, backing people and ideas with the funding, strategic guidance, and networks they need to steer transformative AI toward beneficial outcomes.
Type
Fund
Accepting applications
No
Incubator helping start new organizations that steer transformative technology towards benefiting life and away from extreme large-scale risks.
Type
Incubator
Accepting applications
No
Provides financial support to future and current PhD students in the field of cooperative AI. Run by the Cooperative AI Foundation (CAIF).
Type
Grant-based fellowship
Accepting applications
No
2-year program for young people who want to build new things. Fellows skip or drop out of college to receive a grant and support from a network of founders, investors, and scientists.
Type
Grant-based fellowship
Accepting applications
Yes
Devises and executes bespoke giving strategies for major donors. Grants focus on reducing the risks posed by AI, nuclear war, and engineered pandemics.
Type
Fund
Accepting applications
No
For technologists who want to make an impact. Fellows attend summer retreats and receive access to grants for their projects, and accountability and support mechanisms.
Type
Grant-based fellowship
Accepting applications
Yes – closes 9 February 2026
UK government organization running large-scale grant programs funding AI safety research. Collaborates with researchers and institutions to identify promising projects to fund.
Type
Grant program
Accepting applications
No
UK government R&D funding agency aiming to unlock scientific and technological breakthroughs that benefit everyone. Similar to DARPA in the US.
Type
Fund
Accepting applications
Yes – rolling basis
Helping entrepreneurs direct their charitable giving where it will do the most good by researching the world’s biggest problems and identifying the most effective solutions.
Type
Fund
Accepting applications
No
Grants of $1k–20k to support journalism on AI and its impacts. Mainly focuses on written journalism, but also funds other formats.
Type
Grant program
Accepting applications
No
Incubating early-stage AI safety research organizations. The program involves co-founder matching, mentorship, and seed funding, culminating in an in-person building phase.
Type
Incubator
Accepting applications
No
Identifying leaders from business, policy, and academia, and helping them take on ambitious projects in AI safety.
Type
Incubator
Accepting applications
Yes – rolling basis
12-week program in San Francisco, USA, for pre-seed companies aiming to make an impact with AI safety and assurance technologies.
Type
Incubator
Accepting applications
No
Makes grants for studying the risks from powerful technologies and developing strategies for reducing them – through RFPs, contests, collaborations, and fellowships.
Type
Fund
Accepting applications
No
Charity foundation backed by a large philanthropic commitment supporting research into improving cooperative intelligence of advanced AI.
Type
Fund
Accepting applications
No
Initiative run by the Frontier Model Forum (i.e. frontier AI companies) to accelerate and expand the field of AI safety.
Type
Fund
Accepting applications
No
Private foundation funding AI governance: policymaker education on capabilities, policy design across a range of future scenarios, and civil society coordination on AI risk.
Type
Fund
Accepting applications
No
Supports projects and individuals aiming to address worst-case suffering risks from the development and deployment of advanced AI systems.
Type
Fund
Accepting applications
Yes – rolling basis
Supports the exploration of foundational technical topics that relate to the potential national security implications of AI over the long term.
Type
Grant program
Accepting applications
No
Network of donors funding charitable projects that work one level removed from direct impact, often cross-cutting between cause areas.
Type
Fund
Accepting applications
No
Helping philanthropists find, fund, and scale the most promising people and solutions to the world’s most pressing problems – including AI safety.
Type
Incubator
Accepting applications
No
Sponsors students and recent graduates to undertake independent research with mentor guidance or to serve as research assistants for law professors and other AI policy experts.
Type
Grant-based fellowship
Accepting applications
Yes – rolling basis
Year-long, paid position for researchers, engineers, entrepreneurs, and other professionals to pursue self-directed work on AI safety at the Constellation coworking space.
Type
Grant-based fellowship
Accepting applications
No
Empowering innovators and scientists to increase human agency by creating the next generation of responsible AI. Providing support, resources, and open-source software.
Type
Grant program
Accepting applications
Yes – rolling basis
Berlin-based foundation of tech investor Jan Beckers, funding research in LLM auditing/evaluations, interpretability, and global AI governance.
Type
Fund
Accepting applications
Yes – rolling basis
Small organization with a long history of finding people doing impactful work to prevent human-created existential risks and financially supporting them.
Type
Fund
Accepting applications
No
Swiss VC focused on reducing suffering risks, including that posed by catastrophic AI misuse and AI conflict. Makes both grants to nonprofits and investments in for-profit companies.
Type
Venture capitalist, Fund
Accepting applications
Yes – rolling basis
Run by the Mercatus Center, this program seeks to support entrepreneurs and brilliant minds with highly scalable, “zero to one” ideas for meaningfully improving society.
Type
Grant-based fellowship
Accepting applications
Yes – rolling basis
Evergreen VC and fund by Jaan Tallinn backing tech that supports humanity's long-term survival. Profits from the VC go towards funding AI safety nonprofits.
Type
Venture capitalist, Fund
Accepting applications
Yes – rolling basis
$100M fund by Menlo Ventures and Anthropic backing AI startups from seed to Series A, including those working on "trust and safety tooling".
Type
Venture capitalist
Accepting applications
Yes – rolling basis
VC firm investing in ethical founders developing transformative technologies that have the potential to impact humanity on a meaningful scale.
Type
Venture capitalist
Accepting applications
No
Private foundation providing grants to projects working to make the future with advanced AI go well. Also invests in for-profits in order to redeploy capital to AI safety nonprofits.
Type
Fund, Venture capitalist
Accepting applications
No
Foundation run by Dustin Moskovitz and Cari Tuna providing funding in a range of cause areas, including AI safety.
Type
Fund
Accepting applications
No
Program from Schmidt Sciences supporting exceptional people working on key opportunities and hard problems that are critical to get right for society to benefit from AI.
Type
Grant-based fellowship
Accepting applications
No
Funds research into the design and implementation of learning-enabled systems in which safety is ensured with high levels of confidence.
Type
Grant program
Accepting applications
No
Grant program providing funding to those raising awareness about AI risks or advocating for a pause in AI development.
Type
Fund
Accepting applications
No
Early-stage venture fund supporting startups developing tools to enhance AI safety. Provides both financial investment and mentorship.
Type
Venture capitalist
Accepting applications
Yes – rolling basis
VC firm investing in AI safety startups. Run by exited founders and backed by Reid Hoffman, Eric Ries, and Geoff Ralston.
Type
Venture capitalist
Accepting applications
Yes – rolling basis
VC investing in startups shaping the future of AI and humanity, committed to delivering financial returns while building a resilient and beneficial future.
Type
Venture capitalist
Accepting applications
Yes – rolling basis
Funding initiative housed at TED helping entrepreneurs shape impactful ideas into viable multi-year plans and launching them to the world alongside visionary philanthropists.
Type
Incubator
Accepting applications
Yes – rolling basis
Private foundation of Elastic co-founder Steven Schuurman, providing €5M/year to various initiatives including AI safety.
Type
Fund
Accepting applications
No
Incubation program focusing on "defensive" tech – the idea that the most powerful solution to technological risk is often more technology.
Type
Incubator
Accepting applications
No
VC aiming to empower founders building a radically better world with safe AI systems by investing in ambitious teams with defensible strategies that can scale to post-AGI.
Type
Venture capitalist
Accepting applications
No
New VC fund by Igor Babuschkin (xAI co-founder). Backs AI safety research and startups building agentic systems.
Type
Venture capitalist
Accepting applications
Yes – rolling basis
Jed McCaleb's fund, making grants to organizations and projects in various cause areas including AI safety.
Type
Fund
Accepting applications
No
Crowdsourced charity evaluator and one-stop shop for AI safety funding applications. No longer active.
Type
Platform
Accepting applications
No
$5–50k grants for individuals building high-impact AI safety projects. Applicants must have completed their AGI Strategy course.
Type
Grant program
Accepting applications
No
The EU funded important projects to research AI safety and Tendery.ai offered free support in applying.
Type
Grant program
Accepting applications
No
Funded projects that have a chance of substantially changing humanity's future trajectory for the better. Disbursed 8 million USD in the 2023 round but has been inactive since.
Type
Fund
Accepting applications
No
Fellowship run by Stripe for graduate students and early-career researchers interested in pursuing foundational academic research around the economics of AI.
Type
Grant-based fellowship
Accepting applications
No
Decentralized bounty platform offering prize money for delivering on specific AI safety projects.
Type
Platform
Accepting applications
No
Platform taking grant applications and presenting them to a donor circle, who then reached out if they were interested. Last round was in 2024.
Type
Platform
Accepting applications
No
The largest funder in existential risk reduction. Most funding is via proactive research, but there are also frequent requests for proposals in certain areas. Previously called Open Philanthropy.
Type
Fund
Accepting applications
Yes – rolling basis
Provides financial support to organizations working to improve humanity’s long-term prospects for survival and flourishing. Speculation Grants are rolling; full S-Process runs annually.
Type
Fund
Accepting applications
Yes – rolling basis