Funding

These organizations offer financial support to organizations and individuals working on AI safety.

Science of Trustworthy AI

Schmidt Sciences program making grants of up to $5M for technical research that improves our ability to understand, predict, and control risks from frontier AI systems.

Type

Grant program

Accepting applications

Yes – closes 17 May 2026

Long-Term Future Fund (LTFF)

Large grantmaker funding individuals and small projects aiming to positively influence the long-term trajectory of civilization. Relatively straightforward application process.

Type

Fund

Accepting applications

Yes – rolling basis

Manifund

Regranting platform where individual regranters have funding pools to direct towards publicly-listed projects.

Type

Platform

Accepting applications

Yes – rolling basis

AI Alignment Foundation (AIAF)

Funding neglected approaches to AI alignment through grants and fiscal sponsorship to independent researchers, educational programs, and engineers developing mech interp tools.

Type

Fund

Accepting applications

Yes – rolling basis

Future of Life Institute (FLI): religious projects

Funds impact-oriented religious initiatives to avert the risks from rapid AI development, as well as religious projects promoting pro-human visions of the future.

Type

Grant program

Accepting applications

Yes – closes 2 February 2026

Foresight Institute

Foresight's 'AI for Science & Safety Nodes' program offers funding, a community hub, and local compute in either San Francisco or Berlin.

Type

Grant program

Accepting applications

Yes – rolling basis

AI Safety Tactical Opportunities Fund (AISTOF)

Pooled multi-donor fund structured to be fast and rapidly capture emerging opportunities, including in governance, technical alignment, and evaluations.

Type

Fund

Accepting applications

No

5050

12/14-week program run by Fifty Years, helping scientists and engineers build startups working on AI safety.

Type

Incubator

Accepting applications

No

Omidyar Network: Tech Journalism Fund

Provides $5k–25k project funding for journalists covering topics related to how technology and society intersect.

Type

Fund

Accepting applications

Yes – rolling basis

AI Risk Mitigation (ARM) Fund

Aiming to reduce catastrophic risks from advanced AI through grants towards technical research, policy, and training programs for new researchers.

Type

Fund

Accepting applications

No

Effective Altruism Infrastructure Fund (EAIF)

Aiming to increase the impact of effective altruism projects (including AI safety) by increasing their access to talent, capital, and knowledge.

Type

Fund

Accepting applications

Yes – rolling basis

Future of Life Institute (FLI): Digital Media Accelerator

Supporting digital content from creators raising awareness and understanding about ongoing AI developments and issues.

Type

Grant program

Accepting applications

Yes – rolling basis

Astralis Foundation

Funding initiative with $25M annual giving, backing people and ideas with the funding, strategic guidance, and networks they need to steer transformative AI toward beneficial outcomes.

Type

Fund

Accepting applications

No

Future of Life Foundation (FLF)

Incubator helping start new organizations that steer transformative technology towards benefiting life and away from extreme large-scale risks.

Type

Incubator

Accepting applications

No

Cooperative AI PhD Fellowship

Provides financial support to future and current PhD students in the field of cooperative AI. Run by the Cooperative AI Foundation (CAIF).

Type

Grant-based fellowship

Accepting applications

No

Thiel Fellowship

2-year program for young people who want to build new things. Fellows skip or drop out of college to receive a grant and support from a network of founders, investors, and scientists.

Type

Grant-based fellowship

Accepting applications

Yes

Longview Philanthropy

Devises and executes bespoke giving strategies for major donors. Grants focus on reducing the risks posed by AI, nuclear war, and engineered pandemics.

Type

Fund

Accepting applications

No

Interact Fellowship

For technologists who want to make an impact. Fellows attend summer retreats and receive access to grants for their projects, and accountability and support mechanisms.

Type

Grant-based fellowship

Accepting applications

Yes – closes 9 February 2026

UK AI Security Institute (AISI)

UK government organization running large-scale grant programs funding AI safety research. Collaborates with researchers and institutions to identify promising projects to fund.

Type

Grant program

Accepting applications

No

Advanced Research + Invention Agency (ARIA)

UK government R&D funding agency aiming to unlock scientific and technological breakthroughs that benefit everyone. Similar to DARPA in the US.

Type

Fund

Accepting applications

Yes – rolling basis

Founders Pledge

Helping entrepreneurs direct their charitable giving where it will do the most good by researching the world’s biggest problems and identifying the most effective solutions.

Type

Fund

Accepting applications

No

Tarbell: AI Reporting Grants

Grants of $1k–20k to support journalism on AI and its impacts. Mainly focuses on written journalism, but also funds other formats.

Type

Grant program

Accepting applications

No

Catalyze Impact

Incubating early-stage AI safety research organizations. The program involves co-founder matching, mentorship, and seed funding, culminating in an in-person building phase.

Type

Incubator

Accepting applications

No

Halcyon Futures

Identifying leaders from business, policy, and academia, and helping them take on ambitious projects in AI safety.

Type

Incubator

Accepting applications

Yes – rolling basis

Seldon Lab

12-week program in San Francisco, USA, for pre-seed companies aiming to make an impact with AI safety and assurance technologies.

Type

Incubator

Accepting applications

No

Future of Life Institute (FLI)

Makes grants for studying the risks from powerful technologies and developing strategies for reducing them – through RFPs, contests, collaborations, and fellowships.

Type

Fund

Accepting applications

No

Cooperative AI Foundation (CAIF)

Charity foundation backed by a large philanthropic commitment supporting research into improving cooperative intelligence of advanced AI.

Type

Fund

Accepting applications

No

AI Safety Fund (AISF)

Initiative run by the Frontier Model Forum (i.e. frontier AI companies) to accelerate and expand the field of AI safety.

Type

Fund

Accepting applications

No

Laidir Foundation

Private foundation funding AI governance: policymaker education on capabilities, policy design across a range of future scenarios, and civil society coordination on AI risk.

Type

Fund

Accepting applications

No

Center on Long-Term Risk (CLR) Fund

Supports projects and individuals aiming to address worst-case suffering risks from the development and deployment of advanced AI systems.

Type

Fund

Accepting applications

Yes – rolling basis

CSET: Foundational Research Grants

Supports the exploration of foundational technical topics that relate to the potential national security implications of AI over the long term.

Type

Grant program

Accepting applications

No

Meta Charity Funders

Network of donors funding charitable projects that work one level removed from direct impact, often cross-cutting between cause areas.

Type

Fund

Accepting applications

No

Ergo Impact

Helping philanthropists find, fund, and scale the most promising people and solutions to the world’s most pressing problems – including AI safety.

Type

Incubator

Accepting applications

No

Vista Institute for AI Policy

Sponsors students and recent graduates to undertake independent research with mentor guidance or to serve as research assistants for law professors and other AI policy experts.

Type

Grant-based fellowship

Accepting applications

Yes – rolling basis

Constellation Residency

Year-long, paid position for researchers, engineers, entrepreneurs, and other professionals to pursue self-directed work on AI safety at the Constellation coworking space.

Type

Grant-based fellowship

Accepting applications

No

AE Studio Research

Empowering innovators and scientists to increase human agency by creating the next generation of responsible AI. Providing support, resources, and open-source software.

Type

Grant program

Accepting applications

Yes – rolling basis

Global Technology Risk (GTR) Foundation

Berlin-based foundation of tech investor Jan Beckers, funding research in LLM auditing/evaluations, interpretability, and global AI governance.

Type

Fund

Accepting applications

Yes – rolling basis

Saving Humanity from Homo Sapiens (SHfHS)

Small organization with a long history of finding people doing impactful work to prevent human-created existential risks and financially supporting them.

Type

Fund

Accepting applications

No

Macroscopic Ventures

Swiss VC focused on reducing suffering risks, including that posed by catastrophic AI misuse and AI conflict. Makes both grants to nonprofits and investments in for-profit companies.

Type

Venture capitalist, Fund

Accepting applications

Yes – rolling basis

Emergent Ventures

Run by the Mercatus Center, this program seeks to support entrepreneurs and brilliant minds with highly scalable, “zero to one” ideas for meaningfully improving society.

Type

Grant-based fellowship

Accepting applications

Yes – rolling basis

Metaplanet

Evergreen VC and fund by Jaan Tallinn backing tech that supports humanity's long-term survival. Profits from the VC go towards funding AI safety nonprofits.

Type

Venture capitalist, Fund

Accepting applications

Yes – rolling basis

Anthology Fund

$100M fund by Menlo Ventures and Anthropic backing AI startups from seed to Series A, including those working on "trust and safety tooling".

Type

Venture capitalist

Accepting applications

Yes – rolling basis

Lionheart Ventures

VC firm investing in ethical founders developing transformative technologies that have the potential to impact humanity on a meaningful scale.

Type

Venture capitalist

Accepting applications

No

Astera Institute

Private foundation providing grants to projects working to make the future with advanced AI go well. Also invests in for-profits in order to redeploy capital to AI safety nonprofits.

Type

Fund, Venture capitalist

Accepting applications

No

Good Ventures (GV)

Foundation run by Dustin Moskovitz and Cari Tuna providing funding in a range of cause areas, including AI safety.

Type

Fund

Accepting applications

No

AI2050

Program from Schmidt Sciences supporting exceptional people working on key opportunities and hard problems that are critical to get right for society to benefit from AI.

Type

Grant-based fellowship

Accepting applications

No

National Science Foundation (NSF): Safe Learning-Enabled Systems

Funds research into the design and implementation of learning-enabled systems in which safety is ensured with high levels of confidence.

Type

Grant program

Accepting applications

No

Nonlinear AI Safety Advocacy Grants

Grant program providing funding to those raising awareness about AI risks or advocating for a pause in AI development.

Type

Fund

Accepting applications

No

Safe AI Fund (SAIF)

Early-stage venture fund supporting startups developing tools to enhance AI safety. Provides both financial investment and mentorship.

Type

Venture capitalist

Accepting applications

Yes – rolling basis

Juniper Ventures

VC firm investing in AI safety startups. Run by exited founders and backed by Reid Hoffman, Eric Ries, and Geoff Ralston.

Type

Venture capitalist

Accepting applications

Yes – rolling basis

Auspicious Fund

VC investing in startups shaping the future of AI and humanity, committed to delivering financial returns while building a resilient and beneficial future.

Type

Venture capitalist

Accepting applications

Yes – rolling basis

The Audacious Project

Funding initiative housed at TED helping entrepreneurs shape impactful ideas into viable multi-year plans and launching them to the world alongside visionary philanthropists.

Type

Incubator

Accepting applications

Yes – rolling basis

The Dreamery Foundation

Private foundation of Elastic co-founder Steven Schuurman, providing €5M/year to various initiatives including AI safety.

Type

Fund

Accepting applications

No

Def/acc at Entrepreneur First

Incubation program focusing on "defensive" tech – the idea that the most powerful solution to technological risk is often more technology.

Type

Incubator

Accepting applications

No

Mythos Ventures

VC aiming to empower founders building a radically better world with safe AI systems by investing in ambitious teams with defensible strategies that can scale to post-AGI.

Type

Venture capitalist

Accepting applications

No

Babuschkin Ventures

New VC fund by Igor Babuschkin (xAI co-founder). Backs AI safety research and startups building agentic systems.

Type

Venture capitalist

Accepting applications

Yes – rolling basis

The Navigation Fund

Jed McCaleb's fund, making grants to organizations and projects in various cause areas including AI safety.

Type

Fund

Accepting applications

No

GiveWiki

Crowdsourced charity evaluator and one-stop shop for AI safety funding applications. No longer active.

Type

Platform

Accepting applications

No

BlueDot AGI Strategy Fund

$5–50k grants for individuals building high-impact AI safety projects. Applicants must have completed their AGI Strategy course.

Type

Grant program

Accepting applications

No

EU AI Safety Tenders

The EU funded important projects to research AI safety and Tendery.ai offered free support in applying.

Type

Grant program

Accepting applications

No

Lightspeed Grants

Funded projects that have a chance of substantially changing humanity's future trajectory for the better. Disbursed 8 million USD in the 2023 round but has been inactive since.

Type

Fund

Accepting applications

No

Economics of AI Fellowship

Fellowship run by Stripe for graduate students and early-career researchers interested in pursuing foundational academic research around the economics of AI.

Type

Grant-based fellowship

Accepting applications

No

Superlinear Prizes

Decentralized bounty platform offering prize money for delivering on specific AI safety projects.

Type

Platform

Accepting applications

No

Nonlinear Network

Platform taking grant applications and presenting them to a donor circle, who then reached out if they were interested. Last round was in 2024.

Type

Platform

Accepting applications

No

Coefficient Giving (CG)

The largest funder in existential risk reduction. Most funding is via proactive research, but there are also frequent requests for proposals in certain areas. Previously called Open Philanthropy.

Type

Fund

Accepting applications

Yes – rolling basis

Survival and Flourishing Fund (SFF)

Provides financial support to organizations working to improve humanity’s long-term prospects for survival and flourishing. Speculation Grants are rolling; full S-Process runs annually.

Type

Fund

Accepting applications

Yes – rolling basis

Recipient type

Accepting applications

Type