shadow shadow shadow shadow shadow
    Country
    Grant Type
    Location
    Who can apply
    Clear

    AI Safety Institute – Systemic Safety Grants

    Further to the AI Seoul Summit announcement, this programme looks to fund researchers who will collaborate with the UK government to advance systemic approaches to AI Safety. The grants aim to address the risks that AI poses to people and society by looking beyond the capabilities of the models themselves.

    Opening date 22 May 2024, 12:00AM

    Closing date 1 Apr 2026, 12:00AM

    In partnership with UKRI, the UK AI Safety Institute (AISI) will shortly be inviting researchers to submit grant proposals that directly address a problem in systemic AI safety or improve our understanding of systemic AI safety.  We will prioritise applications that offer concrete, actionable approaches to significant systemic risks from AI. 

    Because AI presents novel risks to all areas of society, we welcome proposals from researchers in any field. The only eligibility requirements are that you are : 

    Associated with a host organisation that can receive the awarded funds. This could be a university, a business, a civil society organisation, or part of government. 

    Based in the UK (but teams may involve international collaborators). 

    Details of submitted proposals will be shared with UK Research and Innovation, The Alan Turing Institute. Details may also be offered to future partners and other AI Safety Institutes. 

    If you would like to receive more information about the grant call, please register here.

    The only eligibility requirements are that you are:

    • Associated with a host organisation that can receive the awarded funds. This could be a university, a business, a civil society organisation, or part of government.

    • Based in the UK (but teams may involve international collaborators). The applicants must have a collaboration with a UK legal entity.

    The initial goals are: 

    1. To crowdsource ideas from academics, policymakers and entrepreneurs about which challenges AI is likely to pose for societal systems and infrastructure.  

    2. To build a community of researchers with expertise in this area and found a new field of research focused on this problem. This grant will support the wider AI safety ecosystem by funding groups we have little awareness of or so far underestimated. To support closer ways of working between governments and the research community, ensuring that outputs from this research programme (e.g., new technical tools) can be rapidly taken up by governments.

    The programme’s application details will be live in due course. To sign up and receive updates about this grant programme, please follow this link: https://www.aisi.gov.uk/grants

    Applicants may only submit one bid as the lead organisation via the online portal (https://www.aisi.gov.uk/grants). However, Applicants may participate in other bids as delivery partners.

    Please direct any clarification questions to the mailbox: AISIgrants@dsit.gov.uk

    IMPORTANT NOTICE: 

    Please be advised the maximum and minimum amounts referred to under the sub-heading titled "How much you can get" does not reflect the true amounts. These will be updated in due course.

    This programme will fund researchers who will collaborate with the UK government to advance systemic approaches to AI Safety.

    Addressing the risks that AI poses to people and society requires looking beyond the capabilities of the models themselves. AI risk management must also include understanding the impact on the systems and infrastructure in which AI models operate.

    In partnership with UKRI, the UK AI Safety Institute (AISI) will shortly be inviting researchers to submit grant proposals that directly address a problem in systemic AI safety or improve our understanding of systemic AI safety.  We will prioritise applications that offer concrete, actionable approaches to significant systemic risks from AI.

    We use cookies and similar technologies that are necessary to operate the website.Please read our cookie policy.

    We use cookies and similar technologies that are necessary to operate the website. Additional cookies are only used with your consent. We use the additional cookies to perform analyses of website usage and to check marketing measures for their efficiency. These analyses are carried out to provide you with a better user experience on the website. You are free to give, deny, or withdraw your consent at any time by using the "cookie settings" link at the bottom of each page. You can consent to our use of cookies by clicking "Agree". For more information about what information is collected and how it is shared with our partners, please read our cookie policy.

    • Required to run the website
    • Monitoring website usage and optimizing the user experience
    • Evaluation of marketing actions
    • Storage of your preferences from previous visits
    • Collecting user feedback to improve our website
    • Recording of your interests in order to provide customised content and offers
    Cookie Settings Accept