New AI Methods Group to spearhead adoption across four leading evidence synthesis organizations
We are delighted to announce a new, joint Methods Group between the Cochrane Collaboration, the Campbell Collaboration, JBI and the Collaboration for Environmental Evidence (CEE) focusing on artificial intelligence (AI) and automation in evidence synthesis.
The rapidly growing evidence base and the increasing complexity of methods makes completing timely, high-quality, and comprehensive evidence synthesis, more and more challenging. Artificial intelligence (AI) and automation promises to help address this, and make it possible to keep up with the demand and expectations of users of evidence synthesis. But to realize this potential, we - as a collective across the whole evidence synthesis ecosystem - need to ensure AI doesn’t compromise on the principles of research integrity in which evidence synthesis was built. Therefore, this Methods Group will help define and support responsible AI use across four of the leading evidence synthesis organizations, including the Cochrane Collaboration, the Campbell Collaboration, JBI and the Collaboration for Environmental Evidence (CEE).
As a Methods Group, we aim to:
- Spearhead methods research and development and act as a bridge between evidence synthesis organizations and the wider research community.
- Define best practice and ensure guidance for accepted methods is up to date.
- Support the implementation of new or amended methods by acting as an advisor or through involvement in methods implementation in our respective evidence synthesis organizations.
We are part of the International Collaboration for Automation in Systematic Reviews (ICASR) and recommend those who want to share and discuss AI methods research, developments and opportunities, do so via the ICASR LinkedIn group, as we will as a Methods Group.
Across our Methods Group, we are also involved in other key developments in the evidence synthesis ecosystem, including the Evidence Synthesis Infrastructure Collaborative safe and responsible use of AI working group.
Our aim is for the Methods Group to work across these organizations and developments in the field and facilitate discussion and critical thinking, particularly around standards for accuracy, evaluations and validation, with events, webinars and other activities.
Defining best practice and ensuring guidance for accepted methods is up to date.We are involved the responsible AI use in evidence synthesis recommendations and guidance (RAISE), which offers tailored advice for a diverse range of roles in the evidence synthesis ecosystem. Whether you're an evidence synthesist, methodologist, AI developer, or an organization or publisher involved in evidence synthesis, this guidance is a first step to help clarify your responsibilities and alleviate some of the concerns around AI use. For more information see the RAISE Open Science Framework project page.
One of our first actions as a Methods Group will be to provisionally endorse the next version of the RAISE recommendations and guidance for use in Cochrane, the Campbell Collaboration, JBI and the Collaboration for Environmental Evidence (CEE), which is due to be released soon.
Supporting the implementation of new or amended methods by acting as an advisor or through involvement in the methods implementation projects.The Methods Group has individuals from across four of the major evidence synthesis organizations, and even though implementation of AI and automation is the responsibility of each organization individually, by coming together we can aim to align best practice and share lessons learned on effective approaches. Our implementation will be based on the RAISE recommendations, with more details being shared in the coming months.
As a first step, we are in the process of defining our position on AI and automation for researchers and authors within our organizations, to ensure they have clarity and are empowered to use them in their evidence synthesis.
Our implementation will also consider how we can we improve AI literacy across our organizations, including how we can work with methodologists and trainers, so researchers and editors have the skills they need to ensure AI is used responsibly and reported transparently.
Methods Group Convenors:
- Ella Flemyng (Cochrane, UK)
- Gerald Gartlehner (University for Continuing Education Krems and Cochrane Austria, Austria)
- Zoe Jordan (JBI, Australia)
- Biljana Macura (Stockholm Environmental Institute and the Collaboration for Environmental Evidence, Sweden)
- Joerg Meerpohl (University of Freiburg and Cochrane Germany, Germany)
- Will Moy (Campbell, UK)
- Anna Noel Storr (Cochrane, UK)
- James Thomas (UCL, UK)
Want to find out more?
Register for the webinar on ‘Recommendations and guidance on responsible AI in evidence synthesis’ on 3 June 2025, part of the Artificial Intelligence (AI) methods in evidence synthesis series, to find out about why we all need to embrace responsible AI and how this Methods Group will support it.
Also see our AI Methods Group website, which includes a news and events section. You can also follow each individual organization for more news and information as the Group develops.
Friday, February 28, 2025