Open call to apply for AlgorithmWatch’s reporting fellowship on AI and power
For a fifth time, AlgorithmWatch is looking for new Algorithmic Accountability Reporting fellows. Apply now if you have research ideas concerning the relation between Artificial Intelligence and power and its consequences. The application deadline is 15 September 2025.

The call for applications is now closed.
The expansion of generative AI infrastructure, the influence of big tech executives in public administrations, and their meddling in regulation have become pressing topics. Therefore, algorithmic and AI accountability reporting is more relevant than ever. In the framework of its ongoing reporting fellowship, AlgorithmWatch is calling on journalists and researchers across Europe to join the new cohort of reporters to unveil automated injustice and investigate the relation between AI and power.
The new round of the fellowship will run from November 2025 to May 2026. During this period, AlgorithmWatch will provide financial support, mentoring sessions, and organizational resources. We expect our fellows to produce at least one journalistic story, audio or video feature, research report, or a similar output by the end of the fellowship. AlgorithmWatch will also support the fellows in the publication process.
Following the line of investigation of the previous cohort, we will focus on the power structure and the influence dynamics underpinning the use and development of AI. This includes a wide range of possible investigation topics around the impact of AI in society, including algorithmic discrimination. Possible research topics could be:
- Influence of big tech corporations on politics and the drafting of legislation, e.g., identifying pressure groups that manage to slow down the development or limit the scope of specific regulations, such as the watering down of the EU AI Act.
- Structural and systemic oppression of women, racialized people, or other vulnerable groups, exacerbated by the use of AI, e.g., automated software in public administration that systematically targets non-nationals in fraud detection.
- Powerful industry leaders, such as big tech companies, facilitating the surveillance of groups, e.g., analyzing major contracts between regional or national governments and private firms that allow the mass scanning of citizens' personal data with automated tools, such as computer vision algorithms or remote biometric identification (RBI).
- Discrimination by automated systems in the fields of: health, education, finance, work, human resources, etc., e.g., public authorities using AI-powered apps to review welfare benefits.
- Exploitation of data workers and algorithmic management platforms to systematically replace traditional labor models, e.g. hospitals automatically allocating care services to save employment costs; the erosion of creative freelance work caused by AI tools in the fields of translation, illustration, graphic design, marketing, etc.
- The political impact of European decision-makers and lobbyists who promote longtermism and effective altruism ideologies.
Candidates are allowed to pitch a joint proposal with a research partner. In such a case, the money will be equally divided. We will also evaluate potential connections between proposals and may suggest shortlisted candidates to work together. In previous rounds, cross-border collaboration and joint research projects have proven to bring successful results, such as investigations on the expansion of data center infrastructure or algorithmic discrimination in the financial industry. The application deadline is 15 September 2025 23:59 CET.
We value stories that provide empirical cases of AI’s social impact to demonstrate how automated systems are already affecting people at an individual or collective level. We are also looking for data-driven stories on the impact of AI development, e.g., analysis of a given infrastructure’s energy consumption or automated systems' performance.
We will host two Q&A sessions on Zoom to solve doubts and further questions on 20 August at 11:00 CET and 8 September at 18:00 CET. Please find the link to each meeting attached to the respective date.
What to expect
The fellowship will start on 10 November 2025 and end on 10 May 2026.
We will choose maximum 6 applicants that will receive a total of 7.400€ (gross) each to conduct their research. Candidates can present a joint proposal with a research partner. In this case, the money will be divided equally among the participants.
Fellows will be free to choose the media outlet for the publication and also decide whether they wish to sell their stories. They can otherwise be published on AlgorithmWatch’s platforms. The fellowship includes outreach support. Optionally, the fellowship will also provide mentorship sessions with AlgorithmWatch team members and external researchers in the algorithmic accountability field.
Fellows will also be invited to an in-person gathering in Berlin when the fellowship starts.
Who can apply
Any person above 18 is welcome to apply. We very strongly encourage persons from minoritized or marginalized groups and communities to apply.
Applicants do not need to have a background in computer science. Just like you do not need a degree in climate science to report on the climate crisis, the effects of automated systems can be researched by non-technical people. We do expect people who apply to be familiar with the algorithmic field and have experience with writing and working with journalists.
There are some specific requirements the applicants must fulfill:
- Residence in a country of the European Union, or in an EFTA country (Iceland, Liechtenstein, Norway and Switzerland), or in a candidate country, or in a former country of the EU.
- Written English at a B2 level in the Common European Framework of Reference for Languages.
- A very strong interest in the topic of AI accountability and automated decision-making.
- A commitment to complete the research within the timeline of the fellowship and to deliver at least one journalistic product, such as an article, an audio or video feature, a report or similar.
How to apply
Please take into account the following guidelines before completing your proposal:
APPROACH TO THE RESEARCH
We are looking for journalistic research and stories that follow a narrative – as opposed to theoretical hypothesis or academic research. Practical, real-life cases will be positively valued.
Here we are asking you to provide an overview of the story you’d like to research, your research plan and goals. The central topic of the fellowship is the relation between AI and power structures, as well as the value chain it is built on. We are looking for research projects that:
- Focus on the influence and power structure of AI.
- Take place in Europe.
- Take into account the impact of AI on society.
- Bring new information to light, or that provide the point of view of people who are rarely given a voice in the debates on AI.
What we are not looking for:
- Information on commercialisation and manufacturing of tech products, such as hardware (e.g. “Nvidia releases new version of microchips”).
- Major tech announcements without researching its impact on society (e.g. “Meta plans to open a data center in X”).
- Theoretical and/or academic research (e.g. “What is the political economy of AI”).
Please read the FAQ section listed below. If you have further doubts on whether your proposal fits the scope of the fellowship, please send us an email to bellio@algorithmwatch.org.
FAQs
Are you offering an employment contract?
No. The allowance is paid on invoices. If fellows are unable to invoice, we will work with them to find a solution.
Who will own the copyright to the reporting I do?
You will have to publish the work under a CC-BY license.
Will I work together will AlgorithmWatch?
Yes! AlgorithmWatch will coordinate the work of the fellows, and fellows will be invited to connect with other members of the organization.
Will I work together with other fellows?
Yes, it is an option and we strongly encourage applicants to propose joint projects. We will also hold at least one monthly meeting with all the fellows.
Will there be in-person meetings?
Yes. Fellows will meet together at least once, most likely at the beginning of the fellowship period.
Do you provide office space for fellows?
No.
Can I participate in the fellowship for less than 6 months?
We expect fellows to complete the full 6 months of the program, but we can offer some flexibility.
Do I have to publish in English?
No. You can publish in your own language, but communication within the fellowship is in English.
Is there an age limit?
Anyone above 18 is welcome to apply.
Can I apply although I’m not a journalist?
Yes.
Can I apply if I’m a student?
Yes.
Can I apply if I’m working as staff in a newsroom?
Yes, but make sure that the fellowship is compatible with your work and your media’s agreements.
Can I apply if I do not have a work permit (e.g. asylum-seeker)?
Yes, but you should check that you are allowed to take part in such a program.
What countries are EU members, former members, candidates or EFTA coutries?
Albania, Austria, Belgium, Bosnia and Herzegovina, Bulgaria, Croatia, Cyprus, Czechia, Denmark, Estonia, Finland, France, Georgia, Germany, Greece, Hungary, Iceland, Ireland, Italy, Latvia, Lithuania, Liechtenstein, Luxembourg, Malta, Moldova, Montenegro, Netherlands, North Macedonia, Norway, Serbia, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, Sweden, Switzerland, Türkiye, Ukraine, United Kingdom.
Can I apply if I live outside of these countries?
No.
Can I apply if I’m a national of these countries but don’t live there at the moment?
No.
Can I work on stories besides the main investigation during the fellowship?
We are open to working with the fellows on other stories for an additional remuneration. These would be discussed on a case-by-case basis. The fellowship is not exclusive to other reporting grants or programmes.
Will AlgorithmWatch reimburse the travel expenses I incur?
This can also be discussed on a case-by-case basis if needed.
AlgorithmWatch is an advocacy organization. Will I have to do advocacy?
No. Reporting and advocacy are separated activities.
The Algorithmic Accountability Reporting fellowship is supported by:
