Ethical and Epistemological Dilemmas in Doing Networked Learning Research with AI
DOI:
https://doi.org/10.54337/nlc.v15.11135Abstract
This round table continues the conversation initiated in the workshop Doing Networked Learning Research with AI, expanding it toward the ethical, relational, and epistemological dilemmas that arise when AI becomes a research collaborator rather than a mere tool. While the workshop focused on how to use generative AI in qualitative research processes—data collection, analysis, and interpretation—this round table invites participants to engage with deeper questions about what it means to do such research responsibly and reflexively.
We will discuss how AI’s participation in research assemblages reconfigures human–non-human relations, challenges conventional notions of ethical accountability, and complicates the epistemic boundaries of interpretation. Using Barad’s concept of the agential cut, we will explore how researchers make decisions about what—and who—counts in knowledge production, and what ethical consequences follow from those decisions.
Importantly, this session is a standalone discussion: while it directly follows the workshop and extends its themes, participants who did not attend the workshop will be fully able to engage and contribute. The framing and discussion prompts will ensure an inclusive, open entry point for all.
Downloads
Published
How to Cite
Issue
Section
Categories
License
Copyright (c) 2026 Kyungmee Lee, Nina Bonderup Dohn, Nataša Lacković

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
CC BY-NC-ND
This license enables reusers to copy and distribute the material in any medium or format in unadapted form only, for noncommercial purposes only, and only so long as attribution is given to the creator. CC BY-NC-ND includes the following elements:
BY: credit must be given to the creator.
NC: Only noncommercial uses of the work are permitted.
ND: No derivatives or adaptations of the work are permitted.