Opening address and "Generative AI, Large Language Models and Integrity Implications for Research"

Loading...

Media is loading
 

ORCID

https://orcid.org/0000-0001-8412-5553

Document Type

Lecture

Creative Commons License

Creative Commons Attribution 4.0 International License
This work is licensed under a Creative Commons Attribution 4.0 International License.

CIT Disciplines

Library science

Disciplines

Library and Information Science

Publication Details

Welcome speech from MTU Research and Compliance Officer Séan Lacey, followed by presentation delivered by Prof. Tomáš Foltýnek with accompanying slides at MTU event "Generative AI, Large Language Models and Integrity Implications for Research".

Abstract

The integration of generative AI and large language models (LLMs) into research workflows presents both opportunities and challenges. This talk will begin with a concise and accessible explanation of the underlying technologies, ensuring clarity for a non-technical audience. The focus will then shift to the ethical implications of using AI tools across different phases of the research process. Key ethical issues will be addressed, including the potential for bias in AI models, the challenges of transparency and explainability in complex systems, and the importance of protecting data privacy and securing proper consent when working with sensitive datasets. Additionally, I will explore the impacts of AI on research reproducibility and integrity, particularly when AI systems generate non-deterministic outputs. The talk will also cover the broader regulatory and legal implications of AI use in research, along with the importance of interdisciplinary collaboration between AI experts and domain researchers to ensure the ethical and effective use of these tools. Finally, I will propose some solutions to these ethical concerns, such as thorough verification of LLM outputs, proper testing, a focus on responsibility and accountability, and improvements to review processes.

This document is currently not available here.

Share

COinS