Guide for Using Generative AI in Research

School of Law, University of KwaZulu-Natal

Recommended by the School Research and Higher Degrees Committee, 28 February 2024
Approved by the School Board, 11 March 2024

Application

This guide applies to the research conducted by the academic staff members of the School of Law, University of KwaZulu-Natal. Separate guides will be developed to govern the use of generative AI in teaching and learning, and in research by students. For clarity, this guide is not intended for
student use.

Custodian

The custodian of this guide is the School Research and Higher Degrees Committee.

Dynamic nature

As the world of AI is developing apace, the intention is for this guide to be dynamic and to be revised on a regular basis. Whenever the School Research and Higher Degrees Committee updates this guide, the updates will be highlighted in a general email to all academic staff members, and in the subsequent general academic staff meeting by the Academic Leader for Research and Higher Degrees, or by a staff member mandated by the School Research and Higher Degrees Committee.

Principles

  1. Appropriate Use: Legal academics are encouraged to use generative AI in appropriate ways.
    1. Generative AI should be used for routine research tasks, such as summarising articles, improving structure and language, or transcribing interviews.
    2. Generative AI outputs should never substitute for critical research tasks requiring original thought and analysis.
    3. Legal academics should be aware of the limitations of AI in generating original research, especially in drawing conclusions. Legal academics should always draw their own conclusions.
  2. Independent critical thinking: In the research using generative AI, legal academics should adopt as their primary goal the unbiased search for truth, and should apply rigorous independent thinking.
    1. Be aware that generative AI systems may prioritise high-citation articles in their search results, which is likely to draw attention away from under-represented scholars’ work. Also be aware that generative AI may provide responses which are biased in favour of dominant ideologies and narratives and which may side-line local South African and African perspectives.
    2. Legal academics should conduct their own thorough literature searches, and conduct their own independent analyses. More particularly, legal academics should where appropriate in their search for truth proactively seek out, engage with, and cite fellow South African and African scholars’ work.
    3. Legal academics should always consider the responses by a generative AI critically, and assess whether such responses are sufficiently representative of our local context and culture.
  3. Academic Integrity: Legal academics are responsible for the content of their own research.
    1. Be aware the generative AI may sometimes ‘hallucinate’, meaning that it produces incorrect, nonsensical, or entirely fabricated information. AI hallucinations can range from minor inaccuracies to completely erroneous statements.
    2. Generative AI cannot carry any responsibility for the accuracy of research. As such, generative AI cannot be a co-author of any research.
    3. Rather, generative AI is similar to a proof-reader, research assistant, or a statistical consultant, who is typically not mentioned in research output. However, if a specific journal requires that the use of generative AI for certain functions be stated in a specified way, such as in a methodology section, such requirement should be followed.
    4. Any research content produced with generative AI assistance must be rigorously checked by the legal academics who are making use of the generative AI to ensure its accuracy.
  4. Data Privacy: Legal academics must comply with data privacy legislation.
    1. Be mindful of the data storage policies of the AI tool.
    2. Avoid entering personal information into publicly available generative AI tools.
    3. If there is good reason to enter personal information into generative AI tools, consent forms must be modified where necessary to include permission for using personal information in generative AI.

Shared understanding

The School Research and Higher Degrees Committee, as custodian of this guide, will take the lead to organise workshops to facilitate the sharing of experiences and insights between colleagues to develop a shared understanding of the best use of generative AI in legal research.

Conclusion

This guide aims to balance the productive potentials of generative AI with academic standards. It encourages legal academics to be both innovative and cautious, ensuring that the use of AI enhances the quality and integrity of legal research. Academics are also encouraged to read and consider other universities’ guides on the use of generative AI. Here are just two examples: