Skip to Main Content
Chat now
Sgwrsiwch nawr

AI and the Library: Effective and ethical use of AI tools

Prompt building

What is prompt building?

Prompt building is the process of making clear instructions (called prompts) that help generative AI tools create specific responses. A prompt can be a question, command, or statement that gives the AI the context and details it needs to produce relevant and accurate answers.

Why is it important?

The ability to create effective prompts for AI tools is important because it can:

  • Improve accuracy: Well-constructed prompts lead to more precise and relevant AI responses.
  • Save time: Effective prompts reduce the time spent revising AI inputs. This makes communication with your AI tool quicker and easier, and subsequently helps you to quickly find the information you need.
  • Enhance creativity: Good prompts can generate creative ideas and solutions by encouraging AI to look at tasks from different perspectives.
  • Facilitate learning: Building prompts helps users develop critical thinking and research skills, and in turn enhances their understanding of the subject matter.

All of these are skills that can improve your academic performance, and that are sought after by many employers.

What is an effective prompt?

A good AI prompt is concise, well-structured and specific. You could think of it like a formula where each variable contributes to ensuring that the prompt is clear, specific, and tailored to the desired outcome from the start, for example:

Task + Topic + Structure + Style + Level

Task: Explain

Topic: the impact of climate change on coastal cities

Structure: in a detailed report

Style: formal

Level of Detail: in-depth analysis

 

Prompt: "Explain the impact of climate change on  coastal cities in a detailed report. Use a formal   tone and provide an in-depth analysis."

 

  • Browse through the tabs below to learn more about each element of this prompt formula.

AI - Risks and Limitations

Copyright

Many AI models are trained using copyrighted content scraped from the Internet. It is not yet clear whether this is considered an infringement of copyright, but you should be aware of the potential ethical implications.

  • Read our Copyright LibGuide for more information on how copyright affects you as a student, researcher or lecturer. 

Data protection

Be cautious and do not share any sensitive or personal information with AI tools. These tools can be susceptible to data breaches and it is not always made clear how any information you share will be used. Make sure you understand the privacy policies of any tools you decide to use.

Generative AI tools produce content based on training data, data which may not always be up-to-date. Be mindful that AI tools may not be using the most recent information available when producing outputs, this is especially important in some disciplines. Always cross-reference details with reliable current sources.

Most AI tools are not able to access subscription-based information, only that which is widely-available on the internet. Most high quality academic research is located behind paywalls and is therefore inaccessible to AI platforms.

Make sure you make use of the Library's quality-assured subscription resources in your research and writing processes.

  • Search Primo, the library catalogue, to browse the library's resources. 

Generative AI models can exhibit various biases due to the data they are trained on and their algorithms. Here are five common biases that could be present in data training sets:

  1. Gender bias: AI models can replicate and perpetuate existing societal biases related to gender, such as gender stereotypes or gendered language usage.

  2. Racial and ethnic bias: AI models may generate content that reinforces racial or ethnic stereotypes or displays unequal treatment towards certain groups.

  3. Cultural bias: Specific cultural contexts in training data may produce content that is biased towards or excludes other cultures, leading to a lack of representation or misrepresentation.

  4. Confirmation bias: AI models can reinforce singular beliefs or opinions, potentially leading to biased outputs that align with specific perspectives or ideologies.

  5. Content bias: AI models can exhibit biases in the types of content they generate, favouring certain topics, themes, or perspectives over others.

It is important for students to understand that these biases may be present and to develop skills to critically evaluate AI outputs. 

Accuracy of information

AI tools can produce 'hallucinations' - false, misleading  or inaccurate information that is presented as fact. It is an important information literacy skill to evaluate and fact-check the information you receive from any AI tool.

Academic Integrity

robot reading books

AI tools present challenges for academic integrity, some of which include:

  • Increased cheating: AI tools can make it easier for students to cheat on assignments and exams.
  • Plagiarism risks: Students may use AI to generate content that is not their own, leading to plagiarism issues.
  • Misleading information: AI can produce inaccurate or biased information, affecting the quality of research.
  • Loss of skills: Over-reliance on AI tools may hinder students' development of critical thinking and writing skills.
  • Assessment challenges: Traditional methods of evaluating student work may struggle to detect AI-generated content.
  • Policy confusion: Departments might struggle to create and enforce rules about using AI and maintaining academic integrity.

As a general rule, AI tools should only be used to supplement your own information skills, not to replace them. By using AI in a way which might undermine your own academic skills, you will not be achieving your learning outcomes.

Always check the latest guidance on the use of AI provided by your Department. If you are allowed to use AI tools for your assessed work, you must be transparent and acknowledge what you have used by completing your Department's Tool Use Statement and/or referencing the tool accordingly. If in doubt, speak to your tutor.

Important: If you use an AI tool in your assessed work in ways that are not specifically permitted, you could be committing Unacceptable Academic Practice (UAP).

Working with AI Guidance

Information Services has a help and support webpage for working with Artificial Intelligence at Aberystwyth University: https://www.aber.ac.uk/en/is/help/ai/

This page brings together policies and safety advice as well as guidance for using AI in your studies, teaching, research, and administration.

The elements of an effective prompt

The Task component of the formula defines what you want to tool to do. 

  • Example: "Write," "Explain," "Describe," "Compare," etc.

Think about what you want the AI tool the do and then choose the action word.

  • Spelling or grammar check
  • Summarise a piece of text
  • Rewrite a piece of text in a different style or tone
  • Outline a task (a presentation for example), or generate a structure 
  • Break down larger topics or pieces of work
  • Provide information
  • Provide a starting point for a new topic

The Topic provides the subject matter or the scenario in which the action should take place.

  • Example: "the benefits of a University education," "the causes of the French Revolution," "the impact of social media on young people," 

This component ensures that the response is relevant and focused. Be specific with your topic and make sure you include any context the tool will need (for example your notes, links or specific details).

The Structure outlines the format in which the response should be delivered

  • Example: "in 6 paragraphs," "as a bullet-point list," "in a detailed report"

Think about how would you like the output presented.

  • One sentence
  • 200 words
  • Bullet point list
  • A table 
  • A graphic or chart

The Style indicates the manner in which the content should be written.

  • Example: "formal," "informal," "academic," "witty"

Specifying the style ensures that the output aligns with the intended audience and purpose of the output. 

The Level of detail specifies the depth and breadth of the information required.

  • Example: "brief overview," "in-depth analysis," "high-level summary"

Think about the intended audience for the information to help you decide how comprehensive the response should be.