FAQ: about chatGPT

AI tools like ChatGPT can make student life easier—but also more complicated. Here, the library experts help you understand what you can use it for and where the boundaries are.

No one (except the company behind it) really knows what happens to the data you put into this service afterward. Instead, use GPT UiO. GPT UiO is based on ChatGPT, but we know where the data is stored and what happens to it. You can find GPT UiO under Services and Tools on himolde.no, or directly via this link.

No. Submitting text produced by ChatGPT is considered cheating. You must always submit your own work.

  • Can I use AI-generated text in an assignment?
    Only if the person assigning the task explicitly allows it. Even then, you must write the assignment yourself. If you choose to include AI-generated content, you must reference it properly. Examples of how to do this can be found on the library’s website or at kildekompasset.no. A good rule of thumb: reference any content you cannot honestly say you wrote yourself. If you only used the tool to, for example, find synonyms, you do not need to reference it—just as you wouldn’t reference a thesaurus.

Yes! You can ask questions, request explanations, or get help understanding complex topics.

  • Large language models like ChatGPT are essentially pattern-generating machines. ChatGPT doesn’t read or understand content—it guesses what a likely answer might be. This means it can be wrong. The main risk is when most of its answer sounds correct, but small parts are inaccurate—resulting in misinformation or disinformation. This can be hard to detect, especially if you’re unfamiliar with the subject.

Only if it’s explicitly allowed. For most exams, it’s not permitted. Always check the rules.

No. ChatGPT may fabricate or mix up sources. Use your curriculum and academic literature instead.

  • ChatGPT and other language models are “dumb” machines that are very good at guessing. They can easily produce wrong information, especially in areas you don’t know well. Unless your topic is about AI itself, it’s best not to use ChatGPT as a source.

  • To get overviews or summaries — but remember, you’re letting the tool decide what’s “important.” It’s always best to read yourself.
  • To create questions or quizzes for self-testing.
  • To test your understanding of difficult topics — but keep in mind that ChatGPT tends to “please people.” If you challenge it, it may adjust its answers just to end the disagreement, even if the new answer is wrong.
  • To improve your language and structure after you’ve written your own text — but remember that AI is just a tool. You must still control and proofread the final result.

Believing ChatGPT can do the work for you. You’ll learn less—and risk being accused of cheating.

Believing AI tools are “intelligent.” They only imitate human intelligence by generating patterns. They’re not truly smart, and not very good at math.