Students are seeing the benefits of using artificial intelligence (AI), but what happens when students are caught using AI in their assessments? In this article we talk about the risks to students in using AI, an issue we are already seeing and have already helped students with when they have landed themselves in hot water because of this.
We refer to universities in this article because as Education Lawyers, this is where we are seeing most of these matters occur, but the same is likely occurring in schools as well.
Why are students using AI?
Students are using AI for a number of reasons; it could be fuelled by intense competition, a pressure to achieve high grades, the ease of using AI as opposed to having to perform in depth research or review multiple resources, or the fear of failing an assessment.
Changes in light of AI use
Students may or may not yet realise that universities are onto them. We have seen that some universities’ policies and procedures have been updated to capture instances of students using AI and to warn students against doing so.
Just like a university would look for markers indicating plagiarism, contract cheating or any other form of collusion or cheating, universities are now on the lookout for signs that AI has been used in a student’s assessment.
AI is not always accurate
Students need to be aware that AI is not always accurate. There has been an interesting incident in America involving a very experienced lawyer (Schwartz) who used ChatGPT to produce some court documents, which made reference to certain cases to support his client’s case.
The other side’s lawyers contacted Schwartz asking for copies of the cases he mentioned as they were unable to find them.
It transpired that Schwartz had used AI to produce the court document and although he had specifically asked the AI engine if one of the cases was real and he received a yes answer in response, the case in fact did not exist. Schwartz did not fact-check the cited cases and had put forward inaccurate information to the court.
It turns out that ChatGPT is not a search engine for accurate information nor does it filter truthful information.
Consequences of using AI
Not only does a student run the risk of turning in an inaccurate assessment judging by the Schwartz case, but by using AI, a student would likely be exposing themselves to allegations of academic misconduct by the university.
Academic misconduct encompasses a wide range of behaviours, including plagiarism, cheating, collusion, fabrication of results and contract cheating.
Once a university suspects or has evidence of some form of misconduct, including the use of AI in an assessment, the misconduct process is then triggered. The exact steps differ between universities, but generally looks like something similar to the following:
- An allegation letter is sent to the student providing the student with an opportunity to respond and/or attend a misconduct hearing;
- A response is provided by the student and/or the student attends a misconduct hearing;
- A decision is made on whether misconduct has occurred and if so, a penalty (or penalties) is applied;
- The student can then lodge a review/appeal to challenge the decision if the decision is not sound.
The penalties that are applied when a finding of academic misconduct is made depend on the university’s policies and procedures, but it is not uncommon for them to include formal warnings, repeats of a subject, a zero grade or a reduction in marks for an assessment, academic probation, suspension, or even exclusion from the course and/or university.
A finding of academic misconduct may also be something that students planning to work in certain industries need to disclose to authorities/bodies of those industries, such as us lawyers. Before someone can be admitted as a solicitor, they are required to make certain disclosures so that a decision can be made as to whether they are a fit and proper person to be admitted as a solicitor. So, depending on the industry the student plans to eventually work in, the consequences can go way beyond the university disciplinary processes.
Conclusion
While AI can offer an easy, convenient and attractive alternative to pulling those all-nighters and tediously sifting through multiple resources, information and documents, students need to seriously consider the risk involved in doing so.
How can we help?
As Education Lawyers we help students who are facing allegations of academic misconduct by drafting a response to the allegations for the student to lodge with the university, and/or drafting review or appeal submissions if an academic misconduct decision has already been made.
One thing we see regularly though is students coming to us after a finding of academic misconduct has already been made and they are seeking to have this reviewed or appealed. Whilst this is something we can help students with, we always recommend that students come to us before responding to the academic misconduct allegations to protect their position and not say anything that might make matters worse for them. Also, at the response stage, there is more information that can be provided as opposed to at the review/appeal stages where the type of information that can be provided or arguments that can be made are usually a lot more limited.
If you are a domestic or international student in need of help, reach out to us on 07 5636 5598 or via email to admin@blossomlawyers.com.au.