top of page

ChatGPT in Education

By Mark Dawes, May 2024

 

Technology in Education

Technology in education is not new.  It is easy to forget, however, that the effective use of technology by pupils is not something that just happens: it has always needed to be taught, and there are myriad ways that it can go wrong. 

 

Here is a maths example:  In order to work out the mean of 1, 5 and 6, we add them up and divide by 3.  1 + 5 + 6 = 12, and 12 ÷ 3 = 4.

If you type this directly into a calculator as 1 + 5 + 6 ÷ 3 (which seems reasonable to many Yr 7 pupils), the answer is given as 8.

What is the problem?  The mean is a measure of ‘central tendency’, so it cannot lie outside the data that is given (8 is bigger than all of 1, 5 and 6).  And the reason the calculator gives this answer is not because it is ‘wrong’, but because it expects to work out the division before it calculates the additions:

1 + 5 + 6 ÷ 3

= 1 + 5 + 2

= 8

Some pupils will realise the problem (and the solution) for themselves, but others will need to be shown that they can use at least three other methods to give the correct answer:

1)    Typing 1 + 5 + 6, pressing ‘=’ and then doing ÷ 3 will work.

2)    Using brackets: Typing (1 + 5 + 6) ÷ 3 uses the correct order of operations.

3)    Using a fraction: An equivalent to version (2) is to use the fraction key, with 1 + 5 + 6 in the numerator and 3 as the denominator.

 

Technology improves over time

Then there is the issue that early iterations of new technology are, frankly, not very good.  The first electronic calculators could do simple calculations (and perhaps find square roots).  Nowadays they include trigonometric tables and carry out advanced statistical calculations.

 

Google Translate was originally fairly weak, and languages teachers initially cautioned against its use. I recall the huge frustration of dealing with an early automated telephone system that couldn’t understand whether I was saying “yes” or “no” (as instructed) in response to a question.  Nowadays we trust Google Translate and happily talk to Siri or Alexa, and use automatic YouTube subtitles with few misunderstandings.

 

Teachers used to warn pupils about using Wikipedia, because it wasn’t well-referenced and was full of errors.  Over the past decade or so it has improved immensely, and now I am far more likely to use Wikipedia myself, and to trust that it is accurate.

 

ChatGPT and Friends

ChatGPT (and its competitors such as Co-Pilot, Bard, Bing, etc) is still in the early stages of its life.  Currently there are lots of weaknesses and inaccuracies, and it is important that we know what these are how to use the technology effectively.

 

It is important to remember that (despite the worries that some people have) this technology is an example of a Large Language Model (LLM).  It uses sophisticated statistical techniques, and a very large set of pre-existing material, to predict/suggest the answer to questions that are posed.  When it is told it has made a mistake it then apologies profusely.  (In this regard it reminds me of certain Yr 11 pupils, who make a mistake, apologise, and then repeat the same error almost immediately!)

 

We therefore first need to decide whether we are asking a question that an LLM is likely to be able to answer.  If it is one that has been seen before then it is possible that it can.  If not, then there may be problems. 

 

Pythagorean Triples are three whole numbers that obey Pythagoras’ Theorem (5, 12 and 13 are an example of this, because 5² + 12² = 13²).  I asked ChatGPT to find three numbers where the sum of the first two squares was one away from the square of the third number.  I could think of 4, 7 and 8 (which works because 4² + 7² = 8² + 1) and 2, 2, 3 (2² + 2² = 3² – 1) and wondered whether there were others.

 

ChatGPT initially thought that 3, 4, 5 was a suitable answer, and also gave me 5, 12, 13 and 7, 24, 25. These are example of Pythagorean Triples and therefore don’t include the extra 1 that I was interested in.  I rephrased my request, re-explained the question, used different mathematical notation and pointed out the problem with the answers.  ChatGPT apologised and then gave the same wrong answers again.  This is not a surprise, because it isn’t a ‘standard’ problem, and almost certainly won’t have appeared in the corpus of prior knowledge that ChatGPT was trained on.  It was therefore silly for me to ask ChatGPT to give me solutions to this problem.

 

Crucially, I knew enough to realise that the answers given, very confidently, by ChatGPT, were wrong.  Would pupils realise this? 

 

(I wonder if you, the reader, spotted the errors I made in the past two paragraphs? 

I wrote that “ChatGPT initially thought that …”.  LLMs don’t think, they use statistics and data.

“the answers given, very confidently” – I was again personifying ChatGPT, which cannot be ‘confident’ about anything.  This illustrates how easy it is to fall into the trap of assuming we are dealing with something sentient.)

 

Interestingly, a Computer Science teacher colleague suggested I ask ChatGPT to write a computer program to solve this problem.  It immediately produced some Python code that gave a systematic set of correct answers to the question!

 

More worryingly, it gave plausible but error-strewn methods and answers when I asked it to carry out fairly simple factorisations.  Here is the final line that ChatGPT wrote:

It isn’t!  2(2x + 1)(3x – 4) is the correct answer.

Again, I could realise that this was incorrect, but will pupils?

 

Using ChatGPT with pupils

At the moment it appears that ChatGPT is too flaky (certainly in mathematics) to be used by pupils without a serious health-warning being applied, or without the support of a teacher.  I am not qualified to know whether it has similar issues in other subject areas, but suspect it is likely that it does.

 

My suggestion is that departments provide the following:

a)      Examples for pupils of when it might be useful to use ChatGPT (eg to find Pythagorean Triples) and when not to do so (anything out of the ordinary).

b)     Examples of errors that ChatGPT has made, to show pupils how careful they need to be, and how they can (and need to) check the answers they are given.

c)      Examples for teachers as to how they might use ChatGPT effectively in their planning and preparation.

 

If this technology improves in the way that Google Translate and Wikipedia have done, then these examples are likely to need to be updated frequently.

 

 

 

122 views0 comments

Recent Posts

See All

Self-deprecation in the mathematics classroom

By Mark Dawes, May 2024 I was fortunate to be able to see an extremely experienced teacher use an NRICH task with a group today.  The task was one that was new to me (Add to 200) and you can find it h

Comments


bottom of page