Classifications with Teachable Machine

Classifications with Teachable Machine

Year level 7-8 


Students are introduced to the concept of classification. By exploring Google’s Teachable Machine tool, students learn about supervised machine learning. Then students are asked to build a cat-dog classifier but are unknowingly given a biased dataset. When the classifier works better on cats than dogs, students have the opportunity to retrain their classifiers with their own new datasets.


Teaching point

For students to understand classification problems, and understand the role training data plays in classification accuracy


Suggested steps- Lesson 1

  1. Today we are going to talk about training datasets – supervised machine learning. In a supervised machine learning system, a computer learns by example. Provide example of how toddlers learn shapes, colours, etc.


  1. We’re going to focus on classification, ask for some useful examples of classification that students use in everyday life or that they’ve learned about in school? (eg. classification systems in science class such as animal classification, how they find books in the library).


  1. Demo Teachable Machines (


  1. Students complete the tutorial.


  1. Class discussion –
  • Share what you have learned with the training and test datasets.
  • What happens when you change your pose?
  • What happens when both partners are in the frame?
  • What happens if you only train one class?
  • What happens as you increase your dataset?
  • What happens when your test dataset is different from your training dataset?
  1. Now we are going to build a cat-dog classifier using the new version of teachable machines. Demo how to train at least one class.


  1. In pairs, build a machine that classifies cats and dogs. You already have training sets loaded on your laptops and can use the cards and webcam to test with.


  1. Discussion:
  • Is this classifier useful if it only works well on just cats?
  • Why do you think it works better on cats vs dogs?
  • How could we make it better with our training data? (If students have trouble, ask them to notice similarities in the dataset, e.g., dogs were really fluffy and cat-like/not as diverse)


  1. Summary: When algorithms, specifically artificial intelligence systems, have outcomes that are unfair in a systematic way, we call that algorithmic bias. We would say that our cat-dog classifier shows algorithmic bias and that it is biased towards cats since it works really well for them and biased against dogs since it doesn’t work as well for them.


Suggested steps- Lesson 2

  1. Review previous lesson work – datasets, bias, teachable machine….


  1. Building on previous lesson, give students time to re-curate their datasets.


  1. Discussion: What did you do to make it work better?


  1. Algorithmic bias – let’s see how it can happen in the real world. Play Gender Shades facial detection video (


  1. Discussion:
  • What problem did Joy identify in the video?
  • Why is this a problem?
  • How does Joy suggest we can fix this problem?
  • How might you find images to better curate your dataset?


  1. Back to teachable machine. Give each pair a bag of musical instruments and ask them to build a classifier that classifies the instrument when it is being played. Then test  the classifier by playing multiple instruments and see what happens.


  1. Discussion:
  • What happened when you played multiple instruments at once to test?
  • Did it ever predict both instruments?
  • Sometimes classification can be really hard because it can be hard to list all the possible categories – think about how many classes we would need in order to classify every combination of musical instrument you have (at least 24 classes).


  1. Summary: This problem also showed up in Gender Shades. The biggest problem with the facial recognition systems studied is that they don’t work as well on darker, female faces as paler, male faces. But not everyone identifies as male or female, and those facial recognition systems can’t capture that. This is something we always need to be careful of when we are classifying – to make sure that our classes don’t exclude anyone.


Curriculum links

Links with the Digital Technologies curriculum area


Years [7-8]

Content description
Strand: Processes and Production Skills ·       Analyse and visualise data using a range of software to create information, and use structured data to model objects or events (ACTDIP026)

·       Design algorithms represented diagrammatically and in English, and trace algorithms to predict output for a given input and to identify errors (ACTDIP02)

·       Implement and modify programs with user interfaces involving branching, iteration and functions in a general-purpose programming language (ACTDIP030)

·       Evaluate how student solutions and existing information systems meet needs, are innovative, and take account of future risks and sustainability (ACTDIP031)



+ There are no comments

Add yours