Intro to Computational Thinking
The foundations of computational thinking that are presented here are derived from the four pillars that were defined by Davidson and Murphy in their UPenn/Coursera computational thinking course. Other foundational models are out there, mostly overlapping but with their own nuances and foci.
There are many definitions of computational thinking out there. The one we will use is: Computational Thinking is the mental discipine of thinking about a problem using concepts from computer science, with the ultimate goal of solving the problem with the help of a computer.
UPenn: “Computational Thinking is an approach to solving problems using concepts and ideas from computer science, and expressing solutions to those problems so that they can be run on a computer.”
Five Foundations of Computational Thinking
The UPenn _four pillars are:
- Decomposition
- Pattern Recognition
- Data Representation and Abstraction
- Algorithms
We are going to adjust those to be:
- Abstraction
- Decomposition
- Pattern Recognition
- Data Representation
- Algorithms
Perhaps we can remember them as: ADPDA
Abstraction
Merriam-Webster defines abstraction as “the act or process of abstracting”, “the state of being abstracted”, or “an abstract idea or term”. But that just uses the same root word abstract.
Abstract has many definitions; the ones relevant for us are:
- “n1a: relating to or involving general ideas or qualities rather than an actual object, person, etc.”
- “n1c: insufficiently factual”
- “n2: naming a quality apart from an object” Naming!
- “n3a: dealing with a subject in its theoretical aspects”
- “v2: to draw away the attention of”
- “v5: to think about or understand by separating general qualities, concepts, etc. from specific objects or instances”
Wikipedia:Abstraction says “Thinking in abstractions is considered by anthropologists, archaeologists, and sociologists to be one of the key traits in modern human behaviour”
Abstraction involves ignoring details that are unnecessary for the current issue at hand, and being able to talk about and reason about a thing apart from those details.
Abstraction is everywhere in our lives, so why treat it as a foundational idea of computational thinking? Because at the very bottom, computers are electronic devices operating on two signal levels we call bits (0 and 1), and so we must build abstractions on top and away from this baseline in order to do anything useful with computers.
Decomposition
We might think of decomposition as food rotting, but in computer science, decomposition is the act of breaking down a large, complex thing into smaller, more manageable, and understandable parts.
Wikipedia:Decomposition says “In computer science, decomposition is the process of identifying and organising a complex system into smaller components or layers of abstraction.”
That’s true, but it not only applies to systems, but also problems and concepts.
Pattern Recognition
Pattern recognition sounds pretty obvious, but it is worthwhile to treat as a foundation of computational thinking because it is fundamental to computing. We need to consciously be looking for patterns such as:
- When this happens, that should follow
- This happens repeatedly, based on these constraints
- If this condition is true, then this action should occur, or else that action should occur
- more here in the future…
Data Representation
This foundation and the last are now getting more focused on the computing part of computational thinking.
Things in the real world have all sorts of information about them: people can be short, or tall; everything is of some color; things can be sharp, dull, hard, soft, brittle, flexible, and on and on. These are all attributes, or data about the things in our world.
If we are going to use computers to help us solve problems that involve concepts about real world things, we have to think about representing this data. This involves two aspects:
- What data needs to be represented?
- How should we represent it?
Remember, on computers, in the end everything is just bits. We need to use, make sense of, and decide what those bits mean to us, and how they represent our data. This heavily involves abstraction, but also involves knowing the constraints that bits and computers place on us.
Algorithms
At the heart of computing is telling a computer to do something for us. This is where algorithms come in. Computational thinking must result in the creation or application of algorithms if it is to make a computer useful to us.
Wikipedia:Algorithm says “An algorithm is an unambiguous method of solving a specific problem.” but then it goes on to say “In mathematics and computer science, an algorithm is a finite sequence of mathematically rigorous instructions, typically used to solve a class of specific problems or to perform a computation.”
So an algorithm is a written set of instructions that, when followed, solve a problem, or do something. Humans have been using algorithms, or a set of instructions, for basically ever. A recipe is an algorithm. A step-by-step guide to fixing something or putting together some furniture are algorithms. However, they are usually very simple algorithms, being only a sequence of steps for one particular thing.
In computational thinking, we are more interested in algorithms that solve a general class of problems rather than solving one individual case. So we have to think hard about how our problem generalizes, and what it needs to operate on a specific case of the problem.