It is widely agreed among its attendees that the Asilomar meeting of the California Math Council is the best math teacher conference anywhere. Certainly, the setting is beautiful.
Over the decades, I have attended some great talks there, and this year was no exception. I will post some notes and reactions here, starting with two tech-oriented talks I attended.
Photomath and its implications
Photomath is a free smart phone app which can “read” exercises (even hand-written ones), solve them instantly, and display one or more paths to the answer. John Martin and Gale Bach introduced us to its power, and to some of its limitations. Here is an example. I handwrote a system of equations, aimed the phone at it, and this is what I got:
Each step can be expanded to show more details. Scrolling down gets you to the solution, and then you are shown how to check whether the answer is correct, once again with the option to expand each step. And that’s not all! The app displays a graph of the two lines, and for some reason their x- and y-intercepts. Photomath can do many things: solve equations, simplify radicals, find derivatives and integrals, and so on.
John and Gale’s used a debate format to present two ways to respond to the existence of this new electronic tool: should we explain to the students they can only use such a tool effectively if they understand the underlying math? or should we take advantage of it to assign different, more interesting problems? In my view, we should do both, and I suspect John and Gale agree — the debate was merely a way to structure their presentation. Neither of them advocated banning Photomath altogether, probably because they realized that is impossible. Their session started an important conversation, one which has been delayed too long.
I was disappointed that the subsequent discussion focused on how to handle cheating, and on how students could use this tool to teach themselves how to carry out these manipulations. To me the more profound issues were the ones raised in the “debate”. We will need much more than one session at a conference to sort it out, but here are some initial thoughts, using an example.
Take basic linear equations, the subject of a lot of deadly drill in middle school. Students need to know what it means to solve an equation, but they do not need to be able to solve super-complicated examples. (Leave those to Photomath, or Wolfram Alpha, or GeoGebra, or…) But how do we teach the basic underlying concepts, you ask? One way is to solve a lot of equations mentally, perhaps using a number talks format. “If 3x = 18, what is x?” and increase the difficulty from there. (3x+2=20, 3x + 2 = 21 + 5x, and so on.) Another way, once the very basics have been established, is to ask questions like “create an equation whose solution is 6.” This is a good way to consolidate understandings about “doing the same thing to both sides”, and there are more correct answers than students. (See more ideas on this in my How To post from 2015.) And of course, there are word problems, modeling questions, and assorted applications, none of which can (yet) be solved by machine.
More generally, speed and accuracy in paper-pencil computational manipulations are no longer priorities in math education. Teaching for understanding is really the only game in town. Trying to figure out how to teach the same algorithmic material the same way as the technology keeps racing ahead becomes more obsolete every day. In my own lifetime, calculators displaced multidigit arithmetic, scientific calculators replaced log and trig tables, graphing calculators superseded tedious graphing by hand, and we are now in the fourth phase of this revolution. Of course, this does not mean that we know what to do about the new state of affairs. Many questions remain, but they will not be answered by trying to find ways to continue business as usual. Let’s keep the conversation going!
Functions from geometry
Tim Erickson presented several activities where data is derived from a geometric situation. The geometry is explored in the real world, using rulers and protractors, then the data is displayed in Desmos, which is supremely easy to do. Looking at the resulting data points hopefully yields an insight. Given the nature of the examples, Tim (a self-described “statistics guy”) discouraged the use of regression, suggesting instead various strategies to help students interpret what they see in the graphs. He gave many great pointers on how to help students see and think about the numbers: ask for a prediction (left to right, will the points generally go down or up? what will happen for small, large, or extreme x or y? and so on.) He also encouraged us to discuss the effect of measurement errors, issues with the displayed domain and range, etc. Tim was a master teacher at work, alternating between talking to the whole group, and looking at our work and pursuing conversations with individuals.
For me, the key question here is whether the approach he shared is even appropriate in geometry. My aesthetic sense says no. Look at the damn figure, think about it, discuss it. Learn to think geometrically! Save data analysis and modeling for a statistics class, of course, but also for an algebra class. In fact, this style of lesson is sure to improve the teaching of algebra, and in my view that’s where it belongs. For example, one outstanding activity Tim shared was an exploration of how much vertical space a given paragraph requires if you change the margins and keep everything else (font size, etc.) the same. This turns out to be a great example of a (nearly) constant product. This activity, and many, many others can be found in Tim’s book, Functions from Geometry. Get your copy (and other great stuff from Tim) at eeps.com, and start using it in your algebra and precalculus classes!