The app that would end math homework

Latest

I want to show you something. First, below, you’ll see real seventh grade math homework, which I acquired from my father, who works at my old middle school. You probably recognize this sort of sheet from before “math” became algebra, geometry, and the calculus.

And then, the video below shows PhotoMath, a new app by the startup MicroBlink. Point a phone’s camera at an equation and it solves for a variable, sparkles twinkling around the letters and numbers. Since its debut a couple weeks ago, PhotoMath has become one of the most popular free applications in Apple’s App Store. It’s not the first software that can do basic algebra (that would be 1967’s MATHLAB). Nor is it the first app that can recognize letters and numbers that you show to it: those have been around for years. But it’s the seamless way PhotoMath combined character recognition and math skill that captured people’s imaginations. More than two million people have watched the video demoing its capabilities.

Could it be that math homework, as we once knew it, is obsolete? I tried the software out on this 7th-grade assignment, and I got answers to all 14 problems in under a minute using the app.

This seems like a technological miracle! On the list of robots children of the 1950s might have asked for, an awesome free math-homework robot would have been up there with a bed-maker.

So, what could such an app actually do to/in math education? I got in touch with Nam Nguyen, who teaches math at my alma mater, View Ridge Middle School in Ridgefield, Washington. A very tech engaged teacher, Nguyen was familiar with the app and had invited his students to play with it. (They were annoyed it couldn’t solve absolute value functions.)

Nguyen isn’t worried about the app’s impact on his classes. The ways he wants his students thinking can’t be solved by software. “My teaching has shifted more to asking questions like, ‘What do you think? What is the first question you have? What do you know right now? Take a guess? What information do we need? How can we get that information?” he said.

The switch towards more open-ended problem solving and less grinding away on problems does seem to be where the leading edge of math education is. Even the controversial Common Core curriculum tests, as they might be implemented in New York, for example, would be immune to PhotoMath because none of the questions for seventh graders are simple equations.

So there is a megahappy scenario here: the robots take on all the busy work, and the students end up doing more substantive thinking.

Dan Meyer, a former math teacher and Stanford PhD student in math education, summarized this hope in a recent blog post about PhotoMath. “It’s conceivable PhotoMath could be great for problems with verbs like ‘compute,’ ‘solve,’ and ‘evaluate.’ In some alternate universe where technology didn’t disappoint and PhotoMath worked perfectly, all the most fun verbs would then be left behind: ‘justify,’ ‘argue,’ ‘model,’ ‘generalize,’ ‘estimate,’ ‘construct,’ etc,” Meyer wrote. “In that alternate universe, we could quickly evaluate the value of our assignments: ‘Could PhotoMath solve this? Then why are we wasting our time?'”

In other words, we should root for both a perfect robot equation solver and hope that it catalyzes innovation in math education. But, as Meyer hints, the technology does disappoint. The ways the app doesn’t work paint a miniature portrait of what will be so confounding about a world laced with artificial intelligence produced by today’s tech industry.


The first disappointment is the accuracy. It’s good, but not anywhere close to perfect. That’s a problem, Nguyen told me, because the students come to rely on these tools, even if their mathematical intuition tells them something is off. “Students tend to think that if it a calculator says it, it must be correct,” he said.

Even if the students did want to debug the PhotoMath solutions, they might find it difficult. The mistakes that the system’s artificial intelligence makes are different from the ones a human might.

For example, as I was going through the answers above, you may have noticed that the software confidently gave the answer to question 8 (m – 3 = -6) as 7. That’s wrong. The right answer is -3 (just add 3 to both sides). How could it possibly have gotten 7? It’s not even possible to generate a 7 with the numbers 3 and 6 in any combination. This was clearly not an error that a human would make.

PhotoMath does provide the steps that it took to solve the equation, so I looked back at that problem and discovered that it had recognized the -6 as a 4. Imagine introducing that kind of error into a longer string of calculations or problem solving. You’d have no idea how you got this strange number.

The interesting thing about replacing cheating off a friend with cheating from a computer is that machines are weirder than our friends, even the ones who like dipping Cheetos in ranch dressing. The machines may be able to solve our problems, but their methods are not human. Their intelligences are narrow. Their communication abilities are weak. PhotoMath might help someone polish off an assignment in record time, but PhotoMath is not your friend.

Now, it is possible that MicroBlink could fix up the app so that its accuracy was stunning, better than any human, and its explanations were clear and parseable by students. But they probably won’t.

Because PhotoMath is a marketing gimmick. That’s clear from the mathematical features included in the app itself, which is not programmed to recognize or solve many basic types of equations that someone taking 7th grade math, not to mention higher levels, might encounter.

A late October blog post by MicroBlink demonstrates that the company never really considered what it would mean to apply their actually pretty awesome machine vision technology to the daily lives of thousands of school kids and their parents and teachers.

“If we can eliminate kids’ frustration at the point when they can’t do anything else but helplessly stare at the book, we’ll feel awesome. It’s as simple as that,” they write. “In fact, we’re sure that the same questions were raised when calculators entered classrooms. With PhotoMath, our goal is to make a much more useful calculator.”

But it is not really as simple as that. Tech startups’ most powerful strategy has been making previously difficult things—like hiring a car or booking a room in someone’s house—really, really easy. Sure, you could call a cab before Uber and Lyft, but the convenience of their services changed the way their users thought about grabbing a ride across town. Making a more “useful” calculator would change the way people used it.

If I had to guess (and I do because I have not heard back from MicroBlink’s executives), I would say that the folks at MicroBlink just didn’t think about the whole operation very much. The kids and teachers—to the extent the app disrupts their lives—are collateral winners or losers in MicroBlink’s promotional efforts.

“We are not an educational company,” MicroBlink co-founder and CEO Damir Sabol told Techcrunch. “We are promoting our machine vision technology with PhotoMath.” MicroBlink’s actual clients are banks. This app is a toy.

There probably would be great value in an app that could solve—and explain how it solved—any equation at which it was pointed. But building the business that builds that app is not going to make anyone a billionaire. And it would require tiresome and inefficient interfacing with teachers and students, even assistant superintendents sometimes. It ain’t WhatsApp.

So, instead, we get a neat demo app powered by the waste heat of a mobile machine vision startup that will probably disappear when MicroBlink is snatched up by Google, Microsoft, Amazon, or Facebook.

It’s not that MicroBlink owes the world any more than this. But this is how Technology(TM) disappoints.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin