


So we took each practice SAT math question and wrote out a solution spread over four swipeable pages. Our hypothesis was that students self studying for the SAT used books (eg Kaplan) but the solutions in the back were poor in explaining how they arrived to the solution because they usually just cut to the chase rather than explaining the approach & helping you arrive to the solution. "Show me some things my teachers may have been wrong about" seems to me an easier request to adequately address than "show me some things my AI tutors lied about", because the former tend to cluster around a kind of common space and form an experience with a great deal of overlap between students, while I'm not sure the latter do.Ĭirca late 2015, early 2016, my cofounder and I spent ~6 month attempting to build exactly this. Does that work for lies AIs tell? They could be about almost anything, they could be broadly wrong, they could be subtly wrong, et c. One can write a book addressing common misconceptions picked up in school, from human teachers, and cover much of it. It's not usually that you ask them a question they can't answer, and they just confidently make up something entirely wrong, and possibly also an incorrect explanation that few or no other people have seen. Or spreading the same political propaganda that can be found all around, so is pretty easy to spot. mythologized history, poor-but-common explanations for things like how airplane wings work).

At least misleading instruction from teachers tends to be of a sort that others also experience, so can later be addressed en masse (e.g.
