While previous generations relied on content knowledge (what you know) to be successful in academia or professions, education is shifting to focus more on process knowledge (what you can do). As educators retool to teach in new ways, assessment of students is also changing. A new wave of standardized testing is arriving that attempts to gauge students’ process knowledge.
We sought to create a tool that would help students prepare to be successful with these new types of exams and build confidence in their problem solving abilities.
Examples of Content Knowledge
Examples of Process Knowledge
We explored the existing landscape of test prep and general educational apps, including popular language learning, coding and STEM education apps targeted to high school and adult learners.
Everyone is copying everyone else
The all rely on more or less the same model of posing multiple choice questions and providing instant feedback about the learner’s response.
None of them are actually fun
They rely on gold stars and badges to “gamify” an essentially dull experience with rewards to make the experience more enjoyable. Many of the most popular of these apps makes sure to carefully pace the challenges so that successes outweigh the failures, and the task feels “easy.”
Teachers endorse these learning models
Interviewing educators in the early stages of our design work, they felt confident that these repetitious interactions were successful in building knowledge as well as increasing students’ comfort with the testing format.
We built our first prototype copying many of existing conventions educational app conventions, planning to later layer on features that made the interaction more engaging or rewarding.
Adapting the format to a new type of test
The first version of the app solved some unique challenges, such as rendering large complex graphs and diagrams on a small phone screen and organizing data in a way that would best mimic the testing context.
Following existing patterns
Our early prototypes did not stray far from the patterns established by industry-leading educational apps. We utilized a familiar question and answer format with instantaneous feedback about whether an answer was correct.
Gamifying?
Knowing that the experience of using the app was not intrinsically rewarding, we brainstormed ways of incorporating external rewards. Unlocking AR experiences? Earning badges?
Getting student buy-in
Testing with users revealed that the app was usable, and learners felt cautiously optimistic that the app would help them get a better score on the exam. However, participants often communicated their growth in terms that referenced content knowledge rather than process knowledge.
Setting the Tone
We attempted to break student expectations about traditional content knowledge exams by reminding them that this was a different kind of test.
Starting fresh with new insights
Because our initial research only led us down a path of reproducing a well-worn solution to our problem, we wanted to zoom out and re-contextualize the challenge of creating an educational experience for demonstrating process knowledge.
How does developing process knowledge work in real life?
We took advantage of an opportunity to observe and analyze student experiences in a course that was taking a process knowledge approach to learning web development. (I was the teacher.)
How do learners’ beliefs about themselves as learners inform how they engage with process knowledge tasks?
Through in-person conversations, written self-reflections and course feedback from students, we began to better understand how students felt about a process-knowledge approach. Transitioning to a new mindset about learning after spending a decade or more in traditional content-oriented academic models was hard!
Synthesizing student experience
We began to develop fresh insights about the learner experience and create new design criteria to guide development of a truly unique solution.
Students have deeply ingrained preconceptions about testing and academic tasks.
While these may be useful in a content knowledge-based academic environment, they don’t serve students well when acquiring process knowledge.
We need to create an experience that isn’t burdened by existing expectations and assumptions.
We can do this by abandoning or purposefully inverting the traditional patterns and language of testing and academia.
In previous iterations, each question and answer stood alone with no relationship to or dependency on any other question. Now the challenges are organized into multi-step missions, making departing from the game after just a few questions feel like leaving a party early (almost).
The length of the challenge is a signal about how much time is the right amount of time to spend on the app. The creation of a finish line increases the intrinsic reward of completing a mission.
Testing with our teenage users revealed familiarity with using gestures for common interactions (such as scrolling, zooming and swiping). This allowed us to clean many of our interfaces and reduce the number of controls on each screen.
While the “resource drawer” solution that we created in the first iteration was a functional way of exploring large test content on a small screen, putting the materials on their own thematic planets and allowing students to explore the planets with gyro controls on their phone was more playful.
Our analogous research showed us just how frustrated students were when seeking answers that seemed inaccessible. To reinforce that with process knowledge, it’s the journey, not the destination, we ask users to capture their journey (many times!) before we ever ask them to give an answer.
While existing process testing paradigms require that students ultimately provide an answer, we can prepare them to be successful by giving many opportunities to practice the process.
Initial Successes
Because of our "show-don't tell" approach to disrupting expectations about success in process knowledge tasks, students were less intimidated, less anxious and less frustrated when encountering questions that seemed impossible at first.
Observations during usability testing allowed us to discover more intuitive ways for users to navigate within the app that were also more simple visually.
Measuring success by finding the relevant data, rather than arriving at the correct answer, makes the app less similar to the testing context, but a more accurate proxy for the skills being tested.
Long-term Goals
We would love to test this with students who are taking the NY Regents Earth Science exam to see how their use of the app impacts their score.
Two groups that are often prepping for the NY Regents exams are advanced students who make take the exam as early as 8th grade to "get ahead" before high school, and special education students who need to pass a minimum number of exams to graduate. Our testers did not include either of these groups, and we think understanding both would be important from an equity perspective.
We are eager to adapt In The Test to different test content. The State of California has recently introduced process knowledge science testing for 5th, 8th and 10th grade students. We would also be interested in adapting it for testing content outside the sciences.
Finally, we are also interested in exploring AR options for students who have printed versions of test references on hand and could capture their images using the phone on their camera.