To App, Or Not To App? That Is Today’s Question

In the context of learning and development, training, and other business settings, learning apps are sexy. They’re interactive and fun to use. They grab data and spit out metrics. They claim to offer easy solutions to big problems.

Apps in the workplace are tools to access a curated body of information aligning with a singular objective and agenda. For example, a sales app would not deliver the world’s knowledge of sales. Instead, it would provide a perspective from its creators for you to emulate. These would include select behaviors, situations, phrases, and tactics all assembled for you to try on your own.

Apps have their place—no argument—but for learning and development, they fall short. Apps are not the panacea for deep understanding! On the contrary, they are poor crutches for thinking and decision-making. In part, that’s because Artificial Intelligence (AI) is by definition…artificial. Apps can’t problem-solve or improvise to better suit the learner!

What can, you ask? The human brain!

Along with my colleague, Dr. Kieran O’Mahony[1], I had the immense pleasure last week to present at a tier one innovative company in Beaverton, Oregon.  For more than eight years, I’ve had the great fortune of collaborating with these creative people. As the first company to embrace Brain-centric Design™ on a global scale, they were successful at optimizing metrics that enabled them to monetize learning for both efficiency of instruction and increased consumer satisfaction.

This collaborative engagement showcases what’s happening in spaces where innovation is not based on an app.

Here, and elsewhere in our business travels, the application that we see growing each day in importance and prominence is not in the tech space.

It’s in the brain.

While many companies, and management within those companies, hear the siren’s song in a phone-based App, innovative companies are turning to the one App we all have. The focus is on its user-interface and how it can best be optimized for learning & development.

While many companies (and management within those companies) hear the siren’s song in a phone-based app, innovative companies are instead turning to the one app we all have. The focus? Its user-interface, or how we can optimize the brain for learning and development.

It is no surprise to neuroscientists that apps (which tend to be single user, single screen, distracted mind) rarely result in learning with deep understanding.

Learning with deep understanding is not a mystery. It simply requires a human interaction—one human speaking to other humans, collaborating and co-creating in a safe learning space.

As the geneticist and sociobiologist E. O. Wilson is quick to point out, today’s workers are inundated with information, but starved for wisdom.[2] The solution is within reach, as every organization has the capacity to ignite individual contributors, generators of ideas and implementers of meaningful practices.

Teach For Deep Understanding

Sounds easy enough until you realize that presenting information and having learners interact with that information is not enough to create deep understanding.

We’ve never really been taught a method where we’re allowed to think about our thinking in a cognitive way. This metacognitive stance is the essence of truly knowing a subject.

We were brought up in a school system that is two-dimensional—right answers or wrong answers, sit and listen, be passive intake units. The most common result for instructional designers and trainers was a regurgitation of that behaviorist approach (what we call the sage on the stage syndrome). Ergo, most presenters deliver new information to their learners in this way.

Be it an app, meeting, lecture, proposal, or presentation we often teach to deer in the headlights. We then evaluate based on whatever amount of information the audience was able to retain and them stratify them accordingly. If the attendees can remember even half of the information provided, we congratulate ourselves on a job well done. Especially if we were witty enough to add a game, append the word ‘Micro,’ defend our use of the color palette, add video and graphics, and a host of other buzzwords. All that flowery stuff is meant to do what? Emulate what the brain does naturally: think critically using our prefrontal cortex.

“Neuro is finally upon us,” cognitive learning neuroscientist Dr. Kieran O’Mahony stated. “Companies who place their trust in Applications and erstwhile behaviorist learning models are just moving deck chairs on the Titanic.” His audience of global learning leaders were captivated during a recent learning and development summit. “Remembering facts is not learning, it is memorization. Critical thinking, assessing situations, making decisions, and synthesizing big ideas—these cognitive skills are uppermost in the executive function of the prefrontal cortex. We all have this capacity, but sadly we too often suffer from EFI (Executive Function Impairment).”

As Dr. O’Mahony points out, cognitive learning, or learning that is constituent with how the brain works and how people learn is the new direction of training and education. It’s the buzz in the hallowed hallways of truly innovative companies. There is a reason Google’s DeepMind emulates the brain’s ability to make new neural connections.

That’s the very definition of learning.

[1] Dr. O’Mahony is a learning scientist in cognitive neurosciences from the University of Washington College of Education and National Science Foundation first Science of Learning center – LIFE (Learning in Informal and Formal Environments).

[2] American biologist, researcher, theorist, naturalist and author, E. O. Wilson is quoted from his work: Darwin’s natural heir. London, 2001