In
6 Ways to Create a Smarter Workforce in 2014, Tim Geisert, Kenexa's Chief Marketing Officer, writes:
Use science, precision and data to hire the right people for the job the first time. According to a 2012 IBM study, 71 percent of CEOs surveyed cited human capital as their greatest source of sustained economic value. So, why does HR continue to rely on gut instinct alone to make such important decisions?
Perhaps Mr. Geisert should have spoken with Rudy Karsan, one of Kenexa's founders and its CEO, who wrote in
Listening to Your Gut Feeling:
On the big decisions I have gone against my gut on a couple of occasions and it’s been a train wreck. My gut is made up of my instinct, my faith, my intuition, my experiences and data that is currently inaccessible because it’s tucked away in the deep recesses of my brain.
Then again, perhaps Mr. Karsan should have spoken with Troy Kanter, President of Human Capital Management for Kenexa, who stated
in a press release:
Now, instead of making hiring decisions based on 'gut' feelings and personal likes and dislikes, hiring managers and HR can select candidates based on objective data, which also prevents potential legal ramifications and mitigates risk in the hiring process."
So, two of the Kenexa executives believe that going with your gut is bad idea, but the most senior Kenexa executive believes that going against your gut is an accident waiting to happen. Who's right?
What Does Data Tell Us?
The benefits provided by the use of pre-employment assessments, whether called workforce science, talent analytics or any other name, should be readily apparent and quantifiable. For example, has the rising use of pre-employment assessments over the past 10-15 years resulted in greater employee engagement?
Gallup has measured employee engagement since 2000 and it defines “engaged” employees as those who are involved in, enthusiastic about, and committed to their work and contribute to their organization in a positive manner. The
2013 Gallup report shows that 70% of American workers are “not engaged” or “actively disengaged” and are emotionally disconnected from their workplaces.
Is the employee engagement data from the 2013 report an anomaly? No. As shown in the following chart taken from the 2013 Gallup report, there has been little change workplace engagement levels since 2000.
Your employees are the face of your brand and the most vital asset of your business. They drive your productivity and profitability. What’s more important than selecting the right ones? Take the guesswork out of employee selection with industry-specific, behavioral-based assessments and interview guides [from Kronos].
Gallup’s research shows that employee engagement is strongly connected to business outcomes essential to an organization’s financial success, including productivity, profitability, and customer satisfaction. Yet, as the report states, "workplace engagement levels
have hardly budged since Gallup began measuring them in 2000."
Brain vs Computer
In "
Thinking In Silicon," a December 2013 article in the MIT Technology Review, Tom Simonite writes:
Picture a person reading these words on a laptop in a coffee shop. The machine made of metal, plastic, and silicon consumes about 50 watts of power as it translates bits of information—a long string of 1s and 0s—into a pattern of dots on a screen. Meanwhile, inside that person’s skull, a gooey clump of proteins, salt, and water uses a fraction of that power not only to recognize those patterns as letters, words, and sentences but to recognize the song playing on the radio.
All today’s computers, from smartphones to supercomputers, have just two main components: a central processing unit, or CPU, to manipulate data, and a block of random access memory, or RAM, to store the data and the instructions on how to manipulate it. The CPU begins by fetching its first instruction from memory, followed by the data needed to execute it; after the instruction is performed, the result is sent back to memory and the cycle repeats. Even multicore chips that handle data in parallel are limited to just a few simultaneous linear processes.
Brains compute in parallel as the electrically active cells inside them, called neurons, operate simultaneously and unceasingly. Bound into intricate networks by threadlike appendages, neurons influence one another’s electrical pulses via connections called synapses. When information flows through a brain, it processes data as a fusillade of spikes that spread through its neurons and synapses. You recognize the words in this paragraph, for example, thanks to a particular pattern of electrical activity in your brain triggered by input from your eyes. Crucially, neural hardware is also flexible: new input can cause synapses to adjust so as to give some neurons more or less influence over others, a process that underpins learning. In computing terms, it’s a massively parallel system that can reprogram itself.
Okay, but what about computing at the bleeding edge, like the "cognitive computing" of IBM's Watson?
Not So Elementary
According to a
January 9, 2014 article in CIO.com, IBM says cognitive computing systems like Watson are capable of understanding the subtleties, idiosyncrasies, idioms and nuance of human language by mimicking how humans reason and process information.
Whereas traditional computing systems are programmed to calculate rapidly and perform deterministic tasks, IBM says cognitive systems analyze information and draw insights from the analysis using probabilistic analytics. And they effectively continuously reprogram themselves based on what they learn from their interactions with data.
Said IBM CEO Ginni Rometty, "In 2011, we introduced a new era [of computing] to you. It is cognitive. It was a new species, if I could call it that. It is taught, not programmed. It gets smarter over time. It makes better judgments over time." "It is not a super search engine," she adds. "It can find a needle in a haystack, but it also understands the haystack."
Watson is having more trouble solving real-life problems than "Jeopardy" questions, according to a review of internal IBM documents and interviews with Watson's first customers.
For example, Watson's basic learning process requires IBM engineers to master the technicalities of a customer's business—and translate those requirements into usable software. The process has been arduous.
Klaus-Peter Adlassnig is a computer scientist at the Medical University of Vienna and the editor-in-chief of the journal Artificial Intelligence in Medicine. The problem with Watson,
as he sees it, is that it’s essentially a really good search engine that can answer questions posed in natural language. Over time, Watson does learn from its mistakes, but Adlassnig suspects that the sort of knowledge Watson acquires from medical texts and case studies is “very flat and very broad.” In a clinical setting, the computer would make for a very thorough but cripplingly literal-minded doctor—not necessarily the most valuable addition to a medical staff.
As Hector J. Levesque, a professor at the University of Toronto and a founding member of the American Association of Artificial Intelligence,
recently wrote:
"As a field, I believe that we tend to suffer from what might be called serial silver bulletism, defined as follows:
the tendency to believe in a silver bullet for AI, coupled with the belief that previous beliefs about silver bullets were hopelessly naıve.
We see this in the fads and fashions of AI research over the years: first, automated theorem proving is going to solve it all; then, the methods appear too weak, and we favour expert systems; then the programs are not situated enough, and we move to behaviour-based robotics; then we come to believe that learning from big data is the answer; and on it goes."
Similarly, assessment companies have marketed the benefits of "science, precision and data" over the past fifteen years under the guise of neural networks, artificial intelligence, big data and deep learning, yet what has changed? Employee engagement levels have hardly budged and employee turnover remains a continuing and expensive challenge for employers.
The more things change, the more they remain the same or, in deference to Monsieur Levesque "plus ça change, plus c'est la même chose."