Predicting who will fail and when – a Marine Corps study with broad implications

This post explores the potential of people analytics in a military environment. Using a study of from USC involving Marines in a training environment, it argues that a focus on quantifiable and objective data is necessary for widespread adoption of analytics in decision making.

A 2019 study by the University of Southern California’s Center for Body Computing followed 121 Marines as they went through an elite training course. This study was able to combine data collected beforehand with continuous collection during a 25-day course. The objective was to determine who would succeed. As failures have a high cost for both the organization and the individual, discovering reliable predictors can have a large positive impact on efficiency and readiness for the Marine Corps. Surprisingly, they found that physiological factors such as hours of sleep, step counts, and heart rates were not significant. Personality factors such as extroversion and positive affect were the strongest indicators. They also found times during the course when voluntary dropouts were more likely to occur (i.e. right before a physically challenging event). Only 56 of the participants passed.

The ability to collect data on large populations in a tightly controlled environment where there is no expectation of privacy makes the military a prime place to study people analytics. High dropout rates which result from pushing people to mental and physical extremes make a ripe environment for data collection and analysis. Another example of people analytics using the military is Angela Duckworth’s West Point study. Through her study of 11,000 West Point Cadets, she was able to feature engineer “grit” as a predictor of success. Unfortunately, selection bias is hard to overcome in this area due to the volunteer nature of the US armed forces and the lack of ethnic and gender diversity. However, I think this Marine Corps study only scratches the surface of what is possible with military experiments, and that learnings can be extrapolated to other organizations.

My main critique of USC’s study is their reliance on personality tests versus focusing on quantifiable data. They used surveys before training started to score candidates on six personality traits – openness, conscientiousness, extroversion, agreeableness, neuroticism, and ego resilience. During the training, they measured physiological data such as caloric intake, sleep duration, step count, and heart rate. It would be interesting to rely on quantifiable data to determine personality type or to expand your continuous data collections (voice data would be a great supplement). This study used a small sample size of a short course. I think that limiting the surveys could increase volunteer sizes and that connections between physiological and personality factors could be found. For example, a trainee who has a higher step count than his peers might be volunteering more and thus may have a higher degree of empathy. If voice data was collected, someone who speaks more during periods of duress would be showing leadership promise. Eliminating subjectivity is a major obstacle to implementation. It will take very concrete and defensible analytics to help make decisions about who gets to enter training and who gets failed. Personality tests and surveys will never make the cut.

Training has long been used as a selection process in the military. You get the people you want in the organization because a training program can filter out undesirable characteristics. But if this function can get fulfilled by analytics, how will training adapt? Growth, learning, and bonding are also fundamental goals of training that result from similar programs. Is a high degree of “fear of failure” necessary to get the most out of training? Out of an employee? During my training in the Marines, I felt like I had to prove myself every day. If selections are efficient and attrition rates dwindle, the selectivity and intensity of the training will be questioned. A corporate metaphor would be an organization where everyone gets promoted. It is hard to avoid the complacency trap. The means and the end are undesirable for surveys and subjective personality tests and military studies should shift focus to continuous collections of objective data. This shift can not only reduce bias, it can increase adoption.

 

Sources:

https://preprints.jmir.org/preprint/14116#Abstract

https://www.economist.com/science-and-technology/2020/02/27/how-to-decide-in-advance-who-will-pass-advanced-military-training

https://penntoday.upenn.edu/news/Penn-Angela-Duckworth-looks-beyond-grit-predict-success

Picture courtesy of:

https://www.military.com/military-fitness/marine-corps-special-operations/usmc-recon-fitness-training

Previous:

People analytics during a pandemic: new opportunities?

Next:

AI as Talent Scout

2 thoughts on “Predicting who will fail and when – a Marine Corps study with broad implications

  1. “Is a high degree of “fear of failure” necessary to get the most out of training?” – That is such an interesting way of looking at this! This suggests to me of their is a virtue of bravery that maybe inseparable from the marine corps experience. The ability for those to overcome their shortcomings is achieved by facing your greatest fears and weaknesses. Analytics may substitute away the need to cultivate that virtue of bravery and thus substitute from what it means to be a marine. The counter to perhaps consider is maybe analytics pushes the line forward and that with its adoption more will be expected from candidates in an already rigorous recruiting process.

  2. Incredibly interesting article Blake – thanks so much for sharing!

    What triggered me was your sentence “It will take very concrete and defensible analytics to help make decisions about who gets to enter training and who gets failed. Personality tests and surveys will never make the cut.” My question to you (as someone without a military background): would you ever want anyone to not be able to even enter training? Is there anything that would make the cut?

    An interesting approach to this is the engineering entrance exam (which I helped design as an Ed Rep in my University). In Belgium, we are firm believers that everyone deserves a chance to enter University, so we do not have any admissions process. You just show up. However, the government wanted to tackle the problem of >50% drop-out rate after only 1 year in the engineering studies. It basically cost a lot of money… What we designed was an in-take exam which was compulsory but not binding. They show you what the likelihood is of you succeeding based on prior intakes with the same scores, but you are still allowed to enter the program. The interesting thing here is that you as an applicant know what you need to work on to get there. This might be interesting for the army as well?

Leave a comment