Matt Tufano

  • Alumni

Activity Feed

On April 14, 2021, Matt Tufano commented on Debugging the Pulse Check Survey :

I’m using a large pulse survey – the Census’ Household Pulse Survey – for a final project in Managing the Future of Work! (Sidebar: they started it about a year ago and are tracking people’s experience through the pandemic. There is a LOT of cool stuff in there if you want to learn more about it.)

Putting that aside, one sentence jumped out to me as a call to action: “No ideas have been incorporated by management.” It feels like an easy win would be to use pulse surveys to test ideas that are ready to be implemented but just need more results. Beyond getting validation from employees, it also draws a clear link: the survey DOES inform workplace decisions, so we should contribute to create change! I like the quick cycle time for these surveys compared to bigger engagement questionnaires, and I worry that without some quantitative tool, it will be hard to track what’s happening without some structure.

Also, I agree that averages throw out a lot of information – we build a robust tool and then compress all of the data into a few numbers that are easily skewed! One fix: we need to start talking and using dispersion stats more (histograms and standard deviations could be a good startpoint!). I read a book called “The End of Average” that dives into alternatives to central-tendency analysis that you might enjoy; I remember it focused on finding ways to capture the “jaggedness” of data.

On April 14, 2021, Matt Tufano commented on Should your employer care how much sleep you’re getting? :

I agree with Kamal, Jamie, Korn, and Berten on privacy concerns! Even if it was designed with good intentions, collecting sleep data (or really, any biometrics) is creepy. What if companies focus benefits on subsidy/access to services like Calm/Headspace, nutrition apps/programs, and fitness programs (such as discounts on Peleton-type classes, fitbits)? Instead of tracking biometrics, you could track app participation and usage at the aggregate level. As “people analysts,” we could evaluate participation and engagement in these benefits with sentiment, engagement, and other outcomes at work but without collecting sensitive information. It also empowers people to take action without feeling watched.

On April 14, 2021, Matt Tufano commented on Narrative risks and the curse of the ā€œ10%ā€ answer :

Confirmation bias: easy to identify, difficult to overcome! (I speak from experience). Laura, I think you’re on to something that we could take from LPA and apply to the future. If we’re involved in people analytics work (or really, any data-driven work), we could ask the question “what do we know to be true, and HOW do we know that?” Going through those discussions and surfacing assumptions could be a great first step to overcome bias. I think data transparency helps here, too: the more people can see the data and follow the analysis, the more likely we are to gather cool ideas and challenge any prevailing/confirmed views. It puts more onus on us as analytically-minded leaders to teach and lead through the data. If we hear signals of narrative bias (“the data should show…”, “this is how it’s always done…”, or “we know the answer already…”), we can respond with this trick: If we truly know the answer, why are we even asking the question and wasting everyone’s time??