-- Intro --
Welcome to Workplace Geeks, the podcast that is on a crusade to check out prospect and probe the world's most exciting and innovative workplace research with the amazing brains behind it. I'm Chris Moriarty…
..and I'm in Ellison. And did you really say probe just then?
Possibly, and we are your hosts on this audacious workplace excursion.
Before we dive into today's interview, first, we will have a little rummage in Ian's mailbag. Ian, who has been in touch and what they have to say?
So, Chris, we have had a message, a really lovely message from Ellen Flood actually, Cottee Parker architects over in Australia and I believe also now New Zealand, so Ellen said, I've recently discovered your show, quickly making my way through all the episodes loving what you guys are doing.
I'm a graduate architect from Australia, I did my master's thesis on productivity and wellbeing in the workplace. I'd love to hear an episode on the behavioral side of workplace design, i.e our mood focus, motivation, etc. and how it influences us, thus impacting employee performance.
So, Ellen, thank you amazing to get feedback like that. It makes us really, really happy that somebody out there is listening. Our question to you then is what paper, project or piece of research do you think Workplace Geeks should dig into on behalf of our listeners? You know that the format is that we have to have something to start with. So is there a paper is there something which you really, really valued from your own studies, hit us up on LinkedIn, let us know
Lovely stuff. And if you're listening from Australia, or any other far flung location, and you want to pay for us to bring the Workplace Geeks roadshow to your local event, then please email email@example.com. In fact, that's a nice reminder for you to get in touch about anything episode feedback, interview suggestions, reflections on workplace, exotic event bookings, you can always email us find in an I on LinkedIn or the Workplace Geeks page. Come on and join that growing community. Now that's our housekeeping done, let's hear more about who we're speaking to in today's episode Ian.
So, on the show today, Christopher, we're joined by Rob Briner, Professor of Organisational Psychology in the School of Business and Management at Queen Mary University of London. So, Rob has topped HR Magazine's most Influential thinker list on a number of occasions before the pandemic, Rob is also the super proponent of an approach to decision making. And I would argue thinking more critically, which is called evidence-based practice. And I believe this is now even being embedded at the very heart of the CIPD’s new profession map, which was launched a few years ago.
Now, evidence-based practice is an approach that Rob argues tends to get used when the stakes are high, from a people perspective. So, for example, in education, in medicine, in social services, policing, and even government policy development. We'll get well into evidence-based practice in our discussion. So, I'm not going to say too much more here. But it is worth saying that we use it to explore employee engagement, the future of work, absenteeism, and even the pros and cons of benchmarking, amongst other things.
But what I will say though, Chris, one more thing is that Rob also gets major props from me for his northern roots. So, he's got his academic chops with degrees at whole, at Durham, and at Sheffield. So big up the northern massive, I'm sure James will have something to say about that.
The northern powerhouse is real, so wonderful stuff. Now this chat with Rob was so good, we didn't want to edit it down too much, that means that we haven't got time for James to join us for a Pinder Ponder afterwards. He did mention to me recently that he signed up for a blacksmithing course, so perhaps it'll give him a chance to buy an anvil or something similar.
Instead, though, you will get some reflections from Ian and I and then next week, James will be back where he'll reflect on this chat with Rob, and our next one who may be with another one HR’s most influential folks and look out for that. But for now, let's hear what Rob had to say.
-- Interview --
Hi, Rob, welcome to Workplace Geeks for the people listening that might not have heard about you or your work, just tell us a brief little overview of some of the stuff that you get up to.
Yeah, so I'm a professor of Organisational Psychology at Queen Mary University of London. And I do teaching and research in the areas of kind of organisational psychology and HR particularly interested in around things like wellbeing and the cyclical contract various aspects of HR.
But actually, probably more importantly to me generally, is that I am a proponent of the idea of evidence-based practice and trying to make better informed decisions in areas such as FM and organisational psychology and HR and management. So that's the probably more important thing for me at the moment.
So, unpack that for us a little bit the evidence-based practice because I follow you on Twitter and I see from time to time you’ll, throw up a graphic that you have grabbed from somewhere. And it will, it will make some bold claim about the five things you need to do to be much more productive or something like that. And you'll, you'll maybe little tongue in cheek, have a little poke at that. So, talk to us about evidence-based practice as a very serious topic. But why? Why it kind of at times, you kind of focus in on some of those messages that we get hammered with on a daily basis.
I mean, I do throw up on Twitter and on LinkedIn, I throw up particular kinds of post objecting to some of these really oversimplifying exaggerated messages. Now, that's not what evidence-based practice is about. But I think in many fields of practice, we are bombarded with these simple, exaggerated solutions to problems, we don't even understand a part, a little part of I guess, evidence-based practice is debunking, particularly in fields where there's so much noise.
And one way I think about it is say you work in HR, there is so much noise out there, people telling us things vying for your attention, trying to study stuff telling you things work telling you things are wonderful. And actually, you need to kind of cut through that noise, to get to guess what the signal and by the signal, I mean, good quality data and information of various kinds that will help you make well informed decisions. And in general, what we see often on Twitter, and LinkedIn and best-selling management books is the opposite. I get a good quality data and information.
Because you again, sort of drawing again to your Twitter account I saw I think it was this morning, actually you, you put something up about leadership books, didn't you and sort of saying how they're kind of anecdote, rich. And this, you know, someone personally, I do, I do like a leadership book. I like Simon Sinek. And, you know, a lot of that is simply his views of the world. And I suspect you've got some views on, on that.
But there's this kind of there is this kind of tension, I guess, between ideas that that really like become really sticky and really popular, versus this sort of evidence base. But even then, I kind of get the sense. And we're going to talk about a paper in a moment, even then some of the stuff that pushes itself forward as evidence has some of its limitations based on what I've read on your website and some of the work that you've done. So just explain the kind of nuance of those that spectrum, I guess
I think the first point you said is you like some of this stuff. Now, it's great to like things, particularly things like to taste food, music, your friends, a particular color of paint on the wall, fantastic. Liking is the most important thing there. When it comes to making well informed decisions, liking is not helpful at all. And I think for many of us, as I guess, and this is true, not just for you, or for me or for all of us, is we sort of confuse how much we like something without valid, useful, important and accurate it is.
And in fact, in practice, some of the most useful pieces of information might be things we don't like at all, you know, the buildings burning down, I don't like that, let's go and do something else, you know, things we don't like potentially incredibly important. So, it is difficult in a context that you probably come across this term, edutainment, which is cross cutting kind of education kind of entertainment. A lot of those popular management books, for example, including ones about leadership, are much more about entertainment than education, which is fine, but we shouldn't confuse the two. It's a bit like in my own teaching practice, or in universities, often we get rewarded for having high student satisfaction ratings. Did students enjoy this program? Did they enjoy this course to do enjoy this module? That's fine. But how much do you enjoy something has got nothing to do with how much they learned.
And in fact, the best available evidence I've seen on this suggests that the more students say they enjoy something, the less they were. So, there's if you're reading a book, and you're loving every page of it, if it's Harry Potter, brilliant, if it's supposed to be telling you something meaningful in terms of giving you important information to help you make decisions about what to do. Not good. So, kind of totally depends. But there is that there is that real tension now, I think and you're right when people are vying to get people's attention, there's a lot of focus on wool people like this, is it fun is interesting, does it grab them, but that's okay, but it's not important.
Let's go back to evidence-based practice. I've seen you talk about this on stage before a few years ago at Workplace Trends. And I've also heard you do this another podcast you've used, and you might have a different way of sort of exemplifying it now, but I've heard you use trying to go out for a meal
The restaurant analogy.
Yes, exactly. So, can we use the restaurant analogy or another analogy of your choosing to sort of unlock what evidence-based practice is all about?
Because for me, it's a decision-making framework, I think that's principally what it is. And I think it's an encouragement to think more critically. But what I'm interested in is the sort of skills and whatnot, you need as well to be able to do that. So, let's start at the beginning and see how it unfolds.
So, the restaurant analogy is basically trying to get over the way in which we use information and data and evidence of different kinds to make better informed decisions. Now, I use this restaurant analogy, it only really works if you're interested in restaurants and eating out having a good dinner. If you're not, it doesn't work, but I'll get in there for nearly all of us, there are things that are very important to us.
And what is striking to me is for things that are very important to us, we tend to do basically evidence-based practice without even knowing we're doing it without even knowing what it's called to have a name. So, the restaurant analogy would be if I arrived in a city, I don't know at all. So, for example, I'm spending a city break in Liverpool, and a few weeks I don’t know Liverpool very well, I've haven't been there for years, I will want to go at least once, maybe and have a nice meal, how we're going to get information about it.
So, if you think about the sources of information, I could just look on, say, Google, I could look at some reviews, I could look on TripAdvisor, I could look at reviews in newspapers that will go and guide Michelin. If I was in a hotel, I could ask the concierge, I could ask friends you live there, there's all these potential sources of information. If I if it was really important for me to get a good dinner, I'm gonna fork out a lot of money really important have a good experience, I would certainly look at more than one source of information, I would not just ask the concierge.
And that's principle number one. Evidence Based Practice is always about using multiple sources of information. The second thing I would do is think about the quality or the trustworthiness of that information. So, if as a concierge and hotel is what he or she says trustworthy, well, possibly, but if you know, they get kickbacks for recommending restaurants, maybe you treat it with you know, with care. If you look on TripAdvisor reviews, TripAdvisor reliable? If you know that you can pay businesses to put up good reviews of one restaurant, bad reviews of others, maybe you don't trust it so much.
And you will tend to focus on the higher quality evidence doesn't mean you don't look at the other things. But you focus on the highest quality information, because that's more likely to give you the result you want. So that's the second principle is looking at the quality and reliability of the information you've got. The third main principle is kind of taking a structured approach. So, I'm sitting in front of my laptop banging away on Google looking at all kinds of things, I might randomly throw up 10, 15, 20 restaurants in Liverpool.
But if I didn't go through some process, maybe record the information in some way, have an Excel spreadsheet. And believe me, I have done this on occasion, I will lose travel who's saying what? What do they mean? What did they say? Why do I think this restaurant’s terrible? What's that based on? Is it my 20th tab that's open and what wasn't it? So, a structured approach.
So, when things are important to us, we tend to take what I would regard as an evidence-based practice approach. And just to repeat, does multiple sources of evidence and information of different kinds, it's thinking and paying attention to the quality of the evidence, and focusing on the best quality evidence. And it is taking a structural systematic approach to gathering and using that data. So that is basically the restaurant analogy. And to me, you can apply that to any question problem, issue opportunity.
So, when you bring that into, for example, an organisation and let's say that the organisation is thinking, lots of organisations right now are thinking about the future of work, or what they do in this sort of semi maybe possibly post pandemic kind of thing. Armed with the experiences they've had over the last couple of years, I think you've talked in the past about saying, well, look, we can go over there for that type of information, we can go over here for this type. And you sort of let, you sort of, I think you've got four different types of information. Is that right?
That's right and when it comes to, and again, this is borrowed from other fields, but when it comes to a manager's organisations, there's four main sources. Yes. The first is your own expertise and experience as a practitioner. So, part of what we're paid for, is using our expertise, we would be a bit odd to go to work and say, I'm going to forget everything I've ever known. Where any of my work experience is, the point is, you should use that too. So that is a source of evidence.
But like any source of evidence, you have to think how reliable and trustworthy is it? And what is it really based on? So as practitioners, we may all have a hunch that this thing will not work, but that's fine. But what's that based on? Is it based on 20 years of experience of seeing that thing 100 times and being really careful about what we understood about it? Or is it just because we don't like it or because we experienced it once? So professional expertise is the first area not in a particular order.
The second area of evidence would be organisational data. What's actually going on in the organisation? Well do we have data information around that might tell us something from existing evidence, existing data? The third area is the scientific evidence so, around a particular question or problem, if you go look at scientific evidence, what is the best available evidence tell you might be the problem and might be tell you my best solution.
And the fourth area, they're all equally important is stakeholders’ perspectives and views and perceptions both on what the issue or problem is, and what the potential solution is, and then organisations that would, of course, include employees very much, it would include senior management, team, management, clients, customers, and even maybe the wider community. So, they're the main four sources of evidence, and there may be others as well. But they're the main four.
I find that absolutely fascinating, because you've sort of set in no particular order, which sort of implies that they all need to share space in this decision-making process. But I've done consultancy, where, for example, we've relied very heavily on data. And we've used that data, that scientific sort of part of this as an opportunity to explore things with stakeholders, and that's informed them, and you know, and we've brought our own expert, so I can sort of see how that thing plays out.
But equally we've done work where there might be a really powerful stakeholder or a stakeholder that we would say, is kind of almost their opinions are overbearing, all of the other parts of it. So how do you find the space to allow all of those four things to breathe? And how do you bring them together so, it becomes meaningful and actually productive in terms of the decision making?
That's called hippo decision making, you've probably heard of it, highest paid person's opinion. HIPPO. And if you are in a situation where, say, for example, one particularly powerful stakeholder, if what they say goes, then basically, you can't it's really hard to do evidence-based practice, because that override absolutely everything. And you're right, that's a situation, which is very hard to bring them together.
But what I tend to do, and obviously, you can't, there's no easy way of overcoming in particular, though, what I tend to do say to organisations, do you want to focus on what are important problems and opportunities for your business? And most people say, yeah, and then they go, do you want to do things that are more likely to capitalise on those opportunities to solve those problems? Most people say yeah, then you say, well, that means, yes, here's powerful stakeholder x, we can listen to them, and only base what we do on them.
But is that is that how likely is that to help you focus on what's most important, and what's most likely to work is very, very unlikely. So, you can do that. But that's not evidence-based practice. And you're very unlikely to get the outcomes you want
So, actually, what you're also doing with this framework is being able to hold it up and show it almost in the mirror and go look, if we approach it this way, we, we may well get the most informed outcome we can get at, and now that you've seen this, I can now point back at this when we're going down blind alleys, or we're overbearing in one particular area.
Yes, absolutely, but its certain the case you’re implying, sometimes people say oh but organisational politics, rules, everything, you can't get over it. There are probably circumstances where that is the case. Yes. So, if you're an organisation, which is incredibly political, and people constantly vying for power and resources, and that's the most important thing to them above anything else, then yeah, sure that, and that's an organisation that probably is not going to be very successful in the medium, or long term.
Yeah, there isn't much you can do about that, but I think the holding of exec, what are you actually trying to do? If you want to do what's important, if you can do is more likely to work, then you need some kind of framework. And this is a framework to help you think about that and to take action around that.
Just on that point about leadership and Ian you were talking about leadership, and Rob, you mentioned about organisational data, there is a piece of data that a lot of leadership teams are obsessed with and I would love to get your, your opinion on this, and this is benchmarking data there is this sort of constant need to look at, look at our data and go well, there's that and I'm thinking about, and we'll talk a bit more about things like engagement.
But there's other sorts of metrics where you sit there going, well, you know, how does that compare to and how does that place us in? What's your kind of take and view on benchmarking as a kind of useful or not so, tool for leadership team.
I think is quite limited where benchmarking has some value up to a point is where you say, we think we've got a problem and say absence or absence rate is high. And someone might say, but is it high? What does that even mean? And if you compare yourself with other organisations, who are similar in a similar sector in a similar area of the country, etc, etc, with similar demographic of the workforce, you might find out that actually your absence rate is more or less the same.
So, in that sense, it might be reassuring okay, we think we've got this big problem with absence but as it turns out, this is like a normal level of absence. So, in that sense, it's okay in another sense is not okay in that it doesn't mean that an average level of absent isn't a problem for you, and your organisation, so they don't have benchmarking can be either scary. Oh my god, we need to do something are reassuring as oh, we're fine.
But actually, still doesn't answer the question, but is that whatever we're benchmarking, is that okay for us or not, in our situation, given the goals of the organisation, what we're trying to do this particular context we're in. So, I think benchmarking can be kind of a sort of sometimes quite lazy way of saying, does this matter? Or are we fine? And it's just not, it's okay. And it's useful up to a point, but then you need to look at your own context and setting which is where those other sources of information come in.
And I suppose you I mean, you mentioned there about benchmarking and saying, you know, you might compare yourself to another organisation in a similar area with similar demographics. But there's, there's a lot of those variables, I guess, that you have no way of finding out. So, it's almost, you're gonna run into a lot of, you know, sort of dead ends, aren't you, if you're, if you're sitting going, we've got to find a perfect profile that matches us and our strategy and our direction.
There might be new strategies opposing these organisations look incredibly similar, but one has a strategy, whatever the hell that means. It has a goal of being very sustainable, and other organisations a goal of like growth, another organisation has a goal of becoming very innovative, and other organisations a goal of reducing costs.
So, all those things would also feed very much into does this level of absence matter or not? And again, if you dig down deeper organisational data, it might be where who is actually absent? From where, what jobs? What does that mean? What are, the knock-on effects of that? So yeah, you're right, even if you can kind of control for some of those things. There's lots of things, you know, it's quite hard to get information on,
We do a lot of this stuff on the podcast, where we'll talk to academics who've done a study, and they've, you know, they've, they'll talk us through the methodology, they'll talk us through what they found and what they consider to be the considerations off the back of that study and stuff.
But having read one of your papers that Ian is about to draw us into, you refer to these as kind of cross-sectional studies, typically the ones that we talked to, they kind of they're quite specific envelopes of activity, we think they've got great value and I'm sure you do, too, but just explain them in the context of evidence-based practice, the role they had is one of those information sources that you talk about.
Broadly speaking, I suppose it depends what you're trying to do, but often in management and organisations and in life, we're interested in cause and effect, what is the effect of this thing on this outcome? What is it? What might the effect be if we do this, on this other outcome? So interesting cause and effect. Cross sectional studies essentially mean you measure everything at a single point in time. So, it's correlation as and as we all know, correlation does not show causality.
We all know that it's a sort of mantra, it doesn't, I mean, you can get some indication of there's some circumstances, but in general, it doesn't. So, within evidence-based practice, in any decision, not just evidence-based practice. If you're interested in cause and effect, you will not regard cross sectional data as very valid.
So, an obvious example, people often say is, if you measure children, say, between the age of five and 12, and you measure the shoe size, and you measure their reading ability, you will find a correlation.
So, if you're worried about the reading ability of your country or your school, would the answer be to give everyone all kids bigger shoes, or to stretch their feet? Or no, because obviously, shoe size is a proxy for age, age ties close to because of practice and rehearsal, things to reading ability. So those studies can actually be remarkably, not only not very useful, but also hugely misleading.
It doesn't mean you shouldn't look at them, it doesn't mean you shouldn't use them, like any source of evidence going back to the restaurant, I might look at TripAdvisor even though I don't really trust it. But you should say what we really want is studies that are capable in principle of showing us cause and effect, not studies, it just look at relationships at one point in time. So, it's a huge challenge, particularly in fields where there is all kinds of other reasons why you might find correlations. Yeah, it's a real challenge.
I guess, the people we talked to, and in fairness to them, and not just the people we interviewed, you know, the wider academic community, whenever you see their papers, whenever they put the findings out, they kind of clarify all this stuff to say, look, you know, the conclusion is when we need more research, and anytime we talk to our guys, we asked them, what would you do next? They're like, I would carry on this study for five years. And you know, all this, and there's a lot of reasons why, I guess part of the challenge, though, is the communication of that.
And with that, not necessarily by the academic that we talk to or the you know, the institution that they work for quite often, it will be some sort of a particularly if it's in a sort of general national interest, some sort of media outlet, and they're not interested in telling people it depends, and I know you've got a lot of views on those two, two words. You know, it does depend. They want to just give you a nice neat packaged-up bit of knowledge that you can then go off and tell other people about.
So, you right, it’s partly about the communication of it, but it's also about, you know, this classic, more research is needed no, no more research is not needed. Better research is needed, not more research, we've got lots of research, most of it is not very good at all. We don't need more of that. Sorry, we don't. And again, just like for managers, or any practitioner, academics themselves, are exposed to a series of incentives and punishments. And often the incentive for them is back to your point, Chris, should I wait five years to do longitudinal study and not publish other things for five years? What will that look like? On my CV? What will that look like when it comes to promotion? What will that look like when I have to say what my citation index is, etc, etc.
Researchers and academics are incentivised, like we all are to do short term stuff that shows activity to do stuff. The question is, is that stuff useful? And I would say generally, not so much if you're interested in cause and effect. So, what you have is, like I say, lots and lots and lots and lots of not very high-quality studies. And a few okay ones, in my view, and other people's view, would be much better to say to people look, publish less, but do really, really good focused research. But yes, there's also then the issue, Chris, with communication in that, obviously, again, universities, researchers, academics, are not going to start off telling you all the problems with the research, they're going to tell you how wonderful it is, and then mentioned some of the problems at the end. So yeah, that is an issue around again, what are the incentives for people to talk about research?
See, that whole piece about incentives is fascinating. I was going to reflect on you kind of got there by saying we need better research. And I was busy pondering the sort of the pocket of workplace that Chris and I hail from, which is this sort of young fledgling discipline, architecture has been around for hundreds of years, but sort of workspace design and facilities management, workspace management, if you like. Those sorts of bits, they're young, and they're developing. And there's been some great work, but arguably, for lots of different systemic reasons, fundamentally, the research capabilities and skills aren't that strong.
And maybe that's kind of part of one of the limiting elements of all of this. But that was my first reflection, my sort of more, I think, sort of progressive reflection off the back of what you just said there, Rob is, knowledge is political, right? There's a politics to knowledge. And we often act like it isn't. And particularly over on the more sort of natural sciences side of things, we just kind of treat knowledge as, as kind of black and white. But actually, behind every piece of research, particularly human research is a human with an agenda with a politics with a reason for doing what they're doing, even if it's just because they're interested in it.
And yet, you said one thing that fascinates me, and it's not quite fair, it's kind of half fair, is that sometimes researchers, scientists will describe a result as disappointing, and you’re thinking, what are you talking about? It's just your result? Was it disappointing? Does this mean you really wanted to find this thing? And if you did really, really want to find this thing? Should you be doing science? Should you be doing something else? Because you shouldn't want the result? I know what they mean. But it's kind of like it's like unconscious leakage by describing it as it's a disappointing finding. No, it's not, it's just your finding.
So that’s interesting, isn’t it? Because what that’s just displaying is that actually, even when you are trying to be objective, a human is by the very definition of being human subjective, and it creeps in, emotion creeps in
You’re right. Science has this reputation, which is wrong for being very being objective, really neutral. And it clearly isn't as part of social political systems as much as any other way of human activity. And that's why in the last particular last I say, 15 years, 20 years, there's been a lot of attempts to try and deal with some of these systemic problems in science.
A great example of this is the so-called replication crisis. Say, for example, when children first learn about science, they understand about the idea of replication. ,You do that experiment, that's fine, we need to do it again. Or there's gonna be 30 kids in the classroom ordinate experiment to see what happens. And the idea of replicating is really important.
But in my field, and I'm guessing your field and many other fields, particularly in social sciences, simply trying to do a replication, it'd be very, very hard to get it published. Because the journals are looking for novelty, a bit like the shiny things maybe managers looking for. It's just the same. They're looking for novelty, new, exciting stuff. So, if you say, okay, there's been a couple of studies on so I'm just going to just do the same thing again, the same experiment, same study, to see as I say, you can't get it published.
So, people being disincentivised away from doing basic, important science. So, the idea that you need to find systemic ways or structural ways of overcoming some of the if you like, very human things that come into science is a really important part of it. So, to recognise it is a human thing with the politics, with the incentives with the perverse incentives, that's okay, we can't actually change those exactly, what we can do is put things in place to help reduce them to some extent.
Let's move across to this topic, and I know it's a topic which you've probably been invited to talk about hundreds of times now. But the as you know, Workplace Geeks, we take a single piece of work. And we use that as the sort of the source point to explore everything in all directions. So, the thing that we particularly caught are eye Rob, with you is this this piece of work you did for CIPD, in partnership with I think, a firm an organisation, a body called Engaged for Success, and it was part of their Future of Engagement Project.
And it was called, What is Employee Engagement? And does it matter? An evidence-based approach. Obviously, we've had a read ready for this, but just for listeners, could you just I know, it's quite an old paper, but it's very readable, it's very accessible. And it sort of from a topic based perspective brings everything that you've been discussing thus far to life. So, do you want to sort of pick up the thread and talk a little bit about it?
I think it is, yes, it is. It is old but are two things. One is the idea of engagement is still around, although is dying, at least using those terms. But also like, it's just a good example. And there could be others as well, just a good example of a ‘how to’ approach a popular topic and think about how valuable or important it is. So around 15,20 years ago, employee engagement was really big. So, in the UK, the government was sponsoring studies into it, and even helping, supporting Engage for Success. And to me, even the very name Engage for Success tells you something about what those people really believe, rather than engagement doesn't matter, engagement doesn't make much difference, etc, etc.
So, in this paper, I tried to do it, just think, okay, let's kind of look at what Engage for Success, say, who are proponents for employee engagement, let's look at the kind of claims that they're making. And let's look at the best available evidence, scientific evidence published evidence, you know, we can get hold of, to look at how good those claims are. So what I essentially did is I address a few questions that express the assumptions they and many others at the time were making.
The first sort of statement is engagement is clearly defined. So, if you read a lot of this stuff, it appears, and it might apply now to things like employee experience lots of things you can think of I'm sure your listeners can think of where there's a term or concept that people talk about. As though we all understand what it means. We all agree, we all know what it means. We're all on the same page. Engagement is one of those examples where we really are not there are many, many, many, many, many definitions that are so different.
It seems to me hard to have a conversation or do anything practical, if we can't even agree on what these words mean. So that's I think the first major obstacle is on the case of engagement, it can mean a behavior, it can mean an attitude, it can mean the way you feel about an organisation, it can mean how focused you are in your jobs, it means lots and lots of different things. So that's the first problem it’s not clearly defined.
The second assumption, I think, in all of this work at that time, probably still is employee engagement, as a concept has valid and reliable measures. So again, it's generally used, it's not just quantitative, but organisations certainly have used as a quantitative thing. We measure employee engagement, we measure every month, we measure every year. We do pulse surveys, we find out how engaged people are, and we measure it, we see if it's going up or down, we try to understand what's happening with it.
And therefore, don't only make sense, if the measures you have are basically valid the measuring the thing you want and they’re reliable, as in they behave effectively as measures. And again, if you look at the best available evidence, it appears that aren't really valid and reliable measures of engagement, there's measures but it's very unclear how valid and reliable they are. Related to that is the third issue here is engagement. Again, you could apply this to employee experience or the concepts to make any sense to talk about it and get excited about it has to be something different. There's no point getting excited, rolling out initiatives, spending millions of pounds, resources, on this thing, if it's exactly the same as something else, or other things were already been doing.
And in the case of employee engagement, it seems extremely similar. If not almost identical to other concepts such as job satisfaction, and organisational commitment. Those ideas have been around for about 70 years. It doesn't mean they're bad ideas or good ideas, it means they're historic. And it means this new idea doesn't really seem so new. So why we're getting excited about. The fourth thing is, is if you're going to get very excited about engagement, you have to assume there's plenty of quite good quality evidence that shows the value of engagement.
And that's in two ways that if engagement goes up, performance goes up. What is the point of measuring engagement, and hoping it goes up if we don't know whether or not an increase in engagement leads to an increase in performance. So that has to be there. Also, we have to know about interventions, that if we intervene to increase engagement, engagement increases, and then performance increases, because even if we know that something going up, somebody else goes up. Unless we know how to intervene in it. That's not very useful information. It's kind of interesting, but we can't do anything practically with it.
And again, if you look at the best available evidence, whether it's in scientific evidence or commercial evidence, there is very little good quality evidence that supports either of those two things that says if you increase engagement, back to your point about longitudinal studies, that performance goes up, and there's very little good if there was no good quality evidence that I've seen, to suggest that if you can intervene, it will improve employee engagement. And once you do that, that increased employee engagement will subsequently improve performance. That's what you want to see. And we just don't know.
Finally, I think you'd have to say that with any new exciting idea that the claims that are being made about it are reasonable and accurate. In the case of employee engagement, they are not reasonable, and they're not accurate. So, you could apply those same sorts of questions, I think, to many areas of practice in FM and other kinds of workplace stuff and say, here's this new authority idea. Let's just go through these questions and ask the same questions and see what we come up with maybe similar answers or maybe different answers. But if you don't have a way of asking questions and trying to answer them, it becomes very hard to work out the value and importance and relevance of this new exciting phenomenon.
The two things that sprung into my head there, were they sort of linked because you said engagement is in decline. And yet the world is saturated. And we work with organisations every day who use employee engagement tools, they might be called employee voice, but they're generally engagement tools. And they are paying significant stacks of cash, often for SaaS platforms, which allow them to understand with measures which you would contest, what's going on longitudinally, picking up on a word which you've used. So, I find it absolutely fascinating that on the one hand, you sort of say it's in decline. And the other thing is, the thing that bounced into my head was you manage what you measure. So, there's almost a thing here of having data that allows you to feel like you've got data.
So, what I meant by engagement is in decline is the concept is the idea is, so if you say go to an HR conference, now, people are not really talking about employee engagement, so much if you went to 10 years ago, everybody will be talking about it as a concept. It doesn't mean organisations are still not interested in similar things. Absolutely an organisation been interested in similar things for centuries.
But they're kind of the label the term the idea is not sort of so popular thing. And people have shifted maybe around employee voice, maybe around happiness at work, maybe around purpose, maybe around work experience. So, these are not new ideas. And these ideas have slightly moved in. It's a bit like if you step up to an HR conference and said, in my business, we were measuring job satisfaction. People go hey, but wasn't that that's a bit old? That's a bit old fashioned. Why are you doing that? And if you say why I'm doing it, because it's basically the same as engagement, and possibly the same as all these other things. People go yeah, this doesn't sound very exciting. It sounds you know, very last century, which it is.
I mean, one way thinking about this is that it's partly driven by consultancies, not all consultants by any means. And it's partly driven by the practitioners, the clients, the customers. So, for consultancies, I think only some consultants are not, you know, the pressure is to develop products and services that will sell to your customers. That's it, that's your business model. If you say sell, if you develop, say, an employee engagement product that may be a survey may be an online thing. It may have interfaces with management information systems, there may be an app to maybe training for manager got a whole suite of products on engagement, after a while, all your clients who are going to buy will have bought it.
So, what you're gonna do, you need a new thing. And it might be mindfulness, or it might be employee experience, then you'll have a measure, you have an app, you have a survey, you have training. And again, it goes in these kinds of cycles. So, it's partly that and I think it's also partly clients and customers. The reason people keep trying new things, apart from the fact it's fun and cool, is because the things are doing are not working very well. Well, they can’t, or they don't know if they are they can't see the effects. So, looking for something new all the time. And so again, it's sort of this tension between those two things. So, I think that's why one reason why the like, he probably gets gone out of fashion because people just thought that’s old, we've done that. Where's the next new thing we can do? So yeah, it has it has gone out of fashion I think.
So, whilst you were saying that stuff, my mind went to a Chief People Officer that I know very, very well and whose opinion I trust, and she's seasoned and quite skeptical about things actually in a really sort of positive wanting better kind of way. And I remember her saying to me one time, that engagement is table stakes, right? If you don't have, if you, if you are a HR/OD function that does not have an employee engagement tool in place, it actually looks wrong.
So, in some respects, it's not super powerful. It's just the tip of the iceberg. If you really want to go deep, then you've got to start thinking about other concepts like for example, wholescale, employee experience. Now I know you might contest whether they're the same different or whatever. But her point was that they are fundamentally different things, one is surface, and one is deep. And one of them is essentially just playing the game because you have to be seen to play the game.
I think the question is then, who is setting the rules of the game? Who is letting the game continue? Why are people not trying to change the rules of the game? In other words, you could I mean, you get this with something like, and again, I'm not being deliberately contentious, but things like the MBTI, the Myers Briggs Type Inventory, which has questionable psychometric properties, blah, blah, blah, blah, blah. And it's quite controversial, but it's very popular. And when I talked to people about using it, they said, the reason I do it, yeah, I don't believe in it. But it allows me to have a conversation to which I reply, well, why is having conversation a good thing, in and of itself? Because you're replying, it's this amazing thing to have a conversation, it might be might not be the worst thing you can do.
They also say it’s a way in, to doing it allow, it opens things up, even though I don't believe in it, it opens things up, my question would be not, you know, any practitioner, why do something you don't believe in at all to get to something you sort of do believe in, they can do that. But I think as a practitioner, it’s a very kind of odd space of sort of being quite disingenuous. And yet hoping that by being disingenuous, people are going to take you seriously, how does that work? So, I'm going to pretend I believe in in getting the survey, I'm going to pay a lot of money for it. I'm going to give all the managers their numbers every month, every year. But I think it's, I think it's nonsense. Why are you doing that?
I'm not sure, necessarily nonsense, but I think it was about the depth and the significance of what you could potentially do with it.
Yeah, or can live with that. Why spend important resources and time and money and energy on things of low significance? There's another way of putting it, nonsense a bit stronger, I agree. I apologise for saying nonsense. But those significant is nice. Yeah. Unimportant, then through so not very important. It is, it is it is tricky to get it and we, I think every practitioner or profession has this sort of we need to do this to get to that. And this thing we're doing we don't really think is in very important. That's okay. But it feels like a bit of a trick to me somehow. And it feels to me is if you're confident, or if you want to be a confident and respected profession, it's probably not great to do that stuff. It's probably not a good idea.
Because what will happen is, is probably has been happening with engagement. The senior management teams, you don't know anything about will talk to their friends and colleagues who similar position other organisations who are telling them now we don't do engaged in abortion nonsense. And they'll go, well, why have we been doing it for the last 15 years? What's wrong with us? So, if you jump on that bandwagon, it has benefits? Sure, but also has cost. I think.
When we're reviewing this paper, Rob, and you've kind of touched on it a few times in some of your explanations about approaches. One of the things that really, I've really enjoyed the section where you kind of break down, okay, here's the claims. And here's the evidence has been presented to support the claims. And you kind of go, we're going back to our two fundamental questions. And you kind of stated at the beginning of this section saying these are the two fundamental questions, these, these are the only things that actually matter, to answer the exam question.
And you kind of used it as a checklist to go through it. Now what I'm thinking about is, is practitioners, they're saying, I've barely got time to do really bad evidence-based management, never mind good stuff. How would they go about, you know, setting about building a body of evidence that they can march into a boardroom with and say, here's what I would like to do and here's the supporting information. That means that we've made as good a decision as we can.
So, the first thing, I would not say march into boardroom is what you'd like to do. And they didn't mean that it doesn't imply the person to have something they would like to do anyway. And then it's trying to amass evidence that supports the fact they want to do that thing. And it’s really, I had a long conversation with a group of HR practitioners a couple of weeks ago, discussing what they thought evidence-based practice was, and many of them said, evidence-based practice is getting together information and data and evidence to support the case for what you want to do. I didn’t say anything but I was thinking no, it's not. It's not. It's the opposite. It's not because I think you might end up doing is not the thing you want to do. But it will be important to the organisation business and be more likely to work. So that's the first thing I'd say.
But I think in terms of convincing people with evidence and data, I think one of the hardest, hardest things is, it seems to be that often, we, as people are not very convinced by evidence and data, we're convinced by other things. So, in terms of persuasion, I'm not sure how much difference evidence and data makes, which is one of the real challenges for anyone interested in making informed decisions, evidence-based practice, because you can have all this information, and you can show it to people, but they go, thanks very much. I just want to get on with my thing. I don't want to do my thing. Probably right. Thanks very much. Bye, bye. I'm gonna do my thing.
So, I think in that case, you described the board, I think, obviously, the first thing is to is to work out what are you trying to do? Are you trying to persuade, just persuade if says all kinds of tactics and strategies you can use to do that? Or are you trying to show how you are working as a practitioner and professional, to I think, as Ian said before, to kind of let people into the way you think about the way you practice. If it's just persuasion, there's all kinds of things you can do. If it's the latter about this is about practice, I would share explicitly the data and the information I've got from those four sources, the most succinct way I can. I'll talk about the better quality stuff, say some of it wasn't particularly good quality, I would try and be in inverted commas as neutral as possible to be in saying, this is what we think the problem or opportunity is. And this is why we think it is this is what we think is the most likely solution.
But actually, this other solution may be just as good. And this other solution is just as good. So, there are options here. It's not about getting an answer. It's about saying there's different things you could do with potentially different costs and benefits. And I would say our team, we think the option B is best, and I’d throw it back to people say well, you know, you are stakeholders in this, too. What do you think? So, I think it's almost about trying to show the way you work showing that data information. And for some people, that could be quite a surprise, and quite a shock.
But I think there's ways of doing it in a kind of systematic way that just says this is the process. We in our team says HR FM, this is what we did, we went through this process, is it perfect? No are the gaps? Absolutely. But by going through this process, like me searching for a restaurant, am I more likely to get a better dinner? By using multiple sources? By focusing on the best quality evidence by going through a structured approach? Even if I don't do it that well? Am I more likely to get good dinner? The answer is yes, I am. And if I don't do that stuff, so I think I'll present it all in quite the kind of humble way. I haven't got the answer. But what I have is some evidence data information says this is what we think is going on. And given that these things we think are likely to help with this. But we're not sure. But we think it's probably one of these things.
That's quite a brave position to take, isn't it? I think that there's the it feels to me, there's an element of this, which is we're not programmed or conditioned to go in and say I don't know, fully, are we?
There's an element of this that is naive, for sure. That is idealistic, for sure. And it's not about doing it to the ultimate degree. So recently, I raised something about all the mistakes I've made in trying to promote evidence based practice, I think 18 is probably more, but part of one of the mistakes I think I've made and other people and I can't speak for them, but I will perceive many others in many professional making similar mistakes, is they imply evidence based practices about some sort of decision making perfection?
Like you're gonna get the answer, like we're doing a maths problem, or a crossword or doodle, or we're gonna get to the answer. If we just get all that and it's not about that. It's about making a better-informed decision. So, going back to the restaurant example, I'm not trying to find the best restaurant in Liverpool, no I'm trying to get a good dinner. In a sense, that's all we're trying to do. So, I think there is an issue there and the way you present it can sound naive and idealistic. But if you just I think emphasise, I think if you emphasise it's about making a better-informed decision, and a perfect decision, I think it feels a bit less naive. And I think there might be a bit more sympathy towards the approach.
But as Ian mentioned, at the start, if there's someone very senior who just HIPPO decision making, who just gets to decide what they're gonna do anyway, because they feel like it then perhaps any kind of evidence from any source will not make any difference or much difference.
There's a relationship there two opportunities for change. I would say change leadership rather than change management, because I see change management as a process and there might be tools and techniques but change leadership, is that piece around inspiring people to think differently to think bigger to think about the opportunities and then potentially to be more open to making more informed decisions.
But that feels to me actually quite Socratic Rob Rio wisest is a is he who knows how much he does not know bolted to that recognising that it's not just data that persuades, you know, Anthony Giddens wrote about this in the 70s. We see it every general election when people can't agree on anything. So, what you're doing is you're sort of almost exposing, it's kind of like lifting up or going, look, this is what’s going on. So why don’t we be honest about what’s going on and do our best with it, with the time and the resources are available to us?
Exactly, I think it is. And I think the idea it's about, again, another mistake I've made historically is not to emphasise this is this is to your point about organisational change. This is not supposed to be a one off. So, it's not like that to Chris's example, you go to the boardroom present your thing fingers crossed, they like what you've done. It's actually nobody should be doing this for everything, or not if I think we should have done as much as possible. And the idea is saying, I don't know, and back to Ian, your point about leadership, is that are you in an organisation where it's okay to say I don’t? Is that all right? Or is it not, all right? Or are you supposed to be uber confident all the time about everything?
Let me put it another way.
My limited personal experience, anec-data, personal data is often people at the top of organisations, who are leaders have not got there by being evidence-based practitioners, apart from possibly using evidence to get them more power than it’s not, they've got their through because they want to, first and foremost, probably through be into power and politics, being good with people being charismatic, and most importantly, getting stuff done. And getting stuff done seems to me to be massively overvalued.
Rarely we get stuff, when we click as we make things happen is more important than if that stuff is only good, as valuable, as helpful, is it destructive is a disaster. Who cares? I got it done everybody. We can all think of an example around that in the UK. And is this, this call centers that we promote, and we value that getting things done, what we don't value is the I'm not sure I don't know, the different things going on maybe just alternatives. Let's try this and then see what happens. And maybe try something else again. And it's okay to be uncertain that we don't value that kind of stuff. They're generally in my experience very much at all. I don't know how you, you remember, emphasise that. But the problem now is if we have overtime, valued, that certainty in that kicking ass not getting things done, this evidence based stuff doesn't rest very well with it, which is exactly why interestingly, one of the main reasons why evidence-based practice got started in medicine around 30 years ago, because a group of relatively young medical educators and physicians and other medical people were getting very frustrated about how much power consultants had.
And they will just decide what to do. What they said was best if they liked a particular operation, they just do it again and again and again, because they could do it. They're good at doing stuff, did it work, did its harm, who knows. So that's one reason why I got started them, I think will make sense for me, in terms of management. Again, this is probably very naive, to take more of that approach, if we want our organisation to be effective and successful. And Mintzberg talks a bit about this lot is that being a manager, or a leader is one of the most important jobs in the world, typically a manager and people go, what do you mean, what about doctors? Or what about nurses or what about.. and what he meant was, organisations just cannot function effectively, or well, managers are reasonably good and good at their jobs, not just about being evidence based. But it's partly my view about that.
Because if organisations don't work and don't thrive and are successful, the knock-on effect that has in society, and to individuals and society is huge. So, in a way, you can have a great set of say medical practitioners in a say, a hospital. But if that hospital is not managed, well, then they can't do their thing very well. So, I think there's something sort of semi profound about that somewhere. I don't know what it is. But there's something about that. And it's quite different. Because you're both saying from what normally seems to happen, I think.
So just to round this all off, then Engage for Success with their with their leading title. How did they respond to your polemic, when you offered it to them on a plate?
There's very limited response. I didn't get harangued a bit at conferences, including one member of Manchester, it might have been a COPD conference, where somebody quite prominent in this kind of movement at that time, basically stood up and shook their head say that that's not true. That's not true. When I was going through my, the argument you've read in the paper, how's going well, isn't it can you tell me why? You know, you know, so it's a bit a bit of anger, and there's some curiosity. So, we've got a lot of invitation to speak when it was an employee engagement event and you're going to get, or we need to find one person who's sort of slightly disagrees, and the only person who can find is Rob.
There's that kind of response. And the thing now, I don't know, I don't know, again, back to your point about politics, my sense is at the time, it serves a political purpose, maybe for the government, possibly for HR, possibly for CIPD, possibly for HR practitioners to get into engagement, because you said, it's the thing you need to take to the table. So, serve this kind of political purpose. So maybe people they were fine about criticism, like it doesn't really matter, because we're going to do it anyway.
Okay, then, Rob. So, if the employee engagement stuff was all published, and triggered, and contested kind of before the pandemic, one of the things which we've seen tons and tons of debate about online and in the media and in our organisational corridors and everywhere, is the future of work now that we've experienced hybrid over the last couple of years. Or hybrid ways of working have been sort of demonstrated as perfectly plausible for some organisations. So, what's your view on that? And what's your view on sort of the quality of information and maybe, maybe what needs to happen to be able to get us to better decision-making place with all of this stuff?
I think one thing is a pandemic has done of course, it's to me and I was trying to have lots of areas not just workplace is it? It's focused us on questions were asking already, like, if you take hybrid workers, great example. Yeah, just flexible working already. People were, you know, for hundreds of years, people work remotely the Roman Empire people worked in, in teams that were not kind of physically co located. Not this is not new stuff, in a way.
But what it did is it really focused attention on the important or remember very early on the lockdown, people got really sick of zoom meetings, because they felt as always, sometimes as a way of proving they were working was to have endless meetings in their calendar. And they'll go What is this meeting for? What is it about? When do I have to be in it? I would say that's great question. That's the question we should always be asking about, for example, any meeting that case.
So, on the one hand, there's a lot of, there's a bit of newness here. There's also a lot of oldness, a lot of this stuff is not new. So, one of the things in terms of going forward, is it's really important that people don't think all of this is totally novel and weird. And there's no evidence we can draw. I think in almost every case, that's not really true. And also, I think we in terms of sort of going forward and thinking about what it all means is, I don't think this means people have changed. It's not the pandemic has rewired our brains or completely change what we want to do as human beings we’re the same.
And I think sometimes some of the talk I see around this as though you know, the great reset the great resignation, everything has changed. Not really, I mean, really hard for some people, but I thought I would generally it turns the world of work, not that not so much. And I think focusing on the newness, and instead of focusing on the importance of all those questions, what are meetings for, where's work best done? I think is enduring questions, which will not go away, they're always going to be there. And they're always important to ask. So, I think kind of getting away from some of the hype around this towards a what do people really want to get from work? What do they want to give at work? Or what is the best way of organising it? And that is seen enduring? And often very tricky question to answer, but it's not new.
One, and maybe this is gonna be unfair at one last reflection, because I think you touched on this a little bit when you were talking about culturally, but what's your hope, for what I'm going to label the knowledge community, which might be leaders, it might be practitioners, it might be organisations trying to sell you something who've got valid inputs to a debate as well? What's your hope for all of this moving forward? Because my sense is, my sense is that we're probably slipping more and more into what you would challenge as being useful knowledge filtering into people's everyday lives.
But how, what's your what's your sense of where we're heading? Are we is it possible to turn this around and make it much more pragmatic, neutral, evidence based practice? Or are we are, we fighting a bit of a losing battle? At the moment?
I think I think we are fighting a losing battle. And I think politically, for example, in the US, the UK and elsewhere, I think that could sort of set a tone a bit in terms of the way decisions are made and the importance of evidence or not. So, I think there is a bit of an uphill battle. But I also hang on to for some reason, I hang on to this idea that a bit like me, choosing a restaurant, or people making a decision important to them personally, quite often, it seems to me they're quite willing to put a bit of extra effort in to both, make a more informed decision, but also to learn about the thing they're trying to do.
And this is again, to think that one of the mistakes I made is that I think it's not enough emphasis on evidence-based practice is about learning. So, if I'm trying to buy a new laptop or a new mobile phone Oh, yeah, I can either just say I'm gonna get this because I always have, sure. I might as I'm gonna find out. And as I look at reviews as different things, I work out all these different functions and things and mobile phone can do I learn about mobile phones through that process? And then I make a better decision. So, my one hope is, is this is not alien to people. I think it's alien if you tell them it's about making a perfect decision. I think it's alien. I think if you tell them, it's only something brainiacs can do. I think it's alienating if you say it's only about quantitative data for experiments, I think is pretty alienated. But I think if you say no, it’s just about trying to make more informed decision, I think is more approachable and more doable.
And I think, Ian, as you said, you don't have to spend a lot of time do you can just try and do it a bit. And that's okay, too, rather than is this impossible, sort of thing. So personally, in the next few years, I want to spend much more time finding ways of making accessible and doable, even if it's even if it's not, you know, 100%, evidence-based practice making more informed decisions to me and changing the way people think about the way they think and make decisions is much more important.
So, Rob, if people have listened to this, and it's piqued their interest, and they want to do better evidence-based practice, where should they head to? What's the first thing they should look up? How do they find out more about your work?
Okay, so follow me look at me on LinkedIn, fairly easy to find. My name's fairly unusual and on Twitter. And also, you can find and inevitably, very out of date website, robbriner.com. But it just helped me just have some resource and other stuff there. Also, the Center for Evidence-Based Management cebma.org, you can also find stuff there as well. Yeah, they're probably they're probably the main the main sources of thing. Yeah.
-- Refleciton Section --
Cool, so that was Rob Briner. As I said, no, James today. So, you're left with Ian and I. So over to you, Ian, what did you make of our chat? What are your big reflections off the back of that?
Well, I don't want to steal too much of the thunder from the binder ponder coming soon, the double bill that we've got lined up. So, I have a couple of things I've got to think about. The first thing was really early in the conversation, Chris, just that notion that Rob mentioned about sort of signal to noise, it just really struck a chord with me, because, you know, one thing that I don't think anybody would contest from the sort of the pandemic onwards is just the amount of stuff about hybrid, and the future of work and opinion pieces and evidence pieces. And we'll put evidence in inverted quotation, shall we, after that conversation, and all of those things.
And that kind of the challenge to just almost before you can sort of step back and validate what's being shown to you just that that ability just to stop and pause and get more critical about it. And just using evidence-based practice, as a tool, to essentially, I guess, sort of sort the wheat from the chaff.
I think, that whole noise thing, putting on my marketing cap, and I've done this before in past episodes, that kind of thing, when we were talking to Mark Eltringham, actually, we were, we were sort of talking about that. And as someone who's worked for and will continue to work for brands that kind of want to get themselves noticed and recognised there is absolutely a trend and a way of thinking that your brand needs to have something to say it needs, to it needs to have opinions.
And that's why content marketing is so massive, and organisations are spending lots of money, either commissioning or developing themselves to varying degrees of competence, thought pieces, and research and all the rest of it. And I guess, you know, with Rob’s approach, those things have to be looked at, because I would argue that just because something is coming from a commercial organisation does not mean it's not valid, because there's obvious biases behind it, but doing what he suggested, which is to really understand where that information’s come from, and why a certain organisation would say what they say, is good practice, but it is it is the absolute noise of it. That is the challenge because everybody is doing it.
There's not many brands anymore, that are comfortable just doing what they do and do it well. And letting customers talk about it. They kind of feel like they need to be active members of the community as well.
Well, and with all the platforms that enable that to be able to happen, it gets me thinking and I've said this before on the podcast, again, about these sort of, I've got a couple of definitions of critical thinking, which I think are really useful a one of them is and it links to this idea that we were talking about with Rob about knowledge is political, because agendas are always involved.
And so, one definition of critical thinking is understanding that nothing is ever written without reason and once that petty drops and if that becomes a filter, which helps you start to explore and validate why something might be useful or not to you or why you might for example, be really agreeing with it. because it supports something that you believe quite passionately in you know, spotting those sorts of creeping biases and blind spots is really important.
And I guess that brings us nicely on to one of the things that's struck me in the conversation, which is when you were talking about table stakes, employee engagement becoming table stakes, and Rob, kind of challenged that because he was saying, even if people believe that perhaps they are just table stakes, and they need to be done, because people would be confused why you weren't doing it. And perhaps that is linked to the noise. You know, if you're not an HR practitioner, you might not get access to critical thinking about something like employee engagement, and therefore, you take the surface level, and everyone seems to be talking about employee engagement. So that's your first question to your HR leader, if you're a business leader is like, what's our engagement looking like? Maybe.
So, he talks about, if that's the case, then it's when you think about the amount of money spent on these tools, and let's face it, they're not insignificant investments, particularly some of these online platforms that that are more than just the employee survey gathering tool, though, there'll be lots of different things built into that. But they're significant investments. And he talks about the wasted resource, but the one thing I thought about when he said that was, the one resource that perhaps gets forgotten about is the attention of your employees.
And surveys typically take time, particularly some I mean, there's some of these platforms, they, they asked you to fill out 120 questions, you know, and you've got to go through, and it's a big profiling exercise, that's hard work, that is a lot of time. And I guess, employees would feel okay about that, if they could see that it was changing the organisation, both a day to day level and at a fundamental level. But if it's table stakes, if it's just so that you've got that number, so you can have that conversation, the chances are, that nothing visibly is going to come off the back of that particular survey contribution.
So, the next survey that comes out, that might be a bit more directive, but might be a bit more about the everyday might get, you know, a smaller response rate to it. So, I guess for me as well is that if it's something like employee engagement, I know it feels like we're kind of doubling down on employee engagement just because it's the, the example we talked about, but there'll be others, it kind of feels like a waste of employees’, precious time, because let's face it, everyone's a bit time pushed at the minute
What I was hearing, when you were sort of explaining that, from your perspective was, I was hearing the word sort of skepticism, I was thinking that it's that thing about if you're going to do something, you have to be doing something for a reason. And if and not just a reason, from your perspective, but what's the perceived reason from the participants that are involved, because, you know, they will understand things from their mountaintop, because that's what humans do.
And if you want me to put effort into something, and all I can do is use me as an example, then I want to understand that that effort is well invested. And if it is, if it turns out to be over a series of months, or years, or whatever, a vacuous exercise, because it doesn't lead to any meaningful change, any meaningful improvement or whatever, then it calls into question my belief in the system, which then undermines what we're trying to achieve as an organisation. So, kind of everything's linked, isn't it?
What else kind of struck you from Rob’s chat?
Well, kind of struck me from Rob's chat, but it also picks up on a different chat that I heard Rob having in preparation for this, I was listening to Rob's chat with Bruce Daisley on Eat, Sleep, Work, Repeat wishes an episode from back in from memory, early 2019.
And some of the subject matter was similar to what we discussed, they talked about evidence-based practice, they talked about employee engagement as an example of perhaps a flawed kind of metric or a flawed narrative which needs more investigation. So, there is a crossover in terms of subject matter. But what they got into is this discussion between engagement and experience and whether they are one and the same and whether they're not.
Now, I didn't get chance to ask Rob this in our conversation. So, I can't for sure get a definitive answer. But my thinking around this is, it felt like the way Rob and Bruce were talking or more to the point the way Rob was explaining things to Bruce, that engagement was sort of a macro top down way of viewing things, asking the same questions, which may or may not be good measures of engagement, depending upon whether you agree with Rob or not, but asking sort of the same top down questions to everybody, and then kind of seeing a broad picture.
Whereas later in the conversation, it felt like when they were exploring experience, it was about those sorts of micro experiences and micro activities, felt individually by everybody in the organisation. And it almost felt like there was an opportunity for that to be bottom up. So, you know, one sort of camp might say they're one in the same. Another camp might say, actually, they're coming at this from a different perspective. And then this also gets me thinking about when I reference this CPO, that I've spoken to in the past about this topic, their comment being the engagement is almost table stakes. And that led to a whole piece with Rob didn't it about whether it is or isn't and whether it's a waste of resource, and experience being a much more invested exercise that if you're going to go there, you need to be really serious about going there, because it is a significant piece of work. But the benefits may have ultimately a much more meaningful impact to the people involved.
Yeah, I was thinking about this recently and looking at engagement. And Rob sort of saying, obviously, that it's a trend that certainly on the conference stage is starting to be seen as whether or not like go to your point about your CPO, whether or not it's becoming unfashionable, or it's just, it's a standard now, so we don't need to talk about it on conference stages, they just because
Sort of given
Yeah, yeah, the reality is, if you look at the market information about employee engagement, tools, it's a billion dollar global industry. So, it's certainly not going away anywhere. And a lot of people that will want to maintain that global industry, which is set to triple over the next sort of decade or so. So, it's going in, in the kind of northerly direction in terms of a chart.
However, I think I like that what you're talking about there, which is engagement tends to, and I'm sure there'll be someone listening that wants to put a violent rebuttal to this, this kind of hot take of my but engagement, to me always feels like how aligned are you with us? Do you understand our values? Do you understand what we're like here? And if you're not aligned, then we need to work out how to get you more aligned. And it that's the kind of nature of the relationship when you think about it from there.
But what experience and again, to Rob's point, you know, what do we mean by experience, but experience, if you take it again, from my marketing hat, you know, we want to understand our customers. And we want to understand every touch point, so we can make it amazing, as amazing as possible. That is true, customer centricity is not about his experience, you have to have, you know, we've all been frustrated by organisations, when we ring them up to make a complaint about a way something was and their response is but that's our process.
Because that's, that's more about you than it is about me and the ease of my doing business with you. So, if we take that mindset to employee experience, what it gets to is, we understand the quality of your experience is paramount to us. And we need to understand it better, so that we can make some decisions to change that. Not there's a set of values. And if you can't remember them, then we're going to get a bad score.
Righto. So if I play that back to you that and so if engagement is how aligned? Are you to us experience could be? How aligned? Are we to your needs?
Lovely. There you go. You said what I tried to say in three minutes in 30 seconds
Unusual. I know, I didn't have time to think about it that way.
So, the final thing for me and as we sort of as we finish off this reflection, the thing for me, it was actually there was a moment where Rob challenged me back and about a phraseology of mine, which was this idea of marching into the boardroom and saying to people, here's what I'd like to do. And here's the evidence for it. You know, as I said it, I didn't really think anything of it. But it was really interesting that he really picked up on that, and explored this idea that when we go about gathering information, evidence, whether that's external or internal, is there a part of us that is just looking for the evidence to support what we already think?
And how often do we go into a, inquisitive project mode whereby we want to understand and learn something where we say to ourselves, this could go anywhere, and we need to be prepared for that. And we can't describe something that counters what we already thought as disappointing results, you know, we have to accept that this is what it is. And it just sort of struck me that it's probably a skill and a mindset that people don't really spend enough time thinking about it certainly me I you know, I've definitely fallen into the trap that Rob was describing there, where you go and gather information on that but I just thought was really interesting that he kind of got quite animated about that idea. And I think that's fundamentally quite important to him
I think rightly so. It's funny because I clocked it as well. And I thought this is interesting, because let's see what happens. There's this thing isn't there about whether a turn of phrase demonstrates a behavior or not and what you did there was I think, demonstrated what not just people in the workspace in the workplace industry are guilty of, but what a lot of professionals are guilty of which is becoming wedded to their own way of doing things and then believing that's the only true way.
If you remember, James raised this point, a couple of Pinder Ponders ago, he was talking about dissertation students on undergraduate and postgraduate degrees that we both supervised who say, this project is to show that my ways the right way, essentially, I'm paraphrasing, but that's the point. And what going through the process of research methods and really thinking about the questions you should be asking. It's not an exercise in saying we're open to anything, because for a lot of organisations that's terrifying.
It's an exercise in going either I have some theories, I need to test to see whether they fit and they're appropriate or not. Or I have a hunch, and I need to see where this goes. But what I certainly don't have is a foregone conclusion,
What all that reminds me of, and I'm in danger of angering Rob further by talking about leadership books, is it kind of reminds me of Malcolm Gladwell’s Blink, and some of the stuff that Rob talks about at the very start about personal experience and what you know, for anyone who hasn't read, Blink, and I'd certainly recommend it. Not very evidential, but very, very, I liked, I liked it, I liked it a lot.
But what that book talks about is some essentially gut reaction to, you know, to problems and challenges and circumstances that feel like they are just finger in the air sort of impulsive decisions. But actually, there's a whole, you know, series of micro experiences that are logged in some part of your deep, dark recesses of your brain firing up saying, we’ve been here before we know what we're doing. I guess there'll be circumstances where people might feel like, I know what I need to do here, because I've seen like Rob says, I've seen this exact scenario hundreds of times and all the rest of it, there might be sometimes where you do need to go and back that up with a bit of evidence.
So, you kind of do need to try and demonstrate to people who haven't had that experience, why that experience is valid and important. And therefore, your decision making is quite sound, but I guess it's a good habit to have in the back of your mind is am I just doing this? You know, have I come to this with an open mind? Have I been open minded enough? Because my experience was in a different organisation for a different sector or whatever it might be. So, I guess it's a without wanting to sound too, fence sitting is something that you got, you got balance all these things out, right?
-- Outro --
Well, that that's that for now. So, like we said, James will we'll be back for a proper Pinder Ponder. On the next episode. Before we go a quick reminder that you can still use the promo code for Sam concepts Uncertainty Experts’ course, the code geeks will get you 20% off of that that's valid until October this year. The year is 2022 if you're listening to this in the future, and finally remember the three R's rate, review, recommend, it will do wonders for our listenership, but also bringing a broader audience to the Workplace Geeks community. You can find us on LinkedIn, you can talk about the show using #workplacegeeks or you can drop us an email on firstname.lastname@example.org.
And if you've got any research ideas like Ellen from Australia at the start the show, please remember the five Workplace Geeks commandments. It's got to be evidence based, not opinion. Rob, we thought about that before we talk to you. So, we haven't stolen that commandment two, we want to talk to the research teams directly. We want to three, learn together so that we can four, deliberately take different perspectives as we explore five, the broad church that is workplace. So, there you go, folks. That's the criteria we need to meet if Ian and I are going to speak to a fellow workplace geek but that's all for today's episode. We'll speak to you soon.