Conference call: video on or off?

How does having your webcam on or off impact group decision making?
21 April 2021

Interview with 

Helen Keyes, ARU; Duncan Astle, Cambridge University

BRAIN

Brain schematic

Share

It's time for some Naked Neuroscience news! And if you’re dialling in to a remote meeting, do you have your camera on, or off? Well the paper that Anglia Ruskin University's perceptual psychologist Helen Keyes has looked at this month has been looking at how group decision making stacks up with videos on vs off, and she told Katie Haylor all about it...

Helen - For a lot of species, including humans, coming together as a group to solve problems is useful. And we call this collective intelligence. And previous research has shown that synchronising our non-verbal cues plays a key role in our ability to collaborate with each other and in our collective intelligence. So our ability to solve problems together.

Katie - What kind of cues are we talking about Helen?

Helen - So here, we're talking about synchrony in facial muscle activity. So essentially your facial expression and also what we call prosodic cues, which is essentially how you are using your voice - so your pitch, the rise and fall of your voice, the melodic part of your voice really. These are particularly important for cohesion within a group and they aid collaboration.

Katie - So right then, you went down at the end of your sentence, when you said "collaboration". Is that the kind of inflection that you're talking about? You're communicating information to me that you've finished your point.

Helen - That's exactly right. That prosody contains communication, but also the emphasis we place on words, how we stress words, the pitch and the speed at which we're talking, the quality of our voices, all of that comes together to form our prosody. So another key non-verbal cue that predicts our collective intelligence is equality in conversational turn-taking - how willing we are to take turns when we're speaking to each other and to use cues to understand that one person has finished speaking on it's now the next person's turn.

Katie - I find this so difficult, Helen. I find it really hard not to interrupt people! Partly it's an internet lag, which isn't helpful. But if I'm not in front of somebody face to face, I think I do quite often interrupt people and it's difficult.

Helen - You're not at all alone in that. And we are finding in general - there are some side studies looking at turn-taking using video conferencing - and without those natural cues of being with somebody, we are actually a lot worse at knowing when to take turns in a conversation.

Katie - So what did they do to try and look at this?

Helen - Studies so far have largely focused on these behaviours in face-to-face environment. But this study wanted to look at how we use these cues, synchronise ourselves to each other over video calls and also over calls that are audio only. They wanted to see what would the same cues predict our collective intelligence.

They got 99 pairs of people. And they asked them to complete a series of collective intelligence tests, essentially some group problem-solving tests. So where you would have to come together to make decisions or generate ideas together. And half of these pairs did these tests over a video call where they could see each other and hear each other. And the other half of the pairs, did it on an audio only call. The research also measured how in sync they were. And they did this by looking at their facial expressions. So they recorded their facial expressions and used software to detect movements and expressions and matched them up with each other. And they also recorded the participants' prosody. So they had software looking at people's pitch and loudness and voice quality, and even frame to frame differences in their speech and they match that up.

And finally, they looked at speaking turn inequality. So how good people were at conversational turn-taking.

And the results showed that - well, the good news is there was no difference in collective intelligence scores depending on whether participants used video and audio, or just audio alone. However, video calling and audio calling involved participants using different methods to synchronise with each other. So for video calling facial synchrony significantly predicted the collective intelligence scores and this not surprisingly, wasn't true for the audio only calls.

However prodosic synchrony, so syncing up how you're using your voice and how you're using language, that significantly predicted collective intelligence overall across both types of media. And this was higher in the audio only condition. That was a bit surprising. So when you didn't have any video, you did better at linking up and having that prosodic synchrony with each other. And that's quite interesting because when you were just using audio only, you were actually better at the conversational turn-taking. And that led to this better synchrony in your prosody, this better communication between people, which in turn led to better collective intelligence scores.

So interestingly, when you have your video switched off, when it's audio only, you appear to be more tuned in to those voice cues, to the rules about etiquette, about turn-taking in a conversation, more tuned in to that. And you have essentially more pro-social behaviour in that way compared to when you're using video calling as well.

Katie - I wonder if it's partly because we're so used to talking on the phone for so many years, a lot of people feel comfortable doing that. I find video calling a bit of a distraction.

Helen - That's exactly what it is. It's the distraction of the video call. So whether you're looking at the other person or looking at your own face like I'm doing right now, it's very distracting. And it's interesting that it does lead to less good turn-taking. So even though overall, it wasn't that audio was better for collective intelligence compared to video, we can see that there's those differences. It is better in some ways it's better in that prosodic synchrony. But then of course you lose the facial expression synchrony in the audio only. And of course, something this study didn't look at was the other feature that we tend to use when we're video calling and video conferencing, which is the chat feature on the side. And I think that's a whole other study that could really be used to think about - is this helping us to synchronise with each other, because we're really getting our thoughts out there in real time? Or is this a further distraction from this pro-social synchronising that might help us to collectively solve a problem?

Katie - I'm being a bit flippant here, you know, giggling about not putting my video on. But so many people use these technologies now, professionally, do you think this study supports the idea that actually it's quite justifiable to have a professional conversation and actually not have your video on?

Helen - Absolutely. It's quite justifiable. What I would say is a mix of video on and video off probably isn't going to be any benefit. Because if some people have their video on, there's still that visual distraction. So it would possibly work better, in terms of tuning into each other's prosody and turn-taking, if we all had our cameras off. But again, missing out on that great emotional expression synchrony that we might expect with videos on.

 

The paper that Cambridge University cognitive neuroscientist Duncan Astle looked at this month is all about how the brain tracks volatility in the world around it - specifically, when applied to the stock market. And he told Katie Haylor about it.

Duncan - So they created a stock market game, so subjects were recruited and were paid a basic payment for being in the study. And then they were given the opportunity to earn more by making good investments. So the subjects would perform what they called the asset pricing task in an fMRI scanner. And the task displays trend lines which show you the recent history for individual stocks. And they are shown these sequentially and then you can make a decision with each one whether you want to invest, stick with your current investment or sell your current investment. And then periodically those are updated as if it’s the next day. And so overtime, what subjects are doing is making some sort of evaluation as to which stocks they want to invest in and which stocks they don't want to invest in.

Katie - But these aren't professional stockbrokers, right? These are just participants in the study.

Duncan - Exactly, they are not professional stockbrokers. They could have invested in the stock market themselves if they wanted to. The only as Joe Public. They are not professional stockbrokers .

And so each time they play, they will see 14 different stocks and they are real stocks from the S&P 500, but taken from recent history, so a 30 day window from a few months ago, before the experiment was conducted. So they’re real stocks or historical stocks. But the people themselves are not stockbrokers.

Katie - So they've put these people in a scanner so that the scientists can look at their brains whilst they're making these decisions, right? What exactly were they looking for?

Duncan - So they were interested in whether and which parts of your brain track what's gonna happen next. So does the activity in different bits of your brain make accurate predictions about what's going to happen in the future to the stock prices?

Katie - Ohh okay, so something like if it's likely to go down you see a little bit firing compared to if it goes up or something like that?

Duncan - Exactly, and you can imagine that actually when you're reading the stock market, when you see the lines appear on the screen, there's actually multiple different signals that you could extract from that visual input that will enable you to make a prediction about what's going to happen next.

Katie - Is that what they tracked then? Were they searching for different kind of indications of which way you are going to go? Are you going to invest or you're going to pull out?

Duncan - So they could track activity in different parts of the brain that are known to be associated with reward processing like the medial prefrontal cortex, the anterior insula, the nucleus accumbens, these are all areas that we know are involved in processing rewards. And in particular, they could test whether the activity in these areas tack dynamic changes that are happening in the stocks. For example, do some areas code for the relative volatility in the stock, versus other areas coding for rises in stocks? So you can imagine all of these bits of information - the volatility, the rising of stock, the decreasing of stock, could all provide you with information that might guide your behaviour in the future. And they were interested in whether different parts of the brain track these different types of signal.

Katie - What did they find then, once they looked at these people making these decisions?

Duncan - Looking at peoples behaviour, people’s choices didn't predict the next day's stock prices, which is reassuring. But the stock prices themselves did. Just like in a real stock market when things rose one day, they were more likely to decrease the next and vice versa. So the idea is that this kind of predictability of the stock market provides the participants with the information that guides their choices.

Katie - It's just really saying that the stock market is volatile!

Duncan - Exactly. And the trends that you see in one day do make some predictions about what might happen the next day.

Katie - So you can get kind of good at playing the stock market?

Duncan - Yeah, exactly right. So some people will get very well tuned to knowing “this stock’s been climbing for five days straight now, I think we're pretty close to what's called an inflexion point” - where it’ll reach its peak and people will start selling off the stocks and it will start to go back down again.

Katie - And could they see that manifest itself in the brain activity then?

Duncan - Yes, so in terms of the brain activity,  activity in the nucleus accumbens predicted the next day stock prices. So the nucleus accumbens is part of what's called the basal forebrain, just next to the hypothalamus, a part really quite low down in the brain. And that would predict the next day's stock prices. Whereas the anterior insula appeared to be particularly good at tracking drops in value following a rise, so called inflexions. So the nucleus accumbens appears to make a quite general prediction about the future, whereas the anterior insula, this part of the frontal lobe, seems to be really good at tracking the moment which stock is called reached its peak and it's about to turn into a decrease.

Katie - Do you think it would be sensible to apply those situations beyond just the stock market? Because I can imagine in everyday life you're trying to predict generally the pattern of something. But then if something really important is going to happen, maybe that would be a slightly different question.

Duncan - Really, we need to step back from the whole stock market scenario, because that's just really a game they've created to create a kind of dynamic kind of competitive environment where people are trying to make predictions about what happens next.

In reality, you and I are making predictions about the future all the time - do I cross the road at this point? Do I buy a house for this amount? And interestingly, the areas that are implicated in this study have been associated with those who make particularly risky choices, or who are particularly risk averse.
For example, the nucleus accumbens has previously been assisted with positive arousal and risk seeking choices. And the anterior insular has been associated with general or negative arousal and risk averse choices. So these areas of the brain - they're not  the “stock market bits of your brain”, they’re the parts of your brain that track important statistics about the environment around you that we think then you use to make good decisions.

Katie - Oh, I see. Did the particular scientists in this study though, did they look and see out of the participants who was a bit more cautious, and who was a bit more carefree? Because I guess you could probably see that in the variations of the brain activity, right?

Duncan - Yeah, you should be, but my sense is that what you would need is quite a large sample size. So they only have 30 to 40 participants per experiment, and so in order to pull out these individual differences in risk taking, you might need larger numbers. Another really interesting thing to explore would be to compare people of different ages, so there's some evidence in the literature that adolescents, for example, are more prone to making risky choices, especially when it's in collective contexts. So when there are other people around. And so it might be interesting to explore whether there are age related changes in the relative weighting that people place on these different sources of information about what's going to happen next.

Katie - Having said that this needs to be taken beyond the context of just the stock market, do you think if you work on the stock market, could tracking what the brain is doing in this way give you any advantage? Or are you just delving deeper into what the brain is actually already doing?

Duncan - I think it’s that you’re delving deeper into what the brain is already doing. So we know that if you practise tasks everyday then you can really tune performance. So the classic example is London cab drivers. Systems of spatial navigation are enhanced in London cab drivers. Now of course you and I even though we're not cab drivers still have good spatial navigation skills and we’re using the same systems.

Katie - Erm….

Duncan - But they are so well tuned in those individuals. And I can imagine that if you were to recruit some stockbrokers, then you would soon find that they are somehow able to tune into these signals - whether consciously aware of it or not - these signals about what's about to happen next.

Comments

Add a comment