Categories
podcast

Analyzing behaviors in aviation security, with Philip Baum

Aviation security professional Philip Baum (www.avsec.com) talks about analyzing behavior for aviation security and risk assessment purposes, and for security purposes in general. Transcript below.

Topics discussed include: looking for deviations from the baseline behaviors normal in an environment; successes of behavioral analysis for security purposes; what can make some of this work controversial; thoughts on what aviation security does wrong. 

Episode links:

Resources related to our talk:

TRANSCRIPT

Zach Elwood:

This is the People Who Read People podcast, with me Zachary Elwood. This is a podcast aimed at better understanding other people, and better understanding ourselves. You can learn more about it at www.behavior-podcast.com. And if you like this episode, you’ll probably find a good amount more episodes you like in my back catalog. I have episodes on security and policing, I have episodes on mental health, I have episodes on reading behaviors in sports and games, and more.

On this episode I talk to Philip Baum, an aviation security consultant and trainer. Our talk is focused on behavioral analysis: the studying of human behavior to detect threats. We focus on the aviation industry, but much of what Philip says is applicable to security and threat detection work in general.

Philip has a long resume and you can learn more about him on his website avsec.com: that’s avsec.com, avsec is short for aviation security. I’ll read just a little bit from his website about his history:

He’s a security professional with more than 35 years’ experience, primarily gained in the international civil aviation environment. He started working in the aviation industry in the 1980s, when he joined Trans World Airlines’ security subsidiary at London Heathrow. From Duty Manager at Heathrow, he moved to TWA’s International HQ where he ultimately became Manager Security Training and Auditing. He left, in 1996, to establish his own company, Green Light, through which he serves as a subject matter expert for the Airports Council International (ACI) in the area of Behavioural Analysis, and runs training courses for them. He also designs and delivers the International Air Transport Association’s (IATA) Inflight Security courses.

He devised and developed a security system called Tactical Risk Assessment of People, which is based on non-racial profiling, observational and questioning techniques. He also established, and chairs, the Behavioural Analysis series of conferences, as well as the DISPAX World trade shows on hijacker and unruly airline passenger management.

He served 24 years as the editor-in-chief of Aviation Security International from 1997 until 2021. The general media use Philip’s services when in need of expert comment; he is a regular guest on CNN, Sky News and the BBC.

Philip’s first book was released in 2016: it was called Violence in the Skies: a history of aircraft hijacking and bombing.

Okay, here’s the interview with Philip Baum…

Zach: Hi, Philip, thanks for coming on the show.

Philip Baum: It’s a pleasure to be here. Thank you for inviting me.

Zach: So, maybe we could start with you giving a quick summary of your career and what has interested you in this space.

Philip: I guess I started my professional career within the aviation industry, and I’d always had a dream about aviation. I grew up on the flight path into London Heathrow and would watch all of the aircraft flying in on their final approach from my bedroom. But I never wanted to be a pilot, I was always fascinated by both the people that worked within the industry and the people that were travelling. And so perhaps it was no surprise that I eventually found myself working at Heathrow Airport, and in a security capacity. And the programme I found myself working on was a passenger profiling programme. Now some people might think the word ‘profiling’ is a swear word, but it’s not because I’m talking about non-racial profiling. I’m talking about identifying people that might be a risk. And that was something I was very actively involved with throughout the 1990s. In fact, in 1996 when TWA went into chapter 11, I actually left TWA and set up on my own. I set up my own consultancy programme company called Green Light. For the last 27 years, that’s what we’ve been marketing– behavioral analysis, identifying hostile intent, and hopefully developing systems where we can identify the person rather than the item. So we’ve been less concerned about the gun, the grenade, or the bomb, I’m more concerned about the person and what their intent is.

Zach: What are the specific areas of the airport or flight process that you focus on?

Philip: The two areas of aviation security that I’m particularly interested in are in-flight security, so that’s unruly passenger management and hijacking management, but also in the behavioral analysis and what happens at the airport. I guess, ultimately it’s about the people and about identifying people that might be a threat, and identifying the best way to actually manage those threats in the worst-case scenario. So most of my work has been airport based, very much looking at how you can incorporate behavioral detection or what I’m still happy to call passenger profiling into the security operation, because I firmly believe it should be our first line of defence. And I think if we look around the world, we tend to find that everybody wants to use technology. They all want to actually have a system where the computer says, “This person is a threat.” Whereas I think we should be using the best technology of them all, and that is the human brain. Ultimately, it’s what we tell the general public to do. We tell people to see something, say something. And yet somehow for some reason, we feel that when it comes to airport security, we would rather an archway metal detector or an x-ray machine do the job for us. And all of those technologies are great and they have their place, but they need to be used intelligently. And if you look at the history of attacks against aviation– and I have to say I did write the book on it, Violence in the Skies: A History of Aircraft Hijacking and Bombing– through my research for the book, I found that actually before almost each incident, there was somebody saying that they thought that something was wrong. But they often didn’t act on it. And I’m going back and including events like 9/11. You know, there were 11 out of the 18 hijackers who were identified, but people didn’t act appropriately having been concerned about people’s behavior.

Zach: Do you think a part of that would be a factor and there will be people afraid to be wrong and be perceived as culturally or racially insensitive or things like that? Could that be a factor?

Philip: I think there’s no question that people are frightened of reporting. And yet, I find it really bizarre that when it comes to airports and it comes to airport security, for some reason the general public expect everybody to be treated the same. Now, of course, we’d all love everybody to be treated the same and everything to be fair, but unfortunately, security isn’t fair and security can’t be fair. And what really troubles me is the fact that we want everybody to be treated the same way before they get on an aircraft, and yet we accept the fact that when you get off an aircraft at of an international flight, you go through immigration controls. Immigration don’t treat everybody the same and yet every day they find people doing something wrong after they’ve got off an aircraft. At Customs inspections, there’s the green channel or the red channel. And in the green channel, customs officers pick on certain people because of their appearance and their behavior. And every day they find somebody doing something wrong after they’ve got off a flight. So it always begs the question, why aren’t we doing that before people get on a flight? If we allow behavioral analysis to be used in airports, why don’t we use it within the security pre-flight screening process?

Zach: When it comes to the use of behavior, are there certain things that stand out to you as top of mind for, you know… And I don’t know how much of this you can talk about because I don’t know how much is kind of industry secret knowledge that people don’t talk about, but are there things you can talk about when it comes to behavioral analysis and prediction?

Philip: I think the key to it is actually understanding the baseline. And this isn’t only about aviation security, this is in any environment. If you’re at a sports stadium, you have expectations of behavior from different fans, from the players, from the people that work within the sports stadium, from the people that live in the local environment, from the taxi drivers, from all of the other security services that are operating at a given venue… You could do the same in a retail environment, you can do the same on a beach, in a casino, at a health club. Wherever it is, most people that are working there understand the baseline for the environment that they’re working in. And what we’re actually asking people to do is to identify when somebody doesn’t match the baseline. And if somebody doesn’t match the baseline, we’re not accusing them of anything. We’re just saying that that person warrants further inspection. So if it’s at a train station and somebody is actually giving you cause for concern because of their behavior, you’re not going up to arrest them. But you might actually be going up and actually starting a conversation with them. And that can be done in a very customer service-focused manner. It doesn’t have to be accusatory in any way. And actually, you might, through that conversation, elicit information that will help you resolve your causes for concern. Ultimately, the people in public places and in crowded places that we’re likely to end up focusing on more than any other will be people that are on their own. Because the is less leakage of emotion and behavior by somebody that is on their own than people that are with their family members, with their friends and with their colleagues, where you see the normal banter, the normal interaction and normal facial expressions. So somebody on their own, yes, they might be more likely to be– I’m going to use the phrase ‘picked on’– but they might generate greater concern in the eyes of a trained security officer purely because they’re not actually displaying emotions that are present in normal day to day communication.

Zach: Are there other major things that stand out when it comes to behaviors? I’m wondering if there’s maybe some things about, you know, say on the security line if people are acting a certain way? Or can that be really hard to say because, you know, anxiety, people can be anxious for many different reasons and such.

Philip: There is no question that people can be anxious. And listen, I come from the aviation industry, and it is estimated that actually 40% of people that are arriving at an airport have some degree of concern when they arrive at the airport. Whether it’s fear of flying or fear of the process or fear of losing their luggage, or just concerns about time and queues, lines that they might have to wait in. There’s a lot of stress at airports. So we’re not just looking for the normal day-to-day stress that a trained security operative will be able to identify and distinguish that from somebody that might actually have hostile or negative intent. What we’re trying to do is, as I’ve said, identify deviances from normal baseline behaviors and to identify a whole range of different threats. And the fact that we’re looking for a whole range of different threats actually helps address the concerns that people have that we might end up racially profiling people. Because, for example, we’re not only looking for the terrorist threat, we’re not only looking for the criminal that may be the shoplifter, we are also looking for the person with poor mental health that might be a threat to others or indeed to themselves, we’re looking for the insider threat, we’re looking for victims of human trafficking or the traffickers themselves. And obviously, the list of suspicious indicators that you might focus on will vary from location to location and from industry to industry.

And for me, the classic example of behavioral analysis, both working and not working, I can take you back to the Ariana Grande concert which some of your listeners may be familiar with that took place at the Manchester Arena five years ago, where Salman Abedi actually arrived at the arena well after the concert had actually began. And he was carrying a backpack, which is not how people normally come to a pop concert. He arrived late, he was on his own, he sat in a secluded part of the arena kind of out of sight. He was observed for almost an hour and a half by various people, including one security guard who did absolutely nothing. He was seen first of all by members of the general public, and such was their concern that even members of the general public went up and spoke to Salman Abedi and even said to him, “You know what? It looks a bit strange somebody like you arriving here with a backpack sitting in this location. What are you doing here?” And Salman Abedi said, “Well, I’m waiting for somebody.” And the person that saw him wasn’t happy and went to speak to a security guard. And the security guard had said, “Yeah, I’ve already seen him. I’ve already clocked him. I’m already looking at him.” But he didn’t do anything, he waited for somebody else to come. And then the supervisor or a more senior person came along. And the two security guards chatted with each other about what they would do if Salman Abedi were to do something dangerous. And they actually said, “Well, maybe we’ll jump on him.” But they were both clearly concerned. And that security guard in the inquiry following the event, when he was asked, “Why didn’t you do something?” He said, “I thought I would be accused of racial profiling. What if I got it wrong?” And as a result, many people lost their lives and hundreds of people were injured, because Salman Abedi eventually blew himself up despite having been observed by security staff for nearly an hour and a half. And so it shows that people had concerns and therefore behavioral analysis actually does work. But behavioral analysis doesn’t really work unless you’ve also got the mechanism and the operating protocols to make sure that people do act on their concerns, and that we don’t victimize security guards for getting it wrong. And providing they’re not just picking on somebody because of the colour of their skin or their sexuality or some discriminatory factor, providing they’re doing it because they can actually put into words their concerns that here, using the example of Salman Abedi, was a young male with a backpack that was arriving after a concert began, sitting out of sight of most of the people and behaving unusually, not maintaining eye contact with anybody, and his demeanor was not that of somebody that was waiting for a relative to come out of the concert at the end.

Zach: To your point for 9/11, I remember there were examples of people noticing unusual behaviour, and I think that even included people noticing the behaviours of the terrorists long before the attacks when they were doing their test flights and such. Is that… Am I getting that right?

Philip: Yeah. No, there were people that reported. And even if you go to Richard Reid, the Shoe Bomber who tried to carry out his attack a few months after the 9/11 attacks, you know, why did Richard Reid not attack El Al the Israeli airline as planned? Because he did a test flight and he was identified as a possible threat to the flight. And he went away and he said, “I know I’m not going to get through the security system. They are going to pick on me.” So he ended up targeting American Airlines. And indeed, what did Richard Reid do? He turned up on the 21st of December 2001, the anniversary of the Lockerbie disaster, and tried to get through the system. He was identified as a possible threat. He was delayed so much at Paris Charles de Gaulle Airport that he actually ended up missing the flight, was sent to a hotel at American Airlines’ expense, and the following day he came back and basically the people in charge basically said, “Hey, we gave this guy a hard time yesterday, let’s let him on the flight now.” So there are numerous examples of behavioral analysis working. Now I think there will be some people that will say, “Yeah, 11 out of 18 hijackers on September the 11th were identified. Why didn’t they do anything?” Well, there are various reasons why people don’t do something. Often, it is the time pressure that is put on, particularly the aviation industry, with the desire for on-time performance. Because of course, beyond those 11 hijackers that were identified, there were probably another couple of 100 passengers that were also identified as being a possible threat that day who were not a threat. And it does take time to actually implement a behavioral analysis system, but personally, I think it is far more effective than the routine X-ray screening of all bags and asking everybody to walk through an archway metal detector. Which is, as I say, it’s great theatre, it looks good, and many of these technologies are very useful tools to have. But we need to find a way to make sure that if people have got concerns, people act on those concerns.

Zach: The Ariana Grande bombing made me think of a very minor incident, but it was one that was interesting to me. It was a thing from a few months ago, a basket ballgame in America where an animal rights activist was trying to run onto the basketball court and was immediately caught by a security guard. It was interesting because I think there were things about that person’s behaviour sort of similar to what you were saying where I think these people arrived at their seat late and they didn’t seem interested in the game, I think they were just kind of like looking tense and looking at the court. And so the person suddenly rushed from their seat to the court and it seemed like the security guard had been eyeing them as unusual and immediately caught them basically right after they got out of their seat. I think it was another of those cases where the security guard just was trained to know this is pretty unusual behavior from this person, they seem like they’re planning to do something.

Philip: It’s interesting because I’ve been involved in security operations at a number of major sporting events and indeed, what we are often asked to do is when we see somebody, we’re told to keep an eye on them. And I always have a problem with that because that might be fine from a police law enforcement surveillance perspective where you keep an eye on somebody, but if you’re brought in as the security professional to prevent potentially a suicide bomber actually reaching their target, you don’t keep an eye on somebody, you act. And as soon as you’ve identified that person, you take steps to try to neutralize the threat. Now, that doesn’t mean neutralize the individual, but it does mean neutralize the threat. And that means you need to interact with that person. You don’t wait for them to get to the security checkpoint. Again, I’m thinking about a sports event where you may be having patrols outside the stadium. You don’t wait for the person to get to the ticket counter to challenge them. Because if they were a suicide bomber and they did have suicidal intent, then the moment you challenge them at the security barrier and they detonate their device at that point, you have increased the number of casualties tenfold. You actually want to intercept that individual in advance of the security checkpoint, or indeed potentially, after the security checkpoint. But not at the most crowded densely packed area of your security operation.

Zach: And I guess the seeming ambiguous nature of some of this work is part of what makes it hard because some people, even if you have your list or your training that’s done well, from an outside perspective, the questioning of one person can seem kind of subjective and random to the outside eye even if it’s done in a great way. So I think that… Would you agree that the subjective seeming nature of it can add to the hardship and the reasons that people can often find it controversial even if it’s done well?

Philip: Yeah, I have to agree. Listen, it is subjective by its very nature. And as I said, we have this great desire to treat everybody equally and everybody the same. But let’s face it, that’s not what law enforcement does. The police when they’re going out on patrol, they patrol some areas more than others where there is a greater chance of there being a criminal act perpetrated. The types of technologies we even implement at different locations are proportionate to the type of threat that exists at that location. And if you are trying to secure a premise, you need to act on concerns. You can’t just act because a system alarms or just because somebody sees something on an X-ray monitor. That is poor security. The number of terrorist plots that are identified around the world by law enforcement communities are not identified because we’re keeping every single member of society under surveillance. They’re identified because the security services are focusing their attention on certain areas more than others. And that is an unfortunate necessity of everyday life. That’s what we are paying our security services to do; to keep us secure. And yes, it will mean that some people are possibly picked on when they are completely innocent. And that’s where the training kicks in. Because that’s where you don’t pick on somebody and victimize them, you actually identify somebody that you’ve got cause for concern. And using a customer service approach, you actually try and resolve your concerns. You have the conversation, and you actually try and elicit the information in a customer service manner.

We have to remember that there is no such thing as 100% security. We all know that. I find it amazing that after each, particularly aviation-type incident, the media often sort of say, “How is it possible that somebody could get through airport security?” And I’m sitting there thinking, “Of course, it’s possible that somebody can get through security!” Look at the prisons of the world. Nearly every prison around the world has got a problem with drugs or with small bladed items or cell phones and mobile phones managing to get onto the inside. How do they get inside? Well, they get through using insiders, they get through by using innovative concealment technologies or concealment techniques, possibly internal carries. Because if somebody has got the will to get through a system, they can find the way to do it. And a person hasn’t got to worry about wait times or customer service, they know that they’re already dealing with people that are supposedly the bad guys in society and their friends and family, and yet things get into prisons. So if things can get into prisons, things can get into airports, things can get into sports stadiums, and therefore we cannot just build a system that is based on screening technologies. And we need to supplement that with a human approach to security that yes, it is subjective, but it’s based on common sense. It’s based on the very thing that we asked everybody else in society to do; to report concerns. To see something, to say something. If you see a bag on a bus or on a train, you’re told to report it. Well, the same if you see an individual that is giving you cause for concern. You need to report it. Then it’s down to the training as to how you respond to it.

Zach: Yeah, that gets into something I’ve talked about in a couple previous episodes about behavioral analysis and security and interrogation situations; the line between what’s a controversial use, and what’s a good use of behavior is how certain you’re acting on it. Like you say, it’s like if you’re using it as a reason to just look into something more, that’s not really going to go wrong unless you do something really bad. But the problems come in when practitioners are too certain, you know, and the stories about cops thinking, “Oh, this person’s guilty because they did XYZ behavior.” That’s really where the problems come in. As opposed to just using that as one of many points and just interrogating someone a little bit differently or something.

Philip: Well, listen, I’m really troubled by a lot of the stories that I see in the press, both from United States, Canada, and indeed from my own country from the UK, where we see excess force used by law enforcement. Those are, of course, in the main isolated incidents and they are extremely regrettable. What I do know about law enforcement, and certainly I can speak about it from a British perspective, is that most police officers are actually almost not acting when they should, because they are terrified of being accused of profiling somebody based on their race, religion, color of skin, or sexuality. Which is why when we are teaching them, we’re saying when you’re describing a reason for arrest or for even having a conversation with somebody, you need to be clear in your mind what it is that is giving you cause for concern. It’s not about their ethnicity, it’s not about their gender, but it is about a behavior. So if you are saying that somebody was behaving suspiciously, write down what does suspicious mean in that context? If you say somebody was standing on the corner of the street looking left and right, looking as if they were trying to identify somebody and perhaps carrying out surveillance for a future attack, then you have actually put into words what you were concerned about. And even then, that doesn’t justify immediate use of force, it justifies an intervention of going to have a conversation with that person to find out is that person carrying out hostile reconnaissance for a future attack, or are they simply waiting for their girlfriend or boyfriend or partner to turn up because they’re late? So for every suspicious behaviour, there is a potentially good explanation for it. And we need to make sure that those people that are implementing behavior-detection programmes and reacting to it are trained to have those conversations using a friendly customer service-based approach. Obviously, in certain circumstances, that’s not always going to be possible. There will be people who will immediately respond to law enforcement– even a polite question– in an aggressive way. And that’s where these things can often start to escalate and we can end up in a place that we don’t want to end up in. But at least you need to justify your initial intervention on the basis of behaviors that you’ve witnessed that have given you cause for concern.

Zach: Any specific behaviours you’d like to talk about? Or do you think it’s mainly about the baseline, as you said, and just noticing major deviations from baseline?

Philip: It is about the baseline. I mean, you can come up with– and when I’m running training courses on this, we’ll often do an exercise in trying to develop a suspicious signs lists based on your operational environment. And you can come up with 100 or 150 suspicious indicators if you wish. The trouble is that you might well witness something that isn’t even on that list of 100 or 150 suspicious indicators. And actually, I think-

Zach: And you can’t keep track of all those in any way.

Philip: Yeah. I think actually it is important to actually empower the security operative to be able to use their common sense and to be able to utilise their own words to describe what it is that they are concerned about, rather than simply having a checklist of concerns. Obviously, there are the things that people always talk about. Somebody that is perspiring profusely or shifting their weight from foot to foot. Well, those may be causes for concerns. But if you’re at a sports stadium and you’re supporting your team, you might well be shifting your weight from foot to foot, you might well be perspiring. If you’re going on a Tinder date, you might be waiting for somebody in a restaurant and be very anxious. There are lots of reasons why somebody might not be behaving exactly in accordance with the baseline set. So it’s all about sensitivity and how you react to it. But you would actually have to customize a suspicious signs list for the environment that you’re going to work in. So yeah, I’ve done programme quite recently for a beach security unit. Obviously, if you are seeing somebody with a heavy overcoat– I’m pushing it to the extremes here to really the absolute obvious– somebody sitting on a beach with a backpack and a heavy overcoat over them. Well, obviously, you’re going to be wondering what is that person doing? But it may be a homeless person that carries their life around with them on their back and is simply going down to a beach. But it would certainly warrant further inspection.

Zach: Is there much use of video training of watching footage of actual criminals and people who did bad things, and using that in the training?

Philip: You definitely can use that, particularly if you’re doing training for retail staff that are trying to identify the behaviours of shoplifters. There’s nothing better than actually showing them how people actually shoplift. There are, of course, lots of TV series that actually even help us. I can’t remember what the American version of Border Force is, but I think in every country now, they show you customs inspections of people arriving in a given country. And you are seeing the video footage of the person that is actually picked up, and their behavior whilst they’re questioned whilst their bag is searched, and the description as to why they were actually identified in the first place. Those are all really useful, but the really best way of training people is to take them down to a live operational environment, the environment in which they’re going to work, and talk to them and shadow them.

And you will see that in most environments, 99% of the people are normal law-abiding citizens. And the people that you will focus on are people who are also law-abiding citizens, but actually they’re not necessarily matching baseline expectations at that given time. I’ll give you an example from a sports event that we recently covered, where we were very concerned about an individual’s behaviour outside of sports stadium. And eventually we went and we spoke to the person, and we found out that they were basically an autograph hunter. And that’s all they were doing. It was that they were waiting for their sports personalities to exit the grounds so that they could actually get an autograph. And their behaviour was different to other autograph hunters, but we got the explanation. And so, nine times out of ten or 99 times out of 100, you have the conversation and actually, you realize that there was nothing wrong. But it doesn’t mean that you were wrong to intervene. And that’s where I get really frustrated with security managers, particularly in an airport environment. When somebody sounds the alarm or somebody goes to intercept somebody and then you don’t find anything on them. And if a security manager turns around and says, “See, you were wrong.” That, for me, is a very poorly trained security manager because-

Zach: -being wrong is part of the… Yeah, the part of the optimal strategy means you have to be wrong a good amount of time.

Philip: Absolutely! People are going to be wrong. And you do not penalize somebody for being wrong because all you’re doing is dissuading them from actually sounding the alarm in the future. And anyway, who’s to say that the person that you did identify that wasn’t carrying anything wasn’t doing hostile reconnaissance for a future attack anyway? Exactly as Richard Reid did with his shoe bomb.

Zach: It’s like in poker, playing the most optimal strategy of that or a lot of games requires you to sometimes, you know, a good amount of the time be wrong with your decisions because it’s a game of incomplete information. So any game of incomplete information, you will, by definition, be wrong with the optimal strategy a good percentage of the time.

Philip: And there’s absolutely nothing wrong with that. I mean, the other issue I have with many security managers and supervisors is where there is a screening checkpoint– is if somebody raises their concerns about somebody where the supervisor says, “Well, did you find anything? When you X-rayed their bag, did you find anything?” And the screener says, “No.” Or, “Did they alarm the archway metal detector?” And the screener says, “No.” And then the supervisor says, “Well, let them go then.” Now, for me, that does not make sense. The fact that we did not find something prohibited or restricted in somebody’s bag or on their body does not mean that they do not pose a threat. There is the possibility that they might acquire the weapon or whatever it is they’re going to use after the checkpoint using insiders. There are numerous types of attacks that can be perpetrated without the need for a weapon or explosive device at all where somebody can actually pose a threat using their bare hands.

Zach: One thing I’ve wondered about airport security, are they recording conversations very much and analyzing it for an automatic analysis of scary words like explosion or bomb or things like that. Is that a thing that happens?

Philip: In Hollywood, yes, but not in the real world. [chuckles]

Zach: Gotcha. Okay. That’s good to know. It’s something I’ve always been curious about when having conversations in airports.

Philip: There may be very specific locations where there is some kind of analysis going on, and I certainly think that in the future that we might see that utilized more often in future. But the reality is that, you know, I get exasperated every time I see that somebody’s been arrested because they used the word ‘bomb’ at a security checkpoint. I mean, do you really think that a bomber is going to come along and utilize the word ‘bomb’ at a security checkpoint and actually have a bomb? All I feel is that everybody becomes fixated on that person and ignoring all the people that really could be a threat.

Zach: They’d probably be the least likely. Yeah.

Philip: It’s the same with this whole nonsense over the liquids, aerosols, and gels. I mean, talk about a distraction where you’ve got screeners that are almost excited because they found 125 mils of toothpaste or a bottle of perfume or a bottle of water. And that’s what the screeners are looking for. Because they know they’ll find bottles of water and tubes of toothpaste. For me, that is just a huge distraction from actually focusing on trying to marry up the bag, the contents of the bag, and the person that is carrying the bag. And that can be in a retail setting, in an airport setting, at a sports stadium, or in any environment. It’s about building a picture of the entire person that you’ve got in front of you.

Zach: I know you focus on unruly passenger behaviour, and I know that there’s reports and statistics showing that that has gone up in recent years. Can you talk a little bit about how bad that problem is? Has it really gone up as much as people say? And maybe what do you see as some of the causes there? I know that’s a lot of questions I just asked, but…

Philip: Well, there’s a lot of problem with statistics, isn’t there? That they reveal interesting facts, but they also disguise interesting facts. What we do know is that there are more incidents being reported now than ever before. But maybe there aren’t actually more incidents, maybe the people are just reporting them more than they did in the past because that’s what crew members are being encouraged to do. And because of our greater use of social media, there are more incidents that are hitting the headlines because there is video footage recorded on board aircraft that simply people weren’t doing 10 years ago or 15 years ago. But there is no question there is a problem with unruly passengers. And I think what we’re seeing is a gradual breakdown of discipline in society, where people feel more entitled. And I know I sound like some extreme right-wing activist here but I am extremely concerned. Even though I think I’m politically very moderate, I am extremely concerned about the fact that people on both sides of the political spectrum are becoming more and more extreme and more and more opinionated, and some of those opinions then turn into arguments, and often, arguments that have severe implications for public safety. Whether it’s on an aircraft or a sports stadium. We are seeing a greater number of people since the pandemic– and there’s no question that the pandemic had a hugely detrimental impact on people’s mental health, we’ve seen a surge in the number of people having to report poor mental health over the last few years.

And the fact, you know, an aircraft cabin reflects society. You have got people now flying in greater numbers that have poor mental health. And sometimes if you combine that with use of alcohol, use of antidepressants, depriving people of sleep, fear of flying or whatever, that you get this sort of potentially dangerous cocktail that is put together that results in people acting extremely unreasonably. But the airline industry itself also needs to hold its hands up and say, “Some of the ways that we do treat people is actually unacceptable. Some of the stresses that we do put people through is unacceptable.” I’m not saying that it’s done maliciously, it’s not done with negative intent. But as the customer, there are probably times where we all feel that what we’re being subjected to is unacceptable behaviour on behalf of the airline or the airport or the security services. And often, that’s just the result of insufficient staffing, or sufficient staffing but insufficient training, and it’s just a multitude of different factors that all combined together that can, in the wrong combination, have really serious consequences.

Zach: Yeah. And to your point, it’s completely not surprising to me that there would be more incidents. Even if the reporting might be also, as you say, the reporting can be exaggerated for the same reasons, the incidents themselves can be on the upswing because people are just more sensitive to threats and insults and such. But the kind of definition of extreme polarisation which a lot of countries are going through these days, it just makes sense that more people are on edge, more people are willing to say something insulting to people, you know? It’s not surprising to me that we have a pretty good upswing, and then combined with the stress of COVID and such.

Philip: You know, whether we’re in Europe or when we as Europeans are looking at things on your side of the pond, we’re seeing a greater polarisation of political opinions. And that simply reflects society, and that impacts people’s behaviour as well in places like the aircraft cabin.

Zach: So there’s, as you’ve talked a little bit about, there’s kind of a cat and mouse aspect to all the security work in the sense that most people trying to do bad things will be aware, or at least the more professional ones will be aware of the security approaches. And the more aware people are of the security approaches, the less effective those approaches are. How do you see the, you know, when it comes to specific things that the aviation security industry focuses on, how do we balance that risk? Is there maybe a rule of not talking about the specific strategies too much publicly and these kinds of things?

Philip: No, I don’t think there is really. I think that people do talk about their strategies. I find it amazing that the airport screening process is so predictable. We are so transparent about what is going to happen to people. I often wonder why, for example, does an X-ray manufacturer have to have its name on the side of the X-ray machine?

Zach: [chuckles] That’s a good point.

Philip: I mean, let’s face it. You’re not going to sell another X-ray machine to another airport because the name’s on the side of it. It’s not really about the brand at that point, so why have we got it there? Why are we telling the people with negative intent that you’re using a certain manufacturer or system that they can then go away, go online, look up the angle at which the X-ray beam hits the bag, and therefore plan their attack around that? Why are we doing that? Why isn’t it just a black box that you put your bag in on one side and it comes out the other side?

Zach: Are there other aspects of security that strike you as a little bit too obvious and repetitive in that aspect?

Philip: Well, I feel that a lot of what we do is– I’ve said it before– I feel a lot of what we do is theatre. It’s deterrent. But if you actually go to the real world, we do know that the terrorist fraternity out there understands the limitations of the security checkpoint, and they know what works and what doesn’t work. They know what type of devices won’t make it through a security checkpoint and which ones will. And we often, I don’t think, treat the enemy with sufficient respect. That’s why ultimately, the aviation system has always been reactive. We’ve always waited for an attack to happen and then we patch the hole. For example, before Richard Reid, we knew that there was a problem with shoes. But we didn’t do anything about it because nobody had actually tried to conceal a bomb in their shoes. We didn’t deploy common body scanners until Umar Farouk Abdulmutallab went through with his underpants bomb. We didn’t lock cockpit doors and we didn’t restrict box cutters until September the 11th. And there are so many things that you think, “Why does it take an event to happen in order for us to actually implement effective security measures?” And I know people don’t like to hear about the Israeli security system because they get bored of hearing about how amazing the Israeli system is, but the reality is the Israeli system had an attack in 1968 and then they implemented a whole series of measures that pretty much ensured the security of El Al aircraft and other Israeli aircraft ever since. There were a couple of incidents in the early 1970s but once their profiling system was implemented, that was it. Now, you can’t transplant the Israeli system and put it into the United States or the United Kingdom because the scale of the aviation industry is so much greater in the United States or in Europe than it is in Israel. And the tolerance that people have got for more invasive security is much lower. But you can use elements of it. You can deploy common sense. We know that, for example, the terrorists out there are planning attacks that are going to be based on chemical and biological weapons. That’s not a shock, that’s nothing new, I’m not breaching any security by saying that. We know that that is out there. What measures are in place at airports to prevent a chemical or biological weapon attack? Well, I think your listeners probably know what the answer is, I don’t need to put it into words. The reality is we’re waiting for that type of attack to happen before we implement a measure to actually prevent it. But there is a measure that can prevent it. And actually, it is based on behavioural analysis and behaviour detection. And that means making hard decisions about people. That you’re not going to resolve whether or not somebody can get onto a flight or get into a sports stadium simply on whether you detect a threat item in their bag or on their person, it means that you’re going to possibly prevent somebody going into a sports stadium, to a rock concert, to a shopping mall, to a theatre, to an airport, or getting on a bus because you’ve got sufficient concerns about them. And you’re actually going to say, “It may not be fair to that person, but our primary objective here has got to be to safeguard lives and to do what we need to do.” We just need to make sure that the people that are doing it are trained to act sensitively and to be able to question people appropriately, because in 99.9% of cases where concern arises, then you will be able to resolve those concerns by having a friendly conversation with an individual.

Zach: And I think most people are okay with the idea of if me or other people being occasionally inconvenienced is what it takes to save a lot of people, then that is a fair trade-off. I think most people would-

Philip: I think you’re absolutely right. I mean, there is the occasional person that will say, “Why are you focusing on me? Do I look like a terrorist?” First of all, I’ve got no idea what a terrorist looks like and nobody can tell you what a terrorist looks like. But we’re not even only looking for terrorists, are we? We’re looking for anybody that could be a problem, and somebody that could be used by a terrorist, and somebody that could be trafficked, and somebody that is a trafficker as I’ve said before. We’re looking for a host of different people that are out there. And providing we do implement security sensitively, I think we can be far more effective than we are at the moment. And I think we should be actually getting rid of elements of the security system to make the whole system actually more user-friendly and customer service friendly. I think that actually even having the checkpoint… I mean, this was something that was sort of born about 25 years ago, the whole concept of centralized screening in an airport where everybody is screened at the same place, I feel that was detrimental to effective passenger profiling or behavioural analysis. It was so much better than when we used to do it at the gate. Because at the gate, you could have a flight that was departing to Shreveport, Louisiana, and another flight that was going to Cancun in Mexico, and another flight that was going to London Heathrow, and another flight that was going to Reykjavik, Iceland, and you will know that the behaviours of people going on those four different flights will be completely different and that what they will be carrying will be completely different. And you would have a relatively small group of people that you will be able to analyze. But we moved away from that model of screening at the gates and moved it to a big centralized screening area, not because it was better security, but because it was cheaper and because it was better to put all of your resources in one place. And we have to recognize that security does cost. We know from 9/11 and other major terrorist events that short-term savings actually result in long-term huge expense.

Zach: This has been great, Philip. Thanks for your time and I really appreciate you coming on.

Philip: My pleasure.

Zach: That was aviation security professional, Philip Baum. You can learn more about his work at his website Avsec, that’s avsec.com.

This has been the People Who Read People podcast, with me, Zachary Elwood. If you enjoyed this episode, check out the back catalog of episodes on my website www.behavior-podcast.com. Some of the more popular episodes I’ve done have been about reading human behavior for security- and criminal investigation- related applications. 

Thanks for listening.