Webinar Replay: Human Vulnerability and Cybersecurity

Human Vulnerability and Cybersecurity
Guest Speaker: Dr. Erik Huffman
Thursday, February 29th  Noon - 1:00 PM Pacific Time

Dr. Erik Huffman returned to speak on human vulnerability and cybersecurity. This conversation was tailored to resonate with founders, CEOs, COOs, operations managers, and office managers across the Puget Sound region.

The discussion began with fundamentals. Dr. Huffman explained the connection between psychology and cybersecurity. He reviewed Cialdini's principals of influence, and he offered a definition of amygdala hijacking.

With terminology and theory behind us, it was time to move on to real world examples (phishing email, voice cloning, etc.) to illustrate the principals of influence as well as representation of emotional instability which is one of the "big 5" personality traits that heightens vulnerability to cybercrime.

Finally, the last portion of the conversation included a glimpse of Dr. Huffman's current research into our ability to differentiate between photos of human faces and computer-generated images.

In conclusion, Dr. Huffman offered these words of encouragement:

"We are the difference between success and a data breach. I focus a lot on the human factor because I believe in people. I believe in us, and I believe that we can overcome this. It's showing how we're all vulnerable as people, yes, but also you're the biggest success or failure in your organization. You've blocked more phishing attacks than your spam filter could ever imagine, and so you're a valuable asset. It's just a level of awareness that we have to continually have to stop the 78% of data breaches. If we can cut that in half, we've changed the economics of cybercrime entirely, and I believe that is our method to success."

Dr. Erik J. Huffman

Video of the event and a transcript appear below.

Click here to view the video on YouTube.

Human Vulnerability and Cybersecurity Transcript:


Kelly Paletta, EXP: Good afternoon, everyone. Welcome to our webinar. I'm Kelly Paletta, Director of Sales and Marketing at EXP Technical. Today, we're going to be talking about human vulnerability and cybersecurity.

We have a very interesting, informative, and entertaining guest presenter to speak to you today. But before we get started with that, and before I introduce our guest, I do have a few administrative announcements that I'd like to run through at the top of these events.

First of all, as our last couple of webinars have been, this one's going to be pretty interactive. There will be times where you can participate. You're certainly welcome to ask questions as we go, and you can do that using the Q&A feature or the chat feature in your Zoom session. The difference between the two of those is if you use Q&A, that will be private. The message will only go to me. If you use the chat session, other attendees can see the questions or the comments that you're presenting to the audience here.

Now, one of the questions that comes up every time is, “Is this event being recorded?” And fortunately, our guest presenter has been gracious enough to allow us to record this event. It will be available at EXP Academy, which is our technology resources site, and also available on our YouTube page. It may take, be patient, it may take me about a week before that's available because I like to go through and edit the transcript and do a little bit of video post-processing if I need to. So bear with me on that, but be on the lookout for that. I will send you all an email when that is available, should be about a week from now.

And that's probably a good segue for me to start introductions here. So our guest today is Dr. Eric Huffman. He is an entrepreneur, a founder. He leads Handshake Leadership, which is an organization for scholarly research and cybersecurity training and awareness. He heads Pikes Peak Robotics, which I think that's a robotics club, and he provides mentoring and coaching for kids through that organization. You can correct me if I'm wrong on that as we go, Eric. He also is a podcast host. He hosts MiC Club podcast, and MiC stands for "Minorities in Cyber." I would encourage you to subscribe to that. It's really fascinating topics. I watched one of the podcasts this morning, really interesting topics.

And on top of all of that, his day job is Director of IT for BombBomb. And BombBomb, if you don't know, is a video messaging platform and it's a product I'm a big fan of. That's how Dr. Huffman showed up on my radar. I'm a fan of BombBomb. I use that product for video messaging, and it kind of led to me being aware of some of the presentations that he had done in the past. So after that long-winded introduction, welcome to our webinar, Dr. Eric Huffman.

Dr. Erik Huffman: Hey, I appreciate it. Thank you for having me.

EXP: Alright, well, and that's all you got to say, huh?

Dr. Erik Huffman: Yeah, I'm just ready to get into it, you know. There's a lot that I do, but this is not about me. It's about the content and us coming together and having a good conversation.

The Cybersecurity/Psychology Connection

EXP: Well, I am really excited to have you here. And I'm gonna make it about you. I'm going to throw some of your words at you to start off with.

I wanted to start off with a quote. This was something—it might have come from one of your TEDx talks. And if you're watching this and you're curious, Dr. Huffman has presented a couple of TEDx talks, and they are both really fascinating, entertaining. They're within that constrained format of, I think they have to be within 15 minutes. So it's pretty tight but packed with information.

Human Hacking: The Psychology Behind Cybersecurity | Dr. Erik J. Huffman | TEDxColoradoSprings

Chasing Security or Seeking Safety? AI in the Digital Age | Dr. Erik Huffman | TEDxManitouSprings

But one of the things—and I think it came from one of those presentations—you said, "Cybersecurity is more closely related to psychology than computer science." And I was wondering if you could expand on that or tell us a little bit about what you had in mind when you said that.

Dr. Erik Huffman: Yeah, definitely. When you think of cybersecurity, you're talking about attackers and then you're talking about victims as well. It's much more so than just ones and zeros or a network. We're talking about impact to people, and we're talking attacks that do impact people and organizations as a collective. So we like to think of it as, "Hey, this hacker hacked this machine," until it happens personally. Then it's like, "This hacker stole my data," and we use verbs like "stole" and "taken," which are things that are physical.

Things are like, "You've stolen something." It's more like we think of it from a physical perspective. And all of that is more psychologically based.

And when we talk victims, we're talking about the mentality of victims. And a lot of times when we talk hackers, we're talking about the mentality of the hackers. Then after the fact, you know, us uber nerds, we think of like what mechanisms or what did they do to get that data. But I think it is much more psychologically based than it is computer science founded. And I will always stand by that.

EXP: Yeah, you know what? I think it's interesting because in my experience, which is kind of limited because I'm just a sales rep and I'm not doing the triage and the remediation, but it seems like there's a level of complacency that people think it's a technology problem until it happens to them. You know, they think, "Oh, there's firewalls and there's antivirus and there's endpoint detection and response and data loss prevention and all this technology to protect me."

Dr. Erik Huffman: Oh, yeah, for sure. There's—I think the last Verizon data breach investigation report said 78% of all data breaches involve the human element. And so we're talking about the 78% which are involving people. We're not talking about the 10, 15%, 5% whatever, which are just the nerds down in the basement doing the hacking.

Most attacks are—they're attacking people. They're targeting humans first before they start targeting the machine, before they go toe-to-toe with the firewall.

They want to figure out if the easiest way to get in your network or get into your computer in your environment is to make you, the user, a participant in your own attack. If you are a participant in your own attack, there is nothing no technology can do or technology can leverage.

The general population thinks, "Well, the spam filter is going to save me, the firewall is going to save me," and things like that. And they think that cybersecurity or security itself is a job title that someone holds.

No, it's actually a duty upon everybody in the organization. And everybody needs to take part in securing the organization because if you become a participant in your own attack, it's like unlocking the door to your house, allowing the person to run in and leave with everything and say, "Man, they stole my stuff!!!"

Like, man, you let them in!

And that's the 78%. That is the majority of attacks that we have going on.

In the intro, you said that cybercrime is the third largest economy. 78% of that economy is involving the human element in attacks regarding the human element. And that's just from Verizon. Those are the attacks that we know about, not the everyday data breaches, not the, "Hey, someone scammed you," or, "Hey, this small company that no one's heard of has gotten attacked," because there's a lot of those as well that aren't included in that number, right?

Who Should Be Concerned About Human Vulnerability and Cybersecurity?

EXP: That’s one thing that's really encouraging about this series and the fact that for one that you're willing to speak to our audience, but also in attendance. I'm not sure if you're aware—we spoke a little bit about this previously—but the folks attending this event, this webinar, and our series of webinars, typically they are not IT experts. And I love that about this audience. They might be CEOs and founders and executives. They might be COOs and operations managers. There are a lot of office managers and a lot of people that, for lack of a better term, are end users that are people that don't have a lot of cybersecurity expertise by way of training. They don't have degrees (in computer science). They don't have certifications. But to their credit, they attend events like this because that helps them keep cybersecurity and safer computing practices top of mind. So, it's one way that, you know, at EXP, one of our values is share knowledge, share success. And this is one way that we feel like we can make the region, the Seattle area, the Puget Sound region, more secure.

I don't know, I'm kind of monologuing here, but it's off of your point that 78% of those attacks involve the human element. It's people. And again, I'll repeat a phrase that you've said recently, and that is that attackers log in, they don't hack in. You know, somebody invites them in, right?

Dr. Erik Huffman: That is true. And I'm glad you all are doing this because a lot of times I speak at conferences and things like that, and you end up preaching to the choir because technical individuals or people that are passionate about security, yeah, we speak a lot to each other. But those messages need to hit, you know, the CEOs, the CISOs, the CFOs, the individuals that are not quite plugged in on the daily basis just because it's just not as a CEO, you have a lot to worry about on top of security. And so being just taking time out of your day to listen to a conversation like this is a huge step in the right direction. And we need more individuals that do not have information security or cybersecurity in their job title, you know, just to tune in and listen to what's going on in the industry or what's going on in the threat landscape because you're protecting it too. You just can't hire someone to protect your organization and they're going to protect it from even what you're going to do with it like against it. That's not fair for the security person, and that's put too much trust into one particular individual.

Terminology: Principles of Influence

EXP: Right, right. Well, now that you've said that, I want to dive kind of deep into the psychology of it. You presented to this audience about a year and a half ago, and some of this is review, but it's also so important to some of the things that we're going to talk about later. One of the concepts that you've spoken to in the past and that I've heard you mention are these principles of influence. And can you speak briefly to what are the principles of influence and how did they relate to a person's vulnerability to a cyber attack?

Dr. Erik Huffman: Alright, so the principles of influence—you can think of digital social engineering or social engineering at all, think of it like “malicious marketing.”

And think of like a spam or phishing email as like their campaign that they're casting out.

And just like you would like to do as an owner of an organization or an owner of a company or an entrepreneur, you want your click rate high because you want people to click on your marketing. Cybercriminals are the same exact way. And so just like you, they want to influence your target audience's behavior. They want to influence their target audience's behavior as well, using the same principles of influence.


One is reciprocity. So if they give you something, you're going to feel obligated to give something back to them. Like the example I like to use is when you go out to eat, they give you a mint on top of your bill because you are likely to tip 20% more because they did so. The same with car dealerships—they give you something, and you are more likely to buy a car because they did so. That's just human nature.

Commitment and Consistency

Next, commitment and consistency is just kind of, “don't quit,” and “something's gonna happen.” That's pretty self-explanatory.

Social Proof

Then you have social proof. With attackers, they could try to prove that they're not AI, that they're not just—they're targeting you specifically, like in a spear-phishing attempt, by letting you know, like, "Hey, we saw your company acquired this," or "We saw that your company is utilizing this." That's social proofing.


Next, liking. If you like someone or if you like something, they are going to influence your behavior. Just like you're more likely to click on an ad with these certain colors, or you're likely to go out to eat if the food looks a certain way. They do the same thing with in these social engineering attempts when they give you something. So if you like Amazon, you're more likely to buy something from Amazon. If you like Apple, you're more likely to buy something from Apple. So they will spoof or fake being those potential companies or representatives of those companies because you're more likely to click on that.


Then, scarcity. So, like, scarcity, if there's like one left, like Amazon, they will put in red letters and numbers, "There is one left," you're more likely to buy it. Just like with cars, there's like one in a hundred, the car can look ugly, but you're like, "Dude, I kind of want it because it's so rare."


And then, lastly, you have authority. Authority is like, "Do this or else!"

And we see a lot of that with faking a CEO's name, faking a CEO's job title or something along the lines of that, because that authority figure, they are more—you're more likely to fall in line with what they're saying to do because of their position. Not really, not really if you respect the person or if you like the person, but because of the position of the person, you're more likely to fall in line and do whatever they're saying because they have authority.

EXP: This is interesting. This will come up later because I'm going to throw some examples at you. But authority is a combination. It sounds like there's a couple of aspects of that.

One is an authority figure, somebody that has certain credentials or a certain role.

And the other is an authoritative command, like a parental command: "You do this because I'm telling you to do this," right?

Dr. Erik Huffman: Yeah, exactly. Yeah, exactly. So, with my research, I do a whole lot of phishing campaigns, and I work with a lot of organizations. And so, the easiest way into an organization is through sales or marketing. Nothing against them.

EXP: I am sales and marketing!

Dr. Erik Huffman: Exactly. Nothing against you at all. You're more likely to respond to me. And so, I'm more like, I'm more likely to have a conversation with you. And so, because, say, hey, and we talked for a couple of days, and I'm like, well, you know what? I gotta back out. Your competitors are doing this, and this is a link. And you click on that. Like, what are they doing? And then click on that, next thing you know, I'm trying to work my way in.

Conclusion: Principles of Influence Affect Us All

And so, those principles of influence just pile on top of each other. And that doesn't matter if you're a man, woman, child, your ethnicity, those are just human traits. All of us, all of us share those. Those are human traits that we all have.

And it's not because I'm a professional in cybersecurity that I don't have these traits. No, you just got to be aware that you do have these traits and understand that you're in a unique environment that they could be exploited against you. And so, you just got to think twice and understand, like, is this exactly—is this a marketing attempt on me, or is this exactly what it seems to be? A lot of times, it's not, especially when you can't see or hear the person behind you, behind the other keyboard, then it makes it significantly harder to pick the right answer throughout all of this confusion that is going on because you have marketers marketing to you, you got hackers marketing to you, every website you go to, they're marketing to you. And so, you're just trying to feel your way through what's real, what's not. And sometimes you get way too comfortable and you just get a little click happy. And that's it.

EXP: Wow. So you're describing—so the just because this was shocking to me, you're describing just a few minutes ago an that I wasn't even aware of that I would be particularly vulnerable to, and that is kind of a long con on a social engineering attack where somebody poses through multiple emails to be a prospective client, and during that process, they establish trust, and then at some point, they say, "This attachment is the RFP," or "Click on this link to see what your competitors are doing," or something that I'm very likely to click on. And your point is trust but verify, right?

Right, right. I mean, or the general rule in cybersecurity is tap the brakes, be sure that the person you're talking to is who they purport to be.

Dr. Erik Huffman: Exactly. So it's difficult for small businesses to understand that, you know, you may not, in your mind, make a lot of money, but would you work a week for a million dollars? Would you work a week for $225,000? And so it's not like, "Hey, they're only going to target these large companies." They're a decent payday. They're targeting everybody if possible. And there are individuals out there that are willing to send multiple emails, they're willing to send multiple messages. And a lot of those, we can see those. Some of those are small businesses. We've seen some of those with large businesses just recently. I apologize, I forget the organization, but if you look it up, there was an organization that sent $25 million because the individual posed as the CFO, and they convinced them in a deepfake attack to transfer $25 million.

Right. So they used the authority principle of influence. So they're the CFO, and they got this deepfake attempt. And so this individual is listening, and they're like, "Alright, this person is who we think they are because I can see or I can hear this person." So I'm going to transfer this $25 million based on the authority of the person that told me to do so. And $25 million gone. And that's a fairly decently sized organization. And these people, they're doing a significant amount of work to make it happen. And it sounds vain, I apologize if it sounds super, but they're good marketers. They're good at marketing. They're nailing it. So if you think of your click rate for your last marketing campaign, imagine it being like 15, 20%, because that's how some of these social engineering attacks happen. That's how some of these phishing campaigns work. You send some out, you get like a 20% click rate. Organizations would flip if they get a 20% rate on their ads,

Yeah, so they're doing it in a similar way that we're doing it on the business side, on the good side, on the well-intended side. They're just taking that and they're flipping these principles of influence on their head and just executing at a very high level on us.

EXP: Right, right. And, you know, I confess, even benevolent marketers have read Robert Cialdini's book, “Influence.” 

I can show behind the curtain. We're using those principles of influence right now.

We're giving away free information in hopes that people come to know, like, and trust EXP Technical. And, you know, if it comes time for them to decide on an IT support provider for their business, they might be familiar with us and feel some sort of endearment to our organization. And we're also using likability. You're a very likable speaker. And so, we're using that to our advantage too.

Terminology: Amygdala Hijacking

EXP: I want to jump to one other technical thing, and then I'm going to throw you some examples, and I think it's going to get fun. We're going to try some things that I've never done in a webinar before.

The other technical term that I want to ask you about is “amygdala hijacking.” It's one I've heard you use before. What is amygdala hijacking? What happens when that happens?

Did I say that right?

Dr. Erik Huffman: You did.

Amygdala hijacking is when your brain has an immediate response because your amygdala is part of your limbic system. Your limbic system in your brain is fight or flight.

And so, until recently, now with AI and deep fakes becoming so prevalent in digital social engineering, that's why you wouldn't see attackers using like video to conduct attacks. Because if you can see the person, it's going to be like, "No, ‘stranger danger!’ I don't know this person. I don't know if I like this person. It sounds creepy. I'm out. Like, I'm done. This is not what I'm looking for."

Amygdala hijacking is what happens with that fight or flight response. You just stop thinking and you just start acting. You just start doing things.

And this happens with cyber attacks more so than people realize. So, there was a phishing campaign that I conducted, I will never forget this, and I've used this as an example before. I sent messages out in like three tiers. One, I'm cheating. Like, I'm using everything I possibly can against the organization. I'm spoofing everything. You're most likely going to click on this. I'm cheating. That's why I wanted my highest rate. Most of them are in the middle. And then the last one is kind of like, ain't nobody going to click on this. Like, just nobody's going to click on this.

Someone clicked on that!

So, when I sent it, I sent an email, and it was something along the lines of, like, "Hey, we hacked your computer. We saw you looking at adult websites. Send us three Bitcoin, which is like $10,000, and we'll make it go away."

And so this person clicked on it and responded back.

And me, I'm used to people just messing with me. And so, I'm like, "This person's messing with me." So, I'm like, "Alright, fill out this form so we can process your payment information." They filled out the form and they sent it back. And so, next, I'm like, "Time out, like, time out, hold up. I got actual information here. I'm conducting a campaign. I do not conduct any attacks for real. That's not what I'm doing." So, I brought the individual in and I said, "Hey, what's going on with this? Are you just messing with me? If so, you should probably use fake info, don't use your real info." And the individual, after a bit of conversation, breaks down in tears and says he's going through marital problems, and the marital problems involved some of that. And so, he just said, "I just wanted it to go away." That, on that line, is close to amygdala hijacking as you can get. And unfortunately, I will bring this up and then we move on from this because I don't want to make this conversation dark. There are individuals that go the full distance and in their lives because they lost their life savings in an attack. That is a amygdala hijacking to a level I can't articulate because I've never experienced that. But those things are real. And so, you get some on the panic side, you get some on the really dark side, but then you get some on the super excited side. Like, "Hey, for a super basic example, you won some contests and you're like, 'Oh crap!' And you're there are the number one of the largest attacks in human history, early in the days of the internet when AOL used to send the little disc out to everybody's house in the cool cases. I kind of miss those. I kind of don't. But I kind of miss—remember the 'I love you' virus?"

EXP: Oh yeah, yeah, yeah.

Dr. Erik Huffman: Everybody, when they saw that, their brain just stopped thinking, processing, and they just thought, "Who?" and so they just clicked.

Like, dude, some of it is small bits of data where you just stop thinking, some of it is like super emotionally driven when you stop thinking, some of it is just crazy, crazy deep pain or hurt, and you just stop processing information that occurs online. And it occurs face to face. The biggest difference that we have is when you're online, you're still human. When you're face to face, you're still human.

When you're face to face, you're built for that, you're created for that, whatever you choose to believe. When you're online, no one thought we'd get this far. My elementary teacher told me I would not have a calculator with me everywhere I went.

EXP: I heard that line, and guess what, Mr. Ford?!

Dr. Erik Huffman: Guess what? No one expected us to make it this far. And so, this method of communication and deductive reasoning is new to us, and that's just it. We're bad at it. That's just being honest. We're just very, very bad at living and operating in this environment.

EXP: Yeah, indeed. And that, you know, that gets back to another one of my favorite of your quotes, which is that when we're in the cyber world, we're living in a world that we weren't built for.

One of our other webinar guests, Dr. Mark Dupuis, said practically the same thing in different words.

He was saying… You've got hundreds of thousands, millions of years of evolution that teaches you to jump when you see a snake. You've got decades, hundreds of years that teach you to beware when you see a sign that says warning. But you've got, in some cases, there are new attacks like deep fakes and other things that are minutes old. Or they're days old or weeks old. So we don't really know how to process these things. 

Human Vulnerability and Cybersecurity: Examples and Analogies

Example 1: Phishing
The Authority Principle

EXP: I'm going to switch gears a little bit here, and I warned you that I had some examples.  I'm going to share an example.

If I did it right, there should be a phishing email.

A while ago, Tony Lesirge and I were asked to talk to a group of educators about security awareness. I borrowed a little bit from Dr. Mark Dupuis on this.

In one of the slides, I was like… If I wanted to phish you teachers, and I'm not an expert by any means at phishing, but I might try something like this.

And I was wondering if you have any comments on this sort of a phishing attack or maybe some of the principles that we've talked about and how they work in this sort of instance.

Dr. Erik Huffman: I like it, um, just for a phishing attempt. If I say something positive, it's probably kind of negative.

I like it because when you're an educator… We all understand that feeling. If you've been in education or ever taught a course, if your friends are educators, you get it.

One of my best friends in the whole world is a teacher. In the subject line, "What were you thinking?" is triggering. It's triggering to the educators, triggering to the teacher.

They're already up in arms, already kind of flustered. It's not starting that amygdala hijacking; it's starting to, "I need to reply to this. I don't need to reason through it. I need to reply to this, talk to this individual, make this not just go away but calm this person down," because that's what they do to a lot of parents.

So, you see this, like, "Emma showed me your comments on her homework," and it's attached. "What in the hell makes you think this comment is appropriate?" Everything leads the educator to think, "Oh my gosh, just another one." The only thing that can possibly save them from feeling doomed is just hoping they don't have an Emma in their class. Yeah, picking the name Emma because it's pretty common; it's a very common name.

EXP: I picked a common name.

Dr. Erik Huffman: It’s playing on a lot of the principles of influence.

Ironically, this is one of authority, which is not quite authority, but you know it's a parent that's demanding response.

As an educator, you want to respond to the parent because they're demanding response on something with their child. In "Seattle CPA Gmail," it reads well. It doesn't give any red flags that hey this doesn't exist, so they created a Gmail account which is smart. So this is a good one. You did a good job with this one.

EXP: Well, good. I passed the test. But you know part of it was I borrowed from some of the things that you've said and also again hearkening back to Dr. Mark Dupuis who is a cybersecurity expert and he's a professor. He challenges his students to phish him.

He said the one time that a student was successful was when they posed as a parent because he was like, “This looks suspicious but I can't ignore an angry parent. I have to…” and he felt compelled to respond.

Hopefully the teachers or anyone that sees this sort of attack don't jump down that rabbit hole immediately. You pause, you tap the brakes, you confirm through some other means. You call Emma's parents directly on the phone rather than replying to this email and opening the attachment.

Part of the reason, the inspiration for this too, is my own experience with security awareness training.

At EXP Technical, we've created security awareness training and we give it away for free. It's available at EXP Academy, academy.exptechnical.com.

And in that training, the information is somewhat generic and broad.

We've had about 2,000 students complete the training and only three or four have offered complaints. And the complaints, they've all been the same. And I feel like we've done a disservice to the students. The students have said, "Well, this is common sense.” “Anyone who gets phished this way deserves it," was a recent one.

Literally there have only been three or four…but I felt like they don't really realize a just slightly more personalized attack that hits them at the wrong moment is very likely to be successful.

I worry a little bit about the arrogance or the overconfidence of that student as they go out and about in life.

But again, I'm kind of monologuing here and you're our guest speaker. So my question though is--if I turn this into a question: do you have any thoughts or comments about like overconfidence? One of your recent podcast guests talked about anti-fragility. Being overly confident is kind of a fragile state of, “Oh I'm not going to even entertain the idea that I could be a victim of this sort of attack.”

Do you have any comments about overconfidence or the antifragility posture when it comes to cybersecurity?

Dr. Erik Huffman: Yeah, overconfidence is the enemy.

Cyber professionals, IT professionals, a lot of them operate with that.

They know the technology, and so they “can't be hacked.”

That is the worst situation you can be in because one of the attack vectors we have as people is comfort. The more comfortable you are, the more hackable you become.

If you're guarded, if you're up in arms, just think of it as the internet as a new city. When you go on vacation, let's say we vacation in New York, and you walk around, you have your backpack on or you have your purse out, I guarantee you clinch that purse a little tighter because you don't want anyone to take that from you.

If you're too comfortable in this unique or on the internet or online, you think you can't be hacked, you got this bravado about you. That is a scary place to be because you don't quite understand that all of us as people are vulnerable. You may try, you may protect your computer, you may protect your network very, very well, but are you protecting yourself? You know, you're answering to various emails or you probably are hacked and didn't even know it, or you have that bravado, and you work in an organization, and you've been hacked, and you're scared to admit it. That's how you end up with, "Hey, three years ago this happened. How the hell no one found this?" Because someone hid it, right?

EXP: And yeah, it's been lying dormant or people have been extracting information for all this time. There's even a comment in the Q&A from one of our audience members. She wrote, "It's not a matter of if, it's a matter of when," which is a great way to think about it.

You can't let your guard down. You can't think, "Oh yeah, it's only stupid people that fall for this." That's not the case at all.

Dr. Erik Huffman: No, it's analogous to drivers. Like, you know, everyone says everyone sucks at driving, but everyone thinks they're a good driver. Who thinks those two things can't be true? You know, like if everyone sucks at driving, but everyone thinks they're a good driver, right? Who's the crappy drivers out there?

Example 2: Voice Cloning
Authority and Liking Principles (Human Factor Authentication)

EXP: I'm gonna circle back to that in a minute, but I want to share another example. The things that you're saying are relevant to some of the illustration examples that I wanted to share.

There's a YouTube video. This is an excerpt from our security awareness training, and it speaks a little bit actually to some of the things you've already outlined. To authority and authoritative. This is a voice-cloning lesson. It's frankly some of the comic relief in our security awareness training. But I'll just play it, and you can hopefully see and hear it.

Video begins:

Kelly Paletta: In this lesson, we're going to talk about voice…

[“Hail to the Chief” cellphone ringtone plays.}

Hang on a second.

Important call…


DJT (impersonated voice): Kelly listen I need a favor.

Kelly Paletta: okay what's up?

DJT IV: Go to Walmart and buy all of the gift cards. Put it on your credit card.

Kelly Paletta: My my company card?

DJT IV: Yes your company credit card.

Kelly Paletta: well shouldn't I get approval from Tony or Bianca before I do that?

DJY IV: That's the stupidest thing I have ever heard.

Kelly shut up! just go to the store.

I will call back in 1 hour.

Kelly Paletta: It seems a little sketchy…

DJT IV: I can't talk. Heading into a very beautiful tunnel.

EXP: Okay, so I'll stop that right there. But, you know, you made me think of that earlier because you said "authoritative" and, uh, I don't know if everyone else hears it this way, but when he says "Kelly, shut up," it sounds like a parental command, and I feel like, "Oh, I better pay attention." Any thoughts on that or on voice cloning or just the principles of influence that we're seeing in this sort of video?

Dr. Erik Huffman: First of all, hilarious. Absolutely hilarious.

I think the principles of influence there…Authority is a very good play on that, and also we've seen attacks exactly like this be successful. We've seen it from a text perspective, but when you add the audio to it and the person can hear that, that's another check in the validation of the victim's mind or the potential victim's mind, because even if they are aware, the fact that they can hear someone and if the audio sounds smooth and it sounds like a command that they would follow, that makes it very hard.

Because as a human, as a person, if you can hear the person, great.

If you hear your mom's voice, that impacts the body. You begin to feel love. Like, you physically, your heart, your mind, the amino acids and the proteins go through your body, you feel love. And so you hear your mom or you hear your dad's voice, it's a calming effect to you. It's happened as a kid, that's why we do some of the things we do as children. The same thing goes in as an adult.

But if you hear like a coworker or the CEO's voice, you know, similar things happen, similar proteins go, like, if you're fearful of your job, man, like, during the pandemic, I would send out phishing campaigns saying, "Hey, layoffs, layoffs."

Even now layoffs are happening.

People would click those like crazy because people are fearful of their jobs. But if they could see or hear the CEO saying, "Hey, it's me,” or “I need you to click this,” or “I need you to do this because there is this potential negative impact happening on the company," people begin to act. They begin to react without fully thinking, thinking and reasoning their way through the situation.

So yeah, all the principles of influence are being put in play there, right? They don't give up commitment, consistency, liking, Authority, a lot of those are being put into play there.

EXP: And voice cloning, I mean, that's an attack that we're starting to see, and like you mentioned the example earlier, they're not that common though quite yet. I would encourage people to Google the 60 Minutes episode that had voice cloning on it because it talks a lot about these sort of attacks that are aimed at our parents and our grandparents, and they fooled an intern.

But where I'm headed with this is if you listen closely to that imitated voice, it's not a very good imitation, but I'm using phrases  that we associate with that voice. And you know, there are other things that kind of sell it. And again, it's that parental command is the one that gets me where he says, "Shut up, just do this." That's the one that kind of snaps me to attention.

And it speaks to a concept that you have mentioned before, which you've referred to as "human factor authentication." We start to believe this is the person, right?

Yeah. So with human factor authentication, you have to, we all try to authenticate the person before we accept the message. And if you respect the person or if you trust the person, you're more likely to accept the message, regardless if it's real or if it's fake.

We've seen a lot of that with misinformation and disinformation campaigns that go out. If you can get to the person that the other individual respects, or if they trust that person, they're more likely to trust that message. Just like if you go out to eat, you'll more, you'll ask a friend like, "Hey, is this new restaurant, is it pretty good?" They say, "Yeah," you're more likely to go out there. But if your friend said, "No, it is terrible, it is awful," you're less likely to follow up and do what they say, do.

So, the human factor authentication is massive in a digital environment. You do it from a name, you try to read a name, if you trust a name, you begin to feel something there or you read the email address, if you begin to trust the email address, you know who that email address belongs to, you're more likely to accept that message as well. But if you can hear the voice, that kind of checks all the boxes, like, "Hey, I hear the voice, I think this voice is real." And if they get you to buy into that voice, it's like Hook, Line, and Sinker.

Not that you're guaranteed screwed, it's just, it's going to be harder for you to recognize that that attack is fake.

Example 3: Snickers Commercial
Moments in Which We Are Vulnerable

EXP:  So I've got another one to share, this is from the pros, so these people are much better at comedy than I am, but you'll recognize this as soon as you see it. But it speaks to something that you mentioned before.

[Snickers Commercial Plays]

I find that commercial hilarious, but it also speaks to the thing a little bit about the driver thing. What I mean is that sometimes you talk about people's vulnerability to attack. One thing I've heard you mention before is emotional instability. Just like driving, nobody would say, "Oh, I'm a bad driver." Everybody thinks they're a good driver. Everybody thinks they're emotionally stable. But when you're hungry, when you're distracted, when you're in fear of losing your job, when you're going through a divorce, or just late to pick up the kids, you're very likely emotionally unstable. And if you're scrolling your work email in those moments, you're particularly vulnerable, correct?

Dr. Erik Huffman: Yeah. So if you're less comfortable, if you're more distracted, you're more likely to fall victim to digital social engineering. What we call those psychologically are emotional hot states. If you're in an emotional hot state, like if you're way too busy at work, you're less likely to take time and breathe things through. It's something we've built into a lot of our organizations, these emotional hot states. If you apply for a job or if your organization says, "Hey, it's a fast-paced environment," which is, "Hey, if it's a fast-paced environment, you might be in more emotionally hot states than someone else." Not to say every workplace needs to be calming and relaxed. You just gotta understand that individuals in emotional hot states are less likely to think things through deductively reason their way through digital social engineering attempts.

Like, if you have a thousand emails to read through, are you really reading a thousand emails? You're kind of clicking through them. I've been there as well. You're kind of just clicking through them, and you're like, "All right, cool, they need this, they need this, send this off, send this off," so you can get through that stack of emails that you have because you're just currently in an emotional hot state. Doesn't mean that you're a bad person or you're stupid at that moment. You're just trying to get things done as quickly as you possibly can, so you're not thinking things through. And that's exactly what the attackers want.

If you really take your time and you think everything through from a third person point of view, you can see the mistake the person's making, and it looks obvious. But when you're in it, it's like, "Oh man, I just, I missed that one." So, the car analogy, like, "Hey, there's a lot of car accidents out there, and all of them made a mistake." None of them wish that would happen. Or I would say most of them, most of them made a mistake, and it's like, "Man, how did you miss that? How did you not see that stop sign?" It's like, "Well, I was texting, I was calling, I had my Easy Bake Oven going on, I was making cookies on the side," you know, they're just totally distracted. That's essentially, in a roundabout way, what we find ourselves in when we just jump into our inbox alone.

EXP: So, the question comes in, how do we recognize those states and respond? What do we do, you know? And I guess what they're asking is, what's the news they can use from this information?

Dr. Erik Huffman: Yeah, with the emotional hot states, I tell everybody to use the same analogy. When you go online, you're in a new city, you're actually in a whole new world. Understand, detach yourself from the situation and understand, like, hey, if I'm just way too busy, which means I probably need to slow down for this particular moment in time. Granted, I know everything in you just wants to catch up, but as you read messages and it requires you to send something off or requires your response and your attention, give it your attention. Don't give it part of your attention, give it all of your attention, and then reason your way through that. Because it's not the criminal putting you in an emotional hot state, your environment has put you in an emotional hot state, so you're clicking through all these things.

Understand that, yes, I'm in a whole new environment, I'm in a whole new world. I need to detach myself from the situation and put my guard up and not get so comfortable. Put my guard up and say, hey, before I send off this message, is this who I think it is? It's well worth taking the time to slow down and try to pull yourself out of an emotional hot state than just to hope nothing bad happens. Right?

EXP: Right, yeah, to tap the brakes and think twice for sure.

In our training, we use the SLAM analogy: Sender, Links, Attachment, Message. But really, the point is that's not really a foolproof way to not be hacked, but it does force people to slow down and think about, “am I really expecting an email from this person?” Or” does this link lead to the organization that it claims to be from,” etc., etc.

So, I want you to have some information to share too, right? And this is kind of a clumsy segue, but I'm not sure if you're aware, but since the last time you presented, there's this thing called AI that's come out.

Dr. Erik Huffman: Yeah.

Continuing Research into Human Vulnerability and Cybersecurity: Can Humans Recognize Computer Generated Faces?

EXP: And I think you've been doing some research on that, correct? On people's ability to differentiate between artificial intelligence-generated images or other things like that. So, I'll shut up, maybe you can speak to us on what your research has been and maybe some of the examples that you have to share too.

Dr. Erik Huffman: Yeah, so I've been conducting a significant amount of studies on AI in conjunction with social engineering, the tips, and the psychological effects of it. Because as we see, we hear deep fakes, we hear about vocal cloning.

And so, I was curious on how well does the human brain recognize human faces. Do we recognize human faces, and can we recognize artificially generated human faces? So, I've conducted two studies with nearing over a thousand people, and we're just trying to see if they can recognize if a face is real or artificially generated. And so, when we use artificially generated faces, we're using faces that a computer generated by itself. And so, we asked our AI-generated servers to create a human face. Sometimes we can ask, like, hey, create a male face, female face, an African-American face, a Caucasian face, something like that. But it's all faces that do not exist in the world.

And so what I have for us today, we're gonna run through 10 of these. I'll ask the audience, we won't do this long. I'll show a face and give you a couple of seconds to see if you can recognize if they're real or AI.

EXP: Let's use a poll too. I just launched a poll, so people should be able to answer with each one of these too. I'm going to try to do this as quickly as I can. So you should see a poll question: "Image one: real or computer-generated imagery?"

Dr. Erik Huffman: Okay, here we go. Is this face real or AI?

EXP: Alright, everybody vote and vote quickly because we're going to go through these pretty quick.

Dr. Erik Huffman: Alright, should I click to the next one? Do you have the answer for this?

EXP: I can end the poll and we can see the answer. Oh, that one's real. Okay, we were 52% to 48%—average. It's okay. Let me see if I can—this might get clumsy. I'll try to…

Dr. Erik Huffman: It's all good. Real or AI?

EXP: Let me—oh man, my polls. I'm not as fast on my fingers on the polls.

Dr. Erik Huffman: Ah, it's totally fine. We're having fun with it. It gives everyone a little bit more time, which isn't quite fair, but it's okay.

EXP: Let's see, go back. Oh, here we go.

So, we'll launch poll number two now. Real or CGI? Oh, we're pretty close here. I'm ending it now, so we'll share the results. The votes are 57 to 43, and it was real. That one was real. Let's do another. I'll launch it. This is kind of fun. Oh, I know my opinion on this one. Alright, that was AI. We were 61%—63% got that one. Nice. Alright, see, there they are. We're getting it. Not real, AI.

Here, this is fun. Everybody's clicking in. Look at that. The votes are in favor of real, 71%. Alright, that one—oh, that one got 'em. Alright, and real AI, I think that one's real. Alright, you ready for this? Sure. I'll share the results. Seen that? Oh, wow.

Dr. Erik Huffman: That one got me. That was actually one of the best ones I've ever seen. Yeah, alright. Okay, I'm gonna click quickly.

EXP: Okay, we'll end the poll, we'll share the results. Ah, most people were wrong, 67 said CGI. We've just got a couple more of these, right? We're on number six. Yeah, we got ten, so we're coming down to the nitty-gritty end of this. Yeah, this is fun though. Okay, last clicks. I'm closing it. Alright, and she's real. Oh, most people, 70% on that one, were computer-generated. Right, right.

Alright, show us the grand reveal. She's real. Alright, we got two more. We got this one and one more left. Okay, launch the poll.

EXP: I'm going to say out loud, I think that one's computer-generated. I'm gonna end the poll and share it. Sorry if you didn't get your votes in in time. Hey, I was right on that one.

Dr. Erik Huffman: Nice, yeah, you're right. And this is the very last one.

EXP: Okay, for all the marbles—well, there are no marbles. Well, this one's tough because the overexposure kind of adds an element of realism, doesn't it? I'm gonna vote real, but it could be very clever. Most people are siding with me on real.

Dr. Erik Huffman: Good job. Alright, we're real. Alright, it's—I'll stop my share.

EXP: So, what are the cybersecurity implications of all of this? I mean, it's a fun game to play, but I imagine this leads to deep fakes and facial recognition and other ways that attackers could use our gullibility against us, correct?

Dr. Erik Huffman: Exactly. So, what happens, you can play on a lot of principles of influence. And so, if you find the person attractive, it plays on the principle of liking. So, individuals, they can use deep fakes, they can create fake accounts on social media to try to elicit different responses for business transactions, like create a fake LinkedIn account with some completely fake individuals and come up with images of that person. And so, it's hard to find that there are fake accounts on LinkedIn. What?

Oh, yeah, 100%. LinkedIn, Instagram, all of them. And so, what happens is that these individuals are playing on your principles of influence for malicious purposes, could be business purposes and things like that. By itself, there is an AI-generated salesperson for an organization that exists out there and that person makes $10,000 a month in sales, and they don't exist. Like, they don't exist at all. And so, it's a unique way now attackers can target individuals because before, they would use anonymity, the fact that you cannot see them, you cannot hear them, as a way to mask themselves behind the environment to play on those principles of influence. Because if you could see and hear the Nigerian prince, nobody's fallen victim to that, right? Nobody's fallen victim to anything of the sort. But now attackers can be selectively visible and they can let you hear them selectively in the way that they want you to see them or in a way that they want you to hear them as they continue to conduct social engineering attempts. It is a wild world out there, and to know that if you got a bunch of them wrong, you're falling in line with the norm. You're human. Yeah, when we allow filters, up to the minute, I was just working on this data this morning. It's still now, it's still 28. almost rounded up to 29% of people get it correct. So, if you got most of them correct, congratulations, you're unique and you're doing very well. If you got a lot of them wrong, then you're just in line with the rest of us, and that includes cybersecurity professionals and everyone, just wrap everyone in the general population. We're only getting about 28 to 29% of AI-generated faces correct, 'cause AI's gotten good enough where the human brain just sees, like, hey, this is a face, that is a person. And so, where do we go from here? It is a scary sight because it really plays on the principles of influence when we're operating in this environment. Like, oh man, now it's scary. Now it's scary because social engineering is going to be very common.

Human Vulnerability and Cybersecurity: Conclusions

EXP: I take comfort from your last—I think it was your most recent—podcast with John Kindervag. I'm not sure if I'm saying his name right, but at the end of that, he said it is scary, but we're pretty smart and we've faced tough challenges before and we'll find a way to engineer our way out of them.

We've got just a couple of minutes remaining. We have had a couple of questions that come in. I just want to let people know you can submit questions if you have time—I mean, if you can type quickly—but otherwise, we're going to start wrapping up here in just a second. I don't see any questions jumping to the chat, so I did want to go back to you, Erik, in the last minute or so that we have remaining. Do you have any closing comments, ways to sum up, you know, news that our audience can use or things that you want to, you know, departing words that you want to leave our audience with, or lessons that you want to share here before we wrap up?

Dr. Erik Huffman: Yeah, I would just like to say, first of all, thank you for having me on. And then lastly, I want to say that we are the difference between success and a data breach. I focus a lot on the human factor because I believe in people. I believe in us, and I believe that we can overcome this. It's showing how we're all vulnerable as people, yes, but also you're the biggest success or failure in your organization. You've blocked more phishing attacks than your spam filter could ever imagine, and so you're a valuable asset. It's just a level of awareness that we have to continually have to stop the 78% of data breaches. If we can cut that in half, we've changed the economics of cybercrime entirely, and I believe that is our method to success. And I believe in the strength of people, and we just got to understand that this is hard, this is unique, this is weird, but we're more than capable of overcoming this.

EXP: Oh, awesome. So thank you so much for joining us. And you know what you said just now speaks to the intent behind this series. We're trying to discuss cybersecurity regularly because we want to fine-tune that human firewall, and I'm borrowing a phrase that you've used in the past. But I really appreciate you taking the time to meet with us today, Dr. Erik Huffman. Thank you so much. And thank you for everyone in attendance and in your good spirit to engage in the polls and all of that information. And with that, I think we'll wrap up. My clock shows 1 PM. So, thank you, everyone, for attending. And once again, thank you, Dr. Eric Huffman.

Dr. Erik Huffman: Alright, thank you so much. I appreciate the opportunity.

Related Posts