A Conversation with Annie Searle

EXP Techincal invited Annie Searle, "The Risk Detective," to discuss emerging cyber threats in the post-corona era. The conversation was fascinating! Professor Searle integrated lessons learned in the 2008 financial crises to emerging contemporary threats outlined in news articles published the day of our discussion.

We covered a wide range of topics including:

  • AI in Cyberattacks
  • Public/Private Partnerships in Disaster Recovery/Emergency Management
  • Cybersecurity for Small Business
  • Business Continuity Planning
  • The Importance of Community in Emergency Preparedness/Emergency Response
  • Risk Management for Small Business
  • Lessons Learned at Washington Mutual, Seattle Art Museum, and Elsewhere

Click on the video player to watch a recording of the event. A transcript appears below.

Emerging Cyber-Threats in the Post Corona Era

Kelly Paletta: Welcome to our event, "Emerging Cyber-Threats in the 'Post Corona' Era." I'm delighted that we have an informative guest joining us today, Annie Searle. Annie, welcome to our event.

Annie Searle (AS): Thanks, Kelly.

Kelly Paletta (EXP): Thank you for being here.

For those of you who aren't familiar with these events, we try to avoid product pitches in our webinars. Instead, we focus on information that business leaders can use to run their businesses more securely and operationally efficiently. For that reason, we look to executives, business leaders, and individuals from academia to speak to our audience. Annie checks many of those boxes, as she is currently the head at ASA Risk Consultants. She has held that role for many years. She is also an Associate Teaching Professor Emeritus from the University of Washington Information School, where she taught courses on cybersecurity, on risk, and ethics. Prior to that, she was an executive at Washington Mutual, where she spent 10 years managing business technology continuity, disaster recovery, technology risk, and compliance.

(You're smiling.) I'm not done yet, but I'm close.

Prior to that, she had a career as an entrepreneur. For 15 years, she ran Delphi Computers and Peripherals, serving as the President and CEO of that organization. Did I miss anything?

AS: No, no, that's fine. It just starts to sound like an obituary if you go any further. I’m glad you stopped.

EXP: Well, it's nowhere near that. I'm sure we have some riveting topics to discuss today.

Before I jump in with questions, I want to mention to folks in attendance that this event is a little different than events we've hosted in the past. There's no PowerPoint today, and there's no contiguous prepared presentation.

Instead, I have a lot of questions that I'm going to pose to our guest. Those of you in attendance are welcome to present questions as well. We won't put you on screen or unmute your mic, but you can pose them via the chat session or use the Q&A feature in Zoom.

I'll do my best to moderate those as we go through the event here today, during this hour.

But first, I'm going to jump in, Annie, with a kind of rudimentary question. I'll give you an easy pop quiz to start.

Threat/Vulnerability/Risk

In technology, there is often a lot of misuse of terminology, with some words used to mean the same things. I thought it wouldn't hurt to start with just some basic definitions of words that we often use on the technology side, that sometimes get misused. From your perspective, could you speak to the definitions and differences between a threat, a vulnerability, and a risk?

AS: A risk is the overall umbrella for the other two.

A threat is usually identified through analysis before something happens to your environment. It differs from a vulnerability in that it identifies something generally external to your environment.

A vulnerability is something found through testing, observation, or—knock on wood—by a regulator or auditor. It's a weakness, a problem in your infrastructure, whether it's in the cybersecurity area, Mainframe, Unix, or elsewhere. A vulnerability means that if someone were to get into your environment, they could take advantage of it and move more freely in your environment than otherwise.

EXP: I know that from the perspective of a company like EXP Technical, network administrators and IT managers often use the words "consequence" and "risk" interchangeably. When you're managing multiple environments and end users, risk is inevitable. A hard drive failure might be a once-in-a-lifetime event for an individual, but it happens all the time for IT support providers.

AS: That's right, and we have a lot more collected data now to tell us how important such a risk, like a hard drive failure, is compared to even 10 years ago.

EXP: It all falls under the umbrella of disaster.

AS: In my world, which is admittedly specific, a disaster involves chaos not only inside your business but often externally as well. It can relate to weather conditions or a terrorist event, a man-made event rather than a natural disaster.

And that word is within a technical vocabulary in the world of emergency management. It’s very like you see something like a hurricane—CAT5, CAT3, CAT2, and when it becomes less consequential it drops to a “tropical storm.”

Risk and consequences aren't necessarily the same thing. If you own your own business and you're at the top, one thing you'd want to ensure is that a broad cross-section of your leadership has sat around a table and identified, for example, the top 10 risks to the company. They should argue about how they should be prioritized and ranked.

Once something like that has been done, the next logical step is to create something calle “a risk register,” listing them out and assessing whether they have high or low probabilities and whether they are highly consequential or less consequential.

And that’s when your dealing inside the vocabulary of risk management and emergency management.

A Backstage Pass to Pivotal Events in 21st Century History

EXP: Now, if we tie those back to experiences you personally had, you had a backstage pass to, I'd say, the three most significant events that shaped the direction of history in the first part of this century. I'm referring to the September 11th terrorist attack, the 2008 banking crisis, and more recently, the pandemic. I'm curious, what did you learn from those events? How did they change your evaluation of risk? What lessons can you share with our audience from your unique perspective and insight into those events?

AS: Well, I'll start with the last one first, the pandemic. When I started at Washington Mutual, one of the things I was assigned to was a task force that had a national and sometimes international (in focus). We were looking at the possible catastrophic impacts of what was swine flu at that time. I did a lot of writing for the financial sector, usually in British journals, on how ready the financial sector in the United States was to handle a pandemic.

We knew a lot things that were true just a few years ago here in this country. Things like up to 40% of your workforce can be gone unexpectedly, either because they're sick, taking care of someone, or due to some other circumstance.

Back in those days, this was around 2006-2007; we really weren't set up for remote work. That's a difference between the two.

Nor did we--in all of the work we did inside work groups at Washington Mutual…

Different business units with senior vice presidents got together, sat down, and had to answer three questions:

  1. which of our business processes do we have to maintain,
  2. which ones can we do less frequently than we do right now, and
  3. which ones can we suspend for some indeterminate period of time?

That gave us a kind of master plan or the foundation for a master plan that we created.

So, we actually had a business continuity plan for a pandemic that sat right next to our regular plans, called "all-hazard plans," that would handle events like wildfires or hurricanes.

What I learned was very useful to my boss at the University of Washington, the dean of the I-school, (University of Washington Information School) Anind Dey.

Because of work groups that I still sit on, I received alerts, bulletins, and material not widely available to the public. I was able to pass those along and answer questions at times that he might have found it hard to get answers to otherwise.

On 9/11, that's an entirely different matter, and I think the most important reason we should remember 9/11 is that it's so possible that it could happen again despite everything we've done.

I learned from that and that was a hard lesson for me.

 At the time, I had about 200 people in my group, and the leadership of the bank went into a room across the street with a TV set, and they never came out during the day. That left all the managers hanging, whether it was the branch bank managers…who had an advantage in being able to make decisions about closing or opening a branch.

But I had trouble knowing whether I could send my people home or not, so I finally went over and knocked on the door and asked my boss. What that got me was being assigned to the crisis management team for the bank for a couple of years, until ultimately, I was asked to chair the crisis management team and remake it into a much smaller, tightly-knit group that were all senior vice presidents, and they all had the authority to take actions on behalf of the bank that would cost the bank money without getting further authorization.

Regarding what I learned from these events…

I learned just how troubling it is to be bound by the Securities and Exchange Commission and being publicly traded. That meant the CEO and the executive vice president couldn't tell their people what was going on unless it had been announced to shareholders and investors.

I actually had the CEO of the bank sort of trapped in an elevator one day and I said, “Y’know Kerry, if you could just talk to people about how you see us coming through it…” and he looked at me and said, “I can't say anything that I haven't already communicated to the SEC or investors.”

That was a hard lesson to learn. Like many senior vice presidents, I hold myself accountable for not having seen the issues earlier and beat the drum.

I was told consistently that it was a little rocky but things were under control or that conditions were improving—as opposed to what I was reading in the paper or seeing on CNN at night.

For people in senior positions in corporations, I would urge them to always be super attentive to the jungle drums, or the media, who often know things before senior management does.

Applying Those Lessons to Small Business

EXP: Now, tying it back to smaller businesses…

AS: I think one thing that applies regardless of size is the need, especially in times of financial uncertainty, to exemplify the values you cherish as the CEO and maintain a smooth-operating team. Smaller and medium-sized businesses may not be publicly traded, so some aspects of regulation may not apply, but what does apply is the need… If you want to have a smooth operating team and you've established certain values that you cherish as the CEO, you need to exemplify those values, particularly in times of financial uncertainty.

I think, for small and medium businesses, this is the rockiest part of it all – figuring out at what point it's appropriate to invest more based on recent growth, say, in the last six to 12 months. At what point is it appropriate to just hold steady? At what point do you hire more salespeople? These are all key questions.

Budgeting for Risk

EXP: Right! “To what extent do you invest in mitigating risk?” This is a challenging question to answer.

AS: It’s really hard! Especially since, we don’t read about all of the attacks…many attacks that target small and medium-sized businesses that don't make it into the paper.

EXP: Yeah. The majority!

AS: Part of the reason why that can happen is that there is a form of either willful ignorance or complacency.

Business owners sometimes say, "That'll never happen here. We’re not a hospital. No one is ever gonna try ransomware on us.”

EXP: Oh yeah, security by obscurity. “We're too small,"

AS: Yeah. “We’re to small. No on is interested in us.”

And yet, there's a new report on cybercrime that shows that this is exactly where these attackers are focused on launching these attacks.

News Reports about Cyber Attacks can be Misleading

EXP: I recently posted something on LinkedIn about this issue. Companies like EXP that support small businesses, in a way they are doing their clients a disservice when they only talk about MGM, Ceasar’s Palace breaches because it leads to that complacency that you are talking about.

People think it's only the big targets being attacked. The majority of attacks are on small and medium-sized businesses, and they are after individuals. They don't care what your data is worth; they care what it's worth to you. They want to hold it for ransom or disrupt your business.

AS: Easy money!

EXP: And less hardened security…fewer layers of control over sensitive data.

[Note: EXP Technical offers FREE Security Awareness Training at EXP Academy.]

Emerging Threats in the Post-Corona Era

I'd like to transition to the next big event and a chilling sentence from your most recent blog post. You ended a blog post about September 11th with the sentence, "Warnings of high risk often go unheeded."

That may be a segue into what we're now entering. I've dubbed this event "post-Corona," but I feel like we're on the threshold of “the artificial intelligence era.’

About a year and a half ago, we had Midjourney and Dall-E, and even more recently, ChatGPT became available.

If you go to OpenAI's site and read their blog about safety, they discuss “the alignment problem,” the concern that artificial intelligence isn't always aligned with human intentions.

Maybe I'm overplaying this, but you can see where I'm headed. What are your thoughts about where we're headed with respect to the dawn of artificial intelligence?

AS: Well, I think there are some remarkable things that have been achieved with AI tools.

I think what you are talking about is generative AI.

The concerns primarily revolve around privacy and security issues related to the data these large learning models consume.

A large learning model only gets better if people willing to let their data get sucked into it.

But there are challenges, and I want to share a recent observation from Gail Codrery from F5, who wrote a piece on CISOs and AI for Tech Radar (AI is here: how should CISOs respond? | TechRadar). She’s talking about three different kinds of challenge but let me just read the third challenge to you.

“The third major challenge is that AI-powered cyberattack software could try many possible approaches, learn from how we respond to each, and quickly adjust its tactics to devise an optimal strategy—all at a speed much faster than any human attacker. We have seen new sophisticated phishing attacks that are utilizing AI, including impersonating individuals both in writing and in speech. For example, an AI tool called PassGAN, short for Password Generative Adversarial Network, has been found to crack passwords faster and more efficiently than traditional methods.”

So I think I'm a lot more worried about that than I am about the loss of control of the tool. At least as we understand things right now and I have no reason to disbelieve this would ever be the case there's always human oversight of the model.

And there's tweaking of the model to try to establish a lack of bias.

They may make the sample size larger. They may choose different audiences to include in the sample size.

We've seen all of the examples that are that are truly terrible, ranging from using AI to decide what a prison sentence should be or to decide when a prisoner should be let out for time served.

We've seen how based on the model that was used to capture the data at really impacts people of color significantly differently than white people.

We've seen tools like Clearview that police departments have opted into where it's not only a database or a language model that includes photos of you taken if you're arrested for some reason or another, but involves actual scraping from social media sites as well.

And there the problems are significant.

DHS put out a RFI a couple of years ago—for a “vetting enterprise project,” they called it—to monitor immigrants. How do you like that?! And that uses AI tools as well.

And then we have examples in the past like Google's Project Nightingale which has been stopped now but amassing large amounts of medical data including PHI and complete health histories.

So you could say with all of these…

…or the Airlines use facial recognition to let you get on a plane now American Delta number of Airlines use that.

And all of those you could say that in the lack of any kind of guidance which usually means “regulation,” businesses have had to choose how they're going to do things. If they're an international company, they sell abroad, they would be constrained somewhat by GDPR, but not by any American regulation at this point.

Commercial Data Protected in Artificial Intelligence Chat Sessions

EXP: Which circles back to your first point about data protection and the confidence that we have in the folks that we're entrusting our data with you know.

I'm working on a blog post on this right now.

If you go to ChatGPT you have the free version.

Then there's I think the “professional” version… [Note: It’s called “Plus” not “Professional”]>

It's only when you get to the paid Enterprise version where they explicitly state that the model isn't training itself on the content that you feed it.

And even at that it's their assurance.

I don't know if there's any regulatory oversight as to whether that's a true statement or not (whether or not it’s true that the model is not training itself on your data).

So I I think where I'm headed with this (and it sounds like you agree or you can reiterate) is that information in conversations that you have with chat GPT are not necessarily private.

And in fact if you're talking about commercial, intellectual property, trade secrets… they may work their way eventually into the language model because it's effectively like a predictive text model, figuring out how words work together.

AS: Exactly!

EXP: So beyond the “Jurassic Park” question or “The Matrix” or or even the weaponization of AI in its current state… There is this danger of…

People need to be cognizant of how they're using it and ways in which they may be exposed that they didn't realize. Would you agree with that?

AS: Yes I agree.

I have only a short GeekWire piece to go by that came out yesterday (Amazon unveils new AI features in quest to make Alexa 'superhuman' – GeekWire) or today. Amazon's new model of Alexa where Alexa has allegedly “superpowers” they're calling it.

I'm going to be looking at that really closely even though I don't have any such device in my house.

EXP: Nor do I!

AS: I'm going to be looking closely at whether additional features really to boost that whole section of Amazon. I mean that division was cut significantly earlier and within the last year. Now they're hiring back up again.

It's not clear to me if all the safeguards are in place or not. That's something I want to look at really closely.

Fostering Innovation within a Culture that Values Cybersecurity

EXP: Now, this brings up another question that I had in mind. You were the executive sponsor of tech innovation at Washington Mutual.

AS: I was [executive sponsor] for the tech of innovation within the the Technology Group.

It's a Technology support group

At Washington Mutual, any employee could propose an innovation, and it didn't have to be something entirely new; it could also be a significant improvement on an existing process or tool.

EXP: How can businesses encourage a culture of innovation while still managing cybersecurity and other risks? At times they can be competing forces.

To foster innovation while managing risks, you want your employees to feel comfortable thinking outside the box. They’re not just supporting the management of a function. but that you're constantly encouraging them to look for ways that the process could be improved…the business process

At Washington Mutual, we awarded a $500 prize monthly for innovative suggestions.

We never awarded the cash to anyone if we were not able to implement their recommendation; that was crucial.

It had to be practical, not some pie-in-the-sky idea that would take five years. It had to be implementable.

EXP: You found that to be a good way to keep people involved?

AS: On their toes! Yes! And to feel included!

EXP: That's a significant part of it because it contributes to the culture from the employee's perspective, and morale, and feeling like they are part of something meaningful.

This goes back to the culture of cybersecurity, which you need to instill in individuals.

However, it's a challenge because many individuals perceive cybersecurity as a constraint or inconvenience. MFA take time, and these security measures feel to the employee like we’re tying their shoelaces together.

Do you have any advice for the business leaders in attendance on how they can promote a culture that values cybersecurity?

AS: I believe the most useful technique, both in the security area and business continuity, was making parallel recommendations between home and work.

For example, a company should designate areas where employees gather outside the building during an emergency, ensuring everyone's safety. Or you should establish a phone tree with people outside your physical area to let someone know—outside a disaster area--you're okay during a disaster.

We found that helping employees create a family plan that parallel what the company is doing was also effective. This made them understand both the risks and the importance of managing them. It showed them the bigger picture at the corporate level.

EXP: Because it makes them the risk manager of the individuals that they love.

AS: Of themselves!

That’s what my book is.

[Advice From A Risk Detective: At Work, At Home, Online And On The Road: Searle, Ms. Annie: 9780983934707: Amazon.com: Books]

How you manage your personal risk starts at home, and then it extends to other areas like school, work, online, and while traveling.

Situational awareness is at the heart of every one of those recommendations.

EXP: Speaking of situational awareness, it reminds me of Gavin de Becker's books, particularly "The Gift of Fear." [The Gift of Fear: de Becker, Gavin: 9780316235778: Amazon.com: Books]

It’s about situational awareness. It talks about listening to that little voice in your head that warns you when something doesn't feel right.

But now I'm I'm gonna throw a big curveball at you. I'm gonna draw on something else that was really influential.

A year ago, we had Dr Eric Huffman present to our audience.

He presented on “The Psychology of Cybersecurity.”

One of the things that he said was, “We're living in a world that we're not built for.”

What he means is that we have six million years of evolution that teaches us to jump back when we see a snake. We have a hundred years of experience that tells us that a plane is safe. We have a few decades that tell us how to engage online and with the computer, and [just] weeks, months, not even a year in how to interact with artificial intelligence.

So can you speak to that about how perception confuses us or may lead us down the wrong path? How can we navigate this new and unfamiliar world that we're all living in now?

AS: The biggest challenge I had after 9/11 was related to that.

It was getting employees to follow the advice to shelter in place during a range of emergency situations.

The perception was that they had to get out of the building, running into the street, which is more appropriate for natural disasters like earthquakes in South America.

It’s still the case to some extent.

People think they’ve got to get out because they've seen movies where everyone rushes out of buildings.

What they don't understand is that most people in those buildings on 9/11 did evacuate successfully. There were some 20,000 people in the towers.3,000 people died which is not insignificant. That’s not what I am saying, but the majority evacuated safely.

Many had been well trained by emergency managers on how to move down from the 42nd story to the street.

In the context of cybersecurity and information and communication technology (ICT), we often perceive technology as a tool rather than a source of threat.

It just seems like it's a device that we're dealing with. All of the processes that we see in operating a computer every day seem well thought out. We've been, in fact, trained on how to use programs—depending on how custom they are. (It depends on the business you're in, I'm sure.)

But we don't tend to see technology as a source of threat to us.

We tend to think of the “threats” as external. I mean really external.

There's a gap between perception and reality. There's a gap between how significant a risk is and what we think it's worth on the risk scale.

Often we've overestimated the amount of risk.

The common examples are a doctor telling you it's no more likely that you would die from taking this medicine than that you would be hit on I-5 at 1 pm on a Thursday afternoon by another car…that kind of thing.

EXP: That implied Authority or the familiarity of the trust that we place in software…

 It used to be that people believed what they read in print. More recently, people have become more skeptical, but they have a high level of trust in software. One thing I'd like to bring up, which is very relevant this week, is the significant increase in phishing attacks that include a QR code.

This was just brought to my attention this week, and it's effective because people tend to trust QR codes, assuming they must be legitimate. They look sophisticated, but anyone can generate a QR code leading anywhere. People know to be suspicious of links, but they might be more trusting of a QR code.

They pull out their phones, go to it, thinking they're resetting multi-factor authentication, when in reality, someone is harvesting all their password information, compromising their business email, and causing all the resulting consequences.

This ties into what you were saying earlier about the level of trust in this tool, giving it some level of authority due to our long history of interaction and trust.

AS: And we have high expectations that it won't fail us, which explains why people get so upset when their hard drive fails.

Cost/Benefit/Risk Decisions

EXP: So, let's jump into that. The solution for many technology problems, which is backup and disaster recovery. When discussing these topics with business leaders, we talk about how long they can afford to be down and how far back in time they need to go, also known as the Recovery Time Objective (RTO) and the Recovery Point Objective (RPO).

These are easy topics for tech people to discuss, but the challenge arises when you realize that the closer you get to zero seconds of downtime, the more expensive it becomes.

Then, we face the challenge of quantifying the risk and calculating the cost of downtime and the likelihood of such an event. This is a crucial aspect of disaster recovery planning, especially for small and medium-sized businesses.

AS: I had it easy at WaMu because if a senior vice president who owned a business platform or piece of software wanted an immediate response time of less than eight hours, they'd find out how much that would cost and then they would say. “Well maybe it’s not so important afterall.”

And then they’d have to figure out how to back up.

If that’s what you are going to say, we’re going to send you into the chief risk officer to sign off on a document that says you're accepting the risk of 72 hours or whatever…48 or whatever they choose, even though in the estimation of our office we consider your application or platform to be high risk and high value to the company.

I don't know how often you can do that at a at a lower level.

Since you're a sales guy I'll say this: sometimes the sales guys from outside talk an owner of a business process into a piece of software. They're describing how “immediate” the recovery will be but what they’ve failed to do is line that out as a cash item in the sale. [The cost of recovery.]

So we're left picking up the pieces after the fact saying, “Well, you have to pay for recovery!”

EXP: Meaning like the uploading of data. There's costs associated with that. The recovery itself has a cost associated with it.

AS: When I went to the bank things were pretty uneven. The bank had just started to buy some other banks and it was in the retail banking area. I had to inventory the infrastructure. That's the first job as a technology architect. I had to do [it] and it was all in people's memory. It was not somewhere else and there was no order of recovery established.

If the mainframe and everything else went down… No order of recovery.

What comes up first? That's another kind of problem that's not exactly the same because you're looking at an enterprise level. What do you need first?

And it's not just technical issues. It has to do with where the bank’s got its money, where it's revenue sources are and a whole lot of other questions as well.

Public/Private Partnerships and Community

EXP: So what I'm inferring is: for a small business, there may be tremendous value in tabletop exercises.

AS: Absolutely!

EXP: Where all this comes to light.

AS: Yeah.

I'm working on what will be a talk but also a white paper on the digital trust landscape.

One of the things I'm going to push hard on is the need for more cooperation between our federal partners like CISA.

We have a Region X Infrastructure Security Group that meets maybe once a month.

You can get on the call. You can take one of your people, even if you're a 10 person firm, you can tell one person they need to be on the call. They need to just keep watching bulletins and other documents sent out that may point them to guidance they didn't know about, or cooperative training efforts or other things that the federal government is offering.

It's a big thing.

EXP: Let's make a point to follow up with that.  

I want to get links from you that I can share with our audience. These are things that I'm not as aware of. I am aware at a higher level that we are seeing a lot more cooperation between the public and private sector from recent release from the Department of Defense and it mentions CMMC and other regulation that has public and private cooperation. The White House too has in their strategy statement that came out in March referred to a greater partnership between the public and private sector.

So it sounds like you see that as an encouraging trend.

AS: Oh absolutely!

And that would include: before something happens, it would really be nice to have met even if only on a call your local FEMA private sector representative, right?

So you had that contact information handy if you ever needed it.

There's actual work groups that people can belong to.

I think that's really important.

And what goes along with that is scenario testing.

Whether it's just a test of a wildfire taking down your remote location or a cluster bomb hitting or how are you going to exit your people out of a downtown core during an earthquake? There are lots of scenarios.

You can include just your own people but you know what would be important to do would be--and Kelly I don't know how you're going to feel about this--but you should include your vendors as well.

EXP: Oh yeah! Of course!

AS: Especially if your vendor is responsible for bringing you back up. You need to include your vendor.

Then you look at what else you’re dependent on that belongs say to the city or to the state. Try to get representatives in.

If we were doing a dirty bomb downtown, we would have the fire department in there. We'd have the police department in there. We would have the state Emergency Management people along with Seattles great emergency management office.

Something like that takes a lot of time. We did it once a month. You could do it quarterly. You could do a quarterly test and engage your public partners at the same time, and the test and have a lot of built-in knowledge/ability in case something does happen.

EXP: I will plug one other test that's very small scale and highly informative. There are folks on this call that are not clients of EXP Technical. Especially for those folks: a test restore from your backups (just retrieving a random file) is one of the most valuable and informative checks you can do on your backup and disaster recovery mechanism.

AS: And humbling!

EXP: Yes! Eye opening,

AS: That's right!

EXP: Because you're monitoring this process it indicates that it completed successfully. You have all this faith in it but then you go to retrieve data and you find out that all those files are corrupt. It's all gibberish.

AS:  That's right!

EXP: That comes out when you try to retrieve a random email message and a random file. Do that on at least a monthly basis.

The plug is: EXP Technical can do that for you but even if you have internal I.T resources it should be a KPI in your meetings too: “How did we perform on our last test restore. and it's a a quick test and It's a really informative spot check.

In your book you said that your mother was part of a community club. Can you tell me more about that?

AS: Sure!

That's a small town of 1100 people at the time when I was growing up.

My memories of it may be faulty in some ways but I think they met maybe once a month and that sometimes they had speakers and then sometimes they were working on things.

We have, in the neighborhood where I live in now, a 300 house disaster preparedness area we've organized on a blog.

We've also broken the neighborhood up into zones. We know who all the doctors and nurses or EMTs who might live in the area. We have a good idea what homes might have supplies like chainsaws if we need them.

We have daycare and senior care centers which are houses within the area. There are five zones and so we've done that.

We've pre-purchased through donations a range of supplies emergency supplies as well

EXP: It's fascinating for me and for us at EXP Technical because community is one of our highest values at EXP. This event is a little bit more interactive than some of the events that we've had in the past.

We have ideas of establishing that sort of community.

Maybe not for emergency preparedness but more of a “tech club” where we can talk about how to recover technology, or what are you doing to enhance operational efficiency in your organization.

Future EXP events may be a little bit more interactive.

I believe that we can learn a lot from each other when we share knowledge and share success. There's a lot that can be gained that we might not be able to discover on our own

Share Knowledge, Share Success

EXP: You came to teaching later in life. Were you always a teacher?

AS: No.... Oh Yes! I was!

Anyone who's known me for a long time said, “You were always teaching!”

When I was at the Seattle Art Museum, I took interns from four different universities, in the office of public affairs.

At Delphi I set up relationships with North Seattle Community College because of their Tech program and a couple of other places as well Lake Washington Voc/Tech and had interns from both of the from those places as well.

My husband is a professor. My son is a professor. I never thought I would teach.

And yet, when I finished up at WaMu and I would was setting up my company, two people from the faculty at the I-School met with me and said, “You should teach because you have all this experience.”

They knew me from a relationship I had with the I-School as being part of corporate sponsor for some of its programs

and nothing ever that I did in my career has been as rewarding as teaching

I love working with people I love being a perpetual student myself so that there's something always not in my comfort zone that I'm going to try to learn.

I'm not backing away from that even in retirement from teaching now um I say that you always get back more than you give when you teach and that's really true and I like to think of myself as having all those years been training the next generation of Risk Managers

Several of my students, who are now corporate employees, are here in this call today.

I can almost hear them laughing.

I almost insist upon being challenged in class.

I expect to have people poke at me and say “why do you say that? I don't agree.” I encourage disagreement.

I encourage people to take a perspective that they're not comfortable with.

It's not their natural perspective.

Try to look at the problem from someone else's point of view. There are lots of exercises like that.

But mostly what I learned from them is about their situations and I'm able to help some of them as well.

I'm terribly proud of the work because I have published student papers that are outstanding or excellent as research notes through ASA. They have published material that's on their resume. That often helps them get a job as well

Audience Q&A

EXP: We're in our final minutes here, we did receive a question through the Q&A, and while we touched on it a bit, you might want to elaborate. The question is, "How do you see AI being used against small businesses in cyberattacks?"

AS: I think that it just will keep learning from the way the designers of that software are working They'll just get smarter about how to approach you the next time

EXP: It’s like the raptors in Jurassic Park – they keep testing the fences.

AS: That's exactly right.

EXP: I think I have time for one more question, and we do have one from the chat in the Q&A: "What role should larger companies play in supporting or requiring smaller suppliers or third parties to comply with security controls?"

I'm not familiar with a situation in which that's not the case. I managed vendor security for Washington Mutual, and we had a separate attestation form, aside from the contract, with clauses written into the contract regarding business continuity, confidentiality, data usage, and all the expected aspects of a third-party relationship.

If you want to grow your business, you should be prepared to adhere to established frameworks and standards so that larger companies will consider hiring you.

EXP: We (at EXP) see this from various perspectives – companies like Microsoft have these requirements for their vendors. Government programs like CMMC encourage compliance down the supply chain as well.

Also, one upcoming topic for discussion is cyber insurance, which often requires attestation or validation of specific controls.

I think we've covered those questions, and we're running short on time. I'd like to thank you; this was both fun and informative. I learned a lot, and I hope the attendees did as well. Any final comments you'd like to share before we conclude?

AS: Just that I think that small businesses are the backbone of this country, and founding and growing a company can be one of the most exciting things you can do.

I took a lot of pleasure delight in doing that and I was rewarded from that with a lot of great customers.

 Our approach was on solving problems for our clients. That can lead to progressively larger contracts, but it's essential to decide how large you want to grow and what you need to gt to that point.

I thank you for the opportunity to come back and discuss small and medium-sized businesses rather than just big corporations.

EXP: Where can people find you?

AS: If anyone wants to reach out, my website is www.anniesearle.com. It includes a blog. My email address is annie@anniesearle.com. You’re welcome to contact me if you're interested in the links I mentioned. I'll try to compile them into one document for Kelly to share, or you can reach out to me directly. If you're one of Kelly's clients, I'm sure he'll handle the heavy lifting.

EXP: Thank you so much, Annie. This was an insightful and enjoyable discussion. I'll ensure the audience has access to the recording, and we'll follow up via email. Thanks again!

AS: Thank you.

References

Here is a list of links to articles, web sites, and presentations that were mentioned during this conversation:

Gail Coury, “AI Is Here: How Should CISOs Respond,” Tech Radar 9/19/23  https://www.techradar.com/pro/ai-is-here-how-should-cisos-respond

“MLA-CCCC Joint Task Force on Writing and AI Working Paper,” July 2023  https://hcommons.org/app/uploads/sites/1003160/2023/07/MLA-CCCC-Joint-Task-Force-on-Writing-and-AI-Working-Paper-1.pdf

“AI in 2019: A Year in Review -- The Growing Pushback Against Harmful AI”  -- accessed at https://medium.com/@AINowInstitute/ai-in-2019-a-year-in-review-c1eba5107127

Bollier, David. “Artificial Intelligence and the Good Society.” Third Annual Aspen Institute Roundtable on Artificial Intelligence – accessed at https://www.aspeninstitute.org/programs/communications-and-society-program/roundtable-artificial-intelligence/

Floridi, Luciano and Cowls, Josh. “A Unified Framework of Five Principles for AI in Society.” Harvard Data Science Review. Accessed at https://hdsr.mitpress.mit.edu/pub/l0jsh9d1

The Institute for Ethical & Machine Learning. “The Eight Machine Learning Principles.” –     accessed at https://ethical.institute/principles.html

Searle, Annie. “Ethics, Risk and Artificial Intelligence.” ASA News & Notes.  November 11, 2019

Floridi, Luciano, et. al. “AI4People—An Ethical Framework for a Good AI Society: Opportunities, Risks, Principles, and Recommendations” https://link.springer.com/article/10.1007/s11023-018-9482-5

Huffman, Erik "The Psychology of Cybersecurity" EXP Technical Webinar
Webinar: The Psychology of Cybersecurity - August 3, 2022 Noon Pacific (exptechnical.com)

EXP Academy: https://academy.exptechnical.com/

Related Posts