Cybersecurity Burnout, Deception Tech, and National Security with Cynthia Brumfield
Show Notes
In this episode of Cyber Focus, host Frank Cilluffo speaks with Cynthia Brumfield, a prolific cybersecurity journalist and analyst. Brumfield discusses her reporting on the human toll of cybersecurity incidents, including mental health challenges and burnout among cyber professionals. She also explores the evolving role of deception technology in cyber defense and highlights key cybersecurity provisions in the latest National Defense Authorization Act (NDAA). The conversation covers the growing threats posed by foreign adversaries, including China, and the importance of resilience in cybersecurity operations.
Main Topics Covered:
- The mental health impact of cybersecurity incidents and the need for better support systems
- Deception technology and its role in cyber defense beyond traditional honeypots
- Cybersecurity funding and policy changes in the NDAA, including a $30 billion investment in military cyber operations
- The rise of ransomware and its classification as a national security threat
- The establishment of the NSA’s AI Security Center and its implications for national security
- Supply chain security concerns, including Chinese technology risks in ports and telecommunications
Key Quotes:
“I don't think I realized until I wrote it and having talked to all the folks who have gone through this... I don't think I realized how traumatic it is to be in the middle of a cybersecurity incident. In fact, it's very much like any other emergency situation.” – Cynthia Brumfield
"You need to lay the baseline of an appropriate emotional and psychological response to these incidents before they occur, so that you don't have the burnout, that you don't have the PTSD.” – Cynthia Brumfield
"[Deception technology] is basically this term of coming up with a very broad strategic goal of tricking the enemy and getting them lured into dead ends on your network.” – Cynthia Brumfield
“I think [the Cyber Force discussion] has legs this time... There's some momentum on this. I'm getting asked more and more and more questions, including from skeptics.” – Frank Cilluffo
“It’s important when we're talking about Chinese supply chain threats and espionage threats to sort of separate the wheat from the chaff. There are some serious concerns... but we have to have a much more sophisticated grasp on what are the true threats and what are not really true.” – Cynthia Brumfield
Relevant Links and Resources:
Managing the emotional toll cybersecurity incidents can take on your team
Increasing the response level to ransomware
Guest Bio:
Cynthia Brumfield is a leading cybersecurity journalist and analyst, writing for publications such as CSO Online. She runs Metacurity.com, a cybersecurity news site, and has been covering the field for over a decade. Her work focuses on cyber policy, national security, and emerging threats, with an emphasis on making complex issues accessible to a broad audience.
Transcript
Welcome to Cyber Focus from the McCrary Institute, where we explore the people and ideas shaping and defending our digital world. I'm your host, Frank Cilluffo, and have the privilege today and this week to sit down with Cynthia Brumfield. Cynthia is a prolific writer on all things cyber. She runs a major cyber news site at Medicurity, which I recommend everyone subscribe to. And I think most importantly, she goes in depth on a number of stories that don't get the attention that they deserve in our ADD world in cybersecurity. So really excited to sit down with Cynthia today. And Cynthia, thank you for joining us. Thank you, Frank. I'm very honored that you invited me. Well, we're privileged to have you. And I thought I'd start, start with a very recent article you wrote and it was looking at the human cost of cybersecurity, the mental health issues, burnout themes that I don't think get as much attention as they probably deserve. But before we jump into the article, I'd be curious, what led you to write this article? Have you been hearing from a lot of friends and colleagues or what was that initial thought to be able to go deep? So initially, and
it's interesting that you were asking this question because initially I had proposed writing a piece on. So this is for a publication called CSO Online, which is aimed at CISOs and they seek practical information on how to do their jobs. And so I had proposed an idea on how CISOs can talk to the board about the cost of cyber incidents. And my editor suggested, well, can you throw in there perhaps the, you know, the human toll cost as well. And when I started talking to various people about that, I realized it needed its own piece. I wasn't going to, you know, do it justice to do a paragraph and a piece on how to talk to the board about the cost of incidents. And the more I of dug into it, the more compelling the whole subject was that that piece that, that was published was a very long one for, for CSO and also generally for any publication it was 2,000 words. And even at that I had to leave a lot on the sort of the cutting room floor because it's just such an in depth topic and a compelling one. And I don't think I realized until I wrote it. And having talked to all the folks, folks who have gone through this, have experience in this, the psychologist who's developed a program around it, the cyber professional who's trying to get organizations to pay attention to this issue, I don't think I realized how traumatic it is to be in the middle of a cybersecurity incident. In fact, it's very much like any other emergency situation. And I use the example in my piece of, you know, police departments and fire departments so very. And indeed, the program that was developed by the psychologist that I cite in my piece to help deal with the mental health issues surrounding cybersecurity professionals, particularly during an incident, he modified that on the. From a PTSD program he developed for the US Army. So, you know, I don't know that it. Having written about Cybersecurity, it's been 10 years now, and having, you know, spoken to many cybersecurity professionals, just how, how serious the mental health impact is, you know, particularly if you're dealing with a very severe incident. True. And I would imagine,
and I'd be curious since you did tie it to sort of the C suite. By and large, I often say security is always too much until the day it's not enough. And the time that most executives are paying attention to these missions is obviously after something bad has happened. Are you starting to see some companies that are being maybe a little more focused on some of these issues? And if not, what do you think we should be discussing and thinking and doing? So it's interesting because
as I note in the piece, I spoke to several experts, one of whom you may know, Joe Sullivan, who was the CEO at his last job of Uber. That created quite a bit of personal stress for him and he, his fellow CISOs, on how they can survive these incidents. And I guess the sense I walked away from this piece is that organizations are not doing enough. One of the things that Sullivan and other folks I spoke to stressed with me is that you need to lay the baseline of an appropriate emotional and psychological response to these incidents before they occur so that you don't have the burnout, that you don't have the ptsd, so that people are looking at them as sort of normal. It's, you know, and Joe Ma mentions this, is that you think of yourself as a fire department, you know, firemen and firefighters, women too, go into crisis situations, but they don't have. They have, they have the training, they have the psychological mindset, they have the social support of their peers and their superiors to talk about these issues that make it a lot easier for them to handle these kinds of crises. And the message that came through from my conversations is that we need to do that too in the cybersecurity. We need to set up particular programs that enable particularly those on the front line and in the security operations centers where the heat is the highest, enable them to kind of deal with this a lot better, enable them to talk about it and otherwise kind of manage it better. What I'm hearing is that there's not enough of this going on in corporate America, that the obstacle right now is trying to inform human resource departments, HR departments, that cybersecurity workers are different. These are not your regular sort of stress reduction mindfulness programs at lunch. This is something a little deeper, a little more serious. And, you know, although CISOs, the people who organize their teams to deal with these incidents are very amenable to it, the sort of the last holdout seems to be the HR departments. They need to be convinced that this is something that they should undertake, which of course costs money. And everything in cyber. Everything in cybersecurity costs money. And that's a problem. Yeah, it is a cost center, and I think
it's fair to say it's more than just having a yoga room and some ping pong table, whatever else. And the reality is, is even in the first responder community, it didn't take on great salience. So after we had a catastrophic incident. So I remember after Oklahoma City, after the bombing at the Alfred P. Murrah building bombing, and then again after 9 11, of course, it suddenly became a more prevalent set of issues. So I think you're tying it to first responders. First preventers is a great approach in emergency managers because like you said, they do have some training. And at the end of the day, if you look at our military, our special forces, they spend a lot of time on the behavioral side. And that battlefield here is probably even more important than the kinetic one in some respects. So I'd be curious what some of your thoughts were around first responders. Anything in particular that we can maybe apply. Obviously it's not the exact same, but there are some parallels. Well, the one
main point that came through in my piece is to normalize, as I mentioned, normalize the crisis that lessens its emotional impact. If the cybersecurity workers are prepared, that someday something really serious and bad will happen and they will be in a pressure cooker and be responsible for getting things up and running. If they know that in advance, if they're able to talk about it in advance, and you know, when an incident does occur, able to talk about it, not bottle up their emotions, which is what firefighters. One of the experts I cite in this is somebody who spent 10 years as an emergency response dispatcher who basically said that police departments that he worked with, were the kind to kind of say, all right, it's over, no need to talk about it, it's done. They would bottle up their feelings, but the firefighters did the opposite. They really talked things out, and as a consequence, they were more resilient and they were more. They were less traumatized by things that happened. So I think the big takeaway there is be prepared. Don't, you know, tell them in advance something bad's going to happen. Be prepared and normalize. Normalize the bad stuff. And ironically, that makes it a little easier to handle. And you mentioned Joe Sullivan. He
paid a very high price personally. He did. Or so oc and obviously that signals to others, hey, if you take these jobs, you could be held at a, at a bar that even if you do try to do everything in, in your authority and capability to do so, you still may not meet that bar. So he prosecuted,
he was a federal prosecutor who was absolutely. For obstruction of justice in a fairly complex case involving Uber, the very, I think the very first CISO who ever had that happened to them. And that was a watershed moment where everyone got a little scared thinking they could go to, you know, in the end, he got probation. But, you know, it raised the prospect of, of going to, you know, being. Going to prison for, for trying to handle an incident the best you can, which oftentimes does come with the fog of war. That's. So, yeah, no, he, he learned. And his, his object lesson, I think, has been very helpful to the rest of the industry.
Like I said. And, and they are balancing also accountability. So it is a tough set of issues if you're. To your shareholders, your customers, your clients and others. So it is, it is complex. It's not a very simple, simple set of solutions. But, but I do hope, I mean, any other parting thoughts before we jump to one of your other articles, what companies could do, should do. And, and I know on the HR side, but if, if you were staffing a, a Fortune 100 company, are there a couple of steps you think that our viewers, listeners, and others can sort of walk away with? Well, another point that Joe made that's in the piece is
that CISOs often shoulder the burden of security alone, even though they're not totally responsible for the policies that kind of govern how organizations do and can respond to incidents. And his recommendation is to share the burden with the other executives, to make them aware of what happens to the cybersecurity teams, to make them aware of the damage that could. Psychological and psychic toll that could fall on cybersecurity workers so that it is maybe not top of mind for them, but until the senior security leader within an organization speaks up and tells the general counsel or tells, you know, the CEO or the board or whoever it might need to be that, hey, bear in mind when these things happen, we need the resources or the consideration for the workers who are dealing with these incidents. So I think, you know, maybe from the first responder perspective and the perspective of sharing with the board, there's a lot more conversation that needs to be going on within the organization about the impact of cybersecurity incidents. And
just the burnout workforce challenges the deficit. I hope that your piece does trigger that discussion, a discussion that undoubtedly is needed and we're only beginning to scratch the surface.
Yeah, there's a lot more. Yeah, yeah, there's a lot more there, there. I'm not
a psychologist. Don't play one on tv. So I don't know what I don't know. But I do know from a lot of friends, and this is not empirically based evidence, but just anecdotal that they are paying a price. And at the end of the day, this is a risk like any other risk that I think the corporate C suite does have to take ownership of to a large extent. Not to pivot in a rough way to another story, but you had another really interesting piece also for, I believe, CSO online, and you're regular contributor for them, and they do great work. And that's looking at changes in not deception from an adversarial standpoint, but from a cyber defense perspective. And I think every one of our viewers will have a lot of awareness around Honey Pats and some of these other more tactical defenses. But I thought your piece went much more broadly and I'd be maybe start with framing the piece and then we'll jump into a couple of the big takeaways. Yeah, well,
so it's interesting. This piece came out of a conference that sadly had its last run in January here in Washington called Shmoocon. And I attended a session. There was a former FBI computer scientist named Russell Handorf. He's actually kind of was kind of a big wheel in the Philadelphia FBI office, but he gave a presentation that was utterly fun and fascinating about this whole field of deception technology, which, as you pointed out, Frank, most of your readers are going to be familiar with honeypots. And that's for a large portion of the industry, kind of the, the beginning and the end of what deception technology is. And honey pots just to, you know, shorthanded Are, you know, assets, typically servers that are, you know, sacrificed or, or put into place to attract enemies or adversaries, to get more information on who they are and to attract, divert their attention away from the really good stuff so that they, you know, if they're not, you know, completely averted, they're slowed down and you get more information on them and you can find out who they are. It's a relatively inexpensive proposition from a couple thousand bucks a year maybe to install servers and then attract the bad guys and then just kind of get rid of those servers. Deception technology is a concept, also comes from the military as, as do many things in, in cybersecurity. But deception technology was sort of pioneered by the NSA in the 1980s and 1990s as a sort of theoretical and academic matter. And now it is basically this term of coming up with a very broad strategic goal of tricking the enemy and getting them lured into dead ends on your network and within your systems so that you can, A, find out who they are, B, get, you know, signatures and footprint information and slow them down and otherwise, you know, kind of situation, set up this very elaborate, elaborate trap for them. And it is beyond honey pots. It is a strategic and very complex view of things. It is also something that while intriguing and I, I had a lot of fun writing this piece and I'll explain why in a minute, because it had to do with Handorf's presentation. He did a real world analogy of his life and kind of likened that to the concept of deception technology. But the reality is it is a very complex proposition to launch a deception technology campaign. For the most part, it is very large organizations that do this level of trying to trick the enemy or the adversary and it's expensive. And some organizations, I had a conversation with Russ after his talk and some organizations go to incredible lengths. They will actually set up fake departments within their organization with fake workers typing in fake emails, you know, that are scripted and you know, creating letterhead and fake invoices and everything else. Because for very sophisticated threat actors, that's what's necessary because they will easily pick up on, you know, kind of really half hearted attempts to create phony assets. And you know, the, the bill for that kind of deception operation is in the millions of dollars a year. And you know what Russ sort of hinted at with me, he didn't really give me names, but basically the, the organizations that do that are major financial institutions. They run deception operations where they lure people in and create realistic assets that at first blush, I mean, ultimately these very Sophisticated threat actors, some of them are nation states, ultimately can figure out that, you know, for example, if you have a thousand servers set up and they're all built exactly the same way, they will instantly spot that as. As a trap, and they will not fall for it. And. And you will have spent a lot of money for nothing. So you have to do it very realistically. And what made it really fun for me and is that Russ had told the story, and I have this in my piece that, you know, his knowledge of deception technology led him, after he and his wife bought, I think it was, 40 acres of land in the countryside, discovered that it was overrun by, you know, illegal hunters and squatters and other people. So he created something called the Rattles. He. He literally formed a corporation or nonprofit organization, I believe, filed the paperwork, created something called the Rattles Rattlesnake Preserve and put signs up all over his property saying, oh, you know, welcome to the Rattlesnake Preserve. He put up cameras, he put up microphones, he put up on the signage, he put up QR codes so that people trespassing on his property would, you know, check it out, and then he would have their phone number, their whatever, all their other identifying information that he collected, you know, just for the purpose of providing law enforcement with information on people who were, you know, committing a very small crime of trespassing. But. But it was interesting because he kind of used that analogy to point out, you know, what you need to do at a corporate level. You need to actually make things look real to. To trap people into getting into your phony assets. Once all those
hunters realized there weren't rattlesnakes on every corner, did they stop? Did they. Did it work? Was it effective or. Yeah, it was effective. It was effective. He didn't really
say out loud if he gave any information to law enforcement, but the whole goal of it was to identify who these people are. And the reason he actually filed for the incorporation papers and he created the signs and everything else was so that they would think it was real and that there were rattlesnakes, you know, roaming around, and they'd been hunting there for. 30 years and never saw one and thought, yeah,
all that. That's the grave story. And I want to push back on one thing you mentioned, sort of NSA beam up, the Greeks, and during World War II, others would say, deception in technology and military environment been around for a long time. But
that is correct. That is correct. But in the digital realm. In the digital realm, they took these ancient practices of fooling the enemy And. I'm going to give you
one guy to read about because he was phenomenal during World War II, R.V. jones and, and he was a British scientist. I don't think people still have an appreciation for just how significant deception was for tricking the Nazis in so many battles and ultimately saved lots of lives, not to mention possibly the war. But great, great set of issues. Just one more question on that because I want to get to one more big story you wrote for sure. And do you think deception technologies and you touch on this, can be integrated with zero trust security models? Do you see that connection? So at least one of the experts I spent some time talking to really
hammered home that point that you don't want one of the things that happens if the enemy gets enemy, the adversary gets wise to the fact that they have been tricked frequently they will go on a rampage and they'll destroy everything that is around. And therefore you want to make sure. If you do it, you do it for
real. Right? Yeah, yeah, right. But you also want to insulate your real assets from
any form of destruction from an angry adversary. And zero trust. One expert I spoke to said you've got to have a zero trust platform which is, you know, anything that's surrounding this particular deception trap, you know, you only permit what has already been pre authorized. The access entry, you know, is following sort of a zero trust model where you don't trust anything, you don't trust anybody, you only permit in those things you're absolutely sure about. So that's where zero trust comes in. When it comes to the deception operations, you're insulating all this stuff. Yeah, and I also think, and again
disagree with me, I may be wrong, but it's also very effective to counter insider threats by and large. Because at the end of the day, if you have an insider that can either be directly an insider or enable an outside group, that's a much more difficult set of issues to defend against. So I would imagine that was high on the list in some of the experts you were speaking to and some of the thoughts. Yeah, you're correct. There's nothing to correct you because they did point
out that it's not only adversaries, it's insiders who are roaming around into places that they should not go. And so you can trick them as well into these sort of, you know, dead end, but very effective for the organization assets that, you know, if somebody is roaming around, oh, this looks really interesting. This is where all the financial accounts are and they're really not someone who should be there that a deception operation can, can really get a hold of those people. And I want to close
with a story that still, I'm amazed, hasn't gotten as much attention. But this is back a month or so ago you wrote about the National Defense Authorization act and specifically went through a long document to tease out all the, all the cyber references and not to steal your headline but 30 billion toward cyber Military Oriented. You want to touch on maybe a couple of the highlights and I've got a couple that are pet rocks of mine that I will go into if you don't. But I still think it's amazing. It's been out for a couple months, but there's so much in there that, that I don't think enough of the discussion is around what is actually in there. So I'd be curious what your big takeaways are. Well, I mean,
yeah, thank you. They're. There was a lot in there. I mean, I left again, this is a piece where I left a lot of sort of discussion on the cutting room floor, if that's the right analogy here. And you know, one, you know, there's so many aspects. There is, there was a provision that, to fund a shortfall that the U.S. federal Communications Commission had in funding to rip out Chinese gear from manufacturers where we're concerned about supply chain threats from China. And that was Huawei and ZTE. And they needed something on the order of $5 billion to give out to mostly local, mostly rural and small town telephone companies so who really don't have a lot of funds. And this was money to help enable them to rip out that gear. And so the bills provided funding, full funding, or at least one would, hopeful funding for the FCC to disperse those funds so they could replace the problematic gear with less problematic technology. There, you know, there was a provision in there to protect the Department of Defense mobile devices from the proliferation of foreign commercial spyware, which is a chronic problem. I don't think a day goes by where I don't read about spyware infections globally, not just in the US of mobile devices. There are one element in. There was a study for an independent assessment of the need for a cyber force. This is, this comes up time and time again and this was something that was part of the. I think it has legs this time.
I think it has some legs this time, but. Oh, interesting. So you might be
able to tell me something. I don't know, but it has been kicking around for a while and there is a study in there to take a look at it. And I'd be interested in why you think it has legs. There's a lot of
discussion on Capitol Hill and it's not only. I think there needs to be a true explainer differentiating what Cyber Force could be and what US Cyber Command is. I've been outspoken that I don't think we need another reorg necessarily if we're doing certain things right. But this is about the services. And the reality is they're not all planning, playing to the, to the level of the capability. They ought to. And it's not just the cyber national mission force, but services in particular. There's some momentum on this. I'm getting asked more and more and more questions and including from skeptics and I'm having to become a little bit of, a little more open minded since I was a skeptic not too long ago. But I think it does have some legs.
Well, that's interesting. You know, I think our mutual colleague, or I don't even know, I, none of you are my colleagues. You're far more experienced. But Mark Montgomery has, has really, he's. Been beating the drum a long time. He has and he has real, he's got some very sincere and hard analysis and papers. And, you know, I'm glad to hear that this is, this has actually got legs. There's so many elements. I mean, the one thing that you mentioned at the beginning of the conversation is that there is a provision in there. The bill contains language. The NDA contains language that essentially raises ransomware attacks which have been very problematic to address. There's only so many tools that the US can deploy in dealing with ransomware actors that are not in the United States and particularly if they're shielded by their home nation, which is, for example, the case with Russian ransomware actors. They're protected in essence by the Russian government. What are you going to do? Years ago, I think one of the first conferences I attended in FBI. Well, maybe not the first, but, but, but years ago I attended a conference where an FBI agent said that ransomware is the perfect crime. And it, it really is, it's very hard to stop. And it's, you know, the, the easiest way to get out of it is to pay the ransomware actor money. And so how do you deal with something like ransomware? Well, the intelligence bill that was folded into the NDAA had this very long section that was ultimately incorporated into the NDA as a sense of Congress kind of thing that basically said that ransomware attacks are kind of the, like terrorist attacks. You know, it proclaims ransomware organizations and foreign affiliates associated with them as something called hostile foreign cyber actors instead of hostile foreign terrorist actors. Now the question becomes, you know, in my mind, you know, does this mean anything? It might, you know, at least there's something in the books on the statute, in the statutes that kind of equates the two and might give. And you probably have a better sense of this. I don't might give the US Government a little more latitude to go after. Some of these ransomware actors without getting
in depth. I think it could unleash authorities and tools that could have more than a little net effect on some of the outcomes. But at the end of the day it does raise the bar. And for years we didn't name or shame. Now it not only names shames, but, but what I felt was most interesting about that, and you touched on this, is countries that we lack extradition treaties with or any, any law enforcement cooperation beyond just extradition have provided safe haven for a number of these ransom. And, and, and there is, I mean even when you deal with terrorist organizations, you have state sponsors, but you also have non state sponsors and it can bring about different sorts of tools that can be brought to bear so more to follow on the net net impact and outcome. But I, I think it's, it's something I've spent some time thinking about, writing about, supporting and testifying on and others. So I don't want to, I don't want to shed the, the conversation there. My, our viewers have heard me on this one. I want to touch on one other point though that you brought up and that is, and, and, and it's sort of the role AI will play in cyber operations. And in particular there is an AI security center that is being funded out of the national security. Anything there, do you think that could have some positive effect? I think that is, I mean, it's getting hard to delineate AI from cyber, just like physical and cyber are all converging. But I'd be curious what some of your thoughts are there. Well, I mean, it's a good
development because I was really not aware that there wasn't anything kind of on the military side of things dealing with AI at this level. But the NDAA directed the NSA to establish an AI security center within the agency's collaboration center to do a lot of interesting things that need to be done, such as develop guidance for the military and presumably the intelligence community on how to prevent or mitigate counter artificial intelligence techniques and otherwise kind of promote secure artificial intelligence adoption practices for the Managers of National Security Systems. This, you know, we were in a new administration. Now, I don't know that I have any more recent insight into what has happened with that. I mean, have they begun work? I don't really have any information yet. I imagine that a lot of things are kind of on hold while the dust settles as the new administration kind of finds its footing and figures out how it's going to deal with a lot of this stuff. But AI for sure. I mentioned that there's some topics that come up every day, spyware is one of them. But AI and it's AI with its potential to. In fact, today a study came out, not a study, but a report came out from a major cybersecurity company called CrowdStrike, which you know and your readers all know, that found that AI generated phishing emails have a higher click through rate, which is 54% compared to human written generated emails. Phishing emails, the ones that are intended to implant malware inside of computers or devices. That to me was astonishing that a human.
And they're probably learning based on who's not clicking how to get entice those that are. So yeah, it's constant forever learning model too. Yeah. Cynthia, unfortunately, the tyranny of time requires I be a bit of a tyrant. But before I let you go, what questions didn't I ask that I should have? I mean, you've covered a great set of issues in all sincerity, you write, you're a prolific writer and I hope our viewers, I know many already follow you, but I hope everyone does from here on out. But what didn't I ask that I should have? Oh, there's so many
questions. I mean, Don, who organizes these talks from the logistical and technical and planning perspective. I kept sending him, I'm like, oh, have him have a mask about this, have a mask about that. And we didn't get to all of them. But one thing, just broadly speaking, is the Chinese threat. And one of the things that Don mentioned to me in our discussions ahead of this talk talk is you know, the threat of pork cranes, which is actually a subject that I kind of took on and perhaps made much more public first before any other publication. And that is, you know, pork crate, Chinese mostly video technology inside of tight. Vision and the like. Yep, yes, yes, actually. And that's, that's a sanctioned entity. But, but they're all, all connected. All those video conferencing technology companies in China are all kind of the same in the sense that, you know, they're all protected by the Chinese government. But that, that's a huge threat. But, but more recently there has been this sort of idea that TP link routers are also a Chinese sort of supply chain threat. And I actually did a real deep dive on that in the fall and it turns out not to be true. But it's an idea that has not gone away, despite the fact there's no evidence that TP link routers made in China are any more of a threat than any other router, including Cisco and Netgear, which are made by US companies, but probably manufactured and the same regions as, as, as crowders. So what's important when we're talking about Chinese supply chain threats and espionage threats to, to sort of separate the wheat from the chaff? There are some serious concerns. Obviously. We went through the, all the typhoons last year, Silk Typhoon and Salt Typhoon, Chinese threat groups which provide varying degrees of threats to US infrastructure. Those are real, some are not. But we have to have a much more sophisticated grasp on what are the true threats and what are not really true. We will have you on again talking port security
and supply chain. That is a evergreen topic for us. And as you can imagine, the Communist Party of China and their intentions is also an evergreen set of issues. And I just might note, Flax Typhoon was the Iot and that got into a whole host of different issues. I think there's something brewing in the medical side right now that's got in healthcare side in particular, causing some significance. Cynthia, thank you for taking so much time with us today. We will have you on again. I really appreciate all the good work you do and thank you so much. Thank you. Frank,
thank you for inviting me. I enjoyed it. This was awesome. Thank you. Thank you
for joining us for this episode of Cyberfocus. If you liked what you heard, please consider subscribing your ratings and reviews help us reach more listeners. Drop us a line if you have any ideas in terms of topics, themes or individuals you'd like for us to host. Until next time. Time stay safe, stay informed and stay curious.