Since the COVID-19 pandemic hit, the way people work has changed, opening a new world of vulnerabilities. High-profile attacks on companies like Colonial Pipeline, SolarWinds and Kaseya have put cybersecurity on everyone’s radar. But in some ways, it feels like the defenders are still playing catch-up to the attackers. CFE recently put together a group of leading experts in the industrial cybersecurity field for an open conversation about some of the prevailing trends in regards to government regulations and critical infrastructure.
Joining Gary Cohen, senior editor of Industrial Cybersecurity Pulse, are Ron Brash, Eric Byres and Dino Busalachi.
Formerly the director of cybersecurity insights at Verve Industrial Protection, Brash became the vice president of technical research and integrations at aDolus Technology in 2021 and has bridged the gap between technical and business domains. He is also a previous 40-Under-40 award winner.
Byres is the chief technology officer (CTO) and a board member at aDolus Technology. He is recognized as one of the world’s top supervisory control and data acquisition (SCADA) security experts and is the inventor of Tofino Security Technology, a service that provides industrial network and SCADA security products.
Busalachi is the CTO and co-founder of Velta Technology. He has 40 years of experience in the operational technology (OT) and industrial control system (ICS) fields. He has continued to use his expertise to provide solutions to protect critical industrial assets and devices.
The discussion has been edited for clarity.
ICS Pulse: Following some high-profile attacks, especially on critical national infrastructure, governments have tried to step in to standardize and shore up national cybersecurity, both in the U.S. and abroad. New regulations from the Cybersecurity Executive Order to things like the proposed Ransomware Disclosure Act have all been part of this effort. Why did this become necessary, and how can these things have an impact?
Eric Byres: It’s become clear after incidents like Colonial Pipeline and SolarWinds that this isn’t something individual companies can protect against. This isn’t something that you can just say, “Well, if we do good security, our company will be all fine.” The whole problem around supply chain and ransomware and all the issues that we’re seeing today requires a unified national solution, and that’s going to come about through guidelines or regulations. You can say that the regulations are to push the laggards, but it’s also to set a baseline. Also to determine how we’re going to cooperate as a country, as a nation, as an industry. How we work together to keep our systems secure.
Dino Busalachi: I would also add that the regulatory groups have recognized that critical infrastructure has a wide- and far-reaching impact to communities and to your nation. So you have to be able to put some parameters around [it]. If you think about wartime, critical infrastructure was always something that was really highly maintained and controlled by the government — not owned by foreign entities — whether it be rail or transportation, your seaports, your power, your major manufacturing. Now, we’ve moved into the age of cyber, and it’s like, “How do we bring that into the fold?” Regulation is one way to do that, especially for folks that get attacked and don’t communicate what’s happened to them or what disruption is there.
What we see as practitioners in the industry, the vulnerabilities are pretty extensive. You can get into the argument over information technology (IT) and operational technology — what was attacked and how did they get in there? But the fact is there’s a lot of IT technologies controlling physical outcome-producing assets on the plant floor.
ICSP: With attacks on critical infrastructure, it’s also a human safety issue. If somebody gets into a system like what happened in Oldsmar, Florida, they can really have an impact on society.
Byres: Unfortunately, what Stuxnet did is it weaponized cyber. Security agencies and governments and military around the world, and countries everywhere in the world, asked two questions when they learned about Stuxnet. One was what do we do to make sure the Americans or whoever don’t do that to us, and B, how do we get some of that? Because we want to able to do it, too. So we’ve weaponized industrial control system security. It’s sad, but it’s the way it is. Now, every major country in the world, I think, has a defensive posture and, unfortunately, probably an offensive capability, as well. Certainly, the major superpowers are all developing those offensive cyber capabilities to disrupt national infrastructure.
Busalachi: I like what you said, Gary: safety. We actually coined the term we like using. We call it digital safety. Saying cybersecurity to an organization is a big word, a boil the ocean kind of word. Depending on who you’re talking to and the owner of that, I find it a dangerous precedent to let folks make decisions on how you’re going to safely secure industrial control systems, who, at the end of the day, are not going to be held accountable if anything goes wrong. So it’s really relevant that the OT organizations, they have to own this. IT plays a role; it plays a significant role. But I don’t believe that they can own what goes on inside those control systems, and I see that when I walk on the plant floor. If I’m down there with an OT person, and we’re cracking open panels, and we’re looking at their PLC (programmable logic controller) architecture, and what kind of control systems we’re using, and how that plant floor is laid out for the control room, I get a whole different perspective.
I understand because I’ve been in several hundred manufacturing facilities in my career — versus when I’m out there with an IT person who, the farthest they can get is maybe to an intermediate distribution frame (IDF) or to a closet or to some data center within the plant because that’s their realm of responsibility and their realm of control. I think it should be safety because process integrity and operational resiliency are job one inside of a manufacturing environment. So OT is going to have to figure out a way to own this and manage this.
Byres: Safety and security are two sides of the same coin. In fact, I believe in German they’re actually the same word. You can’t have a safe system if it’s not a secure system. So I do think you’re absolutely nailing it there, Dino. That is what we have to come to terms with in this industry. We’re going to have to figure out how to make our systems safe, secure and reliable.
ICSP: Ron, you and I have had a conversation in the past about the IT/OT divide, where IT often doesn’t understand what’s really happening on the plant floor and vice versa.
Ron Brash: I didn’t want to tear away from the wonderful discussion with Eric and Dino, but I also see why we’re in this current state with regulation. It’s similar to when steam locomotives were constantly blowing up and killing people back in the day. We had the same thing with vehicles on the road. There was a need for regulation because of human [issues] — or I would say financial and societal problems and idiosyncrasies, whatever you want to call them, inherited attributes of these things. They drive the way that a lot of these facilities were designed, run and modified over time. The majority of facilities that are OT do not necessarily need more cybersecurity silver bullets, per se. What they need is fundamental and continual CapExes (capital expenditures) for concrete improvements overall. That’s probably the first piece because most of these facilities have been kind of derelict and let their investments lapse for a very long time. The majority of facilities, even their networking infrastructure, their fiber is hokey. They need lines replaced.
Whose budget does that come from? You need to put in new routers and switches. Your general lifetime for networking gear is 10 years. Some of these are much older than that. So there’s a lot of things that have cascaded in that technical deficit. Security degrades as a function of time, but all of that has added up to where we are now. And that’s just on the asset owner side. Then, we have engineers cutting corners on designs because nobody wanted to say, “What should we think about our users’ passwords?” or, “Maybe we shouldn’t have systems that go in there with everything under the kitchen sink.” Nobody thought to pay attention to that. I don’t understand why because generally engineers are quite thorough and logical in how they apply technology or their thought process. So something is broken in society, and you see this in software. You’ll see this inside those embedded programmable logic controllers (PLCs) and all that stuff, too.
There’s everything in them now, and it shouldn’t be that way. Systems have become so complex. And they’re actually systems of systems now just inside of that PLC alone that then talks to another system of systems. So the whole system has kind of gotten away from itself, I think, and we need to traverse back to engineering out the risk based on the impacts that that facility or that process could enact. That’s what we’ve gotten away from. No doubt, there’s a problem there, and there’s the weaponization aspects, but there’s something fundamental we have to change, and that’s probably going to have to start in schooling. But also executives need to understand that security and technology needs continuous budget. It’s not one-offs anymore, just like you wouldn’t drive your car down the street and never put a dime into it after a certain point. We need to change the way we look at this stuff.
ICSP: That makes perfect sense. It’s also important to convince the C-suite that there does need to be a significant budget for cybersecurity, which can be tricky.
Busalachi: It’s like insurance, right? That’s the other piece, which is waking up the board of directors and the C-suite, is the risk aspect of it and the insurance aspect of it, because that whole dynamic is changing. If I was going to add a third leg to the stool of what Ron just described, it would be that piece. He started off talking about regulation and then getting into the continuing funding aspect of it. Well, if you’re going to go for insurance and risk, how does that impact your business? And how are you going to demonstrate that you’re doing something in a continuous fashion to improve that? Assessments aren’t good enough anymore. The one and done strategy to come in to run an assessment in an environment is invalid as soon as the team leaves the site because things have already changed in there.
So you have to have something continuous going on there. You have to demonstrate that you’re constantly watching what’s going on inside that environment and try to expand it as deep as you can to get into anything that’s connected to that environment, which is one of the trickier aspects. It goes back to what Ron was saying is you have to fund this. You have to recognize that you don’t have the right technology to get further and broader and deeper in that space, and things have to be updated. And there has to be a budget available for that, or the insurance companies, we’re already seeing that they are denying policies to clients.
Byres: I actually think funding for this has become easier, thanks to the executive order, thanks to Colonial Pipeline. Five years ago, the board didn’t even know they had an OT cybersecurity problem. They probably didn’t even know they had OT in a lot of companies. Now, I’d be surprised if there was a board anywhere in the world that wasn’t concerned about their OT cybersecurity risk if they have OT. Now, the question is, what are we going to do to reduce our risk, and how much is it going to cost us? If you’re a board member, you’re going to be protecting yourself. I don’t run into as many people now saying, “Well, we don’t have budget,” as I have running into people who’ll say, “We don’t have people. I have the budget, but I don’t have the people, or I don’t have a strategy. I’ve been told by the board I’ve got to improve our risk, and I don’t know what to do next.”
Busalachi: That’s a great point. And you’re right, the human capital is definitely important. You have to have the skill sets to be able to go in there and do this work because everything that we do is net add new work. It’s not making anybody’s jobs easier. We’re not reducing a workload. If anything, we’re increasing it two-, or three-, or four- or five-fold and trying to dump that on somebody’s shoulder who’s already got 10 jobs in a plant. The question is, where do you find these resources? Are the services there available? You’re going to add a full-time equivalent of people, because it is a multiyear strategy when you kick it off. Most manufacturers that we work with have an average of seven, eight, nine, 10 plants in their fleet.
Brash: With my previous hat, I was at a number of facilities, and it would be no problem for a facility to go get an $8 million or $10 million machine. It wasn’t a big deal to go get that money because that was something that would generate revenue. And that group also at the same time wanted to improve their security, but OT security never got the money that their facility was generating backward. It always went back to the corporate ivory tower, and they got the cybersecurity budget. That somewhat trickled down eventually, but it was a war of attrition.
For that budget to get back to those facilities, to have the right resources, to redo the decades of neglect — and also the churn of resources because a lot of persons are retired and stuff like that — it’s going to take multiple millions per site, and the boards and everyone has got to sit down and say, “You know what? Our quarterly or year-end isn’t going to look so good if we do this right. But if we do it right, then it’s going to save us so much more.” But those discussions have not occurred yet to my knowledge because the OT security budgets are still minute in comparison. I’ve seen cybersecurity budgets for OT that are probably less than a medium-size university for a very large organization. And that’s negligent, to be honest.
Byres: Ron, I disagree. I’m starting to see budgets that are starting to match. Now, it depends on the industry. It depends on the board’s risk tolerance and their risk sophistication. But I remember sitting in on a meeting where the board had realized that they were spending X million on fire suppression, on platforms, and the risk from cyber disaster on a platform was the same as a fire. Therefore, the budget should be pretty darn close. So I think it depends on the industry, and it depends on the sophistication of the company, but I think we’re seeing a move in the right direction.
Brash: I can agree with that. It’s largely domain dependent. But with manufacturing being one of the current big issues and not necessarily oil and gas or energy, manufacturing is the one that’s [hurting] — and mining, of course, and your typical commodities based off natural resources like pulp and paper and that type of stuff. Those guys, they’re hurting. They’ve always been hurting for many, many years. If I were to be concerned about things, that’s where I would be concerned. Everybody likes their toilet paper. We’d all go mad if we didn’t have it, right?
ICSP: It really starts with knowing where your risks are and being smart about how you allocate those resources. You could chase every vulnerability, you can keep throwing controls at things, but you also have to be intelligent about what the risks are to your specific company and then triage those risks.
Byres: You nailed it. I think that’s goes to what Dino was saying about IT not understanding OT. If you’re looking at the risks on an OT facility, you’ve got to be looking at the mission of that facility. You’ve got to be saying, “What is the worst thing that can happen here, and how do I make sure the worst thing doesn’t happen? It could be a safety incident. It could be loss of human life. It could be environmental impact, but what’s that risk? Then, that’s where you’re starting. And I see a problem, for example, with patching. There’s this enthusiasm to patch as a solution in its own right, which is missing the risk question. I’m pretty sure that I have seen more outages caused by unnecessary patches in the energy industry than all the foreign adversaries in the world.
We’ve done more harm from patching than the bad guys have done to us. The real critical thing is getting focused on real risk analysis, not compliance, not checking the box — have I patched absolutely everything? But what are the risks that my organization is facing? Then, what are we going to do to reduce those? That’s where really Dino comes in and your ability to see those facilities not just as a bunch of wires in a rack room, but actually, what is this place trying to make, and what’s the risk around it?
Busalachi: The internal threats are much more prevalent than external. No doubt about it. The unplanned, unscheduled downtime an organization receives throughout the course of the year is definitely more from them shooting themselves in the foot, from others inadvertently pushing out patches or group policy changes, network changes, scanning of the network at the wrong time All of the things that they can do to disrupt their ability to produce goods is much more prevalent compared to malware being in there or malicious attacks from somebody coming in. So getting through the risk — do the tabletop exercises. That’s one of the things we try to advocate, especially between IT and OT to get them on the same page. Do the tabletop exercises. Do a workshop, and go through this to determine the risk and the activity. Get those IT guys out there on the plant floor and open those panels, and show them what you’re trying to secure and protect so they can get it. Explain it to them because a lot of them don’t.
I’ve been in three plants in just the last three weeks, and all three were different scenarios based upon who I was out there with and who knew what. We’re out there with our hard hats and our safety shoes and our safety glasses, and the IT guys got safety glasses and a headset. You go out there with the engineering team, you get a whole different set of folks who know everything and walk you from — I call it the gemba walk. If you’re a Rockwell person, you probably know what the gemba walk is. Basically, you walk in the front of a plant when the goods come in, and you walk all the way through the entire process area to figure out all the way to finished goods and out the door to get an understanding of what’s going on there. That’s how we match our OT platforms of what we’re seeing on the machine versus what we’re actually seeing out there on the site. And then trying to get that IT and OT group to sit down together and try to work through the overlap between the two of them.
Byres: Dino, you nailed something there — getting them to work together. I remember Eric Cosman, who was head of OT security at Dow and who is the chair for the 62443 committees, used to tell his team there’s never just a leak in your end of the boat. I thought that was such a great analogy. It isn’t an IT problem or an OT problem. It’s an IT/OT problem. They’re not separate. They’re connected. Forget the isolation. It doesn’t exist. Sure, there’s separation, and there’s ways you can put firewalls. But at the end of the day, you’ve got an IT and an OT system, and they’re working together. You’ve got to work together as a team, or the bad guys will eat your lunch.
ICSP: One last question on regulations: These are government regulations, but so many of the companies that are responsible for critical infrastructure are not government entities. They are private industry. How do you get private industry on board, and how effective can government regulations like the executive order be on a private company?
Byres: I think the executive order is hugely impactful because it sets a baseline. Yes, you may not be selling to the government, but you’re selling to another company who says, “Well, if that’s what the government expects, we expect it, too.” For example, we’re seeing Middle Eastern sovereign oil companies now saying, “If the U.S. government is requiring software bills of materials for our OT suppliers, we are too. We know they’re going to be supplying them to U.S. government, so don’t tell me you don’t have them.”
Something the executive order did is really set a minimum baseline on what’s expected in a contract from cybersecurity companies, particularly the supplier side. At the same time, I think that people don’t realize how much of a supply chain we have. You may not be selling to the U.S. government, but you’re selling to somebody who is selling to the U.S. government. Your client is a U.S. government supplier. We find this all the time, even in our company. We’re supplying to, say, a large company like Caterpillar, but they’re ultimately supplying the U.S. government, so we’ve got to comply.