0% found this document useful (0 votes)
70 views

Current Gaps in OT

Uploaded by

Shan Chokshi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
70 views

Current Gaps in OT

Uploaded by

Shan Chokshi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 33
A Current View of Gaps In Operational Technology GV erclascleua1NY yy The femton pronde haan i or gene nation ane ecucatond purposes en. It et iene end seul not be cored o ocr Iga 08 The Inirnaten cortaned heran may not be scpcabl to a stuns and may nt ret the met curert ston, Ning cra etn Shui be fled on oF acted pen without th bene of ga ade based 09 te tid acts nd ciureances presen and ting eran shoud be cormined athenwos. Tend Mio reeane th rigt tometythe corte ths San onsen of ay mate nto ther languages fe Intended oly ae a omveranes. Tarlton aceusey 1s not gunarteed noc implied. Hany questions ce relate ote accuracy of @ anda, please rer 10 ‘neo! nguage ofa wen ofthe document Ary re bring ard Rae no legal fet fer oxo rforcemert purses ‘iow Ta Micro uses reasonable eo" 6 neue scuate ard yp-to-dreomation hein, Wend Moo the content teoo I at your om fk Tend Mo ‘seams al warants of ay nd, ross oF wrod eter end Mero nor ay pany ieted neste, Promaing, of deveing 81S document shal be table fer ay coneaquanas, lon, cr damage, neisng at, indrest spec coreeqventi oss of Busnes profi, cr speci dara whatsoever rsa oi of are, ie oor maby to ns crm connection wth the use of the cocument, ay anes or emeaers nthe cartont threo seo ths wfometion cnsintesasaptanoe fr Trend Micro Research wien oy Joo Weiss ‘Applied Contol Solon, LLC Richard Ku Tend Mic Incerporates Stok anage used under lene tom Contents 17 STI ee Roma: ol BU) CrewmeKce ea: Wee On Engineering ce || Niece sie a Nature of ICS Cyberthreats: Nature ahd History of Control cuenta MENS tec NaU eet merits Seen RC ee Re eum e Cu nL processes they monitor and control from electronic threats — that is, to “keep lights on and water flowing.” Networks are a support function in the overall objective of safety, Cee nee ee ee ec Ceo Rn eee pect Because unintentional cyber incidents can be just as deadly and damaging as malicious: ee eC) Monitoring and preventing compromise of data has been an IT function since the late 1980s, Roe eee tae on ae on Rea Dee rece ene i OR ee tae Mca toes CR Ra Ra Re oR designing and implementing control systems, including those for seismic, environmental, fire, and reliabilty concems. As these were all engineering considerations, control systems Fe CR oa OL Cua ae Cee an a ck ee ca ‘organization was in charge, and its function included cybersecurity. The focus was from the “bottom up.” The main consideration was whether the process could be affected, which Rene ee en ea ee eee CRS ee a ae ey ‘cybersecurity function for control systems was moved to the IT organization and engineering ‘was no longer involved. Consequently, all cybersecurity monitoring and mitigation were at the IP (Internet Protocol) network layer — network anomaly detection. As a result, control system cybersecurity went from being mission assurance to information assurance. ‘Since engineering systems are not within IT's purview, control system devices — such as process sensors, actuators, and drives — still do not have capabilities for cybersecurity, Se ee Le Ree ae See eR Mee ee sue Mae eae a Goa ert e BO Ee ec es eet a ee Uc og en to protect all the systems at all levels of the Industrial control system (ICS) environment CO eu oa ees Moreover, because of the continuing profusion of ransomware attacks, there has not been Race icce ni cer cee wee Re Glossary The following are a tew terms used in control system cybersecurity and thelr corresponding definitions ‘The terms IT, OT, IT/OT convergence, and /oT come trom the ISA TS12 Industrial Networking and Security Information technology (IT): This refers to the study or use of systems (especially computers and telecommunications) for storing, retrieving, and sending information, ‘Operational technology (OT): This refers to hardware and software that detects or causes a change, through the direct monitoring and/or control of industrial equipment, assets, processes, and events.” OT Is not the pumps, the valves, or other hardware, nor does It include the engineers and the technicians: responsible for the equipment. IT/OT convergence: This refers to the integration of IT with OT systems. Internet of things (IoT): This refers to the internetworking of physical devices (also referred to as nected devices” or “smart devices”) and other items embedded with electronics, software, sensors, and network connectivity, which enable these objects to collect and exchange data, The term mostly refers to consumer devices such as smart watches, smart printers, and smart cars. Cyber incident: The de facto IT definition of a cyber incident is when a computer system is connected to the internet and is running Windows, and data is maliciously being manipulated or stolen. itis about privacy. The definition by the National institute of Standards and Technology (NIST) is an occurrence that actually or potentially jeopardizes the confidentiality integrity, or availabilty (CIA) of an information system ‘or the information the system processes, stores, oF transmits, or that constitutes a violation or imminent threat of violation of security policies, security procedures, or acceptable use policies. Incidents may be intentional or unintentional’ It should be noted that there is no mention of “malicious” or safety, It should also be noted that for control systems, | and A are much more important than C. ‘Smart: When applied to things such as cities, grids, sensors, and manufacturing, this refers to two-way ‘communications and programmability, and includes Industry 4.0 and industrial internet of things (lloT) All of these technologies are cyber-vulnerable, 4 1A Curent Vew of Gaps in Operational Technology Cybersecurity Cybersecurity (Cybersecurity became an IT issue after the first virus or worm was identified in the late 1980s, The Morris worm of Nov. 2, 1988 — usually considered the first computer worm and certainly the frst to gain significant mainstream media attention — was distributed via the internet. This worm resulted in the first conviction in the US under the 1986 Computer Fraud and Abuse Act. IT cyberattacks have continued unabated, leading to widespread attention and legislation. IT cybersecurity threats have also led to the development of the cybersecurity industry — with companies like Trend Micro, McAfee, and Symantec — and cybersecurity policies — starting with ISO/EC27000, which is part of a growing family of ISOMEC Information security management systems (ISMS) standards within the field of information and IT security ‘Standards include general methods, management system requirements, techniques, and guidelines for ‘addressing both information security and privacy. However, these standards are IT-focused and do not ‘address the unique issues associated with control systems, including reliability and safety. This has led to the establishment of ISA99, which is developing the suite of IEC 62443 series of automation and control system cybersecurity standards specific to automation and control systems, as illustrated in Figure 1.” ‘And as digital transformation happens across many verticals and industries, other standards will also need to be updated to ensure that they can meet the cybersecurity chellenges of these fast changing verticals and industries. Twrwugy crc dM gsryttems Ses comlse OS esd oie Settee rere wee 1H 2) onset ARNT samme igi i "sat sce cena ‘Seculy tects for Soca hve for ones Sytem conatly i nts ny sn ns Pome cote | = é Figure 1. ISA/IEC 62448 control system cybersecurity standards 5 | Current View of Gaos in Operational Technology Gybersecutty Control Systems ‘The Purdue Reference Model, shown in Figure 2,* wes developed in the 1990s to identify information flows in control systems. Cybersecurity was not an issue for the reference model. The Purdue Reference Model was also based on the existing technology, which made discriminating between sensors, controllers, process control networks, and others straightforward as their capabilities were limited. With the microprocessor and communication revolution, the process reference model levels are no longer so straightforward as the technologies enable process sensors to also have programmable logic controller (PLC) cepabities and even communication gateway capabilities. The Level 0,1 devices used in critical infrastructures are not cyber-secure. In fact, many instrumentation ‘and low-level instrumentation networks may not be able to be secured, Levels 2 and S are critical to secure ‘as they generally use traditional networking architectures to communicate with other control systems within the facility and can also communicate with the cloud. The cloud level was not considered when the model was developed. It is currently being lumped within Level 5. Consideration should be given to creating a new level specifically for the cloud. This Is especially Important for verticals ike manufacturing, healtheare, ‘cloud and virtualization faster than other industries like oil, gas, and power. nd retal as these are industries that seem to adopt This is in contrast to the International Standards Organization (ISO) seven-layer model that was developed for network communications and security. The ISO model divides network communication into Layers 4 = 4, which are considered the lower layers and mostly concern themselves with moving data around. Layers 5 ~ 7, called the upper layers, contain application-level data. Networks operate on one basic principle: Pass ton.” Each layer takes care of a very specific job and then passes the data onto the next layer. This is exactly what occurs at the Purdue Reference Model 2 ~ 3 networks. 8A Curtent View of Gags in Operational Technology Cybersecutty Indust demitarized zane (MZ), Figure 2. Tha Purdue Reference Modal A typical control system is composed of Level 0,1 devices (sensors, actuators, and! drives) connected to Level 2 controllers that are connected to process control networks and human-machine interfaces (HMI), also known as operator displays, at Level 3, which are connected to long-term databases and off-site facilities including the internet at Level 4, Level 3 - 4 have the capabilities for cybersecurity and ccyberlogging, and generally use IP networks, as shown in Figure 2. The sensors and the actuators operate ‘almost exclusively in near-real time (microseconds to milliseconds), whereas the HMs provide operator information on the order of seconds to minutes, The sensors and the actuators can operate — and in most cases were designed to function — without the IP network. Figure 2 provides a representation of the equipment and the information flows in atypical process system trom the process (Purdue Reference Model Level 0) to the enterprise resource planning (ERP) systems (Purdue Reference Model Level 4). Generally, the demilitarized zone (DMZ) server would reside at Level 8.5. However, as technology has moved the intelligence further down to the lower vel devices, modern ‘smart sensors can act not only as sensors but also as PLCs and gateways since they are equipped with Ethernet ports that allow direct communication with the cloud or the internet, bypassing the Level 3.5 DMZ. This capability, which provides improved productivity, also introduces a very significant cyber risk 7 |ACumtent View of Gaps in Operational Technology Gybersecuty 4s the digital sensors have built-in backdoors to allow for calibration and other maintenance activities without a firewall or authorization. ‘As organizations transform their businesses with the adoption of the cloud and virtualization to help provide better visibility and improve productivity and efficiency, we believe there is a new level, Level 6: the cloud, which needs to be considered for cybersecurity, Figure 3 shows how business risk and cyberthreats are directly connected, and we have seen this risk model proved to be correct over that last several decades across the several big transformations — ‘rom client/server architecture, to LAN/WAN architecture, to the internet architecture to the cloud/SaaS/ container architecture, and nowto the convergence of IT/OT and the OT digital transformation architecture. 0 Industry 40 + oongod automaton ‘Incense connectty ‘Incense complet + ‘ rcreased attacker sophistication ‘omation + Connectivity fut 1.0 lew wn — Increased ‘Business ac operational isk RISK Environmental complodty Figure 3. Evolution of industrial cyber risk 8A Curtent View of Gaps in Operational Technology Cybersecurty Control System Cybersecurity In order to understand the cybersecurity status of an organization's OT and control system environment, {an example of which is shown in Figure 4, there is a need to understand how the control systems interact with the different threat vectors that could potentially affect their OT environment. No cybersecurity PD = ate Remote Comms Master i i a o vo Motes pc Protocls cADa Intent Sesxs gy ED gy homme «gy Server id evioes rT. Serial HA onto Wireless NS, Des Field devices Control center Figure 4. typical organization's OT and contro! system environment ‘The Level 0,1 sensors are like the feelings on our fingers and toss. They provide the stimuli to our brains, which are the control system. If the sensing input to our brains are wrong for any reason, the actions of the brain will not be correct. For example, if our fingers are insensitive to a flame near our fingers, the brain will not react to pull our fingers away from the flame. In the physical world, sensors measure pressure, level, flow, temperature, voltage, current, strain, color, humidity, vibration, volume, chemistry, and other parameters. The measurements are input to control systems such as PLCs and electrical breakers, ‘which are programmed to maintain systems within physical constraints based on sensor readings. The ‘sensor readings are assumed to be stable and accurate, Consequently, calibration intervals are generally ‘Scheduled every 1 to 3 years to “recorrect” the sensor readings as they “crit” over time, 8] Curtent View of Gags in Operational Technology Cybersecutty In the 1970s through the mid-1990s, sensors and control systems were isolated systems not connected to the outside world. They were entirely within the purview of the engineers who designed, operated, and maintained these systems. Consequently, the design and operational requirements were for performance and safety, not cybersecurity. The “dumb” sensors and control systems that provided engineering data was useful only to the engineers. What changed was not the internet but the microprocessor. The microprocessor allowed for the calculation and conversion capability to take 1¢ and Os that were not Useful to anyone but that the engineers could convert to information that could be used by multiple ‘organizations outside the engineering organization. It was the availabilty of this useful information that led to the desire to be able to share this information within and outside the immediate engineering facility, This enabled productivity improvements like “just-in-time” operation by sharing data with multiple ‘organizations. The internet and modern networking technologies were the vehicles for disseminating this valuable information, Modern communication technologies with improved analytics that ere now employed at the smart device level enable Industry 4.0, the lloT, transactive energy, and others, but at the price of significant cyber ‘vulnerabilities that could affect the entire process. What is common among all these modern technologies that provide improved productivity is the dependence on reliable, accurate, and secure sensors, controls, ‘and actuators. But whet is missing? Cyber-secure sensors, controls, and actuators, Figure 5 and Table 1 show some of the potential threat vectors to @ control system environment. In some ‘cases, the adversary is able to compromise the OT network from the IT environment. In other cases, the attacks come from physical attacks on the field network devices or software attacks injecting malware into the system during patches or firmware or software updates. There appears to be a lack of understanding ‘about the number of potential attackers as well as the ease of attacking OT networks, be etwork 410]|A Current View of Gaps in Operational Technology Cybersecurity a & Sowa attack Server atack Es femal oo ©) inary Datgaogistion Taabese —caniguaion——_—Engheeng seer Sener seve wets eee mo | a rt rouse a t Fil device network & J | ey oomesin ae | LEH] Go prima sramt sptn wiles | Feld eldevces Salo tak 2 contr system ‘les atack Prysteal ata Figure 5. ICS attack vectors ey Seourity issue eed Local area networks forcollecting | Lack of authentication and security and locally processing data from in process sensors Connected ICS objects ‘ansmission of data tothe cloud | Lackoof security protocos ana | Combromised data cow oad to aoe es equipment damage, regulatory aaa Issues, and personal safety Processing and storage of data in| Lack of data securty hazards. ‘he cloud by appropriate platforms and specific algorithms such as big ata Interfacing between platforms and | Lack of secure communication _|_Use ofthe cloud couls lead to tend users for monitoring protocols Lunforessen operation concsms, Device/Control system Lack of security in the development | Compromised devices could lead Iie cycle, which introduces ‘0 their use in botnet attacks or ‘vulnerabilities and unsecure ‘manipulation of equipment for passwords performing harmful activities. ‘Table 1. Security challenges for OT environments 11 [A Current View of Gaps in Operational Technology Cybersecurity The Need to Address the Growing Gap Between IT/OT and Engineering There has been a trend of highly integrated industrial automation sharing more constructs with IT (known {as IT/OT convergence). As opposed to IT security, control system cybersecurity is stlla developing area. Control system cybersecurity is an interdisciplinary field encompassing computer science, networking, Public poliey, and engineering control system theory and applications. Unfortunately, today’s computer ‘science curriculum often does not address the unique aspects of control systems, as shown in Figure 6 Correspondingly, electrical engineering, chemical engineering, mechanical engineering, nuclear ‘engineering, and industrial engineering curricula do not address computer security. Consequently, there Is @ need to form joint interdisciplinary programs for control system cybersecurity both in the university ‘setting and in the industry° The cultural gap between the cybersecurity and engineering organizations is alive and well, and starts at the university level. The impact of this gap is felt in the disparity between the ‘engineering systems and cybersecurity product designs, as they are diverging rather than converging, Ce Computer scence Se td Engineering a Po ed Lack of understanding fertends to bath THOT and engineering Figure 6. ITIOT vs. engineering ~ Packets vs. process: 12 | A Current View of Gaps in Operational Technology Cybersecurity Misconceptions ‘The prevailing view is that control system information is not publicly available. There are a limited number ‘of control system suppliers, which supply control systems to all industries globally. Control system information often includes common passwords that cross industries and continents. There are a limited number of major system integrators who also work on multiple industries worldwide, The control system vendor users’ groups are open with multiple various information-shering portals and other channels. ‘Consequently, there is sharing of universal control system knowledge that is accessible by both defenders, and attackers. ‘Another prevailing view is that network monitoring can detect all anomalies. However, it cannot detect ‘communications from hardware backdoors. Some transformers have been known to include hardware backdoors, which allow attackers to remotely compromise the transformer control devices, including the load tap changer and protective relays, and consequently damage the transformers, ‘There is also a prevailing assumption that supervisory control and data acquisition (SCADA) systems or HMis (master station) are used in all control systems. This is not true. For example, cruise control is a control system yet there is no operator display specific to cruise control — just on or off. Many people assume that SCADA is needed to keep lights on or water flowing. SCADA is for process optimization and view. There has been a US utility that had its SCADA system hacked and lost for 2 weeks, but there was no loss of power and therefore no disclosure to the authorities. Many people also assume that the operator ‘can prevent damage by using the HMI. The HMI responds in many seconds to minutes. A compromise of ‘a system can occur in milliseconds, which is too fast for any operator. This does not mean, however, thet ‘an organization does not need to secure its SCADA systems or HMI. I stil needs to do so because lack Cf visibility and control into these systems could result in operation downtime and costly business impact. Many people assume that control system devices can be accessed only from Ethernet networks. This Is also not true, In fact, this assumption Is key to the Maginot Line, where all cybersecurity monitoring and mitigation assumes that all communications must go through the Ethernet networks. Monitoring the Ethernet networks is necessary, but it alone is not sufficient. ‘OT network security vendors and consultants assume the Level 0,1 process sensors or field devices are uncompromised, authenticated, and correct, and therefore the packet is all that needs to be monitored. 18 | A Current View of Gaps in Operational Technology Cybersecurity However, there is no cybersecurity, authentication, or cyberlogging at Level 0,1. Sensors have been demonstrated to drift, which is why they need to be recalibrated. Sensor configurations such as span, range, and damping cannot be monitored from the Ethernet networks, yet they ean be compromised. The Corsair demonstrations from Moscow, Russia, at the ICS Cyber Security Conference in 2014 showed how Level 0,1 vuinerabilties could be exploited.” Many people assume that network vulnerabilities correspond to physical system impact. They do not. It is generally not possible to correlate the severity of a network vulnerabiity with the potential for hardware impact. Itis also not possible to correlate @ network vulnerablity with specific equipment such as pumps, motors, or protective relays. Consequently, the question is: What should engineers do when they are apprised of cyber vulnerabilities? Many people equate cybersecurity to safety. They are related but not the same. A process can be cyber= ‘secure but not safe, since there are other features besides cybersecurity that can make the process Unsafe. Conversely, a process can be safe but not cyber-secure if devices that are independent of any network are used for process safety ‘The gap between networking (whether IT or OT) and engineering is summarized in Table 2 ITHOT Wetwarng Zee ms "oot Pat ol peru eae ner repr ot any bray ean Mert aout arate Word aout proces nd equoment IP networks with security Lower-level non-IP networks without security: Aue al conne go trough Preven: | Can gett Lee 0 wa Petr Veit tetemaria cited | Level at pele endemic | pete Worried about advanced persistent threats | Design features with no security Focus on malicious attacks Focus on reliabilty/safoty ‘Table 2. Differences between networking and engineering ‘As can be seen in Table 2, networking and engineering are, in many cases, fundamentally diferent. Issues such as zero trust versus 100% trust fundamentally affect architecture, training, and policies. The difference between networking systems that are nondeterministic and control systems that are deterministic directly affects technology and testing, This difference has resulted in control systems having been shut down or ‘even damaged because of the use of inappropriate network technology or testing tools. 14 | A Current View of Gaps in Operational Technology Cybersecurity Nature of ICS Cyberthreats Because of the potential damage that cyberattacks could have on businesses, the economy, and the defense industry, control system cybersecurity should bea top-level national security concern and a priority tor every business. However, this is not the case. Arguably the greatest hindrance to erttcal infrastructure ‘cybersecurity is the refusal to acknowledge the problem. Neither the Solarium Commission Report nor the CyberMoonShot program, for example, addressed the unique issues with control systems. And in an article tiled "Dismissing Cyber Catastrophe,” James Andrew Lewis, a senior vice president and director of the Strategic Technologies Program at the Center for Strategic and International Studies (CSIS), says that ‘a cyber catastrophe captures our imagination, but in the final analysis, it remains entirely imaginary and is of dubious value as a basis for policymaking. According to Lewis, there has never been a catastrophic ‘cyberattack, These statements are obviously not true. Consequently, despite recent attempts to address the problem, public policy prescriptions, although helpful, are far from sufficient. In fact, articles such as Lewis’ can dissuade organizations from focusing their attention on control system cybersecurty.* ICS honeypots have demonstrated that control system networks and devices are being targeted. In 2013, ‘Trend Micro published research on @ honeypot for a water system that mimicked a real system, including {an HMI and other components of an ICS environment. In that research, there were 12 targeted attacks out 0f 39 total attacks. From March to June 2013, Trend Micro observed attacks originating in 16 countries, ‘accounting fora total of 74 attacks on seven honeypots within the honeynet. Out of these 74 attacks, 11 ‘Were considered “critical.” Some were even able to compromise the entire operation of an ICS device."* In 2015, Trend Micro released research around the Guardian AST monitoring system using a honeypot called GasPot, which simulated a gas tank monitoring system.*® The purpose of this honeypot was to deploy multiple unique systems that did not look the same but nonetheless responded like real deployed systems. Trend Micro evolved the ICS honeypot by making it more and more realistic. The goal was to build a honeypot thet appeared so real that not even a well-trained control system engineer would be able to tell that it was fake without diving deeply into the system. First, Trend Micro decided on what services and ports would be exposed to the internet to make the honeypot attractive to attackers. At the same time, there were @ minimal number of exposed services to prevent the honeypot from being identified as such. Second, Trend Micro created @ backstory for the fictitious company, which included made-up employee names, working phone numbers, and email 145A Current View of Gaps in Operational Technology Cybersecurity ‘addresses. The honeypot consisted of four PLCs from three diferent brands: one Siemens S7-1200, two Rockwell MicroLogix 1100 units, and one Omron OP1L. These PLOs were chosen for their popularity in ‘control system markets around the world, Also, each PLC brand used a different protocol. Each PLO ‘was loaded with logic and performed specific and associated tasks that together ran the manufacturing facility. These roles were agitator, burner control, conveyor belt control, and palletizer, which used a robotic arm. To meke the manufacturing process reelistc, incremental and decremental functions varied the feedback values, which imitated the starting and stopping seen in real motors and heaters. Random ‘generator functions were also created to make slight fluctuations in the fesdback values and to simulate ‘actual variations, Not only are current attackers accustomed to encountering honeypots, but advanced actors also typically perform in-depth investigation — using open-source intelligence (OSINT), for example — before attacking 1 target system to make sure that they are not about to be “caught” by @ honeypot system. For this reason, the honeypot did not only need to look realistic from @ design and technical implementation ‘standpoint, but it also had to reflect a system that a real company would use. ‘The manufacturing honeypot went oniine in May 2019. For seven months, Trend Micro maintained the image of a real company and monitored the honeypot closely. The first attack we encountered came ‘4 month after the honeypot went live, with several others following in its wake. This showed that this ‘sophisticated honeypot designed as a small business with critical clients and inadequate security was cffective in luring threat actors. During the May to December 2019 research period, it became apparent that there was increasing activity ‘on the honeypat, with higher levels of interactions from day to day. However, the longer the honeypot was ‘exposed, the greater the activity that we observed — and the more sophisticated attacks appeared to be ‘compared to standard penetration-testing techniques.”* This means that we created openings for attacks that could realistically be found in actual smart factories. This approach also demonstrated the need to have the different parties involved. (Cyberthreats and associated attacks are increasing, especially with more people working from home during the Covid-19 pandemic.** This is affecting both IT and OT networks. However, many control ‘system designers and operators assume the cyber risk is only about email, which does not affect their job function, and therefore ignore or do not participate in cyber assessments. Consequently, the nature of control systems leads to a much higher risk than many people appreciate. Whereas the IT/OT communities ‘operate under the premise of zero trust, the control system community operates under a 100% trust ‘scenario, with many of the Key organizations such as instrument engineers or technicians and safety system engineers or technicians not even part of many cybersecurity teams. 16 | A Current View of Gaps in Operational Technology Cybersecurity ‘There is also another aspect particularly with certain government agencies. That isa reluctance to make ‘control system cyber information available because of concems that adversaries might learn from them. Unfortunately, this reluctance to shere information affects the defenders as the offensive attackers make it a point to know the latest information. ‘The most probable control system cyberthreat is the unintentional impact that can come from either the ‘control system engineers and technicians or the cybersecurity personnel. Often, a cyber incident from an insider is automatically tagged as an unintentional incident. This should not always be the case. ‘Another concern about malicious attacks is that they can be made to look like equipment malfunctions, as In what occurred with Stuxnet. Because there is mited ICS cyberforensies and training for control system ‘engineers, most equipment malfunctions are not even investigated as possibly being cyber-related, Culture and governance issues are critical to secure control systems. However, the governance model is such that cybersecurity is a network — not engineering — problem."* For control systems, this is a Problem and this needs to change. The engineering organization is responsible for the control system ‘equipment and understands how the control systems work and their system interactions. Many network. securty-induced control system cyber incidents have occurred because of lack of this knowledge. In @ new study, researchers demonstrate that weaponized disinformation campaigns could also hypothetically be exploited to execute relatively immediate attacks on critical infrastructures — using ‘coercive methods to manipulate citizens into unwittingly wreaking havocon the places they live.” Attackers are becoming better system engineers than defenders as they generally do net have organization charts, land the resultant silos, to meet. Often, sophisticated attackers work “backward” by determining what damage they want to cause and then look for tools to enable that to occur. For control systems, older network vulnerabilities are often sufficient to cause the desired impact, whereas defenders often focus ‘on the latest network vulnerabilities without considering the physical impact that might or might not be created, Consequently, there is a need to understand and adapt to the myriad approaches attackers are using, The culture gap has resulted in a focus on network forensics and training for IT/OT network security personnel. There is also a general wilingness to share network attack details as the information becomes: available. Unfortunately, cyberforensics does not exist for Level 0,1 devices, nor is there training for ccontrol system engineers. There has been a reluctance by governments to share actual control system ‘cyber incidents. Control system and equipment vendors are often made aware of control system cyber Incidents with their equipment but cannot share the information because of nondisclosure agreements. ‘Consequently, there has been minimal identification or disclosure of actual control system cyber incidents. 17 [A Current View of Gaps in Operational Technology Cybersecurity ‘The July 23, 2020, Alert AA20-205A, in which the National Security Agency (NSA) and the Cybersecurity and Infrastructure Security Agency (CISA) recommend immediate actions to reduce exposure across ‘operational technologies and control systems, states that control systems should not be connected directly to IT networks or the internet, or they willbe compromised.*° However, as of 2014, there had been more than 2 million control system devices directly connected to the internet. Despite this and other ‘warnings from authorities, there continues a push to connect control systems directly to the internet. ‘The May/June 2015 issue of the ICS-CERT Monitor of the US Department of Homeland Security (DHS) specifically stated (sic) “If You're Connected, Your Liksly Infected! Some asset owners may have missed the memo about disconnecting control system from the internet. Our recent experience in responding to ‘organizations compromised during the BlackEnergy malware campaign continues to bring to light this major cybersecurity issue—Internet connected industrial control systems get compromised," 18 | A Current View of Gaps in Operational Technology Cybersecurity Nature and History of Control System Cyber Incidents ‘Trend Micro has been tracking threats to ICS environments since the early 1990s, Figure 7 shows some of the most notable attacks on multiple organizations that we tracked from the past decade, ? Samer ‘naan ps ce Pe Park 2 mana cntns Meg trea Figure 7. A timeline of industria cyberattacks and their impact Figure 8 shows a timeline of publicly identified cyberattacks on control system environments over the past ‘couple of decades. The first nation-state ICS attack intended to cause physical damage occurred in 2010 with Stuxnet, which damaged approximately 20% of the centrifuges in an Iranian centrituge facility. From the early virus attacks such as Blaster and Zotob, which caused denial of service (DoS) across multiple sectors, to the recent malware and intrusion attempts by hacking groups that have primarily caused ‘operational downtime, the impact has resonated across multiple industries globally. 18]|A Current View of Gaps in Operational Technology Cybersecurity Blaster 2003. @ Meworm ited wth ksomn Nar Aescan aon cxmpanys tin oprans and dspachng esto, and ected aout a of Cra’ fg carn one "eect apse andar cheek opestens. Slammer 2003 © trewnmateed me sty parameter de ye atte Davi Bese Nicer we St Stuxnet 2010 © re worm tretea sox nds tte ing en SCADA tre ithe npc nara. BlackEnergy and KillDisk 2015 © tensa cst a es nian sot cngs Sc pan atsoe "cso ad sectig horde eters. Tn took monte acre ‘eget wit he Peles 60 8 rea Triton 2017 © rropn grea instal satay sytrs andnavareny shut down an esti Dnt’ onto serra nes ori ma eagle es malware be, Wannatry 2019 © trermeomvnesutcvn te severe arcton toes oe Tan Semonfce Wenig Comoe Figure 8. A timeline of publicly identified cyberattacks on control system environments, ‘As of October 2020, Applied Control Solutions has amassed a database of almost 1,300 actual control system cyber incidents. Many of these cases are public, although the cyber aspects are not discussed. (The database is not public, but many were provided to us in confidence) The cases are global in reach {and include power (nuclear, fossil, hydroelectric, renewable), electric transmission and distribution, water/ ‘wastewater, pipelines, manufacturing, transportation, space, and defense. The impact ranges from trivial to significant environmental spills, significant equipment damage, and widespread blackouts. There have been more than 1,500 deaths and more than US$70 billion in direct damages to date, The focus of this ‘database is control system cyber incidents that have had physical impact. Consequentiy, the database does not include the myriad network attacks and network vulnerabilities. ‘A team at Temple University in Philadelphia maintains @ database of ransomware attacks on critical infrastructures. According to the team’s leader, Aunshul Rege, her team updates their dataset of critical infrastructure ransomware incidents (CIRWs) that have been publicly disclosed in the media or security reports. This CIRW dataset now has 747 records assembled from publicly disclosed incidents between 2013 and September 2020, These incidents were not counted in our database. 20]| A Current View of Gaps in Operational Technology Cybersecurity The first control system cyber incident occurred on Feb. 5, 1971. The Apollo 14 astronauts Alan Shepard ‘and Edgar Mitchell were orbiting the moon and preparing to land on board their lunar module. A rogue bit of solder was floating around inside an emergency switch in the vehicle and shorting it out, thereby activating the abort button. In order to save the mission, Don Eyles, a computer engineer who worked on the computer systems in the lunar module, had to hack his own software, He came up with a few lines of Instructions that he sent to the astronauts to lock out the emergency switch behavior. This enabled the Apollo 14 to land on the moon later that day.** ‘The first control system cyberattack occurred in February 1992 at the Ignalina Nuclear Power Plant in Lithuania, after which authorities arrested a computer programmer for attempting to sabotage the reactor ‘with a computer virus The first control system cyber incident that resulted in fatalities was the Olympic Pipeline Company's gasoline pipeline rupture in June 1999. The first publicly known control system ‘cyberattack was the Maroochy Shire wastewater incident in March 2000, where more than 750,000 liters of sewage were dumped on the grounds of a hotel. The first cyber-related physical attack was the March 2007 Aurora demonstration at the kdaho National Laboratory, where a diesel generator was destroyed by remotely opening and closing relays, causing the system to go into @ “forbidden” operating zone. The first nation-state targeted cyberattack was Stuxnet in 2010. The first widespread control system ‘yberattack for economic reasons occurred in the Volkswagen cheat device case in 2015, which affected ‘approximately 800,000 Volkswagen and Audi vehicles. Arguably the first case where @ nation-state installed rogue hardware devices in control system equipment — it is unclear whether the June 1982 Siberian pipeline explosion was truly cyber or not — was the hardware backdoors installed in a large ‘electric transformer in August 2019. This case resulted in the issuance of Presidential Executive Order 19920 in the US." (One ofthe first significant cases where an unintentional cyber incident appeared to be a cyberattack was the penetration testing of protective relays in a large electric utility in 2017. In this case, the security group ‘was scanning data center assets and then expanded the scanning into North American Electric Reliability Corporation Critical Infrastructure Protection (NERC CIP) substations, starting primarily at the 230/500 KV level. The security group had no previous experience with scanning substations. No netiication was given for the scanning change to the internal support groups that are responsible for this function. The OT team was notified thet substation scanning was started with a new security port scanning tool. Following the scans, the relays showed trouble, but the ONP (Distributed Network Protocol) polling was working properly and the networks in most substations were stable — SCADA was unaware of the problems. The port scanning of this new tool caused the real-time protocol operation of the relays (IEEE61850/GOOSE) to stop and suspend operation at the CPU (two different relay suppliers) and left the DNP/non-real-time ‘operations alone — the worst possible circumstance. ‘To clear the trouble and restore operation, each relay had to be cut out and rebooted. Several hundred relays were affected, All the devices in each substation were affected at the same time in every case. 21 [A Current View of Gaps in Operational Technology Cybersecurity \Without knowing thet a security scan was initiated, it looked like a distributed denial-of-service (DDoS) attack resulting in equipment maifunction. IEC 61850 was one of the protocols affected. Additionally, a Unique port scanner was used, which had the effect of a DoS disruption of relays that had to be manually reset, According to ESET's report on Industroyer, the attackers’ arsenal included a port scanner that could be used to map the network and to find computers relevant to their attack. Instead of using existing software, the attackers built their own custom-made port scanner. The attackers could define a range of IP addresses and a range of network ports to be scanned by this tool. Another tool from the attackers’ arsenal was a DoS tool that exploited the CVE-2016-8374 vulnerability to render a device unresponsive. (This vuinerabilty disctosure was for one specific vendor's relays, and the question is how vuinerable it would be to other vendor's relays). Once this vulnerability is successfully exploited, the target device ‘Would stop responding to any commands unti it would be rebooted manually. To exploit this vulnerability, the attackers hard-coded the device IP addresses into this tool. Once the tool is executed, It would send specifically crafted packets to port 50,000 of the target IP addresses using UDP (User Datagram Protoco). Because the impact at this utility was very similar to that of the Industroyer malware, this utility ‘event could have been mistaken as a “test” run of the Industroyer malware. ‘The similarities between the impact at this utiity and that of the Industroyer report raise these questions: + Was this event totally coincidental to the impact of Industroyer? If so, what other unintentional incidents can cause equipment problems and be indistinguishable from cyberattacks? + Was the Industroyer malware somehow loaded onto the penetration tester’s software because the attackers knew the utity’s substation configuration? If so, why did the utility's cybersecurity program not detect the malware particularly after being informed of Industroyer? Is the malware stil resident? How many other utlties would be ineapable of detecting this malware? + Did the developers of Industroyer know that “innocent” penetration-testing software could cause this kind of impact? It s0, was the Industroyer malware developed to mimic the unintentional impact, ‘making the malware detection very cifficutt at best? + How many other software products that could cause grid disruptions have been mimicked? These cases lead to the inability to clearly distinguish between unintentional impact and a cyberattack, making it dificult at best to meet NERC CIP and NEI-0809 malware identification requirements. These ‘cases also make it clear that before using new penetration-testing software, there isa need to test existing relays offline prior to using penetration-testing software in a live condition. The February 2017 report of the Office of Inspector General of the National Aeronautics and Space Administration (NASA}” provided these three case histories where IT technologies had impact on control systems and operations, demonstrating the need for engineering and cybersecurity “convergence” 22 | A Curent Vew of Gaps in Operational Technology Cybersecurity + Alarge-scale engineering oven that uses OT to monitor and regulate its temperature lost this ability ‘when a connected computer was rebooted after application of a security patch update intended for standard IT systems. The reboot caused the control software to stop running, which resulted in the ‘oven temperature rising and a fire that destroyed spacecraft hardware inside the oven. The reboot ‘also impeded alarm activation, leaving the fire undetected for 3.5 hours before it was discovered by ‘an employee. + Vulnerability scanning used to identify software flaws that could be exploited by an attacker caused failure of equipment and loss of communication with an Earth science spacecraft during an orbital pass. As a result, the pass was rendered unusable and data could not be collected until the next orbital pass. + Disabling of a chilled water heating, ventilation, and air conditioning (HVAC) system supporting a data center caused temperatures to rise 50 degrees in a matter of minutes, forcing shutdown to prevent damage to critical IT equipment. 28 | A Current Vew of Gaps in Operational Technology Cybersecurity Cybersecurity Strategy and Security Controls Reducing cyber risk in an ICS environment requires significant understanding of a network environment, including the sensors, the process controls, the protocols, and the communications across each level of the Purdue Reference Model (as shown in Figure 2) as well as the threats to and the vectors in the environment, The cyber risk affects all types of industries — power, oil and gas, manufacturing, Pharmaceutical, healthcare, and transportation, among others — and it is recommended that every ‘organization implement a cybersecurity strategy. Implementing cybersecurity in industrial controls and critical infrastructure environments is critical because the wrong strategy and security controls can cause significant operation downtime or create safety issues, The reasons are + Many organizations, especialy on the OS side, have a shortage of demain knowledge and expertise. + There s uncertainty as tothe roles and responsibilities or governance across the organization ‘+ Its difficult to improve the return on investment (RO)). + Some organizations tend to put the economic considerations or proftabilty of the company over cybersecurity In addition to these business challenges, there are technical chellenges to overcome, These include: + Unknown devices and connections tothe OT network (shadow OT) + Lack of securty in the original design. * Vulnerable and unsecure third-party applieations and operating systems. + Legacy systems and environments that have been around for many decades and that may not be ‘able to be secured. 24 | A Current View of Gaps in Operational Technology Cybersecurity Recommendations ‘To ensure the effectivity of a cybersecurity strategy, there must be a cohesive interaction among its four pillars: people, process, technology, and culture. People: No matter the industry it is in, any organization needs to develop appropriate training on ‘cybersecurity and other people skills as there is a need for integration throughout the organization. It also needs to formulate guidelines and lay out a plan that clearly outlines the collaboration between IT and OT teams, further strengthening the human factor. According to over 62% of the respondents in a 2019 SANS survey on the state of OT/ICS cybersecurity.” the human factor was considered the greatest security Fisk to their operations, but most organizations’ security budgets for it were less than US$100,000 in 2019. This needs to change: Since a trained and security-aware organization can reduce security risk significantly, people should be the starting point of any organization’s cybersecurity strategy investment, Process: For a cybersecurity strategy to be successful, an organization must develop and implement procedures while ensuring that clear roles, responsibilities, and management systems are put in place. In addition, the process needs to include a governance, policy, and best practice framework, and it must be periodically tested to evaluate and ensure its efficacy. If the process is forgotten or broken, it could lead to cyber risk. The 2019 SANS survey left much to be desired in this regard: It found that only 14% of the respondents considered the process as the greatest security risk, and nearly half of organizations’ budget allocation for it did not exceed US$500,000 in 2018. Technology: The lack of understanding of appropriate OT deployment has led to many control system ‘cyber incidents resulting in downtime or even creating safety challenges for organizations. For example, deploying an IT vulnerability scanner in an OT environment can cause process control system shutdowns because of incompatibilty of protocols, the types of applications running, or operating systems, (Cybersecurity stakeholders across the executive, IT, and OT segments of an organization should therefore ‘ensure the proper testing, integration, and use of technology in the IT/OT environment. n the 2019 SANS study, only 22% of the respondents considered technology deployment as the greatest security risk, yet most organizations did not invest sufficiently in OT/ICS security compared to their IT budget, which ‘exceeded US$1 million in the case of more than 40% of them, Culture: The 2019 SANS survey found that around 84% of organizations hed already adopted or were Planning to adopt an IT/OT convergence cybersecurity strategy. But for any such strategy to be effective, ‘cyber culture, which is essential to the reduction of cyber risk, must be developed and implemented \within an organization. The fostering of this culture must start from the top and be communicated down to all levels of the organization. For the OT environment, this includes the engineering and operational ‘organizations responsible for the equipment that is being secured. 25 A Current View of Gaps in Operational Technology Cybersecurity Figure 9 summarizes the key components of the four pillars of an effective cybersecurity strategy. ais ais People Process Technology Culture Sling and awareness» Goveranc, poy, nd Testing, verte ad + Graton of cyber cute nseeuity framework ‘aban of etna within the aan er + Professional sil and “Management systems + Competent people, suport Caeianalg ‘wsfcatins cesses, and an vera + Fostering ofthe ober « competant sauces pea amees ian or tectnobogy ‘autre string atte tp a =10Tast deployment and down o af lve of no ganization Figure 9, The four pillars of an effective cybersecurity strategy dustry ‘standards thet your organization needs to align with. Depending on your industry, you will need to look: at standards that can be applied to your business or vertical. Industry standards, examples of which are indicated in Figure 10, can provide best practices for protecting ICSs. ‘A good place to startin developing and implementing a successful cybersecurity strategy is wit a0 - sa) [iz © jorance Les ; NIST’ SO z=: £8 27009 3 : atv cert oe Ea Figure 10, Examples of industry standards Next, develop a cybersecurity framework that can be appropriately applied to your organization's goals {and objectives. Ensure you have the right people, process, technology, and culture to deal with ifferent ‘cyber incidents and be able to recover quickly to ensure business continuity. At the same time, ensure that cyber incidents will not happen again in the future, or if they do happen again, they will have minimal Impact on the business and its operation. 26 | A Current View of Gaps in Operational Technotoay Cybersecurity Figure 11 shows a high-level example of a cybersecurity framework, illustrating the process that an ‘organization needs to go through in the event of a cyberattack. Figure 12 shows a high-level example of a network architecture in a large enterprise, including both IT and OT environments and the different cybersecurity technologies that could be applied as well as the risks and threats that could affect the ‘environment Understand threats, their capabilities and their risks to systems, assets, and data. Plan to anticipate

You might also like