This document summarizes a presentation about tracking and implications of the Stuxnet computer worm. Stuxnet targeted Siemens industrial control systems and was designed to damage Iranian nuclear centrifuges. It spread using five Windows exploits and a Siemens password to infiltrate industrial networks. Stuxnet hid its activities using rootkit techniques and destroyed centrifuges by manipulating their speeds. Its discovery revealed vulnerabilities in critical infrastructure protection and demonstrated that industrial systems could be attacked remotely for sabotage.
6. Targeted attacks Stats
Worldwide industry sector since 2008
18172 targeted attacks during 2010 Targeted Attacks - Infosec
7. Target Attacks
Phase Mass Attack Targeted Attack
Incursion Generic social engineering Handcrafted & personalized delivery
By-chance infection method
Discovery Typically no discovery Examination of the infected resource
Assumes pre-defined content Monitoring of the user
Predictable location Determine accessible resources, &
network enumeration
Capture Pre-defined specific data Manual analysis &
Matches a pre-defined pattern Inspection of the data
(IE credit card number)
Exfiltration Information sent to a dump Information sent back to the
site with little protection attacker Not stored in location for
Dump site is long term storage extended time period
8. What?
1. Windows Computer worm discovered in
July 2010
2. 100k+ lines of code (complex)
3. 5 different exploits (4 MS vulnerabilities)
1. LNK File Bug – Initial auto exploitation via removable drive
2. Task Scheduler – Privilege Escalation VISTA+
3. Keyboard Layout – Privilege Escalation XP
4. Spooler / MOF Files – Spreading/Lateral Movement
5. SMB Vuln (MS08-067) – Spreading/Lateral Movement
4. Rootkit (hiding binaries)
28. Siemens Infections
Distribution of Infected Systems with Siemens Software
80.00
67.60
70.00
60.00
50.00
40.00
30.00
20.00 12.15
8.10 4.98
10.00 2.18 2.18 1.56 1.25
0.00
U
A
S
N
A
R
N
D
I
A
I
O
H
W
R
E
T
S
N
A
T
I
O
N
D
A
E
S
I
O
U
H
A
R
K
E
T
S
N
G
A
B
R
E
T
I
28
34. 18 Critical Infrastructure Sectors
Homeland Security
Presidential
Directive 7
(HSPD-7) along
with the National
Infrastructure
Protection Plan
(NIPP) identified
and categorized
U.S. critical
infrastructure
into the 18 CIKR
sectors
35. Cross-Sector Interdependencies
Control systems security not sector specific
Connectivity crosses geographic boundaries
Sectors not operationally isolated
This is a sample Pie Chart slide, ideal for communicating product or market segmentation information. To Change Font Color/Size: Select text, right-click and adjust the font setting on the Mini toolbar . Select desired attributes to change: font, size, boldness, color, etc. Note: many of the same commands can also be accessed from the Font group of the Home tab. Edit Chart: Click the chart to edit and select the Chart Tools Design tab (or double-click on the chart). Click the Edit Data button to access the underlying Excel 2007 spreadsheet. Copying Data From a Separate Excel Spreadsheet: From an existing Excel spreadsheet, select the range of cells to be copied, select copy (Ctrl C). In PowerPoint, click the chart to edit and select the Chart Tools Design tab (or double-click on the chart.) Click the Edit Data button to open the spreadsheet for editing. Select all the data in the Chart in Microsoft Office PowerPoint spreadsheet by clicking the top left corner cell, right-click and select Delete Click in the first empty cell of the spreadsheet and paste (Ctrl V) to place the data copied from the other Excel file. Change Orientation: Click the chart to edit and select the Chart Tools Design tab (or double-click on the chart.) Click the Switch Row/Column button. If the Switch Row/Column button is disabled, click the Select Data button and then click the Switch Row/Column button from within the Select Data Source dialog box, click OK .
Countries other than Iran are likely to be collateral damage
CEOs and the technologists who work for them like to say the applications they rely on— especially the kind custom-written by specialists at banks and investment companies with fortunes behind them—are safe as houses. And they are, if you're talking about houses in Louisiana when the Gulf starts lashing hurricanes and tarballs. Almost 60 percent of all the applications brought to security testing and risk-analysis company Veracode during the past 18 months couldn't meet the minimum standards for acceptable security, even when the criteria were dialed down to accommodate applications that don't pose a great security risk, according to Samskriti King, vice president of product marketing at the company. Web-based apps carry their own special set of risks. "There are far more people on Web projects because they're often easier to develop; many components are already available so you can stand up Web applications very easily," King says. "Developer education usually focuses on applications generated and used in one place, but Web applications could touch many places, so a vulnerability in one component could manifest in many places if it's reused." Unfortunately, developers trained with software that's generated and used in one location with a single set of servers often don't understand the precautions needed for Web applications that take code, data, and elements of the interface from many servers, she says. [ For more background on securing Web-based apps, see 5 Problems with SaaS Security . ] The typical number of security flaws, especially in legacy or other homegrown software, must be taken into account by cloud-computer service providers, says Thomas Kilbin, CEO of cloud and hosted-server provider Virtacore Systems . After all, he says, customers who want on-demand compute capacity don't want to rewrite all their applications just to run in an environment designed to save money and add convenience. "Our customers are taking apps they had running in their back office and moving them to private clouds for the most part," Kilbin says. "They are not developing any apps geared towards only working in a cloud IaaS/SaaS model. We secure these apps via a number of methods, traditional firewalls, app specific firewalls from Zeus, etc." Keeping Web-based apps secure can be particularly tough for smaller IT teams. "The cloud model is more threat-rich than the shared hosting model, mainly because in shared hosting the core OS and apps—php, perl, mysql—are kept updated by the service provider," Kilbin says. "In the cloud, the customer has to keep the core OS updated, along with the application stacks, in addition to their code." Most customers don't have the expertise or the time to do so, Kilbin says. Some 2,922 applications were examined by Veracode in the past 18 months, with the results detailed in the company's recently released State of Software Security Report: The Intractable Problem of Insecure Software . Some of the applications sent to Veracode for testing come from ISVs or corporate programmers in the last stages of development. Another big chunk comes from developers who have to present certifications or risk analyses before closing a deal with government agencies or heavily regulated industries. Old App Flaws Revealed Before Web Moves Increasingly, however, Veracode is testing software that clients have used for a long time or are very confident in, but are now migrating to a cloud or Web-based service environment. The requests often come from corporate IT executives who turn out to be wrong in believing that their secure, homegrown applications are either homegrown or secure, especially when they're moved into multi-site environments for the first time. Both commercial and open-source applications failed Veracode's tests more often than homegrown—at 65 percent and 58 percent respectively. Homegrown applications failed 54 percent of the time, Veracode reports. Software written by outsourcing firms missed the mark an astonishing 93 percent of the time, Veracode says. Even applications being used by banks and financial service companies failed 56 percent of the time on initial submission, though the criteria are tougher for those applications, because problems in those apps would create more havoc than, say, in an internally developed server-monitoring application, King says. Internal developers shouldn't be comparatively complacent, however, King says. Though internal apps are generally assumed to be made of 70 percent homegrown code, reuse of code, objects and procedures is so common that between 30 percent and 70 percent of the code in homegrown applications actually came from commercial software. Internal developers are also unaccountably unaware of the most common exploits likely to be used against Web-fronting applications, resulting in an 80 percent failure rate for Web applications, which are tested against the list of 10 most-common security threats published and publicized by the the Open Web Application Security Project (OWASP) , King says. "At that point it just comes down to developer education," King says. Cross-site scripting is the most common security flaw in all the types of software Veracode tests, but is most noticeable in Web- and cloud-based software, King says. But the time it takes to fix problems and get an application to an acceptable level of security has dropped drastically from 30 to 80 days a year or two ago to only 16 days now, mainly because developers of all stripes are putting greater emphasis on security, software quality, and shortening their time to market, King says. There aren't any shortcuts, but Veracode does have some suggestions for IT teams to counter the most consistent app security problems: 1. Design apps assuming they'll link cross-site; secure those links and the processes that launch them. Cross-site scripting (XSS) accounts for 51 percent of all vulnerabilities, according to Veracode. Apps written in .net have an abnormally high number of XSS issues because many .net controls don't automatically encrypt data before sending or storing it. Check and encrypt all points of output. Inadequate or absent encryption in non-.net applications also created problems, but are easy to fix once the source of in-the-clear data broadcasts are identified. 2. Focus your efforts on the greatest source of vulnerabilities. You can assume software from any provider is likely to have vulnerabilities, but put extra Q/A and security analysis effort into code from outsourced programming services, ISVs and components from either of those that find their way into homegrown applications. 3. Verify security of the application itself in a cloud or SaaS environment. Whether the customer or the service provider supplies the application, check it for flaws or vulnerabilities in a realistic cloud/SaaS/shared-resource environment, not just in a workgroup on a LAN. Security in the cloud platforms is still evolving, and the skills to write secure code for them is not widespread. Stick extra red flags on this part of your project plan. 4. Location is irrelevant. New criteria are impact, impact, impact. A printer-management application with a flaw that allows hackers to draft a LaserJet into a bot army can cause headaches. An accounting, customer-data-management or cashflow-automation app with a backdoor can put you out of business. Use Level of Risk as a multiplier to determine how important a particular app is to evaluate, and how much time or money you should spend getting it fixed. 5. Don't ignore the basics. The 10 most common attacks on Web applications are listed here by OWASP. The 25 most significant security errors that appear in applications are listed here . They're easy to read and come with extra help to fix or avoid errors already known by everyone who might want to hack your systems.