Program Security Richard Newman. What is Program Security? Security of executing software - Make software that functions properly (no bugs) – Good development.

Slides:



Advertisements
Similar presentations
Mobile Code Security Yurii Kuzmin. What is Mobile Code? Term used to describe general-purpose executables that run in remote locations. Web browsers come.
Advertisements

Dr. Kalpakis CMSC 421, Operating Systems. Fall 2008 URL: Security.
Chapter 3 (Part 1) Network Security
CS526: Information Security Chris Clifton November 25, 2003 Malicious Code.
ITMS Information Systems Security 1. Malicious Code Malicious code or rogue program is the general name for unanticipated or undesired effects in.
Silberschatz, Galvin and Gagne  Operating System Concepts The Security Problem A system is secure iff its resources are used and accessed as.
19.1 Silberschatz, Galvin and Gagne ©2003 Operating System Concepts with Java Chapter 19: Security The Security Problem Authentication Program Threats.
________________ CS3235, Nov 2002 Viruses Adapted from Pfleeger[Chap 5]. A virus is a program [fragment] that can pass on malicious code [usually itself]
Chapter 14 Computer Security Threats Patricia Roy Manatee Community College, Venice, FL ©2008, Prentice Hall Operating Systems: Internals and Design Principles,
Security A system is secure if its resources are used and accessed as intended under all circumstances. It is not generally possible to achieve total security.
1 Pertemuan 05 Malicious Software Matakuliah: H0242 / Keamanan Jaringan Tahun: 2006 Versi: 1.
Silberschatz, Galvin and Gagne  Operating System Concepts Module 19: Security The Security Problem Authentication Program Threats System Threats.
Copyright © Clifford Neuman - UNIVERSITY OF SOUTHERN CALIFORNIA - INFORMATION SCIENCES INSTITUTE USC CSci530 Computer Security Systems Lecture.
Lecture 15 Overview. Kinds of Malicious Codes Virus: a program that attaches copies of itself into other programs. – Propagates and performs some unwanted.
1 Computer Viruses (and other “Malicious Programs) Computer “Viruses” and related programs have the ability to replicate themselves on an ever increasing.
Guide to Operating System Security Chapter 2 Viruses, Worms, and Malicious Software.
Henric Johnson1 Chapter 10 Malicious Software Henric Johnson Blekinge Institute of Technology, Sweden
1 Ola Flygt Växjö University, Sweden Malicious Software.
1 Chap 10 Malicious Software. 2 Viruses and ”Malicious Programs ” Computer “Viruses” and related programs have the ability to replicate themselves on.
© 2007 Cisco Systems, Inc. All rights reserved.Cisco Public ITE PC v4.0 Chapter 1 1 Basic Security Networking for Home and Small Businesses – Chapter 8.
Chapter 15: Security (Part 1). The Security Problem Security must consider external environment of the system, and protect the system resources Intruders.
Operating Systems Protection & Security.
Malicious Code Brian E. Brzezicki. Malicious Code (from Chapter 13 and 11)
Silberschatz and Galvin  Operating System Concepts Module 20: Security The Security Problem Authentication Program Threats System Threats Threat.
CSCE 201 Attacks on Desktop Computers: Malicious Code Hardware attacks.
CMSC 414 Computer (and Network) Security Lecture 14 Jonathan Katz.
Virus and Antivirus Team members: - Muzaffar Malik - Kiran Karki.
Lecture 14 Overview. Program Flaws Taxonomy of flaws: – how (genesis) – when (time) – where (location) the flaw was introduced into the system 2 CS 450/650.
1 Chapter 19: Malicious Software Fourth Edition by William Stallings Lecture slides by Lawrie Brown (Modified by Prof. M. Singhal, U of Kentucky)
A virus is software that spreads from program to program, or from disk to disk, and uses each infected program or disk to make copies of itself. Basically.
Chapter 5 P rogram Security. csci5233 computer security & integrity (Chap. 5) 2 Outline Viruses & worms Targeted Malicious Codes –Trapdoors, Salami attack,
1 Higher Computing Topic 8: Supporting Software Updated
1 Chap 10 Virus. 2 Viruses and ”Malicious Programs ” Computer “Viruses” and related programs have the ability to replicate themselves on an ever increasing.
CSCE 522 Lecture 12 Program Security Malicious Code.
Administrative: Objective: –Tutorial on Risks –Phoenix recovery Outline for today.
Malicious Code By Diana Peng. What is Malicious Code? Unanticipated or undesired effects in programs/program parts, caused by an agent with damaging intentions.
Chapter 10 Malicious software. Viruses and ” Malicious Programs Computer “ Viruses ” and related programs have the ability to replicate themselves on.
30.1 Lecture 30 Security II Based on Silberschatz & Galvin’s slides And Stallings’ slides.
Week 10-11c Attacks and Malware III. Remote Control Facility distinguishes a bot from a worm distinguishes a bot from a worm worm propagates itself and.
Computer Viruses and Worms By: Monika Gupta Monika Gupta.
Malicious Logic and Defenses. Malicious Logic Trojan Horse – A Trojan horse is a program with an overt (documented or known) effect and covert (undocumented.
Programmed Threats Richard Newman. What is a Programmed Threat? Potential source of harm from computer code May be in form of - Executable program - Executable.
Viruses a piece of self-replicating code attached to some other code – cf biological virus both propagates itself & carries a payload – carries code to.
VIRUS.
Chapter 19 – Malicious Software What is the concept of defense: The parrying of a blow. What is its characteristic feature: Awaiting the blow. —On War,
Computer virus Speaker : 蔡尚倫.  Introduction  Infection target  Infection techniques Outline.
Computer Security Threats CLICKTECHSOLUTION.COM. Computer Security Confidentiality –Data confidentiality –Privacy Integrity –Data integrity –System integrity.
14.1 Silberschatz, Galvin and Gagne ©2009 Operating System Concepts with Java – 8 th Edition Protection.
W elcome to our Presentation. Presentation Topic Virus.
Candidates should be able to:  describe the purpose and use of common utility programs for:  computer security (antivirus, spyware protection and firewalls)
MALICIOUS SOFTWARE Rishu sihotra TE Computer
Page 1 Viruses. Page 2 What Is a Virus A virus is basically a computer program that has been written to perform a specific set of tasks. Unfortunately,
Malicious Programs (1) Viruses have the ability to replicate themselves Other Malicious programs may be installed by hand on a single machine. They may.
Virus Infections By: Lindsay Bowser. Introduction b What is a “virus”? b Brief history of viruses b Different types of infections b How they spread b.
Information Systems CS-507 Lecture 32. Physical Intrusion The intruder could physically enter an organization to steal information system assets or carry.
Cosc 4765 Antivirus Approaches. In a Perfect world The best solution to viruses and worms to prevent infected the system –Generally considered impossible.
Computer Viruses Author: Alyse Allen.
Malicious Software.
Operating Systems Services provided on internet
Viruses and Other Malicious Content
WHAT IS A VIRUS? A Computer Virus is a computer program that can copy itself and infect a computer A Computer Virus is a computer program that can copy.
12: Security The Security Problem Authentication Program Threats
Chap 10 Malicious Software.
Chapter 22: Malicious Logic
Security.
Chap 10 Malicious Software.
Operating System Concepts
Operating System Concepts
Malicious Program and Protection
Presentation transcript:

Program Security Richard Newman

What is Program Security? Security of executing software - Make software that functions properly (no bugs) – Good development practices/software engineering - Make sure you run good software – Known, trustworthy source - Make sure software that is run has not been changed – During distribution – While on host waiting for execution - Make sure processes can't do bad things – Restrict process access (usual protection mechanisms) – Sandbox - Monitor processes in case they do bad things anyway – Audit

What is a Programmed Threat? Potential source of harm from computer code May be in form of - Executable program - Executable code attached to another program - Executable code pushed onto stack of running process - Standalone script - Commands run on startup of program - Commands embedded in “non-executable” file – JPEG – Postscript - Macros

Examples of Programmed Threats 1. Trojan Horse – Program that purports to do one thing but (also) does another 2. Virus – Embedded in another program/file (becomes Trojan) – Must get user or system to run program/open file – Infects other files/drives – Hitchhikes to other file systems on host file via removable media or 3. Bacteria/Rabbits – Replicate so fast, use up all resources 4. Worm – Stand-alone program – Transfers itself to target system – Runs automatically on target system (generally)

More Programmed Threats 5. Buffer overflow attack – “Improper” parameters corrupts stack – Includes executable code – Return pointer in activation frame may be changed to point to code 6. SQL Injection – Interpretable commands included in SQL query – SQL engine executes malicious commands 7. Run command script – Malicious commands included in.rc (or similar) file – Commands executed when program is started

More Programmed Threats 8. Back Door/Trap Door – “Secret” way to get access to system – May be included for field technicians or administrators – See – Often first goal of intruders 9. Covert Channels – Violate information flow policy – Concern in MultiLevel Secure (MLS) systems – Type of Trojan Horse 10. Bugs – Most common :( – Traditionally, most costly

Exposures 1. Unmediated Access – Trap door/back door – Worm – Buffer overflow 2. Information Leaks – Covert Channel – Virus or worm activity – Trap door/back door 3. Logic & Time Bombs – Trojan Horse – Virus/Worm activity – Take special action when triggered by conditions – Time bomb a special case – condition is time 4. Unavailability – Rabbits, worms, botnets

Virus Desiderata 1. Detection Resistant – Evade detection by stealth measures 2. Robust – Hard to deactivate/remove/destroy 3. Infectious – Wide-ranging – Reinfection 4. Easy to create 5. Machine/OS/Application-independent – Able to infect wide variety of targets

Virus Dimensions 1. Lifetime – Transient – run once each time program launched – Resident – continue to run periodically or on events 2. Target – Boot sector – TSR code – Library code – Application – Document/image 3. Attachment method 4. Infection route – Removable media – diskettes, CD/DVD, thumb drives – /MIME – Downloaded files - FTP/HTTP

Viruses 1. History – Von Neumann's self-reproducing automata in 1960's – See – First seriously appeared in early 1980's – Elk Cloner, Brain – Big issue with PCs and floppy disks/bulletin boards 2. General MO – Infected program run – viral code runs first – Optionally takes measures to hide – Looks for new files/drives to infect, infects them – Does “other stuff” Logic Bomb Time Bomb Password cracking Install back door Wreak havoc – Returns control to original program

Viruses 3. Boot Sector Virus – Copies boot sector (small bootstrap program) to unused disk block – Overwrites boot sector with viral code – Intercepts calls to disk drive/TSR code – Redirects reads of boot sector to read copy in other location – Looks for new disk to infect whenever disk is accessed 4. Executable Virus – Adds viral code to executable program – May rewrite JUMP instruction to jump to viral code first, then issue JUMP to program code when done – May modify itself (code transformation) or modify where it is stored to evade detection (polymorphic virus)

Viruses 5. Macro Virus – Included in “non-executable” file with format supporting macros Spreadsheets Document preparation software Graphics editors – Copies macros into other files of same type – Modifies file contents to exercise macros Any format that has “active content” can provide a way for virus to take hold!

Basic Virus Figure 3-4 Virus Appended to a Program.

Virus attachment via GOTOs Figure 3-5 Virus Surrounding a Program.

Virus attachment inline Figure 3-6 Virus Integrated into a Program.

Virus replacement Figure 3-7 Virus Completely Replacing a Program.

Boot Sector Virus Figure 3-8 Boot Sector Virus Relocating Code.

Virus Detection 1. Recognize storage patterns – Modification date of file – Size of file 2. Recognize content change – Specific pattern in code – Checksum change – MAC change 3. Recognize viral programs – Static code analysis – limited by undecidability (halting problem) – Still, can do triage! 4. Detect bad execution patterns – Attempt to access inappropriate files – Attempt to open abnormal network connections – Abnormal system call sequences

Virus Controls Figure 3-9 Recognizable Patterns in Viruses.

Virus Stealth Methods 1. Modify system meta-information – Modification date – Access date – Process information – File control block/i-node table/SFT/etc. 2. Intercept system calls – Modify call/results (man-in-the-middle) 3. Compress target and itself – So file size does not change 4. Modify itself – Polymorphism (don't change functionality) – Evolution (change functionality) 5. Encryption – Viral code encrypted to hide purposes, methods – Also gives “free” polymorphism to some extent

Virus Controls 1. Back-ups/restore points 2. Buy COTS software from reliable vendors 3. Test new code on isolated system – Observe behavior – Fiddle with date/time 4. Run virus scanner – Keep up to date (always behind!) – Test outgoing as well as incoming files 5. Access control – Limit damage of infected programs to user running code 6. Hardware-based protection – Prevent damage to other processes, illegal execution – Protected instructions, mode bit(s), VM, write protection, etc. 7. File signatures – Audit files, system configuration, OS, applications, libraries, etc.

Worms 1. History – 1971 “Creeper virus” at BBN - “Reaper” to kill it – Name coined in Brunner's 1975 “The Shockwave Rider” – Enabled by network/LAN technology – Xerox PARC worm for using idle workstations (1982) – Morris worm 1987 – Code Red, etc. 2. General MO – Standalone program – Looks for target host – Transfers loader (micro-FTP) to target host See

PARC Worm 3. Xerox PARC worm Users ran server pgm on W/S when idle Worm “head” found idle workstations, sent work “Segments” did work, reported to head Head had backup segments also Had to shut down all stations to get to stop! See Shoch and Hupp, “The Worm Programs: Early Experience with a Distributed Computation,” Xerox Palo Alto Research Center,

Morris Worm 4. Morris worm – November 2, 1988 Experiment by grad student at Cornell University Looks for target host – random, /etc/hosts,.rhosts, hosts.equiv Tried to get access Sendmail “feature” - debug mode Symmetry of trust Finger flaw – buffer overflow Password guessing, common accounts/passwords Transferred “grappling hook” to target host (boot loader) Grappling hook got rest of worm, ran it (one-time password) Overwhelmed hosts with processes Overwhelmed networks with traffic

Morris Worm 4. Morris worm (con't) Stealth techniques “encrypted” code (flipped MSB in ASCII) Changed process name to innocuous pgm Changed process ID periodically – short life per proc Died completely after short time Sendmail access Back door, poor configuration, poor interface Symmetry of trust Remote login without password required Host lists trusted hosts If a host B is on list of A, likely host A is on list of B spaf.cerias.purdue.edu/tech-reps/823.pdf

Code Red Worm 5. Code Red Worm – July 2001 Attacked MS IIS Buffer overflow attack Patch had been available for a month Spread Only 1 st – 19 th of month – look for other IIS servers Did not determine if IIS server was vulnerable first Mischief Deface website - “Hacked by Chinese” Launch DoS attack 20 th -27 th of month vs. fixed IP addr

Code Red Worm 5. Code Red Worm IIS buffer overflow: GET /default.ida?NNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNN %u9090%u6858%ucbd3%u7801%u9090%u6858%ucbd3%u7801 %u9090%u6858%ucbd3%u7801%u9090%u9090%u8190%u00c3 %u0003%u8b00%u531b%u53ff%u0078%u0000%u00=a HTTP/1.0

Worm Controls 1. Prevent remote access – Only connect to network if necessary, when necessary – Shut down unneeded servers – Use firewall to limit access – Use VPNs 2. Protect remote access points – Dial-in callback – Proper configuration 3. Limit possible damage – Run servers at lowest possible privilege level – Run servers on special hosts with limited access – Remove general utilities/tools from server hosts 4. Monitor – Look for unusual access/behavior patterns, traffic

Trapdoors Allow unauthorized access Local – Magic password – Unauthorized user name/password Remote – Can be used for remote administration – Allows access over network

Trapdoor Controls 1. Testing for undocumented “features” 2. Code reviews 3. Maintenance – verify patches/updates 4. Monitor for logins/remote accesses 5. Check for input testing/bounds checking 6. Test for undefined machine opcodes

Buffer Overflow Figure 3-1 Places Where a Buffer Can Overflow.

More directed malicious threats Salami attack Collect small amounts of money/time/space Remain undetected (“in the noise”) Many drops of water make up the sea.... Remain because of rounding errors poor processes poor audit Covert channels Information leakage against system policy Generally modified utility program Privileged user runs Trojan horse Info transmitted in unusual way, “in the noise”

Covert Channels Form of Trojan Horse Modified program/driver Legitimate user (with access rights) runs code TH accesses other data, leaks it to low privilege process Figure 3-11 Covert Channel Leaking Information.

Covert Channel Types Storage Channel State of shared resource can be modified by service pgm State can be “read” by receiver Examples: File lock, or file presence Resource exhaustion (disk blocks, memory, i- nodes,...) Numbers handed out (process ID, etc.) In noise of accessible file (steganography) Timing Channel Rate or responsiveness of access to dynamic resource Examples: CPU slice access in timeshared system Network access time, ACK response time, etc. Timing channel can be converted to storage channel

Covert Channel Example Figure 3-13 File Lock Covert Channel.

Covert Channel Example Figure 3-14 File Existence Channel Used to Signal 100.

Covert Channel Example Figure 3-15 Covert Timing Channel.

Covert Channel Controls Prevent unauthorized flows to begin with Information flow policy and enforcement Standard methods for preventing unauthorized code changes Review code/system for possible flows Information flow and control flow analysis of code Shared Resource Matrix (SRM) method List shared resources Determine which modules modify them, read state Determine possible flows, then real flows Estimate maximum data rate for channels found Close channel if possible Make channel noisy or slow if can't close Audit for exercise of channel

Programmed Threat Controls Physical Access Access Control Process Isolation Virtualization Sand boxes Program verification Proof-carrying code Signed code Honey Pots Monitoring Development Controls Distribution and Deployment Controls

Development Controls Good design methodology Separation of Duty Version Management/Revision Control Configuration Management Verification Clean-room program development Design/Code Reviews Testing & Validation Proof-carrying Code

Good Development Practice Figure 3-19 Fault Discovery Rate Reported at Hewlett- Packard.

Configuration Management Change control Version control Configuration management – stable configs Backups, shadow copies, immutable versions Regression testing Audit trail Separation of duty

Good Design Practice Modularity Encapsulation – minimal coupling Information hiding Code reuse Design for testability/verification

Development Controls Figure 3-16 Modularity.

Good Design Practice Figure 3-17 Coupling.

Good Design Practice Figure 3-18 Information Hiding.

Best Design Practices Clean-room programming (Harlan Mills) – Design from requirements – Verify formally – Only then code and test – Testing is easy, fast – Premise is that understanding comes first – Results in better code, sooner – Proof carrying code (Necula) – Include computer readable proof – Recipient has mechanical verifier

Cost of Bugs Figure from Software Engineering Economics by Barry Boehm

Process Improvement TQM/CQI/CPI/SEI Capability Maturity Model/ ISO 9000/1, etc. Structured process so outcomes are Predictable Repeatable (not necessarily good!) Continuous improvement Feedback from results of product Feedback from results of process

SEI CMM Levels 1) Inital – chaotic 2) Repeatable 1) Planning 2) Islands of process 3) Configuration management 3) Defined 1) Management support, training 2) Standardization, communication 3) Peer reviews, documentation 4) Managed - Quantitative measures, analysis 5) Optimizing – use of feedback

NSA's SSE CMM NSA extended CMM for System Security Engineering Three areas: 1) Engineering (development) 1) SE development practices 2) Includes security analysis, vulnerability analysis 2) Project (management) 1) Quality assurance 3) Organizational 1) Training 2) Process improvement

Distribution & Deployment Controls Secure Path – From trusted source to current execution – Example: ctl-alt-del for login Signed Code – Code carries digital signature – Must check signature with verification key – Verification key must be secured and current – Certificate binding key must be trustworthy Proof-carrying Code – Does not rely on secure transmission – Does not rely on trustworthy source – Has proof of security properties attached – Simple theorem prover used to verify code