Policies have natural weaknesses Security is a barrier to progress Security is a learned behavior Expect the unexpected There ’ s no perfect mousetrap Four common pitfalls that limit the effectiveness of any security policy
Security is a barrier to progress Protective measures are (by definition) either obstacles or impediments to commerce Typically add zero benefit Sometimes mitigate specific threats Always reduce the ability to freely share information Balance between security and disruption varies Human nature begets desire (more! faster!) Traffic lights exist for safety, but they ’ re just annoying at vacant intersections At some point our patience runs out Network users experience the same limit No perceived benefit in compliance Disparate compliance security breach
Security is a learned behavior Self-preservation is instinctual; security isn ’ t Higher-level function requiring initial learning and occasional reinforcement Teach and preach the policy; tailor for audience Infosec procedures are often unintuitive How to recognize value of assets? How to evaluate risks? How to estimate costs of compromise? “ This is a stupid policy ” Applies to management, too Want commitment and funding? Better justify each component of the policy
Expect the unexpected Processes designed for global enterprises will process transactions at all hours for many users As complexity of procedures increases, so does the chance they will fail Expect failures and disasters — look for signs Keep skills current Prepare, plan, practice Weeds out faults and loopholes before they ’ re exploited
There ’ s no perfect mousetrap You can never be finished Securing is on-going Technology changes Systems become outdated, fail, lose effectiveness Threats always exist And morph as attackers practice and improve Policies and processes require regular maintenance
The real threats Penetration of your network is unlikely, media histrionics notwithstanding Complete protection might be a budget waster Real threat often from within More commonly: non-malicious damage from human error, denial of service, accidental disclosure Amount of protection based on asset value Overt policy violations come from “ borderline ” hackers tempted by unsecured assets or complacent monitoring and enforcement Policy must project image of value on assets What hurts retail — petty theft or vault cracking?
Where policies break down Key under the doormat It ’ s John Q. Public ’ s fault! Burned by the backlog Three vignettes that illustrate failures of typical security policies
Key under the doormat: analysis Policy ’ s authors failed to consider its impact on workflow Should have involved the users Security department was unable (or unwilling) to note the policy was thwarted Proper auditing and follow-up would have revealed Possibly resulting in a new policy
Key under the doormat: outcome Expensive equipment was lost Employees, managers, and the security morale were negatively affected A thief is at large The costly measures provided no security value The security policy caused the loss because it was inconvenient and easily circumvented
It ’ s John Q. Public ’ s fault: analysis Failed to evaluate viability or effectiveness in business cycle Signatures are arbitrary and don ’ t identify users Risks of granting access not communicated to VPs Security services must always communicate value, risks, and protective measures Security department should have known blank signed forms were circulating Needed assurance spot-checks, would have — Revealed VP ignorance of user accounts Led to new policy or buy-in of existing model
It ’ s John Q. Public ’ s fault: outcome Proprietary information was compromised Loss of reputation from public disclosure A hacker is at large
Burned by the backlog: analysis Management didn ’ t understand importance of servers or ramifications of business loss And it was the security group ’ s fault … Computer room staff didn ’ t know about unprotected assets It ’ s their fault here, too Knowledge would have also fixed back-ups Its placement certainly sends the wrong message Its value is about that of toilet paper And will be treated as such by operators
Burned by the backlog: outcome Customers demand refunds and/or defect to competition Proprietary information was compromised Building and property were damaged Business was lost because of fire and cleanup Company was fined
Don ’ t let this happen to you A government agency A law firm An oil company A local newspaper A midwest (US) manufacturing company A west coast (US) manufacturing company A major online service company
Bad practices spread Why you need policies If I just open a bunch of ports in the firewall my app will work. I think I will wedge the computer room door open. Much easier. They have blocked my favorite Web site. Lucky I have a modem. I think I will use my first name as a password. Say, we run a network too. How do you configure your firewalls? Why do we need the door locked? Hey, nice modem. What's the number of that line? I can never think of a good password. What do you use?
People vs. machines How do people perceive risk? How do people handle exceptions? Why do people trust computers? Why do we think people can make intelligent security decisions? Are there malicious insiders? Why are people vulnerable to social engineering? Six problems that show the inherent conflict between carbon and silicon
Poor perceivers of risk Overestimate risk for things that are Out of their control Sensationalized in the media Underestimate risk for things that are MundaneOrdinary
Damn, this new Whyte Ryce album kicks! Hell not again… we gotta fix that stupid alarm George’ll shut it off when he looks up, he always does
Awkward exception handlnig Computer mistakes are rare; people don ’ t know how to deal with them Sometimes we just ignore or disable the alarm Attackers take advantage of mistakes Drills ensure people know what to do “ This computer never makes mistakes, so you must be lying ”
Trusting the computer People don ’ t sign or encrypt stuff … software does! Necessary to securely transfer human volition to computer action Volition can be forged … make the computer lie Trojan horse feeds malicious document into signing system when key is opened to sign something else
Malicious insiders Implicitly trusted Digital world is rife with insider knowledge Authors of security programs Installers of firewalls Auditors Hire honest people Integrity screening Diffuse trust Public code reviews
Social engineering Persuade someone to do what you want But not wildly outside their normal behaviors Bypasses all controls Targets people People are helpful People just want to get their jobs done Plausibility + dread + novelty = compromise
Why are people so dangerous? Very vulnerable to mistakes and manipulation Not good at estimating risk Often too willing to extend trust Duped by pleas for help — it ’ s our natural desire to want to be helpful And can undermine all technical countermeasures Often the weakest part should be accorded more scrutiny!
How to hack people Diffusion of responsibility “ The veep says you won ’ t bear any responsibility …” Chance for ingratiation “ Look at what you might get out of this! ” Trust relationships “ He ’ s a good guy, I think I can trust him ” Moral duty “ You must help me! Aren ’ t you so mad about this? ”
How to hack people Guilt “ What, you don ’ t want to help me? ” Identification “ You and I are really two of a kind, huh? ” Desire to be helpful “ Would you help me here, please? ” Cooperation “ Let ’ s work together. We can do so much. ”
The help desk People are naturally helpful Its function is to help — to provide answers Like all customer service Generally not trained to question the validity of each call Minimally-educated about security Don ’ t get paid much Objective: move on to next call
How do you win? Remember, there ’ s no perfect mousetrap Plan for the natural weaknesses of security policy Educate users in policy, enforcement, and the value of assets Perform regular health checks on the enforcement operations Make corrections when needed
A good policy Enables management to make a statement about the value of information to the business Permits actions that would otherwise backfire Monitoring traffic is illegal in some countries Unless there exists a policy stating that monitoring is likely to occur Note the policy doesn ’ t have to be discoverable … Informs workers of their information protection duties What they can and cannot do with it all
A good policy Defines how employees are permitted to — Represent the organization and what they may disclose Use organizational computer resources for personal purposes Clearly defines protective measures The policy might be a decisive factor in a court of law Show how you took steps to protect your intellectual property Enumerates acceptable and unacceptable behavior Lists penalties for violations, up to and including termination Provides the legal foundation for making such decisions
The policy drives all other decisions Operations Process Implementation Documentation Technology Policy Review Audit Refine
The security lifecycle Policy The discovery phase Identify threats and risks Determine assets to be protected Develop enforcement strategy; dictates technologies, resources, tactics, and training Enforcement The action phase Everything gets tested here and either survives or decays Includes operational life and execution Assurance The proof phase Evaluate policy, strategy, and effectiveness Analyze failures and feed back into policy
Policy: Determine its impact Security is inconvenient Recognize and respect security ’ s disruption Build “ user impact ” into design; invite discussion Avoid excessive complexity Use tools that are already tested and proven Controls costs; lessens chances of attack To prosecute or not? Decide in advance how far to go If yes: know what evidence to collect and train staff Make the punishment fit the crime Often reprimands are sufficient But what about the person who hacks the payroll?
Enforcement: Be visible Make security overt Badges have huge psychological effects Remind constantly Include reminders of information value Emergency service Drill the troops Know where legitimate users typically work Empower the enforcers Training, training, training Frequent and short Know your environment What ’ s normal — people, jobs, traffic Walk in your user ’ s shoes Helps you avoid mistakes!
Assurance: Learn and refine Expect failure Conduct regular audits to detect leaks and flaws Audit at a level representative of risks you face Audit user IDs to ensure they ’ re still active Break into your house Try to thwart your own policies See whether users and security staff can gain access in other ways (social engineering) Learn from your mistakes Empower auditors with authority and process to affect change and make the policy better
User education Security management campaign Periodic refreshers Newsletters Group meetings Screensavers Signatures on acceptable use policies Shredders and bulk erasers Updated erasers — old ones are too weak Consider: the band saw Regular audits
Security awareness Know what has value What to do if you suddenly lost all access? Friends aren ’ t always friends Don ’ t allow trust to be exploited Over-the-phone friendships lack trust Passwords are personal And always undervalued Uniforms are cheap Mutually authenticate when your bank calls you!
Ongoing reminders Regular reminders to keep people aware One training session won ’ t last forever Police departments do this continually Be creative Don ’ t become yet another source of noise to be ignored Make the policy itself available easily Post on a web server Provide simple searching and navigation Keep it current!
Make the help desk better Help staff learn to recognize attacks Refusal by caller to give contact information RushingName-droppingIntimidationMisspellings Odd questions Know when to say “ no ” Needs backing of management
Learn more Information Security Policies Made Easy, 9/e by Charles Cresson Woodhttp://www.informationshield.com Information Security Policy World http://www.information-security-policies-and- standards.com SANS Security Policy Projecthttp://www.sans.org/resources/policies/ Site Security Handbook http://www.ietf.org/rfc/rfc2196.txt