T1 Network Security Assessments WorkshopHands-On (Day 1 of 2)
David Rhoades, Maven Security Consulting, Inc.
10:30 a.m.6:00 p.m.
|
|
|
|
|
|
|
Who should attend: Anyone who needs to understand how to perform an effective and safe network assessment.
How do you test a network for security vulnerabilities? Just plug
some IP addresses into a network-scanning tool and click SCAN,
right? If only it were that easy. Numerous commercial and freeware tools assist
in locating network-level security vulnerabilities. However, these
tools are fraught with dangers: accidental denial-of-service,
false positives, false negatives, and long-winded reporting, to name but
a few. Performing a security assessment (a.k.a. vulnerability assessment
or penetration test) against a network environment requires
preparation, the right tools, methodology, knowledge, and more.
This hands-on workshop will cover the essential topics for performing
an effective and safe network assessment.
Class exercises will require that students have an x86-based laptop
computer that can be booted from a KNOPPIX CD, along with a 10/100 Ethernet
network card. Please download a copy of KNOPPIX-STD
(https://www.knoppix-std.org), burn it to a CD-R, and try to boot your system
on a network offering DHCP. Be sure your network card is recognized by
Knoppix-STD, otherwise you will not be able to participate in most classroom
exercises. Wireless access will not be supported during class.
Topics include:
- Preparation: What you need before you even begin
- Safety measures: This often-overlooked topic will cover important
practical steps to minimize or eliminate adverse effects on critical networks
- Architecture considerations: Where you scan from affects how you perform the assessment
- Inventory: Taking an accurate inventory of active systems and protocols
on the target network
- Tools of the trade: Effective use of both freeware and commercial tools, with an emphasis on common pitfalls
- Automated scanning: Best-of-class tools, with tips (mostly vendor-neutral) on their proper use
- Research and development: What to do when existing tools don't suffice
- Documentation and audit trail: How to keep accurate records easily
- How to compile useful reports: Planning for corrective action and tracking your security measures
Students will practice network assessment on a target network of Windows and UNIX-based servers and various routing components.
Day 1
- Lab setup and preparation
- Security assessment overview
- Types of assessments
- Choosing an assessment approach
- Assessment preparation
- Defining the purpose
- Rules of engagement
- Assessment logistics
- Open vs. closed testing
- Passive vs. active testing; depth of testing
- Denial of service (DoS)
- Enumeration of target information
- Permission
- Assessment safety
- Verification of tool authenticity
- Vetting tools
- Safety concepts
- The dangers of automated scanners
- Automated tool safety summary
- Documentation and audit trail
- Assessment phase 1: network inventory
- Ping scanning
- Discrete port scanning (host inventory only)
- DNS queries
- Traceroute
- ARP scanning
Day 2
- Assessment phase 2: target analysis
- TCP port scanning
- UDP port scanning
- SNMP
- Assessment phase 3: exploitation and confirmation
- Automated vulnerability scanning tools
- (Online) brute-force attacks
- (Offline) password cracking
- Manual testing
- Special consideration testing
- Firewalls and routers
- Auditing email servers
- Web servers
- Stealth technique summary
- Vulnerability scanning tools
- Automated scanning tools
- Commercial scanners
- Nessus
- Nessus Clients
- Using Nessus
David Rhoades (T1, W1, R1, F1) is a principal consultant with Maven Security
Consulting, Inc. Since 1996, David has provided information protection services
for various FORTUNE 500 customers. His work has taken him across the US
and abroad to Europe and Asia, where he has lectured and consulted in
various areas of information security. David has a B.S. in computer
engineering from the Pennsylvania State University and is an instructor
for the SANS Institute, the MIS Training Institute, and Sensecurity
(based in Singapore).
T2 Inside the Linux Kernel (Updated for Version 2.6)
Theodore Ts'o, IBM
10:30 a.m.6:00 p.m.
|
|
|
|
|
Who should attend: Application programmers and kernel developers. You should be reasonably familiar with C
programming in the UNIX environment, but no prior experience with the UNIX or Linux kernel code is assumed.
This tutorial will give you an introduction to the structure of the Linux kernel, the basic features it provides, and the most important algorithms it employs.
The Linux kernel aims to achieve conformance with existing standards and compatibility with existing operating systems; however, it is not a reworking of existing UNIX kernel code. The Linux kernel was written from scratch to provide both standard and novel features, and it takes advantage of the best practice of existing UNIX kernel designs.
Although the material will focus on the latest release version of the Linux kernel (v. 2.6), it will also address aspects of the development kernel codebase (v. 2.7) where its substance differs from 2.6. It will not contain any detailed examination of the source code but will, rather, offer an overview and roadmap of the kernel's design and functionality.
Topics include:
- How the kernel is organized (scheduler, virtual memory system,
filesystem layers, device driver layers, networking stacks)
- The interface between each module and the rest of the kernel
- Kernel support functions and algorithms used by each module
- How modules provide for multiple implementations of similar functionality
- Ground rules of kernel programming (races, deadlock conditions)
- Implementation and properties of the most important algorithms
- Portability
- Performance
- Functionality
- Comparison between Linux and UNIX kernels, with emphasis on differences in algorithms
- Details of the Linux scheduler
- Its VM system
- The ext2fs filesystem
- The requirements for portability between architectures
Theodore Ts'o (T2) has been a Linux kernel developer since almost the very
beginnings of Linuxhe implemented POSIX job control in the
0.10 Linux kernel. He is the maintainer and author for the Linux COM
serial port driver and the Comtrol Rocketport driver. He architected
and implemented Linux's tty layer. Outside of the kernel, he is also
the maintainer of the e2fsck filesystem consistency checker. Ted is a Senior Technical Staff Member of IBM's Linux Technology Center.
T3 Administering Linux in Production Environments
Aeleen Frisch, Exponential Consulting
10:30 a.m.6:00 p.m.
|
|
|
|
|
Who should attend: Both current Linux system administrators and
administrators from sites considering converting to Linux or adding
Linux systems to their current computing resources. We will be focusing on the
administrative issues that arise when Linux systems are deployed
to address a variety of real-world tasks and problems arising from
both commercial and research and development contexts.
Topics include:
- Recent kernel developments
- High-performance I/O
- Advanced filesystems and logical volumes
- Disk striping
- Optimizing I/O performance
- Advanced compute-server environments
- Beowulf
- Clustering
- Parallelization environments/facilities
- CPU performance optimization
- High availability Linux: fault tolerance options
- Enterprise-wide authentication
- Fixing the security problems you didn't know you had (or, what's good
enough for the researcher/hobbyist won't do for you)
- Automating installations and other mass operations
- Linux in the office environment
Aeleen Frisch (T3, W3, R4) has been a system administrator for over 20 years. She currently
looks after a pathologically heterogeneous network of UNIX and Windows
systems. She is the author of several books, including Essential
System Administration (now in its 3rd edition).
T4 Building a Software Security Capability: How to Foster Best Practices in Software Security
Gary McGraw, Cigital
10:30 a.m.6:00 p.m.
|
|
|
|
|
Who should attend: Software developers who want to improve the securityand salabilityof their products. You will learn current best practices and come away with a clear action plan for attacking the software
security problem in your organization.
This tutorial explains why the key to proactive computer security is
making software behave, and then goes on to tell you how to do it.
Microsoft's Trustworthy Computing Initiative, begun in January 2002, has
changed the way Microsoft builds software. To date, Microsoft has spent
over $500 million (2000 worker years) on their software security push.
Given the emerging importance of software security and reliability to
high-profile software vendors, you need to figure out what to do about the software you develop.
Topics include:
- The role of awareness and training (for development staff)
- The importance of technology choices (language, OS, development tools, testing tools)
- How to weave security analysis throughout the software development lifecycle
- Building abuse and misuse cases
- The role of architectural risk analysis: who, how, and when
- The role of code review: use of advanced tools
- Security testing (and how it differs from functional testing)
- Post facto application security (deployment issues)
- Measuring return on investment
Gary McGraw (T4), Cigital, Inc.'s CTO, researches software security and sets
technical vision in the area of Software Quality Management. Dr. McGraw
is co-author of four popular books: Java Security (Wiley, 1996),
Securing Java (Wiley, 1999), Software Fault Injection (Wiley 1998), and
Building Secure Software (Addison-Wesley, 2001). His fifth book,
Exploiting Software (Addison-Wesley), was released in February 2004. A
noted authority on software and application security, Dr. McGraw
consults with major software producers and consumers. Dr. McGraw has
written over sixty peer-reviewed technical publications and functions as
principal investigator on grants from Air Force Research Labs, DARPA,
National Science Foundation, and NIST's Advanced Technology Program. He
serves on Advisory Boards of Authentica, Counterpane, Fortify Software,
and Indigo Security as well as advising the CS Department at UC Davis.
Dr. McGraw holds a dual Ph.D. in Cognitive Science and Computer Science
from Indiana University and a B.A. in Philosophy from UVa. He regularly
contributes to popular trade publications and is often quoted in
national press articles.
T5 System Log Aggregation, Statistics, and Analysis
Marcus Ranum, Trusecure Corp.
10:30 a.m.6:00 p.m.
|
|
|
|
|
|
|
Who should attend: System and network administrators who are interested in
learning what's going on in their firewalls, servers, network,
and systems; anyone responsible for security and audit or
forensic analysis.
This tutorial covers techniques and software tools for
building your own log analysis system, from aggregating
all your data in a single place, through normalizing it,
searching, and summarizing, to generating statistics and
alerts and warehousing it. We will focus primarily on
open source tools for the UNIX environment, but will
also describe tools for dealing with Windows systems
and various devices such as routers and firewalls.
Topics include:
- Estimating log quantities and log system requirements
- Syslog: mediocre but pervasive logging protocol
- Back-hauling your logs
- Building a central loghost
- Dealing with Windows logs
- Logging on Windows loghosts
- Parsing and normalizing
- Finding needles in haystacks: searching logs
- I'm dumb, but it works: artificial ignorance
- Bayesian spam filters for logging
- Storage and rotation
- Databases and logs
- Leveraging the human eyeball: graphing log data
- Alerting
- Legalities of logs as evidence
Marcus Ranum (T5, R5, F5) is senior scientist at Trusecure Corp. and a world-renowned expert
on security system design and implementation.
He is recognized as the inventor of the proxy firewall and the
implementer of the first commercial firewall product. Since the
late 1980s, he has designed a number of groundbreaking security
products, including the DEC SEAL, the TIS firewall toolkit, the
Gauntlet firewall, and NFR's Network Flight Recorder intrusion
detection system. He has been involved in every level of operations
of a security product business, from developer, to founder and CEO
of NFR. Marcus has served as a consultant to many FORTUNE 500 firms
and national governments, as well as serving as a guest lecturer
and instructor at numerous high-tech conferences. In 2001, he was
awarded the TISC Clue award for service to the security community,
and he holds the ISSA lifetime achievement award.
|