|
WORKSHOP SESSIONS
All sessions will take place in Salon Drummond East (Level 3) unless otherwise noted.
Session papers are available to workshop registrants immediately and to everyone beginning August 10, 2009.
|
Monday, August 10, 2009
|
8:30 a.m.–9:00 a.m. Continental Breakfast, Ballroom Foyer (Level 4)
|
|
9:00 a.m.–9:15 a.m.
|
Opening Remarks
Program Co-Chairs: Jelena Mirkovic, USC Information Sciences Institute (ISI); Angelos Stavrou, George Mason University
General Chair: Terry V. Benzel, USC Information Sciences Institute (ISI)
View the presentation slides
|
9:15 a.m.–9:45 a.m.
|
Keynote Address
The Future of Cyber Security Experimentation and Test
Michael VanPutte, DARPA
View the presentation slides
|
9:45 a.m.–10:00 a.m. Break
|
|
10:00 a.m.–11:00 a.m. |
Security Education
A Highly Immersive Approach to Teaching Reverse Engineering
Golden G. Richard III, University of New Orleans
Paper in PDF | Slides
Collective Views of the NSA/CSS Cyber Defense Exercise on Curricula and Learning Objectives
William J. Adams, United States Military Academy; Efstratios Gavas, United States Merchant Marine Academy; Tim Lacey, Air Force Institute of Technology;
Sylvain P. Leblanc, Royal Military College of Canada
Paper in PDF | Slides
|
11:00 a.m.–noon |
Security Experimentation
Evaluating Security Products with Clinical Trials
Anil Somayaji and Yiru Li, Carleton University;
Hajime Inoue, ATC-NY; José M. Fernandez, École Polytechnique Montréal; Richard Ford, Florida Institute of Technology
Paper in PDF
The Heisenberg Measuring Uncertainty in Lightweight Virtualization Testbeds
Quan Jia, Zhaohui Wang, and Angelos Stavrou, George Mason University
Paper in PDF | Slides
|
Noon–1:00 p.m. Workshop Luncheon, Ballroom East (Level 4) |
|
1:00 p.m.–2:30 p.m. |
Testbeds
The Virtual Power System Testbed and Inter-Testbed Integration
David C. Bergman, Dong Jin, David M. Nicol, and Tim Yardley, University of Illinois at Urbana-Champaign
Paper in PDF | Slides
Dartmouth Internet Security Testbed (DIST): Building a Campus-wide Wireless Testbed
Sergey Bratus, David Kotz, Keren Tan, William Taylor, Anna Shubina,
and Bennet Vance, Dartmouth College; Michael E. Locasto, George Mason University
Paper in PDF | Slides
An Emulation of GENI Access Control
Soner Sevinc and Larry Peterson, Princeton University;
Trevor Jim and Mary Fernández, AT&T Labs Research
Paper in PDF
|
2:30 p.m.–2:45 p.m. Break
|
|
2:45 p.m.–3:45 p.m. |
Experimentation Tools
Payoff Based IDS Evaluation
Michael Collins, RedJack, LLC
Paper in PDF
Toward Instrumenting Network Warfare Competitions to Generate Labeled Datasets
Benjamin Sangster, T.J. O'Connor, Thomas Cook, Robert Fanelli,
Erik Dean, William J. Adams, Chris Morrell, and Gregory Conti,
United States Military Academy
Paper in PDF | Slides
|
3:45 p.m.–4:00 p.m. Break
|
|
4:00 p.m.–5:30 p.m.
|
Panel on Science of Security Experimentation
Panelists: John McHugh, Dalhousie University; Jennifer Bayuk, Jennifer L Bayuk LLC; Minaxi Gupta, Indiana University; Roy Maxion, Carnegie Mellon University
Panel in Slides
There is currently no established best practice for evaluation of practical research security solutions. Not only do we lack benchmarks and metrics for security testing, we also don't agree on testing approaches, test setup, or even evaluation goals. Published work abounds with ad hoc, unrealistic, and unrepeatable test strategies. It is impossible to compare related solutions because they usually have been tested in very different settings, and their implementation was not made public. It is further impossible to build on the work of others without re-implementing their solution and evaluation approach from scratch. This dilutes the strength of a research community and slows down the progress.
This panel will discuss challenges to scientifically rigorous security experimentation, including:
- the choice of an appropriate evaluation approach from theory, simulation, emulation, trace-based analysis, and deployment
- how/where to gather appropriate and realistic data to reproduce relevant security threats
- how to faithfully reproduce data in an experimental setting
- how to promote reuse and sharing, and discourage reinvention in the community
- requirements for and obstacles to creation of widely accepted benchmarks for popular security areas
|
|