Check out the new USENIX Web site.
FOCI '11 Banner

USENIX Security '11

WORKSHOP PROGRAM ABSTRACTS

Three Researchers, Five Conjectures: An Empirical Analysis of TOM-Skype Censorship and Surveillance
Back to Program
We present an empirical analysis of TOM-Skype censorship and surveillance. TOM-Skype is an Internet telephony and chat program that is a joint venture between TOM Online (a mobile Internet company in China) and Skype Limited. TOM-Skype contains both voice-over-IP functionality and a chat client. The censorship and surveillance that we studied for this paper is specific to the chat client and is based on keywords that a user might type into a chat session. We were able to decrypt keyword lists used for censorship and surveillance. We also tracked the lists for a period of time and witnessed changes. Censored keywords range from obscene references, such as "two girls one cup" (the motivation for our title), to specific passages from 2011 China Jasmine Revolution protest instructions, such as "McDonald's in front of Chunxi Road in Chengdu." Surveillance keywords are mostly related to demolitions in Beijing, such as "Ling Jing Alley demolition." Based on this data, we present five conjectures that we believe to be formal enough to be hypotheses that the Internet censorship research community could potentially answer with more data and appropriate computational and analytic techniques.

Fine-Grained Censorship Mapping: Information Sources, Legality and Ethics
Back to Program
We examine the problem of mapping internet filtering, or censorship, at a finer-grained level than the national, in the belief that users in different areas of a country, or users accessing the internet through different providers or services, may experience differences in the filtering applied to their internet connectivity. In investigating this possibility, we briefly consider services that may be used by researchers to experience a remote computer's view of the internet. More importantly, we seek to stimulate discussion concerning the potentially serious legal and ethical concerns that are intrinsic to this form of research.

CensMon: A Web Censorship Monitor
Back to Program
The Internet has traditionally been the most free medium for publishing and accessing information. It is also quickly becoming the dominant medium for quick and easy access to news. It is therefore not surprising that there are significant efforts to censor certain news articles or even entire web sites. For this reason, it is paramount to try to detect what is censored and by whom. In this paper we present the design and implemention of a web censorship monitor, called CensMon. CensMon is distributed in nature, operates automatically and does not rely on Internet users to report censored web sites, can differentiate access network failures from possible censorship, and uses multiple input streams to determine what kind of censored data to look for. Our evaluation shows that CensMon can successfully detect censored content and spot the filtering technique used by the censor.

Work-in-Progress: Automated Named Entity Extraction for Tracking Censorship of Current Events
Back to Program
Tracking Internet censorship is challenging because what content the censors target can change daily, even hourly, with current events. The process must be automated because of the large amount of data that needs to be processed. Our focus in this paper is on automated probing of keyword-based Internet censorship, where natural language processing techniques are used to generate keywords to probe for censorship with. In this paper we present a named entity extraction framework that can extract the names of people, places, and organizations from text such as a news story. Previous efforts to automate the study of keyword-based Internet censorship have been based on semantic analysis of existing bodies of text, such as Wikipedia, and so could not extract meaningful keywords from the news to probe with. We have used a maximum entropy approach for named entity extraction, because of its flexibility. Our preliminary results suggest that this approach gives good results with only a rudimentary understanding of the target language. This means that the approach is very flexible, and while our current implementation is for Chinese we anticipate that extending the framework to other languages such as Arabic, Farsi, and Spanish will be straightforward because of the maximum entropy approach. In this paper we present some testing results as well as some preliminary results from probing China's GET request censorship and search engine filtering using this framework.

Redirecting DNS for Ads and Profit
Back to Program
Internet Service Providers (ISPs) increasingly try to grow their profit margins by employing "error traffic monetization," the practice of redirecting customers whose DNS lookups fail to advertisement-oriented Web servers. A small industry of companies provides the associated machinery for ISPs to engage in this monetization, with the companies often participating in operating the service as well. We conduct a technical analysis of DNS error traffic monetization evident in 66,000 Netalyzr sessions, including fingerprinting derived from patterns seen in the resulting ad landing pages. We identify major players in this industry, their ISP affiliations over time, and available user opt-out mechanisms. One monetization vendor, Paxfire, transgresses the error-based model and also reroutes all user search queries to Bing, Yahoo, and (sometimes) Google via proxy servers controlled or provided by Paxfire.

User Freedom to Attach Devices
Back to Program
Much of the research on an Internet of Things assumes that users will be able to connect devices without consent by or interference from their service providers. However, in cable and satellite television networks, cellular networks, and some broadband Internet networks, the service provider often only allows use of set-top boxes, smart phones, and residential gateways obtained directly from the provider. The ability of a provider to implement such restrictions is limited by communications law. We propose a set of user and service provider rights. We identify the pertinent network architectural principles, and use these to propose a new legal framework for device attachment that, combined with standardized interfaces and protocols, can ensure an open network that supports innovation in devices.

Infrastructures of Censorship and Lessons from Copyright Resistance
Back to Program
U.S. policymakers proclaim their commitment to Internet freedom while simultaneously endorsing restrictions on Internet exchange. Unfortunately, the tools — legal and technical — built to block copyright infringement, counterfeit sales, online gambling, or indecency, often find use to censor lawful expression here and abroad. In particular, the United States and its entertainment industries have prioritized online copyright enforcement such that its attack and riposte can be instructive in the Internet freedom arena.

Decoy Routing: Toward Unblockable Internet Communication
Back to Program
We present decoy routing, a mechanism capable of circumventing common network filtering strategies. Unlike other circumvention techniques, decoy routing does not require a client to connect to a specific IP address (which is easily blocked) in order to provide circumvention. We show that if it is possible for a client to connect to any unblocked host/service, then decoy routing could be used to connect them to a blocked destination without cooperation from the host. This is accomplished by placing the circumvention service in the network itself — where a single device could proxy traffic between a significant fraction of hosts — instead of at the edge.

Hiding Amongst the Clouds: A Proposal for Cloud-based Onion Routing
Back to Program
Internet censorship and surveillance have made anonymity tools increasingly critical for free and open Internet access. Tor, and its associated ecosystem of volunteer traffic relays, provides one of the most secure and widely-available means for achieving Internet anonymity today. Unfortunately, Tor has limitations, including poor performance, inadequate capacity, and a susceptibility to wholesale blocking. Rather than utilizing a large number of volunteers (as Tor does), we propose moving onion-routing services to the "cloud" to leverage the large capacities, robust connectivity, and economies of scale inherent to commercial datacenters. This paper describes Cloud-based Onion Routing (COR), which builds onion-routed tunnels over multiple anonymity service providers and through multiple cloud hosting providers, dividing trust while forcing censors to incur large collateral damage. We discuss the new security policies and mechanisms needed for such a provider-based ecosystem, and present some preliminary benchmarks. At today's prices, a user could gain fast, anonymous network access through COR for only pennies per day.

Bypassing Internet Censorship for News Broadcasters
Back to Program
News organizations are often the targets of Internet censorship. This paper will look at two technical considerations for the BBC, based on its distribution of non-English content into countries such as Iran and China, where the news services are permanently unavailable from the official BBC websites: blocking detection and circumvention. This study examines an internal BBC prototype system built in 2010 to detect online censorship of its content, and evaluates potential improvements. It will also review the BBC's use of circumvention tools, and consider the impact and execution of pilot services for Iran and China. Finally, the study will consider the technical delivery of the BBC's news output, and the methods it employs to bypass Internet censorship.

footer
? Need help? Use our Contacts page.

Back to Program
Last changed: 13 July 2011 jel