?

Workshop on Human-Computer Interaction and Security Systems


part of CHI2003, April 5-10, 2003, Fort Lauderdale, Florida

?

Call for Participants

Organizers
Andrew Patrick, National Research Council of Canada, Andrew.Patrick@nrc.gc.ca
A Chris Long, Carnegie Mellon University, chrislong@acm.org
Scott Flinn, National Research Council of Canada, Scott.Flinn@nrc.gc.ca

Important Dates:
Position Papers Due: January 17, 2003
Notification of Acceptance or Rejection: February 7, 2003
Workshop Date: April 6, 2003

The human factor is often described as the weakest part of a security system and users are often described as the weakest link in the security chain. This workshop will seek to understand the roles and demands placed on users of security systems, and explore design solutions that can assist in making security systems usable and effective. In addition to examining end-users, this workshop will also examine the issues faced by security system developers and operators. The primary motivation for the workshop is that previous research on HCI and Security (HCISEC) has been scattered in different conferences and journals, and information sharing has been difficult. The goal of the workshop is to build a network of interested people, share research activities and results, discuss high priority areas for research and development, and explore opportunities for collaboration.

Security is a large topic so there are many areas where HCI is important. Three obvious areas of interest are authentication (passwords, biometrics, etc.), security operations (intrusion detection, vigilance, policies and practices, etc.), and developing secure systems (developing for security, understanding users, installation and operation support, etc.). We are interested in receiving submissions on these topics, and suggestions of other possible topic areas are also welcome.

Contributions should be in the CHI 2003 Extended Abstracts style, 2 to 4 pages long. Participants will be selected based on the contribution to the goals of the workshop, the shared interests of the other participants, and likelihood to promote discussion and collaboration.

Please send inquiries and submissions to Andrew Patrick, Andrew.Patrick@nrc.gc.ca


FORMAT OF THE WORKSHOP (subject to change)

Participation Solicitation and Selection

The goal is to bring together researchers and practitioners to facilitate information sharing and collaboration. We will solicit participation from people known to be interested in the HCISEC field, and advertise more widely (e.g., the Development Consortium, various security fora). Papers will be selected by the workshop organizers based on the contribution to the goals of the workshop, the shared interests of the other participants, promoting diversity and variety in topics and approaches, and likelihood to promote discussion and collaboration.

Method of Interaction: 1 Day Workshop

This workshop will include brief paper presentations, group discussion about the papers to identify key points, breakout sessions, reports from breakout groups, and a final group discussion and follow-up planning session. A summary of the workshop will be prepared as a poster for display during the conference. Participants will also have the option of a group dinner at the end of the day to encourage further networking and discussion.

Schedule for the Workshop

9.00-9.15	Introduction and context setting by organizers
9.15-10.15	Introductions, statements of interests, and presentations by attendees 
          	(time will be adjusted according to number of attendees)
10.15-10.30	Coffee Break
10.30-11.30	More presentations
11.30-12.30	Discussion of presentations, themes, shared interests
12.30-13.30	Lunch
13.30-13.45	Summarize morning discussions, organize breakout groups
13.45-15.00	Breakout group discussions (topics to be based on themes identified earlier)
15.00-15.45	Breakout group summaries
15.45-16.00	Coffee break
16.00-17.00	Summary, compose poster and report, decide on follow-up opportunities
17.00-17.30	Wrap up and leave.
18:30-22:00	Group dinner

THE TOPIC: HCI AND SECURITY SYSTEMS

The human factor is often described as the weakest part of a security system and users are often described as the weakest link in the security chain. This workshop will seek to understand the roles and demands placed on users of security systems, and explore design solutions that can assist in making security systems usable and effective. In addition to examining end- users, this workshop will also examine the issues faced by security system developers and operators. The primary motivation for the workshop is that previous research on HCI and Security (HCISEC) has been scattered in different conferences and journals, and information sharing has been difficult. The goal of the workshop is to build a more cohesive and active HCISEC community of researchers and practitioners. This will be done by building a network of interested people, sharing research activities and results, discussing high priority areas for research and development, and exploring opportunities for collaboration.

Recent history in the USA and around the world has increased the apparent importance of security systems, both physical and electronic. A number of initiatives and responses have emerged ranging from new security systems and devices to large-scale national identification and security programs. Little attention has been paid, however, to the issue of whether such systems will be usable and effective. HCI researchers and practitioners have a lot to offer security system developers, purchasers, and users. This workshop will be an opportunity for these people to come together, share information, and develop relationships.

Scope of the Topic

Security is a large topic so there are many areas where HCI is important. Three obvious areas of interest are authentication (passwords, biometrics, etc.), security operations (intrusion detection, vigilance, policies and practices, etc.), and developing secure systems (developing for security, understanding users, installation and operation support, etc.). Some previous research has been done in each of these areas, but there are many open issues.

Authentication

The most common authentication procedure is for the user to provide a user ID and a shared secret password that they have chosen. Users have been described as the weakest link in security systems because of their behavior when using user ID/password systems. Many studies have shown, for example, that users tend to choose short and/or guessable passwords [1]. Another very common problem is that users forget their passwords. One estimate is that 50 percent of all help desk calls are password related, and most of these are because a password has been forgotten (Murrer, in [3]).

Probably because of the difficulty remembering, users also have a tendency to write their passwords down. In one study, 50 percent of the users surveyed admitted to writing down their passwords, and the other 50 percent did not answer the question [1]. Other notorious password behaviors are: (1) users share their passwords with their friends and colleagues, (2) users fail to change their passwords on a regular basis even when instructed to, (3) users may choose the same password (or closely related passwords) for multiple systems, and (4) users are often willing to tell their passwords to strangers who asked for them (asking was the most common technique use by Kevin Mitnick in his infamous security exploits [10]).

There are solutions to the security issues caused by the behavior of users, but they are not commonly used (see [1] for an excellent review). To alleviate the problem of a remembering multiple passwords, for example, organizations can support synchronized passwords across systems. A related solution is a single-sign-on system where users are authenticated once and then they are allowed to access multiple systems. Another technique is to reduce the memory load placed on users. It is well known that cued recall, where users are prompted for the information they must remember, is more accurate than free recall [4]. This can be used in security systems by requiring personal associates for passwords, such as "dear - god, "black - white", "spring - garden". Performance can also be improved by not asking users to recall at all, but rather to recognize certain material. Recognition is much easier and more accurate than recall [8]. There is some evidence, for example, that Passfaces are easier to remember than passwords, especially after long intervals with no use [3].

There has been much interest recently in using biometrics, such as fingerprints or voice patterns, for user identification [5] [9], but these systems can have their own problems. Biometrics can be hard to forge but easy to steal [11]. For example, fingerprints can be lifted from objects and used when the owner is not present. Also, the master file of biometric templates can be compromised so that an intruder could replace a legitimate thumbprint file with their own. If the integrity of a biometric has been compromised (e.g., a thumbprint file has been widely distributed) it makes the biometric system unusable forever. Also, a biometric security network can be compromised by packet sniffing and insertion, where an illegitimate biometrics file is inserted in place of a legitimate one that is being transmitted.

Biometrics systems can be based on physical characteristics, such as fingerprints, or behavioral characteristics, such as voice patterns [7] or typing styles [6]. The performance of behavioral biometrics (in terms of correction rejections and false acceptances) can be affected by circumstances such as health, stress, and other factors. Also, at least one behavioral biometric system, the one based on typing styles, appears to be less acceptable to users, who are afraid that their work performance may be monitored in some way [3].

Security Operations

Human factors problems are not restricted to end-users. System operators are also human and therefore have limitations and the potential to make mistakes. Perhaps the most serious behavioral problem of system operators is poor configuration of the system. This may be due to failure to understand the security technology, and/or failure to activate all of the necessary features. In one study, failures during installation and feature configuration were the most common sources of security problems in the banking and government sectors [2].

System operators of large installations also face the problems encountered in other domains of monitoring and controlling large complex systems. Tools such as distributed firewalls promise to improve security, but configuring, monitoring, and controlling these systems is difficult. Operators would benefit from better interfaces for these systems.

Another problem seen with system operators is poor operating procedures. This includes not keeping the system up-to-date, not responding to security notices, badly managing their own passwords, cost-cutting, and simple laziness. An interesting research area might be an analysis of factors that contribute to inappropriate system operator behaviors. Finally, operator fraud can be a serious problem in situations where security compromises can lead to financial gain [2].

Developing Secure Systems

Developers of secure systems face a serious challenge. If a security system is not user-friendly, developers face failure in the marketplace, or users that circumvent or ignore the security features. Although it often appears that security and usability are contrary product attributes, it need not be that way. For example, Yee [12] has recently laid out ten HCI design principles that can be used to improve the usability of security systems: In addition, tools are emerging to assist developers when checking for security vulnerabilities. Often the results and implications of the code scans can be complex and difficult to interpret. The field of HCI can likely contribute to improvement of security-enhancing development tools.

Another development issue is design philosophy. Especially in the realm of Web applications and services, design typically proceeds from the bottom up, driven by well established Web application design patterns and the constraints imposed by underlying technologies, such as public key cryptography. However, it is often difficult to retrofit these design patterns with acceptable security architectures. An alternative approach begins with a user-centered analysis of workflow and information flow (with emphasis on the boundaries), followed by a design approach that is driven from the top, taking care to use well established security models to enforce access control and data separation where appropriate.

REFERENCES

1. Adams, A., & Sasse, M.A. (1999). Users are not the enemy: Why users compromise computer security mechanisms and how to take remedial measures. Communications of the ACM, 42, 41-46.
2. Anderson, R. (1994). Why cryptosystems fail. Communications of the ACM, 37, 32-40.
3. Brostoff, S., & Sasse, M.A. (2000). Are Passfaces more usable than passwords? A field trial investigation. In Proceedings of HCI 2000, Sept. 5-8, Sunderland, U.K., 405-424 Springer.
4. Crowder, R.G. (1976). Principles of learning and memory. Hillsdale, NJ: Lawrence Elrbaum Associates.
5. Jain, A., Hong, L., & Pankanti, S. (2000). Biometric identification. Communications of the ACM, 43, 91-98.
6. Joyce, R., & Gupta, G. (1990). Identity authentication based on keystroke latencies. Communications of the ACM, 33, 168-176.
7. Markowitz, J.A., (2000). Voice biometrics. Communications of the ACM, 43, 66-73.
8. Preece, J. (1994). Human-computer interaction. NY: Addison-Wesley.
9. Rejman-Greene, M. (2001). Biometrics -- Real identities for a virtual world. BT Technology Journal, 19, 115-121.
10. Sasse, M.A., Brostoff, S., & Weirich, D. (2001). Transformng the 'weakest link': A human/computer interaction approach to usable and effective security. BT Technology Journal, 19, 122-131.
11. Schneier, B. (1999). Biometrics: Uses and abuses. Communications of the ACM, 42 (8), 58.
12. Yee, K.-P. (2002). User Interaction Design for Secure Systems. http://zesty.ca/sid/uidss-may-28.pdf