Log In
Or create an account ->
Imperial Library
Home
About
News
Upload
Forum
Help
Login/SignUp
Index
Cover
Information Security and Cryptography Texts and Monographs
Towards Hardware-Intrinsic Security
ISBN 9783642144516
Foreword
Contents
List of Contributors
ISBN 9783642144516
Foreword
Contents
List of Contributors
Towards Hardware-Intrinsic Security
ISBN 9783642144516
Foreword
Contents
List of Contributors
ISBN 9783642144516
Foreword
Contents
List of Contributors
Part I Physically Unclonable Functions (PUFs)
Physically Unclonable Functions: A Study on the State of the Art and Future Research Directions
Roel Maes and Ingrid Verbauwhede
1 Introduction
2 PUF Terminology and Measures
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
3 PUF Instantiations
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
4 PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
5 PUF Application Scenarios
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
6 PUF Discussions and Some Open Questions
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
7 Conclusion
References
1 Introduction
2 PUF Terminology and Measures
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
3 PUF Instantiations
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
4 PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
5 PUF Application Scenarios
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
6 PUF Discussions and Some Open Questions
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
7 Conclusion
References
Roel Maes and Ingrid Verbauwhede
1 Introduction
2 PUF Terminology and Measures
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
3 PUF Instantiations
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
4 PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
5 PUF Application Scenarios
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
6 PUF Discussions and Some Open Questions
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
7 Conclusion
References
1 Introduction
2 PUF Terminology and Measures
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
3 PUF Instantiations
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
4 PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
5 PUF Application Scenarios
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
6 PUF Discussions and Some Open Questions
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
7 Conclusion
References
Hardware Intrinsic Security from Physically Unclonable Functions
Helena Handschuh, Geert-Jan Schrijen, and Pim Tuyls
1 Introduction
2 Rethinking Secure Key Storage Mechanisms
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
3 Hardware Intrinsic Security
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
4 Quality of a PUF
4.1 Reliability
4.2 Security
4.1 Reliability
4.2 Security
5 Conclusions
References
1 Introduction
2 Rethinking Secure Key Storage Mechanisms
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
3 Hardware Intrinsic Security
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
4 Quality of a PUF
4.1 Reliability
4.2 Security
4.1 Reliability
4.2 Security
5 Conclusions
References
Helena Handschuh, Geert-Jan Schrijen, and Pim Tuyls
1 Introduction
2 Rethinking Secure Key Storage Mechanisms
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
3 Hardware Intrinsic Security
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
4 Quality of a PUF
4.1 Reliability
4.2 Security
4.1 Reliability
4.2 Security
5 Conclusions
References
1 Introduction
2 Rethinking Secure Key Storage Mechanisms
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
3 Hardware Intrinsic Security
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
4 Quality of a PUF
4.1 Reliability
4.2 Security
4.1 Reliability
4.2 Security
5 Conclusions
References
From Statistics to Circuits: Foundations for Future Physical Unclonable Functions
Inyoung Kim, Abhranil Maiti, Leyla Nazhandali, Patrick Schaumont, Vignesh Vivekraja, and Huaiye Zhang
1 Introduction
2 Components and Quality Factors of a PUF Design
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
3 Circuit-Level Optimization of PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
4 Architecture-Level Optimization of PUF
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
5 Identity Mapping and Testing
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
6 Conclusions
References
1 Introduction
2 Components and Quality Factors of a PUF Design
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
3 Circuit-Level Optimization of PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
4 Architecture-Level Optimization of PUF
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
5 Identity Mapping and Testing
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
6 Conclusions
References
Inyoung Kim, Abhranil Maiti, Leyla Nazhandali, Patrick Schaumont, Vignesh Vivekraja, and Huaiye Zhang
1 Introduction
2 Components and Quality Factors of a PUF Design
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
3 Circuit-Level Optimization of PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
4 Architecture-Level Optimization of PUF
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
5 Identity Mapping and Testing
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
6 Conclusions
References
1 Introduction
2 Components and Quality Factors of a PUF Design
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
3 Circuit-Level Optimization of PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
4 Architecture-Level Optimization of PUF
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
5 Identity Mapping and Testing
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
6 Conclusions
References
Strong PUFs: Models, Constructions, and Security Proofs
Ulrich Rührmair, Heike Busch, and Stefan Katzenbeisser
1 Introduction
2 Implementations of Strong Physical Unclonable Functions
3 Physical Unclonable Functions: Toward a Formal Definition
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
4 Alternative Attack Models
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
5 Identification Schemes Based on Strong PUFs
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
6 Conclusions
References
1 Introduction
2 Implementations of Strong Physical Unclonable Functions
3 Physical Unclonable Functions: Toward a Formal Definition
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
4 Alternative Attack Models
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
5 Identification Schemes Based on Strong PUFs
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
6 Conclusions
References
Ulrich Rührmair, Heike Busch, and Stefan Katzenbeisser
1 Introduction
2 Implementations of Strong Physical Unclonable Functions
3 Physical Unclonable Functions: Toward a Formal Definition
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
4 Alternative Attack Models
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
5 Identification Schemes Based on Strong PUFs
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
6 Conclusions
References
1 Introduction
2 Implementations of Strong Physical Unclonable Functions
3 Physical Unclonable Functions: Toward a Formal Definition
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
4 Alternative Attack Models
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
5 Identification Schemes Based on Strong PUFs
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
6 Conclusions
References
Physically Unclonable Functions: A Study on the State of the Art and Future Research Directions
Roel Maes and Ingrid Verbauwhede
1 Introduction
2 PUF Terminology and Measures
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
3 PUF Instantiations
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
4 PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
5 PUF Application Scenarios
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
6 PUF Discussions and Some Open Questions
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
7 Conclusion
References
1 Introduction
2 PUF Terminology and Measures
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
3 PUF Instantiations
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
4 PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
5 PUF Application Scenarios
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
6 PUF Discussions and Some Open Questions
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
7 Conclusion
References
Roel Maes and Ingrid Verbauwhede
1 Introduction
2 PUF Terminology and Measures
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
3 PUF Instantiations
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
4 PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
5 PUF Application Scenarios
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
6 PUF Discussions and Some Open Questions
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
7 Conclusion
References
1 Introduction
2 PUF Terminology and Measures
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
2.1 Challenges and Responses
2.2 Inter- and Intra-distance Measures
2.3 Environmental Effects
3 PUF Instantiations
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
3.1 Non-electronic PUFs
3.2 Analog Electronic PUFs
3.3 Delay-Based Intrinsic PUFs
3.4 Memory-Based Intrinsic PUFs
3.5 PUF Concepts
4 PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
4.1 Property Description
4.2 Property Check
4.3 Least Common Subset of PUF Properties
5 PUF Application Scenarios
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
5.1 System Identification
5.2 Secret Key Generation
5.3 Hardware-Entangled Cryptography
6 PUF Discussions and Some Open Questions
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
6.1 Predictability Versus Implementation Size
6.2 Formalization of PUF Properties
6.3 Reporting on PUF Implementation Results
7 Conclusion
References
Hardware Intrinsic Security from Physically Unclonable Functions
Helena Handschuh, Geert-Jan Schrijen, and Pim Tuyls
1 Introduction
2 Rethinking Secure Key Storage Mechanisms
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
3 Hardware Intrinsic Security
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
4 Quality of a PUF
4.1 Reliability
4.2 Security
4.1 Reliability
4.2 Security
5 Conclusions
References
1 Introduction
2 Rethinking Secure Key Storage Mechanisms
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
3 Hardware Intrinsic Security
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
4 Quality of a PUF
4.1 Reliability
4.2 Security
4.1 Reliability
4.2 Security
5 Conclusions
References
Helena Handschuh, Geert-Jan Schrijen, and Pim Tuyls
1 Introduction
2 Rethinking Secure Key Storage Mechanisms
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
3 Hardware Intrinsic Security
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
4 Quality of a PUF
4.1 Reliability
4.2 Security
4.1 Reliability
4.2 Security
5 Conclusions
References
1 Introduction
2 Rethinking Secure Key Storage Mechanisms
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
2.1 Limitations of Current Key Storage Mechanisms
2.2 A Radical New Approach to Secure Key Storage
3 Hardware Intrinsic Security
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
3.1 Physically Unclonable Functions
3.2 Examples of PUFs
3.3 Secure Key Storage Based on PUFs
4 Quality of a PUF
4.1 Reliability
4.2 Security
4.1 Reliability
4.2 Security
5 Conclusions
References
From Statistics to Circuits: Foundations for Future Physical Unclonable Functions
Inyoung Kim, Abhranil Maiti, Leyla Nazhandali, Patrick Schaumont, Vignesh Vivekraja, and Huaiye Zhang
1 Introduction
2 Components and Quality Factors of a PUF Design
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
3 Circuit-Level Optimization of PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
4 Architecture-Level Optimization of PUF
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
5 Identity Mapping and Testing
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
6 Conclusions
References
1 Introduction
2 Components and Quality Factors of a PUF Design
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
3 Circuit-Level Optimization of PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
4 Architecture-Level Optimization of PUF
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
5 Identity Mapping and Testing
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
6 Conclusions
References
Inyoung Kim, Abhranil Maiti, Leyla Nazhandali, Patrick Schaumont, Vignesh Vivekraja, and Huaiye Zhang
1 Introduction
2 Components and Quality Factors of a PUF Design
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
3 Circuit-Level Optimization of PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
4 Architecture-Level Optimization of PUF
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
5 Identity Mapping and Testing
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
6 Conclusions
References
1 Introduction
2 Components and Quality Factors of a PUF Design
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
2.1 Components of a PUF
2.2 PUF Quality Factors
2.3 Sources of CMOS Variability and Compensation of Unwanted Variability
3 Circuit-Level Optimization of PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
3.1 Methodology
3.2 Background: Operating Voltage and Body Bias
3.3 Effect of Operating Voltage and Body Bias on PUF
4 Architecture-Level Optimization of PUF
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
4.1 Compensation of Environmental Effects
4.2 Compensation of Correlated Process Variations
5 Identity Mapping and Testing
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
5.1 Statistical Preliminaries
5.2 A New Test Statistic: Q
5.3 Experimental Results
5.4 Compensation of Environmental Effects
5.5 Open Challenges
6 Conclusions
References
Strong PUFs: Models, Constructions, and Security Proofs
Ulrich Rührmair, Heike Busch, and Stefan Katzenbeisser
1 Introduction
2 Implementations of Strong Physical Unclonable Functions
3 Physical Unclonable Functions: Toward a Formal Definition
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
4 Alternative Attack Models
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
5 Identification Schemes Based on Strong PUFs
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
6 Conclusions
References
1 Introduction
2 Implementations of Strong Physical Unclonable Functions
3 Physical Unclonable Functions: Toward a Formal Definition
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
4 Alternative Attack Models
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
5 Identification Schemes Based on Strong PUFs
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
6 Conclusions
References
Ulrich Rührmair, Heike Busch, and Stefan Katzenbeisser
1 Introduction
2 Implementations of Strong Physical Unclonable Functions
3 Physical Unclonable Functions: Toward a Formal Definition
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
4 Alternative Attack Models
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
5 Identification Schemes Based on Strong PUFs
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
6 Conclusions
References
1 Introduction
2 Implementations of Strong Physical Unclonable Functions
3 Physical Unclonable Functions: Toward a Formal Definition
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
3.1 Physical One-Way Functions
3.2 Physical Unclonable Functions
3.3 Physical Random Functions
4 Alternative Attack Models
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
4.1 Semi-formal Models for Strong PUFs
4.2 The Digital Attack Model
5 Identification Schemes Based on Strong PUFs
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
5.1 PUF-Based Identification Schemes
5.2 Security of PUF-Based Identification in the Digital Attack Model
6 Conclusions
References
Part II Hardware-Based Cryptography
Leakage Resilient Cryptography in Practice
François-Xavier Standaert, Olivier Pereira, Yu Yu, Jean-Jacques Quisquater, Moti Yung, and Elisabeth Oswald
1 Introduction
2 Background
2.1 Notations
2.2 Definition of a Leakage Function
2.1 Notations
2.2 Definition of a Leakage Function
3 Unpredictability vs. Indistinguishability
4 Physical Assumptions: Local vs. Global Approach
4.1 Analogy with Classical Cryptanalysis
4.1 Analogy with Classical Cryptanalysis
5 Leakage Resilient PRGs
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
6 Initialization Issues
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
7 Generalization to PRFs
8 Remark on the Impossibility of Proving the Leakage Resilience for the Forward Secure PRG of Fig. 6a in the Standard Model
9 Open Problems
10 Further Details
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
References
1 Introduction
2 Background
2.1 Notations
2.2 Definition of a Leakage Function
2.1 Notations
2.2 Definition of a Leakage Function
3 Unpredictability vs. Indistinguishability
4 Physical Assumptions: Local vs. Global Approach
4.1 Analogy with Classical Cryptanalysis
4.1 Analogy with Classical Cryptanalysis
5 Leakage Resilient PRGs
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
6 Initialization Issues
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
7 Generalization to PRFs
8 Remark on the Impossibility of Proving the Leakage Resilience for the Forward Secure PRG of Fig. 6a in the Standard Model
9 Open Problems
10 Further Details
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
References
François-Xavier Standaert, Olivier Pereira, Yu Yu, Jean-Jacques Quisquater, Moti Yung, and Elisabeth Oswald
1 Introduction
2 Background
2.1 Notations
2.2 Definition of a Leakage Function
2.1 Notations
2.2 Definition of a Leakage Function
3 Unpredictability vs. Indistinguishability
4 Physical Assumptions: Local vs. Global Approach
4.1 Analogy with Classical Cryptanalysis
4.1 Analogy with Classical Cryptanalysis
5 Leakage Resilient PRGs
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
6 Initialization Issues
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
7 Generalization to PRFs
8 Remark on the Impossibility of Proving the Leakage Resilience for the Forward Secure PRG of Fig. 6a in the Standard Model
9 Open Problems
10 Further Details
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
References
1 Introduction
2 Background
2.1 Notations
2.2 Definition of a Leakage Function
2.1 Notations
2.2 Definition of a Leakage Function
3 Unpredictability vs. Indistinguishability
4 Physical Assumptions: Local vs. Global Approach
4.1 Analogy with Classical Cryptanalysis
4.1 Analogy with Classical Cryptanalysis
5 Leakage Resilient PRGs
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
6 Initialization Issues
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
7 Generalization to PRFs
8 Remark on the Impossibility of Proving the Leakage Resilience for the Forward Secure PRG of Fig. 6a in the Standard Model
9 Open Problems
10 Further Details
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
References
Memory Leakage-Resilient Encryption Based on Physically Unclonable Functions
Frederik Armknecht, Roel Maes, Ahmad-Reza Sadeghi, Berk Sunar,and Pim Tuyls
1 Introduction
2 Related Work
3 Memory Attacks
4 Preliminaries
5 Physically Unclonable Functions
6 Pseudorandom Functions Based on PUFs
7 Encrypting with PUF-(w)PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
8 SRAM PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
9 Conclusions
References
1 Introduction
2 Related Work
3 Memory Attacks
4 Preliminaries
5 Physically Unclonable Functions
6 Pseudorandom Functions Based on PUFs
7 Encrypting with PUF-(w)PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
8 SRAM PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
9 Conclusions
References
Frederik Armknecht, Roel Maes, Ahmad-Reza Sadeghi, Berk Sunar,and Pim Tuyls
1 Introduction
2 Related Work
3 Memory Attacks
4 Preliminaries
5 Physically Unclonable Functions
6 Pseudorandom Functions Based on PUFs
7 Encrypting with PUF-(w)PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
8 SRAM PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
9 Conclusions
References
1 Introduction
2 Related Work
3 Memory Attacks
4 Preliminaries
5 Physically Unclonable Functions
6 Pseudorandom Functions Based on PUFs
7 Encrypting with PUF-(w)PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
8 SRAM PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
9 Conclusions
References
Leakage Resilient Cryptography in Practice
François-Xavier Standaert, Olivier Pereira, Yu Yu, Jean-Jacques Quisquater, Moti Yung, and Elisabeth Oswald
1 Introduction
2 Background
2.1 Notations
2.2 Definition of a Leakage Function
2.1 Notations
2.2 Definition of a Leakage Function
3 Unpredictability vs. Indistinguishability
4 Physical Assumptions: Local vs. Global Approach
4.1 Analogy with Classical Cryptanalysis
4.1 Analogy with Classical Cryptanalysis
5 Leakage Resilient PRGs
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
6 Initialization Issues
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
7 Generalization to PRFs
8 Remark on the Impossibility of Proving the Leakage Resilience for the Forward Secure PRG of Fig. 6a in the Standard Model
9 Open Problems
10 Further Details
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
References
1 Introduction
2 Background
2.1 Notations
2.2 Definition of a Leakage Function
2.1 Notations
2.2 Definition of a Leakage Function
3 Unpredictability vs. Indistinguishability
4 Physical Assumptions: Local vs. Global Approach
4.1 Analogy with Classical Cryptanalysis
4.1 Analogy with Classical Cryptanalysis
5 Leakage Resilient PRGs
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
6 Initialization Issues
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
7 Generalization to PRFs
8 Remark on the Impossibility of Proving the Leakage Resilience for the Forward Secure PRG of Fig. 6a in the Standard Model
9 Open Problems
10 Further Details
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
References
François-Xavier Standaert, Olivier Pereira, Yu Yu, Jean-Jacques Quisquater, Moti Yung, and Elisabeth Oswald
1 Introduction
2 Background
2.1 Notations
2.2 Definition of a Leakage Function
2.1 Notations
2.2 Definition of a Leakage Function
3 Unpredictability vs. Indistinguishability
4 Physical Assumptions: Local vs. Global Approach
4.1 Analogy with Classical Cryptanalysis
4.1 Analogy with Classical Cryptanalysis
5 Leakage Resilient PRGs
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
6 Initialization Issues
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
7 Generalization to PRFs
8 Remark on the Impossibility of Proving the Leakage Resilience for the Forward Secure PRG of Fig. 6a in the Standard Model
9 Open Problems
10 Further Details
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
References
1 Introduction
2 Background
2.1 Notations
2.2 Definition of a Leakage Function
2.1 Notations
2.2 Definition of a Leakage Function
3 Unpredictability vs. Indistinguishability
4 Physical Assumptions: Local vs. Global Approach
4.1 Analogy with Classical Cryptanalysis
4.1 Analogy with Classical Cryptanalysis
5 Leakage Resilient PRGs
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
5.1 On the Difficulty of Modeling a Leakage Function
5.2 Theoretical Security Analysis and Limitations
5.3 Proving Leakage Resilience with Random Oracles
5.4 Practical Security Analysis
6 Initialization Issues
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
6.1 Breaking [34] with a Standard DPA
6.2 Secure Initialization Process
6.3 A More Elegant (and Standard) Construction
6.4 Remark on the Impossibility of a Secure Initialization Process with an Adaptive Selection of the Leakages
7 Generalization to PRFs
8 Remark on the Impossibility of Proving the Leakage Resilience for the Forward Secure PRG of Fig. 6a in the Standard Model
9 Open Problems
10 Further Details
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
10.1 Security Metric
10.2 Proof of Theorem 1
10.3 Proof of Theorem 2
References
Memory Leakage-Resilient Encryption Based on Physically Unclonable Functions
Frederik Armknecht, Roel Maes, Ahmad-Reza Sadeghi, Berk Sunar,and Pim Tuyls
1 Introduction
2 Related Work
3 Memory Attacks
4 Preliminaries
5 Physically Unclonable Functions
6 Pseudorandom Functions Based on PUFs
7 Encrypting with PUF-(w)PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
8 SRAM PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
9 Conclusions
References
1 Introduction
2 Related Work
3 Memory Attacks
4 Preliminaries
5 Physically Unclonable Functions
6 Pseudorandom Functions Based on PUFs
7 Encrypting with PUF-(w)PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
8 SRAM PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
9 Conclusions
References
Frederik Armknecht, Roel Maes, Ahmad-Reza Sadeghi, Berk Sunar,and Pim Tuyls
1 Introduction
2 Related Work
3 Memory Attacks
4 Preliminaries
5 Physically Unclonable Functions
6 Pseudorandom Functions Based on PUFs
7 Encrypting with PUF-(w)PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
8 SRAM PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
9 Conclusions
References
1 Introduction
2 Related Work
3 Memory Attacks
4 Preliminaries
5 Physically Unclonable Functions
6 Pseudorandom Functions Based on PUFs
7 Encrypting with PUF-(w)PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
7.1 General Thoughts
7.2 A Stream Cipher Based on PUF-PRFs
7.3 A Block Cipher Based on PUF-PRFs
8 SRAM PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
8.1 Physical Implementation Details of Static Random Access Memory (SRAM)
8.2 The SRAM PUF Construction
8.3 SRAM PUF Parameters and Experimental Validation
8.4 From SRAM PUF to SRAM PRF
8.5 A Concrete Block Cipher Realization Based on SRAM-PRFs
9 Conclusions
References
Part III Hardware Attacks
Hardware Trojan Horses
Mohammad Tehranipoor and Berk Sunar
1 What Is the Untrusted Manufacturer Problem?
2 Hardware Trojans
3 A Taxonomy of Hardware Trojans
4 A High-Level Attack: Shadow Circuits
5 Trojan Detection Methodologies
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
6 Design-for-Hardware-Trust Techniques
7 Circuit Obfuscation as a Countermeasure
References
1 What Is the Untrusted Manufacturer Problem?
2 Hardware Trojans
3 A Taxonomy of Hardware Trojans
4 A High-Level Attack: Shadow Circuits
5 Trojan Detection Methodologies
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
6 Design-for-Hardware-Trust Techniques
7 Circuit Obfuscation as a Countermeasure
References
Mohammad Tehranipoor and Berk Sunar
1 What Is the Untrusted Manufacturer Problem?
2 Hardware Trojans
3 A Taxonomy of Hardware Trojans
4 A High-Level Attack: Shadow Circuits
5 Trojan Detection Methodologies
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
6 Design-for-Hardware-Trust Techniques
7 Circuit Obfuscation as a Countermeasure
References
1 What Is the Untrusted Manufacturer Problem?
2 Hardware Trojans
3 A Taxonomy of Hardware Trojans
4 A High-Level Attack: Shadow Circuits
5 Trojan Detection Methodologies
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
6 Design-for-Hardware-Trust Techniques
7 Circuit Obfuscation as a Countermeasure
References
Extracting Unknown Keys from Unknown Algorithms Encrypting Unknown Fixed Messages and Returning No Results
Yoo-Jin Baek, Vanessa Gratzer, Sung-Hyun Kim, and David Naccache
1 Introduction
2 The Intuition
3 Notations and Statistical Tools
4 The Attack
4.1 The Exhaust Routine
4.1 The Exhaust Routine
5 Practical Experiments
6 Implications and Further Research
References
1 Introduction
2 The Intuition
3 Notations and Statistical Tools
4 The Attack
4.1 The Exhaust Routine
4.1 The Exhaust Routine
5 Practical Experiments
6 Implications and Further Research
References
Yoo-Jin Baek, Vanessa Gratzer, Sung-Hyun Kim, and David Naccache
1 Introduction
2 The Intuition
3 Notations and Statistical Tools
4 The Attack
4.1 The Exhaust Routine
4.1 The Exhaust Routine
5 Practical Experiments
6 Implications and Further Research
References
1 Introduction
2 The Intuition
3 Notations and Statistical Tools
4 The Attack
4.1 The Exhaust Routine
4.1 The Exhaust Routine
5 Practical Experiments
6 Implications and Further Research
References
Hardware Trojan Horses
Mohammad Tehranipoor and Berk Sunar
1 What Is the Untrusted Manufacturer Problem?
2 Hardware Trojans
3 A Taxonomy of Hardware Trojans
4 A High-Level Attack: Shadow Circuits
5 Trojan Detection Methodologies
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
6 Design-for-Hardware-Trust Techniques
7 Circuit Obfuscation as a Countermeasure
References
1 What Is the Untrusted Manufacturer Problem?
2 Hardware Trojans
3 A Taxonomy of Hardware Trojans
4 A High-Level Attack: Shadow Circuits
5 Trojan Detection Methodologies
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
6 Design-for-Hardware-Trust Techniques
7 Circuit Obfuscation as a Countermeasure
References
Mohammad Tehranipoor and Berk Sunar
1 What Is the Untrusted Manufacturer Problem?
2 Hardware Trojans
3 A Taxonomy of Hardware Trojans
4 A High-Level Attack: Shadow Circuits
5 Trojan Detection Methodologies
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
6 Design-for-Hardware-Trust Techniques
7 Circuit Obfuscation as a Countermeasure
References
1 What Is the Untrusted Manufacturer Problem?
2 Hardware Trojans
3 A Taxonomy of Hardware Trojans
4 A High-Level Attack: Shadow Circuits
5 Trojan Detection Methodologies
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
5.1 Trojan Detection Using Side-Channel Signal Analysis
5.2 Trojan Activation Methods
6 Design-for-Hardware-Trust Techniques
7 Circuit Obfuscation as a Countermeasure
References
Extracting Unknown Keys from Unknown Algorithms Encrypting Unknown Fixed Messages and Returning No Results
Yoo-Jin Baek, Vanessa Gratzer, Sung-Hyun Kim, and David Naccache
1 Introduction
2 The Intuition
3 Notations and Statistical Tools
4 The Attack
4.1 The Exhaust Routine
4.1 The Exhaust Routine
5 Practical Experiments
6 Implications and Further Research
References
1 Introduction
2 The Intuition
3 Notations and Statistical Tools
4 The Attack
4.1 The Exhaust Routine
4.1 The Exhaust Routine
5 Practical Experiments
6 Implications and Further Research
References
Yoo-Jin Baek, Vanessa Gratzer, Sung-Hyun Kim, and David Naccache
1 Introduction
2 The Intuition
3 Notations and Statistical Tools
4 The Attack
4.1 The Exhaust Routine
4.1 The Exhaust Routine
5 Practical Experiments
6 Implications and Further Research
References
1 Introduction
2 The Intuition
3 Notations and Statistical Tools
4 The Attack
4.1 The Exhaust Routine
4.1 The Exhaust Routine
5 Practical Experiments
6 Implications and Further Research
References
Part IV Hardware-Based Policy Enforcement
License Distribution Protocols from Optical Media Fingerprints
Ghaith Hammouri, Aykutlu Dana, and Berk Sunar
1 Introduction
2 Pits and Lands
2.1 Source of Variation
2.2 Single Location Characterization
2.1 Source of Variation
2.2 Single Location Characterization
3 Experimental Validation
4 CD Fingerprinting
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
5 Robustness of the Fingerprint
6 License Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
7 Conclusion
References
1 Introduction
2 Pits and Lands
2.1 Source of Variation
2.2 Single Location Characterization
2.1 Source of Variation
2.2 Single Location Characterization
3 Experimental Validation
4 CD Fingerprinting
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
5 Robustness of the Fingerprint
6 License Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
7 Conclusion
References
Ghaith Hammouri, Aykutlu Dana, and Berk Sunar
1 Introduction
2 Pits and Lands
2.1 Source of Variation
2.2 Single Location Characterization
2.1 Source of Variation
2.2 Single Location Characterization
3 Experimental Validation
4 CD Fingerprinting
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
5 Robustness of the Fingerprint
6 License Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
7 Conclusion
References
1 Introduction
2 Pits and Lands
2.1 Source of Variation
2.2 Single Location Characterization
2.1 Source of Variation
2.2 Single Location Characterization
3 Experimental Validation
4 CD Fingerprinting
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
5 Robustness of the Fingerprint
6 License Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
7 Conclusion
References
Anti-counterfeiting: Mixing the Physical and the Digital World
Darko Kirovski
1 Introduction
1.1 Classification
1.1 Classification
2 Desiderata for Anti-counterfeiting Technologies
3 Digitizing the Physical World
4 Applications
5 Review of Existing Methodologies
5.1 RF-DNA
5.2 Challenge/Response COA Systems
5.1 RF-DNA
5.2 Challenge/Response COA Systems
6 Conclusion
References
1 Introduction
1.1 Classification
1.1 Classification
2 Desiderata for Anti-counterfeiting Technologies
3 Digitizing the Physical World
4 Applications
5 Review of Existing Methodologies
5.1 RF-DNA
5.2 Challenge/Response COA Systems
5.1 RF-DNA
5.2 Challenge/Response COA Systems
6 Conclusion
References
Darko Kirovski
1 Introduction
1.1 Classification
1.1 Classification
2 Desiderata for Anti-counterfeiting Technologies
3 Digitizing the Physical World
4 Applications
5 Review of Existing Methodologies
5.1 RF-DNA
5.2 Challenge/Response COA Systems
5.1 RF-DNA
5.2 Challenge/Response COA Systems
6 Conclusion
References
1 Introduction
1.1 Classification
1.1 Classification
2 Desiderata for Anti-counterfeiting Technologies
3 Digitizing the Physical World
4 Applications
5 Review of Existing Methodologies
5.1 RF-DNA
5.2 Challenge/Response COA Systems
5.1 RF-DNA
5.2 Challenge/Response COA Systems
6 Conclusion
References
License Distribution Protocols from Optical Media Fingerprints
Ghaith Hammouri, Aykutlu Dana, and Berk Sunar
1 Introduction
2 Pits and Lands
2.1 Source of Variation
2.2 Single Location Characterization
2.1 Source of Variation
2.2 Single Location Characterization
3 Experimental Validation
4 CD Fingerprinting
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
5 Robustness of the Fingerprint
6 License Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
7 Conclusion
References
1 Introduction
2 Pits and Lands
2.1 Source of Variation
2.2 Single Location Characterization
2.1 Source of Variation
2.2 Single Location Characterization
3 Experimental Validation
4 CD Fingerprinting
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
5 Robustness of the Fingerprint
6 License Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
7 Conclusion
References
Ghaith Hammouri, Aykutlu Dana, and Berk Sunar
1 Introduction
2 Pits and Lands
2.1 Source of Variation
2.2 Single Location Characterization
2.1 Source of Variation
2.2 Single Location Characterization
3 Experimental Validation
4 CD Fingerprinting
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
5 Robustness of the Fingerprint
6 License Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
7 Conclusion
References
1 Introduction
2 Pits and Lands
2.1 Source of Variation
2.2 Single Location Characterization
2.1 Source of Variation
2.2 Single Location Characterization
3 Experimental Validation
4 CD Fingerprinting
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
4.1 Fuzzy Extractors
4.2 Fingerprint Extraction
4.3 Entropy Estimation and 128-Bit Security
5 Robustness of the Fingerprint
6 License Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
6.1 Simple Distribution Protocol
6.2 Secure Reader Protocol
6.3 Online Distribution Protocol
7 Conclusion
References
Anti-counterfeiting: Mixing the Physical and the Digital World
Darko Kirovski
1 Introduction
1.1 Classification
1.1 Classification
2 Desiderata for Anti-counterfeiting Technologies
3 Digitizing the Physical World
4 Applications
5 Review of Existing Methodologies
5.1 RF-DNA
5.2 Challenge/Response COA Systems
5.1 RF-DNA
5.2 Challenge/Response COA Systems
6 Conclusion
References
1 Introduction
1.1 Classification
1.1 Classification
2 Desiderata for Anti-counterfeiting Technologies
3 Digitizing the Physical World
4 Applications
5 Review of Existing Methodologies
5.1 RF-DNA
5.2 Challenge/Response COA Systems
5.1 RF-DNA
5.2 Challenge/Response COA Systems
6 Conclusion
References
Darko Kirovski
1 Introduction
1.1 Classification
1.1 Classification
2 Desiderata for Anti-counterfeiting Technologies
3 Digitizing the Physical World
4 Applications
5 Review of Existing Methodologies
5.1 RF-DNA
5.2 Challenge/Response COA Systems
5.1 RF-DNA
5.2 Challenge/Response COA Systems
6 Conclusion
References
1 Introduction
1.1 Classification
1.1 Classification
2 Desiderata for Anti-counterfeiting Technologies
3 Digitizing the Physical World
4 Applications
5 Review of Existing Methodologies
5.1 RF-DNA
5.2 Challenge/Response COA Systems
5.1 RF-DNA
5.2 Challenge/Response COA Systems
6 Conclusion
References
Part V HardwareSecurity in Contactless Tokens
Anti-counterfeiting, Untraceability and Other Security Challenges for RFID Systems: Public-Key-Based Protocols and Hardware
Yong Ki Lee, Lejla Batina, Dave Singelee, Bart Preneel, andIngrid Verbauwhede
1 Introduction
2 Security and Privacy Requirements
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
3 State of the Art
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
4 Untraceable Authentication Protocols Based on ECC
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
5 EC-RAC IV
6 Search Protocol
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
7 Implementation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
8 Conclusions
References
1 Introduction
2 Security and Privacy Requirements
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
3 State of the Art
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
4 Untraceable Authentication Protocols Based on ECC
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
5 EC-RAC IV
6 Search Protocol
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
7 Implementation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
8 Conclusions
References
Yong Ki Lee, Lejla Batina, Dave Singelee, Bart Preneel, andIngrid Verbauwhede
1 Introduction
2 Security and Privacy Requirements
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
3 State of the Art
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
4 Untraceable Authentication Protocols Based on ECC
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
5 EC-RAC IV
6 Search Protocol
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
7 Implementation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
8 Conclusions
References
1 Introduction
2 Security and Privacy Requirements
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
3 State of the Art
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
4 Untraceable Authentication Protocols Based on ECC
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
5 EC-RAC IV
6 Search Protocol
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
7 Implementation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
8 Conclusions
References
Contactless Security Token Enhanced Security by Using New Hardware Features in Cryptographic-Based Security Mechanisms
Markus Ullmann and Matthias Vögeler
1 Introduction
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
2 Contactless Security Token
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
3 Authenticated Connection Establishment
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
4 Secure Time Synchronization
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
5 Applications
5.1 Authentication of Internet Services
5.1 Authentication of Internet Services
6 Conclusion
References
1 Introduction
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
2 Contactless Security Token
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
3 Authenticated Connection Establishment
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
4 Secure Time Synchronization
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
5 Applications
5.1 Authentication of Internet Services
5.1 Authentication of Internet Services
6 Conclusion
References
Markus Ullmann and Matthias Vögeler
1 Introduction
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
2 Contactless Security Token
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
3 Authenticated Connection Establishment
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
4 Secure Time Synchronization
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
5 Applications
5.1 Authentication of Internet Services
5.1 Authentication of Internet Services
6 Conclusion
References
1 Introduction
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
2 Contactless Security Token
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
3 Authenticated Connection Establishment
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
4 Secure Time Synchronization
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
5 Applications
5.1 Authentication of Internet Services
5.1 Authentication of Internet Services
6 Conclusion
References
Enhancing RFID Security and Privacy by Physically UnclonableFunctions
Ahmad-Reza Sadeghi, Ivan Visconti, and Christian Wachsmann
1 Introduction
1.1 Contribution
1.1 Contribution
2 High-Level RFID System and Requirement Analysis
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
3 Related Work
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
4 RFID Security and Privacy Model of Vaudenay [67]
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
5 A PUF-Based Destructive-Private RFID Protocol
5.1 Correctness
5.1 Correctness
6 Security Analysis
6.1 Tag Authentication
6.2 Destructive Privacy
6.1 Tag Authentication
6.2 Destructive Privacy
7 Conclusion
References
1 Introduction
1.1 Contribution
1.1 Contribution
2 High-Level RFID System and Requirement Analysis
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
3 Related Work
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
4 RFID Security and Privacy Model of Vaudenay [67]
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
5 A PUF-Based Destructive-Private RFID Protocol
5.1 Correctness
5.1 Correctness
6 Security Analysis
6.1 Tag Authentication
6.2 Destructive Privacy
6.1 Tag Authentication
6.2 Destructive Privacy
7 Conclusion
References
Ahmad-Reza Sadeghi, Ivan Visconti, and Christian Wachsmann
1 Introduction
1.1 Contribution
1.1 Contribution
2 High-Level RFID System and Requirement Analysis
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
3 Related Work
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
4 RFID Security and Privacy Model of Vaudenay [67]
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
5 A PUF-Based Destructive-Private RFID Protocol
5.1 Correctness
5.1 Correctness
6 Security Analysis
6.1 Tag Authentication
6.2 Destructive Privacy
6.1 Tag Authentication
6.2 Destructive Privacy
7 Conclusion
References
1 Introduction
1.1 Contribution
1.1 Contribution
2 High-Level RFID System and Requirement Analysis
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
3 Related Work
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
4 RFID Security and Privacy Model of Vaudenay [67]
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
5 A PUF-Based Destructive-Private RFID Protocol
5.1 Correctness
5.1 Correctness
6 Security Analysis
6.1 Tag Authentication
6.2 Destructive Privacy
6.1 Tag Authentication
6.2 Destructive Privacy
7 Conclusion
References
Anti-counterfeiting, Untraceability and Other Security Challenges for RFID Systems: Public-Key-Based Protocols and Hardware
Yong Ki Lee, Lejla Batina, Dave Singelee, Bart Preneel, andIngrid Verbauwhede
1 Introduction
2 Security and Privacy Requirements
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
3 State of the Art
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
4 Untraceable Authentication Protocols Based on ECC
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
5 EC-RAC IV
6 Search Protocol
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
7 Implementation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
8 Conclusions
References
1 Introduction
2 Security and Privacy Requirements
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
3 State of the Art
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
4 Untraceable Authentication Protocols Based on ECC
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
5 EC-RAC IV
6 Search Protocol
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
7 Implementation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
8 Conclusions
References
Yong Ki Lee, Lejla Batina, Dave Singelee, Bart Preneel, andIngrid Verbauwhede
1 Introduction
2 Security and Privacy Requirements
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
3 State of the Art
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
4 Untraceable Authentication Protocols Based on ECC
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
5 EC-RAC IV
6 Search Protocol
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
7 Implementation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
8 Conclusions
References
1 Introduction
2 Security and Privacy Requirements
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
2.1 Security Objectives
2.2 Privacy Objectives
2.3 General Objectives
3 State of the Art
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
3.1 Authentication Protocols Based on Private-Key Cryptography
3.2 Authentication Protocols Based on PUFs
3.3 Authentication Protocols Based on Public-Key Cryptography
4 Untraceable Authentication Protocols Based on ECC
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
4.1 Notation
4.2 EC-RAC II
4.3 Randomized Schnorr Protocol
4.4 Man-in-the-Middle Attacks
5 EC-RAC IV
6 Search Protocol
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
6.1 Protocol Description
6.2 Search Protocol Analysis
6.3 Combining Authentication Protocols
7 Implementation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
7.1 Overall Architecture
7.2 New MALU Design
7.3 Performance Evaluation
8 Conclusions
References
Contactless Security Token Enhanced Security by Using New Hardware Features in Cryptographic-Based Security Mechanisms
Markus Ullmann and Matthias Vögeler
1 Introduction
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
2 Contactless Security Token
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
3 Authenticated Connection Establishment
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
4 Secure Time Synchronization
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
5 Applications
5.1 Authentication of Internet Services
5.1 Authentication of Internet Services
6 Conclusion
References
1 Introduction
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
2 Contactless Security Token
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
3 Authenticated Connection Establishment
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
4 Secure Time Synchronization
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
5 Applications
5.1 Authentication of Internet Services
5.1 Authentication of Internet Services
6 Conclusion
References
Markus Ullmann and Matthias Vögeler
1 Introduction
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
2 Contactless Security Token
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
3 Authenticated Connection Establishment
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
4 Secure Time Synchronization
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
5 Applications
5.1 Authentication of Internet Services
5.1 Authentication of Internet Services
6 Conclusion
References
1 Introduction
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
1.1 Benefits of Contactless Smart Cards
1.2 Security Limitation of Supposed Security Mechanisms for an Authenticated Connection Establishment Between Terminals and Contactless Cards
1.3 Security Limitation of Device Authentication Protocols Based on Irrevocable Authentication Certificates
2 Contactless Security Token
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
2.1 Flexible Display Technology
2.2 Real-Time Clock
2.3 Buttons
3 Authenticated Connection Establishment
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
3.1 Password-Based Cryptographic Protocols
3.2 Password Authenticated Connection Establishment (PACE)
3.3 Security Token Operation
3.4 Security Analysis of PACE Using Fresh Passwords
3.5 Brute-Force Online-Attacks on Passwords
4 Secure Time Synchronization
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
4.1 Time Values
4.2 Time Server-Based Synchronization Protocols
4.3 Security Requirements for Time Synchronization
4.4 Secure Time Synchronization Protocols
4.5 Security and Performance Analysis
5 Applications
5.1 Authentication of Internet Services
5.1 Authentication of Internet Services
6 Conclusion
References
Enhancing RFID Security and Privacy by Physically UnclonableFunctions
Ahmad-Reza Sadeghi, Ivan Visconti, and Christian Wachsmann
1 Introduction
1.1 Contribution
1.1 Contribution
2 High-Level RFID System and Requirement Analysis
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
3 Related Work
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
4 RFID Security and Privacy Model of Vaudenay [67]
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
5 A PUF-Based Destructive-Private RFID Protocol
5.1 Correctness
5.1 Correctness
6 Security Analysis
6.1 Tag Authentication
6.2 Destructive Privacy
6.1 Tag Authentication
6.2 Destructive Privacy
7 Conclusion
References
1 Introduction
1.1 Contribution
1.1 Contribution
2 High-Level RFID System and Requirement Analysis
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
3 Related Work
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
4 RFID Security and Privacy Model of Vaudenay [67]
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
5 A PUF-Based Destructive-Private RFID Protocol
5.1 Correctness
5.1 Correctness
6 Security Analysis
6.1 Tag Authentication
6.2 Destructive Privacy
6.1 Tag Authentication
6.2 Destructive Privacy
7 Conclusion
References
Ahmad-Reza Sadeghi, Ivan Visconti, and Christian Wachsmann
1 Introduction
1.1 Contribution
1.1 Contribution
2 High-Level RFID System and Requirement Analysis
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
3 Related Work
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
4 RFID Security and Privacy Model of Vaudenay [67]
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
5 A PUF-Based Destructive-Private RFID Protocol
5.1 Correctness
5.1 Correctness
6 Security Analysis
6.1 Tag Authentication
6.2 Destructive Privacy
6.1 Tag Authentication
6.2 Destructive Privacy
7 Conclusion
References
1 Introduction
1.1 Contribution
1.1 Contribution
2 High-Level RFID System and Requirement Analysis
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
2.1 System Model
2.2 Trust and Adversary Model
2.3 Security and Privacy Threats
2.4 Security and Privacy Objectives
3 Related Work
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
3.1 Privacy-Preserving RFID Protocols
3.2 RFID Protocols Based on Physically Unclonable Functions
3.3 Privacy Models for RFID
4 RFID Security and Privacy Model of Vaudenay [67]
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
4.1 General Notation
4.2 Pseudorandom Function (PRF)
4.3 Physically Unclonable Function (PUF)
4.4 System Model
4.5 Adversary Model
4.6 Definition of Correctness, Security, and Privacy
5 A PUF-Based Destructive-Private RFID Protocol
5.1 Correctness
5.1 Correctness
6 Security Analysis
6.1 Tag Authentication
6.2 Destructive Privacy
6.1 Tag Authentication
6.2 Destructive Privacy
7 Conclusion
References
Part VI Hardware-Based Security Architectures and Applications
Authentication of Processor Hardware Leveraging Performance Limits in Detailed Simulations and Emulations
Daniel Y. Deng, Andrew H. Chan, and G. Edward Suh
1 Introduction
2 Threat Model
3 Authentication Approach
4 Hardware Design
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
5 Challenge Program
6 Evaluation
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
7 Related Work
8 Conclusion
References
1 Introduction
2 Threat Model
3 Authentication Approach
4 Hardware Design
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
5 Challenge Program
6 Evaluation
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
7 Related Work
8 Conclusion
References
Daniel Y. Deng, Andrew H. Chan, and G. Edward Suh
1 Introduction
2 Threat Model
3 Authentication Approach
4 Hardware Design
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
5 Challenge Program
6 Evaluation
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
7 Related Work
8 Conclusion
References
1 Introduction
2 Threat Model
3 Authentication Approach
4 Hardware Design
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
5 Challenge Program
6 Evaluation
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
7 Related Work
8 Conclusion
References
Signal Authentication in Trusted Satellite Navigation Receivers
Markus G. Kuhn
1 Introduction
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
2 Techniques
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
3 Comparison
4 Conclusions
References
1 Introduction
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
2 Techniques
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
3 Comparison
4 Conclusions
References
Markus G. Kuhn
1 Introduction
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
2 Techniques
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
3 Comparison
4 Conclusions
References
1 Introduction
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
2 Techniques
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
3 Comparison
4 Conclusions
References
On the Limits of Hypervisor- and Virtual Machine Monitor-Based Isolation
Loic Duflot, Olivier Grumelard, Olivier Levillain, and Benjamin Morin
1 Introduction
2 Compartmented Systems
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
3 Attack Paths
3.1 Taxonomy of Attack Vectors
3.1 Taxonomy of Attack Vectors
4 Design of a DIMM Backdoor
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
5 Exploitation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
6 Countermeasures
7 Conclusion and Future Work
References
1 Introduction
2 Compartmented Systems
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
3 Attack Paths
3.1 Taxonomy of Attack Vectors
3.1 Taxonomy of Attack Vectors
4 Design of a DIMM Backdoor
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
5 Exploitation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
6 Countermeasures
7 Conclusion and Future Work
References
Loic Duflot, Olivier Grumelard, Olivier Levillain, and Benjamin Morin
1 Introduction
2 Compartmented Systems
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
3 Attack Paths
3.1 Taxonomy of Attack Vectors
3.1 Taxonomy of Attack Vectors
4 Design of a DIMM Backdoor
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
5 Exploitation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
6 Countermeasures
7 Conclusion and Future Work
References
1 Introduction
2 Compartmented Systems
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
3 Attack Paths
3.1 Taxonomy of Attack Vectors
3.1 Taxonomy of Attack Vectors
4 Design of a DIMM Backdoor
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
5 Exploitation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
6 Countermeasures
7 Conclusion and Future Work
References
Efficient Secure Two-Party Computation with Untrusted Hardware Tokens
Kimmo Järvinen, Vladimir Kolesnikov, Ahmad-Reza Sadeghi, and Thomas Schneider
1 Introduction
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
2 Preliminaries
2.1 Garbled Circuits (GC)
2.1 Garbled Circuits (GC)
3 Architecture, System, and Trust Model
4 Token-Assisted Garbled Circuit Protocols
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
5 Further Optimizations
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
6 Proof-of-Concept Implementation
6.1 Architecture
6.2 Prototype Implementation
6.1 Architecture
6.2 Prototype Implementation
References
1 Introduction
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
2 Preliminaries
2.1 Garbled Circuits (GC)
2.1 Garbled Circuits (GC)
3 Architecture, System, and Trust Model
4 Token-Assisted Garbled Circuit Protocols
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
5 Further Optimizations
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
6 Proof-of-Concept Implementation
6.1 Architecture
6.2 Prototype Implementation
6.1 Architecture
6.2 Prototype Implementation
References
Kimmo Järvinen, Vladimir Kolesnikov, Ahmad-Reza Sadeghi, and Thomas Schneider
1 Introduction
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
2 Preliminaries
2.1 Garbled Circuits (GC)
2.1 Garbled Circuits (GC)
3 Architecture, System, and Trust Model
4 Token-Assisted Garbled Circuit Protocols
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
5 Further Optimizations
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
6 Proof-of-Concept Implementation
6.1 Architecture
6.2 Prototype Implementation
6.1 Architecture
6.2 Prototype Implementation
References
1 Introduction
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
2 Preliminaries
2.1 Garbled Circuits (GC)
2.1 Garbled Circuits (GC)
3 Architecture, System, and Trust Model
4 Token-Assisted Garbled Circuit Protocols
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
5 Further Optimizations
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
6 Proof-of-Concept Implementation
6.1 Architecture
6.2 Prototype Implementation
6.1 Architecture
6.2 Prototype Implementation
References
Towards Reliable Remote Healthcare Applications Using Combined Fuzzy Extraction
Jorge Guajardo, Muhammad Asim, and Milan Petkovic
1 Introduction
2 Remote Patient Monitoring Services and Data Reliability Issues
2.1 Data Reliability Issues
2.1 Data Reliability Issues
3 Fuzzy Extractors, PUFs, and Biometrics
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
4 Combining PUFs and Biometrics
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
5 Conclusions
References
1 Introduction
2 Remote Patient Monitoring Services and Data Reliability Issues
2.1 Data Reliability Issues
2.1 Data Reliability Issues
3 Fuzzy Extractors, PUFs, and Biometrics
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
4 Combining PUFs and Biometrics
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
5 Conclusions
References
Jorge Guajardo, Muhammad Asim, and Milan Petkovic
1 Introduction
2 Remote Patient Monitoring Services and Data Reliability Issues
2.1 Data Reliability Issues
2.1 Data Reliability Issues
3 Fuzzy Extractors, PUFs, and Biometrics
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
4 Combining PUFs and Biometrics
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
5 Conclusions
References
1 Introduction
2 Remote Patient Monitoring Services and Data Reliability Issues
2.1 Data Reliability Issues
2.1 Data Reliability Issues
3 Fuzzy Extractors, PUFs, and Biometrics
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
4 Combining PUFs and Biometrics
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
5 Conclusions
References
Authentication of Processor Hardware Leveraging Performance Limits in Detailed Simulations and Emulations
Daniel Y. Deng, Andrew H. Chan, and G. Edward Suh
1 Introduction
2 Threat Model
3 Authentication Approach
4 Hardware Design
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
5 Challenge Program
6 Evaluation
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
7 Related Work
8 Conclusion
References
1 Introduction
2 Threat Model
3 Authentication Approach
4 Hardware Design
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
5 Challenge Program
6 Evaluation
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
7 Related Work
8 Conclusion
References
Daniel Y. Deng, Andrew H. Chan, and G. Edward Suh
1 Introduction
2 Threat Model
3 Authentication Approach
4 Hardware Design
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
5 Challenge Program
6 Evaluation
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
7 Related Work
8 Conclusion
References
1 Introduction
2 Threat Model
3 Authentication Approach
4 Hardware Design
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
4.1 Microarchitectural Features
4.2 Checksum Computation
4.3 New Instructions
4.4 Non-determinism
5 Challenge Program
6 Evaluation
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
6.1 Overheads
6.2 Effectiveness
6.3 Deterministic Execution
6.4 Security Discussion
7 Related Work
8 Conclusion
References
Signal Authentication in Trusted Satellite Navigation Receivers
Markus G. Kuhn
1 Introduction
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
2 Techniques
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
3 Comparison
4 Conclusions
References
1 Introduction
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
2 Techniques
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
3 Comparison
4 Conclusions
References
Markus G. Kuhn
1 Introduction
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
2 Techniques
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
3 Comparison
4 Conclusions
References
1 Introduction
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
1.1 Environmental Assumptions
1.2 Related Technologies
1.3 Goals
2 Techniques
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
2.1 Secret Spreading Sequences
2.2 Individual Receiver Antenna Characteristics
2.3 Consistency with Reference Receivers
2.4 Receiver-Internal Plausibility Tests
2.5 Some Other Ideas
3 Comparison
4 Conclusions
References
On the Limits of Hypervisor- and Virtual Machine Monitor-Based Isolation
Loic Duflot, Olivier Grumelard, Olivier Levillain, and Benjamin Morin
1 Introduction
2 Compartmented Systems
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
3 Attack Paths
3.1 Taxonomy of Attack Vectors
3.1 Taxonomy of Attack Vectors
4 Design of a DIMM Backdoor
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
5 Exploitation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
6 Countermeasures
7 Conclusion and Future Work
References
1 Introduction
2 Compartmented Systems
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
3 Attack Paths
3.1 Taxonomy of Attack Vectors
3.1 Taxonomy of Attack Vectors
4 Design of a DIMM Backdoor
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
5 Exploitation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
6 Countermeasures
7 Conclusion and Future Work
References
Loic Duflot, Olivier Grumelard, Olivier Levillain, and Benjamin Morin
1 Introduction
2 Compartmented Systems
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
3 Attack Paths
3.1 Taxonomy of Attack Vectors
3.1 Taxonomy of Attack Vectors
4 Design of a DIMM Backdoor
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
5 Exploitation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
6 Countermeasures
7 Conclusion and Future Work
References
1 Introduction
2 Compartmented Systems
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
2.1 Traditional Architectures and Definition of a Trusted Computing Base
2.2 Attacker Model
3 Attack Paths
3.1 Taxonomy of Attack Vectors
3.1 Taxonomy of Attack Vectors
4 Design of a DIMM Backdoor
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
4.1 Overview of DDR DIMM
4.2 Principle of the Backdoor
4.3 Proof of Concept Implementation
5 Exploitation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
5.1 Difficulties
5.2 Use of the Hidden Functions to Access Sensitive Data
5.3 Use of the Backdoor as a means for Privilege Escalation
6 Countermeasures
7 Conclusion and Future Work
References
Efficient Secure Two-Party Computation with Untrusted Hardware Tokens
Kimmo Järvinen, Vladimir Kolesnikov, Ahmad-Reza Sadeghi, and Thomas Schneider
1 Introduction
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
2 Preliminaries
2.1 Garbled Circuits (GC)
2.1 Garbled Circuits (GC)
3 Architecture, System, and Trust Model
4 Token-Assisted Garbled Circuit Protocols
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
5 Further Optimizations
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
6 Proof-of-Concept Implementation
6.1 Architecture
6.2 Prototype Implementation
6.1 Architecture
6.2 Prototype Implementation
References
1 Introduction
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
2 Preliminaries
2.1 Garbled Circuits (GC)
2.1 Garbled Circuits (GC)
3 Architecture, System, and Trust Model
4 Token-Assisted Garbled Circuit Protocols
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
5 Further Optimizations
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
6 Proof-of-Concept Implementation
6.1 Architecture
6.2 Prototype Implementation
6.1 Architecture
6.2 Prototype Implementation
References
Kimmo Järvinen, Vladimir Kolesnikov, Ahmad-Reza Sadeghi, and Thomas Schneider
1 Introduction
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
2 Preliminaries
2.1 Garbled Circuits (GC)
2.1 Garbled Circuits (GC)
3 Architecture, System, and Trust Model
4 Token-Assisted Garbled Circuit Protocols
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
5 Further Optimizations
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
6 Proof-of-Concept Implementation
6.1 Architecture
6.2 Prototype Implementation
6.1 Architecture
6.2 Prototype Implementation
References
1 Introduction
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
1.1 Our Setting, Goals, and Approach
1.2 Envisioned Applications
1.3 Our Contributions and Outline
1.4 Related Work
2 Preliminaries
2.1 Garbled Circuits (GC)
2.1 Garbled Circuits (GC)
3 Architecture, System, and Trust Model
4 Token-Assisted Garbled Circuit Protocols
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
4.1 Protocols Overview and Security
4.2 Circuit Representation
4.3 GC Creation with Stateful Token (Secure Counter)
4.4 GC Creation with Stateless Token (No Counter)
5 Further Optimizations
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
5.1 Optimizing Memory of Client
5.2 Optimizing Runtime of Token by Caching
6 Proof-of-Concept Implementation
6.1 Architecture
6.2 Prototype Implementation
6.1 Architecture
6.2 Prototype Implementation
References
Towards Reliable Remote Healthcare Applications Using Combined Fuzzy Extraction
Jorge Guajardo, Muhammad Asim, and Milan Petkovic
1 Introduction
2 Remote Patient Monitoring Services and Data Reliability Issues
2.1 Data Reliability Issues
2.1 Data Reliability Issues
3 Fuzzy Extractors, PUFs, and Biometrics
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
4 Combining PUFs and Biometrics
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
5 Conclusions
References
1 Introduction
2 Remote Patient Monitoring Services and Data Reliability Issues
2.1 Data Reliability Issues
2.1 Data Reliability Issues
3 Fuzzy Extractors, PUFs, and Biometrics
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
4 Combining PUFs and Biometrics
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
5 Conclusions
References
Jorge Guajardo, Muhammad Asim, and Milan Petkovic
1 Introduction
2 Remote Patient Monitoring Services and Data Reliability Issues
2.1 Data Reliability Issues
2.1 Data Reliability Issues
3 Fuzzy Extractors, PUFs, and Biometrics
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
4 Combining PUFs and Biometrics
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
5 Conclusions
References
1 Introduction
2 Remote Patient Monitoring Services and Data Reliability Issues
2.1 Data Reliability Issues
2.1 Data Reliability Issues
3 Fuzzy Extractors, PUFs, and Biometrics
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
3.1 Preliminaries
3.2 Physical Unclonable Functions
3.3 Biometrics
3.4 The Need for Fuzzy Extractors
4 Combining PUFs and Biometrics
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
4.1 A Practical Simplification
4.2 Other Variations
4.3 Security and Safety
5 Conclusions
References
← Prev
Back
Next →
← Prev
Back
Next →