A two-round universal composable group key exchange protocol
Abstract
A group key exchange (GKE) protocol allows a group of parties to securely establish a common key within an untrusted network. For practical reasons, the number of communication rounds should be as low as possible, in the best case constant and independent of the group size. Furthermore, the security should be proven within a well-defined and established security model. One promising candidate is the universal composability (UC) framework which ensures the secure concurrent run of protocols and the use of their outputs in any subsequent application. Katz and Shin formalized a security model within the UC framework in respect to attackers who can control the communication, adaptively corrupt parties, and take part in the protocol. Furthermore, they described a mechanism to turn GKE protocols with certain security properties into UC-secure protocols. This mechanism can be used to construct a UC-secure GKE protocol which requires three broadcast rounds. To the best of our knowledge, no other UC-secure GKE protocol have been published so far.
In this talk we present a UC-secure GKE protocol that requires only two broadcast rounds, i.e. one round less. The size of the messages is constant, that is independent of the group size. The proof of security relies in the standard model on the decisional bilinear Diffie-Hellman assumption (or a variation, depending on if the number of parties is even or odd).
It is a recent (still unpublished) result from J. Furukawa, K. Kurosawa and myself.
Title: Practical Decorrelation
Abstract:
In the first part of this talk we will recall several essential results of Serge Vaudenay's Decorrelation Theory. In particular we will review some of the basic tools which make it possible to prove the security of a block cipher in the Luby-Rackoff model. In this model, a computationally unbounded adversary tries to distinguish a random instance of a block cipher from a permutation drawn uniformly at random among all possible permutations (often referred to as "the perfect cipher"), the only limitations being the number of plaintext/ciphertext pairs available. In the second part of this talk, we will give two practical block cipher examples for which security can be proven using the above-mentioned techniques. More precisely, we will review the block cipher C (and detail the reasons why it is indistinguishable from the perfect cipher on the basis of two plaintext/ciphertext pairs) and the block cipher KFC (which presumably achieves higher levels of security, at the price of a large penalty in terms of efficiency).
This talk will essentially rely of Serge Vaudenay's Journal of Cryptology article on the Decorrelation Theory and on some joint work with Matthieu Finiasz.
baigneres_practical_decorrelation.pdf
Title: Algebraic attacks on NFSR
Abstract:
Algebraic attacks have been extensively studied against LFSR based ciphers and especially against filtered LFSR or against combination of LFSRs. In this talk we will show that it is possible to extend algebraic attacks and fast algebraic attacks on certain types of Non Linear Feedback Shift Register and on combinations of NFSR and LFSRs. This allows us to draw new rules for designing stream ciphers with NFSR.
berbain_nfsr.pdf
Item #1: ChaCha20, an improvement of Salsa20. See http://cr.yp.to/talks.html#2008.01.09-1.
Item #2: Better universal hashing, http://cr.yp.to/papers.html#pema, combining the stream-processing speed of UMAC et al. with the small key size and high key agility of Poly1305 et al. See http://cr.yp.to/talks.html#2008.01.09-2 for an application to authentication (MAC1271), and http://eprint.iacr.org/2008/004 by Sarkar for an application to disk-block encryption.
Item #3: http://cr.yp.to/cipherdag/20070924.html, a prototype tool for turning people's C/C++ cipher implementations into data-flow diagrams and then experimenting graphically with the diagrams. If you've written code for automated cryptanalysis (for example, searching for differentials) and if you're annoyed at the difficulty of converting people's ciphers into input to your code, let's see whether we can hook my data-flow diagrams into your code!
The trail backtracking attack
In the context of evaluation of hash function we presented the trail backtracking attack. This attack is based on the study of differential trails and the probability of building a collision from it.
As many cryptanalysis tools it can be used by designers to show that a hash function does not present trails suitable for attacking the design. On the other hand the cryptanalyst can find trails in such a way that could create a collision with a certain probability.
The presentation will cover the concept of differential trails applied to hash function, the basics of the attack and practical examples of how we used trail backtracking to perform evaluation of the RadioGatún design, how the attacks to Panama (Rijmen et al FSE2001, Daemen and Van Assche at FSE2007) and to Grindahl (Peyrin ASIACRYPT2007) can be seen as a form of trail backtracking attack.
This is a joint work with Joan Daemen, Michaël Peeters and Gilles Van Assche.
"The Trail Backtracking Attack.pdf"
Approximation of a combining function by functions of fewer variables
Stream ciphers which combine several independent devices, such as combination generators or the recent Achterbahn proposal, are vulnerable to divide-and-conquer attacks. These attacks usually exploit an approximation of the combining function by a function of fewer variables. The accuracy of such an approximation is therefore an important parameter in the complexity of these attacks. In this context, we evaluate the correlations between a Boolean combining function and the functions depending on a small subset of its input variables. We notably show that the corresponding bias is upper-bounded by a quantity which depends on the nonlinearity of the function.
time info: arrival on Sunday, leave on Friday morning.
canteaut.pdf
"A few observations on APN and AB functions".
Abstract:
There remain many difficult open problems on general S-boxes (vectorial Boolean functions). In particular little is known on the best possible nonlinearity of vectorial functions from $F_2^n$ to $F_2^m$ when $n$ is odd and $m\neq n$ or when $n$ is even and $m>n/2$; few Perfect Nonlinear (PN / bent) functions are known; important questions remain unanswered about Almost Perfect Nonlinear (APN) and Almost Bent (AB) functions. Progress is necessary in these domains for supporting the design of further block ciphers. We shall recall what are these open questions and make observations about them.
Echternach-carlet.pdf
A narrow trail in hash function design
With Guido Bertoni, Gilles Van Assche and Michaël Peeters, I form since a few years a team that has as main goal to bring innovation in the field of hash function design. This has lead to a concrete hash function design called RadioGatún, a new reference concept called Sponge function, a collision-generating attack on Panama and some other more general issues related to stream-oriented hashing such as trail backtracking and the belt-and-mill structure.
But actually, I have been active in hash functions since 1990 with attacks, a design approach quite different from the mainstream and a number of concrete proposals. In general the corresponding publications have stayed rather obscure and not all of them have stood the test of time. Still, several ideas, constructions and component functions introduced in it have survived and are present in our recent proposals. In my presentation I will give a chronological overview of this early work concentrating on the aspects that are still relevant today and the lessons learnt from failures such as the complete lack of security of the Panama hash function as shown by the attack of Vincent Rijmen et al.
"ESC Narrow Trail Hash.pdf"
Improved Meet-in-the-Middle Attacks on Reduced-Round DES
The Data Encryption Standard (DES) is a 64-bit block cipher. Despite its short key size of 56 bits, DES continues to be used to protect financial transactions valued at billions of Euros. In this work, we investigate the strength of DES against attacks that use a limited number of plaintexts and ciphertexts. By mounting meet-in-the-middle attacks on reduced-round DES, we find that up to 6-round DES is susceptible to this kind of attacks.
This is joint work with Gautham Sekar and Bart Preneel
Topic
sha-1.pdf
Title: Pruning and Extending the HB+ Family Tree (joint work with Matt Robshaw and Yannick Seurin)
esc_hb#.pdf
Title: Masking Does Not Protect Against Fault Attacks
Joint work with Arnaud Boscher, Spansion.
ESC08_rump.pdf
When e-th roots become easier than factoring
Abstract
TBA
pres_echternach.pdf
Dakota – hashing from a combination of modular arithmetic and symmetric cryptography
Abstract
We propose a cryptographic hash function, where collision resistance is based upon an assumption that involves squaring modulo an RSA modulus in combination with a one-way function that does not compress its input, and may therefore be constructed from standard techniques and assumptions. We are not able to reduce collision finding to factoring, but on the other hand, our hash function is more efficient than any known construction that makes use of modular squaring.
TimeInfo: arrival on Sunday late and leave Friday early.
Dakota.pdf
Update on Factoring
"A kilobit special number field sieve factorization.ppt"
Title: Cryptanalysis of the GOST Hash Function
Abstract
In this talk, we analyze the security of the GOST hash function with respect to preimage resistance and collision resistance. The GOST hash function, defined in the Russian standard GOST-R 34.11-94, is an iterated hash function producing a 256-bit hash value. As opposed to most commonly used hash functions such as MD5 and SHA-1, the GOST hash function defines, in addition to the common iterative structure, a checksum computed over all input message blocks. This checksum is then part of the final hash value computation. For this hash function, we show how to construct second preimages and preimages with a complexity of about 2^225 compression function evaluations. The attacks are independent of the underlying block cipher GOST.
By exploiting the internal structure of the block cipher, we then show an improved preimage attack on the GOST hash function with a complexity of about 2^192. Furthermore, we present a collision attack on the hash function with a complexity of about 2^105 compression function evaluations. This is work in progress.
A copy of the slides can be found here:
mendel_gost.pdf
Key Recovery with Probabilistic Neutral Bits.
Abstract
The initialization function of a stream cipher maps the key and the initialization vector IV to the initial state, and the automaton produces thereafter keystream bits using an output and update function. The initialization should have good mixing properties with regard to key and IV bits. If mixing is not complete, a few key bits may have less influence on the value of output bits than others, which allows to separate probabilistic neutral key bits from essential key bits. In a reduced complexity key recovery one considers approximations of initialization that focus on determining essential key bits. As the initialization is often a complex process, probabilistic neutral key bits may not be detected directly, but only show up in a well chosen intermediate computation. This is exploited in chosen IV key recovery attacks in two directions. A key recovery faster than exhaustive search of the eSTREAM candidate Salsa20 reduced to 8 rounds is described that is based on a 4 round truncated differential together with an approximate backwards computation that exploits the existence of probabilistic neutral key bits. In a different direction, a recent framework for chosen IV statistical distinguishing analysis of stream ciphers is exploited to provide new methods for key recovery attacks. This rests on a polynomial description of the output bits as a function of the key and the IV. It is demonstrated that a deviation of the algebraic normal form from random can be exploited to derive information on the key faster than exhaustive key search. This elaborates on suitable approximations of the polynomial description and on probabilistic neutral key bits. As applications, a reduced complexity key recovery for Trivium with IV initialization reduced to 672 of its 1152 iterations, and a reduced complexity key recovery for Grain-128 with IV initialization reduced to 180 of its 256 iterations are given. These methods are not capable to provide key recovery faster than exhaustive key search of the eSTREAM candidates Trivium and Grain-128 with full initialization.
(This is joint work with Simon Fischer and Shahram Khazaei.)
TimeInfo: Arrival on Sunday, leave on Thursday morning
Neutralbits.pdf
Collisions for Step-Reduced SHA-256
"Collisions for Step-Reduced SHA 256.ppt"
A survey on linear cryptanalysis using multiple linear approximations
Abstract
Recently, various authors have presented different approaches to using multiple linear approximations. Some are based on the assumption that the approximations are statistically independent, while some others use multidimensional distributions without such an assumption. Some authors tune their method for block ciphers, and take the Markov cipher view on them, and some other analyze stream ciphers. Some do distinguishing attacks only, some aim at recovery of bits of the key. Also different authors use different statistics, and statistical inference is based on different hypothesis tests. By bringing different approaches together this survey aims towards developing a unified framework for multidimensional linear cryptanalysis.
Timeinfo: arrival late Sunday evening and leave late Thursday afternoon
kaisa.pdf
"Efficient Algorithm for Decomposing Multivariate Polynomials and its Applications to Cryptography "
Abstract
In this paper, we present an efficient and general algorithm for decomposing multivariate polynomials of an arbitrary degree. This problem, also known as the Functional Decomposition Problem} (FDP) is a classical problem in computer algebra. This the first general method addressing the decomposition of multivariate polynomials (any degree, any number of polynomials). The method relies on the computation of the column ideal of the ideal generated by the partial derivatives of the polynomials obtained from the composition. This computation can be done efficiently using Gröbner bases. From the cryptographic point of view, the new algorithm is so efficient that the principle of two-round schemes, including 2R$^{-}$ schemes, becomes useless. By the way, we believe that this algorithm might have application in the cryptanalysis of block-ciphers.
Title: Block Ciphers Interacting with Hash Functions
Abstract:
We discuss recent directions that consider analysis and design of block ciphers in interaction with those for hash functions, and give some further results.
This is ongoing unpublished work, partly joint work with Jean-Philippe Aumasson.
Universal hash functions are not so universal
This paper discusses some forgery and key recovery attacks on several universal hash function based MAC algorithms. The attacks use a substantial number of verification queries and in a few cases require nonce reuse. Some of these attacks exploit weak keys, while others can make use of partial information on a secret key, for example, due to a side channel attack. These results show that while universal hash functions offer provable security, high speeds and parallelism, their simple combinatorial properties make them less robust than conventional message authentication primitives.
(joint work with Helena Handschuh)
umac_slidesv4.pdf
Design Paradigms for Building Multi-Property Hash Functions
Abstract
Current usage demands that hash functions enjoy numerous disparate security properties, many of which are not implied by one-wayness and collision-resistance. In this talk, I'll discuss new approaches for building hash functions that incorporate techniques to ensure strong guarantees for multiple security properties.
In the first portion, I'll overview recent work on multi-property-preserving transforms, which describe how to securely expand the domain of compression functions that enjoy numerous security properties. This will include discussion of transforms for both traditional, unkeyed compression functions and (what we call) dedicated-key compression functions.
In the second part we'll look at provably collision-resistant functions (e.g., finding collisions is formally equivalent to solving a hard problem such as factoring). While providing strong CR guarantees, these functions are unsuitable to replace general hash functions, and so we provide a new approach for transforming a CR function into a secure hash function.
This talk will cover joint work with Mihir Bellare and Thomas Shrimpton.
echt08.pdf
Identity Based Encryption, slow and probably not practical, but at least it doesn't use Elliptic Curves. (Short talk or Rump session)
Updates on two other projects within Qualcomm, if they are mature enough to talk about.
A Security Architecture for Body Area Networks
abstract
Body Area Networks (BAN) applications offer an interesting and challenging environment when it comes to designing a cost-effective security architecture. Stringent constraints means that it is usually impracticable to reuse standard protocols and cryptographic algorithms. Moreover BAN systems are usually compound of many heterogeneous devices with different processing power. Hence, making adequate trade-offs and exploiting all available resources are a must.
In the first part of this presentation, we propose a lightweight security architecture for BAN that best fits the security requirements of the targeted application. A small MAC function is proposed, along with 2 possible way of extensions. The security of the MAC function is analysed.
In the second part we give some ideas on how to speed-up the finding of collision trails with minimum cost.
"Echternach - MPe - A Security Architecture for Body Area Networks - v1.00.pdf"
Masking Non-Linear Functions based on Secret Sharing
Abstract
The implementations of cryptographic algorithms are vulnerable to side-channel attacks. Masking techniques are employed to counter side-channel attacks that are based on multiple measurements of the same operation on different data. Most currently known masking techniques still leak information during the computation of non-linear functions due to the presence of glitches. We present a method based on secret sharing to protect the implementation of non-linear functions such as the AES Sbox. Each non-linear functions is split into shared functions such that the power consumption is independent of the unmasked values. The method has a higher computational complexity, but stays effective in the presence of glitches.
Sharing.pdf
Collision-Resistant Compression Functions based on Simple Idealized
Building Blocks
Abstract:
We consider the problem of domain and range extension of a public random function. Our main emphasis goes to the construction of collision resistant compression functions. The challenge then is to get good collision resistance for the compression function while keeping the number of calls to the public random function low.
Specifically, we exhibit a 2n-to-n bit compression function based on three calls to n-to-n bit public random functions, as well as a 3n-to-2nbit compression function based on a single call to a 3n-to-n bit public random function. For both compression functions the collision resistance is close to what one would expect/hope for given the birthday bound.
Part of this work is jointly with Thomas Shrimpton
Authenticated Encryption Mode for Beyond the Birthday Bound Security
Abstract
In this talk, I will describe an authenticated encryption mode for blockciphers, called AE1. It has provable security bounds which are better than the usual birthday bound. Besides, AE1 allows parallel executions of blockcipher calls and finite field multiplications. The design is based on the encrypt-then-PRF approach, where the encryption part uses a key stream generation of CENC, and the PRF part combines a hash function based on the inner product and a blockcipher.
slides-iwata.pdf
Cryptographic Sponges
A sponge function models the finite memory an iterated hash function or an iterated stream cipher has. Based on a random transformation or permutation, a random sponge can only be distinguished from a random oracle by inner collisions. Owing to this similarity, it can be used as a model for hash function (and stream cipher) designs, which have finite memory.
In this presentation, we first revisit the definition of sponge functions and their properties, present a number of new results and their implications, and finally discuss their applications.
This is a joint work with Guido Bertoni, Joan Daemen and Michaël Peeters.
CryptographicSponges-ESC2008.pdf
See also http://sponge.noekeon.org/.
RFID Security and Privacy
Abstract:
In this talk we provide formal definitions for RFID protocols, their security and privacy properties. We identify a hierarchy of privacy levels. The strongest one is impossible to achieve. The second strongest requires public-key cryptography. Other privacy levels can be based on PRF only.
Some material was already presented at ASIACRYPT'07.
rfid-esc08-prt1.pdf
Rump session talk #1: SAGE: open source mathematical software (also) for cryptanalysts
Rump session talk #2: iPhone Crypto
Cache Timing Analysis of eStream Finalists
Abstract: Cache Timing Attacks have been primarily discussed in connection with the Advanced Encryption Standard (AES), where they are applicable in a very straightforward way. However, the underlying techniques can be applied to other cryptographic building blocks too, as becomes obvious when considering e.g. the AES-based stream cipher LEX.
In this talk, we will briefly review cache timing attacks and discuss their significance. We will then present some findings from our analysis of eStream finalist stream ciphers. While these findings do not seem to endanger the practical security of the ciphers considered, they illustrate some design techniques that help preventing cache timing attacks. In addition, they may give rise to some deeper questions about what is usually considered a success in the more standard areas of cryptanalysis.
2008_Echternach_Talk_Erik_Zenner.pdf