Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2005, Lecture Notes in Computer Science
Amortization schemes for authenticating streamed data have been introduced as a solution to reduce the high overhead that sign-each schemes suffer from. The hash chains structure of amortization schemes and the number of hash values appended to other packets affect the efficiency of the authentication scheme specially against packet loss. Which packets should have hashes appended to the signature packet and how many hashes to append to it have no solutions yet. This paper introduces a new hash chain construction to achieve longer resistance against packet loss and reduces the overhead. The proposed scheme consists of multiple connected chains, each chain links several packets together. Our scheme specifies clearly how to choose the packets that should have hashes appended to a signature packet, in addition to deriving their loss probability. We study the effect of the number of hashes that are appended to a signature packet on the overhead. We introduce a measure so as to know the number of packets receivers need to buffer until they can authenticate the received packets. The number of chains of our model plays a main role in the efficiency of our scheme in terms of loss resistance and overhead.
2010
Signature amortization schemes have been introduced for authenticating multicast streams, in which, a single signature is amortized over several packets. The hash value of each packet is computed, some hash values are appended to other packets, forming what is known as hash chain. These schemes divide the stream into blocks, each block is a number of packets, the signature packet in these schemes is either the first or the last packet of the block. Amortization schemes are efficient solutions in terms of computation and communication overhead, specially in real-time environment. The main effictive factor of amortization schemes is it-s hash chain construction. Some studies show that signing the first packet of each block reduces the receiver-s delay and prevents DoS attacks, other studies show that signing the last packet reduces the sender-s delay. To our knowledge, there is no studies that show which is better, to sign the first or the last packet in terms of authentication probab...
2002
Abstract We describe a novel method for authenticating multicast packets that is robust against packet loss. Our main focus is to minimize the size of the communication overhead required to authenticate the packets. Our approach is to encode the hash values and the signatures with Rabin's Information Dispersal Algorithm (IDA) to construct an authentication scheme that amortizes a single signature operation over multiple packets.
International Journal of Communication Systems
Asymmetric cryptography has been widely used to generate a digital signature for message authentication. However, such a strategy cannot be used for packet authentication. Neither the source nor the receiver will be capable of handling the computational cost of asymmetric cryptography. For unicast communication, the solution adopted is based on symmetric cryptography. Solutions based on symmetric cryptography do not scale for multicast communication. Several solutions have been reported to authenticate multicast streams, with the possibility of packet losses. Proposed solutions are based on the concept of signature amortization, where a single signature is amortized on several packets. In this paper we present a new mechanism for multicast data source authentication based on signature amortization. Multi-layers connected chains divides the packet stream into a multi-layer structure, where each layer is a two-dimensional matrix. The hash of a packet is included into a forward chain of packets within the same layer as well as a downward chain of packets across multiple layers. The values of the key parameters that influence the mechanism efficiency as well as its performance are selected following a mathematical analysis. Comparisons of performance results with the well-known efficient multi-chained stream signature scheme as well as a recently reported scheme multiple connected chains model show that the proposed mechanism achieves a stronger resistance to packet losses with low overhead and high authentication probability.
Encyclopedia of Cryptography and Security, 2011
Advanced Modeling and Optimization, 2005
We present a graph-based new amortization scheme for multicast streams authentication that achieves stronger resistance against packet loss and reduces the overhead in the same time. The hash chains of the existing amortization schemes have no systematic way to ...
2012
This paper presents adoption of a new hash algorithm in digital signature. Digital signature presents a technique to endorse the content of the message. This message has not been altered throughout the communication process. Due to this, it increased the receiver confidence that the message was unchanged. If the message is digitally signed, any changes in the message will invalidate the signature. The comparison of digital signature between Rivest, Shamir and Adleman (RSA) algorithms are summarized.
2009
An identity based signature scheme allows any pair of users to communicate securely and to verify each others signatures without exchanging public key certificates. An aggregate signature scheme is a digital signature scheme which supports aggregation of signatures. Batch verification is a method to verify multiple signatures at once. Aggregate signature is useful in reducing both communication and computation cost. In this paper, we describe the breaks possible in some of the aggregate signature schemes and batch verification scheme.
Asymmetric Key Cryptography is widely used in broadcasting areas for authentication. But it is considered to be expensive to wireless sensor networks. This proposed system is a novel broadcast authentication scheme based on PKC with signature amortization. This scheme uses single Signature for authenticating a group of broadcast messages. As a result, the overhead is spread over that group of broadcast messages. Moreover, this scheme gives high security and low overhead also. But signature verification in ECDSA slower than signature generation. So, broadcast authentication with ECDSA has also suffered large energy consumption and lengthy verification delay. To reduce, this system uses cooperation among sensor nodes, which helps to accelerate the signature verification. During Signature verification, sensor nodes which have high energy allowed to leave the intermediary results of the signature verification process to their neighbors for accelerating the same. Simulation results show that the overhead of message authentication and t he del ay of v e ri f i c at i on of aut he nt i c at e d me ssa ge s is reduced significantly.
Potentials, …, 2006
A DESIGN APPROACH to create smallsized, high-speed implementations of the keyed-hash message authentication code (HMAC) is the focus of this article. The goal of this approach is to increase the HMAC throughput to a level that can be used in modern telecommunication applications such as virtual private networks (VPNs) and the oncoming 802.11n. We focus on increasing the maximum operating frequency that, compared to commercially available IP cores, ranges from 30% to 390%. The proposed implementation doesn't introduce significant area penalty. More specifically, the overall increase in lookup tables required by our implementation is less than 10% compared to that of other implementations.
Proceedings of the 18th International Conference on Security and Cryptography
We propose a new digital signature scheme based on combining cryptographic timestamping with an endorsement scheme, both of which can be constructed from one-way and collision-resistant hash functions. The signature scheme is efficient and allows balancing of key generation and signing time for signature size and verification time. The security analysis is based on a realistic model of timestamping. As part of our construction, we introduce the novel concept of endorsements, which may be of independent interest.
Lecture Notes in Computer Science, 2007
We study the multicast stream authentication problem when the communication channel is under control of an opponent who can drop, reorder and inject data packets. In this work, we consider that the stream to be authenticated is divided into block of n packets and we assume that the sender can memorize λ such blocks. Two important parameters for stream authentication protocols are packet overhead and computing efficiency. Our construction will exhibit the following advantages. First, our packet overhead will be a few hashes long. Second, the number of signature verifications per family of λ blocks will be O(1) as a function of both λ and n. Third, hash chains will enable the receiver to check the validity of received elements upon reception. As a consequence he will only buffer those consistent with the original data packets. Fourth, the receiver will be able to recover all the data packets emitted by the sender despite erasures and injections by running the decoding algorithm of the maximal distance separable code onto the elements which have passed the previous filtering process. In broadcasting, the sequence of information sent into the network is called stream. Jakimoski [Jak06]. Thus, constructions for multicast distribution rely on digital signatures to provide non-repudiation. Nevertheless, signing each data packet 2 is not a practical solution as digital signatures are generally too expensive to generate and/or verify. In addition, bandwidth limitations prevent one-time and k-time signatures [GR97, Roh99] from being used due to their size. That is why a general approach consists of generating a single signature and amortizing its computational cost and overhead over several data packets using hash functions for instance. In order to deal with erasures, Perrig et al. [PCTS00, PT03], Challal et al. [CBB05], Golle and Modadugu [GM01]
… Security. Advanced Techniques for Network and …, 2003
We present a stream authentication framework featuring preemptive one-time signatures and reactive hash-graphs, thereby enabling simultaneous realisation of near-online performance and packet-loss tolerance. Stream authentication is executed on packet ...
We consider the problem of source and content authentication in video streaming applications in multicast satellite networks. In particular we pro-pose a novel method which combines signature amortization by means of hash chain with water-marking techniques. This approach does not in-troduce bandwidth overhead and computational overhead is reduced using signature amortization by means of an hash chain. We present simulation results about authentication robustness to trans-mission over a lossy channel.
Several solutions had been introduced to authenticate streamed data delivered in real-time over insecure networks, where there is no guarantee that every packet will be delivered. Some solutions resist any type of packet loss, others resist burst loss. Amortization schemes reduce the overhead caused by other schemes, but suffer from several weak points, such as where to place the signature packet, that is, after how many packets to send the signature. How many hashes to append to each packet, in addition to no clear chain structure analysis had been introduced, so as to show the effect on the efficiency in terms of the authentication probability, loss resistance and overhead. In this paper we introduce a new chain construction for multicast stream authentication delivered in real-time using signature amortization, giving solutions for the shortcomings. We also introduce a theoretical analysis of the chain construction to show its effect on the authentication efficiency. The proposed scheme consists of several odd-even chains, where the odd chains link some of the odd numbered packets, and the even chains link some of the even numbered ones. The scheme achieves better performance in terms of loss resistance and low overhead by changing the number of chains. That is when increasing the number of chains, low overhead and longer packet loss resistance are achieved. The sender's buffer capacity is taken into consideration when choosing the number of chains. We also introduce equations to quantify the requirements such as the buffer size and delay on the receiver.
ACM SIGOPS Operating Systems Review, 1998
We present a related family of authentication and digital signature protocols based on symmetric cryptographic primitives which perform substantially better than previous constructions. Previously, one-time digital signatures based on hash functions involved hundreds of hash function computations for each signature; we show that given online access to a timestamping service, we can sign messages using only two computations of a hash function. Previously, techniques to sign infinite streams involved one such one-time ...
2001
The security of ordinary digital signature schemes relies on a computational assumption. Fail-stop signature (FSS) schemes provide security for a signer against a forger with unlimited computational power by enabling the signer to provide a proof of forgery, if it occurs. Signing long messages using FSS requires a hash function with provable security which results in slow signature generation. In this paper we propose a new construction for FSS schemes based on linear authentication codes which does not require a hash function, and results in a much faster signature generation at the cost of slower verification, and a longer secret key and signature. An important advantage of the scheme is that the proof of forgery is the same as a traditional FSS and does not rely on the properties of the hash function. The scheme can be used in a distributed setting where signature generation requires collaboration of k signers. The paper concludes with some open problems.
Computer Networks, 2010
The network coding based applications are vulnerable to possible malicious pollution attacks. Signature schemes have been well-recognized as the most effective approach to address this security issue. However, existing homomorphic signature schemes for network coding either incur high transmission/computation overhead, or are vulnerable to random forgery attacks. In this paper, we propose a novel dynamic-identity based signature scheme for network coding by signing linear vector subspaces. The scheme can rapidly detect/drop the packets that are generated from pollution attacks, and efficiently thwart random forgery attack. By employing fast packet-based and generation-based batch verification approaches, a forwarding node can verify multiple received packets synchronously with dramatically reduced total verification cost. In addition, the proposed scheme provides one-way identity authentication without requiring any extra secure channels or separate certificates, so that the transmission cost can be significantly reduced. Simulation results demonstrate the practicality and efficiency of the proposed schemes. network security, logic reasoning, and Petri net and its applications. He has published more than 200 papers in research journals and IEEE conference proceedings in these areas and has published three books. He is an IEEE senior member and the Chinese Delegate in IFIP TC6. He serves as the General Chair, ACM SIGCOMM Asia workshop 2005; the Associate Editor, IEEE Transactions on Vehicular Technology; and the Area Editor, Journal of Parallel and Distributed Computing.
eprint.iacr.org
Aggregation schemes allow to combine several cryptographic values like message authentication codes or signatures into a shorter value such that, despite compression, some notion of unforgeability is preserved. Recently, Eikemeier et al. (SCN 2010) considered the notion of history-free sequential aggregation for message authentication codes, where the sequentiallyexecuted aggregation algorithm does not need to receive the previous messages in the sequence as input. Here we discuss the idea for signatures where the new aggregate does not rely on the previous messages and public keys either, thus inhibiting the costly verifications in each aggregation step as in previous schemes by Lysyanskaya et al. (Eurocrypt 2004) and Neven (Eurocrypt 2008). Analogously to MACs we argue about new security definitions for such schemes and compare them to previous notions for history-dependent schemes. We finally give a construction based on the BLS signature scheme which satisfies our notion.
2005
The views and conclusions contained here are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either express or implied, of ARO, Bosch, Carnegie Mellon University, Intel, or the U.S. Government or any of its agencies. Keywords: One-way hash chains, efficient constructions, broadcast authentication. One-way chains are an important cryptographic primitive in many security applications. Lamport first proposed to use one-way chains for one-time password authentication [19]. Subsequently, researchers proposed one-way chains as a basic building block for digital cash, for extending the lifetime of digital certificates, for constructing one-time signatures, for packet authentication, etc. As one-way chains are very efficient to verify, they recently became increasingly popular for designing security protocols for resource-constrained mobile devices and sensor networks, as their low-powered processors can compute a one-way...
ACM Transactions on Information and System Security, 2011
We present a novel video stream authentication scheme which combines signature amortization by means of hash chains and an advanced watermarking technique. We propose a new hash chain construction, the Duplex Hash Chain, which allows us to achieve bit-by-bit authentication that is robust to low bit error rates. This construction is well suited for wireless broadcast communications characterized by low packet losses such as in satellite networks. Moreover, neither hardware upgrades nor specific end-user equipment are needed to enjoy the authentication services. The computation overhead experienced on the receiver only sums to two hashes per block of pictures and one digital signature verification for the whole received stream. This overhead introduces a provably negligible decrease in video quality. A thorough analysis of the proposed solution is provided in conjunction with extensive simulations.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.