Publications

Results 26–28 of 28

Search results

Jump to search filters

The role of decimated sequences in scaling encryption speeds through parallelism

Pierson, Lyndon G.

Encryption performance, in terms of bits per second encrypted, has not scaled well as network performance has increased. The authors felt that multiple encryption modules operating in parallel would be the cornerstone of scalable encryption. One major problem with parallelizing encryption is ensuring that each encryption module is getting the proper portion of the key sequence at the correct point in the encryption or decryption of the message. Many encryption schemes use linear recurring sequences, which may be generated by a linear feedback shift register. Instead of using a linear feedback shift register, the authors describe a method to generate the linear recurring sequence by using parallel decimated sequences, one per encryption module. Computing decimated sequences can be time consuming, so the authors have also described a way to compute these sequences with logic gates rather than arithmetic operations.

More Details

Scalable ATM encryption

Pierson, Lyndon G.

In order to provide needed security assurances for traffic carried in Asynchronous Transfer Mode (ATM) networks, methods of protecting the integrity and privacy of traffic must be employed. Cryptographic methods can be used to assure authenticity and privacy, but are hard to scale and the incorporation of these methods into computer networks can severely impact functionality, reliability, and performance. To study these trade-offs, a research prototype encryptor/decryptor is under development. This prototype is to demonstrate the viability of implementing certain encryption techniques in high speed networks by processing Asynchronous Transfer Mode (ATM) cells in a SONET OC-3 payload. This paper describes the objectives and design trade-offs intended to be investigated with the prototype. User requirements for high performance computing and communication have driven Sandia to do work in the areas of functionality, reliability, security, and performance of high speed communication networks. Adherence to standards (including emerging standards) achieves greater functionality of high speed computer networks by providing wide interoperability of applications, network hardware, and network software.

More Details

Tuning computer communications networks and protocols

Pierson, Lyndon G.

Current computer network protocols are very robust and capable of being used in a variety of different environments. Typically, the implementations of these protocols come to the user with preset parameters that provide reasonable performance for low delay- bandwidth product environments with low error rates, but these defaults do not necessarily provide optimal performance for high delay-bandwidth, high error rate environments. To provide optimal performance from the user's perspective, which is application to application, all equivalent layers of the protocol must be tuned. The key to tuning protocols is reducing idle time on the links caused by various protocol layers waiting for acknowledgments. The circuit bandwidth, propagation delay, error rate, number of outstanding packets, buffer length, number of buffers, and buffer size can all affect the observed idle time. Experiments have been conducted on test bed systems, and on live satellite and terrestrial circuits. Observations from these experiments led the authors to draw conclusions about the locations of common bottlenecks. Various aspects of network tuning and certain specific issues relating to the tuning of three protocols (DECnet, TCP/IP, NETEX) over various media types (point-to-point and broadcast) under several different conditions (terrestrial and satellite) are examined in this paper. Also described are the lessons learned about protocol and network tuning. 3 refs., 2 tabs.

More Details
Results 26–28 of 28
Results 26–28 of 28