Network Working Group                                            J. Dunn
Request for Comments: 3116                                     C. Martin
Category: Informational                                        ANC, Inc.
                                                               June 2001


                    Methodology for ATM Benchmarking

Status of this Memo

   This memo provides information for the Internet community.  It does
   not specify an Internet standard of any kind.  Distribution of this
   memo is unlimited.

Copyright Notice

   Copyright (C) The Internet Society (2001).  All Rights Reserved.

Abstract

   This document discusses and defines a number of tests that may be
   used to describe the performance characteristics of ATM (Asynchronous
   Transfer Mode) based switching devices.  In addition to defining the
   tests this document also describes specific formats for reporting the
   results of the tests.

   This memo is a product of the Benchmarking Methodology Working Group
   (BMWG) of the Internet Engineering Task Force (IETF).

Table of Contents

   1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . .  4
   2. Background . . . . . . . . . . . . . . . . . . . . . . . . . .  5
   2.1. Test Device Requirements . . . . . . . . . . . . . . . . . .  5
   2.2. Systems Under Test (SUTs). . . . . . . . . . . . . . . . . .  5
   2.3. Test Result Evaluation . . . . . . . . . . . . . . . . . . .  5
   2.4. Requirements . . . . . . . . . . . . . . . . . . . . . . . .  5
   2.5. Test Configurations for SONET. . . . . . . . . . . . . . . .  6
   2.6. SUT Configuration. . . . . . . . . . . . . . . . . . . . . .  7
   2.7. Frame Formats. . . . . . . . . . . . . . . . . . . . . . . .  8
   2.8. Frame Sizes. . . . . . . . . . . . . . . . . . . . . . . . .  8
   2.9. Verifying Received IP PDU's. . . . . . . . . . . . . . . . .  9
   2.10. Modifiers . . . . . . . . . . . . . . . . . . . . . . . . .  9
   2.10.1. Management IP PDU's . . . . . . . . . . . . . . . . . . .  9
   2.10.2. Routing Update IP PDU's . . . . . . . . . . . . . . . . . 10
   2.11. Filters . . . . . . . . . . . . . . . . . . . . . . . . . . 10
   2.11.1. Filter Addresses. . . . . . . . . . . . . . . . . . . . . 11
   2.12. Protocol Addresses. . . . . . . . . . . . . . . . . . . . . 12



Dunn & Martin                Informational                      [Page 1]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   2.13. Route Set Up. . . . . . . . . . . . . . . . . . . . . . . . 12
   2.14. Bidirectional Traffic . . . . . . . . . . . . . . . . . . . 12
   2.15. Single Stream Path. . . . . . . . . . . . . . . . . . . . . 12
   2.16. Multi-port. . . . . . . . . . . . . . . . . . . . . . . . . 13
   2.17. Multiple Protocols. . . . . . . . . . . . . . . . . . . . . 14
   2.18. Multiple IP PDU Sizes . . . . . . . . . . . . . . . . . . . 14
   2.19. Testing Beyond a Single SUT . . . . . . . . . . . . . . . . 14
   2.20. Maximum IP PDU Rate . . . . . . . . . . . . . . . . . . . . 15
   2.21. Busty Traffic . . . . . . . . . . . . . . . . . . . . . . . 15
   2.22. Trial Description . . . . . . . . . . . . . . . . . . . . . 16
   2.23. Trial Duration. . . . . . . . . . . . . . . . . . . . . . . 16
   2.24. Address Resolution. . . . . . . . . . . . . . . . . . . . . 16
   2.25. Synchronized Payload Bit Pattern. . . . . . . . . . . . . . 16
   2.26. Burst Traffic Descriptors . . . . . . . . . . . . . . . . . 17
   3. Performance Metrics. . . . . . . . . . . . . . . . . . . . . . 17
   3.1. Physical Layer-SONET . . . . . . . . . . . . . . . . . . . . 17
   3.1.1. Pointer Movements. . . . . . . . . . . . . . . . . . . . . 17
   3.1.1.1. Pointer Movement Propagation . . . . . . . . . . . . . . 17
   3.1.1.2. Cell Loss due to Pointer Movement. . . . . . . . . . . . 19
   3.1.1.3. IP Packet Loss due to Pointer Movement . . . . . . . . . 20
   3.1.2. Transport Overhead (TOH) Error Count . . . . . . . . . . . 21
   3.1.2.1. TOH Error Propagation. . . . . . . . . . . . . . . . . . 21
   3.1.2.2. Cell Loss due to TOH Error . . . . . . . . . . . . . . . 22
   3.1.2.3. IP Packet Loss due to TOH Error. . . . . . . . . . . . . 23
   3.1.3. Path Overhead (POH) Error Count. . . . . . . . . . . . . . 24
   3.1.3.1. POH Error Propagation. . . . . . . . . . . . . . . . . . 24
   3.1.3.2. Cell Loss due to POH Error . . . . . . . . . . . . . . . 25
   3.1.3.3. IP Packet Loss due to POH Error. . . . . . . . . . . . . 26
   3.2. ATM Layer. . . . . . . . . . . . . . . . . . . . . . . . . . 27
   3.2.1. Two-Point Cell Delay Variation (CDV) . . . . . . . . . . . 27
   3.2.1.1. Test Setup . . . . . . . . . . . . . . . . . . . . . . . 27
   3.2.1.2. Two-point CDV/Steady Load/One VCC. . . . . . . . . . . . 27
   3.2.1.3. Two-point CDV/Steady Load/Twelve VCCs. . . . . . . . . . 28
   3.2.1.4. Two-point CDV/Steady Load/Maximum VCCs . . . . . . . . . 30
   3.2.1.5. Two-point CDV/Bursty VBR Load/One VCC. . . . . . . . . . 31
   3.2.1.6. Two-point CDV/Bursty VBR Load/Twelve VCCs. . . . . . . . 32
   3.2.1.7. Two-point CDV/Bursty VBR Load/Maximum VCCs . . . . . . . 34
   3.2.1.8. Two-point CDV/Mixed Load/Three VCC's . . . . . . . . . . 35
   3.2.1.9. Two-point CDV/Mixed Load/Twelve VCCs . . . . . . . . . . 36
   3.2.1.10. Two-point CDV/Mixed Load/Maximum VCCs . . . . . . . . . 38
   3.2.2. Cell Error Ratio (CER) . . . . . . . . . . . . . . . . . . 39
   3.2.2.1. Test Setup . . . . . . . . . . . . . . . . . . . . . . . 39
   3.2.2.2. CER/Steady Load/One VCC. . . . . . . . . . . . . . . . . 40
   3.2.2.3. CER/Steady Load/Twelve VCCs. . . . . . . . . . . . . . . 41
   3.2.2.4. CER/Steady Load/Maximum VCCs . . . . . . . . . . . . . . 42
   3.2.2.5. CER/Bursty VBR Load/One VCC. . . . . . . . . . . . . . . 43
   3.2.2.6. CER/Bursty VBR Load/Twelve VCCs. . . . . . . . . . . . . 44
   3.2.2.7. CER/Bursty VBR Load/Maximum VCCs . . . . . . . . . . . . 46



Dunn & Martin                Informational                      [Page 2]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3.2.3. Cell Loss Ratio (CLR). . . . . . . . . . . . . . . . . . . 47
   3.2.3.1. CLR/Steady Load/One VCC. . . . . . . . . . . . . . . . . 47
   3.2.3.2. CLR/Steady Load/Twelve VCCs. . . . . . . . . . . . . . . 48
   3.2.3.3. CLR/Steady Load/Maximum VCCs . . . . . . . . . . . . . . 49
   3.2.3.4. CLR/Bursty VBR Load/One VCC. . . . . . . . . . . . . . . 51
   3.2.3.5. CLR/Bursty VBR Load/Twelve VCCs. . . . . . . . . . . . . 52
   3.2.3.6. CLR/Bursty VBR Load/Maximum VCCs . . . . . . . . . . . . 53
   3.2.4. Cell Misinsertion Rate (CMR) . . . . . . . . . . . . . . . 54
   3.2.4.1. CMR/Steady Load/One VCC. . . . . . . . . . . . . . . . . 54
   3.2.4.2. CMR/Steady Load/Twelve VCCs. . . . . . . . . . . . . . . 55
   3.2.4.3. CMR/Steady Load/Maximum VCCs . . . . . . . . . . . . . . 57
   3.2.4.4. CMR/Bursty VBR Load/One VCC. . . . . . . . . . . . . . . 58
   3.2.4.5. CMR/Bursty VBR Load/Twelve VCCs. . . . . . . . . . . . . 59
   3.2.4.6. CMR/Bursty VBR Load/Maximum VCCs . . . . . . . . . . . . 60
   3.2.5. CRC Error Ratio (CRC-ER) . . . . . . . . . . . . . . . . . 62
   3.2.5.1. CRC-ER/Steady Load/One VCC . . . . . . . . . . . . . . . 62
   3.2.5.2. CRC-ER/Steady Load/Twelve VCCs . . . . . . . . . . . . . 63
   3.2.5.3. CRC-ER/Steady Load/Maximum VCCs. . . . . . . . . . . . . 64
   3.2.5.4. CRC-ER/Bursty VBR Load/One VCC . . . . . . . . . . . . . 65
   3.2.5.5. CRC-ER/Bursty VBR Load/Twelve VCCs . . . . . . . . . . . 66
   3.2.5.6. CRC-ER/Bursty VBR Load/Maximum VCCs. . . . . . . . . . . 68
   3.2.5.7. CRC-ER/Bursty UBR Load/One VCC . . . . . . . . . . . . . 69
   3.2.5.8. CRC-ER/Bursty UBR Load/Twelve VCCs . . . . . . . . . . . 70
   3.2.5.9. CRC-ER/Bursty UBR Load/Maximum VCCs. . . . . . . . . . . 71
   3.2.5.10. CRC-ER/Bursty Mixed Load/Three VCC. . . . . . . . . . . 73
   3.2.5.11. CRC-ER/Bursty Mixed Load/Twelve VCCs. . . . . . . . . . 74
   3.2.5.12. CRC-ER/Bursty Mixed Load/Maximum VCCs . . . . . . . . . 75
   3.2.6. Cell Transfer Delay (CTD). . . . . . . . . . . . . . . . . 76
   3.2.6.1. Test Setup . . . . . . . . . . . . . . . . . . . . . . . 76
   3.2.6.2. CTD/Steady Load/One VCC. . . . . . . . . . . . . . . . . 77
   3.2.6.3. CTD/Steady Load/Twelve VCCs. . . . . . . . . . . . . . . 78
   3.2.6.4. CTD/Steady Load/Maximum VCCs . . . . . . . . . . . . . . 79
   3.2.6.5. CTD/Bursty VBR Load/One VCC. . . . . . . . . . . . . . . 81
   3.2.6.6. CTD/Bursty VBR Load/Twelve VCCs. . . . . . . . . . . . . 82
   3.2.6.7. CTD/Bursty VBR Load/Maximum VCCs . . . . . . . . . . . . 83
   3.2.6.8. CTD/Bursty UBR Load/One VCC. . . . . . . . . . . . . . . 85
   3.2.6.9. CTD/Bursty UBR Load/Twelve VCCs. . . . . . . . . . . . . 86
   3.2.6.10. CTD/Bursty UBR Load/Maximum VCCs. . . . . . . . . . . . 87
   3.2.6.11. CTD/Mixed Load/Three VCC's. . . . . . . . . . . . . . . 88
   3.2.6.12. CTD/Mixed Load/Twelve VCCs. . . . . . . . . . . . . . . 90
   3.2.6.13. CTD/Mixed Load/Maximum VCCs . . . . . . . . . . . . . . 91
   3.3. ATM Adaptation Layer (AAL) Type 5 (AAL5) . . . . . . . . . . 93
   3.3.1. IP Packet Loss due to AAL5 Re-assembly Errors. . . . . . . 93
   3.3.2. AAL5 Re-assembly Time. . . . . . . . . . . . . . . . . . . 94
   3.3.3. AAL5 CRC Error Ratio . . . . . . . . . . . . . . . . . . . 95
   3.3.3.1. Test Setup . . . . . . . . . . . . . . . . . . . . . . . 95
   3.3.3.2. AAL5-CRC-ER/Steady Load/One VCC. . . . . . . . . . . . . 95
   3.3.3.3. AAL5-CRC-ER/Steady Load/Twelve VCCs. . . . . . . . . . . 96



Dunn & Martin                Informational                      [Page 3]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3.3.3.4. AAL5-CRC-ER/Steady Load/Maximum VCCs . . . . . . . . . . 97
   3.3.3.5. AAL5-CRC-ER/Bursty VBR Load/One VCC. . . . . . . . . . . 99
   3.3.3.6. AAL5-CRC-ER/Bursty VBR Load/Twelve VCCs. . . . . . . . .100
   3.3.3.7. AAL5-CRC-ER/Bursty VBR Load/Maximum VCCs . . . . . . . .101
   3.3.3.8. AAL5-CRC-ER/Mixed Load/Three VCC's . . . . . . . . . . .102
   3.3.3.9. AAL5-CRC-ER/Mixed Load/Twelve VCCs . . . . . . . . . . .104
   3.3.3.10. AAL5-CRC-ER/Mixed Load/Maximum VCCs . . . . . . . . . .105
   3.4. ATM Service: Signaling . . . . . . . . . . . . . . . . . . .106
   3.4.1. CAC Denial Time and Connection Establishment Time. . . . .106
   3.4.2. Connection Teardown Time . . . . . . . . . . . . . . . . .107
   3.4.3. Crankback Time . . . . . . . . . . . . . . . . . . . . . .108
   3.4.4. Route Update Response Time . . . . . . . . . . . . . . . .109
   3.5. ATM Service: ILMI. . . . . . . . . . . . . . . . . . . . . .110
   3.5.1. MIB Alignment Time . . . . . . . . . . . . . . . . . . . .110
   3.5.2. Address Registration Time. . . . . . . . . . . . . . . . .111
   4. Security Considerations  . . . . . . . . . . . . . . . . . . .112
   5. Notices. . . . . . . . . . . . . . . . . . . . . . . . . . . .112
   6. References . . . . . . . . . . . . . . . . . . . . . . . . . .113
   7. Authors' Addresses . . . . . . . . . . . . . . . . . . . . . .113
   APPENDIX A  . . . . . . . . . . . . . . . . . . . . . . . . . . .114
   APPENDIX B  . . . . . . . . . . . . . . . . . . . . . . . . . . .114
   APPENDIX C  . . . . . . . . . . . . . . . . . . . . . . . . . . .116
   Full Copyright Statement  . . . . . . . . . . . . . . . . . . . .127

1. Introduction

   This document defines a specific set of tests that vendors can use to
   measure and report the performance characteristics of ATM network
   devices.  The results of these tests will provide the user comparable
   data from different vendors with which to evaluate these devices.
   The methods defined in this memo are based on RFC 2544 "Benchmarking
   Methodology for Network Interconnect Devices".

   The document "Terminology for ATM Benchmarking" (RFC 2761), defines
   many of the terms that are used in this document.  The terminology
   document should be consulted before attempting to make use of this
   document.

   The BMWG produces two major classes of documents: Benchmarking
   Terminology documents and Benchmarking Methodology documents.  The
   Terminology documents present the benchmarks and other related terms.
   The Methodology documents define the procedures required to collect
   the benchmarks cited in the corresponding Terminology documents.








Dunn & Martin                Informational                      [Page 4]


RFC 3116            Methodology for ATM Benchmarking           June 2001


2. Background

2.1. Test Device Requirements

   This document is based on the requirement that a test device is
   available.  The test device can either be off the shelf or can be
   easily built with current technologies.  The test device must have a
   transmitting and receiving port for the interface type under test.
   The test device must be configured to transmit test PDUs and to
   analyze received PDUs.  The test device should be able to transmit
   and analyze received data at the same time.

2.2. Systems Under Test (SUTs)

   There are a number of tests described in this document that do not
   apply to each SUT.  Vendors should perform all of the tests that can
   be supported by a specific product type.  It will take some time to
   perform all of the recommended tests under all of the recommended
   conditions.

2.3. Test Result Evaluation

   Performing all of the tests in this document will result in a great
   deal of data.  The applicability of this data to the evaluation of a
   particular SUT will depend on its expected use and the configuration
   of the network in which it will be used.  For example, the time
   required by a switch to provide ILMI services will not be a pertinent
   measurement in a network that does not use the ILMI protocol, such as
   an ATM WAN.  Evaluating data relevant to a particular network
   installation may require considerable experience, which may not be
   readily available.  Finally, test selection and evaluation of test
   results must be done with an understanding of generally accepted
   testing practices regarding repeatability, variance and the
   statistical significance of a small numbers of trials.

2.4. Requirements

   In this document, the words that are used to define the significance
   of each particular requirement are capitalized.  These words are:

   *  "MUST" This word, or the words "REQUIRED" and "SHALL" mean that
      the item is an absolute requirement of the specification.

   *  "SHOULD" This word or the adjective "RECOMMENDED" means that there
      may exist valid reasons in particular circumstances to ignore this
      item, but the full implications should be understood and the case
      carefully weighed before choosing a different course.




Dunn & Martin                Informational                      [Page 5]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   *  "MAY" This word or the adjective "OPTIONAL" means that this item
      is truly optional.  One vendor may choose to include the item
      because a particular marketplace requires it or because it
      enhances the product, for example; another vendor may omit the
      same item.

   An implementation is not compliant if it fails to satisfy one or more
   of the MUST requirements for the protocols it implements.  An
   implementation that satisfies all the MUST and all the SHOULD
   requirements for its protocols is said to be "unconditionally
   compliant"; one that satisfies all the MUST requirements but not all
   the SHOULD requirements for its protocols is said to be
   "conditionally compliant".

2.5. Test Configurations for SONET

   The test device can be connected to the SUT in a variety of
   configurations depending on the test point.  The following
   configurations will be used for the tests described in this document.

   1) Uni-directional connection: The test devices transmit port
      (labeled Tx) is connected to the SUT receive port (labeled Rx).
      The SUTs transmit port is connected to the test device receive
      port (see Figure 1).  In this configuration, the test device can
      verify that all transmitted packets are acknowledged correctly.
      Note that this configuration does not verify internal system
      functions, but verifies one port on the SUT.

            +-------------+               +-------------+
            |           Tx|-------------->|Rx           |
            |    Test   Rx|<--------------|Tx   SUT     |
            |   Device    |               |             |
            +-------------+               +-------------+

                            Figure 1

   2) Bi-directional connection: The test devices first transmit port is
      connected to the SUTs first receive port.  The SUTs first transmit
      port is connected to the test devices first receive port.  The
      test devices second transmit port is connected to the SUTs second
      receive port.  The SUTs second transmit port is connected to the
      test devices second receive port (see Figure 2).  In this
      configuration, the test device can determine if all of the
      transmitted packets were received and forwarded correctly.  Note
      that this configuration does verify internal system functions,
      since it verifies two ports on the SUT.





Dunn & Martin                Informational                      [Page 6]


RFC 3116            Methodology for ATM Benchmarking           June 2001


            +-------------+               +-------------+
            |     Test  Tx|-------------->|Rx           |
            |    Device Rx|<--------------|Tx   SUT     |
            |    Tx   Rx  |               |   Tx   Rx   |
            +-------------+               +-------------+
                  |   ^                        |    ^
                  |   |                        |    |
                  |   +------------------------+    |
                  |                                 |
                  |---------------------------------|

                             Figure 2

   3) Uni-directional passthrough connection: The test devices first
      transmit port is connected to the SUT1 receive port.  The SUT1
      transmit port is connected to the test devices first receive port.
      The test devices second transmit port is connected to the SUT2
      receive port.  The SUT2 transmit port is connected to the test
      devices second receive port (see Figure 3).  In this
      configuration, the test device can determine if all of the packets
      transmitted by SUT1 were correctly acknowledged by SUT2.  Note
      that this configuration does not verify internal system functions,
      but verifies one port on each SUT.

   +-------------+           +-------------+           +-------------+
   |           Tx|---------->|Rx         Tx|---------->|Rx           |
   |     SUT1  Rx|<----------|Tx   Test  Rx|<----------|Tx   SUT2    |
   |             |           |    Device   |           |             |
   +-------------+           +-------------+           +-------------+

                              Figure 3

2.6. SUT Configuration

   The SUT MUST be configured as described in the SUT users guide.
   Specifically, it is expected that all of the supported protocols will
   be configured and enabled.  It is expected that all of the tests will
   be run without changing the configuration or setup of the SUT in any
   way other than that required to do the specific test.  For example,
   it is not acceptable to disable all but one transport protocol when
   testing the throughput of that protocol.  If PNNI or BISUP is used to
   initiate switched virtual connections (SVCs), the SUT configuration
   SHOULD include the normally recommended routing update intervals and
   keep alive frequency.  The specific version of the software and the
   exact SUT configuration, including what functions are disabled and
   used during the tests MUST be included as part of the report of the
   results.




Dunn & Martin                Informational                      [Page 7]


RFC 3116            Methodology for ATM Benchmarking           June 2001


2.7. Frame formats

   The formats of the test IP PDUs to use for TCP/IP and UPC/IP over ATM
   are shown in Appendix C: Test Frame Formats.  Note that these IP PDUs
   are in accordance with RFC 2225.  These exact IP PDU formats SHOULD
   be used in the tests described in this document for this
   protocol/media combination.  These IP PDUs will be used as a template
   for testing other protocol/media combinations.  The specific formats
   that are used to define the test IP PDUs for a particular test series
   MUST be included in the report of the results.

2.8. Frame sizes

   All of the described tests SHOULD be performed using a number of IP
   PDU sizes.  Specifically, the sizes SHOULD include the maximum and
   minimum legitimate sizes for the protocol under test on the media
   under test and enough sizes in between to be able to get a full
   characterization of the SUT performance.  Except where noted, at
   least five IP PDU sizes SHOULD be tested for each test condition.

   Theoretically the minimum size UDP Echo request IP PDU would consist
   of an IP header (minimum length 20 octets), a UDP header (8 octets),
   AAL5 trailer (8 octets) and an LLC/SNAP code point header (8 octets);
   therefore, the minimum size PDU will fit into one ATM cell.  The
   theoretical maximum IP PDU size is determined by the size of the
   length field in the IP header.  In almost all cases the actual
   maximum and minimum sizes are determined by the limitations of the
   media.  In the case of ATM, the maximum IP PDU size SHOULD be the ATM
   MTU size, which is 9180 octets.

   In theory it would be ideal to distribute the IP PDU sizes in a way
   that would evenly distribute the theoretical IP PDU rates.  These
   recommendations incorporate this theory but specify IP PDU sizes,
   which are easy to understand and remember.  In addition, many of the
   same IP PDU sizes are specified on each of the media types to allow
   for easy performance comparisons.

   Note: The inclusion of an unrealistically small IP PDU size on some
   of the media types (i.e., with little or no space for data) is to
   help characterize the per-IP PDU processing overhead of the SUT.

   The IP PDU sizes that will be used are:

   44, 64, 128, 256, 1024, 1518, 2048, 4472, 9180

   The minimum size IP PDU for UDP on ATM is 44 octets, the minimum size
   of 44 is recommended to allow direct comparison to token ring
   performance.  The IP PDU size of 4472 is recommended instead of the



Dunn & Martin                Informational                      [Page 8]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   theoretical FDDI maximum size of 4500 octets in order to permit the
   same type of comparison.  An IP (i.e., not UDP) IP PDU may be used in
   addition if a higher data rate is desired, in which case the minimum
   IP PDU size is 28 octets.

2.9. Verifying received IP PDUs

   The test equipment SHOULD discard any IP PDUs received during a test
   run that are not actual forwarded test IP PDUs.  For example, keep-
   alive and routing update IP PDUs SHOULD NOT be included in the count
   of received IP PDUs.  In any case, the test equipment SHOULD verify
   the length of the received IP PDUs and check that they match the
   expected length.

   Preferably, the test equipment SHOULD include sequence numbers in the
   transmitted IP PDUs and check for these numbers on the received IP
   PDUs.  If this is done, the reported results SHOULD include, in
   addition to the number of IP PDUs dropped, the number of IP PDUs that
   were received out of order, the number of duplicate IP PDUs received
   and the number of gaps in the received IP PDU numbering sequence.
   This functionality is required for some of the described tests.

2.10. Modifiers

   It is useful to characterize the SUTs performance under a number of
   conditions.  Some of these conditions are noted below.  The reported
   results SHOULD include as many of these conditions as the test
   equipment is able to generate.  The suite of tests SHOULD be run
   first without any modifying conditions, then repeated under each of
   the modifying conditions separately.  To preserve the ability to
   compare the results of these tests, any IP PDUs that are required to
   generate the modifying conditions (excluding management queries) will
   be included in the same data stream as that of the normal test IP
   PDUs and in place of one of the test IP PDUs.  They MUST not be
   supplied to the SUT on a separate network port.

2.10.1. Management IP PDUs

   Most ATM data networks now make use of ILMI, signaling and OAM.  In
   many environments, there can be a number of management stations
   sending queries to the same SUT at the same time.

   Management queries MUST be made in accordance with the applicable
   specification, e.g., ILMI sysUpTime getNext requests will be made in
   accordance with ILMI 4.0.  The response to the query MUST be verified
   by the test equipment.  Note that, for each management protocol in





Dunn & Martin                Informational                      [Page 9]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   use, this requires that the test equipment implement the associated
   protocol state machine.  One example of the specific query IP PDU
   (ICMP) that should be used is shown in Appendix C.

2.10.2. Routing update IP PDUs

   The processing of PNNI updates could have a significant impact on the
   ability of a switch to forward cells and complete calls.  If PNNI is
   configured on the SUT, one routing update MUST be transmitted before
   the first test IP PDU is transmitted during the trial.  The test
   SHOULD verify that the SUT has properly processed the routing update.

   PNNI routing update IP PDUs SHOULD be sent at the rate specified in
   Appendix B.  Appendix C defines one routing update PDU for the TCP/IP
   over ATM example.  The routing updates are designed to change the
   routing on a number of networks that are not involved in the
   forwarding of the test data.  The first IP PDU sets the routing table
   state to "A", the second one changes the state to "B".  The IP PDUs
   MUST be alternated during the trial.  The test SHOULD verify that the
   SUT has properly processed the routing update.

2.11. Filters

   Filters are added to switches to selectively inhibit the forwarding
   of cells that would normally be forwarded.  This is usually done to
   implement security controls on the data that is accepted between one
   area and another.  Different products have different capabilities to
   implement filters.  Filters are applicable only if the SUT supports
   the filtering feature.

   The SUT SHOULD be first configured to add one filter condition and
   the tests performed.  This filter SHOULD permit the forwarding of the
   test data stream.  This filter SHOULD be of the form as described in
   the SUT Users Guide.

   The SUT SHOULD be then reconfigured to implement a total of 25
   filters.  The first 24 of these filters SHOULD be based on 24
   separate ATM NSAP Network Prefix addresses.

   The 24 ATM NSAP Network Prefix addresses SHOULD not be any that are
   represented in the test data stream.  The last filter SHOULD permit
   the forwarding of the test data stream.  By "first" and "last" we
   mean to ensure that in the second case, 25 conditions must be checked
   before the data IP over ATM PDUs will match the conditions that
   permit the forwarding of the IP PDU.  Of course, if the SUT reorders
   the filters or does not use a linear scan of the filter rules the
   effect of the sequence in which the filters are input is properly
   lost.



Dunn & Martin                Informational                     [Page 10]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   The exact filters configuration command lines used SHOULD be included
   with the report of the results.

2.11.1. Filter Addresses

   Two sets of filter addresses are required, one for the single filter
   case and one for the 25 filter case.

   The single filter case should permit traffic from ATM address [Switch
   Network Prefix] 00 00 00 00 00 01 00 to ATM address [Switch Network
   Prefix] 00 00 00 00 00 02 00 and deny all other traffic.  Note that
   the 13 octet Switch Network Prefix MUST be configured before this
   test can be run.

   The 25 filter case should follow the following sequence.

         deny [Switch Network Prefix] 00 00 00 00 00 01 00
              to [Switch Network Prefix] 00 00 00 00 00 03 00
         deny [Switch Network Prefix] 00 00 00 00 00 01 00
              to [Switch Network Prefix] 00 00 00 00 00 04 00
         deny [Switch Network Prefix] 00 00 00 00 00 01 00
              to [Switch Network Prefix] 00 00 00 00 00 05 00
         ...
         deny [Switch Network Prefix] 00 00 00 00 00 01 00
              to [Switch Network Prefix] 00 00 00 00 00 0C 00
         deny [Switch Network Prefix] 00 00 00 00 00 01 00
              to [Switch Network Prefix] 00 00 00 00 00 0D 00
         allow [Switch Network Prefix] 00 00 00 00 00 01 00
              to [Switch Network Prefix] 00 00 00 00 00 02 00
         deny [Switch Network Prefix] 00 00 00 00 00 01 00
              to [Switch Network Prefix] 00 00 00 00 00 0E 00
         deny [Switch Network Prefix] 00 00 00 00 00 01 00
              to [Switch Network Prefix] 00 00 00 00 00 0F 00
          ...
         deny [Switch Network Prefix] 00 00 00 00 00 01 00
              to [Switch Network Prefix] 00 00 00 00 00 18 00
         deny all else

   All previous filter conditions should be cleared from the switch
   before this sequence is entered.  The sequence is selected to test to
   see if the switch sorts the filter conditions or accepts them in the
   order that they were entered.  Both of these procedures will result
   in a greater impact on performance than will some form of hash
   coding.







Dunn & Martin                Informational                     [Page 11]


RFC 3116            Methodology for ATM Benchmarking           June 2001


2.12. Protocol addresses

   It is easier to implement these tests using a single logical stream
   of data, with one source ATM address and one destination ATM address,
   and for some conditions like the filters described above, a practical
   requirement.  Networks in the real world are not limited to single
   streams of data.  The test suite SHOULD be first run with a single
   ATM source and destination address pair.  The tests SHOULD then be
   repeated with using a random destination address.  In the case of
   testing single switches, the addresses SHOULD be random and uniformly
   distributed over a range of 256 seven octet user parts.  In the case
   of testing multiple interconnected switches, the addresses SHOULD be
   random and uniformly distributed over the 256 network prefixes, each
   of which should support 256 seven octet user parts.  The specific
   address ranges to use for ATM are shown in Appendix A.  IP to ATM
   address mapping MUST be accomplished as described in RFC 2225.

2.13. Route Set Up

   It is not reasonable that all of the routing information necessary to
   forward the test stream, especially in the multiple address case,
   will be manually set up.  If PNNI and/or ILMI are running, at the
   start of each trial a routing update MUST be sent to the SUT.  This
   routing update MUST include all of the ATM addresses that will be
   required for the trial.  This routing update will have to be repeated
   at the interval required by PNNI or ILMI.  An example of the format
   and repetition interval of the update IP PDUs is given in Appendix B
   (interval and size) and Appendix C (format).

2.14. Bidirectional traffic

   Bidirectional performance tests SHOULD be run with the same data rate
   being offered from each direction.  The sum of the data rates should
   not exceed the theoretical limit for the media.

2.15. Single stream path

   The full suite of tests SHOULD be run with the appropriate modifiers
   for a single receive and transmit port on the SUT.  If the internal
   design of the SUT has multiple distinct pathways, for example,
   multiple interface cards each with multiple network ports, then all
   possible permutations of pathways SHOULD be tested separately.  If
   multiple interconnected switches are tested, the test MUST specify
   routes, which allow only one path between source and destination ATM
   addresses.






Dunn & Martin                Informational                     [Page 12]


RFC 3116            Methodology for ATM Benchmarking           June 2001


2.16. Multi-port

   Many switch products provide several network ports on the same
   interface module.  Each port on an interface module SHOULD be
   stimulated in an identical manner.  Specifically, half of the ports
   on each module SHOULD be receive ports and half SHOULD be transmit
   ports.  For example if a SUT has two interface module each of which
   has four ports, two ports on each interface module be receive ports
   and two will be transmit ports.  Each receive port MUST be offered
   the same data rate.  The addresses in the input data streams SHOULD
   be set so that an IP PDU will be directed to each of the transmit
   ports in sequence.  That is, all transmit ports will receive an
   identical distribution of IP PDUs from a particular receive port.

   Consider the following 6 port SUT:

               --------------
      ---------| Rx A   Tx X|--------
      ---------| Rx B   Tx Y|--------
      ---------| Rx C   Tx Z|--------
               --------------

   The addressing of the data streams for each of the inputs SHOULD be:

      stream sent to Rx A:
        IP PDU to Tx X, IP PDU to Tx Y, IP PDU to Tx Z
      stream sent to Rx B:
        IP PDU to Tx X, IP PDU to Tx Y, IP PDU to Tx Z
      stream sent to Rx C
        IP PDU to Tx X, IP PDU to Tx Y, IP PDU to Tx Z

   Note: Each stream contains the same sequence of IP destination
   addresses; therefore, each transmit port will receive 3 IP PDUs
   simultaneously.  This procedure ensures that the SUT will have to
   process multiple IP PDUs addressed to the same transmit port
   simultaneously.

   The same configuration MAY be used to perform a bi-directional
   multi-stream test.  In this case all of the ports are considered both
   receive and transmit ports.  Each data stream MUST consist of IP PDUs
   whose addresses correspond to the ATM addresses all of the other
   ports.









Dunn & Martin                Informational                     [Page 13]


RFC 3116            Methodology for ATM Benchmarking           June 2001


2.17. Multiple protocols

   This document does not address the issue of testing the effects of a
   mixed protocol environment other than to suggest that if such tests
   are wanted then PDUs SHOULD be distributed between all of the test
   protocols.  The distribution MAY approximate the conditions on the
   network in which the SUT would be used.

2.18. Multiple IP PDU sizes

   This document does not address the issue of testing the effects of a
   mixed IP PDU size environment other than to suggest that, if such
   tests are required, then IP PDU size SHOULD be evenly distributed
   among all of the PDU sizes listed in this document.  The distribution
   MAY approximate the conditions on the network in which the SUT would
   be used.

2.19. Testing beyond a single SUT

   In the performance testing of a single SUT, the paradigm can be
   described as applying some input to a SUT and monitoring the output.
   The results of which can be used to form a basis of characterization
   of that device under those test conditions.

   This model is useful when the test input and output are homogeneous
   (e.g., 64-byte IP, AAL5 PDUs into the SUT; 64 byte IP, AAL5 PDUs
   out).

   By extending the single SUT test model, reasonable benchmarks
   regarding multiple SUTs or heterogeneous environments may be
   collected.  In this extension, the single SUT is replaced by a system
   of interconnected network SUTs.  This test methodology would support
   the benchmarking of a variety of device/media/service/protocol
   combinations.  For example, a configuration for a LAN-to-WAN-to-LAN
   test might be:

      (1) ATM UNI -> SUT 1 -> BISUP -> SUT 2 -> ATM UNI

   Or an extended LAN configuration might be:

      (2) ATM UNI -> SUT 1 -> PNNI Network -> SUT 2 -> ATM UNI

   In both examples 1 and 2, end-to-end benchmarks of each system could
   be empirically ascertained.  Other behavior may be characterized
   through the use of intermediate devices.  In example 2, the
   configuration may be used to give an indication of the effect of PNNI
   routing on IP throughput.




Dunn & Martin                Informational                     [Page 14]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Because multiple SUTs are treated as a single system, there are
   limitations to this methodology.  For instance, this methodology may
   yield an aggregate benchmark for a tested system.  That benchmark
   alone, however, may not necessarily reflect asymmetries in behavior
   between the SUTs, latencies introduced by other apparatus (e.g.,
   CSUs/DSUs, switches), etc.

   Further, care must be used when comparing benchmarks of different
   systems by ensuring that the SUTs' features and configuration of the
   tested systems have the appropriate common denominators to allow
   comparison.

2.20. Maximum IP PDU rate

   The maximum IP PDU rates that should be used when testing LAN
   connections SHOULD be the listed theoretical maximum rate for the IP
   PDU size on the media.

   The maximum IP PDU rate that should be used when testing WAN
   connections SHOULD be greater than the listed theoretical maximum
   rate for the IP PDU size on that speed connection.  The higher rate
   for WAN tests is to compensate for the fact that some vendors employ
   various forms of header compression.

   A list of maximum IP PDU rates for LAN connections is included in
   Appendix B.

2.21. Bursty traffic

   It is convenient to measure the SUT performance under steady state
   load; however, this is an unrealistic way to gauge the functioning of
   a SUT.  Actual network traffic normally consists of bursts of IP
   PDUs.

   Some of the tests described below SHOULD be performed with both
   constant bit rate, bursty Unspecified Bit Rate (UBR) Best Effort
   [AF-TM4.1] and Variable Bit Rate Non-real Time (VBR-nrt) Best Effort
   [AF-TM4.1].  The IP PDUs within a burst are transmitted with the
   minimum legitimate inter-IP PDU gap.

   The objective of the test is to determine the minimum interval
   between bursts that the SUT can process with no IP PDU loss.  Tests
   SHOULD be run with burst sizes of 10% of Maximum Burst Size (MBS),
   20% of MBS, 50% of MBS and 100% MBS.  Note that the number of IP PDUs
   in each burst will depend on the PDU size.  For UBR, the MBS refers
   to the associated VBR traffic parameters.





Dunn & Martin                Informational                     [Page 15]


RFC 3116            Methodology for ATM Benchmarking           June 2001


2.22. Trial description

   A particular test consists of multiple trials.  Each trial returns
   one piece of information, for example the loss rate at a particular
   input IP PDU rate.  Each trial consists of five of phases:

   a) If the SUT is a switch supporting PNNI, send the routing update to
      the SUT receive port and wait two seconds to be sure that the
      routing has settled.

   b) Send an ATM ARP PDU to determine the ATM address corresponding to
      the destination IP address.  The formats of the ATM ARP PDU that
      should be used are shown in the Test Frame Formats document and
      MUST be in accordance with RFC 2225.

   c) Stimulate SUT with traffic load.

   d) Wait for two seconds for any residual IP PDUs to be received.

   e) Wait for at least five seconds for the SUT to restabilize.

2.23. Trial duration

   The objective of the tests defined in this document is to accurately
   characterize the behavior of a particular piece of network equipment
   under varying traffic loads.  The choice of test duration must be a
   compromise between this objective and keeping the duration of the
   benchmarking test suite within reasonable bounds.  The SUT SHOULD be
   stimulated for at least 60 seconds.  If this time period results in a
   high variance in the test results, the SUT SHOULD be stimulated for
   at least 300 seconds.

2.24. Address resolution

   The SUT MUST be able to respond to address resolution requests sent
   by another SUT, an ATM ARP server or the test equipment in accordance
   with RFC 2225.

2.25. Synchronized Payload Bit Pattern.

   Some measurements assume that both the transmitter and receiver
   payload information is synchronized.  Synchronization MUST be
   achieved by supplying a known bit pattern to both the transmitter and
   receiver.  This bit pattern MUST be one of the following: PRBS-15,
   PRBS-23, 0xFF00, or 0xAA55.






Dunn & Martin                Informational                     [Page 16]


RFC 3116            Methodology for ATM Benchmarking           June 2001


2.26. Burst Traffic Descriptors.

   Some measurements require busty traffic patterns.  These patterns
   MUST conform to one of the following traffic descriptors:

1) PCR=100% allotted line rate, SCR=50% allotted line rate, and MBS=8192

2) PCR=100% allotted line rate, SCR=50% allotted line rate, and MBS=4096

3) PCR=90% allotted line rate, SCR=50% allotted line rate, and MBS=8192

4) PCR=90% allotted line rate, SCR=50% allotted line rate, and MBS=4096

5) PCR=90% allotted line rate, SCR=45% allotted line rate, and MBS=8192

6) PCR=90% allotted line rate, SCR=45% allotted line rate, and MBS=4096

7) PCR=80% allotted line rate, SCR=40% allotted line rate, and MBS=65536

8) PCR=80% allotted line rate, SCR=40% allotted line rate, and MBS=32768

   The allotted line rate refers to the total available line rate
   divided by the number of VCCs in use.

3. Performance Metrics

3.1. Physical Layer-SONET

3.1.1. Pointer Movements

3.1.1.1. Pointer Movement Propagation.

   Objective: To determine that the SUT does not propagate pointer
   movements as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of IP PDUs at a specific rate through the
       SUT.  Since this test is not a throughput test, the rate should
       not be greater than 90% of line rate.  The cell payload SHOULD
       contain valid IP PDUs.  The IP PDUs MUST be encapsulated in AAL5.







Dunn & Martin                Informational                     [Page 17]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3)  Count the IP PDUs that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test, else lower the test device
       traffic rate until the counts are the same.

   4)  Inject one forward payload pointer movement.  Verify that the SUT
       does not change the pointer.

   5)  Inject one forward payload pointer movement every 1 second.
       Verify that the SUT does not change the pointer.

   6)  Discontinue the payload pointer movement.

   7)  Inject five forward payload pointer movements every 1 second.
       Verify that the SUT does not change the pointer.

   8)  Discontinue the payload pointer movement.

   9)  Inject one backward payload pointer movement.  Verify that the
       SUT does not change the pointer.

   10) Inject one backward payload pointer movement every 1 second.
       Verify that the SUT does not change the pointer.

   11) Discontinue the payload pointer movement.

   12) Inject five backward payload pointer movements every 1 second.
       Verify that the SUT does not change the pointer.

   13) Discontinue the payload pointer movement.

   Reporting Format:

      The results of the pointer movement propagation test SHOULD be
      reported in a form of a table.  The rows SHOULD be labeled single
      pointer movement, one pointer movement per second, and five
      pointer movements per second.  The columns SHOULD be labeled
      pointer movement and loss of pointer.  The elements of the table
      SHOULD be either True or False, indicating whether the particular
      condition was observed for each test.

      The table MUST also indicate the IP PDU size in octets and traffic
      rate in IP PDUs per second as generated by the test device.








Dunn & Martin                Informational                     [Page 18]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.1.1.2. Cell Loss due to Pointer Movement.

   Objective: To determine if the SUT will drop cells due to pointer
   movements as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of cells at a specific rate through the
       SUT.  Since this test is not a throughput test, the rate should
       not be greater than 90% of line rate.  The cell payload SHOULD
       contain valid IP PDUs.  The IP PDUs MUST be encapsulated in AAL5.

   3)  Count the cells that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   4)  Inject one forward payload pointer movement.  Verify that the SUT
       does not drop any cells.

   5)  Inject one forward payload pointer movement every 1 second.
       Verify that the SUT does not drop any cells.

   6)  Discontinue the payload pointer movement.

   7)  Inject five forward payload pointer movements every 1 second.
       Verify that the SUT does not drop any cells.

   8)  Discontinue the payload pointer movement.

   9)  Inject one backward payload pointer movement.  Verify that the
       SUT does not drop any cells.

   10) Inject one backward payload pointer movement every 1 second.
       Verify that the SUT does not drop any cells.

   11) Discontinue the payload pointer movement.

   12) Inject five backward payload pointer movements every 1 second.
       Verify that the SUT does not drop any cells.

   13) Discontinue the payload pointer movement.






Dunn & Martin                Informational                     [Page 19]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the cell loss due to pointer movement test SHOULD
      be reported in a form of a table.  The rows SHOULD be labeled
      single pointer movement, one pointer movement per second, and five
      pointer movements per second.  The columns SHOULD be labeled cell
      loss and number of cells lost.  The elements of column 1 SHOULD be
      either True or False, indicating whether the particular condition
      was observed for each test.  The elements of column 2 SHOULD be
      non-negative integers.

      The table MUST also indicate the traffic rate in IP PDUs per
      second as generated by the test device.

3.1.1.3. IP Packet Loss due to Pointer Movement.

   Objective: To determine if the SUT will drop IP packets due to
   pointer movements as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of IP packets at a specific rate through
       the SUT.  Since this test is not a throughput test, the rate
       should not be greater than 90% of line rate.  The IP PDUs MUST be
       encapsulated in AAL5.

   3)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   4)  Inject one forward payload pointer movement.  Verify that the SUT
       does not drop any packets.

   5)  Inject one forward payload pointer movement every 1 second.
       Verify that the SUT does not drop any packets.

   6)  Discontinue the payload pointer movement.

   7)  Inject five forward payload pointer movements every 1 second.
       Verify that the SUT does not drop any packets.

   8)  Discontinue the payload pointer movement.




Dunn & Martin                Informational                     [Page 20]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   9)  Inject one backward payload pointer movement.  Verify that the
       SUT does not drop any packets.

   10) Inject one backward payload pointer movement every 1 second.
       Verify that the SUT does not drop any packets.

   11) Discontinue the payload pointer movement.

   12) Inject five backward payload pointer movements every 1 second.
       Verify that the SUT does not drop any packets.

   13) Discontinue the payload pointer movement.

   Reporting Format:

      The results of the IP packet loss due to pointer movement test
      SHOULD be reported in a form of a table.  The rows SHOULD be
      labeled single pointer movement, one pointer movement per second,
      and five pointer movements per second.  The columns SHOULD be
      labeled packet loss and number of packets lost.  The elements of
      column 1 SHOULD be either True or False, indicating whether the
      particular condition was observed for each test.  The elements of
      column 2 SHOULD be non-negative integers.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device.

3.1.2. Transport Overhead (TOH) Error Count

3.1.2.1. TOH Error Propagation.

   Objective: To determine that the SUT does not propagate TOH errors as
   defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of IP PDUs at a specific rate through the
       SUT.  Since this test is not a throughput test, the rate should
       not be greater than 90% of line rate.  The cell payload SHOULD
       contain valid IP PDUs.  The IP PDUs MUST be encapsulated in AAL5.

   3)  Count the IP PDUs that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test, else lower the test device
       traffic rate until the counts are the same.



Dunn & Martin                Informational                     [Page 21]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Inject one error in the first bit of the A1 and A2 Frameword.
       Verify that the SUT does not propagate the error.

   5)  Inject one error in the first bit of the A1 and A2 Frameword
       every 1 second.  Verify that the SUT does not propagate the
       error.

   6)  Discontinue the Frameword error.

   7)  Inject one error in the first bit of the A1 and A2 Frameword for
       4 consecutive IP PDUs in every 6 IP PDUs.  Verify that the SUT
       indicates Loss of Frame.

   8)  Discontinue the Frameword error.

   Reporting Format:

      The results of the TOH error propagation test SHOULD be reported
      in a form of a table.  The rows SHOULD be labeled single error,
      one error per second, and four consecutive errors every 6 IP PDUs.
      The columns SHOULD be labeled error propagated and loss of IP PDU.
      The elements of the table SHOULD be either True or False,
      indicating whether the particular condition was observed for each
      test.

      The table MUST also indicate the IP PDU size in octets and traffic
      rate in IP PDUs per second as generated by the test device.

3.1.2.2. c TOH Error.

   Objective: To determine if the SUT will drop cells due TOH Errors as
   defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of cells at a specific rate through the
       SUT.  Since this test is not a throughput test, the rate should
       not be greater than 90% of line rate.  The cell payload SHOULD
       contain valid IP PDUs.  The IP PDUs MUST be encapsulated in AAL5.

   3)  Count the cells that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.




Dunn & Martin                Informational                     [Page 22]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Inject one error in the first bit of the A1 and A2 Frameword.
       Verify that the SUT does not drop any cells.

   5)  Inject one error in the first bit of the A1 and A2 Frameword
       every 1 second.  Verify that the SUT does not drop any cells.

   6)  Discontinue the Frameword error.

   7)  Inject one error in the first bit of the A1 and A2 Frameword for
       4 consecutive IP PDUs in every 6 IP PDUs.  Verify that the SUT
       does drop cells.

   8)  Discontinue the Frameword error.

   Reporting Format:

      The results of the Cell Loss due to TOH errors test SHOULD be
      reported in a form of a table.  The rows SHOULD be labeled single
      error, one error per second, and four consecutive errors every 6
      IP PDUs.  The columns SHOULD be labeled cell loss and number of
      cells lost.  The elements of column 1 SHOULD be either True or
      False, indicating whether the particular condition was observed
      for each test.  The elements of column 2 SHOULD be non-negative
      integers.

      The table MUST also indicate the traffic rate in IP PDUs per
      second as generated by the test device.

3.1.2.3. IP Packet Loss due to TOH Error.

   Objective: To determine if the SUT will drop IP packets due to TOH
   errors as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of IP packets at a specific rate through
       the SUT.  Since this test is not a throughput test, the rate
       should not be greater than 90% of line rate.  The IP PDUs MUST be
       encapsulated in AAL5.

   3)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.




Dunn & Martin                Informational                     [Page 23]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Inject one error in the first bit of the A1 and A2 Frameword.
       Verify that the SUT does not drop any packets.

   5)  Inject one error in the first bit of the A1 and A2 Frameword
       every 1 second.  Verify that the SUT does not drop any packets.

   6)  Discontinue the Frameword error.

   7)  Inject one error in the first bit of the A1 and A2 Frameword for
       4 consecutive IP PDUs in every 6 IP PDUs.  Verify that the SUT
       does drop packets.

   8)  Discontinue the Frameword error.

   Reporting Format:

      The results of the IP packet loss due to TOH errors test SHOULD be
      reported in a form of a table.  The rows SHOULD be labeled single
      error, one error per second, and four consecutive errors every 6
      IP PDUs.  The columns SHOULD be labeled packet loss and number of
      packets lost.  The elements of column 1 SHOULD be either True or
      False, indicating whether the particular condition was observed
      for each test.  The elements of column 2 SHOULD be non-negative
      integers.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device.

3.1.3. Path Overhead (POH) Error Count

3.1.3.1. POH Error Propagation.

   Objective: To determine that the SUT does not propagate POH errors as
   defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of IP PDUs at a specific rate through the
       SUT.  Since this test is not a throughput test, the rate should
       not be greater than 90% of line rate.  The cell payload SHOULD
       contain valid IP PDUs.  The IP PDUs MUST be encapsulated in AAL5.







Dunn & Martin                Informational                     [Page 24]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3)  Count the IP PDUs that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test, else lower the test device
       traffic rate until the counts are the same.

   4)  Inject one error in the B3 (Path BIP8) byte.  Verify that the SUT
       does not propagate the error.

   5)  Inject one error in the B3 byte every 1 second.  Verify that the
       SUT does not propagate the error.

   6)  Discontinue the POH error.

   Reporting Format:

       The results of the POH error propagation test SHOULD be reported
       in a form of a table.  The rows SHOULD be labeled single error
       and one error per second.  The columns SHOULD be labeled error
       propagated and loss of IP PDU.  The elements of the table SHOULD
       be either True or False, indicating whether the particular
       condition was observed for each test.

       The table MUST also indicate the IP PDU size in octets and
       traffic rate in IP PDUs per second as generated by the test
       device.

3.1.3.2. Cell Loss due to POH Error.

   Objective: To determine if the SUT will drop cells due POH Errors as
   defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of cells at a specific rate through the
       SUT.  Since this test is not a throughput test, the rate should
       not be greater than 90% of line rate.  The cell payload SHOULD
       contain valid IP PDUs.  The IP PDUs MUST be encapsulated in AAL5.

   3)  Count the cells that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   4)  Inject one error in the B3 (Path BIP8) byte.  Verify that the SUT
       does not drop any cells.



Dunn & Martin                Informational                     [Page 25]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   5)  Inject one error in the B3 byte every 1 second.  Verify that the
       SUT does not drop any cells.

   6)  Discontinue the POH error.

   Reporting Format:

      The results of the Cell Loss due to POH errors test SHOULD be
      reported in a form of a table.  The rows SHOULD be labeled single
      error and one error per second.  The columns SHOULD be labeled
      cell loss and number of cells lost.  The elements of column 1
      SHOULD be either True or False, indicating whether the particular
      condition was observed for each test.  The elements of column 2
      SHOULD be non-negative integers.

      The table MUST also indicate the traffic rate in IP PDUs per
      second as generated by the test device.

3.1.3.3. IP Packet Loss due to POH Error.

   Objective: To determine if the SUT will drop IP packets due to POH
   errors as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of IP packets at a specific rate through
       the SUT.  Since this test is not a throughput test, the rate
       should not be greater than 90% of line rate.  The IP PDUs MUST be
       encapsulated in AAL5.

   3)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   4)  Inject one error in the B3 (Path BIP8) byte.  Verify that the SUT
       does not drop any packets.

   5)  Inject one error in the B3 byte every 1 second.  Verify that the
       SUT does not drop any packets.

   6)  Discontinue the POH error.






Dunn & Martin                Informational                     [Page 26]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the IP packet loss due to POH errors test SHOULD be
      reported in a form of a table.  The rows SHOULD be labeled single
      error and one error per second.  The columns SHOULD be labeled
      packet loss and number of packets lost.  The elements of column 1
      SHOULD be either True or False, indicating whether the particular
      condition was observed for each test.  The elements of column 2
      SHOULD be non-negative integers.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device.

3.2. ATM Layer

3.2.1. Two-Point Cell Delay Variation (CDV)

3.2.1.1. Test Setup

   The cell delay measurements assume that both the transmitter and
   receiver timestamp information is synchronized.  Synchronization
   SHOULD be achieved by supplying a common clock signal (minimum of 100
   Mhz or 10 ns resolution) to both the transmitter and receiver.  The
   maximum timestamp values MUST be recorded to ensure synchronization
   in the case of counter rollover.  The cell delay measurements SHOULD
   utilize the O.191 cell (ITUT-O.191) encapsulated in a valid IP
   packet.  If the O.191 cell is not available, a test cell encapsulated
   in a valid IP packet MAY be used.  The test cell MUST contain a
   transmit timestamp which can be correlated with a receive timestamp.
   A description of the test cell MUST be included in the test results.
   The description MUST include the timestamp length (in bits), counter
   rollover value, and the timestamp accuracy (in ns).

3.2.1.2. Two-point CDV/Steady Load/One VCC

   Objective: To determine the SUT variation in cell transfer delay with
   one VCC as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR,
       VBR, or UBR connection.  The VPI/VCI MUST not be one of the
       reserved ATM signaling channels (e.g., [0,5], [0,16]).




Dunn & Martin                Informational                     [Page 27]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3)  Send a specific number of IP packets containing timestamps at a
       specific constant rate through the SUT via the defined test VCC.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device.

   Reporting Format:

      The results of the Two-point CDV/Steady Load/One VCC test SHOULD
      be reported in a form of text, graph, and histogram.

      The text results SHOULD display the numerical values of the CDV.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, maximum
      and minimum CDV during the test in us, and peak-to-peak CDV in us.

      The graph results SHOULD display the cell delay values.  The x-
      coordinate SHOULD be the test run time in either seconds, minutes
      or days depending on the total length of the test.  The x-
      coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell delay in us.  The integration time per point MUST be
      indicated.

      The histogram results SHOULD display the peak-to-peak cell delay.
      The x-coordinate SHOULD be the cell delay in us with at least 256
      bins.  The y-coordinate SHOULD be the number of cells observed in
      each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      bearer class of the created VCC MUST also be indicated.

3.2.1.3. Two-point CDV/Steady Load/Twelve VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   twelve VCCs as defined in RFC 2761 "Terminology for ATM
   Benchmarking".




Dunn & Martin                Informational                     [Page 28]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR, VBR,
       or UBR connection.  The VPI/VCIs MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific constant rate through the SUT via the defined test VCCs.
       All of the VPI/VCI pairs will generate traffic at the same
       traffic rate.  Since this test is not a throughput test, the rate
       should not be greater than 90% of line rate.  The IP PDUs MUST be
       encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the Two-point CDV/Steady Load/Twelve VCCs test
      SHOULD be reported in a form of text, graph, and histograms.

      The text results SHOULD display the numerical values of the CDV.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CDV on each VCC during the test in us, and peak-to-peak CDV on
      each VCC in us.

      The graph results SHOULD display the cell delay values.  The x-
      coordinate SHOULD be the test run time in either seconds, minutes
      or days depending on the total length of the test.  The x-
      coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell delay for each VCC in ms.  There SHOULD be 12 curves
      on the graph, one curves indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.







Dunn & Martin                Informational                     [Page 29]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The histograms SHOULD display the peak-to-peak cell delay.  There
      will be one histogram for each VCC.  The x-coordinate SHOULD be
      the cell delay in us with at least 256 bins.  The y-coordinate
      SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      bearer class of the created VCC MUST also be indicated.

3.2.1.4. Two-point CDV/Steady Load/Maximum VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   the maximum number VCCs supported on the SUT as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR, VBR, or UBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific constant rate through the SUT via the defined test VCCs.
       All of the VPI/VCI pairs will generate traffic at the same
       traffic rate.  Since this test is not a throughput test, the rate
       should not be greater than 90% of line rate.  The IP PDUs MUST be
       encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the Two-point CDV/Steady Load/Maximum VCCs test
      SHOULD be reported in a form of text, graphs, and histograms.




Dunn & Martin                Informational                     [Page 30]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The text results SHOULD display the numerical values of the CDV.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CDV on each VCC during the test in us, and peak-to-peak CDV on
      each VCC in us.

      The graph results SHOULD display the cell delay values.  There
      will be (Max number of VCCs/10) graphs, with 10 VCCs indicated on
      each graph.  The x-coordinate SHOULD be the test run time in
      either seconds, minutes or days depending on the total length of
      the test.  The x-coordinate time SHOULD be configurable.  The y-
      coordinate SHOULD be the cell delay for each VCC in us.  There
      SHOULD be no more than 10 curves on each graph, one curve
      indicated and labeled for each VCC.  The integration time per
      point MUST be indicated.

      The histograms SHOULD display the peak-to-peak cell delay.  There
      will be one histogram for each VCC.  The x-coordinate SHOULD be
      the cell delay in us with at least 256 bins.  The y-coordinate
      SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      bearer class of the created VCC MUST also be indicated.

3.2.1.5. Two-point CDV/Bursty VBR Load/One VCC

   Objective: To determine the SUT variation in cell transfer delay with
   one VCC as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR
       or VBR connection.  The VPI/VCI MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific VBR through the SUT via the defined test VCC.  Since
       this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.




Dunn & Martin                Informational                     [Page 31]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device.

   Reporting Format:

      The results of the Two-point CDV/Bursty VBR Load/One VCC test
      SHOULD be reported in a form of text, graph, and histogram.

      The text results SHOULD display the numerical values of the CDV.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, maximum
      and minimum CDV during the test in us, and peak-to-peak CDV in us.

      The graph results SHOULD display the cell delay values.  The x-
      coordinate SHOULD be the test run time in either seconds, minutes
      or days depending on the total length of the test.  The x-
      coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell delay in us.  The integration time per point MUST be
      indicated.

      The histogram results SHOULD display the peak-to-peak cell delay.
      The x-coordinate SHOULD be the cell delay in us with at least 256
      bins.  The y-coordinate SHOULD be the number of cells observed in
      each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.1.6. Two-point CDV/Bursty VBR Load/Twelve VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   twelve VCCs as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.




Dunn & Martin                Informational                     [Page 32]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific VBR through the SUT via the defined test VCCs.  All of
       the VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the Two-point CDV/Bursty VBR Load/Twelve VCCs test
      SHOULD be reported in a form of text, graph, and histograms.

      The text results SHOULD display the numerical values of the CDV.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CDV on each VCC during the test in us, and peak-to-peak CDV on
      each VCC in us.

      The graph results SHOULD display the cell delay values.  The x-
      coordinate SHOULD be the test run time in either seconds, minutes
      or days depending on the total length of the test.  The x-
      coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell delay for each VCC in ms.  There SHOULD be 12 curves
      on the graph, one curves indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The histograms SHOULD display the peak-to-peak cell delay.  There
      will be one histogram for each VCC.  The x-coordinate SHOULD be
      the cell delay in us with at least 256 bins.  The y-coordinate
      SHOULD be the number of cells observed in each bin.







Dunn & Martin                Informational                     [Page 33]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.1.7. Two-point CDV/Bursty VBR Load/Maximum VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   the maximum number VCCs supported on the SUT as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific VBR through the SUT via the defined test VCCs.  All of
       the VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the Two-point CDV/Bursty VBR Load/Maximum VCCs test
      SHOULD be reported in a form of text, graphs, and histograms.

      The text results SHOULD display the numerical values of the CDV.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on




Dunn & Martin                Informational                     [Page 34]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      each VCC during the test in positive integers, maximum and minimum
      CDV on each VCC during the test in us, and peak-to-peak CDV on
      each VCC in us.

      The graph results SHOULD display the cell delay values.  There
      will be (Max number of VCCs/10) graphs, with 10 VCCs indicated on
      each graph.  The x-coordinate SHOULD be the test run time in
      either seconds, minutes or days depending on the total length of
      the test.  The x-coordinate time SHOULD be configurable.  The y-
      coordinate SHOULD be the cell delay for each VCC in us.  There
      SHOULD be no more than 10 curves on each graph, one curve
      indicated and labeled for each VCC.  The integration time per
      point MUST be indicated.

      The histograms SHOULD display the peak-to-peak cell delay.  There
      will be one histogram for each VCC.  The x-coordinate SHOULD be
      the cell delay in us with at least 256 bins.  The y-coordinate
      SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.1.8. Two-point CDV/Mixed Load/Three VCC's

   Objective: To determine the SUT variation in cell transfer delay with
   three VCC's as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with three VCC's.  Each VCC
       MUST be defined as a different Bearer class: one CBR, one UBR and
       one VBR.  Each VCC SHOULD contain one VPI/VCI.  The VPI/VCI MUST
       not be one of the reserved ATM signaling channels (e.g., [0,5],
       [0,16]).

   3)  Send a specific number of IP packets containing timestamps
       through the SUT via the defined test VCCs.  Each generated VCC
       stream MUST match the corresponding VCC Bearer class.  All of the
       VPI/VCI pairs will generate traffic at the same traffic rate.





Dunn & Martin                Informational                     [Page 35]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCC's.

   Reporting Format:

      The results of the Two-point CDV/Mixed Load/Three VCC test SHOULD
      be reported in a form of text, graph, and histogram.

      The text results SHOULD display the numerical values of the CDV.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, maximum
      and minimum CDV during the test in us, and peak-to-peak CDV in us.

      The graph results SHOULD display the cell delay values.  The x-
      coordinate SHOULD be the test run time in either seconds, minutes
      or days depending on the total length of the test.  The x-
      coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell delay in us.  The integration time per point MUST be
      indicated.

      The histogram results SHOULD display the peak-to-peak cell delay.
      The x-coordinate SHOULD be the cell delay in us with at least 256
      bins.  The y-coordinate SHOULD be the number of cells observed in
      each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.1.9. Two-point CDV/Mixed Load/Twelve VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   twelve VCCs as defined in RFC 2761 "Terminology for ATM
   Benchmarking".





Dunn & Martin                Informational                     [Page 36]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCC's.  Each VCC
       MUST be defined as one of the Bearer classes for a total of four
       CBR, four UBR and four VBR VCC's.  Each VCC SHOULD contain one
       VPI/VCI.  The VPI/VCI MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps
       through the SUT via the defined test VCCs.  Each generated VCC
       stream MUST match the corresponding VCC Bearer class.  All of the
       VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the Two-point CDV/Mixed Load/Twelve VCCs test
      SHOULD be reported in a form of text, graph, and histograms.

      The text results SHOULD display the numerical values of the CDV.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CDV on each VCC during the test in us, and peak-to-peak CDV on
      each VCC in us.

      The graph results SHOULD display the cell delay values.  The x-
      coordinate SHOULD be the test run time in either seconds, minutes
      or days depending on the total length of the test.  The x-
      coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell delay for each VCC in ms.  There SHOULD be 12 curves
      on the graph, one curves indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.





Dunn & Martin                Informational                     [Page 37]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The histograms SHOULD display the peak-to-peak cell delay.  There
      will be one histogram for each VCC.  The x-coordinate SHOULD be
      the cell delay in us with at least 256 bins.  The y-coordinate
      SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.1.10. Two-point CDV/Mixed Load/Maximum VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   the maximum number VCCs supported on the SUT as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  Each VCC MUST be defined as one of the Bearer classes for a
       total of (max VCC/3) CBR, (max VCC/3) UBR and (max VCC/3) VBR
       VCC's.  If the maximum number of VCC's is not divisible by 3, the
       total for each bearer class MUST be within 3 VCC's of each other.
       The VPI/VCI MUST not be one of the reserved ATM signaling
       channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps
       through the SUT via the defined test VCCs.  Each generated VCC
       stream MUST match the corresponding VCC Bearer class.  All of the
       VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.




Dunn & Martin                Informational                     [Page 38]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the Two-point CDV/Mixed Load/Maximum VCCs test
      SHOULD be reported in a form of text, graphs, and histograms.

      The text results SHOULD display the numerical values of the CDV.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CDV on each VCC during the test in us, and peak-to-peak CDV on
      each VCC in us.

      The graph results SHOULD display the cell delay values.  There
      will be (Max number of VCCs/10) graphs, with 10 VCCs indicated on
      each graph.  The x-coordinate SHOULD be the test run time in
      either seconds, minutes or days depending on the total length of
      the test.  The x-coordinate time SHOULD be configurable.  The y-
      coordinate SHOULD be the cell delay for each VCC in us.  There
      SHOULD be no more than 10 curves on each graph, one curve
      indicated and labeled for each VCC.  The integration time per
      point MUST be indicated.

      The histograms SHOULD display the peak-to-peak cell delay.  There
      will be one histogram for each VCC.  The x-coordinate SHOULD be
      the cell delay in us with at least 256 bins.  The y-coordinate
      SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.2. Cell Error Ratio (CER)

3.2.2.1. Test Setup

   The cell error ratio measurements assume that both the transmitter
   and receiver payload information is synchronized.  Synchronization
   MUST be achieved by supplying a known bit pattern to both the
   transmitter and receiver.  If this bit pattern is longer than the
   packet size, the receiver MUST synchronize with the transmitter
   before tests can be run.








Dunn & Martin                Informational                     [Page 39]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.2.2.2. CER/Steady Load/One VCC

   Objective: To determine the SUT ratio of errored cells on one VCC in
   a transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR,
       VBR, or UBR connection.  The VPI/VCI MUST not be one of the
       reserved ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a constant rate through the SUT via the
       defined test VCC.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of bit errors at the receiver end of the test
       device.

   Reporting Format:

      The results of the CER/Steady Load/One VCC test SHOULD be reported
      in a form of text and graph.

      The text results SHOULD display the numerical values of the CER.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CER for the entire test.

      The graph results SHOULD display the cell error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CER.  The integration time per point MUST be indicated.





Dunn & Martin                Informational                     [Page 40]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.2.2.3. CER/Steady Load/Twelve VCCs

   Objective: To determine the SUT ratio of errored cells on twelve VCCs
   in a transmission in relation to the total cells sent as defined in
   RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR, VBR,
       or UBR connection.  The VPI/VCIs MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a constant rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of bit errors at the receiver end of the test
       device for all VCCs.

   Reporting Format:

      The results of the CER/Steady Load/Twelve VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CER.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CER for the entire test.



Dunn & Martin                Informational                     [Page 41]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The graph results SHOULD display the cell error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CER for each VCC.  There should be 12 curves on the graph,
      on curve indicated and labeled for each VCC.  The integration time
      per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.2.2.4. CER/Steady Load/Maximum VCCs

   Objective: To determine the SUT ratio of errored cells with the
   maximum number VCCs supported on the SUT in a transmission in
   relation to the total cells sent as defined in RFC 2761 "Terminology
   for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR, VBR, or UBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a constant rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of bit errors at the receiver end of the test
       device for all VCCs.



Dunn & Martin                Informational                     [Page 42]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the CER/Steady Load/Maximum VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CER.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CER for the entire test.

      The graph results SHOULD display the cell error ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CER for each VCC.  There SHOULD be
      no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.2.2.5. CER/Bursty VBR Load/One VCC

   Objective: To determine the SUT ratio of errored cells on one VCC in
   a transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR
       or VBR connection.  The VPI/VCI MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and
       MBS must be configured using one of the specified traffic
       descriptors.






Dunn & Martin                Informational                     [Page 43]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific VBR rate through the SUT via
       the defined test VCC.  Since this test is not a throughput test,
       the rate should not be greater than 90% of line rate.  The IP
       PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of bit errors at the receiver end of the test
       device.

   Reporting Format:

      The results of the CER/Bursty VBR Load/One VCC test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CER.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CER for the entire test.

      The graph results SHOULD display the cell error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CER.  The integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.2.2.6. CER/Bursty VBR Load/Twelve VCCs

   Objective: To determine the SUT ratio of errored cells on twelve VCCs
   in a transmission in relation to the total cells sent as defined in
   RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.



Dunn & Martin                Informational                     [Page 44]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific VBR rate through the SUT via
       the defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The PCR, SCR, and MBS must be indicated.  The IP PDUs MUST
       be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of bit errors at the receiver end of the test
       device for all VCCs.

   Reporting Format:

      The results of the CER/Bursty VBR Load/Twelve VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CER.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CER for the entire test.

      The graph results SHOULD display the cell error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CER for each VCC.  There should be 12 curves on the graph,
      on curve indicated and labeled for each VCC.  The integration time
      per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.



Dunn & Martin                Informational                     [Page 45]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.2.2.7. CER/Bursty VBR Load/Maximum VCCs

   Objective: To determine the SUT ratio of errored cells with the
   maximum number VCCs supported on the SUT in a transmission in
   relation to the total cells sent as defined in RFC 2761 "Terminology
   for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific VBR rate through the SUT via
       the defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of bit errors at the receiver end of the test
       device for all VCCs.

   Reporting Format:

      The results of the CER/Bursty VBR Load/Maximum VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CER.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CER for the entire test.





Dunn & Martin                Informational                     [Page 46]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The graph results SHOULD display the cell error ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CER for each VCC.  There SHOULD be
      no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.2.3. Cell Loss Ratio (CLR)

3.2.3.1. CLR/Steady Load/One VCC

   Objective: To determine the SUT ratio of lost cells on one VCC in a
   transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR,
       VBR, or UBR connection.  The VPI/VCI MUST not be one of the
       reserved ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets at a specific constant rate
       through the SUT via the defined test VCC.  Since this test is not
       a throughput test, the rate should not be greater than 90% of
       line rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of cells transmitted and received on the test
       device.




Dunn & Martin                Informational                     [Page 47]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the CLR/Steady Load/One VCC test SHOULD be reported
      in a form of text and graph.

      The text results SHOULD display the numerical values of the CLR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CLR for the entire test.

      The graph results SHOULD display the Cell Loss ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CLR.  The integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.3.2. CLR/Steady Load/Twelve VCCs

   Objective: To determine the SUT ratio of lost cells on twelve VCCs in
   a transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR, VBR,
       or UBR connection.  The VPI/VCIs MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets at a specific constant rate
       through the SUT via the defined test VCCs.  All of the VPI/VCI
       pairs will generate traffic at the same traffic rate.  Since this
       test is not a throughput test, the rate should not be greater
       than 90% of line rate.  The IP PDUs MUST be encapsulated in AAL5.







Dunn & Martin                Informational                     [Page 48]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of cells transmitted and received per VCC on
       the test device.

   Reporting Format:

      The results of the CLR/Steady Load/Twelve VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CLR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CLR for the entire test.

      The graph results SHOULD display the Cell Loss ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CLR for each VCC.  There should be 12 curves on the graph,
      on curve indicated and labeled for each VCC.  The integration time
      per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.3.3. CLR/Steady Load/Maximum VCCs

   Objective: To determine the SUT ratio of lost cells with the maximum
   number VCCs supported on the SUT in a transmission in relation to the
   total cells sent as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per



Dunn & Martin                Informational                     [Page 49]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       VPI.  The VCC's MUST be configured as either a CBR, VBR, or UBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets at a specific constant rate
       through the SUT via the defined test VCCs.  All of the VPI/VCI
       pairs will generate traffic at the same traffic rate.  Since this
       test is not a throughput test, the rate should not be greater
       than 90% of line rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of cells transmitted and received per VCC on
       the test device.

   Reporting Format:

      The results of the CLR/Steady Load/Maximum VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CLR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CLR for the entire test.

      The graph results SHOULD display the Cell Loss ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CLR for each VCC.  There SHOULD be
      no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.







Dunn & Martin                Informational                     [Page 50]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.2.3.4. CLR/Bursty VBR Load/One VCC

   Objective: To determine the SUT ratio of lost cells on one VCC in a
   transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR
       or VBR connection.  The VPI/VCI MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and
       MBS must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCC.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of cells transmitted and received on the test
       device.

   Reporting Format:

      The results of the CLR/Bursty VBR Load/One VCC test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CLR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CLR for the entire test.

      The graph results SHOULD display the Cell Loss ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CLR.  The integration time per point MUST be indicated.



Dunn & Martin                Informational                     [Page 51]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.3.5. CLR/Bursty VBR Load/Twelve VCCs

   Objective: To determine the SUT ratio of lost cells on twelve VCCs in
   a transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The PCR, SCR, and MBS must be indicated.  The IP PDUs MUST
       be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of cells transmitted and received per VCC on
       the test device.

   Reporting Format:

      The results of the CLR/Bursty VBR Load/Twelve VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CLR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on



Dunn & Martin                Informational                     [Page 52]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      the given VPI/VCI during the test in positive integers, and the
      CLR for the entire test.

      The graph results SHOULD display the Cell Loss ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CLR for each VCC.  There should be 12 curves on the graph,
      on curve indicated and labeled for each VCC.  The integration time
      per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.3.6. CLR/Bursty VBR Load/Maximum VCCs

   Objective: To determine the SUT ratio of lost cells with the maximum
   number VCCs supported on the SUT in a transmission in relation to the
   total cells sent as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The IP PDUs MUST be encapsulated in AAL5.







Dunn & Martin                Informational                     [Page 53]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of cells transmitted and received per VCC on
       the test device.

   Reporting Format:

      The results of the CLR/Bursty VBR Load/Maximum VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CLR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CLR for the entire test.

      The graph results SHOULD display the Cell Loss ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CLR for each VCC.  There SHOULD be
      no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.4. Cell Misinsertion Rate (CMR)

3.2.4.1. CMR/Steady Load/One VCC

   Objective: To determine the SUT ratio of cell misinsertion on one VCC
   in a transmission in relation to the total cells sent as defined in
   RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.




Dunn & Martin                Informational                     [Page 54]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   2)  Configure the SUT and test device with one VCC.  The VCC MUST be
       configured as either a CBR, VBR, or UBR connection.  The VCC
       SHOULD contain one VPI/VCI.  The VPI/VCI MUST not be one of the
       reserved ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets at a specific constant rate
       through the SUT via the defined test VCC.  Since this test is not
       a throughput test, the rate should not be greater than 90% of
       line rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of cell misinsertion errors at the receiver end
       of the test device.

   Reporting Format:

      The results of the CMR/Steady Load/One VCC test SHOULD be reported
      in a form of text and graph.

      The text results SHOULD display the numerical values of the CMR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CMR for the entire test.

      The graph results SHOULD display the Cell misinsertion rate
      values.  The x-coordinate SHOULD be the test run time in either
      seconds, minutes or days depending on the total length of the
      test.  The x-coordinate time SHOULD be configurable.  The y-
      coordinate SHOULD be the CMR.  The integration time per point MUST
      be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.4.2. CMR/Steady Load/Twelve VCCs

   Objective: To determine the SUT rate of misinserted cells on twelve
   VCCs in a transmission in relation to the total cells sent as defined
   in RFC 2761 "Terminology for ATM Benchmarking".




Dunn & Martin                Informational                     [Page 55]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR, VBR,
       or UBR connection.  The VPI/VCIs MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets at a specific constant rate
       through the SUT via the defined test VCCs.  All of the VPI/VCI
       pairs will generate traffic at the same traffic rate.  Since this
       test is not a throughput test, the rate should not be greater
       than 90% of line rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of cell misinsertion errors at the receiver end
       of the test device per VCC.

   Reporting Format:

      The results of the CMR/Steady Load/Twelve VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CMR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CMR for the entire test.

      The graph results SHOULD display the Cell misinsertion rate
      values.  The x-coordinate SHOULD be the test run time in either
      seconds, minutes or days depending on the total length of the
      test.  The x-coordinate time SHOULD be configurable.  The y-
      coordinate SHOULD be the CMR for each VCC.  There should be 12
      curves on the graph, on curve indicated and labeled for each VCC.
      The integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.



Dunn & Martin                Informational                     [Page 56]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.2.4.3. CMR/Steady Load/Maximum VCCs

   Objective: To determine the SUT rate of misinserted cells with the
   maximum number VCCs supported on the SUT in a transmission in
   relation to the total cells sent as defined in RFC 2761 "Terminology
   for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR, VBR, or UBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets at a specific constant rate
       through the SUT via the defined test VCCs.  All of the VPI/VCI
       pairs will generate traffic at the same traffic rate.  Since this
       test is not a throughput test, the rate should not be greater
       than 90% of line rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of cell misinsertion errors at the receiver end
       of the test device per VCC.

   Reporting Format:

      The results of the CMR/Steady Load/Maximum VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CMR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CMR for the entire test.

      The graph results SHOULD display the Cell misinsertion rate
      values.  There will be (Max number of VCCs/10) graphs, with 10
      VCCs indicated on each graph.  The x-coordinate SHOULD be the test
      run time in either seconds, minutes or days depending on the total



Dunn & Martin                Informational                     [Page 57]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CMR for each VCC.  There SHOULD be
      no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.4.4. CMR/Bursty VBR Load/One VCC

   Objective: To determine the SUT rate of misinserted cells on one VCC
   in a transmission in relation to the total cells sent as defined in
   RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR
       or VBR connection.  The VPI/VCI MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and
       MBS must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCC.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of cell misinsertion errors at the receiver end
       of the test device.

   Reporting Format:

      The results of the CMR/Bursty VBR Load/One VCC test SHOULD be
      reported in a form of text and graph.



Dunn & Martin                Informational                     [Page 58]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The text results SHOULD display the numerical values of the CMR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CMR for the entire test.

      The graph results SHOULD display the Cell misinsertion rate
      values.  The x-coordinate SHOULD be the test run time in either
      seconds, minutes or days depending on the total length of the
      test.  The x-coordinate time SHOULD be configurable.  The y-
      coordinate SHOULD be the CMR.  The integration time per point MUST
      be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.4.5. CMR/Bursty VBR Load/Twelve VCCs

   Objective: To determine the SUT rate of misinserted cells on twelve
   VCCs in a transmission in relation to the total cells sent as defined
   in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The PCR, SCR, and MBS must be indicated.  The IP PDUs MUST
       be encapsulated in AAL5.







Dunn & Martin                Informational                     [Page 59]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of cell misinsertion errors at the receiver end
       of the test device per VCC.

   Reporting Format:

      The results of the CMR/Bursty VBR Load/Twelve VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CMR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CMR for the entire test.

      The graph results SHOULD display the Cell misinsertion rate
      values.  The x-coordinate SHOULD be the test run time in either
      seconds, minutes or days depending on the total length of the
      test.  The x-coordinate time SHOULD be configurable.  The y-
      coordinate SHOULD be the CMR for each VCC.  There should be 12
      curves on the graph, on curve indicated and labeled for each VCC.
      The integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.4.6. CMR/Bursty VBR Load/Maximum VCCs

   Objective: To determine the SUT rate of misinserted cells with the
   maximum number VCCs supported on the SUT in a transmission in
   relation to the total cells sent as defined in RFC 2761 "Terminology
   for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per



Dunn & Martin                Informational                     [Page 60]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       VPI.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of cell misinsertion errors at the receiver end
       of the test device per VCC.

   Reporting Format:

      The results of the CMR/Bursty VBR Load/Maximum VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CMR.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, and the
      CMR for the entire test.

      The graph results SHOULD display the Cell misinsertion rate
      values.  There will be (Max number of VCCs/10) graphs, with 10
      VCCs indicated on each graph.  The x-coordinate SHOULD be the test
      run time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CMR for each VCC.  There SHOULD be
      no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.




Dunn & Martin                Informational                     [Page 61]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.2.5. CRC Error Ratio (CRC-ER)

3.2.5.1. CRC-ER/Steady Load/One VCC

   Objective: To determine the SUT ratio of CRC errors on one VCC in a
   transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR,
       VBR, or UBR connection.  The VPI/VCI MUST not be one of the
       reserved ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets at a specific constant rate
       through the SUT via the defined test VCC.  Since this test is not
       a throughput test, the rate should not be greater than 90% of
       line rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received on the test
       device.

   Reporting Format:

      The results of the CRC-ER/Steady Load/One VCC test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CRC-ER.  The integration time per point MUST be indicated.




Dunn & Martin                Informational                     [Page 62]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.5.2. CRC-ER/Steady Load/Twelve VCCs

   Objective: To determine the SUT ratio of lost cells on twelve VCCs in
   a transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR, VBR,
       or UBR connection.  The VPI/VCIs MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets at a specific constant rate
       through the SUT via the defined test VCCs.  All of the VPI/VCI
       pairs will generate traffic at the same traffic rate.  Since this
       test is not a throughput test, the rate should not be greater
       than 90% of line rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device.

   Reporting Format:

      The results of the CRC-ER/Steady Load/Twelve VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.





Dunn & Martin                Informational                     [Page 63]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The graph results SHOULD display the CRC Error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CRC-ER for each VCC.  There should be 12 curves on the
      graph, on curve indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.5.3. CRC-ER/Steady Load/Maximum VCCs

   Objective: To determine the SUT ratio of lost cells with the maximum
   number VCCs supported on the SUT in a transmission in relation to the
   total cells sent as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR, VBR, or UBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets at a specific constant rate
       through the SUT via the defined test VCCs.  All of the VPI/VCI
       pairs will generate traffic at the same traffic rate.  Since this
       test is not a throughput test, the rate should not be greater
       than 90% of line rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device.





Dunn & Martin                Informational                     [Page 64]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the CRC-ER/Steady Load/Maximum VCCs test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CRC-ER for each VCC.  There SHOULD
      be no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.5.4. CRC-ER/Bursty VBR Load/One VCC

   Objective: To determine the SUT ratio of lost cells on one VCC in a
   transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR
       or VBR connection.  The VPI/VCI MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and
       MBS must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCC.  Since this test is not a throughput test, the



Dunn & Martin                Informational                     [Page 65]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device.

   Reporting Format:

      The results of the CRC-ER/Bursty VBR Load/One VCC test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CRC-ER.  The integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.5.5. CRC-ER/Bursty VBR Load/Twelve VCCs

   Objective: To determine the SUT ratio of lost cells on twelve VCCs in
   a transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM



Dunn & Martin                Informational                     [Page 66]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The PCR, SCR, and MBS must be indicated.  The IP PDUs MUST
       be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device for all VCCs.

   Reporting Format:

      The results of the CRC-ER/Bursty VBR Load/Twelve VCCs test SHOULD
      be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CRC-ER for each VCC.  There should be 12 curves on the
      graph, on curve indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.







Dunn & Martin                Informational                     [Page 67]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.2.5.6. CRC-ER/Bursty VBR Load/Maximum VCCs

   Objective: To determine the SUT ratio of lost cells with the maximum
   number VCCs supported on the SUT in a transmission in relation to the
   total cells sent as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device for all VCCs.

   Reporting Format:

      The results of the CRC-ER/Bursty VBR Load/Maximum VCCs test SHOULD
      be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.



Dunn & Martin                Informational                     [Page 68]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CRC-ER for each VCC.  There SHOULD
      be no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.5.7. CRC-ER/Bursty UBR Load/One VCC

   Objective: To determine the SUT ratio of lost cells on one VCC in a
   transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as a UBR
       connection.  The VPI/VCI MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCC.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device.





Dunn & Martin                Informational                     [Page 69]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the CRC-ER/Bursty UBR Load/One VCC test SHOULD be
      reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CRC-ER.  The integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.5.8. CRC-ER/Bursty UBR Load/Twelve VCCs

   Objective: To determine the SUT ratio of lost cells on twelve VCCs in
   a transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC MUST be configured as a UBR connection.
       The VPI/VCIs MUST not be one of the reserved ATM signaling
       channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS must be
       configured using one of the specified traffic descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The PCR, SCR, and MBS must be indicated.  The IP PDUs MUST
       be encapsulated in AAL5.




Dunn & Martin                Informational                     [Page 70]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device for all VCCs.

   Reporting Format:

      The results of the CRC-ER/Bursty UBR Load/Twelve VCCs test SHOULD
      be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CRC-ER for each VCC.  There should be 12 curves on the
      graph, on curve indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.5.9. CRC-ER/Bursty UBR Load/Maximum VCCs

   Objective: To determine the SUT ratio of lost cells with the maximum
   number VCCs supported on the SUT in a transmission in relation to the
   total cells sent as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per



Dunn & Martin                Informational                     [Page 71]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       VPI.  The VCC MUST be configured as a UBR connection.  The
       VPI/VCIs MUST not be one of the reserved ATM signaling channels
       (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS must be configured
       using one of the specified traffic descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device for all VCCs.

   Reporting Format:

      The results of the CRC-ER/Bursty UBR Load/Maximum VCCs test SHOULD
      be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CRC-ER for each VCC.  There SHOULD
      be no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.





Dunn & Martin                Informational                     [Page 72]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.2.5.10. CRC-ER/Bursty Mixed Load/Three VCC

   Objective: To determine the SUT ratio of lost cells on three VCC's in
   relation to the total cells sent as defined in RFC 2761 "Terminology
   for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with three VCC's.  Each VCC
       MUST be defined as a different Bearer class; one CBR, one UBR and
       one VBR.  Each VCC SHOULD contain one VPI/VCI.  The VPI/VCI MUST
       not be one of the reserved ATM signaling channels (e.g., [0,5],
       [0,16]).  The PCR, SCR, and MBS must be configured using one of
       the specified traffic descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns through the SUT via the defined test VCCs.
       Each generated VCC stream MUST match the corresponding VCC Bearer
       class.  All of the VPI/VCI pairs will generate traffic at the
       same traffic rate.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device.

   Reporting Format:

      The results of the CRC-ER/Bursty Mixed Load/Three VCC test SHOULD
      be reported in in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The



Dunn & Martin                Informational                     [Page 73]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CRC-ER for each VCC.  There should be 12 curves on the
      graph, on curve indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.5.11. CRC-ER/Bursty Mixed Load/Twelve VCCs

   Objective: To determine the SUT ratio of lost cells on twelve VCCs in
   a transmission in relation to the total cells sent as defined in RFC
   2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCC's.  Each VCC
       MUST be defined as one of the Bearer classes for a total of four
       CBR, four UBR and four VBR VCC's.  Each VCC SHOULD contain one
       VPI/VCI.  The VPI/VCI MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns through the SUT via the defined test VCCs.
       Each generated VCC stream MUST match the corresponding VCC Bearer
       class.  All of the VPI/VCI pairs will generate traffic at the
       same traffic rate.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device for all VCCs.

   Reporting Format:

      The results of the CRC-ER/Bursty Mixed Load/Twelve VCCs test
      SHOULD be reported in a form of text and graph.



Dunn & Martin                Informational                     [Page 74]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.  The
      x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on  the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the CRC-ER for each VCC.  There should be 12 curves on the
      graph, on curve indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.5.12. CRC-ER/Bursty Mixed Load/Maximum VCCs

   Objective: To determine the SUT ratio of lost cells with the maximum
   number VCCs supported on the SUT in a transmission in relation to the
   total cells sent as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  Each VCC MUST be defined as one of the Bearer classes for a
       total of (max VCC/3) CBR, (max VCC/3) UBR and (max VCC/3) VBR
       VCC's.  The VPI/VCI MUST not be one of the reserved ATM signaling
       channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns through the SUT via the defined test VCCs.
       Each generated VCC stream MUST match the corresponding VCC Bearer
       class.  All of the VPI/VCI pairs will generate traffic at the
       same traffic rate.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.




Dunn & Martin                Informational                     [Page 75]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of CRC errored cells received per VCC on the
       test device for all VCCs.

   Reporting Format:

      The results of the CRC-ER/Bursty Mixed Load/Maximum VCCs test
      SHOULD be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the CRC-
      ER.  The values given SHOULD include: time period of test in s,
      test VPI/VCI value, total number of cells transmitted and received
      on the given VPI/VCI during the test in positive integers, and the
      CRC-ER for the entire test.

      The graph results SHOULD display the CRC Error ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the CRC-ER for each VCC.  There SHOULD
      be no more than 10 curves on each graph, one curve indicated and
      labeled for each VCC.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.6. Cell Transfer Delay (CTD)

3.2.6.1. Test Setup

   The cell transfer delay measurements assume that both the transmitter
   and receiver timestamp information is synchronized.  Synchronization
   SHOULD be achieved by supplying a common clock signal (minimum of 100
   Mhz or 10 ns resolution) to both the transmitter and receiver.  The
   maximum timestamp values MUST be recorded to ensure synchronization
   in the case of counter rollover.  The cell transfer delay
   measurements SHOULD utilize the O.191 cell (ITUT-O.191) encapsulated
   in a valid IP packet.  If the O.191 cell is not available, a test
   cell encapsulated in a valid IP packet MAY be used.  The test cell



Dunn & Martin                Informational                     [Page 76]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   MUST contain a transmit timestamp which can be correlated with a
   receive timestamp.  A description of the test cell MUST be included
   in the test results.  The description MUST include the timestamp
   length (in bits), counter rollover value, and the timestamp accuracy
   (in ns).

3.2.6.2. CTD/Steady Load/One VCC

   Objective: To determine the SUT variation in cell transfer delay with
   one VCC as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR,
       VBR, or UBR connection.  The VPI/VCI MUST not be one of the
       reserved ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific constant rate through the SUT via the defined test VCC.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device.

   Reporting Format:

      The results of the CTD/Steady Load/One VCC test SHOULD be reported
      in a form of text, graph, and histogram.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, minimum,
      maximum, and mean CTD during the test in us.

      The graph results SHOULD display the cell transfer delay values.
      The x-coordinate SHOULD be the test run time in either seconds,



Dunn & Martin                Informational                     [Page 77]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell transfer delay in us.  The integration time per point
      MUST be indicated.

      The histogram results SHOULD display the cell transfer delay.  The
      x-coordinate SHOULD be the cell transfer delay in us with at least
      256 bins.  The y-coordinate SHOULD be the number of cells observed
      in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      bearer class of the created VCC MUST also be indicated.

3.2.6.3. CTD/Steady Load/Twelve VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   twelve VCCs as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR, VBR,
       or UBR connection.  The VPI/VCIs MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific constant rate through the SUT via the defined test VCCs.
       All of the VPI/VCI pairs will generate traffic at the same
       traffic rate.  Since this test is not a throughput test, the rate
       should not be greater than 90% of line rate.  The IP PDUs MUST be
       encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.






Dunn & Martin                Informational                     [Page 78]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the CTD/Steady Load/Twelve VCCs test SHOULD be
      reported in a form of text, graph, and histograms.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CTD on each VCC during the test in us, and mean CTD on each VCC in
      us.

      The graph results SHOULD display the cell transfer delay values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell transfer delay for each VCC in ms.  There SHOULD be 12
      curves on the graph, one curves indicated and labeled for each
      VCC.  The integration time per point MUST be indicated.

      The histograms SHOULD display the cell transfer delay.  There will
      be one histogram for each VCC.  The x-coordinate SHOULD be the
      cell transfer delay in us with at least 256 bins.  The y-
      coordinate SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      bearer class of the created VCC MUST also be indicated.

3.2.6.4. CTD/Steady Load/Maximum VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   the maximum number VCCs supported on the SUT as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR, VBR, or UBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).




Dunn & Martin                Informational                     [Page 79]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3)  Send a specific number of IP packets containing timestamps at a
       specific constant rate through the SUT via the defined test VCCs.
       All of the VPI/VCI pairs will generate traffic at the same
       traffic rate.  Since this test is not a throughput test, the rate
       should not be greater than 90% of line rate.  The IP PDUs MUST be
       encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the CTD/Steady Load/Maximum VCCs test SHOULD be
      reported in a form of text, graphs, and histograms.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CTD on each VCC during the test in us, and mean CTD on each VCC in
      us.

      The graph results SHOULD display the cell transfer delay values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the cell transfer delay for each VCC in
      us.  There SHOULD be no more than 10 curves on each graph, one
      curve indicated and labeled for each VCC.  The integration time
      per point MUST be indicated.

      The histograms SHOULD display the cell transfer delay.  There will
      be one histogram for each VCC.  The x-coordinate SHOULD be the
      cell transfer delay in us with at least 256 bins.  The y-
      coordinate SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      bearer class of the created VCC MUST also be indicated.





Dunn & Martin                Informational                     [Page 80]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.2.6.5. CTD/Bursty VBR Load/One VCC

   Objective: To determine the SUT variation in cell transfer delay with
   one VCC as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR
       or VBR connection.  The VPI/VCI MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific VBR through the SUT via the defined test VCC.  Since
       this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device.

   Reporting Format:

      The results of the CTD/Bursty VBR Load/One VCC test SHOULD be
      reported in a form of text, graph, and histogram.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, minimum,
      maximum, and mean CTD during the test in us.

      The graph results SHOULD display the cell transfer delay values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell transfer delay in us.  The integration time per point
      MUST be indicated.





Dunn & Martin                Informational                     [Page 81]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The histogram results SHOULD display the cell transfer delay.  The
      x-coordinate SHOULD be the cell transfer delay in us with at least
      256 bins.  The y-coordinate SHOULD be the number of cells observed
      in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.6.6. CTD/Bursty VBR Load/Twelve VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   twelve VCCs as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific VBR through the SUT via the defined test VCCs.  All of
       the VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the CTD/Bursty VBR Load/Twelve VCCs test SHOULD be
      reported in a form of text, graph, and histograms.





Dunn & Martin                Informational                     [Page 82]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CTD on each VCC during the test in us, and mean CTD on each VCC in
      us.

      The graph results SHOULD display the cell transfer delay values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell transfer delay for each VCC in ms.  There SHOULD be 12
      curves on the graph, one curves indicated and labeled for each
      VCC.  The integration time per point MUST be indicated.

      The histograms SHOULD display the cell transfer delay.  There will
      be one histogram for each VCC.  The x-coordinate SHOULD be the
      cell transfer delay in us with at least 256 bins.  The y-
      coordinate SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.6.7. CTD/Bursty VBR Load/Maximum VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   the maximum number VCCs supported on the SUT as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific VBR through the SUT via the defined test VCCs.  All of
       the VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be



Dunn & Martin                Informational                     [Page 83]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the CTD/Bursty VBR Load/Maximum VCCs test SHOULD be
      reported in a form of text, graphs, and histograms.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CTD on each VCC during the test in us, and mean CTD on each VCC in
      us.

      The graph results SHOULD display the cell transfer delay values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the cell transfer delay for each VCC in
      us.  There SHOULD be no more than 10 curves on each graph, one
      curve indicated and labeled for each VCC.  The integration time
      per point MUST be indicated.

      The histograms SHOULD display the cell transfer delay.  There will
      be one histogram for each VCC.  The x-coordinate SHOULD be the
      cell transfer delay in us with at least 256 bins.  The y-
      coordinate SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.








Dunn & Martin                Informational                     [Page 84]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.2.6.8. CTD/Bursty UBR Load/One VCC

   Objective: To determine the SUT variation in cell transfer delay with
   one VCC as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as a UBR
       connection.  The VPI/VCI MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific UBR through the SUT via the defined test VCC.  Since
       this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device.

   Reporting Format:

      The results of the CTD/Bursty UBR Load/One VCC test SHOULD be
      reported in a form of text, graph, and histogram.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, minimum,
      maximum, and mean CTD during the test in us.

      The graph results SHOULD display the cell transfer delay values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell transfer delay in us.  The integration time per point
      MUST be indicated.





Dunn & Martin                Informational                     [Page 85]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The histogram results SHOULD display the cell transfer delay.  The
      x-coordinate SHOULD be the cell transfer delay in us with at least
      256 bins.  The y-coordinate SHOULD be the number of cells observed
      in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      bearer class of the created VCC MUST also be indicated.

3.2.6.9. CTD/Bursty UBR Load/Twelve VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   twelve VCCs as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as a UBR connection.
       The VPI/VCIs MUST not be one of the reserved ATM signaling
       channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific UBR through the SUT via the defined test VCCs.  All of
       the VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the CTD/Bursty UBR Load/Twelve VCCs test SHOULD be
      reported in a form of text, graph, and histograms.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test



Dunn & Martin                Informational                     [Page 86]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CTD on each VCC during the test in us, and mean CTD on each VCC in
      us.

      The graph results SHOULD display the cell transfer delay values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell transfer delay for each VCC in ms.  There SHOULD be 12
      curves on the graph, one curves indicated and labeled for each
      VCC.  The integration time per point MUST be indicated.

      The histograms SHOULD display the cell transfer delay.  There will
      be one histogram for each VCC.  The x-coordinate SHOULD be the
      cell transfer delay in us with at least 256 bins.  The y-
      coordinate SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      bearer class of the created VCC MUST also be indicated.

3.2.6.10. CTD/Bursty UBR Load/Maximum VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   the maximum number VCCs supported on the SUT as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC MUST be configured as a UBR connection.  The
       VPI/VCIs MUST not be one of the reserved ATM signaling channels
       (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps at a
       specific UBR through the SUT via the defined test VCCs.  All of
       the VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.




Dunn & Martin                Informational                     [Page 87]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the CTD/Bursty UBR Load/Maximum VCCs test SHOULD be
      reported in a form of text, graphs, and histograms.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CTD on each VCC during the test in us, and mean CTD on each VCC in
      us.

      The graph results SHOULD display the cell transfer delay values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the cell transfer delay for each VCC in
      us.  There SHOULD be no more than 10 curves on each graph, one
      curve indicated and labeled for each VCC.  The integration time
      per point MUST be indicated.

      The histograms SHOULD display the cell transfer delay.  There will
      be one histogram for each VCC.  The x-coordinate SHOULD be the
      cell transfer delay in us with at least 256 bins.  The y-
      coordinate SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      bearer class of the created VCC MUST also be indicated.

3.2.6.11. CTD/Mixed Load/Three VCC's

   Objective: To determine the SUT variation in cell transfer delay with
   three VCC's as defined in RFC 2761 "Terminology for ATM
   Benchmarking".






Dunn & Martin                Informational                     [Page 88]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with three VCC's.  Each VCC
       MUST be defined as a different Bearer class: one CBR, one UBR and
       one VBR.  Each VCC SHOULD contain one VPI/VCI.  The VPI/VCI MUST
       not be one of the reserved ATM signaling channels (e.g., [0,5],
       [0,16]).

   3)  Send a specific number of IP packets containing timestamps
       through the SUT via the defined test VCCs.  Each generated VCC
       stream MUST match the corresponding VCC Bearer class.  All of the
       VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCC's.

   Reporting Format:

      The results of the CTD/Mixed Load/Three VCC test SHOULD be
      reported in a form of text, graph, and histogram.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI value, total number of cells transmitted and received on
      the given VPI/VCI during the test in positive integers, minimum,
      maximum, and mean CTD during the test in us.

      The graph results SHOULD display the cell transfer delay values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell transfer delay in us.  The integration time per point
      MUST be indicated.







Dunn & Martin                Informational                     [Page 89]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The histogram results SHOULD display the cell transfer delay.  The
      x-coordinate SHOULD be the cell transfer delay in us with at least
      256 bins.  The y-coordinate SHOULD be the number of cells observed
      in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.6.12. CTD/Mixed Load/Twelve VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   twelve VCCs as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCC's.  Each VCC
       MUST be defined as one of the Bearer classes for a total of four
       CBR, four UBR and four VBR VCC's.  Each VCC SHOULD contain one
       VPI/VCI.  The VPI/VCI MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing timestamps
       through the SUT via the defined test VCCs.  Each generated VCC
       stream MUST match the corresponding VCC Bearer class.  All of the
       VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the CTD/Mixed Load/Twelve VCCs test SHOULD be
      reported in a form of text, graph, and histograms.



Dunn & Martin                Informational                     [Page 90]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CTD on each VCC during the test in us, and mean CTD on each VCC in
      us.

      The graph results SHOULD display the cell transfer delay values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the cell transfer delay for each VCC in ms.  There SHOULD be 12
      curves on the graph, one curves indicated and labeled for each
      VCC.  The integration time per point MUST be indicated.

      The histograms SHOULD display the cell transfer delay.  There will
      be one histogram for each VCC.  The x-coordinate SHOULD be the
      cell transfer delay in us with at least 256 bins.  The y-
      coordinate SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.

3.2.6.13. CTD/Mixed Load/Maximum VCCs

   Objective: To determine the SUT variation in cell transfer delay with
   the maximum number VCCs supported on the SUT as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  Each VCC MUST be defined as one of the Bearer classes for a
       total of (max VCC/3) CBR, (max VCC/3) UBR and (max VCC/3) VBR
       VCC's.  If the maximum number of VCC's is not divisible by 3, the
       total for each bearer class MUST be within 3 VCC's of each other.
       The VPI/VCI MUST not be one of the reserved ATM signaling
       channels (e.g., [0,5], [0,16]).





Dunn & Martin                Informational                     [Page 91]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3)  Send a specific number of IP packets containing timestamps
       through the SUT via the defined test VCCs.  Each generated VCC
       stream MUST match the corresponding VCC Bearer class.  All of the
       VPI/VCI pairs will generate traffic at the same traffic rate.
       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the packets timestamps at the transmitter and receiver
       ends of the test device for all VCCs.

   Reporting Format:

      The results of the CTD/Mixed Load/Maximum VCCs test SHOULD be
      reported in a form of text, graphs, and histograms.

      The text results SHOULD display the numerical values of the CTD.
      The values given SHOULD include: time period of test in s, test
      VPI/VCI values, total number of cells transmitted and received on
      each VCC during the test in positive integers, maximum and minimum
      CTD on each VCC during the test in us, and mean CTD on each VCC in
      us.

      The graph results SHOULD display the cell transfer delay values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the cell transfer delay for each VCC in
      us.  There SHOULD be no more than 10 curves on each graph, one
      curve indicated and labeled for each VCC.  The integration time
      per point MUST be indicated.

      The histograms SHOULD display the cell transfer delay.  There will
      be one histogram for each VCC.  The x-coordinate SHOULD be the
      cell transfer delay in us with at least 256 bins.  The y-
      coordinate SHOULD be the number of cells observed in each bin.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST also be indicated.



Dunn & Martin                Informational                     [Page 92]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.3. ATM Adaptation Layer (AAL) Type 5 (AAL5)

3.3.1. IP Packet Loss due to AAL5 Re-assembly Errors

   Objective: To determine if the SUT will drop IP packets due AAL5 Re-
   assembly Errors as defined in RFC 2761 "Terminology for ATM
   Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of cells at a specific rate through the
       SUT.  Since this test is not a throughput test, the rate should
       not be greater than 90% of line rate.  The cell payload SHOULD
       contain valid IP PDUs.  The IP PDUs MUST be encapsulated in AAL5.

   3)  Count the cells that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   4)  Inject one error in the first bit of the AAL5 payload.  Verify
       that the SUT does not drop any AAL5 PDU's.

   5)  Discontinue the AAL5 payload error.

   6)  Inject one error in the first bit of the AAL5 header for 4
       consecutive IP PDUs in every 6 IP PDUs.  Verify that the SUT does
       drop the AAL5 PDU's.

   7)  Discontinue the AAL5 payload error.

   Reporting Format:

      The results of the AAL5 PDU Loss due to AAL5 PDU errors test
      SHOULD be reported in a form of a table.  The rows SHOULD be
      labeled single error, one error per second, and four consecutive
      errors every 6 IP PDUs.  The columns SHOULD be labeled AAL5 PDU
      loss and number of PDU's lost.  The elements of column 1 SHOULD be
      either True or False, indicating whether the particular condition
      was observed for each test.  The elements of column 2 SHOULD be
      non-negative integers.

      The table MUST also indicate the traffic rate in IP PDUs per
      second as generated by the test device.




Dunn & Martin                Informational                     [Page 93]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.3.2. AAL5 Reassembly Time.

   Objective: To determine the SUT AAL5 Reassembly Time as defined in
   RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       configuration.

   2)  Send a specific number of IP packets at a specific rate through
       the SUT.  Since this test is not a throughput test, the rate
       should not be greater than 90% of line rate.  The IP PDUs MUST be
       encapsulated in AAL5.  The AAL5 PDU size is 65535 octets or 1365
       ATM cells.

   3)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   4)  Given an AAL5 reassembly timer of 'x' seconds, where 'x' is the
       actual value of the AAL5 reassembly timer on the SUT, sent
       traffic at 1365 cells per 'x' seconds.  The expected results are
       that no AAL5 PDU's will be dropped.

   5)  Send traffic at 1360 cells per 'x' seconds.  The expected results
       are that all AAL5 PDU's will be dropped.

   Reporting Format:

      The results of the IP packet loss due to AAL5 reassembly timeout
      test SHOULD be reported in a form of a table.  The rows SHOULD be
      labeled 1365 cells per 'x' seconds and 1360 cells per 'x' seconds.
      The columns SHOULD be labeled packet loss and number of packets
      lost.  The elements of column 1 SHOULD be either True or False,
      indicating whether the particular condition was observed for each
      test.  The elements of column 2 SHOULD be non-negative integers.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device,
      including the value of









Dunn & Martin                Informational                     [Page 94]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.3.3. AAL5 CRC Error Ratio.

3.3.3.1. Test Setup

   The AAL5 CRC error ratio measurements assume that both the
   transmitter and receiver payload information is synchronized.
   Synchronization MUST be achieved by supplying a known bit pattern to
   both the transmitter and receiver.  If this bit pattern is longer
   than the packet size, the receiver MUST synchronize with the
   transmitter before tests can be run.

3.3.3.2. AAL5-CRC-ER/Steady Load/One VCC

   Objective: To determine the SUT ratio of AAL5 CRC PDU errors on one
   VCC in a transmission in relation to the total AAL5 PDU's sent as
   defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR,
       VBR, or UBR connection.  The VPI/VCI MUST not be one of the
       reserved ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a constant rate through the SUT via the
       defined test VCC.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of AAL5 CRC errors at the receiver end of the
       test device.

   Reporting Format:

      The results of the AAL5-CRC-ER/Steady Load/One VCC test SHOULD be
      reported in a form of text and graph.






Dunn & Martin                Informational                     [Page 95]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The text results SHOULD display the numerical values of the AAL5-
      CRC-ER.  The values given SHOULD include: time period of test in
      s, test VPI/VCI value, total number of AAL5 PDU's transmitted and
      received on the given VPI/VCI during the test in positive
      integers, and the AAL5-CRC-ER for the entire test.

      The graph results SHOULD display the AAL5 CRC error ratio values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the AAL5-CRC-ER.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.3.3.3. AAL5-CRC-ER/Steady Load/Twelve VCCs

   Objective: To determine the SUT ratio of AAL5 CRC PDU errors on
   twelve VCC's in a transmission in relation to the total AAL5 PDU's
   sent as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR, VBR,
       or UBR connection.  The VPI/VCIs MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a constant rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.

       Since this test is not a throughput test, the rate should not be
       greater than 90% of line rate.  The IP PDUs MUST be encapsulated
       in AAL5.







Dunn & Martin                Informational                     [Page 96]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of AAL5 CRC errors at the receiver end of the
       test device for all VCCs.

   Reporting Format:

      The results of the AAL5-CRC-ER/Steady Load/Twelve VCCs test SHOULD
      be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the AAL5-
      CRC-ER.  The values given SHOULD include: time period of test in
      s, test VPI/VCI value, total number of AAL5 PDU's transmitted and
      received on the given VPI/VCI during the test in positive
      integers, and the AAL5-CRC-ER for the entire test.

      The graph results SHOULD display the AAL5 CRC error ratio values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the AAL5-CRC-ER for each VCC.  There should be 12 curves on the
      graph, on curve indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.3.3.4. AAL5-CRC-ER/Steady Load/Maximum VCCs

   Objective: To determine the SUT ratio of AAL5 CRC PDU errors with the
   maximum number VCCs supported on the SUT in a transmission in
   relation to the total AAL5 PDU's sent as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs



Dunn & Martin                Informational                     [Page 97]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR, VBR, or UBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a constant rate through the SUT via the
       defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of AAL5 CRC errors at the receiver end of the
       test device for all VCCs.

   Reporting Format:

      The results of the AAL5-CRC-ER/Steady Load/Maximum VCCs test
      SHOULD be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the AAL5-
      CRC-ER.  The values given SHOULD include: time period of test in
      s, test VPI/VCI value, total number of AAL5 PDU's transmitted and
      received on the given VPI/VCI during the test in positive
      integers, and the AAL5-CRC-ER for the entire test.

      The graph results SHOULD display the AAL5 CRC error ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the AAL5-CRC-ER for each VCC.  There
      SHOULD be no more than 10 curves on each graph, one curve
      indicated and labeled for each VCC.  The integration time per
      point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.




Dunn & Martin                Informational                     [Page 98]


RFC 3116            Methodology for ATM Benchmarking           June 2001


3.3.3.5. AAL5-CRC-ER/Bursty VBR Load/One VCC

   Objective: To determine the SUT ratio of AAL5 CRC PDU errors on one
   VCC in a transmission in relation to the total AAL5 PDU's sent as
   defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with one VCC.  The VCC SHOULD
       contain one VPI/VCI.  The VCC MUST be configured as either a CBR
       or VBR connection.  The VPI/VCI MUST not be one of the reserved
       ATM signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and
       MBS must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific VBR rate through the SUT via
       the defined test VCC.  Since this test is not a throughput test,
       the rate should not be greater than 90% of line rate.  The IP
       PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of AAL5 CRC errors at the receiver end of the
       test device.

   Reporting Format:

      The results of the AAL5-CRC-ER/Bursty VBR Load/One VCC test SHOULD
      be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the AAL5-
      CRC-ER.  The values given SHOULD include: time period of test in
      s, test VPI/VCI value, total number of AAL5 PDU's transmitted and
      received on the given VPI/VCI during the test in positive
      integers, and the AAL5-CRC-ER for the entire test.

      The graph results SHOULD display the AAL5 CRC error ratio values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The





Dunn & Martin                Informational                     [Page 99]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the AAL5-CRC-ER.  The integration time per point MUST be
      indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.3.3.6. AAL5-CRC-ER/Bursty VBR Load/Twelve VCCs

   Objective: To determine the SUT ratio of AAL5 CRC PDU errors on
   twelve VCC's in a transmission in relation to the total AAL5 PDU's
   sent as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCCs, using 1 VPI
       and 12 VCIs.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific VBR rate through the SUT via
       the defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The PCR, SCR, and MBS must be indicated.  The IP PDUs MUST
       be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of AAL5 CRC errors at the receiver end of the
       test device for all VCCs.







Dunn & Martin                Informational                    [Page 100]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the AAL5-CRC-ER/Bursty VBR Load/Twelve VCCs test
      SHOULD be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the AAL5-
      CRC-ER.  The values given SHOULD include: time period of test in
      s, test VPI/VCI value, total number of AAL5 PDU's transmitted and
      received on the given VPI/VCI during the test in positive
      integers, and the AAL5-CRC-ER for the entire test.

      The graph results SHOULD display the AAL5 CRC error ratio values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the AAL5-CRC-ER for each VCC.  There should be 12 curves on the
      graph, on curve indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.3.3.7. AAL5-CRC-ER/Bursty VBR Load/Maximum VCCs

   Objective: To determine the SUT ratio of AAL5 CRC PDU errors with the
   maximum number VCCs supported on the SUT in a transmission in
   relation to the total AAL5 PDU's sent as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with the maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  The VCC's MUST be configured as either a CBR or VBR
       connection.  The VPI/VCIs MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).  The PCR, SCR, and MBS
       must be configured using one of the specified traffic
       descriptors.





Dunn & Martin                Informational                    [Page 101]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3)  Send a specific number of IP packets containing one of the
       specified bit patterns at a specific VBR rate through the SUT via
       the defined test VCCs.  All of the VPI/VCI pairs will generate
       traffic at the same traffic rate.  Since this test is not a
       throughput test, the rate should not be greater than 90% of line
       rate.  The IP PDUs MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of AAL5 CRC errors at the receiver end of the
       test device for all VCCs.

   Reporting Format:

      The results of the AAL5-CRC-ER/Bursty VBR Load/Maximum VCCs test
      SHOULD be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the AAL5-
      CRC-ER.  The values given SHOULD include: time period of test in
      s, test VPI/VCI value, total number of AAL5 PDU's transmitted and
      received on the given VPI/VCI during the test in positive
      integers, and the AAL5-CRC-ER for the entire test.

      The graph results SHOULD display the AAL5 CRC error ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the AAL5-CRC-ER for each VCC.  There
      SHOULD be no more than 10 curves on each graph, one curve
      indicated and labeled for each VCC.  The integration time per
      point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.3.3.8. AAL5-CRC-ER/Mixed Load/Three VCC's

   Objective: To determine the SUT ratio of AAL5 CRC PDU errors on three
   VCC's in a transmission in relation to the total AAL5 PDU's sent as
   defined in RFC 2761 "Terminology for ATM Benchmarking".



Dunn & Martin                Informational                    [Page 102]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with three VCC's.  Each VCC
       MUST be defined as a different Bearer class; one CBR, one UBR and
       one VBR.  Each VCC SHOULD contain one VPI/VCI.  The VPI/VCI MUST
       not be one of the reserved ATM signaling channels (e.g., [0,5],
       [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns through the SUT via the defined test VCCs.
       Each generated VCC stream MUST match the corresponding VCC Bearer
       class.  All of the VPI/VCI pairs will generate traffic at the
       same traffic rate.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT to verify
       connectivity and load.  If the count on the test device is the
       same on the SUT, continue the test; else lower the test device
       traffic rate until the counts are the same.

   5)  Record the number of AAL5 CRC errors at the receiver end of the
       test device for all VCCs.

   Reporting Format:

      The results of the AAL5-CRC-ER/Bursty Mixed Load/Three VCCs test
      SHOULD be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the AAL5-
      CRC-ER.  The values given SHOULD include: time period of test in
      s, test VPI/VCI value, total number of AAL5 PDU's transmitted and
      received on the given VPI/VCI during the test in positive
      integers, and the AAL5-CRC-ER for the entire test.

      The graph results SHOULD display the AAL5 CRC error ratio values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the AAL5-CRC-ER for each VCC.  There should be 12 curves on the
      graph, on curve indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the



Dunn & Martin                Informational                    [Page 103]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.3.3.9. AAL5-CRC-ER/Mixed Load/Twelve VCCs

   Objective: To determine the SUT ratio of AAL5 CRC PDU errors on
   twelve VCC's in a transmission in relation to the total AAL5 PDU's
   sent as defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with twelve VCC's.  Each VCC
       MUST be defined as one of the Bearer classes for a total of four
       CBR, four UBR and four VBR VCC's.  Each VCC SHOULD contain one
       VPI/VCI.  The VPI/VCI MUST not be one of the reserved ATM
       signaling channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns through the SUT via the defined test VCCs.
       Each generated VCC stream MUST match the corresponding VCC Bearer
       class.  All of the VPI/VCI pairs will generate traffic at the
       same traffic rate.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.

   5)  Record the number of AAL5 CRC errors at the receiver end of the
       test device for all VCCs.

   Reporting Format:

      The results of the AAL5-CRC-ER/Bursty Mixed Load/Twelve VCCs test
      SHOULD be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the AAL5-
      CRC-ER.  The values given SHOULD include: time period of test in
      s, test VPI/VCI value, total number of AAL5 PDU's transmitted and
      received on the given VPI/VCI during the test in positive
      integers, and the AAL5-CRC-ER for the entire test.



Dunn & Martin                Informational                    [Page 104]


RFC 3116            Methodology for ATM Benchmarking           June 2001


      The graph results SHOULD display the AAL5 CRC error ratio values.
      The x-coordinate SHOULD be the test run time in either seconds,
      minutes or days depending on the total length of the test.  The
      x-coordinate time SHOULD be configurable.  The y-coordinate SHOULD
      be the AAL5-CRC-ER for each VCC.  There should be 12 curves on the
      graph, on curve indicated and labeled for each VCC.  The
      integration time per point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.3.3.10. AAL5-CRC-ER/Mixed Load/Maximum VCCs

   Objective: To determine the SUT ratio of AAL5 CRC PDU errors with the
   maximum number VCCs supported on the SUT in a transmission in
   relation to the total AAL5 PDU's sent as defined in RFC 2761
   "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Configure the SUT and test device with maximum number of VCCs
       supported on the SUT.  For example, if the maximum number of VCCs
       supported on the SUT is 1024, define 256 VPIs with 4 VCIs per
       VPI.  Each VCC MUST be defined as one of the Bearer classes for a
       total of (max VCC/3) CBR, (max VCC/3) UBR and (max VCC/3) VBR
       VCC's.  The VPI/VCI MUST not be one of the reserved ATM signaling
       channels (e.g., [0,5], [0,16]).

   3)  Send a specific number of IP packets containing one of the
       specified bit patterns through the SUT via the defined test VCCs.
       Each generated VCC stream MUST match the corresponding VCC Bearer
       class.  All of the VPI/VCI pairs will generate traffic at the
       same traffic rate.  Since this test is not a throughput test, the
       rate should not be greater than 90% of line rate.  The IP PDUs
       MUST be encapsulated in AAL5.

   4)  Count the IP packets that are transmitted by the SUT on all VCCs
       to verify connectivity and load.  If the count on the test device
       is the same on the SUT, continue the test; else lower the test
       device traffic rate until the counts are the same.




Dunn & Martin                Informational                    [Page 105]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   5)  Record the number of AAL5 CRC errors at the receiver end of the
       test device for all VCCs.

   Reporting Format:

      The results of the AAL5-CRC-ER/Bursty Mixed Load/Maximum VCCs test
      SHOULD be reported in a form of text and graph.

      The text results SHOULD display the numerical values of the AAL5-
      CRC-ER.  The values given SHOULD include: time period of test in
      s, test VPI/VCI value, total number of AAL5 PDU's transmitted and
      received on the given VPI/VCI during the test in positive
      integers, and the AAL5-CRC-ER for the entire test.

      The graph results SHOULD display the AAL5 CRC error ratio values.
      There will be (Max number of VCCs/10) graphs, with 10 VCCs
      indicated on each graph.  The x-coordinate SHOULD be the test run
      time in either seconds, minutes or days depending on the total
      length of the test.  The x-coordinate time SHOULD be configurable.
      The y-coordinate SHOULD be the AAL5-CRC-ER for each VCC.  There
      SHOULD be no more than 10 curves on each graph, one curve
      indicated and labeled for each VCC.  The integration time per
      point MUST be indicated.

      The results MUST also indicate the packet size in octets, traffic
      rate in packets per second, and bearer class as generated by the
      test device.  The VCC and VPI/VCI values MUST be indicated.  The
      PCR, SCR, and MBS MUST be indicated.  The bearer class of the
      created VCC MUST be indicated.  The generated bit pattern MUST
      also be indicated.

3.4. ATM Service: Signaling

3.4.1. CAC Denial Time and Connection Establishment Time

   Objective: To determine the CAC rejection time and Connection
   Establishment Time on the SUT as defined in RFC 2761 "Terminology for
   ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Create a UNI signaling setup message, as described in Appendix C,
       specifying a PCR which will not allow CAC to reject the call.





Dunn & Martin                Informational                    [Page 106]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   3)  Send the UNI signaling setup message.  Note the time the setup
       message was sent.  Verify that the SVC has been setup with the
       correct parameters.  Note the time the connect message was
       received

   4)  Create a UNI signaling setup message, as described in Appendix C,
       specifying a PCR which will allow CAC to reject the call.

   5)  Send the UNI signaling setup message.  Note the time the setup
       message was sent.  Verify that the SVC has been rejected with the
       correct cause code.  Note the time the release complete message
       was received.

   6)  Compute the rejection time as the difference between the time the
       release complete message was received and the time setup message
       was send.

   Reporting Format:

      The results of the CAC Denial Time and Connection Establishment
      Time tests SHOULD be reported in a form of a table.  The rows
      SHOULD be labeled call accepted and call rejected.  The columns
      SHOULD be labeled time setup sent, time response received, and
      correct response.  The elements of the columns 1 and 2 SHOULD be
      in seconds.  The elements of column 3 SHOULD be be either True or
      False, indicating whether the particular condition was observed
      for each test.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device.

3.4.2. Connection Teardown Time

   Objective: To determine the Connection Teardown Time on the SUT as
   defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Create a UNI signaling setup message, as described in Appendix C,
       specifying a PCR which will not allow CAC to reject the call.

   3)  Send the UNI signaling setup message.  Note the time the setup
       message was sent.  Verify that the SVC has been setup with the
       correct parameters.  Note the time the connect message was
       received



Dunn & Martin                Informational                    [Page 107]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   4)  Create a UNI signaling release message, as described in Appendix
       C, specifying a cause code of normal call clearing.

   5)  Send the UNI signaling release message.  Note the time the
       release message was sent.  Verify that the SVC has been
       terminated with the correct cause code.  Note the time the
       release complete message was received.

   6)  Compute the release time as the difference between the time the
       release complete message was received and the time release
       message was send.

   Reporting Format:

      The results of the Connection Teardown Time tests SHOULD be
      reported in a form of a table.  The rows SHOULD be labeled call
      accepted and call released.  The columns SHOULD be labeled time
      message sent, time response received, and correct response.  The
      elements of the columns 1 and 2 SHOULD be in seconds.  The
      elements of column 3 SHOULD be be either True or False, indicating
      whether the particular condition was observed for each test.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device.

3.4.3. Crankback Time

   Objective: To determine the Crankback Time on the SUT as defined in
   RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       passthrough configuration.

   2)  Create a PNNI signaling setup message, as described in Appendix
       C, specifying a DTL which is not blocked by the far end SUT.

   3)  Send the PNNI signaling setup message.  Note the time the setup
       message was sent.  Verify that the connect message has been
       received by the near-end switch.  Note the time the connect
       message was received

   4)  Create a PNNI signaling setup message, as described in Appendix
       C, specifying a DTL which is blocked by the far end SUT.

   5)  Send the PNNI signaling release message.  Note the time the
       release message was sent.  Note the time the release complete



Dunn & Martin                Informational                    [Page 108]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       message was received.  Note the time the near-end switch sends
       it's own PNNI setup message (referred to as the near-end setup
       message) specifying the non- blocked DTL.

   6)  Compute the crankback time as the difference between the time the
       near-end setup message was received and the time release message
       was send.

   Reporting Format:

      The results of the Crankback Time tests SHOULD be reported in a
      form of a table.  The rows SHOULD be labeled DTL call accepted and
      call released.  The columns SHOULD be labeled time message sent,
      time response received, and correct response.  The elements of the
      columns 1 and 2 SHOULD be in seconds.  The elements of column 3
      SHOULD be be either True or False, indicating whether the
      particular condition was observed for each test.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device.

3.4.4. Route Update Response Time

   Objective: To determine the Route Update Response Time on the SUT as
   defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the uni-directional
       passthrough configuration.

   2)  Create a PNNI PTSE as described in Appendix C, specifying a
       routing topology.  Verify that the routing tables on the far-end
       and near-end switches are empty.

   3)  Send the PTSE message to the far-end switch.  Note the time the
       PTSE message was sent.  Verify that the PTSE message has been
       received by the far-end switch.  Note the time the PTSE message
       was received.

   4)  Create another PNNI PTSE as described in Appendix C, specifying a
       change in the routing topology.  Verify that the routing tables
       on the far-end and near-end switches contain the previous PTSE
       routes.

   5)  Send the PTSE message to the far-end switch.  Note the time the
       PTSE message was sent.  Verify that the PTSE message has been
       received by the far-end switch.  Note the time the PTSE message



Dunn & Martin                Informational                    [Page 109]


RFC 3116            Methodology for ATM Benchmarking           June 2001


       was received.  Note the time the PTSE was sent to the near-end
       switch.  Note the time the PTSE message was received on the
       near-end switch.

   6)  Compute the Route Update Response time as the difference between
       the time the far-end PTSE message was sent and the time far-end
       PTSE message was received by the near-end.

   Reporting Format:

      The results of the Route Update Response Time tests SHOULD be
      reported in a form of a table.  The rows SHOULD be labeled PTSE
      call accepted, far-end PTSE message send, and near-end message
      received.  The columns SHOULD be labeled time message sent, time
      response received, and correct response.  The elements of the
      columns 1 and 2 SHOULD be in seconds.  The elements of column 3
      SHOULD be be either True or False, indicating whether the
      particular condition was observed for each test.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device.

3.5. ATM Service: ILMI

3.5.1. MIB Alignment Time

   Objective: To determine the MIB Alignment Time on the SUT as defined
   in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Send a Cold Start message to the SUT.  Note the time the message
       was sent to the SUT.  Verify that the Cold Start message has been
       received by the SUT.  Note the time the message was received.

   3)  Send a Get Request message to the SUT.  Note the time the message
       was sent to the SUT.  Verify that the Get Request message has
       been received by the SUT.  Note the time the message was
       received.

   4)  After all MIB elements are exchanged, verify that the final Get
       Request message has been received by the SUT.  Note the time the
       message was send and received by the SUT.





Dunn & Martin                Informational                    [Page 110]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   5)  Compute the MIB Alignment Time as the difference between the time
       the Cold Start message was sent and the time the final Get
       Request was received by the SUT.

   Reporting Format:

      The results of the MIB Alignment Time tests SHOULD be reported in
      a form of a table.  The rows SHOULD be labeled Cold Start Send,
      Cold Start accepted, Final Get Request send, and Final Get Request
      received.  The columns SHOULD be labeled time message sent, time
      response received, and correct response.  The elements of the
      columns 1 and 2 SHOULD be in seconds.  The elements of column 3
      SHOULD be be either True or False, indicating whether the
      particular condition was observed for each test.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device.

3.5.2. Address Registration Time

   Objective: To determine the Address Registration Time on the SUT as
   defined in RFC 2761 "Terminology for ATM Benchmarking".

   Procedure:

   1)  Set up the SUT and test device using the bi-directional
       configuration.

   2)  Send a Set Request message to the SUT.  Note the time the message
       was sent to the SUT.  Verify that the Set Request message has
       been received by the SUT.  Note the time the message was
       received.

   3)  Send a Get Request message to the SUT.  Note the time the message
       was sent to the SUT.  Verify that the Get Request message has
       been received by the SUT.  Note the time the message was
       received.

   4)  After all MIB elements are exchanged, verify that the final Get
       Request message has been received by the SUT.  Note the time the
       message was send and received by the SUT.

   5)  Compute the Address Registration Time as the difference between
       the time the Set Request message was sent and the time the final
       Get Request was received by the SUT.






Dunn & Martin                Informational                    [Page 111]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Reporting Format:

      The results of the Address Registration Time tests SHOULD be
      reported in a form of a table.  The rows SHOULD be labeled Set
      Request Send, Set Request accepted, Final Get Request send, and
      Final Get Request received.  The columns SHOULD be labeled time
      message sent, time response received, and correct response.  The
      elements of the columns 1 and 2 SHOULD be in seconds.  The
      elements of column 3 SHOULD be be either True or False, indicating
      whether the particular condition was observed for each test.

      The table MUST also indicate the packet size in octets and traffic
      rate in packets per second as generated by the test device.

4. Security Considerations

   As this document is solely for the purpose of providing methodology
   and describes neither a protocol nor an implementation, there are no
   security considerations associated with this document.

5. Notices

   The IETF takes no position regarding the validity or scope of any
   intellectual property or other rights that might be claimed to
   pertain to the implementation or use of the technology described in
   this document or the extent to which any license under such rights
   might or might not be available; neither does it represent that it
   has made any effort to identify any such rights.  Information on the
   IETFs procedures with respect to rights in standards-track and
   standards-related documentation can be found in BCP-11.  Copies of
   claims of rights made available for publication and any assurances of
   licenses to be made available, or the result of an attempt made to
   obtain a general license or permission for the use of such
   proprietary rights by implementors or users of this specification can
   be obtained from the IETF Secretariat.

   The IETF invites any interested party to bring to its attention any
   copyrights, patents or patent applications, or other proprietary
   rights which may cover technology that may be required to practice
   this standard.  Please address the information to the IETF Executive
   Director.










Dunn & Martin                Informational                    [Page 112]


RFC 3116            Methodology for ATM Benchmarking           June 2001


6. References

   [RFC2544]      Bradner, S. and J. McQuaid, "Benchmarking Methodology
                  for Network Interconnect Devices", RFC 2544, March
                  1999.

   [RFC2225]      Laubach, M. and J. Halpern, "Classical IP and ARP over
                  ATM", RFC 2225, April 1998.

   [RFC2761]      Dunn, J. and C. Martin, "Terminology for ATM
                  Benchmarking", RFC 2761, February 2000.

   [AF-ILMI4.0]   ATM Forum Integrated Local Management Interface
                  Version 4.0, af-ilmi-0065.000, September 1996.

   [AF-TEST-0022] Introduction to ATM Forum Test Specifications, af-
                  test-0022.00, December 1994.

   [AF-TM4.1]     ATM Forum, Traffic Management Specification Version
                  4.1, af-tm-0121.00, April 1996.

   [AF-UNI3.1]    ATM Forum, User Network Interface Specification
                  Version 3.1, September 1994.

   [AF-UNI4.0]    ATM Forum, User Network Interface Specification
                  Version 4.0, July 1996.

7. Authors' Addresses

   Jeffrey Dunn
   Advanced Network Consultants, Inc.
   4214 Crest Place
   Ellicott City, MD 21043, USA

   Phone: +1 (410) 750-1700
   EMail: Jeffrey.Dunn@worldnet.att.net


   Cynthia Martin
   Advanced Network Consultants, Inc.
   4214 Crest Place
   Ellicott City, MD 21043, USA

   Phone: +1 (410) 750-1700
   EMail: Cynthia.E.Martin@worldnet.att.net






Dunn & Martin                Informational                    [Page 113]


RFC 3116            Methodology for ATM Benchmarking           June 2001


Appendix A: Ranges

   ATM NSAP Network Prefix.
     39 0000 0000 0000 0000 0000 0000-39 0000 0000 0000 0000 0000 00FF
     39 0000 0000 0000 0000 0001 0000-39 0000 0000 0000 0000 0001 00FF
     39 0000 0000 0000 0001 0000 0000
     39 0000 0000 0000 0002 0020 0000
     39 0000 0000 0300 0002 0030 0000
     39 0000 0000 4000 0002 0060 0000
     39 0000 0006 0060 0002 0030 0000
     39 0000 0006 0050 0002 0030 0000
     39 0000 0009 0300 0002 0030 0000
     39 0000 00A0 0300 0002 0030 0000
     39 0000 0B00 0300 0002 0030 0000
     39 0000 C000 0300 0002 0030 0000

   ATM NSAP End System Identifier.
     1111 1111 1111 00-1111 1111 11FF 00
     2222 2222 2000 00-2222 2222 2222 00
     9999 999A 0000 00-9999 999C 0000 00

Appendix B: Rates

   PNNI Routing Update Size.

   1) 1 PNNI routing entry update on non-aggregated addresses

   2) 2 PNNI routing entry updates on non-aggregated addresses

   3) 5 PNNI routing entry updates on non-aggregated addresses

   4) 1 % of total available bandwidth or 1 Mb/s, whichever is less on
      non- aggregated addresses

   5) 1 % of total available bandwidth or 1 Mb/s, whichever is less on
      of non-aggregated addresses and of aggregated addresses

   6) 1 % of total available bandwidth or 1 Mb/s, whichever is less on
      aggregated addresses

   7) 2 % of total available bandwidth or 2 Mb/s, whichever is less on
      non- aggregated addresses

   8) 2 % of total available bandwidth or 2 Mb/s, whichever is less on
      of non-aggregated addresses and of aggregated addresses

   9) 2 % of total available bandwidth or 2 Mb/s, whichever is less on
      aggregated addresses



Dunn & Martin                Informational                    [Page 114]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   PNNI Routing Update Repetition Interval.

   Repetition Interval begins after initial PNNI routing table
      stabilizes.

   1) 1 update every 1 hour, for 24 hours

   2) 1 update every 30 minutes, for 24 hours

   3) 1 update every 5 minutes, for 1 hour

   4) 1 update every 1 minute, for 15 minutes

   5) 1 update every 30 seconds, for 5 minutes

   6) 1 update every 30 seconds, for 1 minute

   7) 1 update every 1 second, for 30 seconds

   Maximum WAN Connection rates in packets per second (pps):

                    25.6        OC-3c       OC-12c
   IP Packet Size
   octets/cells
       44/2         30188       176603      706412
       64/2         30188       176603      706412
      128/3         20125       117735      470940
      256/6         10062        58867      235468
    1024/22          2744        16054      64216
    1518/32          1886        11037      44148
    2048/43          1404         8214      32856
    4472/94           642         3757      15028
   9180/192           314         1839       7356

   Maximum LAN Connection rates in packets per second (pps):

                    DS-1       DS-3       E1        E3
   IP Packet Size
   octets/cells
       44/2          1811      52133      2340     40000
       64/2          1811      52133      2340     40000
      128/3          1207      34755      1560     26666
      256/6           603      17377       780     13333
    1024/22           164       4739       212      3636
    1518/32           113       3258       146      2500
    2048/43            84       2424       108      1860
    4472/94            38       1109        49       851
    9180/192           18        543        24       416



Dunn & Martin                Informational                    [Page 115]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   Notes: 1.  PDU size in cells is computed based on ceiling( ( PDU size
   in octets + 16) / 48).  This assumes an 8 octet LLC/SNAP header and
   an 8 octet AAL/5 trailer.

   2.  Due to the number of possible configurations, IMA pps rates are
   not listed, but may be derived from the following formula: floor
   (IDCR/cells per packet), where cells per packet is computed as in
   note 1.

   3. The following cell rates were used: DS-1 = 3622 cps (using ATM TC)
   E1 = 4681 cps 25.6 Mb/s = 60377 cps E3 = 80000 cps (using ATM TC)
   DS-3 = 104266 cps (using ATM TC) OC-3c = 353207 cps OC-12c = 1412828
   cps

Appendix C: PDU's

 TCP/IP over ATM Example 1.
    LLC:    DSAP                        0xAA (SNAP-SAP)
                SSAP                       0xAA (SNAP-SAP)
                Control                    0x03 (Unnumbered Information)
    SNAP: OUI                           0x00-00-00 (Ethertype)
                 PID                       0x0800 (Internet Protocol)
    IP:      Version = 4
             Header length = 20
             Type of service = 0
                 000. .... Precedence = Routine(0)
                 ...0 .... Delay = Normal (0)
                 .... 0... Throughput = Normal (0)
                 .... .0.. Reliability = Normal (0)
             Packet length = 40
             Id = 0
             Fragmentation Info = 0x0000
                 .0.. ....  .... .... Don't Fragment Bit = FALSE
                 ..0. ....  .... .... More Fragments Bit = FALSE
                 ...0 0000  0000 0000 Fragment offset = 0
             Time to live = 255
             Protocol = TCP (6)
             Header checksum = F9CF
             Source address = 15.19.209.236
             Destination address = 15.19.209.237
    TCP:     Source port = smtp (25)
             Destination port = smtp (25)
             Sequence number = 1
             Ack number = 0
             Data offset = 20
             Flags = 0x02
                 ..0. .... URGENT Flag = FALSE
                 ...0 .... ACK Flag = FALSE



Dunn & Martin                Informational                    [Page 116]


RFC 3116            Methodology for ATM Benchmarking           June 2001


                 .... 0... PUSH Flag = FALSE
                 .... .0.. RST Flag = FALSE
                 .... ..1. SYN Flag = TRUE
                 .... ...0 FIN Flag = FALSE
             Window = 0
             Checksum = EDAF
             Urgent pointer = 00000000

 TCP/IP over ATM Example 2.
LLC:     DSAP                         0xAA (SNAP-SAP)
             SSAP                        0xAA (SNAP-SAP)
             Control                     0x03 (Unnumbered Information)
    SNAP:  OUI                        0x00-00-00 (Ethertype)
             PID                         0x0800 (Internet Protocol)
    IP:      Version = 4
             Header length = 20
             Type of service = 0
                 000. .... Precedence = Routine(0)
                 ...0 .... Delay = Normal (0)
                 .... 0... Throughput = Normal (0)
                 .... .0.. Reliability = Normal (0)
             Packet length = 40
             Id = 0
             Fragmentation Info = 0x0000
                 .0.. ....  .... .... Don't Fragment Bit = FALSE
                 ..0. ....  .... .... More Fragments Bit = FALSE
                 ...0 0000  0000 0000 Fragment offset = 0
             Time to live = 255
             Protocol = TCP (6)
             Header checksum = F9CF
             Source address = 15.19.209.236
             Destination address = 15.19.209.237
    TCP:     Source port = ftp-data (20)
             Destination port = 2000
             Sequence number = 1
             Ack number = 0
             Data offset = 20
             Flags = 0x02
                 ..0. .... URGENT Flag = FALSE
                 ...0 .... ACK Flag = FALSE
                 .... 0... PUSH Flag = FALSE
                 .... .0.. RST Flag = FALSE
                 .... ..1. SYN Flag = TRUE
                 .... ...0 FIN Flag = FALSE
             Window = 0
             Checksum = E5FD
             Urgent pointer = 00000000




Dunn & Martin                Informational                    [Page 117]


RFC 3116            Methodology for ATM Benchmarking           June 2001


 UDP/IP over ATM Example.
    LLC:    DSAP                        0xAA (SNAP-SAP)
            SSAP                        0xAA (SNAP-SAP)
            Control                     0x03 (Unnumbered Information)
    SNAP:   OUI                         0x00-00-00 (Ethertype)
            PID                         0x0800 (Internet Protocol)
    IP:      Version = 4
             Header length = 20
             Type of service = 0
                 000. .... Precedence = Routine(0)
                 ...0 .... Delay = Normal (0)
                 .... 0... Throughput = Normal (0)
                 .... .0.. Reliability = Normal (0)
             Packet length = 28
             Id = 0
             Fragmentation Info = 0x0000
                 .0.. ....  .... .... Don't Fragment Bit = FALSE
                 ..0. ....  .... .... More Fragments Bit = FALSE
                 ...0 0000  0000 0000 Fragment offset = 0
             Time to live = 255
             Protocol = ICMP (1)
             Header checksum = F9E0
             Source address = 15.19.209.236
             Destination address = 15.19.209.237
    ICMP:    Type = Echo request (8)
             Code = 0
             Checksum = F7FF
             Identifier = 0 (0x0)
             Sequence Number = 0 (0x0)

 RIP Routing Update over ATM.

    -- DATAGRAM HEADER
          offset data (hex)            description
          00     FF FF FF FF FF FF     dest MAC address is broadcast
          06     xx xx xx xx xx xx     source hardware address
          12     08 00                 type

          -- IP HEADER
          14     45                    IP version - 4, header length (4
         byte units) - 5
          15     00                    service field
          16     00 EE                 total length
          18     00 00                 ID
          20     40 00                 flags (3 bits) 4 (do not
         fragment),
                                       fragment offset-0
          22     0A                    TTL



Dunn & Martin                Informational                    [Page 118]


RFC 3116            Methodology for ATM Benchmarking           June 2001


          23     11                    protocol - 17 (UDP)
          24     C4 8D                 header checksum
          26     xx xx xx xx           source IP address
          30     xx xx xx              destination IP address
          33     FF                    host part = FF for broadcast

          -- UDP HEADER
          34     02 08                 source port 208 = RIP
          36     02 08                 destination port 208 = RIP
          38     00 DA                 UDP message length
          40     00 00                 UDP checksum

          -- RIP packet
          42     02                  command = response
          43     01                  version = 1
          44     00 00               0

          -- net 1
          46     00 02               family = IP
          48     00 00               0
          50     xx xx xx            net 1 IP address
          53     00                  net not node
          54     00 00 00 00         0
          58     00 00 00 00         0
          62     00 00 00 07         metric 7

          -- net 2

          66     00 02               family = IP
          68     00 00               0
          70     xx xx xx            net 2 IP address
          73     00                  net not node
          74     00 00 00 00         0
          78     00 00 00 00         0
          82     00 00 00 07         metric 7

          -- net 3
          86     00 02               family = IP
          88     00 00               0
          90     xx xx xx            net 3 IP address
          93     00                  net not node
          94     00 00 00 00         0
          98     00 00 00 00         0
          102    00 00 00 07         metric 7

          -- net 4
          106    00 02               family = IP
          108    00 00               0



Dunn & Martin                Informational                    [Page 119]


RFC 3116            Methodology for ATM Benchmarking           June 2001


          110    xx xx xx            net 4 IP address
          113    00                  net not node
          114    00 00 00 00         0
          118    00 00 00 00         0
          122    00 00 00 07         metric 7

          -- net 5
          126    00 02               family = IP
          128    00 00               0
          130    00                  net 5 IP address
          133    00                  net not node
          134    00 00 00 00         0
          138    00 00 00 00         0
          142    00 00 00 07         metric 7

          -- net 6
          146    00 02               family = IP
          148    00 00               0
          150    xx xx xx            net 6 IP address
          153    00                  net not node
          154    00 00 00 00         0
          158    00 00 00 00         0
          162    00 00 00 07         metric 7

   UNI  3.1 Signaling Setup Message Example.  PCR will not allow CAC to
   reject the call.

    Protocol Discriminator    : Q.93B UNI call control
    Call Reference Length     : 3
    Call Reference Flag       : orig
    Call Reference Value      : 0
    Message Type              : SETUP
    Ext                       : last octet
    Action Indicator          : clear call
    Message Length            : 50
    Information Element ID    : ATM Traffic Descriptor
    Ext                       : last octet
    Coding Standard           : ITU-T standard
    Action Indicator          : clear call
    IE Length                 : 9
    Cell Rate Subfield ID     : forward peak CR(CLP=0+1)
    Forward Peak Cell Rate    : 1
    Cell Rate Subfield ID     : backward peak CR(CLP=0+1)
    Backward Peak Cell Rate   : 1
    Cell Rate Subfield ID     : best effort indicator
    Information Element ID    : Broadband Bearer Capability
    Ext                       : last octet
    Coding Standard           : ITU-T standard



Dunn & Martin                Informational                    [Page 120]


RFC 3116            Methodology for ATM Benchmarking           June 2001


    Action Indicator          : clear call
    IE Length                 : 2
    Ext                       : last octet
    Bearer Class              : BCOB-X
    Ext                       : last octet
    Clipping Susceptibility   : not susceptible to clipping
    User Plane Connection CFG : point-to-point
    Information Element ID    : Called Party Number
    Ext                       : last octet
    Coding Standard           : ITU-T standard
    Action Indicator          : clear call
    IE Length                 : 21
    Ext                       : last octet
    Addressing/Numbering Plan : ISO NSAP addressing
    ISO NSAP Address Octets   : 3900000000000000000000000011111111111100
    Information Element ID    : Quality of Service Parameter
    Ext                       : last octet
    Coding Standard           : ITU-T standard
    Action Indicator          : clear call
    IE Length                 : 2
    QoS Class Forward         : QoS class 0 - unspecified
    QoS Class Backward        : QoS class 0 - unspecified

   UNI 3.1 Signaling Setup Message Reject Example.  PCR  will  allow
   CAC  to reject the call.

    Protocol Discriminator    : Q.93B UNI call control
    Call Reference Length     : 3
    Call Reference Flag       : orig
    Call Reference Value      : 0
    Message Type              : SETUP
    Ext                       : last octet
    Action Indicator          : clear call
    Message Length            : 50
    Information Element ID    : ATM Traffic Descriptor
    Ext                       : last octet
    Coding Standard           : ITU-T standard
    Action Indicator          : clear call
    IE Length                 : 8
    Cell Rate Subfield ID     : forward peak CR(CLP=0+1)
    Forward Peak Cell Rate    : 300000
    Cell Rate Subfield ID     : backward peak CR(CLP=0+1)
    Backward Peak Cell Rate   : 300000
    Information Element ID    : Broadband Bearer Capability
    Ext                       : last octet
    Coding Standard           : ITU-T standard
    Flag                      : not significant
    Action Indicator          : clear call



Dunn & Martin                Informational                    [Page 121]


RFC 3116            Methodology for ATM Benchmarking           June 2001


    IE Length                 : 3
    Ext                       : another octet
    Bearer Class              : BCOB-X
    Ext                       : last octet
    Traffic Type              : constant bit rate
    Timing Requirements       : end-to-end timing required
    Ext                       : last octet
    Clipping Susceptibility   : not susceptible to clipping
    User Plane Connection CFG : point-to-point
    Information Element ID    : Called Party Number
    Ext                       : last octet
    Coding Standard           : ITU-T standard
    Action Indicator          : clear call
    IE Length                 : 21
    Ext                       : last octet
    Addressing/Numbering Plan : ISO NSAP addressing
    ISO NSAP Address Octets   : 3900000000000000000000000011111111111100
    Information Element ID    : Quality of Service Parameter
    Ext                       : last octet
    Coding Standard           : ITU-T standard
    Action Indicator          : clear call
    IE Length                 : 2
    QoS Class Forward         : QoS class 0 - unspecified
    QoS Class Backward        : QoS class 0 - unspecified

   UNI  3.1 Signaling Release Message, specifying a cause code of normal
   call clearing.

    Protocol Discriminator   : Q.93B UNI call control
    Call Reference Length    : 3
    Call Reference Flag      : orig
    Call Reference Value     : 0
    Message Type             : RELEASE
    Ext                      : last octet
    Action Indicator         : clear call
    Message Length           : 6
    Information Element ID   : Cause
    Ext                      : last octet
    Coding Standard          : ITU-T standard
    Action Indicator         : clear call
    IE Length                : 2
    Ext                      : last octet
    Location                 : user
    Ext                      : last octet
    Cause Value              : NE:normal call clearing

   PNNI Signaling Setup Message, specifying a DTL which is not blocked
   by the far end SUT.



Dunn & Martin                Informational                    [Page 122]


RFC 3116            Methodology for ATM Benchmarking           June 2001


    Protocol Discriminator    : PNNI signalling
    Call Reference Length     : 3
    Call Reference Flag       : from
    Message Type              : SETUP
    Ext                       : last octet
    Pass Along Request        : no pass along request
    Action Indicator          : clear call
    Message Length            : 56
    Information Element ID    : ATM Traffic Descriptor
    Ext                       : last octet
    Coding Standard           : ITU-T standardized
    Pass Along Request        : no pass along request
    Action Indicator          : clear call
    IE Length                 : 0
    Information Element ID    : Broadband Bearer Capability
    Ext                       : last octet
    Coding Standard           : ITU-T standardized
    Pass Along Request        : no pass along request
    Action Indicator          : clear call
    IE Length                 : 3
    Ext                       : another octet
    Bearer Class              : BCOB-X
    Ext                       : last octet
    ATM Transfer Capability   : reserved for bwd compatibility
    Ext                       : last octet
    Clipping Susceptibility   : not susceptible to clipping
    User Plane Connection cfg : point-to-point
    Information Element ID    : Called Party Number
    Ext                       : last octet
    Coding Standard           : ITU-T standardized
    Pass Along Request        : no pass along request
    Action Indicator          : clear call
    IE Length                 : 8
    Ext                       : last octet
    Type of Number            : unknown
    Addressing/Numbering Plan : ATM endsystem address
    ATM Endsystem Address Oct : 11111111111101
    Information Element ID    : Designated Transit List
    Ext                       : last octet
    Coding Standard           : ATM Forum specific
    Pass Along Request        : no pass along request
    Action Indicator          : clear call
    IE Length                 : 29
    Current Transit Pointer   : 0
    Logical Node/Port Indicat : Logical Node/Port Indicator
    Logical Node Identifier   : 3900000000000000000000000011111111111100





Dunn & Martin                Informational                    [Page 123]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   PNNI  Signaling Setup Message Reject, specifying a DTL which is
   blocked by the far end SUT.

Protocol Discriminator      : PNNI signalling
    Call Reference Length   : 3
    Call Reference Flag     : from
    Call Reference Value    : 0
    Message Type            : SETUP
    Ext                     : last octet
    Pass Along Request      : no pass along request
    Action Indicator        : clear call
    Message Length          : 56
    Information Element ID  : ATM Traffic Descriptor
    Ext                     : last octet
    Coding Standard         : ITU-T standardized
    Pass Along Request      : no pass along request
    Action Indicator        : clear call
    IE Length               : 0
    Information Element ID  : Broadband Bearer Capability
    Ext                     : last octet
    Coding Standard         : ITU-T standardized
    Pass Along Request      : no pass along request
    Action Indicator        : clear call
    IE Length               : 3
    Bearer Class            : BCOB-X
    Ext                     : last octet
    ATM Transfer Capability : reserved for bwd compatibility
    Ext                     : last octet
    Clipping Susceptibility : not susceptible to clipping
    User Plane Connection cfg : point-to-point
    Information Element ID  : Called Party Number
    Ext                     : last octet
    Coding Standard         : ITU-T standardized
    Pass Along Request      : no pass along request
    Action Indicator        : clear call
    IE Length               : 8
    Ext                     : last octet
    Addressing/Numbering Plan : ATM endsystem address
    ATM Endsystem Address Oct : 11111111111101
    Information Element ID    : Designated Transit List
    Ext                       : last octet
    Coding Standard           : ATM Forum specific
    Pass Along Request        : no pass along request
    Action Indicator          : clear call
    IE Length                 : 29
    Current Transit Pointer   : 0
    Logical Node/Port Indicat : Logical Node/Port Indicator
    Logical Node Identifier   : 3900000000000000000000000011111111111100



Dunn & Martin                Informational                    [Page 124]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   PNNI Far End Request Message.

Header:  Packet Type                    5 (PTSE REQUEST)
             Packet Length             40
             Protocol Version           1
             Newest Version Supported   1
             Oldest Version Supported   0
             Reserved                   0
    IG:      Information Group Type   513 (Requested PTSE Header)
             Information Group Length  32
             Originating Node ID
                   00013900-00000000-00000000-00000011-11111111-1100
             PTSE Request Count     1
             PTSE Identifier        0

   PNNI PTSE, specifying a routing topology.

Header:  Packet Type                    4 (DATABASE SUMMARY)
             Packet Length             76
             Protocol Version           1
             Newest Version Supported   1
             Oldest Version Supported   0
             Reserved                   0
             Initialize (I)Bit          1 (during init. of DB syn
                                           process)
             More (M)Bit                1 (PTSEs to summarize)
             Master (MS)Bit             1 (both nodes)
             Reserved                   0
             Reserved                   0
             DS Sequence Number         0
    IG:      Information Group Type   512 (Nodal PTSE Summaries)
             Information Group Length  60
             Originating Node ID
                 00013900-00000000-00000000-00000011-11111111-1100
             Originating Node's Peer Group 00000000-00000000-00000000-
                                           0001
             Reserved                    0
             PTSE Summary Count          1
             PTSE Type                   0
             Reserved                    0
             PTSE Identifier             0
             PTSE Sequence Number        0
             PTSE Checksum               0
             PTSE Remaining Lifetime     0







Dunn & Martin                Informational                    [Page 125]


RFC 3116            Methodology for ATM Benchmarking           June 2001


   PNNI PTSE Update, specifying a change in the routing topology.

Header:  Packet Type                    2 (PTSP)
             Packet Length             96
             Protocol Version           1
             Newest Version Supported   1
             Oldest Version Supported   0
             Reserved                   0
             Originating Node ID
                 00013900-00000000-00000000-00000011-11111111-1100
             Originating Node's Peer Group 00000000-00000000-00000000-
                                           0001
    IG:      Information Group Type     64 (PTSE)
             Information Group Length   52
             PTSE Type                   0
             Reserved                    0
             PTSE Identifier             0
             PTSE Sequence Number        0
             PTSE Checksum           42252
             PTSE Remaining Lifetime  3600
    IG:       Information Group Type   224 (Internal Reachable ATM
                                            Addresses)
             Information Group Length   32
             VP Capability Flag          1 (VPCs supported)
             Reserved                    0
             Reserved                    0
             Port ID                     0
             Scope of Advertisement     96
             Address Information Length 14
             Address Information Count   1
             Prefix Length              13
             Reachable Address Prefix   39000000-00000000-00000000-01



















Dunn & Martin                Informational                    [Page 126]


RFC 3116            Methodology for ATM Benchmarking           June 2001


Full Copyright Statement

   Copyright (C) The Internet Society (2001).  All Rights Reserved.

   This document and translations of it may be copied and furnished to
   others, and derivative works that comment on or otherwise explain it
   or assist in its implementation may be prepared, copied, published
   and distributed, in whole or in part, without restriction of any
   kind, provided that the above copyright notice and this paragraph are
   included on all such copies and derivative works.  However, this
   document itself may not be modified in any way, such as by removing
   the copyright notice or references to the Internet Society or other
   Internet organizations, except as needed for the purpose of
   developing Internet standards in which case the procedures for
   copyrights defined in the Internet Standards process must be
   followed, or as required to translate it into languages other than
   English.

   The limited permissions granted above are perpetual and will not be
   revoked by the Internet Society or its successors or assigns.

   This document and the information contained herein is provided on an
   "AS IS" basis and THE INTERNET SOCIETY AND THE INTERNET ENGINEERING
   TASK FORCE DISCLAIMS ALL WARRANTIES, EXPRESS OR IMPLIED, INCLUDING
   BUT NOT LIMITED TO ANY WARRANTY THAT THE USE OF THE INFORMATION
   HEREIN WILL NOT INFRINGE ANY RIGHTS OR ANY IMPLIED WARRANTIES OF
   MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE.

Acknowledgement

   Funding for the RFC Editor function is currently provided by the
   Internet Society.



















Dunn & Martin                Informational                    [Page 127]

mirror server hosted at Truenetwork, Russian Federation.