<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.24 (Ruby 3.2.3) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-chen-bmwg-savnet-sav-benchmarking-03" category="std" consensus="true" submissionType="IETF" xml:lang="en" version="3">
  <!-- xml2rfc v2v3 conversion 3.28.0 -->
  <front>
    <title abbrev="SAVBench">Benchmarking Methodology for Source Address Validation</title>
    <seriesInfo name="Internet-Draft" value="draft-chen-bmwg-savnet-sav-benchmarking-03"/>
    <author initials="L." surname="Chen" fullname="Li Chen">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>lichen@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="D." surname="Li" fullname="Dan Li">
      <organization>Tsinghua University</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>tolidan@tsinghua.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Liu" fullname="Libin Liu">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>liulb@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Qin" fullname="Lancheng Qin">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>qinlc@zgclab.edu.cn</email>
      </address>
    </author>
    <date year="2025" month="March" day="15"/>
    <area>General [REPLACE]</area>
    <workgroup>IETF</workgroup>
    <abstract>
      <?line 65?>

<t>This document defines methodologies for benchmarking the performance of source address validation (SAV) mechanisms. SAV mechanisms are utilized to generate SAV rules to prevent source address spoofing, and have been implemented with many various designs in order to perform SAV in the corresponding scenarios. This document takes the approach of considering a SAV device to be a black box, defining the methodology in a manner that is agnostic to the mechanisms. This document provides a method for measuring the performance of existing and new SAV implementations.</t>
    </abstract>
  </front>
  <middle>
    <?line 69?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>Source address validation (SAV) is significantly important to prevent source address spoofing. Operators are suggested to deploy different SAV mechanisms <xref target="RFC3704"/> <xref target="RFC8704"/> based on their deployment network environments. In addition, existing intra-domain and inter-domain SAV mechanisms have problems in operational overhead and accuracy under various scenarios <xref target="intra-domain-ps"/> <xref target="inter-domain-ps"/>. Intra-domain and inter-domain SAVNET architectures <xref target="intra-domain-arch"/> <xref target="inter-domain-arch"/> are proposed to guide the design of new intra-domain and inter-domain SAV mechanisms to solve the problems. The benchmarking methodology defined in this document will help operators to get a more accurate idea of the SAV performance when their deployed devices enable SAV and will also help vendors to test the performance of SAV implementation for their devices.</t>
      <t>This document provides generic methodologies for benchamarking SAV mechanism performance. To achieve the desired functionality, a SAV device may support many SAV mechanisms. This document considers a SAV device to be a black box, regardless of the design and implementation. The tests defined in this document can be used to benchmark a SAV device for SAV accuracy, convergence performance, and control plane and data plane forwarding performance. These tests can be performed on a hardware router, a bare metal server, a virtual machine (VM) instance, or q container instance, which runs as a SAV device. This document is intended for those people who want to measure a SAV device's performance as well as compare the performance of various SAV devices.</t>
      <section anchor="goal-and-scope">
        <name>Goal and Scope</name>
        <t>The benchmarking methodology outlined in this draft focuses on two objectives:</t>
        <ul spacing="normal">
          <li>
            <t>Assessing ''which SAV mechnisms performn best'' over a set of well-defined scenarios.</t>
          </li>
          <li>
            <t>Measuring the contribution of sub-systems to the overall SAV systems's performance (also known as ''micro-benchmark'').</t>
          </li>
        </ul>
        <t>The benchmark aims to compare the SAV performance of individual devices, e.g., hardware or software routers. It will showcase the performance of various SAV mechanisms for a given device and network scenario, with the objective of deploying the appropriate SAV mechanism in their network scenario.</t>
      </section>
      <section anchor="requirements-language">
        <name>Requirements Language</name>
        <t>The key words "<bcp14>MUST</bcp14>", "<bcp14>MUST NOT</bcp14>", "<bcp14>REQUIRED</bcp14>", "<bcp14>SHALL</bcp14>", "<bcp14>SHALL
NOT</bcp14>", "<bcp14>SHOULD</bcp14>", "<bcp14>SHOULD NOT</bcp14>", "<bcp14>RECOMMENDED</bcp14>", "<bcp14>NOT RECOMMENDED</bcp14>",
"<bcp14>MAY</bcp14>", and "<bcp14>OPTIONAL</bcp14>" in this document are to be interpreted as
described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/> when, and only when, they
appear in all capitals, as shown here.</t>
        <?line -18?>

</section>
    </section>
    <section anchor="terminology">
      <name>Terminology</name>
      <t>Improper Block: The validation results that the packets with legitimate source addresses are blocked improperly due to inaccurate SAV rules.</t>
      <t>Improper Permit: The validation results that the packets with spoofed source addresses are permitted improperly due to inaccurate SAV rules.</t>
      <t>SAV Control Plane: The SAV control plane consists of processes including gathering and communicating SAV-related information.</t>
      <t>SAV Data Plane: The SAV data plane stores the SAV rules within a specific data structure and validates each incoming packet to determine whether to permit or discard it.</t>
      <t>Host-facing Router: An intra-domain router of an AS which is connected to a host network (i.e., a layer-2 network).</t>
      <t>Customer-facing Router: An intra-domain router of an AS which is connected to an intra-domain customer network running the routing protocol (i.e., a layer-3 network).</t>
    </section>
    <section anchor="test-methodology">
      <name>Test Methodology</name>
      <section anchor="test-setup">
        <name>Test Setup</name>
        <t>The test setup in general is compliant with <xref target="RFC2544"/>. The Device Under Test (DUT) is connected to a Tester and other network devices to construct the network topology introduced in <xref target="testcase-sec"/>. The Tester is a traffic generator to generate network traffic with various source and destination addresses in order to emulate the spoofing or legitimate traffic. It is <bcp14>OPTIONAL</bcp14> to choose various proportions of traffic and it is needed to generate the traffic with line speed to test the data plane forwarding performance.</t>
        <figure anchor="testsetup">
          <name>Test Setup.</name>
          <artwork><![CDATA[
    +~~~~~~~~~~~~~~~~~~~~~~~~~~+
    | Test Network Environment |
    |     +--------------+     |
    |     |              |     |
+-->|     |      DUT     |     |---+
|   |     |              |     |   |
|   |     +--------------+     |   |
|   +~~~~~~~~~~~~~~~~~~~~~~~~~~+   |
|                                  |
|         +--------------+         |
|         |              |         |
+---------|    Tester    |<--------+
          |              |
          +--------------+
]]></artwork>
        </figure>
        <t><xref target="testsetup"/> illustrates the test configuration for the Device Under Test (DUT). Within the test network environment, the DUT can be interconnected with other devices to create a variety of test scenarios. The Tester may establish a direct connection with the DUT or link through intermediary devices. The nature of the connection between them is dictated by the benchmarking tests outlined in <xref target="testcase-sec"/>. Furthermore, the Tester has the capability to produce both spoofed and legitimate traffic to evaluate the SAV accuracy of the DUT in relevant scenarios, and it can also generate traffic at line rate to assess the data plane forwarding performance of the DUT. Additionally, the DUT is required to support logging functionalities to document all test outcomes.</t>
      </section>
      <section anchor="network-topology-and-device-configuration">
        <name>Network Topology and Device Configuration</name>
        <t>The placement of the DUT within the network topology significantly influences the precision of SAV mechanisms. Consequently, the benchmarking process <bcp14>MUST</bcp14> involve positioning the DUT at various locations throughout the network to thoroughly evaluate its performance.</t>
        <t>The routing configurations of devices within the network topology can vary, and the SAV rules generated are contingent upon these configurations. It is imperative to delineate the specific device configurations employed during testing.</t>
        <t>Moreover, it is essential to denote the role of each device, such as a host-facing router, customer-facing router, or AS border router within an intra-domain network, and to clarify the business relationships between ASes in an inter-domain network context.</t>
        <t>When assessing the data plane forwarding performance, the network traffic produced by the Tester must be characterized by specified traffic rates, the ratio of spoofing to legitimate traffic, and the distribution of source addresses, as these factors can all impact the outcomes of the tests.</t>
      </section>
    </section>
    <section anchor="sav-performance-indicators">
      <name>SAV Performance Indicators</name>
      <t>This section lists key performance indicators (KPIs) of SAV for overall benchmarking tests. All KPIs <bcp14>MUST</bcp14> be measured in the bencharking scenarios described in <xref target="testcase-sec"/>. Also, the KPIs <bcp14>MUST</bcp14> be measured from the result output of the DUT.</t>
      <section anchor="proportion-of-improper-blocks">
        <name>Proportion of Improper Blocks</name>
        <t>The proportion of legitimate traffic which is blocked improperly by the DUT across all the legitimate traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="proportion-of-improper-permits">
        <name>Proportion of Improper Permits</name>
        <t>The proportion of spoofing traffic which is permitted improperly by the DUT across all the spoofing traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="protocol-convergence-time">
        <name>Protocol Convergence Time</name>
        <t>The control protocol convergence time represents the period during which the SAV control plane protocol converges to update the SAV rules when routing changes happen, and it is the time elapsed from the begining of routing change to the completion of SAV rule update. This KPI can indicate the convergence performance of the SAV protocol.</t>
      </section>
      <section anchor="protocol-message-processing-throughput">
        <name>Protocol Message Processing Throughput</name>
        <t>The protocol message processing throughput measures the throughput of processing the packets for communicating SAV-related information on the control plane, and it can indicate the SAV control plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-sav-table-refreshing-rate">
        <name>Data Plane SAV Table Refreshing Rate</name>
        <t>The data plane SAV table refreshing rate refers to the rate at which a DUT updates its SAV table with new SAV rules, and it can reflect the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-forwarding-rate">
        <name>Data Plane Forwarding Rate</name>
        <t>The data plane forwarding rate measures the SAV data plane forwarding throughput for processing the data plane traffic, and it can indicate the SAV data plane performance of the DUT.</t>
      </section>
    </section>
    <section anchor="testcase-sec">
      <name>Benchmarking Tests</name>
      <section anchor="intra-domain-sav">
        <name>Intra-domain SAV</name>
        <section anchor="sav-accuracy">
          <name>SAV Accuracy</name>
          <t><strong>Objective</strong>: Evaluate the DUT's accuracy in handling legitimate and spoofing traffic across diverse intra-domain network scenarios, encompassing SAV implementations for customer or host networks, Internet-facing networks, and aggregation-router-facing networks. This assessment is quantified by the ratio of legitimate traffic erroneously blocked by the DUT to the total volume of legitimate traffic, alongside the ratio of spoofing traffic mistakenly allowed by the DUT relative to the total volume of spoofing traffic.</t>
          <t>In the following, this document introduces the classic scenarios for testing the accuracy of DUT for intra-domain SAV.</t>
          <figure anchor="intra-domain-customer-syn">
            <name>SAV for customer or host network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
| FIB on DUT            +~~~~~~~~~~+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +----------+                       |
|                       |   DUT    |                       |
|                       +----------+                       |
|                         /\    |                          |
|            Traffic withs |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |       Tester       |
                    |    (10.0.0.0/15)   |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-syn"/> shows the case of SAV for customer or host network in intra-domain symmetric routing scenario, and the DUT performs SAV as a customer/host-facing router and connects to Router 1 to access the Internet. Network 1 is a customer/host network within the AS, connects to the DUT, and its own prefix is 10.0.0.0/15. The Tester can emulate Network 1 to advertise its prefix in the control plane and generate spoofing and legitimate traffic in the data plane. In this case, the Tester configs to make the inbound traffic destined for 10.0.0.0/15 come from the DUT. The DUT learns the route to prefix 10.0.0.0/15 from the Tester, while the Tester can send outbound traffic with source addresses in prefix 10.0.0.0/15 to the DUT, which emulates the a symmetric routing scenario between the Tester and the DUT. The IP addrsses in this test case is optional and users can use other IP addresses, and this holds true for other test cases as well.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer or host network in intra-domain symmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester generates traffic using 10.0.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
          <figure anchor="intra-domain-customer-asyn">
            <name>SAV for customer or host network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +~~~~~~~~~~+                          |
|                       | Router 2 |                          |
| FIB on DUT            +~~~~~~~~~~+   FIB on Router 1        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  Router 2  /         \/  10.1.0.0/16  Router 2  |
|               +----------+     +~~~~~~~~~~+                 |
|               |   DUT    |     | Router 1 |                 |
|               +----------+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                    +--------------------+
                    |       Tester       |
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><xref target="intra-domain-customer-asyn"/> shows the case of SAV for customer or host network in intra-domain asymmetric routing scenario, and the DUT performs SAV as a customer/host-facing router. Network 1 is a customer/host network within the AS, connects to the DUT and Router 1, respectively, and its own prefix is 10.0.0./15. The Tester can emulate Network 1 and performs its control plane and data plane functions. In this case, the Tester configs to make the inbound traffic destined for 10.1.0.0/16 come only from the DUT and the inbound traffic destined for 10.0.0.0/16 to come only from Router 1. The DUT only learns the route to prefix 10.1.0.0/16 from the Tester, while Router 1 only learns the route to the prefix 10.0.0.0/16 from Network 1. Then, the DUT and Router 1 avertise their learned prefixes to Router 2. Besides, the DUT learns the route to 10.0.0.0/16 from Router 2, and Router 1 learns the route to 10.1.0.0/16 from Router 2. The Tester can send outbound traffic with source addresses of prefix 10.0.0.0/16 to the DUT, which emulates the an asymmetric routing scenario between the Tester and the DUT.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer or host network in intra-domain asymmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-asyn"/> to construct the test network environment. The Tester is connected to the DUT and Router 1 and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT, Router 1, and Router 2, are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
          <figure anchor="intra-domain-internet-syn">
            <name>SAV for Internet-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                          |
|                          |   \/                          |
|                       +----------+                       |
|                       |    DUT   | SAV facing Internet   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
|                          |    \/                         |
|                  +--------------------+                  |
|                  |     Network 1      |                  |
|                  |   (10.0.0.0/15)    |                  |
|                  +--------------------+                  |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-syn"/> illustrates the test scenario for SAV in an Internet-facing network within an intra-domain symmetric routing context. In this scenario, the network topology mirrors that of <xref target="intra-domain-customer-syn"/>, with the key distinction being the DUT's placement within the network. Here, the DUT is linked to Router 1 and the Internet, with the Tester simulating the Internet's role. The DUT executes Internet-facing SAV, as opposed to customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for Internet-facing network in intra-domain symmetric routing scenario**:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
          <figure anchor="intra-domain-internet-asyn">
            <name>SAV for Internet-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                             |
|                          |   \/                             |
|                       +----------+                          |
|                       |    DUT   |                          |
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
|                         \    \/                             |
|                  +--------------------+                     |
|                  |     Network 1      |                     |
|                  |   (10.0.0.0/15)    |                     |
|                  +--------------------+                     |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><xref target="intra-domain-internet-asyn"/> shows the test case of SAV for Internet-facing network in intra-domain asymmetric routing scenario. In this test case, the network topology is the same with <xref target="intra-domain-customer-asyn"/>, and the difference is the location of the DUT in the network topology, where the DUT is connected to Router 1 and Router 2 within the same AS, as well as the Internet. The Tester is used to emulate the Internet. The DUT performs Internet-facing SAV instead of customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for Internet-facing network in intra-domain asymmetric routing scenario**:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-asyn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
          <figure anchor="intra-domain-agg-syn">
            <name>SAV for aggregation-router-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +----------+                       |
|                       |    DUT   | SAV facing Router 1   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |       Tester       |
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-syn"/> depicts the test scenario for SAV in an aggregation-router-facing network within an intra-domain symmetric routing environment. The test network setup in <xref target="intra-domain-agg-syn"/> is identical to that of <xref target="intra-domain-internet-syn"/>. The Tester is linked to Router 1 to simulate the operations of Network 1, thereby evaluating the SAV accuracy of the DUT as it faces the direction of Router 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for aggregation-router-facing network in intra-domain symmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-syn"/> to construct the test network environment. The Tester is connected to Router 1 and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester can send traffic using 10.1.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 for this test case.</t>
          <figure anchor="intra-domain-agg-asyn">
            <name>SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +----------+                          |
|                       |    DUT   | SAV facing Router 1 and 2|
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
          </figure>
          <t><xref target="intra-domain-agg-asyn"/> shows the test case of SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario. The test network environment of <xref target="intra-domain-agg-asyn"/> is the same with <xref target="intra-domain-internet-asyn"/>. The Tester is connected to Router 1 and Router 2 to emulate the functions of Network 1 to test the SAV accuracy of the DUT facing the direction of Router 1 and Router 2.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-asyn"/> to construct the test network environment. The Tester is connected to Router 1 and Router 2 and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 and Router 2 for this test case.</t>
        </section>
        <section anchor="intra-control-plane-sec">
          <name>Control Plane Performance</name>
          <t><strong>Objective</strong>: Measure the control plane performance of the DUT, encompassing both protocol convergence performance and protocol message processing performance in response to route changes triggered by network failures or operator configurations. The protocol convergence performance is quantified by the protocol convergence time, which is the duration from the initiation of a routing change to the completion of the SAV rule update. The protocol message processing performance is characterized by the protocol message processing throughput, defined as the total size of protocol messages processed per second.</t>
          <figure anchor="intra-convg-perf">
            <name>Test setup for protocol convergence performance measurement.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
| Emulated Topology |------|   Tester    |<-------->|    DUT    |
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
]]></artwork>
          </figure>
          <t><strong>Protocol Convergence Performance</strong>: <xref target="intra-convg-perf"/> illustrates the test setup for measuring protocol convergence performance. The protocol convergence process of the DUT, which updates SAV rules, is initiated when route changes occur. These route changes, which necessitate the updating of SAV rules, can result from network failures or operator configurations. Consequently, in <xref target="intra-convg-perf"/>, the Tester is directly connected to the DUT and simulates route changes to trigger the DUT's convergence process by adding or withdrawing prefixes.</t>
          <t>The <strong>procedure</strong> is listed below for testing the protocol convergence performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the protocol convergence time of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively withdraws the prefixes in a certern percentage of the overall prefixes supported by the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the protocol convergence time is calculated according to the logs of the DUT about the beginning and completion of the protocol convergence.</t>
            </li>
          </ol>
          <t>Please note that withdrawing prefixes proportionally for IGP can be accomplished by proportionally shutting down interfaces. For instance, the Tester is connected to an emulated network topology where each interface links to an emulated device. Suppose the Tester connects to ten emulated devices through ten interfaces. Initially, these ten emulated devices advertise their prefixes to the DUT. To withdraw 10% of the prefixes, the Tester can randomly disable one interface connected to an emulated device. Similarly, to withdraw 20%, it can shut down two interfaces randomly, and this method applies to other proportions accordingly. This is merely a suggested approach, and alternative methods achieving the same objective are also acceptable.</t>
          <t>The protocol convergence time, which is the duration required for the DUT to complete the protocol convergence process, should be measured from the moment the last hello message is received on the DUT from the emulated device connected by the disabled interface until the SAV rule generation on the DUT is finalized.
To accurately measure the protocol convergence time, the DUT's logs should record the timestamp of receiving the last hello message and the timestamp when the SAV rule update is completed. The protocol convergence time is then determined by calculating the difference between these two timestamps.</t>
          <t>It is important to note that if the emulated device sends a "goodbye hello" message during the process of shutting down the Tester's interface, using the reception time of this goodbye hello message instead of the last hello message would yield a more precise measurement, as recommended by <xref target="RFC4061"/>.</t>
          <t><strong>Protocol Message Processing Performance</strong>: The test of the protocol message processing performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The protocol message processing performance measures the protocol message processing throughput to process the protocol messages. Therefore, the Tester can vary the rate for sending protocol messages, such as from 10% to 100% of the overall link capacity between the Tester and the DUT. Then, the DUT records the size of the processed total protocol messages and processing time.</t>
          <t>The <strong>procedure</strong> is listed below for testing the protocol message processing performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the protocol message processing throughput of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the protocol messages to the DUT in a certern percentage of the overall link capacity between the Tester and the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the protocol message processing throughput is calculated according to the logs of the DUT about the overall size of the protocol messages and the overall processing time.</t>
            </li>
          </ol>
          <t>To measure the protocol message processing throughput, the logs of the DUT records the overall size of the protocol messages and the overall processing time, and the protocol message processing throughput is calculated by dividing the overall size of the protocol messages by the overall processing time.</t>
        </section>
        <section anchor="intra-data-plane-sec">
          <name>Data Plane Performance</name>
          <t><strong>Objective</strong>: Evaluate the data plane performance of the DUT, encompassing both the data plane SAV table refreshing performance and the data plane forwarding performance. The data plane SAV table refreshing performance is quantified by the data plane SAV table refreshing rate, which indicates the speed at which the DUT updates its SAV table with newly implemented SAV rules. Concurrently, the data plane forwarding performance is measured by the data plane forwarding rate, which represents the total size of packets forwarded by the DUT per second.</t>
          <t><strong>Data Plane SAV Table Refreshing Performance</strong>: The assessment of the data plane SAV table refreshing performance utilizes the identical test configuration depicted in <xref target="intra-convg-perf"/>. This performance metric gauges the velocity at which a DUT refreshes its SAV table with new SAV rules. To this end, the Tester can modulate the transmission rate of protocol messages, ranging from 10% to 100% of the total link capacity between the Tester and the DUT. This variation influences the proportion of updated SAV rules and, consequently, the proportion of entries in the SAV table. Subsequently, the DUT logs the total count of updated SAV table entries and the duration of the refreshing process.</t>
          <t>The <strong>procedure</strong> is listed below for testing the data plane SAV table refreshing performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the data plane SAV table refreshing rate of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the protocol messages to the DUT in a certern percentage of the overall link capacity between the Tester and the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the data plane SAV table refreshing rate is calculated according to the logs of the DUT about the overall number of updated SAV table entries and the overall refreshing time.</t>
            </li>
          </ol>
          <t>To measure the data plane SAV table refreshing rate, the logs of the DUT records the overall number of updated SAV table entries and the overall refreshing time, and the data plane SAV table refreshing rate is calculated by dividing the overall number of updated SAV table entries by the overall refreshing time.</t>
          <t><strong>Data Plane Forwarding Performance</strong>: The evaluation of the data plane forwarding performance employs the same test setup illustrated in <xref target="intra-convg-perf"/>. The Tester is required to transmit a blend of spoofing and legitimate traffic at a rate equivalent to the total link capacity between the Tester and the DUT, while the DUT constructs a SAV table that utilizes the entire allocated storage space. The proportion of spoofing traffic to legitimate traffic can be adjusted across a range, for example, from 1:9 to 9:1. The DUT then records the aggregate size of the packets forwarded and the total duration of the forwarding activity.</t>
          <t>The <strong>procedure</strong> is listed below for testing the data plane forwarding performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the data plane forwarding rate of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the data plane traffic including spoofing and legitimate traffic to the DUT at the rate of the overall link capacity between the Tester and the DUT. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
            <li>
              <t>Finally, the data plane forwarding rate is calculated according to the logs of the DUT about the overall size of the forwarded traffic and the overall forwarding time.</t>
            </li>
          </ol>
          <t>To measure the data plane forwarding rate, the logs of the DUT records the overall size of the forwarded traffic and the overall forwarding time, and the data plane forwarding rate is calculated by dividing the overall size of the forwarded traffic by the overall forwarding time.</t>
        </section>
      </section>
      <section anchor="inter-domain-sav">
        <name>Inter-domain SAV</name>
        <section anchor="sav-accuracy-1">
          <name>SAV Accuracy</name>
          <t><strong>Objective</strong>: Measure the accuracy of the DUT to process legitimate traffic and spoofing traffic across various inter-domain network scenarios including SAV for customer-facing ASes and SAV for provider/peer-facing ASes, defined as the proportion of legitimate traffic which is blocked improperly by the DUT across all the legitimate traffic and the proportion of spoofing traffic which is permitted improperly by the DUT across all the spoofing traffic.</t>
          <figure anchor="inter-customer-syn">
            <name>SAV for customer-facing ASes in inter-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|             \         |              \            \      |
|     P6[AS 1] \        |               \            \     |
|      P1[AS 1] \       |                \            \    |
|          (C2P) \      | (C2P/P2P) (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-syn"/> presents a test case of SAV for customer-facing ASes in inter-domain symmetric routing scenario. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 and P6 to AS 2 and the DUT, respectively, and then AS 2 further propagates the route for prefix P1 and P6 to the DUT. Consequently, the DUT can learn the route for prefixes P1 and P6 from AS 1 and AS 2. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;AS 4, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in inter-domain symmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain symmetric routing scenario, a testbed can be built as shown in <xref target="inter-customer-syn"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-syn"/> can be set at AS 1 and AS 2 to evaluate its SAV accuracy according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-lpp">
            <name>SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|    P6[AS 1] \         | NO_EXPORT    \            \      |
|     P1[AS 1] \        |               \            \     |
|     NO_EXPORT \       |                \            \    |
|          (C2P) \      | (C2P)     (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><xref target="inter-customer-lpp"/> presents a test case of SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT configuration. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 to AS 2 and adds the NO_EXPORT community attribute to the BGP advertisement sent to AS 2, preventing AS 2 from further propagating the route for prefix P1 to the DUT.  Similarly, AS 1 adds the NO_EXPORT community attribute to the BGP advertisement sent to the DUT, resulting in the DUT not propagating the route for prefix P6 to AS 3. Consequently, the DUT only learns the route for prefix P1 from AS 1 in this scenario. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT, a testbed can be built as shown in <xref target="inter-customer-lpp"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-lpp"/> can be set at AS 1 and AS 2 to evaluate its SAV accuracy according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-dsr">
            <name>SAV for customer-facing ASes in the scenario of direct server return (DSR).</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                       |
|                                +----------------+               |
|                Anycast Server+-+    AS 3(P3)    |               |
|                                +-+/\----+/\+----+               |
|                                   /       \                     |
|                         P3[AS 3] /         \ P3[AS 3]           |
|                                 /           \                   |
|                                / (C2P)       \                  |
|                       +----------------+      \                 |
|                       |     DUT(P4)    |       \                |
|                       ++/\+--+/\+--+/\++        \               |
|          P6[AS 1, AS 2] /     |      \           \              |
|               P2[AS 2] /      |       \           \             |
|                       /       |        \           \            |
|                      / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|      +----------------+       |          \           \          |
|User+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|      +----------+/\+--+       | P6[AS 1]   \           \        |
|          P6[AS 1] \           |             \           \       |
|           P1[AS 1] \          |              \           \      |
|                     \         |               \           \     |
|                      \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                    +----------------+        +----------------+ |
|                    |AS 1(P1, P3, P6)|        |    AS 5(P5)    | |
|                    +----------------+        +----------------+ |
|                         /\     |                                |
|                          |     |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                     +----------------+
                     |     Tester     |
                     | (Edge Server)  |
                     +----------------+

Within the test network environment, P3 is the anycast prefix and is only advertised by AS 3 through BGP.
]]></artwork>
          </figure>
          <t><xref target="inter-customer-dsr"/> presents a test case of SAV for customer-facing ASes in the scenario of direct server return (DSR). In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2 send requests to the anycast destination IP, the forwarding path is AS 2-&gt;DUT-&gt;AS 3.  The anycast servers in AS 3 receive the requests and tunnel them to the edge servers in AS 1.  Finally, the edge servers send the content to the users with source addresses in prefix P3. The reverse forwarding path is AS 1-&gt;DUT-&gt;AS 2. The Tester sends the traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;DUT-&gt;AS 2.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in the scenario of direct server return (DSR):</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of DSR, a testbed can be built as shown in <xref target="inter-customer-dsr"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of DSR.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P3 as source addresses and P2 as destination addresses (legitimate traffic) to AS 2 via the DUT.</t>
            </li>
          </ol>
          <t>Note that in <xref target="inter-customer-dsr"/>, to direct the return traffic from the edge server to the user to the path AS 1-&gt;DUT-&gt;AS 2, the document recommends to config static route to direct the traffic with source addresses in P3 and destination addresses in P2 to the DUT.</t>
          <t>The <strong>expected results</strong> are that the DUT can permit the legitimate traffic with source addresses in P3 from the direction of AS 1 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-dsr"/> can be set at AS 1 and AS 2 to evaluate its SAV accuracy according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-reflect">
            <name>SAV for customer-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P1')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |  Server+-+    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="inter-customer-reflect"/> depicts the test case of SAV for customer-facing ASes in the scenario of reflection attacks. In this test case, the reflection attack by source address spoofing takes place within DUT's customer cone, where the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P5) that are designed to respond to such requests. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-reflect"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in the scenario of reflection attacks:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="inter-customer-reflect"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P5 as destination addresses (spoofing traffic) to AS 5 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-reflect"/> can be set at AS 1 and AS 2 to evaluate its SAV accuracy according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-direct">
            <name>SAV for customer-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P5')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |          |    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="inter-customer-direct"/> presents the test case of SAV for customer-facing ASes in the scenario of direct attacks. In this test case, the direct attack by source address spoofing takes place within the DUT's customer cone, where the attacker spoofs a source address (P5) and directly targets the victim's IP address (P1), overwhelming its network resources. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-direct"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in the scenario of direct attacks**:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="inter-customer-direct"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P5 as source addresses and P1 as destination addresses (spoofing traffic) to AS 1 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic with source addresses in P5 from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-direct"/> can be set at AS 1 and AS 2 to evaluate its SAV accuracy according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="reflection-attack-p">
            <name>SAV for provider-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
                                   +----------------+
                                   |     Tester     |
                                   |   (Attacker)   |
                                   |      (P1')     |
                                   +----------------+
                                        |     /\
                                        |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment              \/     |                    |
|                                  +----------------+               |
|                                  |                |               |
|                                  |    AS 3(P3)    |               |
|                                  |                |               |
|                                  +-+/\----+/\+----+               |
|                                     /       \                     |
|                                    /         \                    |
|                                   /           \                   |
|                                  / (C2P/P2P)   \                  |
|                         +----------------+      \                 |
|                         |     DUT(P4)    |       \                |
|                         ++/\+--+/\+--+/\++        \               |
|            P6[AS 1, AS 2] /     |      \           \              |
|                 P2[AS 2] /      |       \           \             |
|                         /       |        \           \            |
|                        / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|        +----------------+       |          \           \          |
|Server+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|        +----------+/\+--+       | P6[AS 1]   \           \        |
|            P6[AS 1] \           | NO_EXPORT   \           \       |
|             P1[AS 1] \          |              \           \      |
|             NO_EXPORT \         |               \           \     |
|                        \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                      +----------------+        +----------------+ |
|              Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|                      +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Provider/Peer-facing ASes</strong>: <xref target="reflection-attack-p"/> depicts the test case of SAV for provider-facing ASes in the scenario of reflection attacks. In this test case, the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P2) that respond to such requests. The Tester performs the source address spoofing function as an attacker. The servers then send overwhelming responses back to the victim, exhausting its network resources. The arrows in <xref target="reflection-attack-p"/> represent the commercial relationships between ASes. AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for provider-facing ASes in the scenario of reflection attacks:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="reflection-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P2 as destination addresses (spoofing traffic) to AS 2 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 3 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="reflection-attack-p"/> can be set at AS 1 and AS 2 to evaluate its SAV accuracy according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="direct-attack-p">
            <name>SAV for provider-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
                           +----------------+
                           |     Tester     |
                           |   (Attacker)   |
                           |      (P2')     |
                           +----------------+
                                |     /\
                                |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment      \/     |                    |
|                          +----------------+               |
|                          |    AS 3(P3)    |               |
|                          +-+/\----+/\+----+               |
|                             /       \                     |
|                            /         \                    |
|                           /           \                   |
|                          / (C2P/P2P)   \                  |
|                 +----------------+      \                 |
|                 |     DUT(P4)    |       \                |
|                 ++/\+--+/\+--+/\++        \               |
|    P6[AS 1, AS 2] /     |      \           \              |
|         P2[AS 2] /      |       \           \             |
|                 /       |        \           \            |
|                / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|+----------------+       |          \           \          |
||    AS 2(P2)    |       | P1[AS 1]  \           \         |
|+----------+/\+--+       | P6[AS 1]   \           \        |
|    P6[AS 1] \           | NO_EXPORT   \           \       |
|     P1[AS 1] \          |              \           \      |
|     NO_EXPORT \         |               \           \     |
|                \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|              +----------------+        +----------------+ |
|      Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|              +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="direct-attack-p"/> showcases a testcase of SAV for provider-facing ASes in the scenario of direct attacks. In this test case, the attacker spoofs another source address (P2) and directly targets the victim's IP address (P1), overwhelming its network resources.  The arrows in <xref target="direct-attack-p"/> represent the commercial relationships between ASes.  AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for provider-facing ASes in the scenario of direct attacks:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="direct-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P2 as source addresses and P1 as destination addresses (spoofing traffic) to AS1 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic with source addresses in P2 from the direction of AS 3 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="direct-attack-p"/> can be set at AS 1 and AS 2 to evaluate its SAV accuracy according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
        </section>
        <section anchor="control-plane-performance">
          <name>Control Plane Performance</name>
          <t>The test setup, procedure, and measures can refer to <xref target="intra-control-plane-sec"/> for testing the protocl convergence performance and protocol message processing performance.</t>
        </section>
        <section anchor="data-plane-performance">
          <name>Data Plane Performance</name>
          <t>The test setup, procedure, and measures can refer to <xref target="intra-data-plane-sec"/> for testing the data plane SAV table refreshing performance and data plane forwarding performance.</t>
        </section>
      </section>
    </section>
    <section anchor="reporting-format">
      <name>Reporting Format</name>
      <t>Each test has a reporting format that contains some global and identical reporting components, and some individual components that are specific to individual tests. The following parameters for test configuration and SAV mechanism settings <bcp14>MUST</bcp14> be reflected in the test report.</t>
      <t>Test Configuration Parameters:</t>
      <ol spacing="normal" type="1"><li>
          <t>Test device hardware and software versions</t>
        </li>
        <li>
          <t>Device CPU load</t>
        </li>
        <li>
          <t>Network topology</t>
        </li>
        <li>
          <t>Test traffic attributes</t>
        </li>
        <li>
          <t>System configuration (e.g., physical or virtual machine, CPU, memory, caches, operating system, interface capacity)</t>
        </li>
        <li>
          <t>Device configuration (e.g., symmetric routing, NO_EXPORT)</t>
        </li>
        <li>
          <t>SAV mechanism</t>
        </li>
      </ol>
    </section>
    <section anchor="IANA">
      <name>IANA Considerations</name>
      <t>This document has no IANA actions.</t>
    </section>
    <section anchor="security">
      <name>Security Considerations</name>
      <t>The benchmarking tests outlined in this document are confined to evaluating the performance of SAV devices within a controlled laboratory environment, utilizing isolated networks.</t>
      <t>The network topology employed for benchmarking must constitute an independent test setup. It is imperative that this setup remains disconnected from any devices that could potentially relay test traffic into an operational production network.</t>
    </section>
  </middle>
  <back>
    <references anchor="sec-combined-references">
      <name>References</name>
      <references anchor="sec-normative-references">
        <name>Normative References</name>
        <reference anchor="RFC3704">
          <front>
            <title>Ingress Filtering for Multihomed Networks</title>
            <author fullname="F. Baker" initials="F." surname="Baker"/>
            <author fullname="P. Savola" initials="P." surname="Savola"/>
            <date month="March" year="2004"/>
            <abstract>
              <t>BCP 38, RFC 2827, is designed to limit the impact of distributed denial of service attacks, by denying traffic with spoofed addresses access to the network, and to help ensure that traffic is traceable to its correct source network. As a side effect of protecting the Internet against such attacks, the network implementing the solution also protects itself from this and other attacks, such as spoofed management access to networking equipment. There are cases when this may create problems, e.g., with multihoming. This document describes the current ingress filtering operational mechanisms, examines generic issues related to ingress filtering, and delves into the effects on multihoming in particular. This memo updates RFC 2827. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="3704"/>
          <seriesInfo name="DOI" value="10.17487/RFC3704"/>
        </reference>
        <reference anchor="RFC8704">
          <front>
            <title>Enhanced Feasible-Path Unicast Reverse Path Forwarding</title>
            <author fullname="K. Sriram" initials="K." surname="Sriram"/>
            <author fullname="D. Montgomery" initials="D." surname="Montgomery"/>
            <author fullname="J. Haas" initials="J." surname="Haas"/>
            <date month="February" year="2020"/>
            <abstract>
              <t>This document identifies a need for and proposes improvement of the unicast Reverse Path Forwarding (uRPF) techniques (see RFC 3704) for detection and mitigation of source address spoofing (see BCP 38). Strict uRPF is inflexible about directionality, the loose uRPF is oblivious to directionality, and the current feasible-path uRPF attempts to strike a balance between the two (see RFC 3704). However, as shown in this document, the existing feasible-path uRPF still has shortcomings. This document describes enhanced feasible-path uRPF (EFP-uRPF) techniques that are more flexible (in a meaningful way) about directionality than the feasible-path uRPF (RFC 3704). The proposed EFP-uRPF methods aim to significantly reduce false positives regarding invalid detection in source address validation (SAV). Hence, they can potentially alleviate ISPs' concerns about the possibility of disrupting service for their customers and encourage greater deployment of uRPF techniques. This document updates RFC 3704.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="8704"/>
          <seriesInfo name="DOI" value="10.17487/RFC8704"/>
        </reference>
        <reference anchor="RFC2544">
          <front>
            <title>Benchmarking Methodology for Network Interconnect Devices</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <author fullname="J. McQuaid" initials="J." surname="McQuaid"/>
            <date month="March" year="1999"/>
            <abstract>
              <t>This document is a republication of RFC 1944 correcting the values for the IP addresses which were assigned to be used as the default addresses for networking test equipment. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="2544"/>
          <seriesInfo name="DOI" value="10.17487/RFC2544"/>
        </reference>
        <reference anchor="RFC4061">
          <front>
            <title>Benchmarking Basic OSPF Single Router Control Plane Convergence</title>
            <author fullname="V. Manral" initials="V." surname="Manral"/>
            <author fullname="R. White" initials="R." surname="White"/>
            <author fullname="A. Shaikh" initials="A." surname="Shaikh"/>
            <date month="April" year="2005"/>
            <abstract>
              <t>This document provides suggestions for measuring OSPF single router control plane convergence. Its initial emphasis is on the control plane of a single OSPF router. We do not address forwarding plane performance.</t>
              <t>NOTE: In this document, the word "convergence" relates to single router control plane convergence only. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="4061"/>
          <seriesInfo name="DOI" value="10.17487/RFC4061"/>
        </reference>
        <reference anchor="RFC2119">
          <front>
            <title>Key words for use in RFCs to Indicate Requirement Levels</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <date month="March" year="1997"/>
            <abstract>
              <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="2119"/>
          <seriesInfo name="DOI" value="10.17487/RFC2119"/>
        </reference>
        <reference anchor="RFC8174">
          <front>
            <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
            <author fullname="B. Leiba" initials="B." surname="Leiba"/>
            <date month="May" year="2017"/>
            <abstract>
              <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="8174"/>
          <seriesInfo name="DOI" value="10.17487/RFC8174"/>
        </reference>
      </references>
      <references anchor="sec-informative-references">
        <name>Informative References</name>
        <reference anchor="intra-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Intra-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="inter-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Inter-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="intra-domain-arch" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-architecture/">
          <front>
            <title>Intra-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="inter-domain-arch" target="https://datatracker.ietf.org/doc/draft-wu-savnet-inter-domain-architecture/">
          <front>
            <title>Inter-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
      </references>
    </references>
    <?line 912?>

<section numbered="false" anchor="Acknowledgements">
      <name>Acknowledgements</name>
      <t>Many thanks to Aijun Wang, Nan Geng, Susan Hares, Giuseppe Fioccola etc. for their valuable comments on this document.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAAAAAAAAA+19eXPbSJLv/4rwd8DasWHJlmhLsj0zjtneUdvubsf64Fpy
99s37pgAwRKJMQhwcEitsTyfZT/L+2QvjzqBAgge8tViR7hFEJWVlZWVlfmr
I/f29m5sFWWYjv8WJlkqHgdlXokbW/E8pz+L8uD+/T/dP7ixFYXl46Aox8Gt
4MlURO+hWDWaxUURZ2l5MYeSz5+d/HBjK8xF+Dj4UaQiD5Pgr2+eDV8cPXn2
642t84l65cbWOIvScAZlxnl4Wu5FU5HujWbnk70iPEtFif/bG4k0ms7C/H2c
TvbuH2KxMi4TKPS99UvwUpTTbJwl2eQiOM3y4Dir8kgER+NxLooi+DlM4nFY
ApPA2miUi7PHwfHRz0TixlYSpsCVSJF4WAGh/PGNrb0gTovHwY2tIGAmX8TY
5BQfZDm8/3+nWTqZVGEaVWnwIhxleVhm+QX+HsXlBTIY/x14owdZlZY5PHsy
jdMQhFcVIjh58TTYFr9FYl4Gb/9rB6iq96hGLCdmYZw8DpIYZfOXf06iJBwN
xLgaRKmHw6chMBJrBk8KqH1ahcHbND4TeQFMXQVzZYayTf9Syura+XsRj2Lk
sPo8MqyS0UIRvgBWQNST4L/jz9LT/4jTJKpzeWMrzfIZqO+ZeIzvvvnhyeEf
7j9Qf//R+vvg4QP994P7j/bxbxjH6alDIIaqw71xBnWme/OCngVBGeYTAcN7
Wpbw7N49GDAhvBe9F/kgFuXpAIRxD8bsPR6u+EiNVJdgno0SMdsDg1KKmUjL
e5I+j9vWoQlsBc8tQsErUZ5n+fsi+DGcB0dpmFwUcbEbDJl+cKzo7wZguYI3
4h9VnNODIuAagS5UeHD/4IFstcg33GqL4Hqt1oSuoNWma8I8mm64t5FkXIqo
rHLhNtnpy/b2b4MhfvXsZCc4sigt7sDVm3JeeTuwsyGmezbREDA8e3tBOCqQ
yxIH+Mk0LgJgscKODMbiNE5FEcz0rBbDN5zX7OkwKKcimIucxnYKPGWnQcHc
hZK7M5e7HaAYTcM0LmbFACdA63sAU3ZQlXES/1OMwagHE5q8S0Hv5VUCHMDT
OcydyGKtomKeZcDzhJVyGp4JYFWAcs/mCSkn0DyPy2kAnF4AW3mcVdBgUcST
tMAhkOVjkVMF3CCqFZ5jG6Msh0rmWTrGVheRSLE8tMCVWhm+Rx6hQDiH0RhG
U5RIlKVFDLSxaEhUx+IsBtahrhG8GowS0JRglP22y3JXkp1ZLgUwEiLrKfI4
DcsA6g0naVaUcYSE+H0jWpcxYOYMWCiQBhGlrpyJsKjyln4Uv8VAG1kGcabi
nMWhhEk9WgxQcVCRZvF4nAj8dosGXTauIvZ1bmwdL1AI4BP7ID6NozAtkwus
JMvBFSx79PYgeD0XNBmy+hTVZCKKkvVnLOZJdhGM49NTkSORmr59+CDnsY8f
+e8/8t+jsAACGXV9nEsyJMeULSN4amdxnqVk8wbQYuQrxjbtGrnZNoqEGDuD
2GWF9FUacNZGahZQBOc1A99pKsIxUQmjqIIxexFUKeqrUmStk9CS2uRKratN
PR8/Dlzr6GMQTElg26QGbfyxSV0+xe6AFs2zQo7mCjSQNI0HHSoZ6tVScgI6
RZacMR0lL9R24Rome+iwLRvzWLZHxXmcJMFUJHMpbVQiMjsljpMM+GdhgwUC
1kNkGKtFluzBcg7umqMqUBeP8AIUJQQWqQg2jWoMkyLjakGzx7LSErTWNwyb
o46GrqqOahk07bce8GRDwUS0GPJQCcyRs80ECDcDOUxjcWZ6L4cmnlZpxAoK
Huiua9lm4QUMxTmOY7a3bjfWzZOykMVC+5iLSZiDqQETIHtDKhMpjiMn1gqU
a9GuAmBysI5K6qjWIZcRiuewC+XY20WOYVCCcCOnx3jygR/BBCbBHMI6QU/Q
H5Bf4dVzaALK3JXyVBSKXcmV/J1tUQg2Ih+f46jKswpGCIp8hF+ha8FIFCI/
44dgmsoKnsyw16DG7Z9f7mCUUTKH0JZ/EIswvMCAmB/OpxDkwTwL02Ho9kS9
v+KCBikYoLHURhjkwG4G8gcyWXAurTfPMMIhdrtwdByqOhc4KqDZ2WyODfIM
A2XmDBnW+lu3gh8zaCsK+TiCYcxDocMagOwSVxfQJYNmRKAEBVn98yzIRn8H
mwfRCjrpN7buBEcF/IrxZXD7NgtKqTQbJskv9ltR3r5NNhuaXYAtAfaxhXtK
CY37wKRfOtMwKU88qmiooz9VjfaKC5jS2PrhK0g7BJEhB/KnmlC3yca8T7Pz
FAV7+/YsjvLMQBm3b+8MGpIKwpjrsPuhbuyAoxi8IDAuqGKyK2DWG0wGu0ZD
QSeK7LS0tBWnSWlxi2l2HoXFwn62zD4qWRhMoD9SNSbZLeHpWEl0lz08kpHq
QKTKZllJmHyzeR4rz9KYvVjZ8TphpWtOrAOROoTkE61y78VFAKXGRXDz5dvj
k5u7/P/g1Wv6+82z/377/M2zp/j38U9HL17oP7bkG8c/vX774qn5y5R88vrl
y2evnnJheBo4j7Zuvjz6n5tse26+Hp48f/3q6MXNprGjPiWbSlMsuFboKIXF
FtjQCHSOB8X3T4b/73/3H8Cs/m8Yzu/v/4mmePzyx/0/oH+EEx7XlqXgr/FX
ENzFFohWhDk5q9DRUTiPwTSBeoAOYq+nMO3lYrC1deevKJlfHwd/HkXz/Qff
yQfYYOehkpnzkGTWfNIozEL0PPJUo6XpPK9J2uX36H+c70ru1sM//ydammBv
/4//+d0WO8cnIp/FKVkifPB8hqoIluL7JIveP6YZy3KQweWqkrJgh59GC4aU
8IDUPBET8DlnqMaudyzYFx4hTUHTIlUCXTWuSAPiVDs2OrQaOPwMkc9ySYbI
KUcD5+NmThTL5fjBb0/kbDrE6ZM5wsfuJEseBE6dMNqBesT1xmmUVDTVTkLg
NlfhDBi4WZVCvFFK12cvF0lIvCmcCvwHxcBTnLxrtVsTegF+o4z6TJyK8qCQ
rZiLCEMbLgHhdsVBObIhBYtuIoaKwG02I7+AhMoRTEkKQ04mtkDGpyBJtLHj
uIjA5AZxOQiQ258gHNw7DSMk8obM7uPgKHU9bDbHKCdwMo6O5awf4+wLwWUk
QyfwNoCWtoPb8UAM0LdIwgvwzQ/UDzyLPKlABjN4vpmqa6UiSV0zAx6KDpGR
JIksz8osAnWocXrocspDEBpmofXStNPjY1FWc2XPySUv8AkatIlcSYjZT0ni
kCIIUHsKHRH2xKAKCz7lGeotBWhEd/vp25Mdj5DxR/QT0JRS96o2qviBZuOU
1YYarF4oIbCSwACH22y8P3xArnF23StEpBiS1SBkEIBkT1EfJbiS5Q7SosnL
t6iBOsaU4xodWoFBLhsFM85tFEXMKhxSxLSK1lFnLaslKyHPAHhTFpQaPc3Q
pVQ1UxyZE+ZAfr/kjpx+KpsKMa6BRlix0wwyxjAe+T0dcC32zVEh/qU/DKfd
/Vfr5y6/ccldL8HU4JmBDIJL9QZR2nM+d+mh8wb/qz+X6g0o+Z3zBmiZ/QZS
u7F1uYAO0zJv+fkxb3W13Ly14OO85a2x8ZaXeyMJ+aHHUt/xtz9rqixRPy37
tzo3bt9/eBzcoiCN7ALBs/9x09iOwc2PqC08DOkd8JfA6a0QaC3lNEGqB8P6
NJ5UuRPTt5mOQfALTyi6uAeJ2mUKoAMyfiQvz1gcGgRsZmzzkgscLSGNNVFe
0Pgiy2ejnNqGYGQPf4WjJC6mUGoM7nBUKruGTdEuOHKCAz5OwZ5MwVJPpswS
RLRxmF/oQI7ogzHBeVFG9Ra9EbRUMMIyw6E+jqOS5urRBb3qAtIUP9sxXtMi
/lDlKAQEeFhksmnTkLsH3NZwFCOqwQAkWddglFkeDhqepiEjsweTeqXMj40Z
qJahVHAuFAm8mlpi3lXmDHuP4jdjypS9K9mI8UOYQCgo7WfFrPoHuHYQM3aT
XBi1AeHmHOCQhVQADkwzEyRmAz4xK48JLcDdJ60B0cP0aKJzZf5O1HyFjZRa
/sQZAmrahUZEFGHZEjs36t+YAWvocXqaVIjLFBIjBP+rkLF0HYUCBgposMCC
u01lko5kQKFJnJ4R8DjPChKd8kCQO+gXNVOBz83YuFJ5EEiNbYRL6CfgVqtL
XBaNOefEcnAcc1FwUMtjuEs0qErA2QXrluuiKu0ak3uO7jTUg2Kv5gx9F6JW
q5qowYEncPpMsI+KOmmmeuXvch/X+BYzBZBKxIP8iAk19yUMyIwwLJ7T0acA
psDnolrSTFYBXj+vUKDTzNXsgrLCF4KtppYTrJCyqOahqudgnsAPHbHTIt1T
5brX3FApWylJsJsJ9PipNEFVgatlOHwSbug0nhfach0ds2/EJA22rboLZS9+
K0kIvyCYHGqwqdfQ3nU7XxqLuXILpZ1UBhxEgZMDjAJc+oOg6J/8juw5HPuS
As1YTJz6j+Ao5cqBCJoW0OgZhCYujFWLCQkTYCWDPiHone1eguoVSm9XWRNl
Csi8s2khTR5a9u15OsaIDihpOLyQU0hCoSHiM7ZBjHWBYPu/hs+LHWUicDJW
GFtzdgHrCc+xABuGkVAw51gtFzK0zmXMwoyDsjSnpSMw+SxsP+3TPJtxX1AQ
jrKZV7aRVCZ3qJ1l/NGFGAptZZ2XPHOZDs48MILUKDJ9UZ6B4pP9h0ftKhFz
B+fiNBGyd1umxwXtYGiipSFGPevN8OIP7Q2pE1qrGRyaPrHWDE7imcYNNZKh
XrQXF0CY2OMwjRUEOkrMNM60CeUWKkZcWKRBkmbtaj62PRSJWKDh0bMNTJH4
8hQBvXTXCrRoFCJPYOnmha2WI+h6mhVBBC4dhVtT3CxKazLGqiU7co0BdJ8E
LAenUN6gb7XFWZCTLW1I/SWYmxB4GPJkjlyd8MwMo8dSIn57Jt+em7dL/bYa
i1IK5rkBnfRiusTFTmmjVQ+0SS43u/3nuISORDxd7Xf1pDgMhkVFT2hV8o04
hcZMCbIBwkoY1nyD75b0bm7eJe8Tvotcr0jQI/CCWBdDGlDcrQV5NoYORQdq
OwFpntPI+riyeOndwB/MJNnSLGsaJc6dfq1Va71r9Tj2a63LrTKOzWjrvp5N
c3eZnlB88+GWM3lIEbi7nY5+5sc8UR5J+0QrTndeq8WRO3ceB8/siAWqvV0Y
awaEYASPE6zaMu3YrIallQZ0TDs9hdd5soMdkdIyE8vPs7eEh47C/uBvG5CE
8rQlCrdQSZfO/EK7JCYTXCpGSnvs19Xfk/aGXS21qvmPCoIIdoLk1KAdH88U
KXKIvAV4/TiTyFnSmlHk2CgzXJ6F4AEiJT8hYDnJ0kmh9kd4nC1Z4wwcmfC9
wFUXmKiyc7dC9j7PRFvNdXIM+bPdOc2QHu2fcheNNL4og+MEuyyy3BqCLtiN
5+U1aypErvD3uKaaDUitA1Ra9LnrBZxasbcuIMr9WDzdbXmlo/SlhMGD/Tra
5Jb+4fn3aPsVdLdk3U+xmerzCoKIv02zOfx57x3z0Pqh0vv3B/Tfvf2HgRbW
viq4qHT7r/jPu3urlLbAt5VkrjHQTplfRd19ZW59Tix0ulAytx+2lJax1POh
Bb3L0jYu7/7u1g1j0+58Wbr21Kl7vRHaLhSlLf53amCsB8qtEQps9DeoQbu1
d7etxu60v9vGg2kgw8LO1jiNOBQXqYKJVWzZNqvhdOtYyuJiNhMlbt9SDrXe
kMA48507iuYTiyYuA6rhjDN8bdeezdrHj7Qwr4DPQtgx8Ab4NIAAjkvp57A/
SGCNquJeE7VRW6kQByZHU9tTRD6jSCGfyhEYWBYsbtDWvFtw2dHxrlOBZFO5
bUWAOxYg8jqNf0OKlrY4kDh6d2q1y7CATI7BFSrjQqJ7kpDHz6cKNdqrp+kW
lFlSMP4j7UCVoWnhYtoMv1HrZuA40E9xOsqq1MA8bDPkVi57/CP0YuI7Ao5P
ZE8mIszTQi/BCrlVFxtoU9CFmR3aY5YIh0EQHkS2Y4QzXLZ4S0F9K0Gc+uqx
O49jENkhcjd2h4ba6wv2eqzTZGlKFQcka17FwSEDX7K53KuLRatCSDgLj97w
iottjC0sYZol44LOujHqxGv8inKhdsdpOPjOHYo6IPQH3x0rRmgLvUAB3pvj
im1wENMWuP1B8EOcF+Wus8pLrOq9Cdbyk9bm5s4OvZ1yczaG+EBkTa58jao4
Kc2uI8LbumxgY5m9bY2tvp7urOdrKAl6V5s6cq7lygn1pzYRvGfjgEjyBiqN
6ZuNKzZNbQAlYE/AOldNBxYIt2qfMqi6Q+xGa91HNkV1V6GHX0WBmT3IwqI5
HLeb9mmHtbtO5qCTTD002XGGNJ694IA1ueAeaA+RvLi0tRCiFgrINO0//hMW
+NPjfWuEid/m3KVytxMMNN4NKfc86RVWDPm8YKHSANym40dFuXozW/D6r21V
uLtsH2MND+xfbWFS7zhp3Uipm4AOlg4W+s+94iX5kh4wNoHukAlcUfyn5SUV
Ne2zMj9yoiYd8CAFM3Ccl2phF/ymG27KU+zk1GJe8gixEbd0doeHQCNs6oxd
r4ID/qge4C/eV5oEnHjJInCvPZJSBHxx1DtTvjWWqnHgRE2PFIV79ed1DtYe
zu0CIh4+R0hVj6iuJqQK14mpwoVBVZurEG4sXurgYY2AaWMBkONsuLPvgtio
X2jk+EdIq/vAjHKeNhzmaOtKYQ7tZLdjHd0PPcOlR/LkhE1KydDETfRbd/Ck
+WoJnrRpbqUl9724QdKjmsthu51NB1MFriUdh6BKoMVMVNjROLiv3wtErgtD
ysdTgxFVftetuaXsvrdsQ9eWiSRp0a4hpEWRZOfgXRRLfoIoroO7x8GXEMd1
277VA7nwqiI5Myg2H9btWmbWqgsHRXuI1zWDrRjj6eHVMzjzRnj3O4n4AsXP
FePVI7wvI77r6ye1OGuX2k3bVpDoTpsDtgxd9SHv/BLm4Hbjuhjnv7Q8aekg
K2Bg/UXA9v32uv4W1hYuZ3WsZl3tcpYMzC7Z0LK7p7o36A6Je9b9zS0hWlqx
fOm+y7ZXUfcKS4hukPuFLyFuvL/9YWTf0vzI1tnAK/v20s1w9wo4X88qmi+e
iDpWe2g8AXXL/pp11ijrJNuXKG3G2g7QaKdbuaG8w7qN75Zt3U321W5sHW0a
9xTrbuyxn8V5Tpd4oP8A6t+50mCdUMftyGO6K0adc7EcQzzPr89BNHf4D4Kf
hDq9Ik9u4DEbdhQdh9VeIbUql55CEVNwoypWL0LtuMnehK3iNxFVKP26dEHu
tJM7m+t7XlyEQjJsvb9GFLS+TqKuXWEUtD6Dq8RAtdHyKReznAX4L2c5S2MB
S65m9Yx0uteyvqRI51OuZtm6cB3wXHHA8691Y55g3bCnm0CP6KObgBP8dBPw
r/bVOHBfOnAIfGFLgiaQ61gSNC/5FuQsLfE/aAixLv1aGNS5QnsVHEgB/K6W
BNsVnSisNBr7RhqtBPqGSZ0E+kRKG2rC2kbVfOkKmXyLkH3dv6XXIJ1anTVI
swvNWojcBB86/NE1tMQ/8lBYEc6Eug+lC6G3j4jyvZyRUDTU4enaWXlftbg6
I3JhR0CO++oEQdp0WZEU8YsLoNYFdO52Utc5VlcE2veauO86C7aeQImu2sNr
PPFO2EVBUnD1UVJH738hYdKG14rqo+grCJTaFoiuYn3oOmj6GoKmL/Kk1GJn
f4VlDsufv17maHyulzm+1GWOb++k1JVu6wsnE9/6w8ITvOusRBy1Em9fk5B8
gt8wFvM4KhcvRSxuQu9FiYYz4rgq+pLCVpbx1p4x3qYT8XU6LYsVLpJcd3s8
Swx4TVRsecT62noCu7ShJD8jFyN94ZFycdpuyApxYyBOBHLFh68Zk6GB3la3
uo+8Se26yuMxn2dBwejNZlzkpbdQfQ0rCfuf7VyMf2vut7JpqnW4f+2HZDaK
iPv8ZBTwwTUifo2INwh8tYj4Zzoks4QzfakXIbUC77T4x0u40it70j5Yenln
Z2mAWlW9GJveLDNNX9jySTwOrsXnIty6hhb2d3C0WaihxcbXsV1j5y7qNndY
Sqh9arTr/aRucdd5gyv0izfK6Kou8kZhZL8Kbd5h/pSI8hd54uD36j07nd3m
SuN9cU7CDeeK1Q9yvpHn9fbogJ6+gq52r9xLmfsJOepzVWHtSji6b9t7JaaT
PQql0nF5o3vfa8BJHAs6VcbHy9RNl6Dck4nI+UI1NXZPwzihewHxag6ZIq5x
JzL27kI+vXfLtV74uWuuLaX+1LfEqx6OU+h8vVQa9rpvU80vtTs3uy+/rLWh
cWlwuaC8uTJxVydhk6thfDtdAZTk/ZkOkUKncSG1x6t8s3Tc6964uz4X6q7f
t6KA7xlP0mNzSfkl/4oOmy+XwHdWKNbmpS7Jhd+xQ8WY7GEnOKkGGO+Td1B2
K5682pKmH4WCem+ktca5DX4aDlr3YWtmTD7RRWx1jRp56bptGHg0qAtFrZtD
KREdDQXMb6BusDWjOkMvQuXWc35SRGECRlUtlZNGlWATrBtq4WW+m5TuXaZB
uJSFcK+Zt3wJW7jOxElpDtCIJxfti80KeC3qpixT1szaTe4TMYxfzFuaUl4W
dIHHeXjO3cfnfFfwJFvtmtX7ixzDTtvoKMYSfpujyCu7bJ290vTCZEHKBcx+
hpazTk/AJ6opWVQkcow5UFYRXoQ60a1V15Hr92VyBucCUOOF7N//993gAP8Z
DAa78PX+vw98flq7lOmUfRJJwwjeeCbvwM3kTp1J4eD1I5XngK6BTtX1Zc0p
yFcl8TZMBIaK8pr/sPRqpHXdN7aCMfIfh6r7kU/MzVRMWTC1t4tpVZKWjlk5
QNi0xDDAO4OtTJjuSKxnpxJqumhsheItSTKZlyROqyZFvajKq3lc0cEJB3G2
L2QQjTI6sQT9aLfhOVlC1beUTNRT2lxKV9LZfvtMv9JjTPmqpI+6ZPqO320g
5Dn0dTZL6EALXfOcpcISQasEtRjiWZyEObFuVU0aLO9Pxr7jfsP8nKbdum7r
VjWZ4zqcgypwy/hiNTuNlFZp9v1xJsGCOY7R0MojrTJ5y3uFExygfMsu11LI
5LjK+BGeYLJPoptPyVzwxsI53YGtrerS/p/OzqLTBfE1w3KUiQ7ryyZ/F+1i
lYz9iQVmGWcxx/Edgj2ciiTJtFdH2WEiAY1SubEZmlCla51qdbq0UFI3xpZi
VOARJ65nKmNG60p0ub3vFA0X+p0gP0pJzAABdNfMijQ6ZGomQ7JeUhLQJtAD
ngTgNTABszldYU9tVb3qkYfaxmhKqUTQdTdb54zDfJsdro8yvOWU0pyWnPqP
5KdssUGB9N5J684JHNQwODRHMqmjStpisqobIxufejsPl7zwCpmbkywbjy4E
t/2mbvzYyhtvHDbXvhobcbswfb4rw3f8FWVMFydaMzuw6tRp9M/snmzpkXPq
0ItYwL8ygTen/3HcYNrxib0+m3EGYxAvpfB7cP/R/sePg5qP7MkfUHOVNQRZ
n+EWxFOUb1ibDMuX7vRdlovZnJvte2Y54KRX+n7VRmRGHMBMUM+fpcAQ7lrU
ezRTqElOSKDI1DETmGborhcz3SinhzKIYVauCHNy9biv07rWhoe3lLOMNS2t
pSkJ49Bm/CmhBS0g0NB1/eHuzlrKLe7uwq/cR2br41U+Owjq6TMvoz5ruNDd
PbKyP61aUVNej7K6cYJHcTP/TLkAtvGxZ4+qjTBoTgWsJM8Rep5nsca5+/Ek
/ZIumd1yU4v4wFC8qqwbCXUybCxM++HDQWsFvWlZ6rBorUxLblMau8uQ9iKZ
fVLGaKdWJkKRNpnyserUMUrBupPHYJo/lS0ESpuc0Qi1gFuYWxn9FidGJL9f
+sLN9tRyxahW1FIy1cBMk/0Hy7pZOmpo5p07i3LzeNwNK2uJVJxlurACpxs4
Zc6tbYGikZaUdzqqjGV+ZyQuah4HrRNNwmoiawCTnpHprSUIktz1SBFEASm5
hjAxNHyOWTY2i6zAYVrM4oJSPpIT4gOXdzFq5KSWLb4Hd+iyngdwiIkgWXiN
TJR2ojJWcEt3kRTdxljLSOkWgx/yWF1CbnU1IgmjWkm6ig8Nt2lPlFWsMnb1
LHVFWZsOpQNSILYusaFc0RtaQlP7eES9slVd+0OfzB/q1R9re0NpNRuJvJ8m
qzIWE20+Ub95rK9HtAEmd31TeV+5tnlFffiq+UY+2Tkzl5V0zTNfqV3fxpws
npY5Vaw/QjYLUZ1Tkz0K7czGcpKAQR9Am9Oxs+LfknUjxLdJyEgI2iMYSllt
srAzYdDiv7IsCLqY3iCIxpmucbImSJHOykJzijLLcZAXUK9ZV+tKidm+lQER
9PHfK8Y9ZT5MmilBD9GMi99CdLt2G7sc9PlXgq7ssaA26tSi74aDpNE0EmV9
8rFUhIwiCHj92cevd0vOOfVUgt/MTNNMZ2jtJVo0WOyVytLgQeviOle0P6dj
Cqt370bDeKP99iYf+207+eTiKasRqqwSui/NlHeK6pZbn3C9yUdtSvLJRuXB
NDm2++fBtPcr+fZgWrCob3roSImpcsN7k3+bRIpmeNUvVVZ7GymRONakXgCG
QIwivwdRtPNWY7fNJ8v3bAM5nyIx81eRSjJoMNXY8d9Vmn44Og4Ot4eH/ttT
uuu+e++drPXeu7tL1h04Zy683HWWto8o+Ip3HxR2vnmKd5W+F2w/ORjudBRv
lvbsu7/bVrxZWn4Frd0ePtixHzWLe+rG3oH66F/+v+mpd4tKa2Gp504Jt7gp
PTz4K+jVwa9a0n5+nS/NunU3meetxRulTTdd2iWGD5Gzh7+6X7B0y1iyqbbU
jqXVWDrYHh5YY+kyGO5jHfu/thav1W2PJSj9SJf2F2+022IraPnBfNGldT2G
Lbe0r7jV3/u14g0tahZ3OefOUmzR13tDfOT8YL402t1uCj2/NEpfUuftbw/3
d0EWO1YDVMc+3B6qK6Y2XDd95PmrLnsZNKXmMNmv9MavEeBKmyeeGjavUZxL
WrcFNA4t+WgYfvSWVnBS+iRGdfydOHUdpxWSotr09ObWGi8Qd2lEP/QfXVqX
Pd+VWqjM9O8B/XuooVt2q1GfbXCgKxpckDooBWIDqs/NDoQtxPpdeIKeNN6z
0Qv23hp0Di3GG7/TYha+5IZ2xJTemKZSpcIfQz62MKS8LE0mm5mJCH+gF0+r
vFR7vsKJXm/irbLsO1PyF6cKzdGTBg6vjmhQghovKYdfijO5XSyNg9Yb1Swf
eh6CfNTmru70NaBqsjb7CKb7+wPsAWRi7zvkgP59YNSk40IrzblaZWjVf/uq
Nm6aGG8g0c2KY+yT5LhZkbf+gFDTNG3mjJnuVetglqLVhG48iWxki1Wc2sd6
daayWfFcmUGp3CNhOCA8J8FoSD7An/xDZcHlY8FZHFrGx63Rf2/Dwhq78p3q
+r7tW8+cU2psslsOpb3SuxUZ0IrMDTPW0lfLyJEjrRAlQpGORaYDumpvhlqD
1tBPA9zTpgwzjCUMroyyMwm6Lz7gdx4nSSBOT1WHSjleAxqBxyG+BjR8ha8B
DX/xa0Dj8wEaTUQCSr96/bdn/2f4+s1JW3EDaNQRiaUADVPPJgGNneYP14DG
NaBh+PECGsl8vhKg0ZVQNArp3unRhdF0+wKWWvWbwjCW5MjdNHgNc6wCc9j4
BsQJLAJbwrNZlfImSuiXkZXp9/sfh4YouXGF3JrC4oZaznDXCHU2etnogNfR
EX0ixwOP2BGhfVJPNmczvNqYDrjPyE9sjn+lWdmDVYUSHbbhN/6MyW5rDXAT
1/J/fQkAjh4xXzV+s6R9+fygzpIMr4b0sAH/RpGeVa8QuoZ6rqGexVAPD53f
NdRjPOU1EB/jrC906JsOcg8iR+lFhOd2j0UOTsBdLtOFA/XjBCJXBTD05cTz
UUG/FxPqJjI8xCD28FcHWdIPl+RkLYBJ07ChouVwprae7QM3mZ/oXws48qI4
izixASQHPuoGnyQkwfOV6pTLZsnFGFQdTvK1og2Kanw6IakFiJShITvWRqYM
tGT/bYi0jlUHoPL+jUTeFvZYreFMLtLkp+LlRPaqIfLIC1e9axLBjwd1qlsP
H5WaYJvgU+dunDp2Vf/YCNgiKu1d/M50sR7ENhZl/d1GpN04e35pI3JpcKlD
wqaWQqY2yQl91sOpHNavGq4yc3FXFfhPWzaPHhiW06AuKEu/uP1sPBFy7m3N
lOqv+sbWLyZ1WjsyMzxUd9aEcqaXgTX6YfALBeAaAKCgiWASdaXR9z8OBwtx
tnGR98XZyHFVgRr4kOyIgoeIIgD3razyNNh+evxmpw1Vg8rWQNWWqP8aNWtB
zX7BLT0QYuckUaqaEjLggS5orD4BqTTOvZF81z5MQEd9EKFhVIXxFIJWIPrk
g9WSCPeQqvFQ3XzEqJGqmNisIOqmnekzxYjAUeYS2AfyTnTrvMP5JaZ8TaqF
h3GjW9EjhVkdynBRILG2tu6bth44yEEzwG6Hqw4XwFWgJkmmLj0JuTedijeP
K/UfYZ8FRaqzB4ysBg6xHfpGwaGahFYDhA7b4ZmD5QEhsjOIzigpuOhEaxfR
BXZSCdlYkAY2IBHLANjDXQMQntEjD6dlUUWhu77CqpBqAZINihIaGElg2+Vk
E8O7rhXLgUndaFEXX61IUntSmBWRJB5nXz+S5Hp0m/Rifb5zb3yp7nN2OOIL
AaZliKmIpXPD0RKc1Xa2rEEMPxIY8ANPSxJzEuysScxCorzUliHmIFItsNYi
Yi0q4Ye3FhFrQ6i8MNdCztqQKh/c5SHWE7HywF5tnPVArprwVxsx9elCsBow
2AJi3UiWs18Ko3EHJ2i3Dn5EywGJCB9QAfPl3qW8zt3Drf3cTw2JbR+VJd5n
kO9cYgllauo4mVOJlxpztj3cv02bx/58yTfQ+zmzdnd5qXXKTBouLTN7s5eP
WqfS1jE4Z++Xh1qLavTA4jzUWog194X5VLBObbHS+rA5B5Fz9ol1EOua3fzI
mIfYzzF4BTOCZht7yOpLLC5Wd3WcretrgPbr1Ezo5OAVJ+wUmj0a8nR4KMed
ARzitIjHhMzc2CJnLcsboc+BRpv4UmgMZhaDTlB3Qp70asCTLE4+NXFdtAFO
8k1fmtlVUSdP5W1bWRqvoqxdp9zyPcP3uIspwZucZSZbmWxAoToge7oxT6h7
FVSPEQ15PRwpMZQy6ePQBnLKH472bKhHgia33ddBtcltRv8ZAph4knJ/c8IX
+pMWxRUlJz52Eiy1NVYlXpJwmWoJEwrzHJOe+QIK05/m0iIJ9kD0lkdxiLcr
JRydTON5oS9CIcVkr5VbXViXOdC1D8494CbSdq8WCulSIU9BO2pXhbG+8ThW
N+YbqM9b+hPgOU3d/SJwnCZbq8E6Rjs2Bu0cfLHQjs8ObXjrz8NlN+IQFF3H
eTa2R6Zr599yu2PWgDSMjl3DGo6r0eEE8eca1liZGH6uYY1+xK5hjf6cXcMa
Xy2s8fAa1nB/8jTvGtYIumEN582vCdZ4uAjWePg5YA21PrjWdpoFiAa/Ze+i
WRvRqFXchmY4ry2JZEhPdRk0I6zTJ92kNVV1MWsZ5hNRdgMfu3TFJVSTzOgY
FLyuwkB4h2r4hPiF7r1r+GL17ShSUfHuqS8AvHCZWnE/ilKL3wFuUbc2q2AW
HYeH9lfALPY/B2bx8BOe6NH69Y1hFv5P792+7oc9nB57f5vFjHe/VLHArE/2
LbZi26wq771btsRm9m/L01T9UKB3TnBbY6rXAaSVTlS1SqD9wRJk1j2atTFu
NnXGa61TXh4qK90g5KGy+kkvhTjwJaxL3ym0mdNe7bDTUue9Vj/xtckzXxs+
9bWZc18rnvxa++xXfRvBWqe/Nnj+qxdM46PTEPGGzoD1gWaadLo6fAPnwNY+
f7UYivEDMRviZkNz+Cb3lBz6wBdzgqkTfDELoHtcz17jxiAVpK68ocTcfjxU
ySGGteQQfAWyh5k+m07WYLANprmyXSEHclfIJ9oFos7zlHhiiQ71OEAOc4HB
1AjhKBmVcEN3If6YhhWDDB2Ijw3U+DtQJ6ZcBqXhU1c+pAV6HHO25JgiWTjn
vOxEHw6u0oBwOtCfDYA4weoozuq6fKU4zups9cRy/JqzMSTn8ItFcj7BDpSu
s0ZtaM6BugfGPfqoFHsxNHEV21AO14Z0/Gr2TQM6S4IdSwI4SwI3GrA56AXY
rALU9AdoNgPM9ABkVgdi1gRg1kRM1oc41oM21oM01oMyVoMw1oQu1oQslocq
NgFRbAiaWA+SWBKKWBOCUONqRehhA5DDulDDmhDD5qCFtSCFFaGEDUEInxI6
QMjgYBFkcPBJIAP2ytaGC1p2atSog3eGUQJ6euqqk1XD/547NBpbKFKWRmMj
xcGVbaRoxNVNoawUU3/VQfXVx9SugnwR8fRK+yKa2vI7iKM3sSPiYKM7Iva7
Q+gr3hlxcJVhdFPDvs4QGnMwP8nSMs+SYEhZoocm7brqKJIRNKya7xrmWENl
1uuCmg+TMFsHK4M6Ut6j/NN7hYhATvXk70CwzKIENRwmBhhYOJ0bHvhKW3oH
WJxB94YToZI+19LE6xY9xaTXm28O5tLubIuVbBu7uAxHCR3fBIrTGq88cy5K
fM8tCt4IytoMv/2AP5T49FkIfg01ZYq4N86H8h0qXLJOYA+E4PrAEJmJYJJk
I5jf6OK3Md4/H9F8qQrCJDrPUtwNytKgMnFKacGrMLF+D/TBTrwgOZZ2z3q1
NFD+aZbAvMRXX+XhTJQIxyvJuWkKdB7tmYimYRoXM+wn5K0IXr49PsEBJgEs
TE1t3XzHjZC4Mz544tAd6pof4xsBQtT02liAoyJAhPn4HJvDzT4t6QsuHODo
5yJgyJ/y20+Gb8E0hGN+DjZW4R5lNs+SbHLBPzyQdejLmdXN+5IgzP/HF2CP
ZzUhbIvBZACqOb0oqINAVDA9lSjWGXR6jBtegYVdkNIsw6ulI3iK6cUxTzff
xV8Q3V2+MP0Ud9BG4Rwm2vJih+t+pBvjrbtxMfiuCXgkhT8M3J6Sqd6D50ev
juiqf5zdpfX8cAuffuQBCEZX35eEqptmXCYkCy0nrlvBsQD7iAkLGrQK+ctH
NaBHYDSmszB/T+OQlp+05VQJA3SVejaVp5GlYdbmyBqi0sVmHSnUJuQwkGYt
AQIJWGbgDLrBvdgPhJbE/yQPt8jQrxwrZ6PQ819a05pAzOZJdgGv4uhwGjWr
eKiAqSnxFqkQHZ2xmMM8Th6wtmjg1JcU7cxYF8703IBJE/ANGCozsgjjuDDO
DU2VYXqhGyutR5WA9c3wArwYvQhyri9chwd0LEOGpPahL4tmdVzxjCtbSa2G
OJDW2biHj6L3aXae4K1bMzIqH27VH32kgCutZiMBrs9/3DwNk0JwoPQSuQUu
0/e02HgU/71Kg19CUlXg5keBfx1XBfz9E/Q5DI8f46oQ87kAlyiDOTcJA1FG
A7XhOYZRhpqAJpsv8UI1qmnPYOv/A3Vb5oEVHwEA

-->

</rfc>
