<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.29 (Ruby 3.2.3) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-chen-bmwg-savnet-sav-benchmarking-04" category="info" submissionType="IETF" xml:lang="en" version="3">
  <!-- xml2rfc v2v3 conversion 3.28.1 -->
  <front>
    <title abbrev="SAVBench">Benchmarking Methodology for Source Address Validation</title>
    <seriesInfo name="Internet-Draft" value="draft-chen-bmwg-savnet-sav-benchmarking-04"/>
    <author initials="L." surname="Chen" fullname="Li Chen">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>lichen@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="D." surname="Li" fullname="Dan Li">
      <organization>Tsinghua University</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>tolidan@tsinghua.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Liu" fullname="Libin Liu">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>liulb@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Qin" fullname="Lancheng Qin">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>qinlc@zgclab.edu.cn</email>
      </address>
    </author>
    <date year="2025" month="May" day="20"/>
    <area>General [REPLACE]</area>
    <workgroup>IETF</workgroup>
    <abstract>
      <?line 65?>

<t>This document defines methodologies for benchmarking the performance of source address validation (SAV) mechanisms. SAV mechanisms are utilized to generate SAV rules to prevent source address spoofing, and have been implemented with many various designs in order to perform SAV in the corresponding scenarios. This document takes the approach of considering a SAV device to be a black box, defining the methodology in a manner that is agnostic to the mechanisms. This document provides a method for measuring the performance of existing and new SAV implementations.</t>
    </abstract>
  </front>
  <middle>
    <?line 69?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>Source address validation (SAV) is significantly important to prevent source address spoofing. Operators are suggested to deploy different SAV mechanisms <xref target="RFC3704"/> <xref target="RFC8704"/> based on their deployment network environments. In addition, existing intra-domain and inter-domain SAV mechanisms have problems in operational overhead and accuracy under various scenarios <xref target="intra-domain-ps"/> <xref target="inter-domain-ps"/>. Intra-domain and inter-domain SAVNET architectures <xref target="intra-domain-arch"/> <xref target="inter-domain-arch"/> are proposed to guide the design of new intra-domain and inter-domain SAV mechanisms to solve the problems. The benchmarking methodology defined in this document will help operators to get a more accurate idea of the SAV performance when their deployed devices enable SAV and will also help vendors to test the performance of SAV implementation for their devices.</t>
      <t>This document provides generic methodologies for benchamarking SAV mechanism performance. To achieve the desired functionality, a SAV device may support many SAV mechanisms. This document considers a SAV device to be a black box, regardless of the design and implementation. The tests defined in this document can be used to benchmark a SAV device for SAV accuracy, convergence performance, and control plane and data plane forwarding performance. These tests can be performed on a hardware router, a bare metal server, a virtual machine (VM) instance, or q container instance, which runs as a SAV device. This document is intended for those people who want to measure a SAV device's performance as well as compare the performance of various SAV devices.</t>
      <section anchor="goal-and-scope">
        <name>Goal and Scope</name>
        <t>The benchmarking methodology outlined in this draft focuses on two objectives:</t>
        <ul spacing="normal">
          <li>
            <t>Assessing ''which SAV mechnisms performn best'' over a set of well-defined scenarios.</t>
          </li>
          <li>
            <t>Measuring the contribution of sub-systems to the overall SAV systems's performance (also known as ''micro-benchmark'').</t>
          </li>
        </ul>
        <t>The benchmark aims to compare the SAV performance of individual devices, e.g., hardware or software routers. It will showcase the performance of various SAV mechanisms for a given device and network scenario, with the objective of deploying the appropriate SAV mechanism in their network scenario.</t>
      </section>
      <section anchor="requirements-language">
        <name>Requirements Language</name>
        <t>The key words "<bcp14>MUST</bcp14>", "<bcp14>MUST NOT</bcp14>", "<bcp14>REQUIRED</bcp14>", "<bcp14>SHALL</bcp14>", "<bcp14>SHALL
NOT</bcp14>", "<bcp14>SHOULD</bcp14>", "<bcp14>SHOULD NOT</bcp14>", "<bcp14>RECOMMENDED</bcp14>", "<bcp14>NOT RECOMMENDED</bcp14>",
"<bcp14>MAY</bcp14>", and "<bcp14>OPTIONAL</bcp14>" in this document are to be interpreted as
described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/> when, and only when, they
appear in all capitals, as shown here.</t>
        <?line -18?>

</section>
    </section>
    <section anchor="terminology">
      <name>Terminology</name>
      <t>SAV Control Plane: The SAV control plane consists of processes including gathering and communicating SAV-related information.</t>
      <t>SAV Data Plane: The SAV data plane stores the SAV rules within a specific data structure and validates each incoming packet to determine whether to permit or discard it.</t>
      <t>Host-facing Router: An intra-domain router of an AS which is connected to a host network (i.e., a layer-2 network).</t>
      <t>Customer-facing Router: An intra-domain router of an AS which is connected to an intra-domain customer network running the routing protocol (i.e., a layer-3 network).</t>
    </section>
    <section anchor="test-methodology">
      <name>Test Methodology</name>
      <section anchor="test-setup">
        <name>Test Setup</name>
        <t>The test setup in general is compliant with <xref target="RFC2544"/>. The Device Under Test (DUT) is connected to a Tester and other network devices to construct the network topology introduced in <xref target="testcase-sec"/>. The Tester is a traffic generator to generate network traffic with various source and destination addresses in order to emulate the spoofing or legitimate traffic. It is <bcp14>OPTIONAL</bcp14> to choose various proportions of traffic and it is needed to generate the traffic with line speed to test the data plane forwarding performance.</t>
        <figure anchor="testsetup">
          <name>Test Setup.</name>
          <artwork><![CDATA[
    +~~~~~~~~~~~~~~~~~~~~~~~~~~+
    | Test Network Environment |
    |     +--------------+     |
    |     |              |     |
+-->|     |      DUT     |     |---+
|   |     |              |     |   |
|   |     +--------------+     |   |
|   +~~~~~~~~~~~~~~~~~~~~~~~~~~+   |
|                                  |
|         +--------------+         |
|         |              |         |
+---------|    Tester    |<--------+
          |              |
          +--------------+
]]></artwork>
        </figure>
        <t><xref target="testsetup"/> illustrates the test configuration for the Device Under Test (DUT). Within the test network environment, the DUT can be interconnected with other devices to create a variety of test scenarios. The Tester may establish a direct connection with the DUT or link through intermediary devices. The nature of the connection between them is dictated by the benchmarking tests outlined in <xref target="testcase-sec"/>. Furthermore, the Tester has the capability to produce both spoofed and legitimate traffic to evaluate the SAV accuracy of the DUT in relevant scenarios, and it can also generate traffic at line rate to assess the data plane forwarding performance of the DUT. Additionally, the DUT is required to support logging functionalities to document all test outcomes.</t>
      </section>
      <section anchor="network-topology-and-device-configuration">
        <name>Network Topology and Device Configuration</name>
        <t>The placement of the DUT within the network topology significantly influences the precision of SAV mechanisms. Consequently, the benchmarking process <bcp14>MUST</bcp14> involve positioning the DUT at various locations throughout the network to thoroughly evaluate its performance.</t>
        <t>The routing configurations of devices within the network topology can vary, and the SAV rules generated are contingent upon these configurations. It is imperative to delineate the specific device configurations employed during testing.</t>
        <t>Moreover, it is essential to denote the role of each device, such as a host-facing router, customer-facing router, or AS border router within an intra-domain network, and to clarify the business relationships between ASes in an inter-domain network context.</t>
        <t>When assessing the data plane forwarding performance, the network traffic produced by the Tester must be characterized by specified traffic rates, the ratio of spoofing to legitimate traffic, and the distribution of source addresses, as these factors can all impact the outcomes of the tests.</t>
      </section>
    </section>
    <section anchor="sav-performance-indicators">
      <name>SAV Performance Indicators</name>
      <t>This section lists key performance indicators (KPIs) of SAV for overall benchmarking tests. All KPIs <bcp14>MUST</bcp14> be measured in the bencharking scenarios described in <xref target="testcase-sec"/>. Also, the KPIs <bcp14>MUST</bcp14> be measured from the result output of the DUT.</t>
      <section anchor="false-positive-rate">
        <name>False Positive Rate</name>
        <t>The proportion of legitimate traffic which is determined to be spoofing traffic by the DUT across all the legitimate traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="false-negative-rate">
        <name>False Negative Rate</name>
        <t>The proportion of spoofing traffic which is determined to be legitimate traffic by the DUT across all the spoofing traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="protocol-convergence-time">
        <name>Protocol Convergence Time</name>
        <t>The control protocol convergence time represents the period during which the SAV control plane protocol converges to update the SAV rules when routing changes happen, and it is the time elapsed from the begining of routing change to the completion of SAV rule update. This KPI can indicate the convergence performance of the SAV protocol.</t>
      </section>
      <section anchor="protocol-message-processing-throughput">
        <name>Protocol Message Processing Throughput</name>
        <t>The protocol message processing throughput measures the throughput of processing the packets for communicating SAV-related information on the control plane, and it can indicate the SAV control plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-sav-table-refreshing-rate">
        <name>Data Plane SAV Table Refreshing Rate</name>
        <t>The data plane SAV table refreshing rate refers to the rate at which a DUT updates its SAV table with new SAV rules, and it can reflect the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-forwarding-rate">
        <name>Data Plane Forwarding Rate</name>
        <t>The data plane forwarding rate measures the SAV data plane forwarding throughput for processing the data plane traffic, and it can indicate the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="resource-utilization">
        <name>Resource Utilization</name>
        <t>The resource utilization measures the CPU and memory utilizations of the SAV process for intra-domain SAV and inter-domain SAV within the DUT.</t>
      </section>
    </section>
    <section anchor="testcase-sec">
      <name>Benchmarking Tests</name>
      <section anchor="intra_domain_sav">
        <name>Intra-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Evaluate the DUT's false positive rate and false negative rate in handling legitimate and spoofing traffic across diverse intra-domain network scenarios, encompassing SAV implementations for customer or host networks, Internet-facing networks, and aggregation-router-facing networks.</t>
          <t>In the following, this document introduces the classic scenarios of testing DUT for intra-domain SAV.</t>
          <figure anchor="intra-domain-customer-syn">
            <name>SAV for customer or host network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
| FIB on DUT            +~~~~~~~~~~+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +----------+                       |
|                       |   DUT    |                       |
|                       +----------+                       |
|                         /\    |                          |
|            Traffic withs |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |       Tester       |
                    |    (10.0.0.0/15)   |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-syn"/> shows the case of SAV for customer or host network in intra-domain symmetric routing scenario, and the DUT performs SAV as a customer/host-facing router and connects to Router 1 to access the Internet. Network 1 is a customer/host network within the AS, connects to the DUT, and its own prefix is 10.0.0.0/15. The Tester can emulate Network 1 to advertise its prefix in the control plane and generate spoofing and legitimate traffic in the data plane. In this case, the Tester configs to make the inbound traffic destined for 10.0.0.0/15 come from the DUT. The DUT learns the route to prefix 10.0.0.0/15 from the Tester, while the Tester can send outbound traffic with source addresses in prefix 10.0.0.0/15 to the DUT, which emulates the a symmetric routing scenario between the Tester and the DUT. The IP addrsses in this test case is optional and users can use other IP addresses, and this holds true for other test cases as well.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer or host network in intra-domain symmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester generates traffic using 10.0.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
          <figure anchor="intra-domain-customer-asyn">
            <name>SAV for customer or host network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +~~~~~~~~~~+                          |
|                       | Router 2 |                          |
| FIB on DUT            +~~~~~~~~~~+   FIB on Router 1        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  Router 2  /         \/  10.1.0.0/16  Router 2  |
|               +----------+     +~~~~~~~~~~+                 |
|               |   DUT    |     | Router 1 |                 |
|               +----------+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                    +--------------------+
                    |       Tester       |
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><xref target="intra-domain-customer-asyn"/> shows the case of SAV for customer or host network in intra-domain asymmetric routing scenario, and the DUT performs SAV as a customer/host-facing router. Network 1 is a customer/host network within the AS, connects to the DUT and Router 1, respectively, and its own prefix is 10.0.0./15. The Tester can emulate Network 1 and performs its control plane and data plane functions. In this case, the Tester configs to make the inbound traffic destined for 10.1.0.0/16 come only from the DUT and the inbound traffic destined for 10.0.0.0/16 to come only from Router 1. The DUT only learns the route to prefix 10.1.0.0/16 from the Tester, while Router 1 only learns the route to the prefix 10.0.0.0/16 from Network 1. Then, the DUT and Router 1 avertise their learned prefixes to Router 2. Besides, the DUT learns the route to 10.0.0.0/16 from Router 2, and Router 1 learns the route to 10.1.0.0/16 from Router 2. The Tester can send outbound traffic with source addresses of prefix 10.0.0.0/16 to the DUT, which emulates the an asymmetric routing scenario between the Tester and the DUT.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer or host network in intra-domain asymmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-asyn"/> to construct the test network environment. The Tester is connected to the DUT and Router 1 and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT, Router 1, and Router 2, are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
          <figure anchor="intra-domain-internet-syn">
            <name>SAV for Internet-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                          |
|                          |   \/                          |
|                       +----------+                       |
|                       |    DUT   | SAV facing Internet   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
|                          |    \/                         |
|                  +--------------------+                  |
|                  |     Network 1      |                  |
|                  |   (10.0.0.0/15)    |                  |
|                  +--------------------+                  |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-syn"/> illustrates the test scenario for SAV in an Internet-facing network within an intra-domain symmetric routing context. In this scenario, the network topology mirrors that of <xref target="intra-domain-customer-syn"/>, with the key distinction being the DUT's placement within the network. Here, the DUT is linked to Router 1 and the Internet, with the Tester simulating the Internet's role. The DUT executes Internet-facing SAV, as opposed to customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for Internet-facing network in intra-domain symmetric routing scenario**:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
          <figure anchor="intra-domain-internet-asyn">
            <name>SAV for Internet-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                             |
|                          |   \/                             |
|                       +----------+                          |
|                       |    DUT   |                          |
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
|                         \    \/                             |
|                  +--------------------+                     |
|                  |     Network 1      |                     |
|                  |   (10.0.0.0/15)    |                     |
|                  +--------------------+                     |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><xref target="intra-domain-internet-asyn"/> shows the test case of SAV for Internet-facing network in intra-domain asymmetric routing scenario. In this test case, the network topology is the same with <xref target="intra-domain-customer-asyn"/>, and the difference is the location of the DUT in the network topology, where the DUT is connected to Router 1 and Router 2 within the same AS, as well as the Internet. The Tester is used to emulate the Internet. The DUT performs Internet-facing SAV instead of customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for Internet-facing network in intra-domain asymmetric routing scenario**:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-asyn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
          <figure anchor="intra-domain-agg-syn">
            <name>SAV for aggregation-router-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +----------+                       |
|                       |    DUT   | SAV facing Router 1   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |       Tester       |
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-syn"/> depicts the test scenario for SAV in an aggregation-router-facing network within an intra-domain symmetric routing environment. The test network setup in <xref target="intra-domain-agg-syn"/> is identical to that of <xref target="intra-domain-internet-syn"/>. The Tester is linked to Router 1 to simulate the operations of Network 1, thereby evaluating the false positive rate and false negative rate of the DUT as it faces the direction of Router 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for aggregation-router-facing network in intra-domain symmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-syn"/> to construct the test network environment. The Tester is connected to Router 1 and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester can send traffic using 10.1.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 for this test case.</t>
          <figure anchor="intra-domain-agg-asyn">
            <name>SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +----------+                          |
|                       |    DUT   | SAV facing Router 1 and 2|
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
          </figure>
          <t><xref target="intra-domain-agg-asyn"/> shows the test case of SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario. The test network environment of <xref target="intra-domain-agg-asyn"/> is the same with <xref target="intra-domain-internet-asyn"/>. The Tester is connected to Router 1 and Router 2 to emulate the functions of Network 1 to test the false positive rate and false negative rate of the DUT facing the direction of Router 1 and Router 2.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-asyn"/> to construct the test network environment. The Tester is connected to Router 1 and Router 2 and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 and Router 2 for this test case.</t>
        </section>
        <section anchor="intra-control-plane-sec">
          <name>Control Plane Performance</name>
          <t><strong>Objective</strong>: Measure the control plane performance of the DUT, encompassing both protocol convergence performance and protocol message processing performance in response to route changes triggered by network failures or operator configurations. The protocol convergence performance is quantified by the protocol convergence time, which is the duration from the initiation of a routing change to the completion of the SAV rule update. The protocol message processing performance is characterized by the protocol message processing throughput, defined as the total size of protocol messages processed per second.</t>
          <figure anchor="intra-convg-perf">
            <name>Test setup for protocol convergence performance measurement.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
| Emulated Topology |------|   Tester    |<-------->|    DUT    |
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
]]></artwork>
          </figure>
          <t><strong>Protocol Convergence Performance</strong>: <xref target="intra-convg-perf"/> illustrates the test setup for measuring protocol convergence performance. The protocol convergence process of the DUT, which updates SAV rules, is initiated when route changes occur. These route changes, which necessitate the updating of SAV rules, can result from network failures or operator configurations. Consequently, in <xref target="intra-convg-perf"/>, the Tester is directly connected to the DUT and simulates route changes to trigger the DUT's convergence process by adding or withdrawing prefixes.</t>
          <t>The <strong>procedure</strong> is listed below for testing the protocol convergence performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the protocol convergence time of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively withdraws the prefixes in a certern percentage of the overall prefixes supported by the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the protocol convergence time is calculated according to the logs of the DUT about the beginning and completion of the protocol convergence.</t>
            </li>
          </ol>
          <t>Please note that withdrawing prefixes proportionally for IGP can be accomplished by proportionally shutting down interfaces. For instance, the Tester is connected to an emulated network topology where each interface links to an emulated device. Suppose the Tester connects to ten emulated devices through ten interfaces. Initially, these ten emulated devices advertise their prefixes to the DUT. To withdraw 10% of the prefixes, the Tester can randomly disable one interface connected to an emulated device. Similarly, to withdraw 20%, it can shut down two interfaces randomly, and this method applies to other proportions accordingly. This is merely a suggested approach, and alternative methods achieving the same objective are also acceptable.</t>
          <t>The protocol convergence time, which is the duration required for the DUT to complete the protocol convergence process, should be measured from the moment the last hello message is received on the DUT from the emulated device connected by the disabled interface until the SAV rule generation on the DUT is finalized.
To accurately measure the protocol convergence time, the DUT's logs should record the timestamp of receiving the last hello message and the timestamp when the SAV rule update is completed. The protocol convergence time is then determined by calculating the difference between these two timestamps.</t>
          <t>It is important to note that if the emulated device sends a "goodbye hello" message during the process of shutting down the Tester's interface, using the reception time of this goodbye hello message instead of the last hello message would yield a more precise measurement, as recommended by <xref target="RFC4061"/>.</t>
          <t><strong>Protocol Message Processing Performance</strong>: The test of the protocol message processing performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The protocol message processing performance measures the protocol message processing throughput to process the protocol messages. Therefore, the Tester can vary the rate for sending protocol messages, such as from 10% to 100% of the overall link capacity between the Tester and the DUT. Then, the DUT records the size of the processed total protocol messages and processing time.</t>
          <t>The <strong>procedure</strong> is listed below for testing the protocol message processing performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the protocol message processing throughput of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the protocol messages to the DUT in a certern percentage of the overall link capacity between the Tester and the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the protocol message processing throughput is calculated according to the logs of the DUT about the overall size of the protocol messages and the overall processing time.</t>
            </li>
          </ol>
          <t>To measure the protocol message processing throughput, the logs of the DUT records the overall size of the protocol messages and the overall processing time, and the protocol message processing throughput is calculated by dividing the overall size of the protocol messages by the overall processing time.</t>
        </section>
        <section anchor="intra-data-plane-sec">
          <name>Data Plane Performance</name>
          <t><strong>Objective</strong>: Evaluate the data plane performance of the DUT, encompassing both the data plane SAV table refreshing performance and the data plane forwarding performance. The data plane SAV table refreshing performance is quantified by the data plane SAV table refreshing rate, which indicates the speed at which the DUT updates its SAV table with newly implemented SAV rules. Concurrently, the data plane forwarding performance is measured by the data plane forwarding rate, which represents the total size of packets forwarded by the DUT per second.</t>
          <t><strong>Data Plane SAV Table Refreshing Performance</strong>: The assessment of the data plane SAV table refreshing performance utilizes the identical test configuration depicted in <xref target="intra-convg-perf"/>. This performance metric gauges the velocity at which a DUT refreshes its SAV table with new SAV rules. To this end, the Tester can modulate the transmission rate of protocol messages, ranging from 10% to 100% of the total link capacity between the Tester and the DUT. This variation influences the proportion of updated SAV rules and, consequently, the proportion of entries in the SAV table. Subsequently, the DUT logs the total count of updated SAV table entries and the duration of the refreshing process.</t>
          <t>The <strong>procedure</strong> is listed below for testing the data plane SAV table refreshing performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the data plane SAV table refreshing rate of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the protocol messages to the DUT in a certern percentage of the overall link capacity between the Tester and the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the data plane SAV table refreshing rate is calculated according to the logs of the DUT about the overall number of updated SAV table entries and the overall refreshing time.</t>
            </li>
          </ol>
          <t>To measure the data plane SAV table refreshing rate, the logs of the DUT records the overall number of updated SAV table entries and the overall refreshing time, and the data plane SAV table refreshing rate is calculated by dividing the overall number of updated SAV table entries by the overall refreshing time.</t>
          <t><strong>Data Plane Forwarding Performance</strong>: The evaluation of the data plane forwarding performance employs the same test setup illustrated in <xref target="intra-convg-perf"/>. The Tester is required to transmit a blend of spoofing and legitimate traffic at a rate equivalent to the total link capacity between the Tester and the DUT, while the DUT constructs a SAV table that utilizes the entire allocated storage space. The proportion of spoofing traffic to legitimate traffic can be adjusted across a range, for example, from 1:9 to 9:1. The DUT then records the aggregate size of the packets forwarded and the total duration of the forwarding activity.</t>
          <t>The <strong>procedure</strong> is listed below for testing the data plane forwarding performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the data plane forwarding rate of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the data plane traffic including spoofing and legitimate traffic to the DUT at the rate of the overall link capacity between the Tester and the DUT. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
            <li>
              <t>Finally, the data plane forwarding rate is calculated according to the logs of the DUT about the overall size of the forwarded traffic and the overall forwarding time.</t>
            </li>
          </ol>
          <t>To measure the data plane forwarding rate, the logs of the DUT records the overall size of the forwarded traffic and the overall forwarding time, and the data plane forwarding rate is calculated by dividing the overall size of the forwarded traffic by the overall forwarding time.</t>
        </section>
      </section>
      <section anchor="inter_domain_sav">
        <name>Inter-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates-1">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Measure the false positive rate and false negative rate of the DUT to process legitimate traffic and spoofing traffic across various inter-domain network scenarios including SAV for customer-facing ASes and SAV for provider/peer-facing ASes.</t>
          <figure anchor="inter-customer-syn">
            <name>SAV for customer-facing ASes in inter-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|             \         |              \            \      |
|     P6[AS 1] \        |               \            \     |
|      P1[AS 1] \       |                \            \    |
|          (C2P) \      | (C2P/P2P) (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-syn"/> presents a test case of SAV for customer-facing ASes in inter-domain symmetric routing scenario. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 and P6 to AS 2 and the DUT, respectively, and then AS 2 further propagates the route for prefix P1 and P6 to the DUT. Consequently, the DUT can learn the route for prefixes P1 and P6 from AS 1 and AS 2. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;AS 4, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in inter-domain symmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain symmetric routing scenario, a testbed can be built as shown in <xref target="inter-customer-syn"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-syn"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-lpp">
            <name>SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|    P6[AS 1] \         | NO_EXPORT    \            \      |
|     P1[AS 1] \        |               \            \     |
|     NO_EXPORT \       |                \            \    |
|          (C2P) \      | (C2P)     (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><xref target="inter-customer-lpp"/> presents a test case of SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT configuration. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 to AS 2 and adds the NO_EXPORT community attribute to the BGP advertisement sent to AS 2, preventing AS 2 from further propagating the route for prefix P1 to the DUT.  Similarly, AS 1 adds the NO_EXPORT community attribute to the BGP advertisement sent to the DUT, resulting in the DUT not propagating the route for prefix P6 to AS 3. Consequently, the DUT only learns the route for prefix P1 from AS 1 in this scenario. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT, a testbed can be built as shown in <xref target="inter-customer-lpp"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-lpp"/> can be set at AS 1 and AS 2 to evaluate its its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-dsr">
            <name>SAV for customer-facing ASes in the scenario of direct server return (DSR).</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                       |
|                                +----------------+               |
|                Anycast Server+-+    AS 3(P3)    |               |
|                                +-+/\----+/\+----+               |
|                                   /       \                     |
|                         P3[AS 3] /         \ P3[AS 3]           |
|                                 /           \                   |
|                                / (C2P)       \                  |
|                       +----------------+      \                 |
|                       |     DUT(P4)    |       \                |
|                       ++/\+--+/\+--+/\++        \               |
|          P6[AS 1, AS 2] /     |      \           \              |
|               P2[AS 2] /      |       \           \             |
|                       /       |        \           \            |
|                      / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|      +----------------+       |          \           \          |
|User+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|      +----------+/\+--+       | P6[AS 1]   \           \        |
|          P6[AS 1] \           |             \           \       |
|           P1[AS 1] \          |              \           \      |
|                     \         |               \           \     |
|                      \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                    +----------------+        +----------------+ |
|                    |AS 1(P1, P3, P6)|        |    AS 5(P5)    | |
|                    +----------------+        +----------------+ |
|                         /\     |                                |
|                          |     |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                     +----------------+
                     |     Tester     |
                     | (Edge Server)  |
                     +----------------+

Within the test network environment, P3 is the anycast prefix and is only advertised by AS 3 through BGP.
]]></artwork>
          </figure>
          <t><xref target="inter-customer-dsr"/> presents a test case of SAV for customer-facing ASes in the scenario of direct server return (DSR). In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2 send requests to the anycast destination IP, the forwarding path is AS 2-&gt;DUT-&gt;AS 3.  The anycast servers in AS 3 receive the requests and tunnel them to the edge servers in AS 1.  Finally, the edge servers send the content to the users with source addresses in prefix P3. The reverse forwarding path is AS 1-&gt;DUT-&gt;AS 2. The Tester sends the traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;DUT-&gt;AS 2.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in the scenario of direct server return (DSR):</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of DSR, a testbed can be built as shown in <xref target="inter-customer-dsr"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of DSR.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P3 as source addresses and P2 as destination addresses (legitimate traffic) to AS 2 via the DUT.</t>
            </li>
          </ol>
          <t>Note that in <xref target="inter-customer-dsr"/>, to direct the return traffic from the edge server to the user to the path AS 1-&gt;DUT-&gt;AS 2, the document recommends to config static route to direct the traffic with source addresses in P3 and destination addresses in P2 to the DUT.</t>
          <t>The <strong>expected results</strong> are that the DUT can permit the legitimate traffic with source addresses in P3 from the direction of AS 1 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-dsr"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-reflect">
            <name>SAV for customer-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P1')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |  Server+-+    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="inter-customer-reflect"/> depicts the test case of SAV for customer-facing ASes in the scenario of reflection attacks. In this test case, the reflection attack by source address spoofing takes place within DUT's customer cone, where the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P5) that are designed to respond to such requests. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-reflect"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in the scenario of reflection attacks:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="inter-customer-reflect"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P5 as destination addresses (spoofing traffic) to AS 5 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-reflect"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-direct">
            <name>SAV for customer-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P5')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |          |    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="inter-customer-direct"/> presents the test case of SAV for customer-facing ASes in the scenario of direct attacks. In this test case, the direct attack by source address spoofing takes place within the DUT's customer cone, where the attacker spoofs a source address (P5) and directly targets the victim's IP address (P1), overwhelming its network resources. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-direct"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in the scenario of direct attacks**:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="inter-customer-direct"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P5 as source addresses and P1 as destination addresses (spoofing traffic) to AS 1 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic with source addresses in P5 from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-direct"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="reflection-attack-p">
            <name>SAV for provider-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
                                   +----------------+
                                   |     Tester     |
                                   |   (Attacker)   |
                                   |      (P1')     |
                                   +----------------+
                                        |     /\
                                        |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment              \/     |                    |
|                                  +----------------+               |
|                                  |                |               |
|                                  |    AS 3(P3)    |               |
|                                  |                |               |
|                                  +-+/\----+/\+----+               |
|                                     /       \                     |
|                                    /         \                    |
|                                   /           \                   |
|                                  / (C2P/P2P)   \                  |
|                         +----------------+      \                 |
|                         |     DUT(P4)    |       \                |
|                         ++/\+--+/\+--+/\++        \               |
|            P6[AS 1, AS 2] /     |      \           \              |
|                 P2[AS 2] /      |       \           \             |
|                         /       |        \           \            |
|                        / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|        +----------------+       |          \           \          |
|Server+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|        +----------+/\+--+       | P6[AS 1]   \           \        |
|            P6[AS 1] \           | NO_EXPORT   \           \       |
|             P1[AS 1] \          |              \           \      |
|             NO_EXPORT \         |               \           \     |
|                        \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                      +----------------+        +----------------+ |
|              Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|                      +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Provider/Peer-facing ASes</strong>: <xref target="reflection-attack-p"/> depicts the test case of SAV for provider-facing ASes in the scenario of reflection attacks. In this test case, the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P2) that respond to such requests. The Tester performs the source address spoofing function as an attacker. The servers then send overwhelming responses back to the victim, exhausting its network resources. The arrows in <xref target="reflection-attack-p"/> represent the commercial relationships between ASes. AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for provider-facing ASes in the scenario of reflection attacks:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="reflection-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P2 as destination addresses (spoofing traffic) to AS 2 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 3 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="reflection-attack-p"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="direct-attack-p">
            <name>SAV for provider-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
                           +----------------+
                           |     Tester     |
                           |   (Attacker)   |
                           |      (P2')     |
                           +----------------+
                                |     /\
                                |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment      \/     |                    |
|                          +----------------+               |
|                          |    AS 3(P3)    |               |
|                          +-+/\----+/\+----+               |
|                             /       \                     |
|                            /         \                    |
|                           /           \                   |
|                          / (C2P/P2P)   \                  |
|                 +----------------+      \                 |
|                 |     DUT(P4)    |       \                |
|                 ++/\+--+/\+--+/\++        \               |
|    P6[AS 1, AS 2] /     |      \           \              |
|         P2[AS 2] /      |       \           \             |
|                 /       |        \           \            |
|                / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|+----------------+       |          \           \          |
||    AS 2(P2)    |       | P1[AS 1]  \           \         |
|+----------+/\+--+       | P6[AS 1]   \           \        |
|    P6[AS 1] \           | NO_EXPORT   \           \       |
|     P1[AS 1] \          |              \           \      |
|     NO_EXPORT \         |               \           \     |
|                \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|              +----------------+        +----------------+ |
|      Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|              +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="direct-attack-p"/> showcases a testcase of SAV for provider-facing ASes in the scenario of direct attacks. In this test case, the attacker spoofs another source address (P2) and directly targets the victim's IP address (P1), overwhelming its network resources.  The arrows in <xref target="direct-attack-p"/> represent the commercial relationships between ASes.  AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for provider-facing ASes in the scenario of direct attacks:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="direct-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P2 as source addresses and P1 as destination addresses (spoofing traffic) to AS1 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic with source addresses in P2 from the direction of AS 3 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="direct-attack-p"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
        </section>
        <section anchor="control-plane-performance">
          <name>Control Plane Performance</name>
          <t>The test setup, procedure, and measures can refer to <xref target="intra-control-plane-sec"/> for testing the protocl convergence performance and protocol message processing performance.</t>
        </section>
        <section anchor="data-plane-performance">
          <name>Data Plane Performance</name>
          <t>The test setup, procedure, and measures can refer to <xref target="intra-data-plane-sec"/> for testing the data plane SAV table refreshing performance and data plane forwarding performance.</t>
        </section>
      </section>
    </section>
    <section anchor="resource-utilization-1">
      <name>Resource Utilization</name>
      <t>During testing the DUT for intra-domain SAV (<xref target="intra_domain_sav"/>) and inter-domain SAV (<xref target="inter_domain_sav"/>), the CPU utilization and memory utilization of the DUT should be logged.</t>
    </section>
    <section anchor="reporting-format">
      <name>Reporting Format</name>
      <t>Each test has a reporting format that contains some global and identical reporting components, and some individual components that are specific to individual tests. The following parameters for test configuration and SAV mechanism settings <bcp14>MUST</bcp14> be reflected in the test report.</t>
      <t>Test Configuration Parameters:</t>
      <ol spacing="normal" type="1"><li>
          <t>Test device hardware and software versions</t>
        </li>
        <li>
          <t>Network topology</t>
        </li>
        <li>
          <t>Test traffic attributes</t>
        </li>
        <li>
          <t>System configuration (e.g., physical or virtual machine, CPU, memory, caches, operating system, interface capacity)</t>
        </li>
        <li>
          <t>Device configuration (e.g., symmetric routing, NO_EXPORT)</t>
        </li>
        <li>
          <t>SAV mechanism</t>
        </li>
      </ol>
    </section>
    <section anchor="IANA">
      <name>IANA Considerations</name>
      <t>This document has no IANA actions.</t>
    </section>
    <section anchor="security">
      <name>Security Considerations</name>
      <t>The benchmarking tests outlined in this document are confined to evaluating the performance of SAV devices within a controlled laboratory environment, utilizing isolated networks.</t>
      <t>The network topology employed for benchmarking must constitute an independent test setup. It is imperative that this setup remains disconnected from any devices that could potentially relay test traffic into an operational production network.</t>
    </section>
  </middle>
  <back>
    <references anchor="sec-combined-references">
      <name>References</name>
      <references anchor="sec-normative-references">
        <name>Normative References</name>
        <reference anchor="RFC3704">
          <front>
            <title>Ingress Filtering for Multihomed Networks</title>
            <author fullname="F. Baker" initials="F." surname="Baker"/>
            <author fullname="P. Savola" initials="P." surname="Savola"/>
            <date month="March" year="2004"/>
            <abstract>
              <t>BCP 38, RFC 2827, is designed to limit the impact of distributed denial of service attacks, by denying traffic with spoofed addresses access to the network, and to help ensure that traffic is traceable to its correct source network. As a side effect of protecting the Internet against such attacks, the network implementing the solution also protects itself from this and other attacks, such as spoofed management access to networking equipment. There are cases when this may create problems, e.g., with multihoming. This document describes the current ingress filtering operational mechanisms, examines generic issues related to ingress filtering, and delves into the effects on multihoming in particular. This memo updates RFC 2827. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="3704"/>
          <seriesInfo name="DOI" value="10.17487/RFC3704"/>
        </reference>
        <reference anchor="RFC8704">
          <front>
            <title>Enhanced Feasible-Path Unicast Reverse Path Forwarding</title>
            <author fullname="K. Sriram" initials="K." surname="Sriram"/>
            <author fullname="D. Montgomery" initials="D." surname="Montgomery"/>
            <author fullname="J. Haas" initials="J." surname="Haas"/>
            <date month="February" year="2020"/>
            <abstract>
              <t>This document identifies a need for and proposes improvement of the unicast Reverse Path Forwarding (uRPF) techniques (see RFC 3704) for detection and mitigation of source address spoofing (see BCP 38). Strict uRPF is inflexible about directionality, the loose uRPF is oblivious to directionality, and the current feasible-path uRPF attempts to strike a balance between the two (see RFC 3704). However, as shown in this document, the existing feasible-path uRPF still has shortcomings. This document describes enhanced feasible-path uRPF (EFP-uRPF) techniques that are more flexible (in a meaningful way) about directionality than the feasible-path uRPF (RFC 3704). The proposed EFP-uRPF methods aim to significantly reduce false positives regarding invalid detection in source address validation (SAV). Hence, they can potentially alleviate ISPs' concerns about the possibility of disrupting service for their customers and encourage greater deployment of uRPF techniques. This document updates RFC 3704.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="8704"/>
          <seriesInfo name="DOI" value="10.17487/RFC8704"/>
        </reference>
        <reference anchor="RFC2544">
          <front>
            <title>Benchmarking Methodology for Network Interconnect Devices</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <author fullname="J. McQuaid" initials="J." surname="McQuaid"/>
            <date month="March" year="1999"/>
            <abstract>
              <t>This document is a republication of RFC 1944 correcting the values for the IP addresses which were assigned to be used as the default addresses for networking test equipment. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="2544"/>
          <seriesInfo name="DOI" value="10.17487/RFC2544"/>
        </reference>
        <reference anchor="RFC4061">
          <front>
            <title>Benchmarking Basic OSPF Single Router Control Plane Convergence</title>
            <author fullname="V. Manral" initials="V." surname="Manral"/>
            <author fullname="R. White" initials="R." surname="White"/>
            <author fullname="A. Shaikh" initials="A." surname="Shaikh"/>
            <date month="April" year="2005"/>
            <abstract>
              <t>This document provides suggestions for measuring OSPF single router control plane convergence. Its initial emphasis is on the control plane of a single OSPF router. We do not address forwarding plane performance.</t>
              <t>NOTE: In this document, the word "convergence" relates to single router control plane convergence only. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="4061"/>
          <seriesInfo name="DOI" value="10.17487/RFC4061"/>
        </reference>
        <reference anchor="RFC2119">
          <front>
            <title>Key words for use in RFCs to Indicate Requirement Levels</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <date month="March" year="1997"/>
            <abstract>
              <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="2119"/>
          <seriesInfo name="DOI" value="10.17487/RFC2119"/>
        </reference>
        <reference anchor="RFC8174">
          <front>
            <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
            <author fullname="B. Leiba" initials="B." surname="Leiba"/>
            <date month="May" year="2017"/>
            <abstract>
              <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="8174"/>
          <seriesInfo name="DOI" value="10.17487/RFC8174"/>
        </reference>
      </references>
      <references anchor="sec-informative-references">
        <name>Informative References</name>
        <reference anchor="intra-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Intra-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="inter-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Inter-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="intra-domain-arch" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-architecture/">
          <front>
            <title>Intra-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="inter-domain-arch" target="https://datatracker.ietf.org/doc/draft-wu-savnet-inter-domain-architecture/">
          <front>
            <title>Inter-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
      </references>
    </references>
    <?line 914?>

<section numbered="false" anchor="Acknowledgements">
      <name>Acknowledgements</name>
      <t>Many thanks to Aijun Wang, Nan Geng, Susan Hares, Giuseppe Fioccola, Minh-Ngoc Tran etc. for their valuable comments and reviews on this document.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAAAAAAAAA+1deXPbSHb/X1X+DohdKUs2RVuS7ey6NpNobM+MKz4YS95J
sp6aAsEWiTUIcHFIw7W8nyWfJZ8s7+gTaICnfI04VR4RRL9+3f36Hb8+3v7+
/o2dogzT0a9hkqXicVDmlbixE89y+rMoD+/f/+P9wxs7UVg+DuL0LAtuBU8m
InoP5arhNC6KOEvL+QyKPn92+sONnTAX4ePgR5GKPEyCv7x5Nnhx/OTZLzd2
LsbqlRs7oyxKwymUGeXhWbkfTUS6P5xejPeL8DwVJf5vfyjSaDIN8/dxOt6/
/wCLlXGZQKHvrV+Cl6KcZKMsycbz4CzLg5OsyiMRHI9GuSiK4M9hEo/CEpgE
1obDXJw/Dk6O/0wkbuwkYQpciRSJhxUQyh/f2NmHdhaPgxs7QcBMvoixySk+
yHJ4/38mWToeV2EaVWnwIhxmeVhm+Rx/j+JyjgzGfwXe6EFWpWUOz55M4jSE
zqsKEZy+eBrsit8iMSuDt/+xB1TVe1QjlhPTME4eB0mMffPvfx9HSTjsi1HV
j1IPh09DYCTWDJ4WUPukCoO3aXwu8gKYugrmygz7Nv33UlbXzt+LeBgjh9Xn
6cMqGS7swhfACnT1OPjP+LOM9N/iNInqXN7YSbN8CuJ7Lh7ju29+eHL0L/cf
qL//YP19+PCB/vvB/UcH+DdMZJixNoEYqg73RxnUme7PCnoWBGWYjwXM70lZ
wrN792DChPBe9F7k/ViUZ33ojHswZ+/xdMVHaqa6BPNsmIjpPmiUUkxFWt6T
9Hnetk5NYCt4bhEKXonyIsvfF8GP4Sw4TsNkXsRFLxgw/eBE0e8FoLqCN+Jv
VZzTgyLgGoEuVHh4//CBbLXIt9xqi+BmrdaErqDVZmjCPJpsebSRZFyKqKxy
4TbZGcv29u+CIn717HQvOLYoLR7A9ZtyUXkHsLMhZni20RBQPPv7QTgskMsS
J/jpJC4CYLHCgQxG4ixORRFMtVWL4RvaNdscBuVEBDOR09xOgafsLCiYu1By
d+5ytwcUo0mYxsW06KMBtL4HYLKDqoyT+O9iBEo9GJPxLgW9l1cJcABPZ2A7
kcVaRcUsy4DnMQvlJDwXwKoA4Z7OEhJOoHkRl5MAOJ0DW3mcVdBgUcTjtMAp
kOUjkVMF3CCqFZ5jG6Msh0pmWTrCVheRSLE8tMDttTJ8jzxCgXAGszGMJtgj
UZYWMdDGoiFRHYnzGFiHuobwajBMQFKCYfZbj/td9ezUcimAkRBZT5HHSVgG
UG84TrOijCMkxO+brnUZA2bOgYUCaRBRGsqpCIsqbxlH8VsMtJFl6M5UXHB3
qM6kES36KDgoSNN4NEoEfrtFky4bVRH7Ojd2ThYIBPCJYxCfxVGYlskcK8ly
8AXLJUa7H7yeCTKGLD5FNR6LomT5GYlZks2DUXx2JnIkUpO3Dx+kHfv4kf/+
A/89DAsgkNHQx7kkQ/2YsmYET+08zrOUdF4fWox8xdimnuk3W0dRJ8bOJHZZ
IXmVCpylkZoFFMF5zcB3mohwRFTCKKpgzs6DKkV5VYKsZRJaUjOu1Lqa6fn4
se9qRx+DoEoCWyc1aOOPTeryKQ4HtGiWFXI2VyCBJGk86VDIUK5W6iegU2TJ
OdNR/YXSLlzFZE8d1mUjnsv2rLiIkySYiGQmexuFiNROifMkA/65s0EDAesh
MozVIkv2ZLkAd80RFaiLZ3gBghICi1QEm0Y1hkmRcbUg2SNZaQlS65uGzVlH
U1dVR7X0m/pbT3jSoaAiWhR5qDrM6WebCejcDPphEotzM3o5NPGsSiMWUPBA
e65mm4ZzmIoznMesb91hrKsnpSGLhfoxF+MwB1UDKkCOhhQmEhynn1gqsF+L
dhEAlYN1VFJGtQy5jFA8h0Mo514POYZJCZ0bOSPGxgd+BBWYBDMI6wQ9QX9A
foVXL6AJ2OduL09EodiVXMnfWReFoCPy0QXOqjyrYIZglw/xKwwtKIlC5Of8
EFRTWcGTKY4a1Lj755d7GGWUzCG05W/EIkwvUCDmh4sJBHlgZ8Echu5I1Mcr
LmiSggIaSWmESQ7sZtD/QCYLLqT2ZgsjHGK3C0fGoaoLgbMCmp1NZ9ggzzRQ
as6QYam/dSv4MYO2YiefRDCNeSp0aAPou8SVBXTJoBkRCEFBWv8iC7LhX0Hn
QbSCTvqNnTvBcQG/YnwZ3L7NHaVEmhWT5BfHrShv3yadDc0uQJcA+9jCfSWE
xn1g0i8dM0zCEw8rmuroT1XD/WIOJo21H76CtEPoMuRA/lTr1F3SMe/T7CLF
jr19expHeWagjNu39/qNngrCmOuwx6Gu7ICjGLwgUC4oYnIowOr1x/2ekVCQ
iSI7Ky1pRTMpNW4xyS6isFg4zpbaRyELgzGMR6rmJLslbI5Vj/bYw6M+UgOI
VFktqx4m32yWx8qzNGovVnq8TljJmhPrQKQOIflYi9x7MQ+g1KgIbr58e3J6
s8f/D169pr/fPPvPt8/fPHuKf5/8dPzihf5jR75x8tPrty+emr9MySevX758
9uopF4angfNo5+bL4/++ybrn5uvB6fPXr45f3GwqOxpT0qlkYsG1QkcpLHZA
h0Ygczwpvn8y+L//PXgAVv2fMJw/OPgjmXj88oeDf0H/CA0e15al4K/xV+i4
+Q50rQhzclZhoKNwFoNqAvEAGcRRT8Hs5aK/s3PnL9gzvzwO/jSMZgcPvpMP
sMHOQ9VnzkPqs+aTRmHuRM8jTzW6N53ntZ52+T3+b+e76nfr4Z/+DTVNsH/w
h3/7boed41ORT+OUNBH5xiB+T6S1GKB5eExGCx+7RoQsJJoGkGaQXphzqKzi
NEoqMiXjEPo/V+46TOBplYI/XUrTvp+LJCxpfCUOA/ZRMfAUjVOtdstgFeAX
yajGxGE4zygkKWYiQtedS0A4WXHQiWxIVx/dIAyFgNtsSnYPA+OSPfSSOoSc
KGyBjL+mcYk6ZBQXEaiUIC77AXL7E4Q7+2dhhETekFp5HBynrgfJ6gb7CYzo
8Ym0ajFaFwieIhkagDUFWnqe78Z90UfbmYRz8D0P1Q+sJZ9U0AdTeL6dqmul
IkldMwMWWIeASJK6LM/KLAJxqHF65HLKIgYNs9Boqbro8Ykoq5nSV+RyFvgE
J+xYIuUx2+EkDslDBnVKoRHCehg0YMGnrIHfUgBCdHefvj3d83Qy/oh2EFUF
Da9qo/KPydqkLDbUYPVCCYGDDHw5nGTl9OEDco3WY78QkWJIVoMhcQA9e4by
KMGDLHeQBE1evkUN1DGUjDHRYRMYxLG/LWNO4aIEYlrhlCKmVTSKMpuIMUSC
U/qJKyHLB7wpDUGNnmToMqmaKU7KKaYmv1ZyR04tlU2FGNVAEazYaQYpG5iP
/J4OKBb7nigQ/9Afhovu/qP1c5ffuOShl2Bh8MyExMGleoMo7Tufu/TQeYP/
1Z9L9QaU/M55A6TMfgOp3di5XECHaZm3/PyYt7pabt5a8HHe8tbYeMvLvekJ
+aHHUt7xtz9pqtyjflr2b3Vu3LH/8Di4RUEI6QWCH//1ptEd/ZsfUVp4GtI7
4A+AU1chkFhKM0GiB9P6LB5XuROztqmOfvAzGxRd3IO09JgCyICMj8iLMRqH
JgGrGVu95AJnS0hzTZRzml+k+WwUT+sQjFzhL4ja42ICpUbg7kWl0mvYFO1i
Iic44eMU9MkENPV4wixBxBaH+VwHKkQflAnaRRm1WvSG0FLBCMIUp/oojkqy
1cM5veoCrhQf2jFMUyP+UOXYCQhgcJfJpk1CHh5wy8JhjFE7A2ykXSG8hmaR
IhMMNDUVGak9MOqVUj92TKxahr2CtlAk8GpqdXNPqTMcPYpPjCpT+q5kJcYP
wYBQ0LWcFrPq7yM2HjM2kcyN2EDn5uzAk4ZUAAWYmTESswGNmIXHuM7gzpLU
QNeDeTTRp1J/p8peYSOllD9xpoAyu9CIiCIIu8cujPg3LGANHU3Pkgpxh0Ji
YOB/FTJWrKMswEABDRZYsNcUJulIBuR6x+k5AWuzrKCuUx4IcgfjoixVkkWM
/SqRhw6psY1wAP0E3GpxicuiYXNOLQfHURcFB208h7u6BkUJOJuzbLkuqpKu
EcU96E5DPdjt1Yyh3ULUalWGOp4y+Hou2EdFmTSmXvm7PMY1vsVUAYAyoic/
YkzNfQkTMiOMhm06+hTAFPhcVEuaySrA62cEHp1mrqYHwgpfCJaZWE6wQoKi
moeqnoN6Aj90yE6LdE+V615zQ2Xfyp4EvZnAiJ9JFVQVuBqE0yfhhk7iWaE1
1/EJ+0ZM0mC3ariw78VvJXXCzwiWhhpMWWpq99zBl8piptxCqSeVAoeuQOMA
swCXtiAo+ju/I0cO576kQBaLidP4EdyiXDnogqYGNHIGoYkL0zjrE4JjXhYy
GBOCllnvJSheofR2lTZRqoDUO6sWkuSBpd+epyOM6ICShnsLaUISCg0Rf7AV
YqwLBLv/MXhe7CkVgcZYYUhN6wLaE55jAVYMQ6FgvJFaDmPomMuYhQcHRWia
pWNQ+dzZftpneTblsRBFlZCmnVW2klQq9wcwHiIYkJ6CKfoGxkfrVu1HYzmP
BdMhmY48Je5rDbx8VYoV6b8oz0D6yQjAo3a5iHmUc3GWCDnELTbSbcwrMQ4X
NKbBX3tTPO1ub0yd7vpNGagY9YkFjp/GU90gDWmoF20UHRjGoQd7VhC6JsHB
ONO6lBusGHHxkQZJMt/VbGS7KhK6QA2kzQ7YSnx5gshV2rMiLpqOyBOovFlh
y+cQupfMI3SBS0cBtBRAi9Kyyli1ZEeC6TAJqIPlLBXKLfQtKzgrT7KljV5/
CXonBB4GbNWRq1M20TCNLJnit6fy7Zl5u9Rvq0kpe8E8N+iTXjUmLIcR2qVg
J7mu6o6f4xs6PeIZar/PJ7vDgFlU9JSW396IM2jMhLAba4JZhgffLend3LxL
bih8F7mG3ukRuEMsiyFNKB7WglwcQ4fCBLVuTpLnNLI+ryxelm7gD8ZatjTL
sqfEuTOutWqtd60Rx3GtDblVxtEZbcO3dNPeCGlF39ImkFDtIDhlo8C/VeY3
tzlPBm+JjamAyGduv1fU5g85vdgwxwFSa7SN5WfLAdXMuns/Tykq+3DLMXmy
Uc/rdXy4RdX+yo9+LcJz+WrDsiE3HvtA9v/OnddqhePOncfBMzssAy5vQwOp
5EwRY8kFivw8VRR5eTsFDZiOEmyMZT3w9YbdkfZjRDs6hdeJtIM+kdJyEouP
Zw8Jaw6FgcLfNjAL5WnrE26Vkq6t+YV2Q4zHObUkS/fZv62/R+P1nIfvLEuS
7II2CbkrIxpklBFygvxGlm8jQQMkjFPeJzwNGK0DSFr0uesFmVrxti7wyf1Y
PN1teaWj9KWEvoODOsLklv7h+feo5hVct2LdT7GZ6vMKAodfJ9kM/rz3jnlo
/VDpg/t9+u/ewcNAd9aBKriodPuv+M+7e+uUtgC3tfpc456dfX4VdS/b59bn
1EKkC9Xn9sOW0lK7Px9YcLssbWPx7u9u3TBH7cGXpWtPnbo3m6HtnaKkxf9O
DYD1wLc1QoGN+AY1OLf27q7V2L32d9t4MA1kKNjZ7qVRhmKeKmhYxZNtGhxN
i6Mpi/l0KkrckqR8Z73IztjynTuK5hOLJi79qemMBq+2E81m7eNHWmxWYGch
7Lh3C3waEADnpXRp2PUjgEZVca+J1KjtQYj9kk+p9SminVGk0E5l9PqWBosb
tDXvlodyfNJzKpBsKg8NLNlFiqDhWfwbUrSkxYHB0ZFTK1yGBWRyBGa/jAuJ
6ElCHpeeKtQIr/YiWpBlScG4irSrUkahhYtjM+RGrZuG79nlidNhVqUG2mGd
Ibcn2fMf4RYTyhFYfCpHMhFhnhZ62VXI7afYQJuCLszs0L6pRDgMQudBEDtC
CMNlixRgHSrCxnvqsQePww05IHKHcYeE2msK9hqs02SpShUH1Ne8coNTBr5k
M7n/FItWhZAQFh4n4VUWWxlbsMEkS0YFHeBipInX9RXlQu340hDwnTvkk0OU
D64sVoxwFmJ2Apw1XjmSrtcWJzFt6zroBz/EeVH2nJVdYlXvR7CWnLQ0622h
BlVQWwS3p2OID0TT5GrXsIqT0uykIYytSwc2ltbb1tXqa+jOGr5GjWB0taoj
X1qultB4ahXB+zQOiSRvCtI4vtmsYtPUClCC9ASmc9W0CZ8gqnaTQdUd4TBa
az2yKWq4Cj39KgpC7EkWFs3puNvUT3ss3XUyh51k6pHTnjOl8TwBx2/JnEfA
g0GbdTePzjSLH2pxgFTTweM/YoE/Pj6wZpj4bcZDyuBqARONd/iFpbuqmmTR
ey8uqCQAt+b4QVCu3lgLXvO1tQoPl+1jbOCB/aMtTFo6Tto0UuomoIOlw4X+
81LxknxJTxibQHfIBK4o/tPykoqaDliYHzlRkw54kIKZOM5LtbALftMNN+Up
dnJqMS95OrERt3QOh4dAI2zqjF2vggP+qBHgL95XmgSceMkicK89klIEfHHU
O1O+NZaqceBETY8UhXv153UONp7O7R1EPHyOkKoeUV1NSBVuElOFC4OqNlch
3Fq81MHDBgHT1gIgx9lwre+C2Gi50Mjxj5BW9yEQ5TxtOczR2pXCHNqdbcc6
ehyWDJceydMANinVhyZuot+6gyfNV0vwpFVzKy2518UNkh7VXA7b7Ww6mCpw
LWmLP1UCLWaiwo7GwX39XuCBpMKQ8vHUYESV77k1t5Q98JZtyNoqkSStzzU6
aVEk2Tl5F8WSnyCK6+DucfAlxHHdum/9QC68qkjOTIrth3U9S81adeGkaA/x
uizYmjGenl5LBmfeCO9+JxFfoPi5Yrx6hPdlxHfL+kktztqldtN2FSS61+aA
rUJXfcg7vwQb3K5cF+P8l5YnLR1kBQxsvgjYvsde19/C2sLlrI7VrKtdzpKB
2SUrWnb31PAG3SHxknV/c0uIllSsXnrZZdurqHuNJUQ3yP3ClxC3Pt7+MHLZ
0vzIltnA2/ftpZvh7hVwvplWNF88EXWs9ot4AuqWvSSbrFHWSbYvUdqMtR2a
0U63ckN5V3Ub3y1buZvsqx3YOto07mk58eyrn8Z5ThdToP8A4t+50mCdusYt
yCO6/0SdbbEcQzyjrs8+NHf194OfhDqxIk9r4NEadhQdh9VeIbUql55CEVNw
oypWL0LtuLHehK3iNxFV2Pv13oV+p93b2UzfXeIiFJJh6/0NoqDNZRJl7Qqj
oM0ZXCcGqs2WT7mY5SzAfznLWRoLWHE1a8lIp3st60uKdD7lapYtC9cBzxUH
PP/YNOYJNg17ugksEX10E3CCn24C/tW+GgfuS4cOgS9sSdAEch1LguYl34Kc
JSX+B41OrPd+LQzqXKG9Cg5kB/yulgTbBZ0orDUbl400WgksGyZ1ElgmUtpS
EzZWquZLV8jkW4Rc1v1beQ3SqdVZgzS70KyFyG3wocMfXUNL/CPPfxXhVKg7
ULoQevtYKN81GQlFQx2Yrp2P91WLqzMiF3YE5LivThCkVZcVSRG/uABqXarm
bid1nWN17Z19l4n7rrNg6wmU6Po4vJoS7zldFCQFVx8ldYz+FxImbXmtqD6L
voJAqW2B6CrWh66Dpq8haPoiT0otdvbXWOaw/PnrZY7G53qZ40td5vj2Tkpd
6ba+cDz2rT8sPK26yUrEcSvx9jUJySf4DSMxi6Ny8VLE4iYsvSjRcEYcV0Vf
TNjKMt7UM8IbdCK+QqdlscJFkutuj2eJAa+Gii2PWF/FTmCXVpTkZ+RiqC85
Ui7OKgevraAgxH2DaCfkghDfPCYjB73rbn0XepvCd5WnZz7PeoMRq+140Cvv
sPoaFhoOPtuxGf/O3W9lT1XrdP/az9BsFTD3udHYwYfXgPk1YN4g8NUC5p/p
DM0KvvalXqPUArzX4j6v4Gmv7Wj7UOvVnZ2V8WtV9WLoervMNF1lyyfx+L8W
n4tg7RqYuLyDo9VCDUw2vo7tOTvXU6/pLcsObLecNluf1GvuOq1whW7zVhld
14PeKgjtl7Dt+9OfEo/+Is8r/F6da2ew2zxtvJfNSdHhXMoqL3Pbl6f99ul4
n77+rXZJ20uZDQk5WuZOw9rlaXRDt/fuTCefEvZKxy2P7g2xAac1LOhMGh9O
U1dignCPxyLny3PV3D0L44Ru3MOLPWTStMYtyji6C/mEnv5bFaYlX8kr7ydt
vRm0Z647pfHU98qrEY5TGHy90BoudTEnPvFcztl9S2atDY1rhssF5c3dij2d
lkyupZUZJfICSvKiTYdIoRO/kNjj5b9ZOlrq1rm7Pg/rrt/1onjwGdvwkbnW
/JJ/RX/Ol33gOytSa3NiV+TC7/ehYIz3cRCc5ASMFsrLKrsFT14aSeZHYaje
q2uteW5Dp4aD1l3cmhmTYXMRW12zRt5YaSsGng3q5lHrilFKzUZTATMiqKtu
zazO0ItQ2eacnxRRMMAoqqXy4agSbIJ1lS28zJeY0k3NNAlX0hDuxfSWL2F3
rmM4KTECKvFk3r5UrWDboq7KMqXNrL3ovi6G+YuZPFPK5IIe8igPL3j4+JTw
Gp5kq16zRn+RY9ipGx3BWMFvcwR5bZetc1SaXpgsSNlx2c/Q/awTGvB5bEov
FYkcQxLsqwivDB3r1qoLzPX7Mp2D0cLUH8oLObj/z73gEP/p9/s9+Hr/n/s+
P629l+mMfhJJxQjeeCYvy83kPp9x4cD5Q5UZge6LTtXlZ00T5KuSeBskAiNJ
mRggLL0SaV0Tjq1gCP3HgRp+5BOzORUT7pja28WkKklKRywc0Nm0AtHHy4Wt
3JDuTKznsxLKXDQ2UvGGJpn+SxKnNZeiXlRlmjyp6NiFA0jb1zmIRhmdioJ+
tNvwnDShGltKr+kpba60K+lmAPtGACXHmARV9T7Kkhk7frcBoOcw1tk0oeMw
dB90lgqrC1p7UHdDPI2TMCfWrapJguVFyzh2PG6YsdK0W9dt3ckmsz6HMxAF
bhlfy2YnntIizb4/WhIsmOMcDa3Myiq3tbyBN8EJykE611LIdLFK+RHcYPIx
optP6V/wvsMZXZatterK/p/O56ITDMHUkyksYZaJDu3LKr+HerFKRv5UBNOM
83rj/A5BH05EkmTaq6N8MpGARqls0QxNqNK1QbUGXWooKRsjSzAq8IgT1zOV
MaN1d7rcHHiGigv9Tug/StLLAAEM19SKNDr61BhD0l6yJ6BNIAdsBOA1UAHT
Gd11T21Vo+rpD7UJ0pRSqZHrbrbOMocZKDtcH6V4ywkl/ixVnoPhXOtigwLp
nZfWjRU4qWFyaI7kddAqzYvJM26UbHzmHTxcEcMLaG6Os2w0nAtu+03d+JGV
Sd04bK5+NTridmHGvCfDd/wV+5iuXbQsO7Dq1Gnkz+y9bBmRCxrQeSzgX5nS
mhMGOW4w7RfFUZ9OOacvdC8l/Xtw/9HBx4/9mo/sSTRQc5U1Qlm3cAviKcrA
q1WG5Ut3+i6rxWzOnfFLpkPgNFn6dtZGZEYcgCWoZ9xSYAgPLco9qimUJCck
UGTqmAmYGbopxpgb5fRQzjHM4xVhFq8lbvu0LsXh6S37WcaaltSSScI4tBl/
SmhBdxBI6Kb+cPdgreQWdw/hV+4js/bxCp8dBC3pM68iPhu40N0jsrY/rVpR
E16PsLpxgkdwM7+lXADb+NizZ9VWGDRnCtbqzyF6nuexxrmX40n6JV19dsvN
QeIDQ/Gis24k1ElXsTA/iA8HrRX05m+pw6K1Mi3ZUGnurkLai2Quk1tGO7Uy
Y4rUyZTBVeeYUQLWnWUGEwOqvBpQWuM0BLWAW5hbOQAXp1Ikv1/6ws321JLK
qFbUcjfVwEyTJgjLOkF6Hc28c2dREh+Pu8Ep5ey0iqsMISeLkQNgbSoUjUSm
vE9S5TjzOyNxUfM4aJ1oHFZjWQOo9IxUby2TkORuiVxCFJCSawiGoeFzTLOR
WYMFDtNiGheUJFIto3p8D3iN02C2+B48oKt6HsAhpo7kzmvkrrQTnLGAW7KL
pOgux1oOS7cY/JDH6gpza6gRSRjWStJFfqi4TXuirGKRsavnXleUtepQMiA7
xJYlVpRrekMrSOoyHtFSaa2u/aFP5g8tNR4be0NpNR2KfDlJVmUsJtp8ouXs
2LIe0RaY7PlM+bL92uYVLcNXzTfy9Z1juazsbB57pfaMG3Wy2Cxzcll/hGwW
ojpNkz0L7VzI0kjApA+gzenIWfFvydkR4tvUyUgI2iMYSlnPWNh5NGjxX2kW
BF3MaBBE45hrNNYEKdJJW2hOUWY5TvIC6jXral2pNNu3MiCCPvprxbinTJxJ
lhLkENW4+C1Et6vX2OWgT88SdGXPBbVRpxZ9NxwkjaZRV9aNjyUipBShgze3
Pn65W9Hm1HMOfjOWppn30NpLtGiy2CuVpcGDNsV1rmh/TocJqw/vVsN4I/32
Jh/7bTtL5WKT1QhV1gndV2bKa6K6+22ZcL3JR80k+fpG5aB0U1pSpC7y7eag
tLc3rbmr0wJafQanIx2lyk/vTUBu8jiaCVu/5FntlqRk5liTegEYgoER+T2I
y523voqEj0GDqcbG+67S9MPxSXC0Ozjy33HSXffde+9krffe3V2x7sA5+uDl
rrO0fVLAV7z7OK/zzVO8q/S9YPfJ4WCvo3iztGf7+9224s3S8ivMod3Bgz37
UbO4p24cHaiP/uX/m5F6t6i07iz13CnhFjelB4d/Abk6/EX3tJ9f50uzbj1M
5nlr8UZpM0yXdonBQ+Ts4S/uFyzdMpdsqi21Y2k1lw53B4fWXLoMBgdYx8Ev
rcVrddtzCUo/0qX9xRvtttgKWn4wX3RpXY9hyy3tK26N90GteEOKmsVdznmw
FFv09d4AHzk/mC+NdrerQs8vjdKXNHgHu4ODHvTFntUANbAPdwfqIqgt100f
eQyqS18GzV5zmFyu9NYP+3OlzYNHDZ3XKM4lrTP9jbNDPhqGH711FEz3MulL
HS8gTl13Yo3UpTY9vYm0xgvENxo5D/0niDZlz3fxFQoz/XtI/x71rDAN3B+U
ZzsI74q6FiT4SYFYn+pzc/hgC7F+FwagJ433bJSA9wE16BxZjDd+p0UjfMkN
oYgpvQFMJTSFPwZ8PGBA2VOaTDbzB1GcTy+eVXmp9laFY72uw1tS2aOkFC1O
FZqjJw28Wx2FoDQyXlIOvxTPcbu4Nw5b7z2zfOxZCP2jNlF1J5kBUZO12Sch
3d8f4AggE/vfIQf07wMjJh3XTmnOFZrfKv/2hWrcNDHaQjqaNefYJ8lEsyZv
ywMvTdW0nbNcelStA1CKVhMi8aSbkS1W0dsy2qsz4cya57cMGuQevcIJ4Tlx
RVPyAf7knyoLrggLzuPQUj5ujf7rExbW2JWVVNf3bd9N5pwGY5Xdcvjrld4V
yMBRZO6BsZaYWmaOnGmFKBHyczQynZNVeyBwrXcVvKSBsWlNh2nCEj5RNMzO
Jfa9+JzdRZwkgTg7U+Mtu7nexdd4x+K6r/EOf/FrvOMa7/jUeEcTsIDSr17/
+uy/Bq/fnLYVN3hHHbBYCe8w9WwT79hr/nCNd1zjHYYfL96RzGZr4R1dWUGj
kC6PHs6NpNvXpNSq3xbEsSJH7t69axRkHRTEhj8gjOAusHt4Oq1S3ssI4zK0
0vV+/+PAECU3rpA7RLi7oZZz3LxBg41OOPrndfBEH4zxoCd2wGgfmJPN2Q6v
NuQD7jPyE5tTWGlWLsGqApGO2uAdf9pjt7UG14lrSby+BHxHz5ivGt5ZUb98
fsxnRYbXA4JYgX+jQNC6N/lcI0HXSNBiJIinzipI0O8RDTLO9AagkPHnF/r8
TR96CSLH6TzCE7YnIgc/4S6X6YKKluMEgluFQSzLieejcAEvbNRNZHCEce7R
Lw74pB+uyMlGGJSmYaNJq0FRbSO7DCJlfqJ/LWzJC/Qs4sTGmByEqRufkqgF
mzQ1KJfNkothqjri5GtFG1rV+HSiVgtAK0NDDqwNXhn0yf7bEGmdqw6G5f0b
ibwt7Llag6JcMMpPxcuJHFVD5JEX0XrXJIIfDzBV1x4+KrWObeJTnft56vBW
/WODZIuotA/xOzPEehLbcJX1dxuRduXs+aWNyKWBro4IvloJvNomJ/TZDMpy
WL9qRMvY4q4q8J+2rB1LwFxOg7rQLv3i7rPRWEjb25oR1V/1jZ2fTYq0dvBm
cKRulwmlpZexN3pe8AvF6BojoLiKkBR1+dD3Pw76C6G4UZEvC8WRb6tiOXAz
2VcFJxK7ANy3ssrTYPfpyZu9NuANKtsAeFuh/mtgrQVY+xk3BUEUnlOPUtWU
WQGPXkFj9VlFJXHu1eI9e9s/HcpBEIeBF4ZcCH2BAJWPQEsiPEKqxiN1RxED
S6piYrOCwJwuG5oqRgTOMpfAAZB3AmDnHU4UMeELTS3IjBvdCjApWOtIRpQC
ibW19cC09dABF5oxeDuidbQA0QIxSTJ1PUnIo+lUvH3oafkZ9lmApjp7wMh6
+BHroW8UP6r10HqY0VE7gnO4OmZEegYBHNULLoDROkR01ZwUQlYWJIEN1MRS
APZ01wCEZ/bIY2RZVFHori+bKqRYQM8GRQkNjCT27XKyjeldl4rV8KZuQKmL
r1awqT27y5pgE8+zb37bkevwbdPJ9bnWS8NPdZe0w09fiD+tQkwFNJ1bllbg
rLY3ZgNi+JG4gR+XWpGYk0hnQ2IWUOWltgoxB7BqQb0WEWsRCT/6tYhYG4Dl
RcEWctYGZPnQMA+xJQEtDyrWxtkSwFYTHWsjpj5dAFcDJVtArBvocnZcYbDu
wAjt2sEPeDkYEsEHKp6+3L+U97J7uLWf+6khsd3jssSLCfK9SyyhVE0dRnMq
8VJjznYHB7dp+9mfLvkqeT9n1v4wL7XOPpOKS/eZvV3MR61TaOsQnbN7zEOt
RTSWgOo81FqINXeW+USwTm2x0PqgOwewc3aadRDrsm5+4MxD7M8xeAVTQm4b
u9DqKzAulHd1nG3qa4D06xRM6OTgXSXsM5pdHvJIfyjnncEj4rSIRwTc3Ngh
Xy7LG5HRoQaj+HZneUrddp48mBTUnZCjvR4uJYuTy01cF214lHzTl212XVDK
U3nbZpjGq9jXrs9u+Z7he9wHleCVzDKhrcwaoEAf6Hu6+k7IGw/0iBENec8b
CTGUMmniUAdy7h4OBm0kSGIqt93XQbTJbUb/GeKbeJzyeHPmFvqTltUVJSd8
djIltTVWZVCSaJpqCRMK8xyTm/niDTOe5vYhiQVBcJdHcYjXJCUcvEziWaFv
NCHBZK+VW12oHCjqtgXnQm8TiLt3BIV0O5CnoB3Uq8JY32gUq6vvDRLoLf0J
4J6m7H4RME+TrfVQHyMdW0N+Dr9Y5Menh7a8eejhqlt5CKmuw0Bb22XTtXdw
tf01GyAeRsauUY9VPJEOH4k/16jH2sTwc416LEfsGvVYnrNr1OOrRT0eXqMe
7k+e5l2jHkE36uG8+TWhHg8XoR4PPwfqoVYXN9qMswDw4LfsPTgbAx61itvA
Due1FYEO6amuAnaEdfokm7Qiqy5gLcN8LMpuXKRHV1lCNcmUzlnB6ypKhHeo
hk8Ib+jRu0Y31t/MIgUV7776ArANl6k1d7MosfgdwBp1bbMOpNFxOulgDUjj
4HNAGg8/4ZEhLV+/L0jD/1l6K7H7YQdoiY3FzWLG+V+pWGBWN5cttmbbrCrv
vVu1xHY2h8ujWsuBRO+c2LfG1FKnm9Y6rtXaA+0PViCz6bmvrXGzrQNkGx0h
81BZ6wYjD5X1j5EpQILviF35TqPtHCVrR6VWOky2/nGybR4o2/KRsu0cKlvz
WNnGB8vqmxA2Olq2xcNlS6E4PjqNLt7SAbNlkJsmna4B38Ihs40Pdy1Gavw4
zZa42ZIN3+aOlCMfNmOOR3ViM2b5dJ/r2W/cWKRi2LW3o5jLmQcqo8OgltGB
b2j2MLPMlpUNGGxDca5sT8mh3FPyifaQqMNCJR6HohNDDs7DXGCsNUS0SkYl
3NAexB+TsGIMogMQsnEc/wDq/JSrgDh8pMsHxMCIY+qWHDMlC+cQmZ241YFd
GghPBzi0BYwnWB/kWV+WrxTmWZ+tJaEev+RsDeg5+mKBnk+wf6XrIFMb2HOo
7qFxz1UqwV4MTVzFJpajjREfv5j9nvGeFbGQFfGdFXEdjeccLoXnrIPjLI/f
bAe3WQKvWR+n2RCf2RBQ2RwB2Qz52Azx2AzpWA/h2BDZ2BDRWB3J2AaCsSXk
YjPEYkWkYkOEQs2rNZGJLSASmyIRGyIQ20MeNkIc1kQatoQwfEpkARGFw0WI
wuEnQRTYadsYTWjZ51GjDs4bBhHoCKprVtZFB5bc39HYgJFybzS2YRxe2TaM
Rtjd7JS1Qu6vOua++pDbFZAvItxea1dFU1p+B2H2NvZTHG51P8VBd4R9xfsq
Dq8yym5K2DcZYWPS5idZWuZZEgwo1fTA5G5X40hdCO2uZj3DHAuwTJ1dUO+A
jWblYaVhR8r7lMR6vxARdGM9gzwQLLMowQkAdgPmHVp7wwNfyEvvAItTGP1w
LFSe51qued2ip5g5e/vNwYTcnW2xMnajGizDYUJHR4HipMYrG1Zvhu9Gi4I3
0nAGb8s4if9OIou/PK1yqtpiAQUA2ZIcm6zdu7IVdtruj2zfnUvBzbtuiu+P
e6xhngzeBpVhQ3bbNMvnzmNrPoHerhK0Xpg8fSzvYcdGzbKc+P4BW1vi02ch
+HI0PhNcCkAfQL5DPVKyoKNYAVuoyKYiGCfZEGw6NWSEKQEi8hFUQXAcZlmK
+2d5iKlMnFLC9CpMrN8DfVIW76yOpa63Xi3N6sZZloAt5qvG8nAqSlyhUOLg
Zo7Q+cCnIpqEaVxMUfiQtyJ4+fbkFDtGYnpipCwkkeFGSCgeHzxx6A50zY/x
jQBRe3ptJMA5E9CF+egCm8PNPivpC66loMbjImC8FKZTZrMMBmjOPxxJWvpe
bJX0QBZ80A9O5mBrprXG7or+uA/zajIvaCCgS8D0lth9UxjcGLcCgwj1pMz0
YKZFEwFjk83Q4NIN6kS3x2J5hnuLo3AGTkQ53+O6wa96yk301t24k71ngjlJ
4VHfHRGZ7D54fvzqmLIsoOciLcOHW/j0I2sPMCj6HioU0TTjMiFZH2mUbwUn
AnwjzBXRoFXIXz4qbTQEjTeZhvl7NZMLo/ZVrgZdpfYU5DFuaXS0LrX0iwwf
WBYKtT07DKROToBAAmYFOMOp61yYyPOYvPciQ595pBypQtv2tCY1gZjOkmwO
r+IscBo1rXhKgJIqKzKEOKnEDHwU8u61OoaApaRIbsqycK4NG+arwDdgSkxp
5o/iwjhu5AaE6Vw3VmoJ1DqzDC8WjNFDosBh7jpzIGMZMiSlD/10tAmjir0J
2UpqNcS4tMTII3wcvU+ziwRvM5uS8vhwq/7oIwWTaTUdCnDr/vUmGX8OAl8i
t8Bl+p7WWY/jv1Zp8HNIogrc/Cjwr5OqgL9/gjGH6fFjXBViNhPg7mXgMCRh
L3gZp5P9V+MsCk5zeFGUUV9tDY9h1qFkoP3hy9LkrY059JC4wPtIXcnq7/w/
VTRchUYfAQA=

-->

</rfc>
