<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.29 (Ruby 3.2.3) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-chen-bmwg-savnet-sav-benchmarking-05" category="info" submissionType="IETF" xml:lang="en" version="3">
  <!-- xml2rfc v2v3 conversion 3.30.0 -->
  <front>
    <title abbrev="SAVBench">Benchmarking Methodology for Source Address Validation</title>
    <seriesInfo name="Internet-Draft" value="draft-chen-bmwg-savnet-sav-benchmarking-05"/>
    <author initials="L." surname="Chen" fullname="Li Chen">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>lichen@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="D." surname="Li" fullname="Dan Li">
      <organization>Tsinghua University</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>tolidan@tsinghua.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Liu" fullname="Libin Liu">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>liulb@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Qin" fullname="Lancheng Qin">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>qinlc@zgclab.edu.cn</email>
      </address>
    </author>
    <date year="2025" month="July" day="20"/>
    <area>General [REPLACE]</area>
    <workgroup>IETF</workgroup>
    <abstract>
      <?line 65?>

<t>This document defines methodologies for benchmarking the performance of intra-domain and inter-domain source address validation (SAV) mechanisms. SAV mechanisms are utilized to generate SAV rules to prevent source address spoofing, and have been implemented with many various designs in order to perform SAV in the corresponding scenarios. This document takes the approach of considering a SAV device to be a black box, defining the methodology in a manner that is agnostic to the mechanisms. This document provides a method for measuring the performance of existing and new SAV implementations.</t>
    </abstract>
  </front>
  <middle>
    <?line 69?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>Source address validation (SAV) is significantly important to prevent source address spoofing. Operators are suggested to deploy different SAV mechanisms <xref target="RFC3704"/> <xref target="RFC8704"/> based on their deployment network environments. In addition, existing intra-domain and inter-domain SAV mechanisms have problems in operational overhead and accuracy under various scenarios <xref target="intra-domain-ps"/> <xref target="inter-domain-ps"/>. Intra-domain and inter-domain SAVNET architectures <xref target="intra-domain-arch"/> <xref target="inter-domain-arch"/> are proposed to guide the design of new intra-domain and inter-domain SAV mechanisms to solve the problems. The benchmarking methodology defined in this document will help operators to get a more accurate idea of the SAV performance when their deployed devices enable SAV and will also help vendors to test the performance of SAV implementation for their devices.</t>
      <t>This document provides generic methodologies for benchamarking SAV mechanism performance. To achieve the desired functionality, a SAV device may support many SAV mechanisms. This document considers a SAV device to be a black box, regardless of the design and implementation. The tests defined in this document can be used to benchmark a SAV device for SAV accuracy, convergence performance, and control plane and data plane forwarding performance. These tests can be performed on a hardware router, a software router, a virtual machine (VM) instance, or a container instance, which runs as a SAV device. This document outlines methodologies for assessing SAV device performance and comparing various SAV mechanisms.</t>
      <section anchor="goal-and-scope">
        <name>Goal and Scope</name>
        <t>The benchmarking methodology outlined in this draft focuses on two objectives:</t>
        <ul spacing="normal">
          <li>
            <t>Assessing ''which SAV mechnisms performn best'' over a set of well-defined scenarios.</t>
          </li>
          <li>
            <t>Measuring the contribution of sub-systems to the overall SAV systems's performance (also known as ''micro-benchmark'').</t>
          </li>
        </ul>
        <t>This benchmark evaluates the SAV performance of individual devices (e.g., hardware/software routers) by comparing different SAV mechanisms under specific network scenarios. The results help determine the appropriate SAV deployment for real-world network scenarios.</t>
      </section>
      <section anchor="requirements-language">
        <name>Requirements Language</name>
        <t>The key words "<bcp14>MUST</bcp14>", "<bcp14>MUST NOT</bcp14>", "<bcp14>REQUIRED</bcp14>", "<bcp14>SHALL</bcp14>", "<bcp14>SHALL
NOT</bcp14>", "<bcp14>SHOULD</bcp14>", "<bcp14>SHOULD NOT</bcp14>", "<bcp14>RECOMMENDED</bcp14>", "<bcp14>NOT RECOMMENDED</bcp14>",
"<bcp14>MAY</bcp14>", and "<bcp14>OPTIONAL</bcp14>" in this document are to be interpreted as
described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/> when, and only when, they
appear in all capitals, as shown here.</t>
        <?line -18?>

</section>
    </section>
    <section anchor="terminology">
      <name>Terminology</name>
      <t>SAV Control Plane: The SAV control plane consists of processes including gathering and communicating SAV-related information.</t>
      <t>SAV Data Plane: The SAV data plane stores the SAV rules within a specific data structure and validates each incoming packet to determine whether to permit or discard it.</t>
      <t>Host-facing Router: An intra-domain router facing an intra-domain host network.</t>
      <t>Customer-facing Router: An intra-domain router facing an intra-domain customer network which includes routers and runs the routing protocol.</t>
      <t>AS Border Router: An intra-domain router facing an external AS.</t>
    </section>
    <section anchor="test-methodology">
      <name>Test Methodology</name>
      <section anchor="test-setup">
        <name>Test Setup</name>
        <t>The test setup in general is compliant with <xref target="RFC2544"/>. The Device Under Test (DUT) is connected to a Tester and other network devices to construct the network topology introduced in <xref target="testcase-sec"/>. The Tester is a traffic generator to generate network traffic with various source and destination addresses in order to emulate the spoofing or legitimate traffic. It is <bcp14>OPTIONAL</bcp14> to choose various proportions of traffic and it is needed to generate the traffic with line speed to test the data plane forwarding performance.</t>
        <figure anchor="testsetup">
          <name>Test Setup.</name>
          <artwork><![CDATA[
    +~~~~~~~~~~~~~~~~~~~~~~~~~~+
    | Test Network Environment |
    |     +--------------+     |
    |     |              |     |
+-->|     |      DUT     |     |---+
|   |     |              |     |   |
|   |     +--------------+     |   |
|   +~~~~~~~~~~~~~~~~~~~~~~~~~~+   |
|                                  |
|         +--------------+         |
|         |              |         |
+---------|    Tester    |<--------+
          |              |
          +--------------+
]]></artwork>
        </figure>
        <t><xref target="testsetup"/> illustrates the test configuration for the Device Under Test (DUT). Within the test network environment, the DUT can be interconnected with other devices to create a variety of test scenarios. The Tester may establish a direct connection with the DUT or link through intermediary devices. The nature of the connection between them is dictated by the benchmarking tests outlined in <xref target="testcase-sec"/>. Furthermore, the Tester has the capability to produce both spoofed and legitimate traffic to evaluate the SAV accuracy of the DUT in relevant scenarios, and it can also generate traffic at line rate to assess the data plane forwarding performance of the DUT. Additionally, the DUT is required to support logging functionalities to document all test outcomes.</t>
      </section>
      <section anchor="network-topology-and-device-configuration">
        <name>Network Topology and Device Configuration</name>
        <t>The placement of the DUT within the network topology significantly influences the SAV performance. Consequently, the benchmarking process <bcp14>MUST</bcp14> involve positioning the DUT at various locations throughout the network to thoroughly evaluate its performance.</t>
        <t>The routing configurations of devices within the network topology can vary, and the SAV rules generated are contingent upon these configurations. It is imperative to delineate the specific device configurations employed during testing.</t>
        <t>Moreover, it is essential to denote the role of each device, such as a host-facing router, customer-facing router, or AS border router within an intra-domain network, and to clarify the business relationships between ASes in an inter-domain network context.</t>
        <t>When assessing the data plane forwarding performance, the network traffic produced by the Tester must be characterized by specified traffic rates, the ratio of spoofing to legitimate traffic, and the distribution of source addresses, as these factors can all impact the outcomes of the tests.</t>
      </section>
    </section>
    <section anchor="sav-performance-indicators">
      <name>SAV Performance Indicators</name>
      <t>This section lists key performance indicators (KPIs) of SAV for overall benchmarking tests. All KPIs <bcp14>MUST</bcp14> be measured in the bencharking scenarios described in <xref target="testcase-sec"/>. Also, the KPIs <bcp14>MUST</bcp14> be measured from the result output of the DUT.</t>
      <section anchor="false-positive-rate">
        <name>False Positive Rate</name>
        <t>The proportion of legitimate traffic which is determined to be spoofing traffic by the DUT across all the legitimate traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="false-negative-rate">
        <name>False Negative Rate</name>
        <t>The proportion of spoofing traffic which is determined to be legitimate traffic by the DUT across all the spoofing traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="protocol-convergence-time">
        <name>Protocol Convergence Time</name>
        <t>The control protocol convergence time represents the period during which the SAV control plane protocol converges to update the SAV rules when routing changes happen, and it is the time elapsed from the begining of routing change to the completion of SAV rule update. This KPI can indicate the convergence performance of the SAV protocol.</t>
      </section>
      <section anchor="protocol-message-processing-throughput">
        <name>Protocol Message Processing Throughput</name>
        <t>The protocol message processing throughput measures the throughput of processing the packets for communicating SAV-related information on the control plane, and it can indicate the SAV control plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-sav-table-refreshing-rate">
        <name>Data Plane SAV Table Refreshing Rate</name>
        <t>The data plane SAV table refreshing rate refers to the rate at which a DUT updates its SAV table with new SAV rules, and it can reflect the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-forwarding-rate">
        <name>Data Plane Forwarding Rate</name>
        <t>The data plane forwarding rate measures the SAV data plane forwarding throughput for processing the data plane traffic, and it can indicate the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="resource-utilization">
        <name>Resource Utilization</name>
        <t>The resource utilization measures the CPU and memory utilizations of the SAV process for intra-domain SAV and inter-domain SAV within the DUT.</t>
      </section>
    </section>
    <section anchor="testcase-sec">
      <name>Benchmarking Tests</name>
      <section anchor="intra_domain_sav">
        <name>Intra-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Evaluate the DUT's false positive rate and false negative rate in handling legitimate and spoofing traffic across diverse intra-domain network scenarios, encompassing SAV implementations for customer or host networks, Internet-facing networks, and aggregation-router-facing networks.</t>
          <t>In the following, this document introduces the classic scenarios of testing DUT for intra-domain SAV.</t>
          <figure anchor="intra-domain-customer-syn">
            <name>SAV for customer or host network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
| FIB on DUT            +~~~~~~~~~~+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +----------+                       |
|                       |   DUT    |                       |
|                       +----------+                       |
|                         /\    |                          |
|            Traffic withs |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |       Tester       |
                    |    (10.0.0.0/15)   |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-syn"/> shows the case of SAV for customer or host network in intra-domain symmetric routing scenario, and the DUT performs SAV as a customer/host-facing router and connects to Router 1 to access the Internet. Network 1 is a customer/host network within the AS, connects to the DUT, and its own prefix is 10.0.0.0/15. The Tester can emulate Network 1 to advertise its prefix in the control plane and generate spoofing and legitimate traffic in the data plane. In this case, the Tester configs to make the inbound traffic destined for 10.0.0.0/15 come from the DUT. The DUT learns the route to prefix 10.0.0.0/15 from the Tester, while the Tester can send outbound traffic with source addresses in prefix 10.0.0.0/15 to the DUT, which emulates the a symmetric routing scenario between the Tester and the DUT. The IP addrsses in this test case is optional and users can use other IP addresses, and this holds true for other test cases as well.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer or host network in intra-domain symmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester generates traffic using 10.0.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
          <figure anchor="intra-domain-customer-asyn">
            <name>SAV for customer or host network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +~~~~~~~~~~+                          |
|                       | Router 2 |                          |
| FIB on DUT            +~~~~~~~~~~+   FIB on Router 1        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  Router 2  /         \/  10.1.0.0/16  Router 2  |
|               +----------+     +~~~~~~~~~~+                 |
|               |   DUT    |     | Router 1 |                 |
|               +----------+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                    +--------------------+
                    |       Tester       |
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><xref target="intra-domain-customer-asyn"/> shows the case of SAV for customer or host network in intra-domain asymmetric routing scenario, and the DUT performs SAV as a customer/host-facing router. Network 1 is a customer/host network within the AS, connects to the DUT and Router 1, respectively, and its own prefix is 10.0.0./15. The Tester can emulate Network 1 and performs its control plane and data plane functions. In this case, the Tester configs to make the inbound traffic destined for 10.1.0.0/16 come only from the DUT and the inbound traffic destined for 10.0.0.0/16 to come only from Router 1. The DUT only learns the route to prefix 10.1.0.0/16 from the Tester, while Router 1 only learns the route to the prefix 10.0.0.0/16 from Network 1. Then, the DUT and Router 1 avertise their learned prefixes to Router 2. Besides, the DUT learns the route to 10.0.0.0/16 from Router 2, and Router 1 learns the route to 10.1.0.0/16 from Router 2. The Tester can send outbound traffic with source addresses of prefix 10.0.0.0/16 to the DUT, which emulates the an asymmetric routing scenario between the Tester and the DUT.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer or host network in intra-domain asymmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-asyn"/> to construct the test network environment. The Tester is connected to the DUT and Router 1 and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT, Router 1, and Router 2, are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
          <figure anchor="intra-domain-internet-syn">
            <name>SAV for Internet-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                          |
|                          |   \/                          |
|                       +----------+                       |
|                       |    DUT   | SAV facing Internet   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
|                          |    \/                         |
|                  +--------------------+                  |
|                  |     Network 1      |                  |
|                  |   (10.0.0.0/15)    |                  |
|                  +--------------------+                  |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-syn"/> illustrates the test scenario for SAV in an Internet-facing network within an intra-domain symmetric routing context. In this scenario, the network topology mirrors that of <xref target="intra-domain-customer-syn"/>, with the key distinction being the DUT's placement within the network. Here, the DUT is linked to Router 1 and the Internet, with the Tester simulating the Internet's role. The DUT executes Internet-facing SAV, as opposed to customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for Internet-facing network in intra-domain symmetric routing scenario**:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
          <figure anchor="intra-domain-internet-asyn">
            <name>SAV for Internet-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                             |
|                          |   \/                             |
|                       +----------+                          |
|                       |    DUT   |                          |
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
|                         \    \/                             |
|                  +--------------------+                     |
|                  |     Network 1      |                     |
|                  |   (10.0.0.0/15)    |                     |
|                  +--------------------+                     |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><xref target="intra-domain-internet-asyn"/> shows the test case of SAV for Internet-facing network in intra-domain asymmetric routing scenario. In this test case, the network topology is the same with <xref target="intra-domain-customer-asyn"/>, and the difference is the location of the DUT in the network topology, where the DUT is connected to Router 1 and Router 2 within the same AS, as well as the Internet. The Tester is used to emulate the Internet. The DUT performs Internet-facing SAV instead of customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for Internet-facing network in intra-domain asymmetric routing scenario**:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-asyn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
          <figure anchor="intra-domain-agg-syn">
            <name>SAV for aggregation-router-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +----------+                       |
|                       |    DUT   | SAV facing Router 1   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |       Tester       |
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-syn"/> depicts the test scenario for SAV in an aggregation-router-facing network within an intra-domain symmetric routing environment. The test network setup in <xref target="intra-domain-agg-syn"/> is identical to that of <xref target="intra-domain-internet-syn"/>. The Tester is linked to Router 1 to simulate the operations of Network 1, thereby evaluating the false positive rate and false negative rate of the DUT as it faces the direction of Router 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for aggregation-router-facing network in intra-domain symmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-syn"/> to construct the test network environment. The Tester is connected to Router 1 and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester can send traffic using 10.1.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 for this test case.</t>
          <figure anchor="intra-domain-agg-asyn">
            <name>SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +----------+                          |
|                       |    DUT   | SAV facing Router 1 and 2|
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
          </figure>
          <t><xref target="intra-domain-agg-asyn"/> shows the test case of SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario. The test network environment of <xref target="intra-domain-agg-asyn"/> is the same with <xref target="intra-domain-internet-asyn"/>. The Tester is connected to Router 1 and Router 2 to emulate the functions of Network 1 to test the false positive rate and false negative rate of the DUT facing the direction of Router 1 and Router 2.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-asyn"/> to construct the test network environment. The Tester is connected to Router 1 and Router 2 and performs the functions as Network 1.</t>
            </li>
            <li>
              <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 and Router 2 for this test case.</t>
        </section>
        <section anchor="intra-control-plane-sec">
          <name>Control Plane Performance</name>
          <t><strong>Objective</strong>: Measure the control plane performance of the DUT, encompassing both protocol convergence performance and protocol message processing performance in response to route changes triggered by network failures or operator configurations. The protocol convergence performance is quantified by the protocol convergence time, which is the duration from the initiation of a routing change to the completion of the SAV rule update. The protocol message processing performance is characterized by the protocol message processing throughput, defined as the total size of protocol messages processed per second.</t>
          <figure anchor="intra-convg-perf">
            <name>Test setup for protocol convergence performance measurement.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
| Emulated Topology |------|   Tester    |<-------->|    DUT    |
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
]]></artwork>
          </figure>
          <t><strong>Protocol Convergence Performance</strong>: <xref target="intra-convg-perf"/> illustrates the test setup for measuring protocol convergence performance. The protocol convergence process of the DUT, which updates SAV rules, is initiated when route changes occur. These route changes, which necessitate the updating of SAV rules, can result from network failures or operator configurations. Consequently, in <xref target="intra-convg-perf"/>, the Tester is directly connected to the DUT and simulates route changes to trigger the DUT's convergence process by adding or withdrawing prefixes.</t>
          <t>The <strong>procedure</strong> is listed below for testing the protocol convergence performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the protocol convergence time of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively withdraws the prefixes in a certern percentage of the overall prefixes supported by the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the protocol convergence time is calculated according to the logs of the DUT about the beginning and completion of the protocol convergence.</t>
            </li>
          </ol>
          <t>Please note that withdrawing prefixes proportionally for IGP can be accomplished by proportionally shutting down interfaces. For instance, the Tester is connected to an emulated network topology where each interface links to an emulated device. Suppose the Tester connects to ten emulated devices through ten interfaces. Initially, these ten emulated devices advertise their prefixes to the DUT. To withdraw 10% of the prefixes, the Tester can randomly disable one interface connected to an emulated device. Similarly, to withdraw 20%, it can shut down two interfaces randomly, and this method applies to other proportions accordingly. This is merely a suggested approach, and alternative methods achieving the same objective are also acceptable.</t>
          <t>The protocol convergence time, which is the duration required for the DUT to complete the protocol convergence process, should be measured from the moment the last hello message is received on the DUT from the emulated device connected by the disabled interface until the SAV rule generation on the DUT is finalized.
To accurately measure the protocol convergence time, the DUT's logs should record the timestamp of receiving the last hello message and the timestamp when the SAV rule update is completed. The protocol convergence time is then determined by calculating the difference between these two timestamps.</t>
          <t>It is important to note that if the emulated device sends a "goodbye hello" message during the process of shutting down the Tester's interface, using the reception time of this goodbye hello message instead of the last hello message would yield a more precise measurement, as recommended by <xref target="RFC4061"/>.</t>
          <t><strong>Protocol Message Processing Performance</strong>: The test of the protocol message processing performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The protocol message processing performance measures the protocol message processing throughput to process the protocol messages. Therefore, the Tester can vary the rate for sending protocol messages, such as from 10% to 100% of the overall link capacity between the Tester and the DUT. Then, the DUT records the size of the processed total protocol messages and processing time.</t>
          <t>The <strong>procedure</strong> is listed below for testing the protocol message processing performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the protocol message processing throughput of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the protocol messages to the DUT in a certern percentage of the overall link capacity between the Tester and the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the protocol message processing throughput is calculated according to the logs of the DUT about the overall size of the protocol messages and the overall processing time.</t>
            </li>
          </ol>
          <t>To measure the protocol message processing throughput, the logs of the DUT records the overall size of the protocol messages and the overall processing time, and the protocol message processing throughput is calculated by dividing the overall size of the protocol messages by the overall processing time.</t>
        </section>
        <section anchor="intra-data-plane-sec">
          <name>Data Plane Performance</name>
          <t><strong>Objective</strong>: Evaluate the data plane performance of the DUT, encompassing both the data plane SAV table refreshing performance and the data plane forwarding performance. The data plane SAV table refreshing performance is quantified by the data plane SAV table refreshing rate, which indicates the speed at which the DUT updates its SAV table with newly implemented SAV rules. Concurrently, the data plane forwarding performance is measured by the data plane forwarding rate, which represents the total size of packets forwarded by the DUT per second.</t>
          <t><strong>Data Plane SAV Table Refreshing Performance</strong>: The assessment of the data plane SAV table refreshing performance utilizes the identical test configuration depicted in <xref target="intra-convg-perf"/>. This performance metric gauges the velocity at which a DUT refreshes its SAV table with new SAV rules. To this end, the Tester can modulate the transmission rate of protocol messages, ranging from 10% to 100% of the total link capacity between the Tester and the DUT. This variation influences the proportion of updated SAV rules and, consequently, the proportion of entries in the SAV table. Subsequently, the DUT logs the total count of updated SAV table entries and the duration of the refreshing process.</t>
          <t>The <strong>procedure</strong> is listed below for testing the data plane SAV table refreshing performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the data plane SAV table refreshing rate of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the protocol messages to the DUT in a certern percentage of the overall link capacity between the Tester and the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the data plane SAV table refreshing rate is calculated according to the logs of the DUT about the overall number of updated SAV table entries and the overall refreshing time.</t>
            </li>
          </ol>
          <t>To measure the data plane SAV table refreshing rate, the logs of the DUT records the overall number of updated SAV table entries and the overall refreshing time, and the data plane SAV table refreshing rate is calculated by dividing the overall number of updated SAV table entries by the overall refreshing time.</t>
          <t><strong>Data Plane Forwarding Performance</strong>: The evaluation of the data plane forwarding performance employs the same test setup illustrated in <xref target="intra-convg-perf"/>. The Tester is required to transmit a blend of spoofing and legitimate traffic at a rate equivalent to the total link capacity between the Tester and the DUT, while the DUT constructs a SAV table that utilizes the entire allocated storage space. The proportion of spoofing traffic to legitimate traffic can be adjusted across a range, for example, from 1:9 to 9:1. The DUT then records the aggregate size of the packets forwarded and the total duration of the forwarding activity.</t>
          <t>The <strong>procedure</strong> is listed below for testing the data plane forwarding performance:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the data plane forwarding rate of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the data plane traffic including spoofing and legitimate traffic to the DUT at the rate of the overall link capacity between the Tester and the DUT. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
            <li>
              <t>Finally, the data plane forwarding rate is calculated according to the logs of the DUT about the overall size of the forwarded traffic and the overall forwarding time.</t>
            </li>
          </ol>
          <t>To measure the data plane forwarding rate, the logs of the DUT records the overall size of the forwarded traffic and the overall forwarding time, and the data plane forwarding rate is calculated by dividing the overall size of the forwarded traffic by the overall forwarding time.</t>
        </section>
      </section>
      <section anchor="inter_domain_sav">
        <name>Inter-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates-1">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Measure the false positive rate and false negative rate of the DUT to process legitimate traffic and spoofing traffic across various inter-domain network scenarios including SAV for customer-facing ASes and SAV for provider/peer-facing ASes.</t>
          <figure anchor="inter-customer-syn">
            <name>SAV for customer-facing ASes in inter-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|             \         |              \            \      |
|     P6[AS 1] \        |               \            \     |
|      P1[AS 1] \       |                \            \    |
|          (C2P) \      | (C2P/P2P) (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-syn"/> presents a test case of SAV for customer-facing ASes in inter-domain symmetric routing scenario. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 and P6 to AS 2 and the DUT, respectively, and then AS 2 further propagates the route for prefix P1 and P6 to the DUT. Consequently, the DUT can learn the route for prefixes P1 and P6 from AS 1 and AS 2. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;AS 4, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in inter-domain symmetric routing scenario:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain symmetric routing scenario, a testbed can be built as shown in <xref target="inter-customer-syn"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the symmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-syn"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-lpp">
            <name>SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|    P6[AS 1] \         | NO_EXPORT    \            \      |
|     P1[AS 1] \        |               \            \     |
|     NO_EXPORT \       |                \            \    |
|          (C2P) \      | (C2P)     (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><xref target="inter-customer-lpp"/> presents a test case of SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT configuration. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 to AS 2 and adds the NO_EXPORT community attribute to the BGP advertisement sent to AS 2, preventing AS 2 from further propagating the route for prefix P1 to the DUT.  Similarly, AS 1 adds the NO_EXPORT community attribute to the BGP advertisement sent to the DUT, resulting in the DUT not propagating the route for prefix P6 to AS 3. Consequently, the DUT only learns the route for prefix P1 from AS 1 in this scenario. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT, a testbed can be built as shown in <xref target="inter-customer-lpp"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-lpp"/> can be set at AS 1 and AS 2 to evaluate its its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-dsr">
            <name>SAV for customer-facing ASes in the scenario of direct server return (DSR).</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                       |
|                                +----------------+               |
|                Anycast Server+-+    AS 3(P3)    |               |
|                                +-+/\----+/\+----+               |
|                                   /       \                     |
|                         P3[AS 3] /         \ P3[AS 3]           |
|                                 /           \                   |
|                                / (C2P)       \                  |
|                       +----------------+      \                 |
|                       |     DUT(P4)    |       \                |
|                       ++/\+--+/\+--+/\++        \               |
|          P6[AS 1, AS 2] /     |      \           \              |
|               P2[AS 2] /      |       \           \             |
|                       /       |        \           \            |
|                      / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|      +----------------+       |          \           \          |
|User+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|      +----------+/\+--+       | P6[AS 1]   \           \        |
|          P6[AS 1] \           |             \           \       |
|           P1[AS 1] \          |              \           \      |
|                     \         |               \           \     |
|                      \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                    +----------------+        +----------------+ |
|                    |AS 1(P1, P3, P6)|        |    AS 5(P5)    | |
|                    +----------------+        +----------------+ |
|                         /\     |                                |
|                          |     |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                     +----------------+
                     |     Tester     |
                     | (Edge Server)  |
                     +----------------+

Within the test network environment, P3 is the anycast prefix and is only advertised by AS 3 through BGP.
]]></artwork>
          </figure>
          <t><xref target="inter-customer-dsr"/> presents a test case of SAV for customer-facing ASes in the scenario of direct server return (DSR). In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2 send requests to the anycast destination IP, the forwarding path is AS 2-&gt;DUT-&gt;AS 3.  The anycast servers in AS 3 receive the requests and tunnel them to the edge servers in AS 1.  Finally, the edge servers send the content to the users with source addresses in prefix P3. The reverse forwarding path is AS 1-&gt;DUT-&gt;AS 2. The Tester sends the traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;DUT-&gt;AS 2.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in the scenario of direct server return (DSR):</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of DSR, a testbed can be built as shown in <xref target="inter-customer-dsr"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of DSR.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P3 as source addresses and P2 as destination addresses (legitimate traffic) to AS 2 via the DUT.</t>
            </li>
          </ol>
          <t>Note that in <xref target="inter-customer-dsr"/>, to direct the return traffic from the edge server to the user to the path AS 1-&gt;DUT-&gt;AS 2, the document recommends to config static route to direct the traffic with source addresses in P3 and destination addresses in P2 to the DUT.</t>
          <t>The <strong>expected results</strong> are that the DUT can permit the legitimate traffic with source addresses in P3 from the direction of AS 1 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-dsr"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-reflect">
            <name>SAV for customer-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P1')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |  Server+-+    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="inter-customer-reflect"/> depicts the test case of SAV for customer-facing ASes in the scenario of reflection attacks. In this test case, the reflection attack by source address spoofing takes place within DUT's customer cone, where the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P5) that are designed to respond to such requests. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-reflect"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in the scenario of reflection attacks:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="inter-customer-reflect"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P5 as destination addresses (spoofing traffic) to AS 5 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-reflect"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="inter-customer-direct">
            <name>SAV for customer-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P5')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |          |    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="inter-customer-direct"/> presents the test case of SAV for customer-facing ASes in the scenario of direct attacks. In this test case, the direct attack by source address spoofing takes place within the DUT's customer cone, where the attacker spoofs a source address (P5) and directly targets the victim's IP address (P1), overwhelming its network resources. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-direct"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for customer-facing ASes in the scenario of direct attacks**:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="inter-customer-direct"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P5 as source addresses and P1 as destination addresses (spoofing traffic) to AS 1 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic with source addresses in P5 from the direction of AS 2 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="inter-customer-direct"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="reflection-attack-p">
            <name>SAV for provider-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
                                   +----------------+
                                   |     Tester     |
                                   |   (Attacker)   |
                                   |      (P1')     |
                                   +----------------+
                                        |     /\
                                        |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment              \/     |                    |
|                                  +----------------+               |
|                                  |                |               |
|                                  |    AS 3(P3)    |               |
|                                  |                |               |
|                                  +-+/\----+/\+----+               |
|                                     /       \                     |
|                                    /         \                    |
|                                   /           \                   |
|                                  / (C2P/P2P)   \                  |
|                         +----------------+      \                 |
|                         |     DUT(P4)    |       \                |
|                         ++/\+--+/\+--+/\++        \               |
|            P6[AS 1, AS 2] /     |      \           \              |
|                 P2[AS 2] /      |       \           \             |
|                         /       |        \           \            |
|                        / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|        +----------------+       |          \           \          |
|Server+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|        +----------+/\+--+       | P6[AS 1]   \           \        |
|            P6[AS 1] \           | NO_EXPORT   \           \       |
|             P1[AS 1] \          |              \           \      |
|             NO_EXPORT \         |               \           \     |
|                        \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                      +----------------+        +----------------+ |
|              Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|                      +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Provider/Peer-facing ASes</strong>: <xref target="reflection-attack-p"/> depicts the test case of SAV for provider-facing ASes in the scenario of reflection attacks. In this test case, the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P2) that respond to such requests. The Tester performs the source address spoofing function as an attacker. The servers then send overwhelming responses back to the victim, exhausting its network resources. The arrows in <xref target="reflection-attack-p"/> represent the commercial relationships between ASes. AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for provider-facing ASes in the scenario of reflection attacks:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="reflection-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P1 as source addresses and P2 as destination addresses (spoofing traffic) to AS 2 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The expected results are that the DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 3 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="reflection-attack-p"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
          <figure anchor="direct-attack-p">
            <name>SAV for provider-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
                           +----------------+
                           |     Tester     |
                           |   (Attacker)   |
                           |      (P2')     |
                           +----------------+
                                |     /\
                                |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment      \/     |                    |
|                          +----------------+               |
|                          |    AS 3(P3)    |               |
|                          +-+/\----+/\+----+               |
|                             /       \                     |
|                            /         \                    |
|                           /           \                   |
|                          / (C2P/P2P)   \                  |
|                 +----------------+      \                 |
|                 |     DUT(P4)    |       \                |
|                 ++/\+--+/\+--+/\++        \               |
|    P6[AS 1, AS 2] /     |      \           \              |
|         P2[AS 2] /      |       \           \             |
|                 /       |        \           \            |
|                / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|+----------------+       |          \           \          |
||    AS 2(P2)    |       | P1[AS 1]  \           \         |
|+----------+/\+--+       | P6[AS 1]   \           \        |
|    P6[AS 1] \           | NO_EXPORT   \           \       |
|     P1[AS 1] \          |              \           \      |
|     NO_EXPORT \         |               \           \     |
|                \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|              +----------------+        +----------------+ |
|      Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|              +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="direct-attack-p"/> showcases a testcase of SAV for provider-facing ASes in the scenario of direct attacks. In this test case, the attacker spoofs another source address (P2) and directly targets the victim's IP address (P1), overwhelming its network resources.  The arrows in <xref target="direct-attack-p"/> represent the commercial relationships between ASes.  AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> is listed below for testing SAV for provider-facing ASes in the scenario of direct attacks:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="direct-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
            </li>
            <li>
              <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
            </li>
            <li>
              <t>Finally, the Tester sends the traffic using P2 as source addresses and P1 as destination addresses (spoofing traffic) to AS1 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> are that the DUT can block the spoofing traffic with source addresses in P2 from the direction of AS 3 for this test case.</t>
          <t>Note that the locations of the DUT in <xref target="direct-attack-p"/> can be set at AS 1 and AS 2 to evaluate its false positive rate and false negative rate according to the procedure outlined above. The expected results are that the DUT will effectively block spoofing traffic.</t>
        </section>
        <section anchor="control-plane-performance">
          <name>Control Plane Performance</name>
          <t>The test setup, procedure, and measures can refer to <xref target="intra-control-plane-sec"/> for testing the protocl convergence performance and protocol message processing performance.</t>
        </section>
        <section anchor="data-plane-performance">
          <name>Data Plane Performance</name>
          <t>The test setup, procedure, and measures can refer to <xref target="intra-data-plane-sec"/> for testing the data plane SAV table refreshing performance and data plane forwarding performance.</t>
        </section>
      </section>
      <section anchor="resource-utilization-1">
        <name>Resource Utilization</name>
        <t>When testing the DUT for both intra-domain (<xref target="intra_domain_sav"/>) and inter-domain SAV (<xref target="inter_domain_sav"/>) functionality, CPU utilization (both control plane and data plane) and memory utilization (both control plane and data plane) should be recorded. These measurements should be captured separately for each plane to enable detailed performance analysis.</t>
      </section>
    </section>
    <section anchor="reporting-format">
      <name>Reporting Format</name>
      <t>Each test has a reporting format that contains some global and identical reporting components, and some individual components that are specific to individual tests. The following parameters for test configuration and SAV mechanism settings <bcp14>MUST</bcp14> be reflected in the test report.</t>
      <t>Test Configuration Parameters:</t>
      <ol spacing="normal" type="1"><li>
          <t>Test device hardware and software versions</t>
        </li>
        <li>
          <t>Network topology</t>
        </li>
        <li>
          <t>Test traffic attributes</t>
        </li>
        <li>
          <t>System configuration (e.g., physical or virtual machine, CPU, memory, caches, operating system, interface capacity)</t>
        </li>
        <li>
          <t>Device configuration (e.g., symmetric routing, NO_EXPORT)</t>
        </li>
        <li>
          <t>SAV mechanism</t>
        </li>
      </ol>
    </section>
    <section anchor="IANA">
      <name>IANA Considerations</name>
      <t>This document has no IANA actions.</t>
    </section>
    <section anchor="security">
      <name>Security Considerations</name>
      <t>The benchmarking tests outlined in this document are confined to evaluating the performance of SAV devices within a controlled laboratory environment, utilizing isolated networks.</t>
      <t>The network topology employed for benchmarking must constitute an independent test setup. It is imperative that this setup remains disconnected from any devices that could potentially relay test traffic into an operational production network.</t>
    </section>
  </middle>
  <back>
    <references anchor="sec-combined-references">
      <name>References</name>
      <references anchor="sec-normative-references">
        <name>Normative References</name>
        <reference anchor="RFC3704">
          <front>
            <title>Ingress Filtering for Multihomed Networks</title>
            <author fullname="F. Baker" initials="F." surname="Baker"/>
            <author fullname="P. Savola" initials="P." surname="Savola"/>
            <date month="March" year="2004"/>
            <abstract>
              <t>BCP 38, RFC 2827, is designed to limit the impact of distributed denial of service attacks, by denying traffic with spoofed addresses access to the network, and to help ensure that traffic is traceable to its correct source network. As a side effect of protecting the Internet against such attacks, the network implementing the solution also protects itself from this and other attacks, such as spoofed management access to networking equipment. There are cases when this may create problems, e.g., with multihoming. This document describes the current ingress filtering operational mechanisms, examines generic issues related to ingress filtering, and delves into the effects on multihoming in particular. This memo updates RFC 2827. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="3704"/>
          <seriesInfo name="DOI" value="10.17487/RFC3704"/>
        </reference>
        <reference anchor="RFC8704">
          <front>
            <title>Enhanced Feasible-Path Unicast Reverse Path Forwarding</title>
            <author fullname="K. Sriram" initials="K." surname="Sriram"/>
            <author fullname="D. Montgomery" initials="D." surname="Montgomery"/>
            <author fullname="J. Haas" initials="J." surname="Haas"/>
            <date month="February" year="2020"/>
            <abstract>
              <t>This document identifies a need for and proposes improvement of the unicast Reverse Path Forwarding (uRPF) techniques (see RFC 3704) for detection and mitigation of source address spoofing (see BCP 38). Strict uRPF is inflexible about directionality, the loose uRPF is oblivious to directionality, and the current feasible-path uRPF attempts to strike a balance between the two (see RFC 3704). However, as shown in this document, the existing feasible-path uRPF still has shortcomings. This document describes enhanced feasible-path uRPF (EFP-uRPF) techniques that are more flexible (in a meaningful way) about directionality than the feasible-path uRPF (RFC 3704). The proposed EFP-uRPF methods aim to significantly reduce false positives regarding invalid detection in source address validation (SAV). Hence, they can potentially alleviate ISPs' concerns about the possibility of disrupting service for their customers and encourage greater deployment of uRPF techniques. This document updates RFC 3704.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="8704"/>
          <seriesInfo name="DOI" value="10.17487/RFC8704"/>
        </reference>
        <reference anchor="RFC2544">
          <front>
            <title>Benchmarking Methodology for Network Interconnect Devices</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <author fullname="J. McQuaid" initials="J." surname="McQuaid"/>
            <date month="March" year="1999"/>
            <abstract>
              <t>This document is a republication of RFC 1944 correcting the values for the IP addresses which were assigned to be used as the default addresses for networking test equipment. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="2544"/>
          <seriesInfo name="DOI" value="10.17487/RFC2544"/>
        </reference>
        <reference anchor="RFC4061">
          <front>
            <title>Benchmarking Basic OSPF Single Router Control Plane Convergence</title>
            <author fullname="V. Manral" initials="V." surname="Manral"/>
            <author fullname="R. White" initials="R." surname="White"/>
            <author fullname="A. Shaikh" initials="A." surname="Shaikh"/>
            <date month="April" year="2005"/>
            <abstract>
              <t>This document provides suggestions for measuring OSPF single router control plane convergence. Its initial emphasis is on the control plane of a single OSPF router. We do not address forwarding plane performance.</t>
              <t>NOTE: In this document, the word "convergence" relates to single router control plane convergence only. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="4061"/>
          <seriesInfo name="DOI" value="10.17487/RFC4061"/>
        </reference>
        <reference anchor="RFC2119">
          <front>
            <title>Key words for use in RFCs to Indicate Requirement Levels</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <date month="March" year="1997"/>
            <abstract>
              <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="2119"/>
          <seriesInfo name="DOI" value="10.17487/RFC2119"/>
        </reference>
        <reference anchor="RFC8174">
          <front>
            <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
            <author fullname="B. Leiba" initials="B." surname="Leiba"/>
            <date month="May" year="2017"/>
            <abstract>
              <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="8174"/>
          <seriesInfo name="DOI" value="10.17487/RFC8174"/>
        </reference>
      </references>
      <references anchor="sec-informative-references">
        <name>Informative References</name>
        <reference anchor="intra-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Intra-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="inter-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Inter-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="intra-domain-arch" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-architecture/">
          <front>
            <title>Intra-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="inter-domain-arch" target="https://datatracker.ietf.org/doc/draft-wu-savnet-inter-domain-architecture/">
          <front>
            <title>Inter-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
      </references>
    </references>
    <?line 916?>

<section numbered="false" anchor="Acknowledgements">
      <name>Acknowledgements</name>
      <t>Many thanks to Aijun Wang, Nan Geng, Susan Hares, Giuseppe Fioccola, Minh-Ngoc Tran, Shengnan Yue, Changwang Lin etc. for their valuable comments and reviews on this document.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAAAAAAAAA+19eXPcyJHv/4zQd8BKsSFSalIiKWlthXfWHEkzo1gdvSLl
eX7WxAQaXeyGhQbaOMihRfmz7Gd5n+zlUSdQQKMP6hq2wxo2GpWVlZWVlfmr
I3d3d29sFWWYjn8NkywVj4Myr8SNrXie059FeXD//h/vH9zYisLycRCnp1lw
K3gyFdF7KFeNZnFRxFlaXsyh6PNnJz/c2ApzET4OfhSpyMMk+NubZ8MXR0+e
/XJj63yiXrmxNc6iNJxBmXEenpa70VSku6PZ+WS3CM9SUeJ/dkcijaazMH8f
p5Pd+w+xWBmXCRT63voleCnKaTbOkmxyEZxmeXCcVXkkgqPxOBdFEfwlTOJx
WAKTwNpolIuzx8Hx0V+IxI2tJEyBK5Ei8bACQvnjG1u70M7icXBjKwiYyRcx
NjnFB1kO7//faZZOJlWYRlUavAhHWR6WWX6Bv0dxeYEMxn8H3uhBVqVlDs+e
TOM0BOFVhQhOXjwNtsVvkZiXwdv/3gGq6j2qEcuJWRgnj4MkRtn8+Z+TKAlH
e2Jc7UWph8OnITASawZPCqh9WoXB2zQ+E3kBTF0Fc2WGsk3/XMrq2vl7EY9i
5LD6PDKsktFCEb4AVkDUk+B/4s/S0/+I0ySqc3ljK83yGajvmXiM77754cnh
f9x/oP7+g/X3wcMH+u8H9x/t498wkGHE2gRiqDrcHWdQZ7o7L+hZEJRhPhEw
vqdlCc/u3YMBE8J70XuR78WiPN0DYdyDMXuPhys+UiPVJZhno0TMdsGilGIm
0vKepM/jtnVoAlvBc4tQ8EqU51n+vgh+DOfBURomF0VcDIIh0w+OFf1BAKYr
eCP+UcU5PSgCrhHoQoUH9w8eylaLfMOttgiu12pN6ApabbomzKPphnsbScal
iMoqF26Tnb5sb/82GOJXz052giOL0uIOXL0p55W3AzsbYrpnEw0Bw7O7G4Sj
ArkscYCfTOMiABYr7MhgLE7jVBTBTM9qMXzDec2eDoNyKoK5yGlsp8BTdur0
NumH3cSgYN5DyfuZy/sO1BdNwzQuZsUeTo/W9wAm9KAq4yT+pxiDyQ8mNLWX
gt7LqwT4g6dzmFmxAbWKinmWQYsmrLLT8ExAQwSo/myekOoCzfO4nAbQjgtg
K4+zCsQhiniSFjhAsnwscqqAm0u1wnOUQJTlUMk8S8cokyISKZaHFrgyLcP3
yCMUCOcwVsNoivKKsrSIgTYWDYnqWJzFwDrUNYJXg1ECehSMst8G3CtK7jPL
4UBRI+sp8jgNywDqDSdpVpRxhIT4fSNalzFg5gxYKJAGEaWOnomwqPKWXha/
xUAbWQZxpuKcxaGEST1a7KFaoZrN4vE4EfjtFg3JbFxF7And2DpeoBDAJ/ZB
fBpHYVomF1hJloOnWPbo7b3g9VzQVMnqU1STiShK1p+xmCfZRTCOT09FjkRq
+vbhg5zlPn7kv//Af4/CAghk1PVxLsmQHFO2m+DHncV5lpJF3IMWI18xtmlg
5NY9TGqskL5K887aSM0CiuDaZuBZTUU4JiphFFUwoi+CKkV9VYqsdRJaUpt6
qXW1ienjxz3XdvoYBEMT2BarQRt/bFKXT7E7oEXzrJCjuQINJE3jQYdKhnq1
lJyATpElZ0xHyQu1Xbhmyx46bOnGPJbtUXEeJ0kwFclcShuViMxOieMkA/5Z
2GCBgPUQGcZqkSV7sJyDM+eoCtTFI7wARQmBRSqCTaMaw6TIuFrQ7LGstASt
9Q3D5qijoauqo1r2mtZdD3iyoWAiWsx8qATmyNlmAoSbgRymsTgzvZdDE0+r
NGIFBf904Fq2WXgBQ3GO45jtrduNdfOkLGSx0D7mYhLmYGrABMjekMpEiuPI
ibUC5Vq0qwCYHKyjkjqqdchlhKI97EI59gbIMQxKEG7k9BhPPvAjmMAkmEPQ
J+gJegvyK7x6Dk1AmbtSnopCsSu5kr+zLQrBRuTjcxxVeVbBCEGRF9lpWXsE
hqmswGjMsM+gvu2/vNzBCKRk/qAlITEIgwvMh/nhfAoBIMyyMBmGbj/Uewuq
Slpch7AooGuUPknp2RrN4pnNQ5p2lPGqaQfNI7eCHzNoBhY4jmB8so53DHPJ
ltXJ6IkBWxH0bkHm/DwLstHfwZhBkIK++Y2tO8GRZvn2bZaB4oYtjuQeO6Qo
b98mY4ySByMBGngukmRXaZfxC5j0S2d+Ja2IRxWNYShaVKPd4gLmKjZr+ArS
DsFCIAfyp9uFI79tMh7v0+w8xW66fXsWR3lmEIzbt3eMNTDaLGDWrcCOFV4D
Rj7dOAaDgYqjTNe22JvsDbTW3avpWrETjC6svmydZXmeKuYiwileT6GOEwVE
RVEloPpkF8cCKpih9mpfap7HyhO05mPUuVyEyS6QTMYe2lKVnAgG4m8ItCda
o96LiwCKjYvg5su3xyc3B/zf4NVr+vvNs/95+/zNs6f49/FPRy9e6D+25BvH
P71+++Kp+cuUfPL65ctnr55yYXgaOI+2br48+utNthk3Xw9Pnr9+dfTiZtNI
odDZFtLUCC4ROjhhsQW2LwKVYp3//snw//3v/gOYjf8Ng/T9/T/S1Ixf/rD/
H+jX4ETFtWUp+Fn8FSR8sQUiFmFOTiaoXxTO4xIUbYAqVkxR16bQtXtbW3f+
hpL55XHwp1E033/wnXyADXYeKpk5D0lmzSeNwixEzyNPNVqazvOapF1+j/7q
fFdytx7+6b/QkAS7+3/4r++22Kk9IX0kQ0M+LajhE2nlh2jWH5MS42PX+NPM
hiYdxhhoMYwrtEVxGiUVTQGTEOSfKzcbRtOsSsEPLqUJ3c1FEpbUvxJdgXlN
MfAUJ5Va7dZEU4A/Yw14jp8wAqJQQo9HKgFBYsWhJLIhXXR0XzCEAW6zGc1X
GO6W7FmrAQpKhC2QcdMsLnGGGcdFBGYjiMu9ALn9CcKU3dMwQiJvyHo8Do5S
1/NjqxLIt8Lar1OgoIY3CeBJBe2bgZ+4FtlIUtGWg2cA7h9ovzR1JBWaGlGa
+JDEkWdlFmUJ8XN0HHzPEWRvTsRv8B39+6NjIsF6Bu20gGZpv+jxsSiruTJa
5C8W+ARH7USC4GA10CIncUjuLUS7FNcgYocePxZ8ypPyW7LKRHf76duTHS4K
4WUkg6eQfsS5Du0F9bESkpoh4C3Ub9Idkox6oQSvX0atHAuyhfrwAbmOILra
LUSkGJLVYDwbgLhOUSll5J/lDgygycu3qIE6AJIBInpbAiMwdpZlwCjcEF/M
KhxXxLQKJVFxEzGBMG5GP3ElECZRrK3MBDV6mkFQo2umICengJicUskdeaRU
NhViXEM0sGKnGWRxYFDyezoaWOw4okL8S38YCbr7r9bPXX7jkrte4oDBMxPP
BpfqDaK063zu0kPnDf5Xfy7VG1DyO+cN0DL7DaR2Y+tyAR2mZd7y82Pe6mq5
eWvBx3nLW2PjLS/3RhLyQ4+lvuNvf9JUWaJ+WvZvdW7cvv/wOLhFEQTZBUIW
//OmsR17Nz+itvAwpHfAKYCItEKMUDmHpHowrE/jSZU7AWeb6dgLfuZZRRf3
wCQDpgA6IIMbcmWMxaFBwGbGNi/g3JUYAuJYE+UFjS+yfK73KGWKYSf8BSF3
XEyh1Bh8vqhUdg2bQtUoTnDAxynYkynY5smUWYJwKw7zCx1bE30wJjg5ypDT
ojeClgoO/2c41MdxVNKEDb5xWY9WOLiz45SmRfyhylEIiD6wyGTTpiF3D/hm
4SjGkJvRMbKuEBtDs8iQCUaJmoaMzJ4MA7RToMEk2TKUCk5VIoFXU0vMA2XO
sPcoBjGmTNm7ko0YP8xkLNjPiln17yHsHTOwkFwYtQHh5uzFk4VU6AJMMxMk
ZqMRMSuP8Z/BpyWtAdHD9Ch0WKDM34mar7CRUsufOENATbvQiEhwFGwkdm7U
vzED1qDN9DSpEDTwBmJ7WGkBjRT48qCpQNKDDMjnjtMzQsLmWUHiUnEmcgR9
oWanJIsYrFVqDkKosRrgcjT+BBxqFYnLojHPnFjej2MiaOJT47ZLHKg+wNkF
65PrmyqNGlPAg3401IOiruaMxRaiVquanOMZo6Vngp1T1EMzvStHl/u1xreY
KcRORurkO0youS9hEGJIPpDzOPoRwBT4WVRLmskqwN1nyBy9Za5mAAoKXwhJ
mVrer8Jpopr7qp6DSQJPcsSOivQYlc9e8yelbKUkwVYm0OOn0uxUBSI0OGQS
bug0nhfaWh0dsz/EJA3YqroLZQ/eKQnhZ0Q3Da7TazgP3M6XBmKuXEFpG5XR
BlHghBBNQ1ypgmjon/yO7Dkc75ICzVJMnPqPYBTlvoEImlbP6BnEJC784iwo
CA52WcmgTwgLZluXoHqF0sNVFkQNfzLpbE5Ik4eWTXuejjGUA0oakSnktJFQ
TIjAg20EY10g2P7v4fNiRwHAOAErbKg5o4DFhOdYgA3DSMiVHYWFSSsiy5iV
Agc+aE5FR2DmWdh+2qd5NuO+IOwGZTOvbMOozOwPMGGIYEh2CoboG+gfbU+1
74zlPLOWDMgKE3JKoNbqePmqVCuyf1GegfaT4YdH7XoRcy/n4jQRsotb5kW3
Ma/EJFzQmAZ/7U3xtLu9MXW6qzdlKANYnHY0mn0Sz3SDNJahXrRhb2AYu34O
vU+wmly8iDNtS7nBihEXGGmQpCm7mo9t90RiFmiB9LQzDVN8eYqQVTqwoiwa
jsgTmLx5YevnCMRL0yOIwKWjgFcKmoXqOVW1ZEfi3zAISMBylArlCvrWAZyl
IhsmsKX+EuxOCDwMeVZHrk54ioZhZOkUvz2Tb8/N26V+Ww1KKQXz3MBOepmX
QByG63vhTXIh1O0/xx90JOLpar+fJ8VhUCwqekLrZW/EKTRmSsCONcCsiQff
Lend3LxLrid8F7mG1OkRuEOsiyENKO7WglwcQ4dCA7XQTZrnNLI+rixeejfw
BzNbtjTLmk+Jc6dfa9Va71o9jv1a63KrjGMz2rqvd9PeCDmLvqVdG6Fa8peA
Pv9Wmd/c5jwZviU2ZgKinQv7vaI2fsjpxYY5DpBaVG2sF1sOqGbW3cp5QpHY
h1vOlCcb9bxex4dbVO2v/OjXIjyTrzZmNuTGMz/Q/H/nzmu19HTnzuPgmR2K
AZe3oYFUcq6IseYCRX6eKoq8Hp2CBUzHCTbGmj3w9ca8I+ePMW3QFF4n0g70
REqLOmYZr7bpgy2HAk7hbxubhfK0kwl3PknX1vxC2xcmk5xakqW77N/W36P+
es7dd5olSXZOu3rcJRENLMqoOEF+I8u3kUABEsYh71OeBnTWAR4t+tz1Akut
GFsX4OR+LJ7utrzSUfpSotHBfh1Vckv/8Px7NPMKoluy7qfYTPV5BYHDr9Ns
Dn/ee8c8tH6o9P79Pfrfvf2HgRbWviq4qHT7r/jPu3urlLZAtpVkrrHOTplf
Rd19ZW59TiwUulAytx+2lJbW/fnQgthlaRt/d39364Yxane+LF176tS93ght
F4rSFv87NdDVA9nWCAU2yhvUINzau9tWY3fa323jwTSQ4V9nf5ZGGYqLVMHB
Kp5ss+A4tTiWsriYzUSJe4iU76ysrMST79xRNJ9YNHHNTw1nnPBqW8ds1j5+
pFVmBXAWwo57N8CnAQFwXEqXhl0/AmhUFfeaSI3az4N4L/mU2p4iwhlFCuFU
k96eZcHiBm2zzGg8lKPjgVOBZFN5aDCTnae4C/I0/g0pWtriQN/oyKlVLcMC
MjmGab+MC4noSUIel54q1Kiu9iJa0GRJwbiKtA1SRqGFi10z5Eatm4Xv2eWJ
01FWpQbaYZsheGOqPf4RbjGhHAHEJ7InExHm1pqskPtFsYE2BV2Y2aGtTolw
GAThQRA7RgjDZYsMYB0qwsZ76rE7j8MN2SFyS3CHhtrrCPa6q9NkaUoVByRr
Xq3BIQNfsrncMIpFq0JICAtPh/DKim2MLdhgmiXjgs5jMdLEC/qKMu0Hw31O
GgK+c4d8cojywZXFihHOQsxOgLPGq0XS9drgIKbtWvt7wQ9xXpQDZzWXWNUb
EaxlJq3Neh+nQRXUnr7N2RjiA9E0ucI1quKkNFtoCGPrsoGN5fS2tbT6urmz
bq9RI+hdberIl5YrJNSf2kTwBo0DIsm7gTSOb3ap2DS1AZQgPYHpXDXtmieI
qn3KoOoOsRut9R3ZFNVdhR5+FQUh9iALi+Zw3G7apx3W7jqZg04y9chpxxnS
eACA47fkQu5Za2LQZq3NYzPN4odaHCDTtP/4j1jgj4/3rREmfptzl8qNcTDQ
aBsYbv13VlKTLHrvxQWVBuCeHD8IytWb2YLXeW2rwt1l+xhreGD/aguTesdJ
60ZK3QR0sHSw0H/uFS/Jl/SAsQl0h0zgiuI/LS+pqGmflfmREzXpgAcpmIHj
vFQLu+A33XBTnmInpxbzkkeIjbilszs8BBphU2fsehUc8Ef1AH/xvtIk4MRL
FoF77ZGUIuCLo96Z8q2xVI0DJ2p6pCjcqz+vc7D2cG4XEPHwOUKqekR1NSFV
uE5MFS4MqtpchXBj8VIHD2sETBsLgBxnw519F8RG/UIjxz9CWt2nNpTztOEw
R1tXCnNoW7Yd6+h+6BkuPWIn0iGlZGjiJvqtO3jSfLUET9o0t9KitaZ6kPSo
5nLYbmfTwVSBa0mHnagSaDETFXY0Du7r9wJPEBWGlI+nBiOq/MCtuaXsvrds
Q9eWiSRpfa4hpEWRZOfgXRRLfoIoroO7x8GXEMd1277VA7nwqiI5Myg2H9YN
LDNr1YWDoj3E65rBVozx9PDqGZx5I7z7nUR8geLnivHqEd6XEd/19ZNanLVL
7aZtK0h0p80BW4au+pB3fglzcLtxXYzzX1qetHSQFTCw/iJg+756XX8LawuX
szpWs652OUsGZpdsaNndU90bdIfEPev+5pYQLa1YvnTfZdurqHuFJUQ3yP3C
lxA33t/+MLJvaX5k62zglX176Wa4ewWcr2cVzRdPRB2r/SKegLplL8k6a5R1
ku1LlDZjbQdltNOt3FDeVd3Gd8tW7ib7age2jjaNe1pOPfvqZ3Ge000S6D+A
+neuNAzMMRjcgjymC0vUeRbLMcSz5/q8Q3NX/17wk1CnVOQJDTxOw46i47Da
K6RW5dJTKGIKblTF6sXbBW2sN2Gr+E1EFUq/Ll2QO+3ezub6shEXoZAMW++v
EQWtr5Ooa1cYBa3P4CoxUG20fMrFLGcB/stZztJYwJKrWT0jne61rC8p0vmU
q1m2LlwHPFcc8Pxr3ZgnWDfs6SbQI/roJuAEP90E/Kt9NQ7clw4cAl/YkqAJ
5DqWBM1LvgU5S0v8DxpCrEu/FgZ1rtBeBQdSAL+rJcF2RScKK43GvpFGK4G+
YVIngT6R0oaasLZRNV+6QibfImRf92/pNUinVmcN0uxCsxYiN8GHDn90DS3x
jzz/VYQzoe496ULo7WOhfG1VJBQNdWC6dibeVy2uzohc2BGQ4746QZA2XVYk
RfziAqjcYtfwZmvOsbqnzr6/xH3XWbD1BEp04xveJYkXky4KkoKrj5I6ev8L
CZM2vFZUH0VfQaDUtkB0FetD10HT1xA0fZEnpRY7+yssc1j+/PUyR+Nzvczx
pS5zfHsnpa50W184mfjWHxaeVl1nJeKolXj7moTkE/yGsZjHUbl4KWJxE3ov
SjScEcdV0ZcRtrKMN/WM8QadiK/QaVmscJHkutvjWWLA66BiyyPWd6cT2KUN
JfkZuRjpS46Ui7PMwWsrKAhx3yDOE3JBiG8bk5GD3nW3ugu9SeW7ytMzn2e9
wajVZjzopXdYfQ0LDfuf7diMf+fut7KnqnW4f+1naDYKmPvcaBTwwTVgfg2Y
Nwh8tYD5ZzpDs4SvfanXKLUC77S4z0t42is72j7UenlnZ2n8WlW9GLreLDNN
V9nySTz+r8XnIli7Bib2d3C0WaiBycbXsT1n50rqFb1lKcD2mdNm65N6zV2n
Fa7Qbd4oo6t60BsFof0atnl/+lPi0V/keYXfq3PtdHabp433sjm5OZxLWeVl
brvytN8uHe/T17/VLmnjBD76pseFdxrWLk+jW7m9d2fWUyJ13fLo3hAbcB7C
gs6k8eE0dSUmKPdkInK+PFeN3dMwTujGPbzYQ2Y5a9yijL27kE+Q9D+qMC35
Sl55P2nrzaADc90p9ae+S171cJxC5+uF1rDXxZz4xHM5Z/ctmbU2NK4ZLheU
N3crDnQeMbmWVmZlmAQFUJIXbTpECp3xhdQeL//N0nGvW+fu+jysu37Xi+LB
ZzyHj81V5pf8K/pzvowD31mRWpsTuyQXfr8PFWOyi53gJCRgtFBeVtmtePLS
SJp+FIbqvbrWGuc2dGo4aN3FrZkxKTEXsdU1auSNlbZh4NGgbh61rhhFUJSH
AmZBUFfdmlGdoReh0sM5PymiMAGjqpbKh6NKsAnWVbbwMl9iSjc10yBcykK4
F9NbvoQtXGfipGQIaMSTi/alagXbFnVTlilrZu1F94kYxi+m3kwpewt6yOM8
POfu41PCK3iSrXbN6v1FjmGnbXQUYwm/zVHklV22zl5pemGyIKWzZT9Dy1ne
+azOY1NeqUjkGJKgrCK8MnSiW6suMNfvyxQOxgqTPJQXsn//3wfBAf6zt7c3
gK/3/33P56e1S5nO6CeRNIzgjWfystxM7vOZFA6cP1KZEei+6FRdftacgnxV
Em/DRGAkKRMDhKVXI61rwrEVDKH/OFTdj3xiBqdiyoKpvV1Mq5K0dMzKAcKm
FYg9vFzYSufojkQ3s5O+DmHc3EjFG5pk3i9JnNZcinpRlRzyuKJjFw4gbV/n
IBpldCoK+tFuw3OyhKpvKR+mp7S50q6kmwHsGwGUHmPWUiV91CXTd/xuA0DP
oa+zWULHYeg+6CwVlghaJajFEM/iJMyJdatq0mB50TL2HfcbZqI07dZ1W3ey
yTTN4RxUgVvG17LZyaa0SrPvjzMJFsxxjIZWKmSVjFrewJvgAOUgnWspZH5X
ZfwIbtCJMsnNp5QveN/hnC7L1lZ1af9P53DRSYVg6PGlFTjKRIf1ZZM/QLtY
JWN/KoJZxom4cXyHYA+nIkky7dVRDplIQKNUemeGJlTpWqdanS4tlNSNsaUY
FXjEieuZypjRujtdbg48RcOFfifIj7LqMkAA3TWzIo0OmZrJkKyXlAS0CfSA
JwF4DUzAbE533VNbVa965KE2QZpSKpdx3c3WmeUw9WSH66MMb4l0rDwHmCpU
2mKDAumdl9aNFTioYXBojuR10CrNi0kMboxsfOrtPFwRwwtobk6ybDy6ENz2
m7rxYyv1uXHYXPtqbMTtwvT5QIbv+CvKmK5dtGZ2YNWp0+if2XvZ0iPn1KEX
sYB/ZQ5qsFgR2jrLDab9otjrM/g2ZvFSor8H9x/tf/y4V/ORPYkGaq6yRijr
M9yCeIoy62qTYfnSnb7LcjGbc2d8z3QInBpL387aiMyIA5gJ6lm2FBjCXYt6
j2YKNckJCRSZOmYC0wzdFGOmG+X0UJ4xzN0VYeauHrd9Wpfi8PCWcpaxpqW1
NCVhHNqMPyW0oAUEGrquP9zdWUu5xd1d+JX7yGx9vMpnB0E9feZl1GcNF7q7
R1b2p1UrasrrUVY3TvAobuafKRfANj727FG1EQbNmYKV5DlCz/Ms1jh3P56k
X9Ils1tuDhIfGIoXnXUjoU66ioX5QXw4aK2gN39LHRatlWnJgEpjdxnSXiSz
T24Z7dTKjCnSJlPWVp1jRilYd5YZTAao8mpAaY3TENQCbmFu5QBcnD6R/H7p
CzfbU0sqo1pRy91UAzNNmiAs6wTpdTTzzp1FSXw87ganlLNTKS7ThZwsRnaA
talQNJKX8j5JlePM74zERc3joHWiSVhNZA1g0jMyvbVMQpK7HrmEKCAl1xAm
hobPMcvGZg0WOEyLWQxjB+MluYzq8T3gNU592eJ7cIcu63kAh5g6koVXy1fp
JjhjBbd0F0nRXY61HJZuMfghj9UV5lZXI5IwqpWki/zQcJv2RFnFKmNXz1JX
lLXpUDogBWLrEhvKFb2hJTS1j0fUK63VtT/0yfyhXv2xtjeUVrMRXlXYR5NV
GYuJNp+o3zzW1yPaAJMD31TeV65tXlEfvmq+kU92zsxlZWfzzFdqz7gxJ4un
ZU4u64+QzUJU59Rkj0I7/7GcJGDQB9DmdOys+Lfk7AjxbRIyEoL2CIZSVpss
7DwatPivLAuCLqY3CKJxpmucrAlSpJO20JyizHIc5AXUa9bVulJptm9lQAR9
/PeKcU+ZOJNmStBDNOPitxDdrkFjl4M+PUvQlT0W1EadWvTdcJA0mkairE8+
loqQUQQBrz/7+PVuyTmnnnPwm5lpmnkPrb1EiwaLvVJZGjxoXVznivbndExh
9e7daBhvtN/e5GO/bWepXDxlNUKVVUL3pZnyTlHdcusTrjf5qE1JPtmoHJRu
SkuK1EW+2RyU9vamFXd1WkCrb8LpSEep8tN7E5CbPI5mwNYveVa7JSmZOdak
XgCGoGNEfg/icuetryLhY9BgqrHxvqs0/XB0HBxuDw/9d5x013333jtZ6713
d5esO3COPni56yxtnxTwFe8+zut88xTvKn0v2H5yMNzpKN4s7dn+freteLO0
/ApjaHv4YMd+1CzuqRt7B+qjf/m/pqfeLSqthaWeOyXc4qb08OBvoFcHv2hJ
+/l1vjTr1t1knrcWb5Q23XRplxg+RM4e/uJ+wdItY8mm2lI7llZj6WB7eGCN
pctguI917P/SWrxWtz2WoPQjXdpfvNFui62g5QfzRZfW9Ri23NK+4lZ/79eK
N7SoWdzlnDtLsUVf7w3xkfOD+dJod7sp9PzSKH1Jnbe/PdwfgCx2rAaojn24
PVQXQW24bvrIY1Bd9jJoSs1hsl/pjR/250qbB48aNq9RnEtaZ/obZ4d8NAw/
eusoTN190pc6XkCcuu7ECqlLbXp6E2mNF4hvNHIe+k8Qrcue7+IrVGb694D+
PRxYYRq4P6jPdhDeFXUtSPCTArE9qs/N4YMtxPpdGICeNN6zUQLeB9Sgc2gx
3vidFo3wJTeEIqb0BjCV0BT+GPLxgCFlT2ky2cwfRHE+vXha5aXaWxVO9LoO
b0llj5JStDhVaI6eNPBudRSC0sh4STn8UjzH7WJpHLTee2b52PMQ5KM2UXUn
mQFVk7XZJyHd3x9gDyATu98hB/TvA6MmHddOac4Vmt+q//aFatw0Md5AOpoV
x9gnyUSzIm/9gZemadrMWS7dq9YBKEWrCZF40s3IFqvorY/16kw4s+L5LYMG
uUevcEB4TlzRkHyAP/mHyoIrwoKzOLSMj1uj//qEhTV2ZSXV9X3bd5M5p8HY
ZLcc/nqldwUycBSZe2CsJaaWkSNHWiFKhPwci0znZNUeCFzrXQYvaWBs2tJh
mrCETxSNsjOJfS8+Z3ceJ0kgTk9Vf0sx10V8jXcsrvsa7/AXv8Y7rvGOT413
NAELKP3q9a/P/s/w9ZuTtuIG76gDFkvhHaaeTeIdO80frvGOa7zD8OPFO5L5
fCW8oysraBTS5dGjC6Pp9jUpteo3BXEsyZG7d+8aBVkFBbHhDwgjWAS2hGez
KuW9jNAvIytd7/c/Dg1RcuMKuUOExQ21nOHmDepsdMLRP6+DJ/pgjAc9sQNG
+8CcbM5meLUhH3CfkZ/YnMJKs7IHqwpEOmyDd/xpj93WGlwnriXx+hLwHT1i
vmp4Z0n78vkxnyUZXg0IYgP+jQJBq97kc40EXSNBi5EgHjrLIEG/RzTIONNr
gELGn1/o8zd96B5EjtKLCE/YHosc/IS7XKYLKurHCQS3CoPoy4nno3ABL2zU
TWR4iHHu4S8O+KQfLsnJWhiUpmGjSctBUW092weRMj/Rvxa25AV6FnFiY0wO
wtSNT0nUgqc01SmXzZKLYao64uRrRRta1fh0olYLQCtDQ3asDV4Z9Mn+2xBp
HasOhuX9G4m8LeyxWoOiXDDKT8XLiexVQ+SRF9F61ySCHw8wVbcePio1wTbx
qc79PHV4q/6xQbJFVNq7+J3pYj2IbbjK+ruNSLtx9vzSRuTSQFeHBF8tBV5t
khP6rAdlOaxfNaJl5uKuKvCftqwdPWAup0FdaJd+cfvZeCLk3NuaEdVf9Y2t
n02KtHbwZniobpcJ5UwvY2/0vOAXitE1RkBxFSEp6vKh738c7i2E4sZF3heK
I99WxXLgZrKvCk4kigDct7LK02D76fGbnTbgDSpbA3hbov5rYK0FWPsZNwVB
FJ6TRKlqyqyAR6+gsfqsotI492rxgb3tnw7lIIjDwAtDLoS+QIDKR6AlEe4h
VeOhuqOIgSVVMbFZQWBOlw3NFCMCR5lLYB/IOwGw8w4nipjyhaYWZMaNbgWY
FKx1KCNKgcTa2rpv2nrggAvNGLwd0TpcgGiBmiSZup4k5N50Kt489NR/hH0W
oKnOHjCyGn7EdugbxY9qEloNMzpsR3AOlseMyM4ggKOk4AIYrV1EV81JJWRj
QRrYQE0sA2APdw1AeEaPPEaWRRWF7vqyqUKqBUg2KEpoYCSxb5eTTQzvulYs
hzd1A0pdfLWCTe3ZXVYEm3icffPbjlyHb5NOrs+17g0/1V3SDj99If60DDEV
0HRuWVqCs9remDWI4UfiBn5cakliTiKdNYlZQJWX2jLEHMCqBfVaRKxFJfzo
1yJibQCWFwVbyFkbkOVDwzzEegJaHlSsjbMewFYTHWsjpj5dAFcDJVtArBvo
cnZcYbDuwAjt1sEPeDkYEsEHKp6+3L2U97J7uLWf+6khse2jssSLCfKdSyyh
TE0dRnMq8VJjzraH+7dp+9mfLvkqeT9n1v4wL7VOmUnDpWVmbxfzUetU2jpE
5+we81BrUY0eUJ2HWgux5s4ynwrWqS1WWh905wB2zk6zDmJds5sfOPMQ+0sM
XsGMkNvGLrT6CowL5V0dZ+v6GqD9OgUTOjl4Vwn7jGaXhzzSH8pxZ/CIOC3i
MQE3N7bIl8vyRmR0oMEovt1ZnlK3nScPJgV1J+Ror4ZLyeLkchPXRRseJd/0
ZZtdFZTyVN62GabxKsra9dkt3zN8j/ugErySWSa0lVkDFOgDsqer74S88UD3
GNGQ97yREkMpkyYObSDn7uFg0EaCJKZy230dVJvcZvSfIb6JJyn3N2duoT9p
WV1RcsJnJ1NSW2NVBiWJpqmWMKEwzzG5mS/eMP1pbh+SWBAEd3kUh3hNUsLB
yzSeF/pGE1JM9lq51YXKgaJuW3Au9DaBuHtHUEi3A3kK2kG9Koz1jcexuvre
IIHe0p8A7mnq7hcB8zTZWg31MdqxMeTn4ItFfnx2aMObhx4uu5WHkOo6DLSx
XTZdeweX21+zBuJhdOwa9VjGE+nwkfhzjXqsTAw/16hHP2LXqEd/zq5Rj68W
9Xh4jXq4P3mad416BN2oh/Pm14R6PFyEejz8HKiHWl1cazPOAsCD37L34KwN
eNQqbgM7nNeWBDqkp7oM2BHW6ZNu0oqsuoC1DPOJKLtxkQFdZQnVJDM6ZwWv
qygR3qEaPiG8oXvvGt1YfTOLVFS8++oLwDZcplbczaLU4ncAa9StzSqQRsfp
pP0VII39zwFpPPyER4a0fv2+IA3/p/dWYvfDDlCPjcXNYsb5X6pYYFY3+xZb
sW1WlffeLVtiM5vD5VGtfiDROyf2rTHV63TTSse1WiXQ/mAJMuue+9oYN5s6
QLbWETIPlZVuMPJQWf0YmQIk+I7Ype802sxRsnZUaqnDZKsfJ9vkgbINHynb
zKGyFY+VrX2wrL4JYa2jZRs8XNYLxfHRaYh4QwfM+iA3TTpdHb6BQ2ZrH+5a
jNT4cZoNcbOhOXyTO1IOfdiMOR7Vic2Y5dNdrme3cWORimFX3o5iLmceqowO
w1pGB76h2cNMny0razDYhuJc2Z6SA7mn5BPtIVGHhUo8DkUnhhych7nAWGuE
aJWMSrihA4g/pmHFGEQHIGTjOP4O1PkplwFx+EiXD4iBHsfULTlmShbOITI7
casDuzQQng5waAMYT7A6yLO6Ll8pzLM6Wz2hHr/mbAzoOfxigZ5PsH+l6yBT
G9hzoO6hcc9VKsVeDE1cxSaWw7URH7+a/Z7xniWxkCXxnSVxHY3nHPTCc1bB
cfrjN5vBbXrgNavjNGviM2sCKusjIOshH+shHushHashHGsiG2siGssjGZtA
MDaEXKyHWCyJVKyJUKhxtSIysQFEYl0kYk0EYnPIw1qIw4pIw4YQhk+JLCCi
cLAIUTj4JIgCO21rowkt+zxq1MF5wyACHUF1zcqq6EDP/R2NDRgpS6OxDePg
yrZhNMLuplBWCrm/6pj76kNuV0G+iHB7pV0VTW35HYTZm9hPcbDR/RT73RH2
Fe+rOLjKKLupYd9khI1Jm59kaZlnSTCkVNNDk7td9SOJENpdzQeGOVZgmTq7
IOnAHM3Gw0rDjpR3KYn1biEiEGM9gzwQLLMowQEA8waMO5ztDQ98IS+9AyzO
oPfDiVB5nmu55nWLnmLm7M03BxNyd7bFytiNZrAMRwkdHQWK0xqvPLF6M3w3
WxS8kTNn8LaMk/ifpLN0Nx1i4jYH2P/IFV0pJrnmy763ZSvstN0feX53LgVH
xrflvqLauwqmD5O4BHvzZPg2qAw/wTZVKvtcNstt5Y4U8izLL5YuCua/SnAS
lDnXxZgGBIwn2Wsz2iRrXovCeUkGtRDzEIcaDAWUjQjBYeQ6cNym1EtjUYYw
049rfRQmF0XMybqxF+ZZTqL+Ad8o8ekzJEYaNcXFC/Ra5DtEpeShiS0DQaLp
nYlgkmQj8EJI9GNMYhCRV6MKgqszz1JsDCsllYlTSvFehYn1e6DP9uIt27Gc
naxXS7Mec5ol4D3w5Wh5OBMlrqkoBXZzXegM5jMRTcM0LmY4XJC3Inj59viE
+4BQSDFWczqR4UbIxQN88MShO9Q1P8Y3AlxnoNfGAtxJASLMx+fYHG72aUlf
cPUHbTQXgelWoVBlNs+SbHLBPxxKWvomb5WmQRZ8sBccX8DsOKs1dlvsTfbA
Ekyhp7EjQCTgLJQovhl0boybl0HVB1JvB6BXEagd+LlzdBHozneiO+CBdIq7
oUH3wO0pL3a4bvAEn3ITvXU3bpEfmPBTUni05/YIPkWdfH706ojyQqCvJeey
D7fw6Ue2dzAF6puzUEXTjMuENJSlG3ErOBbgzWF2iwatQv7yUdnPEdjo6SzM
35PRobVCPVGp7BK6Su3byIPncprU1t8abTLgYV0o1IbyUJkFHJwJTITAGZoP
54pHtiUUbxQZevlj5foV2htJa1oTiNk8yS7gVTKYdqNmFQ8JsKtlRVM3Diox
B6+K4hE9gUCIVVLsOWNdONNTMWbYwDdgSMxo5I/jwria5LiE6YVurLQSaLnm
GV6FGKNPR6HOhet+go5lyJDUPrTGOIuNK/Z/ZCup1RCV06Io9/BR9D7NzhO8
f40t5Ydb9UcfKfxNq9lIgN38z5vkrnDY+hK5BS7T97QyfBT/vUqDn0NSVeDm
R4F/HVcF/P0T9DkMjx/jCgzvXICDmoGLk4SD4GWcTndfTbIoOMlD8JiPYfqa
pFDkrxUOMqA+OYf/By+g30UZ7amd7jEMSVQbNNR895u8hDIH8YlzvF7VVbu9
rf8PJxqxo+QfAQA=

-->

</rfc>
