<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.29 (Ruby 3.2.3) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-chen-bmwg-savnet-sav-benchmarking-06" category="info" submissionType="IETF" xml:lang="en" version="3">
  <!-- xml2rfc v2v3 conversion 3.30.1 -->
  <front>
    <title abbrev="SAVBench">Benchmarking Methodology for Intra-domain and Inter-domain Source Address Validation</title>
    <seriesInfo name="Internet-Draft" value="draft-chen-bmwg-savnet-sav-benchmarking-06"/>
    <author initials="L." surname="Chen" fullname="Li Chen">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>lichen@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="D." surname="Li" fullname="Dan Li">
      <organization>Tsinghua University</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>tolidan@tsinghua.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Liu" fullname="Libin Liu">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>liulb@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Qin" fullname="Lancheng Qin">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>qinlc@zgclab.edu.cn</email>
      </address>
    </author>
    <date year="2025" month="August" day="29"/>
    <area>General [REPLACE]</area>
    <workgroup>IETF</workgroup>
    <abstract>
      <?line 65?>

<t>This document defines methodologies for benchmarking the performance of intra-domain and inter-domain source address validation (SAV) mechanisms. SAV mechanisms are utilized to generate SAV rules to prevent source address spoofing, and have been implemented with many various designs in order to perform SAV in the corresponding scenarios. This document takes the approach of considering a SAV device to be a black box, defining the methodology in a manner that is agnostic to the mechanisms. This document provides a method for measuring the performance of existing and new SAV implementations.</t>
    </abstract>
  </front>
  <middle>
    <?line 69?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>Source address validation (SAV) is significantly important to prevent source address spoofing. Operators are suggested to deploy different SAV mechanisms <xref target="RFC3704"/> <xref target="RFC8704"/> based on their deployment network environments. In addition, existing intra-domain (intra-AS) and inter-domain (inter-AS) SAV mechanisms have problems in operational overhead and SAV accuracy under various scenarios <xref target="intra-domain-ps"/> <xref target="inter-domain-ps"/>. Intra-domain and inter-domain SAVNET architectures <xref target="intra-domain-arch"/> <xref target="inter-domain-arch"/> are proposed to guide the design of new intra-domain and inter-domain SAV mechanisms to solve the problems. The benchmarking methodology defined in this document will help operators to get a more accurate idea of the SAV performance when their deployed devices enable SAV and will also help vendors to test the performance of SAV implementation for their devices.</t>
      <t>This document provides generic methodologies for benchamarking SAV mechanism performance. To achieve the desired functionality, a SAV device may support multiple SAV mechanisms, allowing operators to enable those most suitable for their specific network environments. This document considers a SAV device to be a black box, regardless of the design and implementation. The tests defined in this document can be used to benchmark a SAV device for SAV accuracy (i.e., false positive and false negative rates), SAV protocol convergence performance, and control plane and data plane forwarding performance. These tests can be performed on a hardware router, a software router, a virtual machine (VM) instance, or a container instance, which runs as a SAV device. This document outlines methodologies for assessing SAV device performance and comparing various SAV mechanisms and implementations.</t>
      <section anchor="goal-and-scope">
        <name>Goal and Scope</name>
        <t>The benchmarking methodology outlined in this draft focuses on two objectives:</t>
        <ul spacing="normal">
          <li>
            <t>Assessing “which SAV mechanism performs best” over a set of well-defined scenarios.</t>
          </li>
          <li>
            <t>Measuring the contribution of sub-systems to the overall SAV systems' performance (also known as “micro-benchmark”).</t>
          </li>
        </ul>
        <t>This benchmark evaluates the SAV performance of individual devices (e.g., hardware/software routers) by comparing different SAV mechanisms under specific network scenarios. The results help determine the appropriate SAV deployment for real-world network scenarios.</t>
      </section>
      <section anchor="requirements-language">
        <name>Requirements Language</name>
        <t>The key words "<bcp14>MUST</bcp14>", "<bcp14>MUST NOT</bcp14>", "<bcp14>REQUIRED</bcp14>", "<bcp14>SHALL</bcp14>", "<bcp14>SHALL
NOT</bcp14>", "<bcp14>SHOULD</bcp14>", "<bcp14>SHOULD NOT</bcp14>", "<bcp14>RECOMMENDED</bcp14>", "<bcp14>NOT RECOMMENDED</bcp14>",
"<bcp14>MAY</bcp14>", and "<bcp14>OPTIONAL</bcp14>" in this document are to be interpreted as
described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/> when, and only when, they
appear in all capitals, as shown here.</t>
        <?line -18?>

</section>
    </section>
    <section anchor="terminology">
      <name>Terminology</name>
      <t>SAV Control Plane: The SAV control plane consists of processes including gathering and communicating SAV-related information.</t>
      <t>SAV Data Plane: The SAV data plane stores the SAV rules within a specific data structure and validates each incoming packet to determine whether to permit or discard it.</t>
      <t>Host-facing Router: An intra-domain router facing an intra-domain host network.</t>
      <t>Customer-facing Router: An intra-domain router facing an intra-domain customer network which includes routers and runs the routing protocol.</t>
      <t>AS Border Router: An intra-domain router facing an external AS.</t>
    </section>
    <section anchor="test-methodology">
      <name>Test Methodology</name>
      <section anchor="test-setup">
        <name>Test Setup</name>
        <t>The test setup in general is compliant with <xref target="RFC2544"/>. The Device Under Test (DUT) is connected to a Tester and other network devices to construct the network topology introduced in <xref target="testcase-sec"/>. The Tester is a traffic generator to generate network traffic with various source and destination addresses in order to emulate the spoofing or legitimate traffic. It is <bcp14>OPTIONAL</bcp14> to choose various proportions of traffic and it is needed to generate the traffic with line speed to test the data plane forwarding performance.</t>
        <figure anchor="testsetup">
          <name>Test Setup.</name>
          <artwork><![CDATA[
    +~~~~~~~~~~~~~~~~~~~~~~~~~~+
    | Test Network Environment |
    |     +--------------+     |
    |     |              |     |
+-->|     |      DUT     |     |---+
|   |     |              |     |   |
|   |     +--------------+     |   |
|   +~~~~~~~~~~~~~~~~~~~~~~~~~~+   |
|                                  |
|         +--------------+         |
|         |              |         |
+---------|    Tester    |<--------+
          |              |
          +--------------+
]]></artwork>
        </figure>
        <t><xref target="testsetup"/> illustrates the test configuration for the Device Under Test (DUT). Within the test network environment, the DUT can be interconnected with other devices to create a variety of test scenarios. The Tester may establish a direct connection with the DUT or link through intermediary devices. The nature of the connection between them is dictated by the benchmarking tests outlined in <xref target="testcase-sec"/>. Furthermore, the Tester has the capability to produce both spoofed and legitimate traffic to evaluate the SAV accuracy of the DUT in relevant scenarios, and it can also generate traffic at line rate to assess the data plane forwarding performance of the DUT. Additionally, the DUT is required to support logging functionalities to document all test outcomes.</t>
      </section>
      <section anchor="network-topology-and-device-configuration">
        <name>Network Topology and Device Configuration</name>
        <t>The positioning of the DUT within the network topology has an impact on SAV performance. Therefore, the benchmarking process <bcp14>MUST</bcp14> include evaluating the DUT at multiple locations across the network to ensure a comprehensive assessment.</t>
        <t>The routing configurations of network devices may differ, and the resulting SAV rules depend on these settings. It is essential to clearly document the specific device configurations used during testing.</t>
        <t>Furthermore, the role of each device, such as host-facing router, customer-facing router, or AS border router in an intra-domain network, <bcp14>SHOULD</bcp14> be clearly identified. In an inter-domain context, the business relationships between ASes <bcp14>MUST</bcp14> also be specified.</t>
        <t>When evaluating data plane forwarding performance, the traffic generated by the Tester must be characterized by defined traffic rates, the ratio of spoofed to legitimate traffic, and the distribution of source addresses, as all of these factors can influence test results.</t>
      </section>
    </section>
    <section anchor="sav-performance-indicators">
      <name>SAV Performance Indicators</name>
      <t>This section lists key performance indicators (KPIs) of SAV for overall benchmarking tests. All KPIs <bcp14>SHOULD</bcp14> be measured in the bencharking scenarios described in <xref target="testcase-sec"/>. Also, the KPIs <bcp14>SHOULD</bcp14> be measured from the result output of the DUT.</t>
      <section anchor="false-positive-rate">
        <name>False Positive Rate</name>
        <t>The proportion of legitimate traffic which is determined to be spoofing traffic by the DUT across all the legitimate traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="false-negative-rate">
        <name>False Negative Rate</name>
        <t>The proportion of spoofing traffic which is determined to be legitimate traffic by the DUT across all the spoofing traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="protocol-convergence-time">
        <name>Protocol Convergence Time</name>
        <t>The control protocol convergence time represents the period during which the SAV control plane protocol converges to update the SAV rules when routing changes happen, and it is the time elapsed from the begining of routing change to the completion of SAV rule update. This KPI can indicate the convergence performance of the SAV protocol.</t>
      </section>
      <section anchor="protocol-message-processing-throughput">
        <name>Protocol Message Processing Throughput</name>
        <t>The protocol message processing throughput measures the throughput of processing the packets for communicating SAV-related information on the control plane, and it can indicate the SAV control plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-sav-table-refreshing-rate">
        <name>Data Plane SAV Table Refreshing Rate</name>
        <t>The data plane SAV table refreshing rate refers to the rate at which a DUT updates its SAV table with new SAV rules, and it can reflect the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-forwarding-rate">
        <name>Data Plane Forwarding Rate</name>
        <t>The data plane forwarding rate measures the SAV data plane forwarding throughput for processing the data plane traffic, and it can indicate the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="resource-utilization">
        <name>Resource Utilization</name>
        <t>The resource utilization refers to the CPU and memory usage of the SAV processes within the DUT.</t>
      </section>
    </section>
    <section anchor="testcase-sec">
      <name>Benchmarking Tests</name>
      <section anchor="intra_domain_sav">
        <name>Intra-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Evaluate the false positive rate and false negative rate of the DUT in processing both legitimate and spoofed traffic across various intra-domain network scenarios. These scenarios include SAV implementations for customer/host networks, Internet-facing networks, and aggregation-router-facing networks.</t>
          <t>In the following, this document introduces the test scenarios for evaluating intra-domain SAV on the DUT.</t>
          <figure anchor="intra-domain-customer-syn">
            <name>SAV for customer or host network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
| FIB on DUT            +~~~~~~~~~~+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +----------+                       |
|                       |   DUT    |                       |
|                       +----------+                       |
|                         /\    |                          |
|            Traffic withs |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |    (10.0.0.0/15)   |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-syn"/> illustrates an intra-domain symmetric routing scenario in which SAV is deployed for a customer or host network. The DUT performs SAV as a customer/host-facing router and connects to Router 1 for Internet access. A sub network, which resides within the AS and uses the prefix 10.0.0.0/15, is connected to the DUT. The Tester emulates a sub network by advertising this prefix in the control plane and generating both spoofed and legitimate traffic in the data plane. In this setup, the Tester is configured so that inbound traffic destined for 10.0.0.0/15 arrives via the DUT. The DUT learns the route to 10.0.0.0/15 from the Tester, while the Tester sends outbound traffic with source addresses within 10.0.0.0/15 to the DUT, simulating a symmetric routing scenario between the two. The IP addresses used in this test case are optional; users may substitute them with other addresses, as applies equally to other test cases.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain symmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To verify whether the DUT can generate accurate SAV rules for customer or host network under symmetric routing conditions, construct a testbed as depicted in <xref target="intra-domain-customer-syn"/>. The Tester is connected to the DUT and acts as a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish a symmetric routing environment..</t>
            </li>
            <li>
              <t>The Tester generates both legitimate traffic (with source addresses in 10.0.0.0/15) and spoofed traffic (with source addresses in 10.2.0.0/15) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and allows legitimate traffic originating from the sub network.</t>
          <figure anchor="intra-domain-customer-asyn">
            <name>SAV for customer or host network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +~~~~~~~~~~+                          |
|                       | Router 2 |                          |
| FIB on DUT            +~~~~~~~~~~+   FIB on Router 1        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  Router 2  /         \/  10.1.0.0/16  Router 2  |
|               +----------+     +~~~~~~~~~~+                 |
|               |   DUT    |     | Router 1 |                 |
|               +----------+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-asyn"/> illustrates an intra-domain asymmetric routing scenario in which SAV is deployed for a customer or host network. The DUT performs SAV as a customer/host-facing router. A sub network, i.e., a customer/host network within the AS, is connected to both the DUT and Router 1, and uses the prefix 10.0.0.0/15. The Tester emulates a sub network and handles both its control plane and data plane functions. In this setup, the Tester is configured so that inbound traffic destined for 10.1.0.0/16 is received only from the DUT, while inbound traffic for 10.0.0.0/16 is received only from Router 1. The DUT learns the route to prefix 10.1.0.0/16 from the Tester, and Router 1 learns the route to 10.0.0.0/16 from the Tester. Both the DUT and Router 1 then advertise their respective learned prefixes to Router 2. Consequently, the DUT learns the route to 10.0.0.0/16 from Router 2, and Router 1 learns the route to 10.1.0.0/16 from Router 2. The Tester sends outbound traffic with source addresses in 10.0.0.0/16 to the DUT, simulating an asymmetric routing scenario between the Tester and the DUT.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To determine whether the DUT can generate accurate SAV rules under asymmetric routing conditions, set up the test environment as shown in <xref target="intra-domain-customer-asyn"/>. The Tester is connected to both the DUT and Router 1 and emulates the functions of a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.1.0.0/16) and legitimate traffic (using source addresses in 10.0.0.0/16) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the sub network.</t>
          <figure anchor="intra-domain-internet-syn">
            <name>SAV for Internet-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                          |
|                          |   \/                          |
|                       +----------+                       |
|                       |    DUT   | SAV facing Internet   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
|                          |    \/                         |
|                  +--------------------+                  |
|                  |    Sub Network     |                  |
|                  |   (10.0.0.0/15)    |                  |
|                  +--------------------+                  |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-syn"/> illustrates the test scenario for SAV in an Internet-facing network under intra-domain symmetric routing conditions. The network topology resembles that of <xref target="intra-domain-customer-syn"/>, with the key difference being the positioning of the DUT. In this case, the DUT is connected to Router 1 and the Internet, while the Tester emulates the Internet. The DUT performs SAV from an Internet-facing perspective, as opposed to a customer/host-facing role.</t>
          <t>The <strong>procedure</strong> for testing SAV for an Internet-facing network in an intra-domain symmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for Internet-facing SAV under symmetric routing, set up the test environment as depicted in <xref target="intra-domain-internet-syn"/>. The Tester is connected to the DUT and emulates the Internet.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish a symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.0.0.0/15) and legitimate traffic (using source addresses in 10.2.0.0/15) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and allows legitimate traffic originating from the Internet.</t>
          <figure anchor="intra-domain-internet-asyn">
            <name>SAV for Internet-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                             |
|                          |   \/                             |
|                       +----------+                          |
|                       |    DUT   |                          |
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
|                         \    \/                             |
|                  +--------------------+                     |
|                  |    Sub Network     |                     |
|                  |   (10.0.0.0/15)    |                     |
|                  +--------------------+                     |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-asyn"/> illustrates a test case for SAV in an Internet-facing network under intra-domain asymmetric routing conditions. The network topology is identical to that of <xref target="intra-domain-customer-asyn"/>, with the key distinction being the placement of the DUT. In this scenario, the DUT is connected to Router 1 and Router 2 within the same AS, as well as to the Internet. The Tester emulates the Internet, and the DUT performs Internet-facing SAV rather than customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for Internet-facing SAV under asymmetric routing, construct the test environment as shown in <xref target="intra-domain-internet-asyn"/>. The Tester is connected to the DUT and emulates the Internet.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.0.0.0/15) and legitimate traffic (using source addresses in 10.2.0.0/15) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the Internet.</t>
          <figure anchor="intra-domain-agg-syn">
            <name>SAV for aggregation-router-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +----------+                       |
|                       |    DUT   | SAV facing Router 1   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-syn"/> depicts the test scenario for SAV in an aggregation-router-facing network under intra-domain symmetric routing conditions. The network topology in <xref target="intra-domain-agg-syn"/> is identical to that of <xref target="intra-domain-internet-syn"/>. The Tester is connected to Router 1 to emulate a sub network, enabling evaluation of the DUT's false positive and false negative rates when facing Router 1.</t>
          <t>The <strong>procedure</strong> for testing SAV in this aggregation-router-facing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for aggregation-router-facing SAV under symmetric routing, construct the test environment as shown in <xref target="intra-domain-agg-syn"/>. The Tester is connected to Router 1 and emulates a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish a symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both legitimate traffic (using source addresses in 10.1.0.0/15) and spoofed traffic (using source addresses in 10.2.0.0/15) toward Router 1. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the direction of Router 1.</t>
          <figure anchor="intra-domain-agg-asyn">
            <name>SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +----------+                          |
|                       |    DUT   | SAV facing Router 1 and 2|
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                   +--------------------+
                   | Tester (Sub Network) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-asyn"/> illustrates the test case for SAV in an aggregation-router-facing network under intra-domain asymmetric routing conditions. The network topology in <xref target="intra-domain-agg-asyn"/> is identical to that of <xref target="intra-domain-internet-asyn"/>. The Tester is connected to both Router 1 and Router 2 to emulate a sub network, enabling evaluation of the DUT's false positive and false negative rates when facing Router 1 and Router 2.</t>
          <t>The <strong>procedure</strong> for testing SAV in this aggregation-router-facing asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under asymmetric routing conditions, construct the test environment as shown in <xref target="intra-domain-agg-asyn"/>. The Tester is connected to Router 1 and Router 2 and emulates the functions of a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish an asymmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.1.0.0/16) and legitimate traffic (using source addresses in 10.0.0.0/16) toward Router 1. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the direction of Router 1 and Router 2.</t>
        </section>
        <section anchor="intra-control-plane-sec">
          <name>Control Plane Performance</name>
          <t><strong>Objective</strong>: Measure the control plane performance of the DUT, including both protocol convergence performance and protocol message processing performance in response to route changes caused by network failures or operator configurations. Protocol convergence performance is quantified by the convergence time, defined as the duration from the onset of a routing change until the completion of the corresponding SAV rule update. Protocol message processing performance is measured by the processing throughput, represented by the total size of protocol messages processed per second.</t>
          <figure anchor="intra-convg-perf">
            <name>Test setup for protocol convergence performance measurement.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
| Emulated Topology |------|   Tester    |<-------->|    DUT    |
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
]]></artwork>
          </figure>
          <t><strong>Protocol Convergence Performance</strong>: <xref target="intra-convg-perf"/> illustrates the test setup for measuring protocol convergence performance. The convergence process of the DUT, during which SAV rules are updated, is triggered by route changes resulting from network failures or operator configurations. In <xref target="intra-convg-perf"/>, the Tester is directly connected to the DUT and simulates these route changes by adding or withdrawing prefixes to initiate the DUT's convergence procedure.</t>
          <t>The <strong>procedure</strong> for testing protocol convergence performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the protocol convergence time of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester withdraws a specified percentage of the total prefixes supported by the DUT, for example, 10%, 20%, up to 100%.</t>
            </li>
            <li>
              <t>The protocol convergence time is calculated based on DUT logs that record the start and completion times of the convergence process.</t>
            </li>
          </ol>
          <t>Please note that for IGP, proportional prefix withdrawal can be achieved by selectively shutting down interfaces. For instance, if the Tester is connected to ten emulated devices through ten interfaces, each advertising a prefix, withdrawing 10% of prefixes can be accomplished by randomly disabling one interface. Similarly, 20% withdrawal corresponds to shutting down two interfaces, and so forth. This is one suggested method, and other approaches that achieve the same effect should be also acceptable.</t>
          <t>The protocol convergence time, defined as the duration required for the DUT to complete the convergence process, should be measured from the moment the last “hello” message is received from the emulated device on the disabled interface until SAV rule generation is finalized. To ensure accuracy, the DUT should log the timestamp of the last hello message received and the timestamp when SAV rule updates are complete. The convergence time is the difference between these two timestamps.</t>
          <t>It is recommended that if the emulated device sends a “goodbye hello” message during interface shutdown, using the receipt time of this message, rather than the last standard hello, as the starting point will provide a more precise measurement, as advised in <xref target="RFC4061"/>.</t>
          <t><strong>Protocol Message Processing Performance</strong>: The test for protocol message processing performance uses the same setup illustrated in <xref target="intra-convg-perf"/>. This performance metric evaluates the protocol message processing throughput, the rate at which the DUT processes protocol messages. The Tester varies the sending rate of protocol messages, ranging from 10% to 100% of the total link capacity between the Tester and the DUT. The DUT records both the total size of processed protocol messages and the corresponding processing time.</t>
          <t>The <strong>procedure</strong> for testing protocol message processing performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the protocol message processing throughput of the DUT, set up the test environment as shown in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends protocol messages at varying rates, such as 10%, 20%, up to 100%, of the total link capacity between the Tester and the DUT.</t>
            </li>
            <li>
              <t>The protocol message processing throughput is calculated based on DUT logs that record the total size of processed protocol messages and the total processing time.</t>
            </li>
          </ol>
          <t>To compute the protocol message processing throughput, the DUT logs <bcp14>MUST</bcp14> include the total size of the protocol messages processed and the total time taken for processing. The throughput is then derived by dividing the total message size by the total processing time.</t>
        </section>
        <section anchor="intra-data-plane-sec">
          <name>Data Plane Performance</name>
          <t><strong>Objective</strong>: Evaluate the data plane performance of the DUT, including both data plane SAV table refresh performance and data plane forwarding performance. Data plane SAV table refresh performance is quantified by the refresh rate, which indicates how quickly the DUT updates its SAV table with new SAV rules. Data plane forwarding performance is measured by the forwarding rate, defined as the total size of packets forwarded by the DUT per second.</t>
          <t><strong>Data Plane SAV Table Refreshing Performance</strong>: The evaluation of data plane SAV table refresh performance uses the same test setup shown in <xref target="intra-convg-perf"/>. This metric measures the rate at which the DUT refreshes its SAV table with new SAV rules. The Tester varies the transmission rate of protocol messages, from 10% to 100% of the total link capacity between the Tester and the DUT, to influence the proportion of updated SAV rules and corresponding SAV table entries. The DUT records the total number of updated SAV table entries and the time taken to complete the refresh process.</t>
          <t>The <strong>procedure</strong> for testing data plane SAV table refresh performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the data plane SAV table refresh rate of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends protocol messages at varying percentages of the total link capacity, for example, 10%, 20%, up to 100%.</t>
            </li>
            <li>
              <t>The data plane SAV table refresh rate is calculated based on DUT logs that record the total number of updated SAV table entries and the total refresh time.</t>
            </li>
          </ol>
          <t>To compute the refresh rate, the DUT logs <bcp14>MUST</bcp14> capture the total number of updated SAV table entries and the total time required for refreshing. The refresh rate is then derived by dividing the total number of updated entries by the total refresh time.</t>
          <t><strong>Data Plane Forwarding Performance</strong>: The evaluation of data plane forwarding performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The Tester transmits a mixture of spoofed and legitimate traffic at a rate matching the total link capacity between the Tester and the DUT, while the DUT maintains a fully populated SAV table. The ratio of spoofed to legitimate traffic can be varied within a range, for example, from 1:9 to 9:1. The DUT records the total size of forwarded packets and the total duration of the forwarding process.</t>
          <t>The procedure for testing data plane forwarding performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the data plane forwarding rate of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends a mix of spoofed and legitimate traffic to the DUT at the full link capacity between the Tester and the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
            <li>
              <t>The data plane forwarding rate is calculated based on DUT logs that record the total size of forwarded traffic and the total forwarding time.</t>
            </li>
          </ol>
          <t>To compute the forwarding rate, the DUT logs must include the total size of forwarded traffic and the total time taken for forwarding. The forwarding rate is then derived by dividing the total traffic size by the total forwarding time.</t>
        </section>
      </section>
      <section anchor="inter_domain_sav">
        <name>Inter-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates-1">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Measure the false positive rate and false negative rate of the DUT when processing legitimate and spoofed traffic across multiple inter-domain network scenarios, including SAV implementations for both customer-facing ASes and provider-/peer-facing ASes.</t>
          <figure anchor="inter-customer-syn">
            <name>SAV for customer-facing ASes in inter-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|             \         |              \            \      |
|     P6[AS 1] \        |               \            \     |
|      P1[AS 1] \       |                \            \    |
|          (C2P) \      | (C2P/P2P) (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-syn"/> presents a test case for SAV in customer-facing ASes under an inter-domain symmetric routing scenario. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network environment, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which in turn is a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 and P6 to AS 2 and the DUT, respectively. AS 2 then propagates routes for P1 and P6 to the DUT, enabling the DUT to learn these prefixes from both AS 1 and AS 2. In this test, the legitimate path for traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT-&gt;AS 4. The Tester is connected to AS 1 to evaluate the DUT's SAV performance for customer-facing ASes.</t>
          <t>The <strong>procedure</strong> for testing SAV in this scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for customer-facing ASes under symmetric inter-domain routing, construct the test environment as shown in <xref target="inter-customer-syn"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish a symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate traffic (with source addresses in P1 and destination addresses in P4) and spoofed traffic (with source addresses in P5 and destination addresses in P4) to the DUT via AS 2. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-syn"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-lpp">
            <name>SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|    P6[AS 1] \         | NO_EXPORT    \            \      |
|     P1[AS 1] \        |               \            \     |
|     NO_EXPORT \       |                \            \    |
|          (C2P) \      | (C2P)     (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t>SAV for Customer-facing ASes: <xref target="inter-customer-lpp"/> presents a test case for SAV in customer-facing ASes under an inter-domain asymmetric routing scenario induced by NO_EXPORT community configuration. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which is itself a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefix P1 to AS 2 with the NO_EXPORT community attribute, preventing AS 2 from propagating the route for P1 to the DUT. Similarly, AS 1 advertises prefix P6 to the DUT with the NO_EXPORT attribute, preventing the DUT from propagating this route to AS 3. As a result, the DUT learns the route for prefix P1 only from AS 1. The legitimate path for traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT. The Tester is connected to AS 1 to evaluate the DUT's SAV performance for customer-facing ASes.</t>
          <t>The <strong>procedure</strong> for testing SAV in this asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under NO_EXPORT-induced asymmetric routing, construct the test environment as shown in <xref target="inter-customer-lpp"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate traffic (with source addresses in P1 and destination addresses in P4) and spoofed traffic (with source addresses in P5 and destination addresses in P4) to the DUT via AS 2. The ratio of spoofed to legitimate traffic may vary—for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-lpp"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-dsr">
            <name>SAV for customer-facing ASes in the scenario of direct server return (DSR).</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                       |
|                                +----------------+               |
|                Anycast Server+-+    AS 3(P3)    |               |
|                                +-+/\----+/\+----+               |
|                                   /       \                     |
|                         P3[AS 3] /         \ P3[AS 3]           |
|                                 /           \                   |
|                                / (C2P)       \                  |
|                       +----------------+      \                 |
|                       |     DUT(P4)    |       \                |
|                       ++/\+--+/\+--+/\++        \               |
|          P6[AS 1, AS 2] /     |      \           \              |
|               P2[AS 2] /      |       \           \             |
|                       /       |        \           \            |
|                      / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|      +----------------+       |          \           \          |
|User+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|      +----------+/\+--+       | P6[AS 1]   \           \        |
|          P6[AS 1] \           |             \           \       |
|           P1[AS 1] \          |              \           \      |
|                     \         |               \           \     |
|                      \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                    +----------------+        +----------------+ |
|                    |AS 1(P1, P3, P6)|        |    AS 5(P5)    | |
|                    +----------------+        +----------------+ |
|                         /\     |                                |
|                          |     |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                     +----------------+
                     |     Tester     |
                     | (Edge Server)  |
                     +----------------+

Within the test network environment, P3 is the anycast prefix and is only advertised by AS 3 through BGP.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-dsr"/> presents a test case for SAV in customer-facing ASes under a Direct Server Return (DSR) scenario. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which is itself a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2 send requests to an anycast destination IP, the forwarding path is AS 2-&gt;DUT-&gt;AS 3. Anycast servers in AS 3 receive the requests and tunnel them to edge servers in AS 1. The edge servers then return content to the users with source addresses in prefix P3. If the reverse forwarding path is AS 1-&gt;DUT-&gt;AS 2, the Tester sends traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;DUT-&gt;AS 2. Alternatively, if the reverse forwarding path is AS 1-&gt;AS 2, the Tester sends traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;AS 2. In this case, AS 2 may serve as the DUT.</t>
          <t>The <strong>procedure</strong> for testing SAV in this DSR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under DSR conditions, construct the test environment as shown in <xref target="inter-customer-dsr"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish the DSR scenario.</t>
            </li>
            <li>
              <t>The Tester sends legitimate traffic (with source addresses in P3 and destination addresses in P2) to AS 2 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT permits legitimate traffic with source addresses in P3 received from the direction of AS 1.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-dsr"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-reflect">
            <name>SAV for customer-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P1')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |  Server+-+    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-reflect"/> illustrates a test case for SAV in customer-facing ASes under a reflection attack scenario. In this scenario, a reflection attack using source address spoofing occurs within the DUT's customer cone. The attacker spoofs the victim's IP address (P1) and sends requests to server IP addresses (P5) that are configured to respond to such requests. The Tester emulates the attacker by performing source address spoofing. The arrows in <xref target="inter-customer-reflect"/> indicate the business relationships between ASes: AS 3 serves as the provider for both the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under reflection attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules in a reflection attack scenario, construct the test environment as shown in <xref target="inter-customer-reflect"/>. The Tester is connected to AS 2 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the reflection attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P1 and destination addresses in P5) toward AS 5 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P1 received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-reflect"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-direct">
            <name>SAV for customer-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P5')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |          |    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-direct"/> presents a test case for SAV in customer-facing ASes under a direct attack scenario. In this scenario, a direct attack using source address spoofing occurs within the DUT's customer cone. The attacker spoofs a source address (P5) and directly targets the victim's IP address (P1), aiming to overwhelm its network resources. The Tester emulates the attacker by performing source address spoofing. The arrows in <xref target="inter-customer-direct"/> indicate the business relationships between ASes: AS 3 serves as the provider for both the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under direct attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules in a direct attack scenario, construct the test environment as shown in <xref target="inter-customer-direct"/>. The Tester is connected to AS 2 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the direct attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P5 and destination addresses in P1) toward AS 1 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P5 received from the direction of AS 2.</t>
          <t>Note that DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-direct"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="reflection-attack-p">
            <name>SAV for provider-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
                                   +----------------+
                                   |     Tester     |
                                   |   (Attacker)   |
                                   |      (P1')     |
                                   +----------------+
                                        |     /\
                                        |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment              \/     |                    |
|                                  +----------------+               |
|                                  |                |               |
|                                  |    AS 3(P3)    |               |
|                                  |                |               |
|                                  +-+/\----+/\+----+               |
|                                     /       \                     |
|                                    /         \                    |
|                                   /           \                   |
|                                  / (C2P/P2P)   \                  |
|                         +----------------+      \                 |
|                         |     DUT(P4)    |       \                |
|                         ++/\+--+/\+--+/\++        \               |
|            P6[AS 1, AS 2] /     |      \           \              |
|                 P2[AS 2] /      |       \           \             |
|                         /       |        \           \            |
|                        / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|        +----------------+       |          \           \          |
|Server+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|        +----------+/\+--+       | P6[AS 1]   \           \        |
|            P6[AS 1] \           | NO_EXPORT   \           \       |
|             P1[AS 1] \          |              \           \      |
|             NO_EXPORT \         |               \           \     |
|                        \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                      +----------------+        +----------------+ |
|              Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|                      +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Provider/Peer-facing ASes</strong>: <xref target="reflection-attack-p"/> illustrates a test case for SAV in provider/peer-facing ASes under a reflection attack scenario. In this scenario, the attacker spoofs the victim's IP address (P1) and sends requests to server IP addresses (P2) that are configured to respond. The Tester emulates the attacker by performing source address spoofing. The servers then send overwhelming responses to the victim, exhausting its network resources. The arrows in <xref target="reflection-attack-p"/> represent the business relationships between ASes: AS 3 acts as either a provider or a lateral peer of the DUT and is the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under reflection attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for provider/peer-facing ASes in a reflection attack scenario, construct the test environment as shown in <xref target="reflection-attack-p"/>. The Tester is connected to AS 3 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the reflection attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P1 and destination addresses in P2) toward AS 2 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P1 received from the direction of AS 3.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="reflection-attack-p"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="direct-attack-p">
            <name>SAV for provider-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
                           +----------------+
                           |     Tester     |
                           |   (Attacker)   |
                           |      (P2')     |
                           +----------------+
                                |     /\
                                |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment      \/     |                    |
|                          +----------------+               |
|                          |    AS 3(P3)    |               |
|                          +-+/\----+/\+----+               |
|                             /       \                     |
|                            /         \                    |
|                           /           \                   |
|                          / (C2P/P2P)   \                  |
|                 +----------------+      \                 |
|                 |     DUT(P4)    |       \                |
|                 ++/\+--+/\+--+/\++        \               |
|    P6[AS 1, AS 2] /     |      \           \              |
|         P2[AS 2] /      |       \           \             |
|                 /       |        \           \            |
|                / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|+----------------+       |          \           \          |
||    AS 2(P2)    |       | P1[AS 1]  \           \         |
|+----------+/\+--+       | P6[AS 1]   \           \        |
|    P6[AS 1] \           | NO_EXPORT   \           \       |
|     P1[AS 1] \          |              \           \      |
|     NO_EXPORT \         |               \           \     |
|                \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|              +----------------+        +----------------+ |
|      Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|              +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="direct-attack-p"/> presents a test case for SAV in provider-facing ASes under a direct attack scenario. In this scenario, the attacker spoofs a source address (P2) and directly targets the victim's IP address (P1), overwhelming its network resources. The arrows in <xref target="direct-attack-p"/> represent the business relationships between ASes: AS 3 acts as either a provider or a lateral peer of the DUT and is the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The procedure for testing SAV under direct attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for provider-facing ASes in a direct attack scenario, construct the test environment as shown in <xref target="direct-attack-p"/>. The Tester is connected to AS 3 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the direct attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P2 and destination addresses in P1) toward AS 1 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P2 received from the direction of AS 3.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="direct-attack-p"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
        </section>
        <section anchor="control-plane-performance">
          <name>Control Plane Performance</name>
          <t>The test setup, procedure, and metrics for evaluating protocol convergence performance and protocol message processing performance can refer to <xref target="intra-control-plane-sec"/>.</t>
        </section>
        <section anchor="data-plane-performance">
          <name>Data Plane Performance</name>
          <t>The test setup, procedure, and metrics for evaluating data plane SAV table refresh performance and data plane forwarding performance can refer to <xref target="intra-data-plane-sec"/>.</t>
        </section>
      </section>
      <section anchor="resource-utilization-1">
        <name>Resource Utilization</name>
        <t>When evaluating the DUT for both intra-domain (<xref target="intra_domain_sav"/>) and inter-domain SAV (<xref target="inter_domain_sav"/>) functionality, CPU utilization (for both control and data planes) and memory utilization (for both control and data planes) <bcp14>MUST</bcp14> be recorded. These metrics <bcp14>SHOULD</bcp14> be collected separately per plane to facilitate granular performance analysis.</t>
      </section>
    </section>
    <section anchor="reporting-format">
      <name>Reporting Format</name>
      <t>Each test follows a reporting format comprising both global, standardized components and individual elements specific to each test. The following parameters for test configuration and SAV mechanism settings <bcp14>MUST</bcp14> be documented in the test report.</t>
      <t>Test Configuration Parameters:</t>
      <ol spacing="normal" type="1"><li>
          <t>Test device hardware and software versions</t>
        </li>
        <li>
          <t>Network topology</t>
        </li>
        <li>
          <t>Test traffic attributes</t>
        </li>
        <li>
          <t>System configuration (e.g., physical or virtual machine, CPU, memory, caches, operating system, interface capacity)</t>
        </li>
        <li>
          <t>Device configuration (e.g., symmetric routing, NO_EXPORT)</t>
        </li>
        <li>
          <t>SAV mechanism</t>
        </li>
      </ol>
    </section>
    <section anchor="IANA">
      <name>IANA Considerations</name>
      <t>This document has no IANA actions.</t>
    </section>
    <section anchor="security">
      <name>Security Considerations</name>
      <t>The benchmarking tests outlined in this document are confined to evaluating the performance of SAV devices within a controlled laboratory environment, utilizing isolated networks.</t>
      <t>The network topology employed for benchmarking must constitute an independent test setup. It is imperative that this setup remains disconnected from any devices that could potentially relay test traffic into an operational production network.</t>
    </section>
  </middle>
  <back>
    <references anchor="sec-combined-references">
      <name>References</name>
      <references anchor="sec-normative-references">
        <name>Normative References</name>
        <reference anchor="RFC3704">
          <front>
            <title>Ingress Filtering for Multihomed Networks</title>
            <author fullname="F. Baker" initials="F." surname="Baker"/>
            <author fullname="P. Savola" initials="P." surname="Savola"/>
            <date month="March" year="2004"/>
            <abstract>
              <t>BCP 38, RFC 2827, is designed to limit the impact of distributed denial of service attacks, by denying traffic with spoofed addresses access to the network, and to help ensure that traffic is traceable to its correct source network. As a side effect of protecting the Internet against such attacks, the network implementing the solution also protects itself from this and other attacks, such as spoofed management access to networking equipment. There are cases when this may create problems, e.g., with multihoming. This document describes the current ingress filtering operational mechanisms, examines generic issues related to ingress filtering, and delves into the effects on multihoming in particular. This memo updates RFC 2827. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="3704"/>
          <seriesInfo name="DOI" value="10.17487/RFC3704"/>
        </reference>
        <reference anchor="RFC8704">
          <front>
            <title>Enhanced Feasible-Path Unicast Reverse Path Forwarding</title>
            <author fullname="K. Sriram" initials="K." surname="Sriram"/>
            <author fullname="D. Montgomery" initials="D." surname="Montgomery"/>
            <author fullname="J. Haas" initials="J." surname="Haas"/>
            <date month="February" year="2020"/>
            <abstract>
              <t>This document identifies a need for and proposes improvement of the unicast Reverse Path Forwarding (uRPF) techniques (see RFC 3704) for detection and mitigation of source address spoofing (see BCP 38). Strict uRPF is inflexible about directionality, the loose uRPF is oblivious to directionality, and the current feasible-path uRPF attempts to strike a balance between the two (see RFC 3704). However, as shown in this document, the existing feasible-path uRPF still has shortcomings. This document describes enhanced feasible-path uRPF (EFP-uRPF) techniques that are more flexible (in a meaningful way) about directionality than the feasible-path uRPF (RFC 3704). The proposed EFP-uRPF methods aim to significantly reduce false positives regarding invalid detection in source address validation (SAV). Hence, they can potentially alleviate ISPs' concerns about the possibility of disrupting service for their customers and encourage greater deployment of uRPF techniques. This document updates RFC 3704.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="8704"/>
          <seriesInfo name="DOI" value="10.17487/RFC8704"/>
        </reference>
        <reference anchor="RFC2544">
          <front>
            <title>Benchmarking Methodology for Network Interconnect Devices</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <author fullname="J. McQuaid" initials="J." surname="McQuaid"/>
            <date month="March" year="1999"/>
            <abstract>
              <t>This document is a republication of RFC 1944 correcting the values for the IP addresses which were assigned to be used as the default addresses for networking test equipment. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="2544"/>
          <seriesInfo name="DOI" value="10.17487/RFC2544"/>
        </reference>
        <reference anchor="RFC4061">
          <front>
            <title>Benchmarking Basic OSPF Single Router Control Plane Convergence</title>
            <author fullname="V. Manral" initials="V." surname="Manral"/>
            <author fullname="R. White" initials="R." surname="White"/>
            <author fullname="A. Shaikh" initials="A." surname="Shaikh"/>
            <date month="April" year="2005"/>
            <abstract>
              <t>This document provides suggestions for measuring OSPF single router control plane convergence. Its initial emphasis is on the control plane of a single OSPF router. We do not address forwarding plane performance.</t>
              <t>NOTE: In this document, the word "convergence" relates to single router control plane convergence only. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="4061"/>
          <seriesInfo name="DOI" value="10.17487/RFC4061"/>
        </reference>
        <reference anchor="RFC2119">
          <front>
            <title>Key words for use in RFCs to Indicate Requirement Levels</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <date month="March" year="1997"/>
            <abstract>
              <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="2119"/>
          <seriesInfo name="DOI" value="10.17487/RFC2119"/>
        </reference>
        <reference anchor="RFC8174">
          <front>
            <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
            <author fullname="B. Leiba" initials="B." surname="Leiba"/>
            <date month="May" year="2017"/>
            <abstract>
              <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="8174"/>
          <seriesInfo name="DOI" value="10.17487/RFC8174"/>
        </reference>
      </references>
      <references anchor="sec-informative-references">
        <name>Informative References</name>
        <reference anchor="intra-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Intra-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="inter-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Inter-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="intra-domain-arch" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-architecture/">
          <front>
            <title>Intra-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="inter-domain-arch" target="https://datatracker.ietf.org/doc/draft-wu-savnet-inter-domain-architecture/">
          <front>
            <title>Inter-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
      </references>
    </references>
    <?line 912?>

<section numbered="false" anchor="Acknowledgements">
      <name>Acknowledgements</name>
      <t>Many thanks to Aijun Wang, Nan Geng, Susan Hares, Giuseppe Fioccola, Minh-Ngoc Tran, Shengnan Yue, Changwang Lin etc. for their valuable comments and reviews on this document.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAAAAAAAAA+19+3LcRnb+/6zSOyBWJUtKQ1KkLGdXu3FCS7Ktii4TkVpn
s3K5MJjmDFYYYIwL6bGorTxEHiDPkkf5PcnvXPoKNDCYCyXZ5mytzMGgu0+f
Pn36fF/f9vf3b+0UZZiOfwiTLBUPgzKvxK2deJ7Tn0V5fO/eH+4d39qJwvJh
EKfnWXA7eDQV0VtIV41mcVHEWVou5pD06ZOzr2/thLkIHwbfiFTkYRL89dWT
4bOTR0++v7VzOVGv3NoZZ1EaziDNOA/Py/1oKtL90exysl+EF6ko8T/7I5FG
01mYv43Tyf69LzBZGZcJJPrK+iV4LsppNs6SbLIIzrM8eJqWebg/zmZhnAZQ
MXwgcvXgNKvySAQn43EuiiL4c5jE47CEKoDgo1EuLh4Gpyd/pgJu7SRhCjKL
FIsOKygmf3hrZx+0UDwMbu0EAVfhWYwKSfFBlsP7/zXN0smkCtOoSoNn4SjL
wzLLF/h7FJcLFD/+G0hOD7IKxIVnj6ZxGoJqq0IEZ88eB7vip0jMy+D1v+9B
ruo9KhHTCahM8jBIYtTcv/08iZJwdCDG1UGUeiR8HIIgsRbwrIDSp1UYvE7j
C5EXINR1CFdmqNv030pZXLt8z+JRjBJWH0eHVTJaqsJnIAqoehL8R/xRWvrH
OE2iupS3dtIsn4H5XoiH+O6rrx/d/+d7n6u/f2/9ffzgc/335/e+OMK/oZtD
f7YziK2+sz8v6FkQlGE+EdD7p2UJzw4PocOE8F70VuQHsSjPD0AZh9CjD7kz
4yPVj90M82yUiNk++JtSzERaHsr8uVe3dk0Qy+3VL0R5meVvi+CbcB6cpGGy
KOJiEAw5/+BU5T+g/v9K/FjFOT0oAi4R8oUCj+8dP5C11g5iW7W2Mtys1sZ1
bb/WpmnCPJpuubUxy7gUUVnlwq2y05bt9d8FR/ziydlecGLltLwB16/KZeVt
wM6K9BhZVqgIOJ79/SAcFShliR38bBoXAYhYYUMGY3Eep6IIZnrMi+Ebjnr2
YBmUUxHMRU59OwWZsnOntck+7CoGBcseStkvXNn3oLxoGqZxMSsOcHi0vgcw
3AdVGSfxz2IMLj+Y0MBfCnovrxKQD57OYWTFCtQKKuZZBjWasMlOwwsBFRFg
+rN5QqYLeV7G5TSAeixArDzOKlCHKOJJWmAHyfKxyKkAri6VCs9RA1GWQyHz
LB2jTopIpJgeauDqtAzfooyQIJxDXw2jKeorytIihrwxaUi5jsVFDKJDWSN4
NRglYEfBKPtpwK2i9D6zwhFUNYqeoozTsAyg3HCSZkUZR5gRv29U6woGwlyA
CAXmQZlSQ89EWFR5SyuLn2LIG0UGdabiktWhlEktWhygWaGZzeLxOBH47TZ1
yWxcRRwJ3do5XWIQICe2QXweR2FaJgssJMshjix7tPZB8HIuaKhk8ymqyUQU
JdvPWMyTbBGM4/NzkWMmNXt7906Ocu/f89+/579HYQEZZNT0cS6zIT2m7Dch
jruI8ywlj3gANUa5YqzTwOjN6Sa7/O3kdK/ZY3b5G/5WE5CsWDp9tlGqLJQD
4XAG8dZUhGPKEBOGUVRBX18EVYqWrExcWyvUsTYoU71rQ9b79wfNuNeRl11Q
YPuyRt74YzN3+RQbCmo1zwrZzyuwTbJB7o5ofmhx3Y6mpivIp8iSC85H6Qz7
gXAdmt2p2AeOuZfb/eUyTpJgKpK51DiaFzmkEntQBvKzssE3geghCozFokh2
N7qEMM8xIiiL+34BJhSCiNxw6ZhLDJMi42LB5sey0BLs2ddBm/2ROrUqjko5
aPp97QrIu4LzaBkAQqUwR8+2EKDcDPQwjcWFab0cqnhepREbKUSuA9fnzcIF
dNI59vBgViVlPJdKME0JKZIku8SyHe1LjYG0EO7OwPNBRnFJz0zFi7mI0JW0
dFVXF8ozF0v9ci4mYQ4uDlyPbGtpqmSWTiuwzWGrFe0GBq4Oy6hkD9AW6gqC
1XJ69m58IA4GwTkYCthDBnALwm2SgR+lICY9QtMs9gZskXlWZlGWYHXBZ0C7
R44x8YgJP4LfToI5IFXOEkMc+RVevYT6Y5O4BjAVhaqrrJL8nR1oCC4sH19i
h8+zCjovWkORnZe1R9BEZQU+bYbmBOXt/vn5HsKmkuUDNYQkIPR78Gzmh8sp
oFYIDWAED91GrDc1FJW0xDthUUC7KlOXqrc7G6tnNg9prFR+tR68NAyBe9/t
28E3GVSNvHQEBs1dssMrSVEtq8GQEkSNwFwKGpcusyAb/Q18L7Q1goxbO3eC
E12Nf0rGP1bZH1k53u5bQOlF+U85vUcjCbYLeDcw7kuRJPvKcE2ow4U8d0IG
spl4VJHzgaRFNdovFjD8sj/GVzBv6M4khvzpd45yd8npvU2zyxTbUMo+i6M8
M7SNlHTP+DPTYwREFBWau9cFU7w6jsHloX0p57srDibQj5RxHtZMstgLRgur
yVsjCB5pGz7HCRAhU1GApyvYs48FFDBDI9dx4jyPVZRrxRpomrkIk33IMhl7
8pbW5aCzZ2E6qcKJNrK3YhFAsnERfPb89enZZwP+b/DiJf396sl/vH766slj
/Pv025Nnz/QfO/KN029fvn722PxlUj56+fz5kxePOTE8DZxHO589P/nLZ+xa
Pns5PHv68sXJs8+ajhCVzv6WBncI9zB4C4sd8K8R2BZ3g68eDf/vf48+h3ji
H5CAODr6AwUX+OX3R/+MMRsOtVxalkIMyV9Bw4sdULEIcwqgwQ6jcA5jRoKD
DIRGU7S6KTTtwc7Onb+iZr5/GPxpFM2PPv9SPsAKOw+VzpyHpLPmk0ZiVqLn
kacYrU3neU3Trrwnf3G+K71bD//0r+hbgv2j3//rlzscsJ+RPZLvoXgdzPCR
HAyG6P0fkhHjY3eMoNETPT/0MbBi6FfonuI0SioaKWAomkrUww50VqUQ45fS
0+7nIglLal/JHMHYqQR4jGNPrXRrPCogJrA6PGNDRHcEk3R/pBQAgCuGySiG
hB8YgCE8A2mzGQ1rCOVLRg2qg4IRYQ0kJpzFJQ5E47iIwG0EcXkQoLTfQiCy
fx5GmMkr8h4Pg5PUjV3ZqwTyrbD26xRDGdm9SQGPKqjfDCLdjbKNZC7ac/B4
wO0D9ZeujrRCIyhqEx+SOmTMQPKcnAZfMTruLYn4Cb4jSjk5pSzYzqCeFsUu
/Rc9PhVlNVdOiyLeAp9gr51I+h+8BnrkJA4pQAckT5gN2UjELJjwMY/dr8kr
U767j1+f7XFSgM6RBIYh/YiDHvoLamOlJDVCwFto32Q7pBn1Qgm4RSJyxrns
od69Q6kjQI77hYiUQLIYxOoBqOscjVKyGlnuUBw6e/kWVVBDOAl+MSgTiC45
3JdgWLj0hYCwGnNEoRVMRsNNxATCxRn9xIUA0CMeQbkJqvQ0w/BalUwwLado
hgJfKR0FO5Q2FWJcY2uwYKca5HGgU/J7Gs8sjy/RIP6uP8xy3f176+cuv3HF
TS85zuCJAQDBlXqDctp3PnfpofMG/6s/V+oNSPml8wZYmf0G5nZr52pJPpyX
ecsvj3mrq+bmrSUf5y1viY23vNIbTcgPPZb2jr/9SefKGvXnZf9Wl8Zt+3cP
g9sENMgvEGv6L58Z33Hw2Xu0Fu6G9A4EBYCpK+Q/VXBIpgfd+jyeVLkDmdtc
x0HwHY8qOrkHVw44B7ABiYEolDEehzoBuxnbvUBwVyLMxL4mygX1L/J8bvQo
dYrAGf4CuBsXU0g1hpgvKpVfw6pQMUoS7PBxCv5kCr55MmWRAJXFYb7Q7ADl
D84EB0cJa638RlBTwQTGDLv6OI5KGrAhNi7rAIYxoA1dmh7x6ypHJSB/wiqT
VZuG3DwQm4WjGEkDZv7IuwL+hmqRIxPMdTUdGbk9CQN0UKBBs6wZagWHKpHA
q6ml5oFyZ9h6hEaMK1P+rmQnxg8zCRn7eTGr/AOk9GOmRpKFMRtQbs5RPHlI
xY/AMDPBzGw+JWbjMfEzxLRkNaB6GB6FhgXK/Z2p8QorKa38kdMF1LDLfEJG
9LOls0vTARpjILZcSBx7CLaYpXX4RRaWi3Pd4o7NyKAxoDBbxiWqHRXERAlC
iypKsojBNbRvnsk2MHJBtywo1KNYIRcAAwqiSKjBUGMHqroq1HH8QcHcoxsL
YNdjCMimUmpMp0gDDkABvIlUEccwiIIbwjcKNc7iWJ2WMcQy2P0TQCWAVMzk
AY3XKnLlhqrJRnTRWOJvCgQmVJ1Gx4IgnUl8jHE5rwGYFXyBBptaEauiYKJa
yKmegxuB6G/EwYWM8oiMdeM/qbFBIMEM+EBVv3iMdT6PxZiJ8tRlcRFSQLAo
raMqkKPB3pBwlafxvNCO6ORUSGOhXjrS+oK8UQ3fId1q2c/SrjlwYhXV67WD
U54XdEM1moY4lQaQ5md+R5EkKgMaaGQDoPhEiUjHBS3e9FvGnABVuEyKM90h
GK5iX+eOCcYFDUXEaEQaPU8qovXIFUi+gT0B2efQckdP0zGiMEiryZRCevyE
4BxyBrb/inWCYPffh0+LPcU+49ip+J3mYADODp5jAssqeMpJcVvSH8hUZqLC
wf7NceQEWp/V3Jb7eZ7NrG6KznFelbYnll7ya6JNh4pJfQWNo92hDn0xnWfQ
kXiqMIhRcrkm7FavSoMiZ8Zui/w2PGo3ipjbFrxnIiQGaRnW3Mq8UBxwe2Ua
8rVXxVPv9srU812/KkPFWT+yOOuzeKYrpKkIH7kNAmPTg/8viBWTsydxpr0n
V1gJ4vIajSxpxK3mYzu6kJQDehw9kEzDFF+eIuOUDiyQRF4GZQK3Ni9s+xyB
etWA6+ajCFTCvEK1nCpaiiNZbugG0g9QTxUqkvOx/c5clY3yba0/B6cTggxD
HqFRqjMOJKEbWTbFb8/k23PzdqnfVp1SasE8N6yRnoEmDoZJ+V50kRxq3fZz
wjlHI56m9odpUh2GhKKkZzTV9EqcQ2WmxMtYHcwaa/BdnpbKzbsUOcJ3kWtq
nB5BbMO2GFKH4mYFQF8WVj4U2as5eLI8p5L1fmXJ0ruCX5sBsqVa1hBKkjvt
WivWetdqcWzXWpNbaRyf0dZ8vav2Ssgh9DUtKAnVagTJx/Nvlfmt1jaPhq9J
jpmAoGoBgRfat9tzJN9phce6eHdV6RlBo3e3nWFMiumuX4KM392mwOoHfvRD
EV7IVxtjFYrn8fg0qt+581JND9258zB4YmOj2uQhW6F/BrEGnay2I1BmjQ2Y
gQ51FGjiwUFRSb6AsYZ1MWbWQYACBJ51J+whZMh6aPOn0DFoJRWuvJKhrPkF
hQwnk5yqmKX7HM/W36MmfMotep7JmehBbdpCk38WsWBER/GsODSuN3Lm2otN
c3SQPMs+d70EUCsX1kUMuR9Lprstr3SkvpKscXBUZ3/c1F8//QoVo6i0Fct+
jNVUnxeAKH6YZnP48/ANy9D6odRH9w7of4dHDwKtrCOVcFnq9l/xnzeH66S2
yLC1dK45yU6dX0fZfXVufc4strhQOrcftqSWbvzp0KLCZWqbJ3d/d8sGD2c3
vkxde+qUvVkPbVeKshb/OzVy1EOtWhlJ4Lp7Wo2UNe9dtbyL/+xald0LarTs
chlMBZmmdVaCaWahWKSKtlXgUU9Rwd+2F8fBxnGaxWI2EyWuVlJBsnK2kve9
c0fl+cjKE+fmlAJwHKwtUrNFq7HFdYKjXQCU1ay0IAglV3qd87KVljrKGSvo
pHpNBmGiwkpz2KRq1FodJGkpWtHOVW5YobEPoRWYO0BlXJNh6Bm5XkYUtALM
Cl1OTiljWmDCC+jEefyT3QkGjVk0TWpaLLWcfMJKWAUjZAzHgEbKWEZ+caGK
iD0hPMki6RgdbSyhgGU+JkAkuqlkeqOs5g7jzFUhag0XuWRySW06yqrUBDDs
Q2RL2v4gzHNceBNcxKGrBWxO5L2s2VQCcnZijf1YFmqURNjSAWgdE5fuikOO
sE4MqUa0SzBtMwiKmFqE1x53WLFF9uPyIq6O4ziJfFTrN3gmBYJZWsCRzZme
/iO+lBdykd8ItFdWHHXO7EmQGqs1nyfIaosfK2TFUXp+TZdRaM72zh2KQQHE
Q1zLczdMg5rF2nHRv+fSqjGO8XgV1RGtaQRDjc8XZvrfmtzRMwN6/achAzo9
mlwp1BAmwjXlFNQOrAnnkOo1opUw6FDiqFRkWIcHq885+3orB8HoO8JaJyUl
Hx/oyQHhJNFuBjl2axqqWSNrYuyAl0jcdwRTKiwaIEIZ+q7f0l0r3/Nijs6k
xzppmSE8dftuP8qWbBsgzWLAUf5PIUKTAffqo4d/wFR/eHhkWaz4ac6NIGlZ
ZbjNbkROSCl9lGTR26IJqrD5yGB9wmV5PKGoB2eNlJ+x2zjYIt74exvk6I05
NkUd3Rlo4HG8NBbthT3kS7on2Bl0ww8I6/CflpcUAjli8/zCQSAaPGAOxvyd
l2oQBn7TFTfpCYc4pZiXPEpsYIDO5vBk0IAgnTjwOiTgj2oB/uJ9pZmBgz2s
DA7bUYnKwIdJ3pj0rbikJoGDQL5QORzWn9cl2Lg7tyuIZPgY8KSOTq4HnoSb
4JPwmgFKuByhdIjwgSFKA3XwpoXQz9i5KKSJMihI8EUig2WIpQ8w4b2CKe7s
4JKQeO/eCiEXYxTbxxfaNdN6kEgAxpDLmfVYTjE9A4Z6bi5IactEaa8brxhl
apkasMWJCrtBTyP1QfBVW7Piw1SjRQpCY1wIj1P9RExTUVAnllHYIJij1wKQ
hMCdhGaBTS/xVCb9qnbkTevY3Eo4zoluv2jFcN0d3UZx1hJXm2xeE0Z1upcW
HOVZSd0TSjFa8pRpwyXcr4JLARX/bgEPs7i/AzOFy0FTq/OhL9qn0FSB8gs4
RIfNoNuHrAY1Z6YjMgdk4ctdA8xSiNUASRWxMC0mqAx7r41p6U5+Tyf/ZYMs
XvG/HsryrVvuE6y0RExXemHtriL49tqioFXyVR8Kka9gOGv3U8uJ6ysrnJVR
qsLbm89qtS/o1uW3iLZ0fqZjeuZ652ckOrripVQcQGn+thuX9iz7VzcnZlnF
6qn7zkNeR9lrzIm5SPMTnxPbenv7sVzf1PTIQpfmYd/UTcx5DZJv5hXNFw+s
jdUiCA+qbVkgscmkWz3LdkhrC9a2Q0MHlmr/O684bpObg8UlopvAUe56qC9l
x7WCsxGdMYMRAlh7J9c+MNstcL2s2iGM+xWEXs/mXVBvcCPGJM4mACf2dEJN
fEnV3zNj5ESi6r0W+E4Ri0ed8JLCWDQvk831AR2tcD8RPTEFEQ3tTehZUr4G
3tBbQNaZuamLhr+2TNoshR4dMzau/feesfE38DXM2GwZTLizNSuDid/ojI3T
wLafbw4jN0BitVFzAywRbAonujPoEdV3Z+CAiu4M/FNZNQncl46dDD6x+S4D
kDrmu8xLvtkmy0r8DxpKrGu/Bi86px+vQwKpgN/UfFe7oVMOa/XGvhF8awZ9
4UdnBn0QyJaqsLFTNV+6oIhvhq0vFlllgm0NMOKbX7OG+rVxSCeD3QJEkHmn
PZMRbxRdBkdCPx7Bbqf2cWtAkoSR4OOwPHBEKbQnJNHezZrAK8IZz+JBEIzn
SOF/ZSTropIu3DKw5ywMdvEF6Hko4/wwrUEUqVbr5Q89+3FNaKQpir1wbNXJ
kFoPuCY88knOc9xAk7XmOVqxySe5c2V5TL0GS2+FzTcsfeNzw9J/qiz9r2/n
yrUuDQsnEx99vnQH4SZE+klr5u1RrJQT4lfmHZcz6cursB1OvRlvGFl7hrkr
8KUO4SkPQQvdlWh0hDBRnnJ3Ju+ql8Pk74q+J+vyxv/acLBigNneCNcbWraX
20l5rx9j6jbv13pOeHntOxTW2p/QZ/1M2waF1YJKd7nerzeq5NPMZHd0OpTt
oj/hzQpbJW99sSZq9fiGvL0hbxsZ/GLJ24+0WWGFgPQq8AWkLVHmCgHp2vGo
j0NdPSBdhU1dNyL1UKo6cPCQqmuFpGvRq974JFwvKO27hNhPnH6kONURYmtB
6welSHutD98sYu3RtP5WXWVl+GaEqX8zwDWu5tj20vDfdHTb7IZ4yJNz8r5z
bqM8GWpf7k7ap+1I+iyp2olPfE+HPghu6ZFnA+vcfjKMZZfGsB46ToBzT5AM
+Pq0gnbP8DYadVxeFNLJAqOF9tXnYZzQyWJ4vqS8Aah2FOqBOauuTUBo3R+r
UJ49qg4trJ8SONCHeMoziMf6YGjVeCh1yV23dkJfBZknMlv7kD5+Yt8Y1zi2
b9hXc4U51FJWwXvK3sAcd2jeLLMShrEi/lnIw/acIgt9iBkZNB4CCtL2wlx3
fVHMXX94Q6jrCfvDsTmN+Ip/xZjJd2j4lxYeagsVV5TCH1uhQUz2UeXOmeJ8
yrg8sK7bzmT7kL9VwZP3+EqrM9sxk5GgdT2sFsbc2LdMLHapzo/ytGW70zun
YZrhlS5iJEsd04ZMGF8mEyFt0O295hxk6jArdeGnqVcJ9Z2U7D2TRftspNwg
xzorRE1EOnyGuiHIgFBonIeXrEOzgzBOwZOrI/I4xmooD2OjHvHSUotpC4dm
ltNuP9fUbr811qG6mtaLBqS6O3WtwhUrnFDqLMy1K+xOIjysz5yZyK5IK1ye
cG5cFdXGGc+P7v3jIDjGf7CCuOXy3j/a8Uy7gmhldRJJj6Mvc6R9oNlEru+G
amZy0hbiqbzUl3xJN445FcaZN7oRiTJMBIYOaVbK2IGm7r8ZDqzzdnW1ta7C
RN0TIC/PIy0UIuHhG5RfTKuST7Dm+BQ0DVE2Htj/dWZffxafNzcdW62Gp2Er
x6svHpD3AeCPJuMBnxRun9EUSrEHTpeBVuGxRDakrghfzVJMpY8AbWazhNah
SCSTpcIUeBCcxrM4wQPCqZEd3eiRky91dHSBN5/ZYlPvp+mNcioPw40LKsvc
BMp3qw2sy17U7axqsb99iSEtYRHn53igKuCDKhlTBfHAcTxUa04nsmov0GqF
7YGFPulfXz0BhknXzZDxeQ7vZYsbWOI0D7qeZfoQ+SQETyDvUZsKcDLqnjcV
a9g7w3UGNUtRZ1RyA4qx0boMe3RIo87pwmumwavFeE3Bz3ja+5k5jV8e9GxW
FsmqQH9k94C9rYSer3oc1YGE11JrkdXqIJOIUG4txuJRTCm1ORwqV8GVtHZv
6B3UBZ2EZYqRJ4OWUn8ZAK+U7r6hvf3nXi3yFvBQNccky8ajhQi8zSIHY6No
NH20enCB+sBc0sK8tIYCChEph4GzHkqrEb3FGOEWlTpQBkluj8arLFZXnsrL
QdUlp9DNI9yDb4U4fGTX+CIu1LAib4EHrFyLfTyHSNdCoDM1bjmB1pKQWJ/4
QD1VXhOlw6bWsU56BzdyI+DsXiDY71Rrfcq/dYKzXrGmzwZuhNzO2En3vsiq
iNQcrOyL1bFp04kOs9ANyzHRHWHpzhe8RyXCW1SWHAeg9wfxYFiY7e4N5KBw
QgNDqPxcvGPrDAx1laBpOSDqHzt1H0y+QiDVIIq2G0Wxl/AotySmQ1lGYe7w
8EVHgw1MwRdYdWtv1ShrdZNSMaPHlHi0lGcLrtRltXzOpTNNAX352ojZFZK8
Md7+ntYOOZeXATtqK3G0wvvgZeRHt5Mq/875qYqQNA6a96njtnuUu480wgNs
uhkj54zwpcesN/iirvPvG7xRj5uFH/fNz0vzqBex1wz0xYd8ljxegHMJaeLo
bWJusuh78L4jWcuFTx7SpnZ4fiM6rPUOcxECpnJgUp2ruXNn2TUFnkHXnVbo
3Xbu0GtxE53+UY68crR1rg3wD6Cy6F6t4R9PQYS0mMXQTzDgbh9UtzeYDphE
0PfwTOt3rkg+xWZZCHHWaUKuqsCLlVX97CHaSJdWsxGe2eVm7SR3QmXpn+pI
Q7ezhWu7h+re5tJnrO7MrHYHwSfAd/QYqQ35UXSY04psx3I1rTcer2RElEIV
2jYYu663OeSCAkrV+OvKIO/3sXC0uWlFXbntqqbHmNsUQxXuDMCN+jsO2LpG
ZRW32zKQrO9vtcFKT4jn/Qaz+Cd11eOSw7SRE5G3vIRlNHUVtZpbNGcVoB3g
9GoJ/0dxzis87nmezaXN6lZfaTpQMlDk/MfmDmiES2LJFGGHc1UjsRmB1Zjs
WqJmdWRPtxuy5lC1O21zpu3RxCpetH5NzyfoPskSe1ihzfIztYUmswbG3f7M
ctMt1/W+GT4ydmfPO5v37AuWWhxxI+R0fDFdatgOf5aVXwM8pizWi0cZPTyw
KqqJenzV5duTzE2S+vYkkW/39iR7Ln3Ny5OIoLSgW7+bk/Slp86NmY2rk2ws
1nZZEmG0+iWfdJ+mnMdH2i/fP5wL9+dfxC6v5s3YjSWXXanph5PT4P7u8L5/
p3V32XcP38hSD9/cXbHswFn06pWuM7W9RtSXvHu3k/PNk7wr9WGw++h4uNeR
vJnas+zxblvyZmr5FfrT7vDzPftRM7mnbGwdKI/+5f+alnqzLLVWlnrupHCT
m9TD47+CXR1/rzXtl9f50ixbN5N53pq8kdo005WdYvgAJXvwvfsFU7f0JTvX
ltIxtepLx7vDY6svXQXDIyzj6PvW5LWy7b4Eqb/Qqf3JG/W2xApafjBfdGpd
jhHLTe1LbrX3US15w4qayV3JubGUWPT1cIiPnB/Ml0a9212h55dG6itqvKPd
4dEAdLFnVUA17IPdoTqOYstl00cugO/yl0FTa46Q/VJvfS8kF9pcct7weY3k
nNKsgmquGfflYeTRy5lg6O5z25Yz/Me1+7jXOMjezk8vbKrJ8v59oG/jbTlZ
wyueXOjbX8j6oexozvTvMf17f2DQEEY+aNEoxMyAIhVfWeDIAj7WkRQq3pLY
BLJKxIVIDqhMgm3WyfrnHIKhIDY++SM/abxsw2firgNA7mnzPazSH01NWgu9
74IiklCfsa7u4oI/hrwsdUgHkNeFHVjHsCeLA/69lKHtPJwQgU5rrzjodDLT
eei17UqdBMbCXM25a1kId6kKHKlKHpsmxubi9rTi6XkICQhhLztuXYpnb3Jx
f/8c9Yll73+JBe9/CdLSn593LgsnaUtrdbus6O+Yy7axfVu3XHFV/vVuGO3o
mKYjOv1z7U2jDa+xXNHWVXFCruc2/IFzMEdjsX0f37DhZlImPFa/6Gq5ba56
/9XwwfIcLcIFr7bj3nY9BAob9wddmt9c8+Ssy6faomgv9HI+w1oueAXYSJ6h
NEanT+aHO6TIg7dZsOMJULK+W3fMmh9if80KVPZ/6CrdBbXOuU1as1g8rWbj
pYWkurrmblD+8rJvUL4/+Q3Kv0H5HxrlN2E6pH7x8ocn/zl8+eqsLblB+XWY
vhLKN+VsE+XvNX+4Qfk3KN/I40X5yXy+Fsrv2qxr9uNpS5fQvwv4e2A/CLdd
2N99xd24impi43TYrEpxltCJlLZHEnx8YoDWSInk/Dp5AUQjig7Q9fVpOSyh
cUYVTjZCygvcSE8tCuko5lY0gV7MTvu0JFVg3yhu7Q5pE8kmFXxS+WVRCTzi
xIW53w01A9pArTEy6bhKjhd8Kj2Zm/ZQcEZPH46d+MR4iY9wJoA2gH3lEbZx
YGrDrX1avAT+vNrJqb9NXuL//ff/3BATemT+jRETJq7bgJ8woeXS8LMZzvXI
5CRdRLhn61TkMOLd5TRdrEU/SQBnKTjcVxLPR0FUL4PRncnwPkKu+987PIh+
uKIkG9EhOg+b2FiNFWlr2T7kiPmJ/rVoDi/nsEwSm+5wyI5uqkQCaB5kVKNc
NVMuZ0zq5IevFm3ESePTSaAs4U9MHrJhbR7FECH23yaT1r7q0CnevzGT14Xd
V2usiMuL+HPxSiJb1WTyhZdcedPMBD8ejqTuPXy51BTbpEo6F1TUmZb6x+Zr
luXS3sRvTBPrTmwzJ9bfbZm0O2fPL22ZXBkW5T4xKSvxKNuUhD6bsSqO6NdN
rpixuKsI/KftVOkejItToS7iRb+4+2Q8EXLsbb0Yy1/0rZ3vzDUY7esIhvfV
nvNQjvQSOmK0RScXJAuDdonOIJSujm346pvhwVJWaFzkfVkhiuIUJMM9CRR4
AjxAFUBgSqsOdh+fvtpbf/kHiLMhDxQ8ZrG4ZYJXlljXsPLj10bqfIcrNCpo
UmpxkgDhH22ggWrTKRt4lJw0SPfQz0FjWwMyGMw8WCsikC2R6dl2VFn3FbqR
m4NkkSRgBZiZzs+aERjAvucmlvyJ8wutN5GGiSeb0akXjA+5jq1IU5E0IOvT
cykOZtlWvSNTvWPnTCIGz8sZnPtLkC3YRpJJKEXl1goFnSZ40GTIEEmf97JU
7A8tsbssh681JTtDfEoNp/a5Kr6jP4cEnfzaSSMsY5OTIxve7tOjh2w1tvJB
q1FBS41lT7PGyNo0m39NkqWDSekSdjnLcrRdloXHvV8iy+JGO9uM8HxxZW/u
pR6PdQSpS8mXVTJT0Xzn0pEVJKutUdggM/xI0OwnZVbMzDnAfcPMLJbGm9sq
mTlsTQvlsyyzFpPwUz/LMmtjb7wU0FLJ2lgcHxXkyawnm+OhhNok68HqNKmh
tszUp4vdaVBESzLrZnmclS+IVB0M3e4d/GyPQ6AQdlZg8mr/Sp5U6pHWfu7P
DTPbPSlL3Fmc711hCuVq6hySU4g3N5Zsd3j0O1oG9KcrPlzVL5m1TsebW6fO
pOPSOrOX7fhy6zTaOj/lrOLx5NZiGj14Kk9uLZk1V/j4TLCe23Kj9fFWDlvl
rPjpyKxrdPOzRp7M/hxDUDAj2rKxGqg+/eDyWNcn2aaxBli/4ldUjCOjQTND
LzcWh7LfWZg7LfDsO4gFb+1QUMdH1jqB+7FmYvgkSTkpbgdPHkIGysbDPdcl
ZWRyirBJ6mJ9Mkbm1e+y205OpiGVj4rRV8r63vcdFs/tRgeGIlor7Ntl5bG8
ivGAtpFHRei2pMTc/hdk3vC+dZs82Licnia4YxMgkvFybjkho+cDQnMTbbMt
yFN7KCmeyabyar/eVgsJBmhxSi3VlxXLcwC6XnRhNaQ82IpKGaFSMaNcJIwL
pvG80Kck8FIt4mWoxoUC5moLuNktrtCEwpT1ozxCvGnOl9rGpyrxQXAyZngd
JnJVz7HqqY3EPRkCNsSmWRkkv32+gM8WabX8zagD3aTL6IPjD0AfqDO1JePU
2tfbuIQVVoAsW1NirkMj4bZKJrSs2OgS9oMv1zBd/YZMaA89+HNDJqydGX5u
yIR+md2QCf0luyETfrFkwoMbMsH9yVO9GzIh6CYTnDd/SWTCg2VkwoOPQSbI
JRqbLfDYmEfgfDZd1+FIs4Q/cN+9Nu4grGdK1koARR12WIb5RJTdNANIHBO+
h6bOLkQOQDOZUcyu1gXBq1TQh6MMdJv9thkD15CunS3w2/iGiwxkS35yREFL
h94CSbBsm8iRTRIcfWSS4MEaJMEmKw1Ux/7F8wP+T++1nu6Ho4keKz+byUwk
vVKywMzA9U22Zt2sIg/frJpiO6t35V6afozLGwdI1oTqtf1krf00rRpof7BC
NptuzNmaNNva4bPRHh9PLmudduLJZf19Pgrd8ymKK59/sp29Pu0Uz0q7fdbf
77PNHT9b3vOznV0/a+772XjnT32ifKO9P1vc/dOLEvHl01DxlnYA9aFBmvl0
NfgWdgFtvPtmOe3hJz22JM2WxvBtrpq47yM67vcjOsws4z6Xs9843USfVL75
komhzOpwKHx8h0eYfosmlIiNs9TXXDlRekiKrS1wOF62wGG75ISziYM2oWhu
hO4KkJfEF2pHB9dxAAhjGlaM4zsoFJv68DefviR9ReZDMRciJvsNDQOR4TdU
So6XswlnP4/a1OWjLBrkSAe58mteUGF36maP2e5yC69NLKNQ7v+21loc2zTK
sTpuw91R9mkuuri/4aILv8f4lTEqK7INKzIoKzInmjE57sWYrMOU9GdItsOM
9GBE1mdCNmRANqQsNucYNuMWNuMUNuMS1uMQNuQONuQMVucKtsERbIkb2IwT
WJEL2JADUP1qTey/Bcy/KdbfEONvD9tvhOnXxPJbwvAfErsjZj9ehtmPPwhm
5yhsY7zuXZbw7l0t9x7LDbwFrr7cwAe6PSsDjtdaGeBg3p54tqmJ3zqW9d+n
+TGm+bsMfUvz/43m/+Rg6/XN/B+vPvP/CUDW4w8AWZtO4ZcCV/FGTrC3Ms8S
eXOydV2yaixz4fHASMQmyCedctdTtyrznbt8MTdeWyvAF+NhtvbBtvK2S+f2
bvtiTvtd7PYwlqIjyOz7cFHofbr0db8QEXRFVSHrHuit1Kb3RevUP5ZeJOyt
DybzVCZ4JYei4HUZJ/HP1Oh0yhUyt5aMygr0IjCZLZ/YvSuLsS9kfc9jZly/
u3VXLmupvXtepREPDXRr+qPh66AyIgW75nZTaU6uMoo9qeFZli9WTUkXlo+E
vKZXMBkOXUG11+m3L18/e4xvgDkl3AUKMQ+xTyXEissWAYXjuABVwI45ycMU
HGdea8MwWRQxH7GM+p9nOWn4a3yjxKdPQgjfyIzk4EXkqHqPcirpCuA8JmOm
yk2SbBQmg6AooXpoFT+DkPhSlnIcRW1Bt/FWMOwLvjoWPZ2IYnkLs1AFq9t9
sXQ+digHd1Eipa8GYddPUPbYujMRTcM0LmbYBVDeQmt3nEUVlskXT+vBkGvG
nhu/P3LyHeqSaQQP6KwoQcdXQeAlgilU9RK9OM2GZOclfcHJB/RenATGPMXS
lNk8S7LJgn+4L/My16HLs8Rlws8PgtMFDGazWmV3xcHkAHr3FBoyAmWCSmBA
L1GvsxCvURdkwANpjhADwFMBvjSb4zBN0yeU74C7B9iM0Ddd73HZEC095ip6
y/Ycdq3hmczhiwO3RfApmtzTkxcnqGXEBtLLB+9u49P37MPA06u2Av1C2Jpx
mpA6KAStnM+pgAgJj4Nv5FXIX94rnzgCBz2dhflbciU0VQVSJ6CosT4AShep
56hSOdK4XsjuTDC8YhXZFgpzK7zs6QlkkISjDCRDr+Cc0McugiLzIuPLu2V4
bk4/T2tWE4jZPMkWONSjT7ErRfdsU6gXlxVd94y9TcwhCKLQXQ8KMNKWhM1m
bAsXOiZQR9pBj0DHCBqJCxPuUXARpgtdWUoVZVUCI12Gx6PFGFUTLli4YSDY
GJ38Jq0PfSyOTOOKYxRZS6o1oNZgBEEGt/BJ9DbNLhM8l42dxbvb9UfvCR6m
1WwkcjH+l88o6mBY9xylBSnTtzTPdxL/rUqD70IyVZDmG4F/nVYF/P0ttDl0
j2/iCtzqXARfx1kErjYcBM/jdLr/YpJFwRl4U3gfxqVJCkn+UmEng9wnl/D/
4Bm0uyijAxnkiRi6JJoNDqV4cYH2gTmoT1ziCYyu2R3s/H8A409F2RABAA==

-->

</rfc>
