<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.29 (Ruby 3.2.3) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-chen-bmwg-savnet-sav-benchmarking-08" category="info" submissionType="IETF" xml:lang="en" version="3">
  <!-- xml2rfc v2v3 conversion 3.31.0 -->
  <front>
    <title abbrev="SAVBench">Benchmarking Methodology for Intra-domain and Inter-domain Source Address Validation</title>
    <seriesInfo name="Internet-Draft" value="draft-chen-bmwg-savnet-sav-benchmarking-08"/>
    <author initials="L." surname="Chen" fullname="Li Chen">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>lichen@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="D." surname="Li" fullname="Dan Li">
      <organization>Tsinghua University</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>tolidan@tsinghua.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Liu" fullname="Libin Liu">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>liulb@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Qin" fullname="Lancheng Qin">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>qinlc@zgclab.edu.cn</email>
      </address>
    </author>
    <date year="2025" month="November" day="03"/>
    <area>General [REPLACE]</area>
    <workgroup>IETF</workgroup>
    <abstract>
      <?line 65?>

<t>This document defines methodologies for benchmarking the performance of intra-domain and inter-domain source address validation (SAV) mechanisms. SAV mechanisms are utilized to generate SAV rules to prevent source address spoofing, and have been implemented with many various designs in order to perform SAV in the corresponding scenarios. This document takes the approach of considering a SAV device to be a black box, defining the methodology in a manner that is agnostic to the mechanisms. This document provides a method for measuring the performance of existing and new SAV implementations.</t>
    </abstract>
  </front>
  <middle>
    <?line 69?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>Source address validation (SAV) is significantly important to prevent source address spoofing. Operators are suggested to deploy different SAV mechanisms <xref target="RFC3704"/> <xref target="RFC8704"/> based on their deployment network environments. In addition, existing intra-domain (intra-AS) and inter-domain (inter-AS) SAV mechanisms have problems in operational overhead and SAV accuracy under various scenarios <xref target="intra-domain-ps"/> <xref target="inter-domain-ps"/>. Intra-domain and inter-domain SAVNET architectures <xref target="intra-domain-arch"/> <xref target="inter-domain-arch"/> are proposed to guide the design of new intra-domain and inter-domain SAV mechanisms to solve the problems. The benchmarking methodology defined in this document will help operators to get a more accurate idea of the SAV performance when their deployed devices enable SAV and will also help vendors to test the performance of SAV implementation for their devices.</t>
      <t>This document provides generic methodologies for benchmarking SAV mechanism performance. To achieve the desired functionality, a SAV device may support multiple SAV mechanisms, allowing operators to enable those most suitable for their specific network environments. This document considers a SAV device to be a black box, regardless of the design and implementation. The tests defined in this document can be used to benchmark a SAV device for SAV accuracy (i.e., false positive and false negative rates), SAV protocol convergence performance, and control plane and data plane forwarding performance. These tests can be performed on a hardware router, a software router, a virtual machine (VM) instance, or a container instance, which runs as a SAV device. This document outlines methodologies for assessing SAV device performance and comparing various SAV mechanisms and implementations.</t>
      <section anchor="goal-and-scope">
        <name>Goal and Scope</name>
        <t>The benchmarking methodology outlined in this draft focuses on two objectives:</t>
        <ul spacing="normal">
          <li>
            <t>Assessing “which SAV mechanism performs best” over a set of well-defined scenarios.</t>
          </li>
          <li>
            <t>Measuring the contribution of sub-systems to the overall SAV systems' performance (also known as “micro-benchmark”).</t>
          </li>
        </ul>
        <t>This benchmark evaluates the SAV performance of individual devices (e.g., hardware/software routers) by comparing different SAV mechanisms under specific network scenarios. The results help determine the appropriate SAV deployment for real-world network scenarios.</t>
      </section>
      <section anchor="requirements-language">
        <name>Requirements Language</name>
        <t>The key words "<bcp14>MUST</bcp14>", "<bcp14>MUST NOT</bcp14>", "<bcp14>REQUIRED</bcp14>", "<bcp14>SHALL</bcp14>", "<bcp14>SHALL
NOT</bcp14>", "<bcp14>SHOULD</bcp14>", "<bcp14>SHOULD NOT</bcp14>", "<bcp14>RECOMMENDED</bcp14>", "<bcp14>NOT RECOMMENDED</bcp14>",
"<bcp14>MAY</bcp14>", and "<bcp14>OPTIONAL</bcp14>" in this document are to be interpreted as
described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/> when, and only when, they
appear in all capitals, as shown here.</t>
        <?line -18?>

</section>
    </section>
    <section anchor="terminology">
      <name>Terminology</name>
      <t>SAV Control Plane: The SAV control plane consists of processes including gathering and communicating SAV-related information.</t>
      <t>SAV Data Plane: The SAV data plane stores the SAV rules within a specific data structure and validates each incoming packet to determine whether to permit or discard it.</t>
      <t>Host-facing Router: An edge router directly connected to a layer-2 host network.</t>
      <t>Customer-facing Router: An edge router connected to a non-BGP customer network which includes routers and runs the routing protocol.</t>
      <t>AS Border Router: An intra-domain router facing an external AS.</t>
    </section>
    <section anchor="test-methodology">
      <name>Test Methodology</name>
      <section anchor="test-setup">
        <name>Test Setup</name>
        <t>The test setup in general is compliant with <xref target="RFC2544"/>. The Device Under Test (DUT) is connected to a Tester and other network devices to construct the network topology introduced in <xref target="testcase-sec"/>. The Tester is a traffic generator to generate network traffic with various source and destination addresses in order to emulate the spoofing or legitimate traffic. It is <bcp14>OPTIONAL</bcp14> to choose various proportions of traffic.</t>
        <figure anchor="testsetup">
          <name>Test Setup.</name>
          <artwork><![CDATA[
    +~~~~~~~~~~~~~~~~~~~~~~~~~~+
    | Test Network Environment |
    |     +--------------+     |
    |     |              |     |
+-->|     |      DUT     |     |---+
|   |     |              |     |   |
|   |     +--------------+     |   |
|   +~~~~~~~~~~~~~~~~~~~~~~~~~~+   |
|                                  |
|         +--------------+         |
|         |              |         |
+---------|    Tester    |<--------+
          |              |
          +--------------+
]]></artwork>
        </figure>
        <t><xref target="testsetup"/> illustrates the test configuration for the Device Under Test (DUT). Within the test network environment, the DUT can be interconnected with other devices to create a variety of test scenarios. The Tester may establish a direct connection with the DUT or link through intermediary devices. The nature of the connection between them is dictated by the benchmarking tests outlined in <xref target="testcase-sec"/>. Furthermore, the Tester has the capability to produce both spoofed and legitimate traffic to evaluate the SAV accuracy of the DUT in relevant scenarios, and it can also generate traffic at line rate to assess the data plane forwarding performance of the DUT. Additionally, the DUT is required to support logging functionalities to document all test outcomes.</t>
      </section>
      <section anchor="network-topology-and-device-configuration">
        <name>Network Topology and Device Configuration</name>
        <t>The positioning of the DUT within the network topology has an impact on SAV performance. Therefore, the benchmarking process <bcp14>MUST</bcp14> include evaluating the DUT at multiple locations across the network to ensure a comprehensive assessment.</t>
        <t>The routing configurations of network devices may differ, and the resulting SAV rules depend on these settings. It is essential to clearly document the specific device configurations used during testing.</t>
        <t>Furthermore, the role of each device, such as host-facing router, customer-facing router, or AS border router in an intra-domain network, <bcp14>SHOULD</bcp14> be clearly identified. In an inter-domain context, the business relationships between ASes <bcp14>MUST</bcp14> also be specified.</t>
        <t>When evaluating data plane forwarding performance, the traffic generated by the Tester must be characterized by defined traffic rates, the ratio of spoofed to legitimate traffic, and the distribution of source addresses, as all of these factors can influence test results.</t>
      </section>
    </section>
    <section anchor="sav-performance-indicators">
      <name>SAV Performance Indicators</name>
      <t>This section lists key performance indicators (KPIs) of SAV for overall benchmarking tests. All KPIs <bcp14>SHOULD</bcp14> be measured in the benchmarking scenarios described in <xref target="testcase-sec"/>. Also, the KPIs <bcp14>SHOULD</bcp14> be measured from the result output of the DUT.
The standard deviation for KPIs' testing results <bcp14>SHOULD</bcp14> be analyzed for each fixed test setup, which can help understand the stability of the DUT's performance. The data plane SAV table refreshing rate and data plane forwarding rate below <bcp14>SHOULD</bcp14> be tested using varying SAV table sizes for each fixed test setup, which can help measure DUT's sensibility to the SAV table size for these two KPIs.</t>
      <section anchor="false-positive-rate">
        <name>False Positive Rate</name>
        <t>The proportion of legitimate traffic which is determined to be spoofing traffic by the DUT across all the legitimate traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="false-negative-rate">
        <name>False Negative Rate</name>
        <t>The proportion of spoofing traffic which is determined to be legitimate traffic by the DUT across all the spoofing traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="protocol-convergence-time">
        <name>Protocol Convergence Time</name>
        <t>The control protocol convergence time represents the period during which the SAV control plane protocol converges to update the SAV rules when routing changes happen, and it is the time elapsed from the beginning of routing change to the completion of SAV rule update. This KPI can indicate the convergence performance of the SAV protocol.</t>
      </section>
      <section anchor="protocol-message-processing-throughput">
        <name>Protocol Message Processing Throughput</name>
        <t>The protocol message processing throughput measures the throughput of processing the packets for communicating SAV-related information on the control plane, and it can indicate the SAV control plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-sav-table-refreshing-rate">
        <name>Data Plane SAV Table Refreshing Rate</name>
        <t>The data plane SAV table refreshing rate refers to the rate at which a DUT updates its SAV table with new SAV rules, and it can reflect the SAV data plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-forwarding-rate">
        <name>Data Plane Forwarding Rate</name>
        <t>The data plane forwarding rate measures the SAV data plane forwarding throughput for processing the data plane traffic, and it can indicate the SAV data plane performance of the DUT. It is suggested that measuring the data plane forwarding rate of DUT enabling and disabling SAV to see the proportion of decrease for the data plane forwarding rate. This can help analyze the efficiency for SAV data plane implementation of the DUT.</t>
      </section>
      <section anchor="resource-utilization">
        <name>Resource Utilization</name>
        <t>The resource utilization refers to the CPU and memory usage of the SAV processes within the DUT.</t>
      </section>
    </section>
    <section anchor="testcase-sec">
      <name>Benchmarking Tests</name>
      <section anchor="intra_domain_sav">
        <name>Intra-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Evaluate the false positive rate and false negative rate of the DUT in processing both legitimate and spoofed traffic across various intra-domain network scenarios. These scenarios include SAV implementations for customer/host networks, Internet-facing networks, and aggregation-router-facing networks.</t>
          <t>In the following, this document presents the test scenarios for evaluating intra-domain SAV performance on the DUT. Under each scenario, the generated spoofed traffic <bcp14>SHOULD</bcp14> include different types of forged source addresses, such as unused source addresses within the subnetwork, private network source addresses, internal-use-only source addresses of the subnetwork, and external source addresses. The ratios among these different types of forged source addresses <bcp14>SHOULD</bcp14> vary, since different SAV mechanisms may differ in their capability to block packets with forged source addresses of various types. Nevertheless, for all these types of spoofed traffic, the expected result is that the DUT <bcp14>SHOULD</bcp14> block them.</t>
          <figure anchor="intra-domain-customer-syn">
            <name>SAV for customer or host network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
| FIB on DUT            +~~~~~~~~~~+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +----------+                       |
|                       |   DUT    |                       |
|                       +----------+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |    (10.0.0.0/15)   |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-syn"/> illustrates an intra-domain symmetric routing scenario in which SAV is deployed for a customer or host network. The DUT performs SAV as a customer/host-facing router and connects to Router 1 for Internet access. A sub network, which resides within the AS and uses the prefix 10.0.0.0/15, is connected to the DUT. The Tester emulates a sub network by advertising this prefix in the control plane and generating both spoofed and legitimate traffic in the data plane. In this setup, the Tester is configured so that inbound traffic destined for 10.0.0.0/15 arrives via the DUT. The DUT learns the route to 10.0.0.0/15 from the Tester, while the Tester sends outbound traffic with source addresses within 10.0.0.0/15 to the DUT, simulating a symmetric routing scenario between the two. The IP addresses used in this test case are optional; users may substitute them with other addresses, as applies equally to other test cases.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain symmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To verify whether the DUT can generate accurate SAV rules for customer or host network under symmetric routing conditions, construct a testbed as depicted in <xref target="intra-domain-customer-syn"/>. The Tester is connected to the DUT and acts as a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both legitimate traffic (with source addresses in 10.0.0.0/15) and spoofed traffic (with source addresses in 10.2.0.0/15) toward the DUT. The prefix 10.2.0.0/15 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and allows legitimate traffic originating from the sub network.</t>
          <figure anchor="intra-domain-customer-asyn">
            <name>SAV for customer or host network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +~~~~~~~~~~+                          |
|                       | Router 2 |                          |
| FIB on DUT            +~~~~~~~~~~+   FIB on Router 1        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  Router 2  /         \/  10.1.0.0/16  Router 2  |
|               +----------+     +~~~~~~~~~~+                 |
|               |   DUT    |     | Router 1 |                 |
|               +----------+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer or Host Network</strong>: <xref target="intra-domain-customer-asyn"/> illustrates an intra-domain asymmetric routing scenario in which SAV is deployed for a customer or host network. The DUT performs SAV as a customer/host-facing router. A sub network, i.e., a customer/host network within the AS, is connected to both the DUT and Router 1, and uses the prefix 10.0.0.0/15. The Tester emulates a sub network and handles both its control plane and data plane functions. In this setup, the Tester is configured so that inbound traffic destined for 10.1.0.0/16 is received only from the DUT, while inbound traffic for 10.0.0.0/16 is received only from Router 1. The DUT learns the route to prefix 10.1.0.0/16 from the Tester, and Router 1 learns the route to 10.0.0.0/16 from the Tester. Both the DUT and Router 1 then advertise their respective learned prefixes to Router 2. Consequently, the DUT learns the route to 10.0.0.0/16 from Router 2, and Router 1 learns the route to 10.1.0.0/16 from Router 2. The Tester sends outbound traffic with source addresses in 10.0.0.0/16 to the DUT, simulating an asymmetric routing scenario between the Tester and the DUT.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To determine whether the DUT can generate accurate SAV rules under asymmetric routing conditions, set up the test environment as shown in <xref target="intra-domain-customer-asyn"/>. The Tester is connected to both the DUT and Router 1 and emulates the functions of a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.1.0.0/16) and legitimate traffic (using source addresses in 10.0.0.0/16) toward the DUT. The prefix 10.1.0.0/16 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the sub network.</t>
          <figure anchor="intra-domain-internet-syn">
            <name>SAV for Internet-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                          |
|                          |   \/                          |
|                       +----------+                       |
|                       |    DUT   | SAV facing Internet   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
|                          |    \/                         |
|                  +--------------------+                  |
|                  |    Sub Network     |                  |
|                  |   (10.0.0.0/15)    |                  |
|                  +--------------------+                  |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-syn"/> illustrates the test scenario for SAV in an Internet-facing network under intra-domain symmetric routing conditions. The network topology resembles that of <xref target="intra-domain-customer-syn"/>, with the key difference being the positioning of the DUT. In this case, the DUT is connected to Router 1 and the Internet, while the Tester emulates the Internet. The DUT performs SAV from an Internet-facing perspective, as opposed to a customer/host-facing role.</t>
          <t>The <strong>procedure</strong> for testing SAV for an Internet-facing network in an intra-domain symmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for Internet-facing SAV under symmetric routing, set up the test environment as depicted in <xref target="intra-domain-internet-syn"/>. The Tester is connected to the DUT and emulates the Internet.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.0.0.0/15) and legitimate traffic (using source addresses in 10.2.0.0/15) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and allows legitimate traffic originating from the Internet.</t>
          <figure anchor="intra-domain-internet-asyn">
            <name>SAV for Internet-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                             |
|                          |   \/                             |
|                       +----------+                          |
|                       |    DUT   |                          |
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/16 \      / of 10.0.0.0/16              |
|                         \    \/                             |
|                  +--------------------+                     |
|                  |    Sub Network     |                     |
|                  |   (10.0.0.0/15)    |                     |
|                  +--------------------+                     |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
]]></artwork>
          </figure>
          <t><strong>SAV for Internet-facing Network</strong>: <xref target="intra-domain-internet-asyn"/> illustrates a test case for SAV in an Internet-facing network under intra-domain asymmetric routing conditions. The network topology is identical to that of <xref target="intra-domain-customer-asyn"/>, with the key distinction being the placement of the DUT. In this scenario, the DUT is connected to Router 1 and Router 2 within the same AS, as well as to the Internet. The Tester emulates the Internet, and the DUT performs Internet-facing SAV rather than customer/host-network-facing SAV.</t>
          <t>The <strong>procedure</strong> for testing SAV in this intra-domain asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for Internet-facing SAV under asymmetric routing, construct the test environment as shown in <xref target="intra-domain-internet-asyn"/>. The Tester is connected to the DUT and emulates the Internet.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.0.0.0/15) and legitimate traffic (using source addresses in 10.2.0.0/15) toward the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the Internet.</t>
          <figure anchor="intra-domain-agg-syn">
            <name>SAV for aggregation-router-facing network in intra-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +----------+                       |
|                       |    DUT   | SAV facing Router 1   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|             Traffic with |    | Traffic with             |
|      source IP addresses |    | destination IP addresses |
|           of 10.0.0.0/15 |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                    +--------------------+
                    |Tester (Sub Network)|
                    |   (10.0.0.0/15)    |
                    +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-syn"/> depicts the test scenario for SAV in an aggregation-router-facing network under intra-domain symmetric routing conditions. The network topology in <xref target="intra-domain-agg-syn"/> is identical to that of <xref target="intra-domain-internet-syn"/>. The Tester is connected to Router 1 to emulate a sub network, enabling evaluation of the DUT's false positive and false negative rates when facing Router 1.</t>
          <t>The <strong>procedure</strong> for testing SAV in this aggregation-router-facing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for aggregation-router-facing SAV under symmetric routing, construct the test environment as shown in <xref target="intra-domain-agg-syn"/>. The Tester is connected to Router 1 and emulates a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT and Router 1 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both legitimate traffic (using source addresses in 10.1.0.0/15) and spoofed traffic (using source addresses in 10.2.0.0/15) toward Router 1. The prefix 10.2.0.0/15 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the direction of Router 1.</t>
          <figure anchor="intra-domain-agg-asyn">
            <name>SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +----------+                          |
|                       |    DUT   | SAV facing Router 1 and 2|
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/15  Network 1  /        \   10.0.0.0/15  Network 1 |
| 10.0.0.0/15  DUT       /         \/  10.1.0.0/15  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|         Traffic with \          / Traffic with              |
|   source IP addresses \        / destination IP addresses   |
|         of 10.0.0.0/15 \      / of 10.0.0.0/15              |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \   \/
                   +--------------------+
                   |Tester (Sub Network)|
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Aggregation-router-facing Network</strong>: <xref target="intra-domain-agg-asyn"/> illustrates the test case for SAV in an aggregation-router-facing network under intra-domain asymmetric routing conditions. The network topology in <xref target="intra-domain-agg-asyn"/> is identical to that of <xref target="intra-domain-internet-asyn"/>. The Tester is connected to both Router 1 and Router 2 to emulate a sub network, enabling evaluation of the DUT's false positive and false negative rates when facing Router 1 and Router 2.</t>
          <t>The <strong>procedure</strong> for testing SAV in this aggregation-router-facing asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under asymmetric routing conditions, construct the test environment as shown in <xref target="intra-domain-agg-asyn"/>. The Tester is connected to Router 1 and Router 2 and emulates the functions of a sub network.</t>
            </li>
            <li>
              <t>Configure the DUT, Router 1, and Router 2 to establish an asymmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester generates both spoofed traffic (using source addresses in 10.2.0.0/15) and legitimate traffic (using source addresses in 10.1.0.0/15) toward Router 1. The prefix 10.2.0.0/15 does not belong to the sub network and thus is not advertised by the Tester. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic originating from the direction of Router 1 and Router 2.</t>
          <figure anchor="intra-domain-frr-topo">
            <name>Intra-domain SAV under Fast Reroute (FRR) scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|     +------------+                     +------------+       |
|     |   Router2  |---------------------|   Router3  |       |
|     +------------+                     +------------+       |
|          /\                                  /\             |
|          |                                   |              |
|          | backup path                       | primary path |
|          |                                   |              |
|     +-----------------------------------------------+       |
|     |                     DUT                       |       |
|     +-----------------------------------------------+       |
|                           /\                                |
|                           | Legitimate and                  |
|                           | Spoofed Traffic                 |
|                           |                                 |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                            |
                  +--------------------+
                  |Tester (Sub Network)|
                  +--------------------+
]]></artwork>
          </figure>
          <t><strong>SAV under Fast Reroute (FRR) Scenario</strong>: Fast Reroute (FRR) mechanisms such as Loop-Free Alternates (LFA) or Topology-Independent Loop-Free Alternates (TI-LFA) provide sub-second restoration of traffic forwarding after link or node failures. During FRR activation, temporary forwarding changes may occur before the control plane converges, potentially impacting SAV rule consistency and causing transient
false positives or false negatives.</t>
          <t>The <strong>procedure</strong> for testing SAV under FRR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure the DUT and adjacent routers with FRR protection for the primary link (Router3–DUT).</t>
            </li>
            <li>
              <t>The Tester continuously sends legitimate and spoofed traffic toward the protected prefix.</t>
            </li>
            <li>
              <t>Trigger a link failure between Router3 and the DUT, causing FRR switchover to Router2.</t>
            </li>
            <li>
              <t>Measure false positive and false negative rates during the switchover and after reconvergence.</t>
            </li>
            <li>
              <t>Restore the primary link and verify that SAV rules revert correctly.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT should maintain correct SAV behavior throughout FRR activation and recovery. False positive and false negative rates <bcp14>SHOULD</bcp14> remain minimal during FRR events, and SAV rules <bcp14>SHOULD</bcp14> update promptly to reflect restored routing.</t>
          <figure anchor="intra-domain-pbr-topo">
            <name>Intra-domain SAV under Policy-based Routing (PBR) scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|             Test Network Environment           |
|                 +------------+                 |
|                 |  Router2   |                 |
|                 +------------+                 |
|                       /\                       |
|                        | default path          |
|                        |                       |
|               +----------------+               |
|               |       DUT      |               |
|               +----------------+               |
|                 /\           /\                |
|    policy-based /             \ default path   |
|           path /               \               |
|         +-----------+      +-----------+       |
|         |  Router3  |      |  Router1  |       |
|         +-----------+      +-----------+       |
|              /\                 /\             |
|              |                   |             |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
               |                   |
          +-----------------------------+
          |     Tester (Sub Network)    |
          +-----------------------------+
]]></artwork>
          </figure>
          <t><strong>SAV under Policy-based Routing (PBR) Scenario</strong>: Policy-based Routing (PBR) enables forwarding decisions based on user-defined match conditions (e.g., source prefix, DSCP, or interface) instead of the standard routing table. Such policies can create asymmetric paths that challenge the SAV mechanism if rules are derived solely from RIB or FIB information.</t>
          <t>The <strong>procedure</strong> for testing SAV under PBR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure PBR on the DUT to forward traffic matching a specific source prefix (e.g., 10.1.0.0/16) to Router3, while other traffic follows the default path to Router1.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate and spoofed traffic that matches and does not match the PBR policy.</t>
            </li>
            <li>
              <t>Measure the false positive and false negative rates for both traffic types.</t>
            </li>
            <li>
              <t>Dynamically modify or remove the PBR policy and observe SAV rule adaptation.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT <bcp14>SHOULD</bcp14> continue to correctly filter spoofed packets and permit legitimate packets under PBR scenario. SAV rules <bcp14>MUST</bcp14> adapt to policy-based forwarding paths without producing misclassification.</t>
        </section>
        <section anchor="intra-control-plane-sec">
          <name>Control Plane Performance</name>
          <t><strong>Objective</strong>: Measure the control plane performance of the DUT, including both protocol convergence performance and protocol message processing performance in response to route changes caused by network failures or operator configurations. Protocol convergence performance is quantified by the convergence time, defined as the duration from the onset of a routing change until the completion of the corresponding SAV rule update. Protocol message processing performance is measured by the processing throughput, represented by the total size of protocol messages processed per second.</t>
          <t>Note that the tests for control plane performance of the DUT which performs intra-domain SAV are <bcp14>OPTIONAL</bcp14>. Only DUT which implements the SAV mechanism using an explicit control-plane communication protocol, such as SAV-specific information communication mechanism proposed in <xref target="intra-domain-arch"/> <bcp14>SHOULD</bcp14> be tested on its control plane performance.</t>
          <figure anchor="intra-convg-perf">
            <name>Test setup for protocol convergence performance measurement.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
| Emulated Topology |------|   Tester    |<-------->|    DUT    |
+~~~~~~~~~~~~~~~~~~~+      +-------------+          +-----------+
]]></artwork>
          </figure>
          <t><strong>Protocol Convergence Performance</strong>: <xref target="intra-convg-perf"/> illustrates the test setup for measuring protocol convergence performance. The convergence process of the DUT, during which SAV rules are updated, is triggered by route changes resulting from network failures or operator configurations. In <xref target="intra-convg-perf"/>, the Tester is directly connected to the DUT and simulates these route changes by adding or withdrawing prefixes to initiate the DUT's convergence procedure.</t>
          <t>The <strong>procedure</strong> for testing protocol convergence performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the protocol convergence time of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester withdraws a specified percentage of the total prefixes supported by the DUT, for example, 10%, 20%, up to 100%.</t>
            </li>
            <li>
              <t>The protocol convergence time is calculated based on DUT logs that record the start and completion times of the convergence process.</t>
            </li>
          </ol>
          <t>Please note that for IGP, proportional prefix withdrawal can be achieved by selectively shutting down interfaces. For instance, if the Tester is connected to ten emulated devices through ten interfaces, each advertising a prefix, withdrawing 10% of prefixes can be accomplished by randomly disabling one interface. Similarly, 20% withdrawal corresponds to shutting down two interfaces, and so forth. This is one suggested method, and other approaches that achieve the same effect should be also acceptable.</t>
          <t>The protocol convergence time, defined as the duration required for the DUT to complete the convergence process, should be measured from the moment the last “hello” message is received from the emulated device on the disabled interface until SAV rule generation is finalized. To ensure accuracy, the DUT should log the timestamp of the last hello message received and the timestamp when SAV rule updates are complete. The convergence time is the difference between these two timestamps.</t>
          <t>It is recommended that if the emulated device sends a “goodbye hello” message during interface shutdown, using the receipt time of this message, rather than the last standard hello, as the starting point will provide a more precise measurement, as advised in <xref target="RFC4061"/>.</t>
          <t><strong>Protocol Message Processing Performance</strong>: The test for protocol message processing performance uses the same setup illustrated in <xref target="intra-convg-perf"/>. This performance metric evaluates the protocol message processing throughput, the rate at which the DUT processes protocol messages. The Tester varies the sending rate of protocol messages, ranging from 10% to 100% of the total link capacity between the Tester and the DUT. The DUT records both the total size of processed protocol messages and the corresponding processing time.</t>
          <t>The <strong>procedure</strong> for testing protocol message processing performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the protocol message processing throughput of the DUT, set up the test environment as shown in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends protocol messages at varying rates, such as 10%, 20%, up to 100%, of the total link capacity between the Tester and the DUT.</t>
            </li>
            <li>
              <t>The protocol message processing throughput is calculated based on DUT logs that record the total size of processed protocol messages and the total processing time.</t>
            </li>
          </ol>
          <t>To compute the protocol message processing throughput, the DUT logs <bcp14>MUST</bcp14> include the total size of the protocol messages processed and the total time taken for processing. The throughput is then derived by dividing the total message size by the total processing time.</t>
        </section>
        <section anchor="intra-data-plane-sec">
          <name>Data Plane Performance</name>
          <t><strong>Objective</strong>: Evaluate the data plane performance of the DUT, including both data plane SAV table refresh performance and data plane forwarding performance. Data plane SAV table refresh performance is quantified by the refresh rate, which indicates how quickly the DUT updates its SAV table with new SAV rules. Data plane forwarding performance is measured by the forwarding rate, defined as the total size of packets forwarded by the DUT per second.</t>
          <t><strong>Data Plane SAV Table Refreshing Performance</strong>: The evaluation of data plane SAV table refresh performance uses the same test setup shown in <xref target="intra-convg-perf"/>. This metric measures the rate at which the DUT refreshes its SAV table with new SAV rules. The Tester varies the transmission rate of protocol messages, from 10% to 100% of the total link capacity between the Tester and the DUT, to influence the proportion of updated SAV rules and corresponding SAV table entries. The DUT records the total number of updated SAV table entries and the time taken to complete the refresh process.</t>
          <t>The <strong>procedure</strong> for testing data plane SAV table refresh performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the data plane SAV table refreshing rate of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends protocol messages at varying percentages of the total link capacity, for example, 10%, 20%, up to 100%.</t>
            </li>
            <li>
              <t>The data plane SAV table refreshing rate is calculated based on DUT logs that record the total number of updated SAV table entries and the total refresh time.</t>
            </li>
          </ol>
          <t>To compute the refresh rate, the DUT logs <bcp14>MUST</bcp14> capture the total number of updated SAV table entries and the total time required for refreshing. The refresh rate is then derived by dividing the total number of updated entries by the total refresh time.</t>
          <t><strong>Data Plane Forwarding Performance</strong>: The evaluation of data plane forwarding performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The Tester transmits a mixture of spoofed and legitimate traffic at a rate matching the total link capacity between the Tester and the DUT, while the DUT maintains a fully populated SAV table. The ratio of spoofed to legitimate traffic can be varied within a range, for example, from 1:9 to 9:1. The DUT records the total size of forwarded packets and the total duration of the forwarding process.</t>
          <t>The procedure for testing data plane forwarding performance is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To measure the data plane forwarding rate of the DUT, set up the test environment as depicted in <xref target="intra-convg-perf"/>, with the Tester directly connected to the DUT.</t>
            </li>
            <li>
              <t>The Tester sends a mix of spoofed and legitimate traffic to the DUT at the full link capacity between the Tester and the DUT. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
            <li>
              <t>The data plane forwarding rate is calculated based on DUT logs that record the total size of forwarded traffic and the total forwarding time.</t>
            </li>
          </ol>
          <t>To compute the forwarding rate, the DUT logs must include the total size of forwarded traffic and the total time taken for forwarding. The forwarding rate is then derived by dividing the total traffic size by the total forwarding time.</t>
        </section>
      </section>
      <section anchor="inter_domain_sav">
        <name>Inter-domain SAV</name>
        <section anchor="false-positive-and-false-negative-rates-1">
          <name>False Positive and False Negative Rates</name>
          <t><strong>Objective</strong>: Measure the false positive rate and false negative rate of the DUT when processing legitimate and spoofed traffic across multiple inter-domain network scenarios, including SAV implementations for both customer-facing ASes and provider-/peer-facing ASes.</t>
          <t>In the following, this document presents the test scenarios for evaluating inter-domain SAV performance on the DUT. Under each scenario, the generated spoofed traffic <bcp14>SHOULD</bcp14> include different types of forged source addresses, such as source addresses belonging to the local AS but not announced to external networks, private network source addresses, source addresses belonging to other ASes, and unallocated (unused) source addresses. The ratios among these different types of forged source addresses <bcp14>SHOULD</bcp14> vary, since different inter-domain SAV mechanisms may differ in their capability to block packets with forged source addresses of various origins. Nevertheless, for all these types of spoofed traffic, the expected result is that the DUT <bcp14>SHOULD</bcp14> block them.</t>
          <figure anchor="inter-customer-syn">
            <name>SAV for customer-facing ASes in inter-domain symmetric routing scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|             \         |              \            \      |
|     P6[AS 1] \        |               \            \     |
|      P1[AS 1] \       |                \            \    |
|          (C2P) \      | (C2P/P2P) (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-syn"/> presents a test case for SAV in customer-facing ASes under an inter-domain symmetric routing scenario. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network environment, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which in turn is a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 and P6 to AS 2 and the DUT, respectively. AS 2 then propagates routes for P1 and P6 to the DUT, enabling the DUT to learn these prefixes from both AS 1 and AS 2. In this test, the legitimate path for traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT-&gt;AS 4. The Tester is connected to AS 1 to evaluate the DUT's SAV performance for customer-facing ASes.</t>
          <t>The <strong>procedure</strong> for testing SAV in this scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for customer-facing ASes under symmetric inter-domain routing, construct the test environment as shown in <xref target="inter-customer-syn"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish symmetric routing environment.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate traffic (with source addresses in P1 and destination addresses in P4) and spoofed traffic (with source addresses in P5 and destination addresses in P4) to the DUT via AS 2. The ratio of spoofed to legitimate traffic may vary, for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-syn"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-lpp">
            <name>SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|    P6[AS 1] \         | NO_EXPORT    \            \      |
|     P1[AS 1] \        |               \            \     |
|     NO_EXPORT \       |                \            \    |
|          (C2P) \      | (C2P)     (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
          </figure>
          <t>SAV for Customer-facing ASes: <xref target="inter-customer-lpp"/> presents a test case for SAV in customer-facing ASes under an inter-domain asymmetric routing scenario induced by NO_EXPORT community configuration. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which is itself a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefix P1 to AS 2 with the NO_EXPORT community attribute, preventing AS 2 from propagating the route for P1 to the DUT. Similarly, AS 1 advertises prefix P6 to the DUT with the NO_EXPORT attribute, preventing the DUT from propagating this route to AS 3. As a result, the DUT learns the route for prefix P1 only from AS 1. The legitimate path for traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT. The Tester is connected to AS 1 to evaluate the DUT's SAV performance for customer-facing ASes.</t>
          <t>The <strong>procedure</strong> for testing SAV in this asymmetric routing scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under NO_EXPORT-induced asymmetric routing, construct the test environment as shown in <xref target="inter-customer-lpp"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish the asymmetric routing scenario.</t>
            </li>
            <li>
              <t>The Tester sends both legitimate traffic (with source addresses in P1 and destination addresses in P4) and spoofed traffic (with source addresses in P5 and destination addresses in P4) to the DUT via AS 2. The ratio of spoofed to legitimate traffic may vary—for example, from 1:9 to 9:1.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic and permits legitimate traffic received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-lpp"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-dsr">
            <name>SAV for customer-facing ASes in the scenario of direct server return (DSR).</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                       |
|                                +----------------+               |
|                Anycast Server+-+    AS 3(P3)    |               |
|                                +-+/\----+/\+----+               |
|                                   /       \                     |
|                         P3[AS 3] /         \ P3[AS 3]           |
|                                 /           \                   |
|                                / (C2P)       \                  |
|                       +----------------+      \                 |
|                       |     DUT(P4)    |       \                |
|                       ++/\+--+/\+--+/\++        \               |
|          P6[AS 1, AS 2] /     |      \           \              |
|               P2[AS 2] /      |       \           \             |
|                       /       |        \           \            |
|                      / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|      +----------------+       |          \           \          |
|User+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|      +----------+/\+--+       | P6[AS 1]   \           \        |
|          P6[AS 1] \           |             \           \       |
|           P1[AS 1] \          |              \           \      |
|                     \         |               \           \     |
|                      \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                    +----------------+        +----------------+ |
|                    |AS 1(P1, P3, P6)|        |    AS 5(P5)    | |
|                    +----------------+        +----------------+ |
|                         /\     |                                |
|                          |     |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                     +----------------+
                     |     Tester     |
                     | (Edge Server)  |
                     +----------------+

Within the test network environment, P3 is the anycast prefix and is only advertised by AS 3 through BGP.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-dsr"/> presents a test case for SAV in customer-facing ASes under a Direct Server Return (DSR) scenario. In this setup, AS 1, AS 2, AS 3, the DUT, and AS 5 form the test network, with the DUT performing SAV at the AS level. AS 1 is a customer of both AS 2 and the DUT; AS 2 is a customer of the DUT, which is itself a customer of AS 3; and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2 send requests to an anycast destination IP, the forwarding path is AS 2-&gt;DUT-&gt;AS 3. Anycast servers in AS 3 receive the requests and tunnel them to edge servers in AS 1. The edge servers then return content to the users with source addresses in prefix P3. If the reverse forwarding path is AS 1-&gt;DUT-&gt;AS 2, the Tester sends traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;DUT-&gt;AS 2. Alternatively, if the reverse forwarding path is AS 1-&gt;AS 2, the Tester sends traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;AS 2. In this case, AS 2 may serve as the DUT.</t>
          <t>The <strong>procedure</strong> for testing SAV in this DSR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules under DSR conditions, construct the test environment as shown in <xref target="inter-customer-dsr"/>. The Tester is connected to AS 1 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to establish the DSR scenario.</t>
            </li>
            <li>
              <t>The Tester sends legitimate traffic (with source addresses in P3 and destination addresses in P2) to AS 2 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT permits legitimate traffic with source addresses in P3 received from the direction of AS 1.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-dsr"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-reflect">
            <name>SAV for customer-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P1')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |  Server+-+    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-reflect"/> illustrates a test case for SAV in customer-facing ASes under a reflection attack scenario. In this scenario, a reflection attack using source address spoofing occurs within the DUT's customer cone. The attacker spoofs the victim's IP address (P1) and sends requests to server IP addresses (P5) that are configured to respond to such requests. The Tester emulates the attacker by performing source address spoofing. The arrows in <xref target="inter-customer-reflect"/> indicate the business relationships between ASes: AS 3 serves as the provider for both the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under reflection attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules in a reflection attack scenario, construct the test environment as shown in <xref target="inter-customer-reflect"/>. The Tester is connected to AS 2 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the reflection attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P1 and destination addresses in P5) toward AS 5 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P1 received from the direction of AS 2.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-reflect"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-customer-direct">
            <name>SAV for customer-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P5')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |          |    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Customer-facing ASes</strong>: <xref target="inter-customer-direct"/> presents a test case for SAV in customer-facing ASes under a direct attack scenario. In this scenario, a direct attack using source address spoofing occurs within the DUT's customer cone. The attacker spoofs a source address (P5) and directly targets the victim's IP address (P1), aiming to overwhelm its network resources. The Tester emulates the attacker by performing source address spoofing. The arrows in <xref target="inter-customer-direct"/> indicate the business relationships between ASes: AS 3 serves as the provider for both the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under direct attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules in a direct attack scenario, construct the test environment as shown in <xref target="inter-customer-direct"/>. The Tester is connected to AS 2 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the direct attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P5 and destination addresses in P1) toward AS 1 via the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P5 received from the direction of AS 2.</t>
          <t>Note that DUT may also be placed at AS 1 or AS 2 in <xref target="inter-customer-direct"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="reflection-attack-p">
            <name>SAV for provider-facing ASes in the scenario of reflection attacks.</name>
            <artwork><![CDATA[
                                   +----------------+
                                   |     Tester     |
                                   |   (Attacker)   |
                                   |      (P1')     |
                                   +----------------+
                                        |     /\
                                        |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment              \/     |                    |
|                                  +----------------+               |
|                                  |                |               |
|                                  |    AS 3(P3)    |               |
|                                  |                |               |
|                                  +-+/\----+/\+----+               |
|                                     /       \                     |
|                                    /         \                    |
|                                   /           \                   |
|                                  / (C2P/P2P)   \                  |
|                         +----------------+      \                 |
|                         |     DUT(P4)    |       \                |
|                         ++/\+--+/\+--+/\++        \               |
|            P6[AS 1, AS 2] /     |      \           \              |
|                 P2[AS 2] /      |       \           \             |
|                         /       |        \           \            |
|                        / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|        +----------------+       |          \           \          |
|Server+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|        +----------+/\+--+       | P6[AS 1]   \           \        |
|            P6[AS 1] \           | NO_EXPORT   \           \       |
|             P1[AS 1] \          |              \           \      |
|             NO_EXPORT \         |               \           \     |
|                        \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                      +----------------+        +----------------+ |
|              Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|                      +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><strong>SAV for Provider/Peer-facing ASes</strong>: <xref target="reflection-attack-p"/> illustrates a test case for SAV in provider/peer-facing ASes under a reflection attack scenario. In this scenario, the attacker spoofs the victim's IP address (P1) and sends requests to server IP addresses (P2) that are configured to respond. The Tester emulates the attacker by performing source address spoofing. The servers then send overwhelming responses to the victim, exhausting its network resources. The arrows in <xref target="reflection-attack-p"/> represent the business relationships between ASes: AS 3 acts as either a provider or a lateral peer of the DUT and is the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The <strong>procedure</strong> for testing SAV under reflection attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for provider/peer-facing ASes in a reflection attack scenario, construct the test environment as shown in <xref target="reflection-attack-p"/>. The Tester is connected to AS 3 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the reflection attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P1 and destination addresses in P2) toward AS 2 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P1 received from the direction of AS 3.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="reflection-attack-p"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="direct-attack-p">
            <name>SAV for provider-facing ASes in the scenario of direct attacks.</name>
            <artwork><![CDATA[
                           +----------------+
                           |     Tester     |
                           |   (Attacker)   |
                           |      (P2')     |
                           +----------------+
                                |     /\
                                |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment      \/     |                    |
|                          +----------------+               |
|                          |    AS 3(P3)    |               |
|                          +-+/\----+/\+----+               |
|                             /       \                     |
|                            /         \                    |
|                           /           \                   |
|                          / (C2P/P2P)   \                  |
|                 +----------------+      \                 |
|                 |     DUT(P4)    |       \                |
|                 ++/\+--+/\+--+/\++        \               |
|    P6[AS 1, AS 2] /     |      \           \              |
|         P2[AS 2] /      |       \           \             |
|                 /       |        \           \            |
|                / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|+----------------+       |          \           \          |
||    AS 2(P2)    |       | P1[AS 1]  \           \         |
|+----------+/\+--+       | P6[AS 1]   \           \        |
|    P6[AS 1] \           | NO_EXPORT   \           \       |
|     P1[AS 1] \          |              \           \      |
|     NO_EXPORT \         |               \           \     |
|                \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|              +----------------+        +----------------+ |
|      Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|              +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
          </figure>
          <t><xref target="direct-attack-p"/> presents a test case for SAV in provider-facing ASes under a direct attack scenario. In this scenario, the attacker spoofs a source address (P2) and directly targets the victim's IP address (P1), overwhelming its network resources. The arrows in <xref target="direct-attack-p"/> represent the business relationships between ASes: AS 3 acts as either a provider or a lateral peer of the DUT and is the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5. Additionally, AS 2 is the provider for AS 1.</t>
          <t>The procedure for testing SAV under direct attack conditions is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>To evaluate whether the DUT can generate accurate SAV rules for provider-facing ASes in a direct attack scenario, construct the test environment as shown in <xref target="direct-attack-p"/>. The Tester is connected to AS 3 and generates test traffic toward the DUT.</t>
            </li>
            <li>
              <t>Configure AS 1, AS 2, AS 3, the DUT, and AS 5 to simulate the direct attack scenario.</t>
            </li>
            <li>
              <t>The Tester sends spoofed traffic (with source addresses in P2 and destination addresses in P1) toward AS 1 via AS 3 and the DUT.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT blocks spoofed traffic with source addresses in P2 received from the direction of AS 3.</t>
          <t>Note that the DUT may also be placed at AS 1 or AS 2 in <xref target="direct-attack-p"/> to evaluate its false positive and false negative rates using the same procedure. In these configurations, the DUT is expected to effectively block spoofed traffic.</t>
          <figure anchor="inter-domain-frr-topo">
            <name>Inter-domain SAV under FRR scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment           |
|          +-----------+            +-----------+      |
|          |   AS3     |------------|   AS2     |      |
|          +-----------+            +-----------+      |
|               /\                       /\            |
|               |                        |             |
| primary link  |            backup link |             |
|               | (C2P)                  | (C2P)       |
|        +-----------------------------------------+   |
|        |                   DUT                   |   |
|        +-----------------------------------------+   |
|                           /\                         |
|                           |                          |
|                           | Legitimate and           |
|                           | Spoofed Traffic          |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                            | (C2P)
                     +-------------+
                     |    Tester   |
                     +-------------+
]]></artwork>
          </figure>
          <t><strong>SAV under FRR Scenario</strong>: Inter-domain Fast Reroute (FRR) mechanisms, such as BGP Prefix Independent Convergence (PIC) or MPLS-based FRR, allow rapid failover between ASes after a link or node failure. These events may temporarily desynchronize routing information and SAV rules.</t>
          <t>The <strong>procedure</strong> for testing SAV under FRR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure FRR or BGP PIC on the DUT for inter-AS links to AS3 (primary) and AS2 (backup).</t>
            </li>
            <li>
              <t>Continuously send legitimate and spoofed traffic from AS1 toward DUT.</t>
            </li>
            <li>
              <t>Trigger a failure on the AS3–DUT link to activate the FRR path via AS2.</t>
            </li>
            <li>
              <t>Measure false positive and false negative rates during and after switchover.</t>
            </li>
            <li>
              <t>Restore the AS3 link and verify SAV table consistency.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT <bcp14>MUST</bcp14> maintain consistent SAV filtering during FRR events. Transient topology changes <bcp14>SHOULD NOT</bcp14> lead to acceptance of spoofed traffic or unnecessary blocking of legitimate packets.</t>
          <figure anchor="inter-domain-pbr-topo">
            <name>Inter-domain SAV under PBR scenario.</name>
            <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|               Test Network Environment           |
|     +-----------+            +-----------+       |
|     |   AS3     |------------|   AS2     |       |
|     +-----------+            +-----------+       |
|          /\                       /\             |
|           |                        |             |
|           | preferred path         | default path|
|           | (C2P)                  | (C2P)       |
|    +-----------------------------------------+   |
|    |                  DUT                    |   |
|    +-----------------------------------------+   |
|                        /\                        |
|                         | Legitimate and         |
|                         | Spoofed Traffic        |
|                         |                        |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                          | (C2P) 
                   +-------------+
                   |    Tester   |
                   +-------------+
]]></artwork>
          </figure>
          <t><strong>SAV under PBR Scenario</strong>: In inter-domain environments, routing policies such as local preference, route maps, or communities may alter path selection independently of shortest-path routing. Such policy-driven forwarding can affect how the SAV rules are derived and applied.</t>
          <t>The <strong>procedure</strong> for testing SAV under PBR scenario is as follows:</t>
          <ol spacing="normal" type="1"><li>
              <t>Configure a routing policy on the DUT (e.g., set local preference) to prefer AS3 for specific prefixes while maintaining AS2 as an alternative path.</t>
            </li>
            <li>
              <t>Generate legitimate and spoofed traffic from AS1 matching both policy-affected and unaffected prefixes.</t>
            </li>
            <li>
              <t>Observe SAV filtering behavior before and after policy changes.</t>
            </li>
            <li>
              <t>Modify the routing policy dynamically and measure false positive and false negative rates.</t>
            </li>
          </ol>
          <t>The <strong>expected results</strong> for this test case are that the DUT <bcp14>SHOULD</bcp14> maintain correct SAV filtering regardless of routing policy changes. Legitimate traffic rerouted by policy <bcp14>MUST NOT</bcp14> be dropped, and spoofed traffic <bcp14>MUST NOT</bcp14> be forwarded during or after policy updates.</t>
        </section>
        <section anchor="control-plane-performance">
          <name>Control Plane Performance</name>
          <t>The test setup, procedure, and metrics for evaluating protocol convergence performance and protocol message processing performance can refer to <xref target="intra-control-plane-sec"/>. Note that the tests for control plane performance of the DUT which performs inter-domain SAV are <bcp14>OPTIONAL</bcp14>. Only DUT which implements the SAV mechanism using an explicit control-plane communication protocol, such as SAV-specific information communication mechanism proposed in <xref target="inter-domain-arch"/> <bcp14>SHOULD</bcp14> be tested on its control plane performance.</t>
        </section>
        <section anchor="data-plane-performance">
          <name>Data Plane Performance</name>
          <t>The test setup, procedure, and metrics for evaluating data plane SAV table refresh performance and data plane forwarding performance can refer to <xref target="intra-data-plane-sec"/>.</t>
        </section>
      </section>
      <section anchor="resource-utilization-1">
        <name>Resource Utilization</name>
        <t>When evaluating the DUT for both intra-domain (<xref target="intra_domain_sav"/>) and inter-domain SAV (<xref target="inter_domain_sav"/>) functionality, CPU utilization (for both control and data planes) and memory utilization (for both control and data planes) <bcp14>MUST</bcp14> be recorded. These metrics <bcp14>SHOULD</bcp14> be collected separately per plane to facilitate granular performance analysis.</t>
      </section>
    </section>
    <section anchor="reporting-format">
      <name>Reporting Format</name>
      <t>Each test follows a reporting format comprising both global, standardized components and individual elements specific to each test. The following parameters for test configuration and SAV mechanism settings <bcp14>MUST</bcp14> be documented in the test report.</t>
      <t>Test Configuration Parameters:</t>
      <ol spacing="normal" type="1"><li>
          <t>Test device hardware and software versions</t>
        </li>
        <li>
          <t>Network topology</t>
        </li>
        <li>
          <t>Test traffic attributes</t>
        </li>
        <li>
          <t>System configuration (e.g., physical or virtual machine, CPU, memory, caches, operating system, interface capacity)</t>
        </li>
        <li>
          <t>Device configuration (e.g., symmetric routing, NO_EXPORT)</t>
        </li>
        <li>
          <t>SAV mechanism</t>
        </li>
      </ol>
    </section>
    <section anchor="IANA">
      <name>IANA Considerations</name>
      <t>This document has no IANA actions.</t>
    </section>
    <section anchor="security">
      <name>Security Considerations</name>
      <t>The benchmarking tests outlined in this document are confined to evaluating the performance of SAV devices within a controlled laboratory environment, utilizing isolated networks.</t>
      <t>The network topology employed for benchmarking must constitute an independent test setup. It is imperative that this setup remains disconnected from any devices that could potentially relay test traffic into an operational production network.</t>
    </section>
  </middle>
  <back>
    <references anchor="sec-combined-references">
      <name>References</name>
      <references anchor="sec-normative-references">
        <name>Normative References</name>
        <reference anchor="RFC3704">
          <front>
            <title>Ingress Filtering for Multihomed Networks</title>
            <author fullname="F. Baker" initials="F." surname="Baker"/>
            <author fullname="P. Savola" initials="P." surname="Savola"/>
            <date month="March" year="2004"/>
            <abstract>
              <t>BCP 38, RFC 2827, is designed to limit the impact of distributed denial of service attacks, by denying traffic with spoofed addresses access to the network, and to help ensure that traffic is traceable to its correct source network. As a side effect of protecting the Internet against such attacks, the network implementing the solution also protects itself from this and other attacks, such as spoofed management access to networking equipment. There are cases when this may create problems, e.g., with multihoming. This document describes the current ingress filtering operational mechanisms, examines generic issues related to ingress filtering, and delves into the effects on multihoming in particular. This memo updates RFC 2827. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="3704"/>
          <seriesInfo name="DOI" value="10.17487/RFC3704"/>
        </reference>
        <reference anchor="RFC8704">
          <front>
            <title>Enhanced Feasible-Path Unicast Reverse Path Forwarding</title>
            <author fullname="K. Sriram" initials="K." surname="Sriram"/>
            <author fullname="D. Montgomery" initials="D." surname="Montgomery"/>
            <author fullname="J. Haas" initials="J." surname="Haas"/>
            <date month="February" year="2020"/>
            <abstract>
              <t>This document identifies a need for and proposes improvement of the unicast Reverse Path Forwarding (uRPF) techniques (see RFC 3704) for detection and mitigation of source address spoofing (see BCP 38). Strict uRPF is inflexible about directionality, the loose uRPF is oblivious to directionality, and the current feasible-path uRPF attempts to strike a balance between the two (see RFC 3704). However, as shown in this document, the existing feasible-path uRPF still has shortcomings. This document describes enhanced feasible-path uRPF (EFP-uRPF) techniques that are more flexible (in a meaningful way) about directionality than the feasible-path uRPF (RFC 3704). The proposed EFP-uRPF methods aim to significantly reduce false positives regarding invalid detection in source address validation (SAV). Hence, they can potentially alleviate ISPs' concerns about the possibility of disrupting service for their customers and encourage greater deployment of uRPF techniques. This document updates RFC 3704.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="8704"/>
          <seriesInfo name="DOI" value="10.17487/RFC8704"/>
        </reference>
        <reference anchor="RFC2544">
          <front>
            <title>Benchmarking Methodology for Network Interconnect Devices</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <author fullname="J. McQuaid" initials="J." surname="McQuaid"/>
            <date month="March" year="1999"/>
            <abstract>
              <t>This document is a republication of RFC 1944 correcting the values for the IP addresses which were assigned to be used as the default addresses for networking test equipment. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="2544"/>
          <seriesInfo name="DOI" value="10.17487/RFC2544"/>
        </reference>
        <reference anchor="RFC4061">
          <front>
            <title>Benchmarking Basic OSPF Single Router Control Plane Convergence</title>
            <author fullname="V. Manral" initials="V." surname="Manral"/>
            <author fullname="R. White" initials="R." surname="White"/>
            <author fullname="A. Shaikh" initials="A." surname="Shaikh"/>
            <date month="April" year="2005"/>
            <abstract>
              <t>This document provides suggestions for measuring OSPF single router control plane convergence. Its initial emphasis is on the control plane of a single OSPF router. We do not address forwarding plane performance.</t>
              <t>NOTE: In this document, the word "convergence" relates to single router control plane convergence only. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="4061"/>
          <seriesInfo name="DOI" value="10.17487/RFC4061"/>
        </reference>
        <reference anchor="RFC2119">
          <front>
            <title>Key words for use in RFCs to Indicate Requirement Levels</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <date month="March" year="1997"/>
            <abstract>
              <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="2119"/>
          <seriesInfo name="DOI" value="10.17487/RFC2119"/>
        </reference>
        <reference anchor="RFC8174">
          <front>
            <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
            <author fullname="B. Leiba" initials="B." surname="Leiba"/>
            <date month="May" year="2017"/>
            <abstract>
              <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="8174"/>
          <seriesInfo name="DOI" value="10.17487/RFC8174"/>
        </reference>
      </references>
      <references anchor="sec-informative-references">
        <name>Informative References</name>
        <reference anchor="intra-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Intra-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="inter-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Inter-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="intra-domain-arch" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-architecture/">
          <front>
            <title>Intra-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
        <reference anchor="inter-domain-arch" target="https://datatracker.ietf.org/doc/draft-wu-savnet-inter-domain-architecture/">
          <front>
            <title>Inter-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2025"/>
          </front>
        </reference>
      </references>
    </references>
    <?line 1081?>

<section numbered="false" anchor="Acknowledgements">
      <name>Acknowledgements</name>
      <t>Many thanks to Aijun Wang, Nan Geng, Susan Hares, Giuseppe Fioccola, Minh-Ngoc Tran, Shengnan Yue, Changwang Lin, Yuanyuan Zhang, and Xueyan Song, for their valuable comments and reviews on this document.
Apologies to any others whose names the authors may have missed mentioning.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAAAAAAAAA+19+ZbbRnb3/32O3gGxTsbdEkmpW5Izo5k4aWvx6EQL0y3N
ZBL7+IAgSGIEAjSWbtNqzZl3yPcA37N8jzJP8t2lVqCwcGlZtsVk5CaIqrp1
69ZdfrXc4XB44yAv/GT6nR+nSfjQK7IyvHEQrTL6My9O7t793d2TGweBXzz0
omSWeje9R4sweAvlyskyyvMoTYr1Coo+e/L66Y0DPwv9h97XYRJmfuz9z9mT
8fPTR0++vXFwOZev3DiYpkHiL6HMNPNnxTBYhMlwsrycD3P/IgkL/M9wEibB
Yulnb6NkPrz7WyxWREUMhb4yfvFehMUinaZxOl97szTzniVF5g+n6dKPEg86
hg/CTD44T8ssCL3T6TQL89z7kx9HU7+ALgDhk0kWXjz0zk//RA3cOIj9BGgO
E2zaL6GZ7OGNgyFwIX/o3TjwPO7C8wgZkuCDNIP3/3uRJvN56SdBmXjP/Uma
+UWarfH3ICrWSH70V6CcHqQlkAvPHi2ixAfWlnnovX7+2DsMfwjCVeG9+Y8j
qFW+Ry1iuRA6Ez/04gg59+8/zoPYn4zCaTkKEgeFj30gJFIEvs6h9UXpe2+S
6CLMciDqOogrUuRt8u+FaK6ZvufRJEIKy5+Gh2U86WThcyAFWD33/jP6SUb6
+yiJgyqVNw6SNFuC+F6ED/Hds6eP7v3L3fvy798af588uK/+vn/3i2P8G6Y5
zGezgsiYO8NVTs88r/CzeQizf1EU8OzOHZgwPrwXvA2zURQWsxEw4w7M6Ds8
mfGRnMd2hVk6icPlEPRNES7DpLgj6udZ3Tg1gSx7Vr8Mi8s0e5t7X/sr7zTx
43Ue5QNvzPV757L+Ac3/s/D7MsroQe5xi1AvNHhy9+SB6LVSEPvqtVHhbr3W
qmv/vdZD42fBYs+jjVVGRRgUZRbaXbbGsrn/h6CIXz55feSdGjV1D+D2Xbks
nQPY2pEelmWDjoDiGQ49f5IjlQVO8NeLKPeAxBIH0puGsygJc2+pbF4E39Dq
mcbSKxahtwozmtsJ0JTOrNEm+TC76OVMuy9ov7BpP4L2goWfRPkyH6F5NL57
YO69soji6MdwCirfm5PhL0J6LytjoA+ersCyYgcqDeWrNIUezVlkF/5FCB0J
QfSXq5hEF+q8jIqFB/1YA1lZlJbAjjCP5kmOEyTNpmFGDXB3qVV4jhwI0gwa
WaXJFHmSB2GC5aEHNk8L/y3SCAX8FcxVP1ggv4I0ySOoG4v6VOs0vIiAdGhr
Aq96kxjkyJukPwx4VCTfl4Y7gqxG0hOkceEXHrTrz5M0L6IAK+L3NWttwoCY
CyAhxzqoUhroZejnZdYwyuEPEdSNJAM7k/CS2SGZSSOaj1CsUMyW0XQah/jt
Jk3JdFoG7AndODjvEAigE8cgmkWBnxTxGhtJM/Ajix6jPfJerUIylSw+eTmf
h3nB8jMNV3G69qbRbBZmWElF3t69E1bu/Xv++7f898TPoYKUhj7KRDXEx4T1
JvhxF1GWJqQRR9BjpCvCPg0036xpcsjfTs+P6jPmkL/hbxUCSYqF0mcZpc5C
O+AOp+BvLUJ/ShViQT8ISpjra69MUJKliCtphT5WjDL1u2Ky3r8f1f1ei15W
QZ6py2p144/12sVTHCjo1SrNxTwvQTZJBnk6ovihxLUrmgqvoJ48jS+4Hskz
nAehrdDMScU6cMqz3Jwvl1Ece4swXgmOo3iRQipwBqVAPzMbdBOQ7iPB2CyS
ZE6jS3DzLCGCtnju5yBCPpDIA5dMuUU/zlNuFmR+KhotQJ5dE7Q+H2lSy+ao
lVFd7ytVQNoVlEeHAbDYbNIAvE2BDYsovNCDl0EPZ2USsIyC4zqwVd7SX8Mc
XeEE95ZlXEQrwQM9klAijtNLbNtivmAYEAve7hIUH1QUFfRM9ztfhQFqkoaZ
arNCKua8Uy1n4dzPQMOB5hFDLSSVpNIaBBY5HLS8Wb5A02EbpZgAiuE2Idgt
a2IfRqNwNPBmICcgDilEW+BtEw38KAEy6RFKZn40YIHM0iIN0hi7CyoDhj2w
ZIkNJvwIajv2VhCocpXo4Yiv8Ool9B+HxBaARZjLvoouid9Zf/qgwbLpJc73
LC1h7qI05OmsqDyCISpKUGlLFCdo7/BPL44waiqYPmCDTwTCtAfFpn+4XEDQ
Cp4BGHDfHsTqUENTcYO74+c5jKsUdcF6c64xe5Yrn0ylVKtV36UmCDz5bt70
vk6ha6SkAxBonpEtSkmQakgNepRAagDikpNZuky9dPJXUL0w1hhj3Di45Z2q
bvwmnn5fpr9n5jinbw6t58VvMnqPDAmOCyg3EO7LMI6HUnC1p8ONvLA8BpKZ
aFKS7oGieTkZ5muwvqyO8RWsG6YzkSF++txi7iHpvLdJepngGAral1GQpRq1
EZQeaXWmZ0wIDkWJ4u7UwOSuTiPQeChfUvcehqM5zCMpnHcqIpkfeZO1MeSN
DgQb2prOsfxDqDTMQdPlrNinITSwRCFXbuIqi6STa7gaKJpZ6MdDqDKeOuoW
0mUFZ8/9ZF76cyVkb8O1B8WmuffZizfnrz8b8H+9l6/o77Mn//nm2dmTx/j3
+R9Pnz9XfxyIN87/+OrN88f6L13y0asXL568fMyF4alnPTr47MXpXz5j1fLZ
q/HrZ69enj7/rK4Ikemsb8m2g7eHvpufH4B+DUC2eBp89Wj8//7v8X1wJ/4J
8Yfj49+Rb4Fffnv8L+iyoaXl1tIEXEj+ChxeHwCLQz8j/xnkMPBXYDNiNDLg
GS1Q6hYwtKODg1v/g5z59qH3h0mwOr7/pXiAHbYeSp5ZD4ln9Se1wsxExyNH
M4qb1vMKp216T/9ifZd8Nx7+4d9Qt3jD49/+25cH7K+/Jnkk3UPuOojhI2EM
xqj9H5IQ42PbRpD1RM0PcwykGOYVqqcoCeKSLAWYooUIeliBLssEXPxCaNph
FsZ+QeMrgCOwnZKAx2h7Kq0b9igHn8CY8BwaYnBHUZKaj1QC4t+So2QkQ0Qf
6H9hdAbUpksyaxjJFxw0yAkKQoQ9ECHhMirQEE2jPAC14UXFyENq/wiOyHDm
B1jJGWmPh95p4oXTudQmUCQDVR2jRoHwLRDBie/F/hq82RNvgb6MmN/EgUcl
dHAJv7XXW6kuSZPhV1+PvUCUViqDDQEPDHRc6DhiB5lOZCM+JD4IZ4HoOD33
vuKo2KDAcsoFJYJO8ADCH+A7Rien51QFCxj0z4DWheKix+dhUa6ktiJPN8cn
OF3nAvYHdYGqOI58cswhgqdYDVFIjFWw4GM22m9IHVO9h4/fvD7iohaT8Ee0
dqgoaHAlk6RpgLdQsEloiDPyhQLiFRGJc3zLqundO6Q6gIhxmIeBJEg0gzG6
B+yaoTQKNCPNLGhDVS/eog6q0E0EveiNhRhVspsvguDQhi1C8KexRiRahsco
sXE4Bz9xST9xIxDgEX4g9QN1epGiXy1bpvAsIzeGPF5REEfqb+rDsNPtvzV+
bvMbVzwmAnT0nmiX3LuSb1BNQ+tzmx5ab/C/6nMl34CSX1pvwPCbb2BtNw6u
OurhuvRbbnr0W2091291fKy3nC3W3nJSrzkhPvRYCCL+9gdVK3PUXZf5W5Ua
e+zfPfRukutPE5ZgzH/9TE/q0WfvUVp4ftA7YKYhyC0RkJTuGs14mG+zaF5m
VgzbNKdH3p9Zz6vijkhvwDWADIiohJwLrQpokvH8N+c9uFsFBn44CcJiTYJP
Ksn25wRPMZSFvyAAjfIFlGI1LxUOdoWakZTgTIwSmOgLUJrzBZMEcVLkZ2sV
rlP9MMvRXIlA06hvAj0NGVFY4vydRkFBJhS81aIaUnBUZgYTdVX1tMyQCQho
MMtE1xY+Dw94S/4kwjCeoThSexARQ7dIw4QMPtU1DOkj4ZgrM63CWNEz5Ara
kDCGVxODzezIRRwmU3yg1KWs3y88cmT4YSqCOI7Mu+JWo/0RYuwRgxXxWosN
MDdjv5rshkQsQP/PsTIT4YhYeLRHC14mSQ2wHuxWqBx1qf5eS0OCnRRS/sia
AtIecoSfEh5s8OxST4CaccKR8wn09kEW06QaEJGEZeFMjbglM8KN88jxFQ6D
HEcZ9CEFvgHexGnA4S6Mb5aKMdB0wbTMyfkiI56F4JjnBFrQgCHHRrK70gex
9EHOYKBtpHHqcVDGolKoKEuG8ewSQjgVJhLJBesGagjfyKUBRCOaFBE4GTj9
Y4gTwEvTaD4ZUulL8kBVaCMAZyoiYrLQc+pObWKB28yoOnqdXNcAxAq+wIAt
DB9SgiJBxQeUz0GNgFs2Yasv3C9CR23HTHBs4InwAnSg7F80xT7PonDKyHVi
w6ro5IMXJ6SjzBE1wdkQc5cX0SpXiuj0PBTCQrN0ovgFdSMb/oz4pyE/nVOT
W614TFrBSc0LvKEeQQAPYg5Bxo/8joQtZAVkaMQAIPkEUgjFBSNe11tanMDP
t7ENa/0h5AAS5zpPTBAuGCiCKgPi6CwuCWgjVSAQANYEJJ9jQx09S6YYF0FZ
BW/kQuPHFGBhFG/qr0gV8A7/Y/wsP5JwMNpOibjUjQEoO3iOBQyp4DUgiTZV
9IFeOrDC8bohOYXhZz43VT/L0qUxT1E7rsrCVMWsBGgDEQZYOEm0R4DVfi6n
mEJUdDs+rmGjFODLNMlm0Q84yiqkkGghDg8hMQTeUHM80wtp6jRNn+c13WnK
MLKcAWjQqEDTgmgjH6IRP6WfJ2GcXhrUF7xkVeYCYFxLNca15yDf+QY9E0wX
PchR42orLq2xrlq6XAjlXqbEammznhKsPJZI8xkQr4yTihCQYQ4XQISduY6o
BdatoxP5qpjeZFrYiJAVhUfNUzTimQacj0MRqjU4GXZnXkqMvLkzNfqau+Lo
d3NnqvVu35WxxPQfGZj+62ipOqSgGhf4DwSjyII1zgk1FItLUapsGXdYEmLj
PrUqyf8pV1PT1xOQDOp/ZdYXfoIvLxCRS5SPF4lIAGkCI7PKTWUxAfYm0v+x
K5KyTNhAKIdOti3oEcsAINJCLZPiDKVj7VoOsdbyTDTEZPsLsAE+0DBmhwmp
es1+PSg1Q6j47aV4e6XfLtTbcrYKNujnGlZTK/QEUrEm6IWnCc/HHkDLu7Y4
4hhrt9cs2KFROir6mjTKmVaF5gzrpTbhe5iptQPWpIUQRp9mFA9rDuTnRj0U
aMk9CiR6VierE8ugpXcHn2oV3tCtqpK3xrXSrPGuMeI4rpUhN8pYSqNp+Lq7
JlxfY6ME7iaxt4K09ApqwnGgxViJ7YKvJL7RkEC8FKpleEOtTkOMsHNlb1qa
EfNWWTRh3qlUiFyIYNqu1QKpUVFlRbw+pmehcOXe0E4jX25TESs1/Fupf6sI
5aPxG+rzMgTnfg0mGye2rTIEEm6Eaap5e7vxawrR3920vClBpr2xDSp+d5Mc
/O/40Xe5fyFerVlpJM9h68i7vHXrlVw4vHXroffEjNEry8rKkXGsLVdCeENo
CRwwrCJWoFxuGbyzWZRYoytwqWAuGLspX1QGpo4NSawaReh0xwTWQSPQFjvc
kidCKv0LEunP5xl1MU2GHFdV36MhfMYjOkvFHoVBZUHLMqo2esQenI6GouoQ
WzNWS47AwMj3k5Wxs60DpCqLhW8pWaVXMHF7PcXU0NIcC9biGhmVlglFt9UX
TLnOy4kKNFdZdGEC2vWKKcyEiTyEeoe0TFerW4iVWS+OjVpVqBYQC6w4auBm
LVPWX/kmHZasQrcbOh8h8xsXfDXwIOKlKKuAZJM4Dd4qS02GqalhoElOASJy
BPMVPBKoFTeaDHhzAruO6JrLflSGmiUh/GHF0KYIr8ir8gs1SWWsQdQhfliD
8lvA7K7PbSfQ3Yj5twHg9seg6XbDKy2lr8SylXdcRbnt0k+ffYXTTS4ZbNj2
Y+ym/LwESf1uka7gzzvfMA2NHyp9fHdE/3fn+IGnmHUsC3aVbv4V//nmzjal
DdB/K56rtZdWnl9H2315bn5em8tugufWM3dpMZefjY3pLEqbC3X273bbMJPN
wRelK0+ttneboc1MkdLifqeyCORYQjIqEgDd4Xk5kdJ8dNXwLv5zaHT2yKss
P3XToDvIy1HWFlSFoObrRC5PSZBMrZHD36aXgFrdMsv5erkMC9wmKaNPaYHF
+tatW7LOR0aduCtAMgD9rMruWJO0yqpYFchtJgBp1Xu8CJwQW0xnvGGuoY9i
yRwmqdoNRmhDbpS5U4ek5S5BXIwib1gpV3FSjnwrBC1A3EfeKRpxDUOLnXph
TltPDRfi9Jwqpq1tHDKEs+gHcxIMasv4yjEyVuPE6jd2wmgYwRh/ikY1EiFV
lMsmIkdsTLQIr0p5sx1LXaIeHYMQrF4wjEvwnAFdc1doCYFcArbSUTJJy0R7
b6xDxEia+sDPMtzy511Evs0FHE7E943tHISQmIUVqsK00KDEoUkdeK5TWjO0
ySFF2OQHmi3osUFPikaEDz20SLGxqInwI3fHUpzkhsqdY7xijFEkbh1LV7wM
93t8KcvF9uIJcK8oOapZmou9FfR+tYpx9S78vsTVP6SeX1Nt5Gpt6tYtinGm
MGy3bnEAK7BodUokyvvPXNqvyjEE7988pt3UIKjRbK03HhmL2GoFVG081zBb
q0YTexRrxAR4mIWCpoGx48Wnfk1oDx4qlCgoJObfosGqm15cs5WDLNQdfmWS
EpNPRmoRNLSKKDWDa4lqub3eH2P5nyq8ZxEl2ZfXAlQp5IduKbcl/MgZz7YW
PVFFixQhDnveanUn34NQEsomaUGrBKizUhkTqTFl2BhDZ35TqrjqSpkRHnUu
fdHc4SCI4tQffAytB6w1jh/+Dkv97uGxMSMqMUcuJ0Z9mlqhCMUgeR0UQPGg
CeEiLs2iOXlVuPou9ZgpQ94e45m/NYU0vWOaXaOa9gpUYHPS6ev2im3ES2qm
mRW0hzfgNuI/DS/JCOeYRfsLK8JRwQnWoKeY9VIlRILfVMd1eYpzrFb0Sw4m
1mKM1uFwVFALcVrjzOuggD9yBPiL85V6BVZsY1RwpznqkRW4Yp5vdPnGuKdC
gRXhfCFruFN9XqVg5+nczCCi4acIf6rRz/WEP/4u8Y9/zQGQ3x0BtZDwgUOg
WlTDx7F8N+JsRzn1KIYcEZenM+iKiPoEPnwIOsEza9wSrpi1H/ISm9ry/ccv
SjXTvroghBhGHNRQtpxiBg5IqrXZQVBTJZJ77fGQZqaiqRYWWV5ne1BVKz3y
vmoaVnyYaFdNoMd4rJzXY7gp6BPTGJpBNnvHOUQqIR6R1hsVe5EnK+nXtWNn
WUvmNooTLQ/6i8YYsX2im1GisYffXFzbMkxrVS8NcZrjjEjPUI2jMUebZjiG
J/FwS7VcQTJCG31sqSUm87uDskblw+stUqfQUpfUC2ii/brT7YrcBhVlpjwy
K4jDl9sMTGcYVwvEeAdVgwhKwT5qQnLai99VxdsDOTV/fsWBHJ+X2i6Sc50x
6eMQNXhlV+oQxKEEKY+aPK1N6pUfcsOvwGQ268Ju8P3KcJmFJyzxgN1X5poP
36j2G0jrXGNqWWK63jUmEYFd8bZXdtIUBt0e+/Zs+xe3rmdIxeal+66lXkfb
v/R1vb2Ptzte7FuaHhkRrH7Yt3Q9rr0GynfTivqLI3SO5EYhR+TcsIlol4XD
apXNYbNJWNNpOuW8ys1xfDqkiW52SDtI186pOKFWPXaEm56WE7qgCz0EkPbW
9YKBPhqHZxvkdhs8Wxaqza7Ow086NkWfxDqwZfm3ljuLL8n+O1a9LG9XvtcA
EZDH4mAnvCTjOFpbSlfqdqNGSCEOe8YtBGY0D6Hj+M8WMY06rrfN6lOVNPy1
YeGpM7xpWXWy5b/3qpN7gD/8qtNm4Yq95rRxuNK+7vTzCSU2XBOyhtfU8nUj
8imM2Mxm7hBJeLsGE+0V9PDp2yuwQor2CtyLZRUK7JdOrAo+shU1HR61rKjp
l1zrWYaUuB/UmFjlfiW4aF3gvA4KBAN+VStqzYJONWw1G/v6740V9A0+Wivo
E3/sqQs7K1X9pS0Qca3h9Y1ENlnC2yIUca3gGaZ+6yikFSNvCEMQ26fT7QEf
6e8KRnx3NILTTt64ocKR2A9CvkrQEYzYxzI6AxKl3cyzFP6S1wnBBcY7+PC/
wo+1Y5K2qGVgroroyMXlnme+8PJhWOwARbDVePlDr69cUyxSJ8Xc+rbpcktl
BlxTNPJRrqR8Ck22WuVojE0+yrM33T71Fhi94TZ/wuhrn08Y/ceK0f/yzt5c
6+Yzfz53geedZ2x3gdFPGytv9mIFneC/MurYjaN3d2E/iHrd39C09nRzN0BL
LbhT3CPp23vd1Il/eYLZOl3/ed73VnK+FKRiDjZ0MJsH4Xpdy+Z2WwHv7X1M
Neb9Rs9yLz/GMxZ99uc0HbLYzKW0twN+Omaxi9/KN1uKCW9NWdMIfMQHLvYK
D7u8WeTqyUcHDz/oAw8/aIGHH/SBhx98godbKvjI4OEHDfDwgxoFP9GBiw1c
3g083g0c3q39XRdGu7nDuwlau63H64BslWPiAG23cnm3gm+d/o+/ndPbdxO0
G5j9ifxgi4i9OcUfFILttcN9N4+4x9C6R3WTve27AbLu4wzXuFvkZDdI9viT
//yB/ef6VDctzMfiTFvW0O08O19RFeC/3Es8juuyrUP9yj0NIO6PAvpYvlvD
p/KOXUH75hbnO9UKJn7wtlx5K7/q7xnvrDKQp2zNL+2PAveOpOaPcxjrnwps
7yBkfxS4P93j2lHBlffcvqxw8wrOhYqQDv3GFXR9rtkbdzvC/R3x/n74Vr71
LMuG6CJK37p2MyZ7HE99UG5nIZ9kPHx6dnbkdpwb3z4Xb6PT7PjZuAVQ3pD4
PE1Xw6dZGHqnMXqdZLsPnz89pezrMs/D8FnCeQhQ07qLvH42pFIiNycn0gvR
YUKjVaSZdjP1mVh5X6o/Q+ZTbhFoNUmneJNnFOPVsyPvMV/qCj3Au1fwhkZK
UluEmGMXFY1RkbygGS1sio4dmHZMFEE2rJb7i69+HoCjW3AOBc7ci60kOgeD
zBJG17XSDVI++yHQEfgBSt44sB3mHHthO8x9r+ERQwt9bXdv3SCoP/2rH+AY
yQRZFJhjbXiRszDg8upaqaeJ7YfCev3j7//LKWp4qyb5j4Z/hyyMkjItc7z6
ko7QdlyUaiyPCxrU8eCR2A7KTiR4HXP0upkeMfzq0Kw0rsbekIEaCGIXdDVY
UB5I5Tuf6G7cH4nEj7U7YhuDm6m+TNionNhM4pqFxvXbui8PRng/byGFzuIy
pW/jC5LIUdMhR0bXZnKSbky0trvzB9FHGU89VDEFJ8WgqqnNSbjwLyKqhG5u
Bm5V5henVoMeAlnrkbiLt5tn4o7OLCTNtowS6HwsOYktUCZscV2t7r0oJi5g
BzlZrgq+zUree806BJnAccju3mbNwezhWjrNYId35yxzZXiU/QDA7drhT6OH
0WbWcSl65uNFrLa3116mdzs1O1rtUcO1NfhRPlvthf20U8VJa+yTZcA6RsF6
yMnWbTj1myr7Ku3Q0yoCW23IKnO73gnHI7vMVT0sUY+OHS7utu008Kk9HhHU
1D72s208x7qP52ynJVFc3cGr1uXyGDevtdVtXE16uY1jUwrPBE5zOP6q3X1s
KWW6kS2vcTbz3PS/pmEQ5QRGcQEwJHiroUrGDG4CZnlRMJpMYywgHfYLBt7j
80djyhNFUOgMfBrOpR36U3W/tkyyI4EpSqMw8s7RsaV5ifciItInc+JpLAvn
njjmBw5jHIeUi2NRyeLuRTNhntC2AsvotpY8jUN1XwuugWW0FFbNANvX0wNG
9vb08F19pzpaR8F5A1gqgoW4rlJm/bJYK/ltXSWh/KV78mShuENSOep8folw
IFOpqYLH6hqN2g0rfS7T5/QNSHvIOV0VMMfygg1j51ndjgyvUfp0BIT29FFw
IPjuENk831wuqwVX8fE68ZeI0sNIL9Mp+muUynoJ7lCFGs7DOgEhv9AQMjji
/qqoSsOWjpxwjoTrHXJ6V+EmerMoJl4LjspL2zXEZ/Je/loXvZHhjHEyNOwA
3TZkzn8z4xnNIYwv0HnkvIr4fBnlQeznOcqeZgCmeLAyMlvZw0ReiKEI0IYU
oKlMEpV8D+aQ98n0MjDyOdOwO/MJmWWJeS2Jb+w8ZnT9EV5sRO4qRdsyCsX4
hIFiCSfLqBbFKV2JXLp2Qr6RTtHTRCDIy/elLzLgSRy6mh1poFLJiUyYU5We
VIK6SHXBywaVxEQlVB6Las3cRPwk4y5PrShZZisa9+VcrjOriS44kwsNdJon
/WaRFphKATN/cY4hq8lcpTChWeAxDkGC+DItjOnFCUY5F1G3JIl72tSRhFry
C5y6MiHxyHuF93rpYirNR+4wNRzHUvLpFZquwrPmgpkqKU1Uf3WiC0yepDS+
mT3JLqgbpLQ2ufOEsp8Fi/fv66ndoHz9AjYzt1yfkMzhUFou+W3bPbrynvBa
11TnHBUI/JXnTg38pbHTpcl53JAKt5OGE24+RAZYmYM5l7DIg9Q+j4X801qa
dNCcadEMZWmuh2sKGm9SUMTo7EhdZLERt34UOVVNpWplWdPGA6cAa4IpXRdY
MKzDM9fWjjrbKSmkjVTks8TJhOo9f+609SZiJq5vY57lYYVEunqd1BzQgLZu
mvmXzEN9v12UgH2VCYh4/bzGPPQBe/iFnRLTtNS9NIxic748c/y2uMHA5rQ6
cCbY3crrUd09lOzMtbPK6hoBTCMjFat6xXCRx1ibAuqNtY56fPefB94J/oMd
xAsB7/6zuVbdzCC6kyMOhMZRIQzdUpjORciAqFimEm5mhcgqoMwk1pRrY1mb
RkTKOKY0YokyR3Ts6+vxwEg4prqteAVPRDZwH/z88IK5kIcxu0eIyS7KgvPU
8t4DET7BlHlK4RSGTZijNppVpoo9apjzVipelV5cZP3GH3XFA07qZGYo8FUc
Z04ZGBW21WIgVUeIdVG+EDoCuJku47WRkS1NQt0geKrRMooxDTANssUb5ZnQ
xLR5gXlBTbJp9lMIVSxErjb4f2xL55WDeHGRTvldcf3+CsbHp0CFhk2Mgz7+
GM5miFcK9BU7iGmFMaXEiiPUWmrF/o6byuetEsxzGCiEz5ETkiVuYJBTz2a7
TFWq6BjXjH4TT78v098vQlAyv8nob+XLmfeWqgoqkiJjVB5AUiGC68KtVC6j
zFKBngVotQiTkf+IOZ1f65zbIoGoPpUqugLzkdUDzrYCZr6ccdQHIl5RrUiW
qwe6EO1gqviwbMUkU+vmUKoK7qRx74+631OkoVXNiLxrheAf+GS4libyFoqp
WOUih8++HI55mk4n69BzDoswxprRKPoo9QPhWWIDxAUM6pQpIBecahhYZ2kV
GxXIQq0OpECS2iN7lUKTMAPjWK37+R5mDMdpHuANsYaLwwkrpheRcjvPnj66
f/eL4/fvRxXfx5GbtOICvZZ2y3K0OkIOdR8xzVT2jbTb1GjrhHawPTcCkuQO
M3nLcZ9kqSqXt5EYVMq2zrxYC2ks24k512RXwsTKbFkriEObzJWbhWpY2ETb
wtICFSaCCzANXMdltepmKTaGub6MtRaZyTisFqPJ+ux40uQZCOomTlN3wNnf
d2rPd7uBI1XbBLhfL4q1hIO5hUrCLTLIy1DR5R0NdhAFl2PVzr1NvazNRUr6
jA5RYmspMutsNGUVfQSQybSUdQJd9ZqIhE0kaePCfxsmldy5zFKbbQVaK4lE
T9BHAp0r9TvXJztC1FhoiYsdN+0MwS5QDq9Xb0fkrAysndl7a3hcW1rlGi7n
zrdrRa+P+9bnhNHkizhrZM4vmaI492AyQ5koeBvrDOl98zlblLmJd4FilbTC
Ne+wMjt0fm0sZYVJVSzs1q2u7NcOo2tvGe89drbpNbCJVv0oLK+wtlY2arcB
FU33Gg23PaVNPcsoz8nhbjaq+zOmAwYRZnHJ3uWimnRa4CkmykIRZxWG5a6C
3cHO1E20pi4plxPMKGFXbRW3XGWhn6qRhhpnI65tN9W9xaWPre6Vjv2jgjx6
GGuNf+QtErUh4NGLU9tZ5Y1EiUrI4W4yybYCrhte4EEhRWBbGkiorWhac0Ps
5Deo6Gl562TIxi0zXOu/pYafam2/ifJtMCfba10ls0If4pqit4x+INYbBxwa
TmogMsK8U0vS2ypHfdctyoHczobkzEpcn12lKyGzatQ3OowhcCgyAVN5m5VP
QVPYcUCjRcVKe6ztsLk0q99T2I6Y7OZAVtSqUqpNKrXZp9hEl1b8jY9Rg5Ik
9pBCE+tngAtFZotId//neuqaucr33aIkLXfmqR/9ntFakyKuOZ6WLl6WedES
BHW1Xwl7dFvMFwczemhg2VQ99ql21+Pgh6+zMpdwKeoJs+/40Xe5f/Fexkm8
B3Zs7i/hRy/l/pIzjAM6tixUdqlkck+MY6uKvfQcJmYE17Gjxg+yNMdBioto
FQv4XPZTrrTJzR+5GZLRcUy5VM3LbXrHjLp1UJzKPD0XplWgf9nwziq0f2bk
MxEShTqI7hMh8HGaBiXpDrG+77i6hhuXdk9inOaIWVFmouftG9rkQosT9tWG
8oxinWtiyVtKtcR2C94hJMR6TpvA7BOIGl2pnU3ko4WRPl0Yp3j69/Tcm0BM
T0cKkyQtgXrSKuEPdKQiloOEpxMy3JOtzxo7Gm9tlBcucCxEVrYE76MOiAOH
ZYJbVI5qVRh6D0Z4mfI8yzdhimQna0SQ2sAsXRtH43AK6lF+U2SmjjLS15Mo
Ro2Nx57xZKMyqWRVmqgAAtG2p2UuTjZC317SbvtFGNPiCJ1yj2MJ3MtuVaSD
haeyh4sVU32zFtMHD5c/i4vyvBpRtf3lndu+QaLvHY7vuS+rbW/79p1vRKt3
vrm9YduedauHk7rW0uYWbFfx9gvjrG+O4m2l73iHj07GRy3F66Ud+4lvNxVv
3GwNcno4vn9kPurYes5t4+hAe/Qv/1eP1DddpRWz5HOrRNM28fHJ/4BcnXyr
OO2m1/ri2JiufnKXML/USuthujJLjB8gZQ++tb9g6Ya5ZNba0DqWlnPp5HB8
YsylK298jG0cf9tYvNK2OZeg9BeqtLt4rd8GWV7DD/qLKq3a0WTZpV3FjfE+
rhSvSVG9uE05D5Yki77eGeMj6wf9pdbvZlXo+KVW+ooG7/hwfDwAXhwZHZAD
++BwLG/03nPb9BEnLjpP5LbcXd6v9N6vk+RG63fq1E/wVN+wDmQI8rrr0PSo
XX3gkZhZfppSDlvub5TYvswW2YbN+tT+vgot799rL7nhcnIneeIuk/5EVjPn
ojjTvyf07z0VB7I3iRKNRCy17y4dVQMdMCJ/41ZvGW8I/wmqisExi0fUJuEW
RvrjGYcgSIgZoP+en9ReNvEjWsLxijJL6u9hl36ve9LY6D0bFSAK1WUgud7W
NOZbMcaUJbZK7MDIlRuvR/x7IUK7lT+ndSTagshxj1WZqkNd3yPZSWiEn8mt
J4oWAh5kB45lJ0/0EONw8XhapwTYm+7OiSvIM2/xsn+/j/zEtodfYsPDL4Fa
+vN+6803RG1hXOAjOvp5Xov4mqblhhcPXe+dmy0TU09Ea35ufe9mTWt0MxqH
UN/bQ004Dm8beJw+kdRHN+x0H6f7DJG6D2gHyWy4obO5xgfdNRp440Xki7l2
Pfghi/YHvReovvHPuhSIels/YsGg/Zq3QU5EEoopqnwSPrwCjvR3k/xaegAp
63vMS298o8UPvQ2btR8qSntXuZX4QnEWm6ctnby/lgP7Cuc+xfjdbX+K8d3F
P8X4n2L8Dx3j14N0KP3y1XdP/mv86ux1U3Ed41eD9I1ifN3OPmP8o/oPn2L8
TzG+pscZ48er1VYxftttpPrQr5J0Efi3hf2OoB+I22/Q33qHajItgwrZ8vBo
sbY9pf1BBD89LEAbBcN4dp2oAEYjEgxQ/XVx2S9gcCYlrrWvMropiUcUypHP
LUECdaKDDisKoMDYN2EekWoiyYQUXFS5aZEFHOREArkQXYU47hS5xpGJsXkA
gYq8Qr7mU5rIyzWQcI6ePhw28ZGhEj/BpcdKAIZSI+wj41xNrX1cqAT+vFnq
uV8nLvGPv/+fT8CEssy/MmBC+3U74BPatex0P7e6uO00WQd4cPEcr+PJbnOZ
NtSiHyUQZ8lwuC8ljo8MUZ0IRnsl43sYct371sJB1MMNKdkJDlF1mMDGZqhI
08j2AUf0T/SvAXM4MYcuSky4wwI72qESEUCzkZGDclUv2Y2YVMEPVy+agJPa
pxVA6cBPdB1iYE0cRQMh5t+6ksa5asEpzr+xkje5OVcrqIiNi7hrcVIiRlVX
8oUTXPmmXgl+HBhJVXu4aqkwtg6VtG6nqCIt1Y+J13TV0jzE3+ghVpPYRE6M
v5sqaVbOjl+aKrnSKMo9QlI2wlH2SQl9dkNVLNKvG1zRtritCfynKS1nD8TF
6lAb8KJePHwynYfC9h41v+hq+sbBn3Ue8eZdBON78uIFX1h6ETqit0XXd8Tr
So4OitLl3SVffT0edaJC0zzriwqRFydDMjySQ46nR7cB4qEi2nNw+Pj87Gj7
zR9Azo44kPeYyeKR8c4Msq5h38cvDdT5M+7PwFtMacSJAgz/6PwY3R8Hnjjm
yhECaec0G9RO9SCCwciDsR8C0RJRnmVHtnVPRjfibJxokggsIWamXctLCgZw
7tmFBX5i/UK7TYRg4lVutJOb40PuY2OkKUEaoPXZTJCDVTZ171h378S6mIuD
524E515HZAuyEYuN6dxupdGRSlBAIZK69KiT7A9Nsb0pB2c2TzyKT/lmUXHY
W+Id/TEkmOTXDhphG7ukxqppu48PHjLZ2IgHbQYFdQrLkUKNEbWpD/+WIEsL
ktJGbDfKcrxflIXt3s8RZbG9nX16eC6/sjf2UvXHWpzUTvBlk8qkN9+6dWQD
yip7FHaoDD8iaHaDMhtWZuWn3bEyA6Vx1rZJZRZa0wD5dFXWIBJu6Kersib0
xgkBdVLWhOK4oCBHZT3RHAck1ERZD1SnDg01VSY/behODSLqqKwd5bF2vmCk
asXQzdrBjfZYAArFzjKYvBpeiet6HdSaz921YWWHp0WBpwCzoyssIVVNFUOy
GnHWxpQdjo8/p21Af7jiG4bdlBn7dJy1tfJMKC7FM3Pbjqu2VqGt4lPWLh5H
bQ2i0QOnctTWUFl9h49LBKu1dQutC7ey0Cprx09LZW3WzY0aOSr7UwROwZJg
y9puoOryg41jXR9lu/oaIP0SX5E+jp0+Ynwsz9X7Yt4ZMXeS4wWQ4AveOCCn
ju9tthz3E4XE6FPJ3YCMTAK1JSgjipOHTVTn24Mxoq7KfdvbYDI1qlxQjDq4
7nrflQ2Xx41uzcVoLZcXmui9CgrxgLERN6WosaTCPP4XJN7wvs7LjtpRLE9T
uGMCIALxspK4k9DzLbmZ9rZZFsTVVVQUj87LuqyoyspzrIgEATQwpYbui45l
GSYucUUXxkCK292olQkyFSvKwpjjgkW0ytUlIbxVi3AZ6nEuA3N5AYKRXWSh
bxlHFVC9ycYPCndpMz6VhUfe6ZTDa8xJMlA4mLPwRglo6mJlZOfZO17AV+s0
Sv5u0IEa0i744OQDwAfyYnmBODXO9SYsYYMdIF17SnRGbCJur2BCw46NNmI/
+HYNPdU/gQnNrgd/PoEJW1eGn09gQr/KPoEJ/Sn7BCb8bMGEB5/ABPsnR/c+
gQleO5hgvflzAhMedIEJD34KMEFs0dhtg8fOOALXs+u+DouaDvzAfvfasAO/
WilJKwUo8q7Pws/mYdEOMwDF0VJeYHcRZhBoxkvy2eW+IHiVGvpwkIEas183
YmAL0rWjBW4Z33GTgRjJjw4oaJjQewAJuo6JHJsgwfFPDBI82AIk2GWngZzY
P3t8wP3pvdfT/rA30WPnZ72Y9qQ3KubpFbi+xbbsm9HknW82LbGf3bviLE0/
xOUbK5CsENXr+MlW52kaOdD8YINqdj2Yszdq9nXCZ6czPo5atrrtxFHL9ud8
ZHTPdyhufP/Jfs76NEM8G5322f68zz5P/Oz5zM9+Tv1see5n55M/1YXync7+
7PH0Ty9IxFVPjcV7OgHUBwap19M24Hs4BbTz6Ztu2MMNeuyJmj3Z8H3umrjn
Ajru9QM69CrjkNsZ1m43Uff0775lYiyqujMOXXiHg5h+myYkibVMAlvunCgc
IMXeNjicdG1w2C84YR3ioEMoChuhVBnUZs5pyXUfBxBhLPyS4/gWCMWEPtzD
l4UCrtoQ+ZDIRRiR/PoagcCr9z1kSoYZCkPrPI881OWCLGrgSAu48kveUGFO
6vqM2e92C6dMdEEo935dey1OTBjlRF63YZ8o+zg3XdzbcdOFW2P8whCVDdGG
DRGUDZEThZic9EJMtkFK+iMk+0FGeiAi2yMhOyIgO0IWu2MMu2ELu2EKu2EJ
22EIO2IHO2IGm2MF+8AI9oQN7IYJbIgF7IgByHm1Zey/h5h/11h/xxh/f7H9
TjH9lrH8nmL4Dxm7Y8x+0hWzn3yQmJ29sJ3jdee2hHfvKrX32G7gbHDz7Qau
oNuxM+Bkq50BVszbM56tc+LXHsu608n+FMv8bYK+p/X/2vB/dGHr9a38n2y+
8v8RhKwnHyBkrSuFn2O4uqUldF/k2Gc52jbet102u+EHu+QVuQX3+G/TwPMP
J9p52F+b9LnjjqCqPzhKNsY49g9UcpVFSz9bc7pp+4UJyFu54l8cJatV2zkk
Gn5oXZVr/NyulHR1EKXU3eP9tOn4NI5QZ8m2AL6r5HM7qfIGJc/FHH0t9JpZ
cvv52dKkHPxeV6y1Xu2m0Kkm9KhWV9WHNS+ZH86ybFikq1R6srUM2+xdPD07
c+fF0z+fi59xPcmq5Slek3UW8vXhh/DqkZE8WKdi/urrsTdmZ/4ZVLoCu4l6
DMw0+I9g2cHkHI6fPTpCm/Bi/Px8yDnWob4BpgNOL0G3ryJU+FGMLqflDHr+
jDLF8xyGGpJ0GtKrpPVfk8qna9M5nXERLldpBt0BtQ6meJ0EEB0kmKdc3jYd
JXR9OBtokD7lH220+mDy1e2paTcF34VKiE/PHhnJs6luHlUwltjDnN2ie96h
0GpHwpM58Q5ZmR1hTnUUF+kLAWFlWubxmheqOhKWiyvfj6UrQjevcY3sAGXR
fE4MFzyW5AJR//j7/9LN8jgSeBcbmk/pVmEf6aYtdmtOdKX3Ryove18rD5xH
buPvPPw5+C/BAoVD9x7c8TMYllTke0emEWVYCl6MZmsaLrxcipyCPIL5lwTr
3X2sF2/O0feBgcNZoqouqL1ZhHehIfmiF8gZFlDkrg8v81VwqzRO52sP59Nc
J/F++Yqu7p8yg4NwVXDC9VqabJQovJgOE9Wj9SMfhvaUz+yL/Cl19z6cGYcj
s4kTs4kvoUtt4rvs2hZ9evosVVO1icNi/ogoSJjhWjbNHv3DNJz5mPgcH9dK
beSobOcwODrk9k8sB2V/zkmzZ9Kxga3Buego1eBY9Ngs56Zwu/nV2JQaWuc7
fbyQPj7IZh7IatLLAxl/1eqB4M+2B2In0zGwBvA5pBEH5RkFEahN6YSA9kMw
iGYTuhwDkfVk6a+gGOGGnP0FC3HcipygSZeHciE70v4L2FNUuos0Q0swpBdF
6yPvHFslGtbDaQaGKzEvuUQcxqfw0luAd4NmQ4MwaE2g3xRwk4FbreIonG7k
fJgs7XI+fJtna9P9OAxH89EAL6Kt8Y/uROSvpH+REMyoG+HkUIlvGU6TppDx
pBMkBzmg7wQlNo+EPwCuwdcSpOrrrMAbwQKrp/M7gvHMYsHGMlFfJXUS03k1
4as9bes8CRf+RYRHgsIZ+hDa2RB8EoZZOxzoyKRT9Cto+4PN1ek68ZdRgMAg
VbXczOXZ3SkR/oPhlmR8F7HV6wxazaYxArC4+8zug+rxc1fuD5pPdNOzeJ3c
IPRXJiDRWbpahdOBcxjNF8U0gV+Fe4Rgrsn2cjVVDLl58yZ5uFkae+PYT0Jv
rJP/SI4RT8RdymrqDMQgYJ4ZBj4F3ETdzdIiDaDOwAhRzLRClAdFvrREB2su
sKac8CfzXZzsPE1gwtCJl8wfBkz0cIVED/MwQCDUxs8K2vA24xUN6iG9bNVt
wNq8JiJ+zG0VSdc9gzy8Gr9+9url6XMQebyeWxeLMI0MaVCljFQgJyA16AUI
HirVwrOol4oz4IhJskWHf1DbUGkGM7iyC+oGMZ1VihGgcURImBQ/Cxbv30tZ
njCX4E1UzUXezCklLY/9wt+nqEyxPm5OBxMw3DA3FzWRMV42rzzuEhYsZkkK
dwajGwZs3xRRHP1IbKQL3HFTokGjGUySfhTVsnAcima+4+/f5f7F+/ccVdaE
6FAMR+XdWZkEvOoRFeuB92j8xis1Sd6haliOj82M/EhweJlCnLJhSdIdE+Q5
KDRQGzLkl+OlRQWEMmbFmYcrH7VqTBs+xYgAw3HJA7qAam0OgVgZ+1llDP14
DbEcDwDwfwXGn0I4kml8+sQHoScxEgaX9v3J91j2UfAhes+VvZrH6cTHGQOx
3BSl4kcgEl9KE14ipLGYRhfRtAQjHMq5qmYVwtWyYV6j4Nb5Ru3MB2bgblXp
LNgQuAI59AyEKYD05oq70zQosU2elGqdh3vGtgm/P7LqHauWyevw6Br0kG5m
v4iAnQvo6qUvTGuezgr6gvtqEZjnIuAMyPBRRsT8wz1Rl0pNJdPkiYJgjM/X
oBuWlc4Kl2a1gIFEjwZYAv5jgXxd+uhDhCTAAyGOA5iSAQgUuIgrdEloZzDV
O+DpATKD0xbCaJD9I277wch7zF10tu3I46Z2HogavhjZI4JPUeSenb48RS7j
srdYwPDe3cSn71mHgQ8gxwr4m3tJymV8mqDkrGA952EA1rVY1+vKxS/vpU6c
gPVbLP2MsAM2SkB1DIyaqrvNVZNq+3UiFlFsLVSxXdhFlgV1xN2XMz2GCmJ/
kgJlqBWs5BOsIgiry1NcuJvKlWftJyUVqfFCsHHpGlexyKczOrUseUqAF12U
5Gmafr5hFEbes4K2HSxZFi6UuZbZGmBGoGIEjkS5XskkN9VP1qqzVCpIyxjc
iBSRoYj8QlzyXtsrnCBjlNRASB/qWLRM05LjEdFL6jVEZLSewSN8GrxN0ssY
Uw6wsnh3s/roPcVsSbmcgEs//dfPyO/kAOwFUgtUSsAx+muZeH/2SVSBGvDP
4a/zMoe//whjDtPj66gEtboKvadRGoCq9QfeiyhZDF/O04BgLXgf7NI8gSJ/
KXGSoTN5Cf/znkfw419KaBL+5/33gppBnfBfZbiGJ+cpPmAfN4xgvqJMMWi3
XCoFmQFvw8ucwxdDJoE3pyQCUSgyRKx5JwhGJyk62qCkxEmAsoBwjsM/cP7B
iEQ5+iFYDXAbI7uD/w8+KW3NmkIBAA==

-->

</rfc>
