<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.7.17 (Ruby 3.0.2) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-chen-bmwg-savnet-sav-benchmarking-00" category="std" consensus="true" submissionType="IETF" xml:lang="en" version="3">
  <!-- xml2rfc v2v3 conversion 3.22.0 -->
  <front>
    <title abbrev="SAVBench">Benchmarking Methodology for Source Address Validation</title>
    <seriesInfo name="Internet-Draft" value="draft-chen-bmwg-savnet-sav-benchmarking-00"/>
    <author initials="L." surname="Chen" fullname="Li Chen">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>lichen@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="D." surname="Li" fullname="Dan Li">
      <organization>Tsinghua University</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>tolidan@tsinghua.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Liu" fullname="Libin Liu">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>liulb@zgclab.edu.cn</email>
      </address>
    </author>
    <author initials="L." surname="Qin" fullname="Lancheng Qin">
      <organization>Zhongguancun Laboratory</organization>
      <address>
        <postal>
          <city>Beijing</city>
          <country>China</country>
        </postal>
        <email>qinlc@zgclab.edu.cn</email>
      </address>
    </author>
    <date year="2024" month="July" day="08"/>
    <area>General [REPLACE]</area>
    <workgroup>IETF</workgroup>
    <abstract>
      <?line 64?>

<t>This document defines methodologies for benchmarking the performance of source address validation (SAV) mechanisms. SAV mechanisms are utilized to generate SAV rules to prevent source address spoofing, and have been implemented with many various designs in order to perform SAV in the corresponding scenarios. This document takes the approach of considering a SAV device to be a black box, defining the methodology in a manner that is agnostic to the mechanisms. This document provides a method for measuring the performance of existing and new SAV implementations.</t>
    </abstract>
  </front>
  <middle>
    <?line 68?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>Source address validation (SAV) is significantly important to prevent source address spoofing. Operators are suggested to deploy different SAV mechanisms <xref target="RFC3704"/> <xref target="RFC8704"/> based on their deployment network environments. In addition, existing intra-domain and inter-domain SAV mechanisms have problems in operational overhead and accuracy under various scenarios <xref target="intra-domain-ps"/> <xref target="inter-domain-ps"/>. Intra-domain and inter-domain SAVNET architectures <xref target="intra-domain-arch"/> <xref target="inter-domain-arch"/> are proposed to guide the design of new intra-domain and inter-domain SAV mechanisms to solve the problems. The benchmarking methodology defined in this document will help operators to get a more accurate idea of the SAV performance when their deployed devices enable SAV and will also help vendors to test the performance of SAV implementation for their devices.</t>
      <t>This document provides generic methodologies for benchamarking SAV mechanism performance. To achieve the desired functionality, a SAV device may support many SAV mechanisms. This document considers a SAV device to be a black box, regardless of the design and implementation. The tests defined in this document can be used to benchmark a SAV device for SAV accuracy, convergence performance, and control plane and data plane forwarding performance. These tests can be performed on a hardware router, a bare metal server, a virtual machine (VM) instance, or q container instance, which runs as a SAV device. This document is intended for those people who want to measure a SAV device's performance as well as compare the performance of various SAV devices.</t>
      <section anchor="goal-and-scope">
        <name>Goal and Scope</name>
        <t>The benchmarking methodology outlined in this draft focuses on two objectives:</t>
        <ul spacing="normal">
          <li>
            <t>Assessing ''which SAV mechnisms performn best'' over a set of well-defined scenarios.</t>
          </li>
          <li>
            <t>Measuring the contribution of sub-systems to the overall SAV systems's performance (also known as ''micro-benchmark'').</t>
          </li>
        </ul>
        <t>The benchmark aims to compare the SAV performance of individual devices, e.g., hardware or software routers. It will showcase the performance of various SAV mechanisms for a given device and network scenario, with the objective of deploying the appropriate SAV mechanism in their network scenario.</t>
      </section>
      <section anchor="requirements-language">
        <name>Requirements Language</name>
        <t>The key words "<bcp14>MUST</bcp14>", "<bcp14>MUST NOT</bcp14>", "<bcp14>REQUIRED</bcp14>", "<bcp14>SHALL</bcp14>", "<bcp14>SHALL
NOT</bcp14>", "<bcp14>SHOULD</bcp14>", "<bcp14>SHOULD NOT</bcp14>", "<bcp14>RECOMMENDED</bcp14>", "<bcp14>NOT RECOMMENDED</bcp14>",
"<bcp14>MAY</bcp14>", and "<bcp14>OPTIONAL</bcp14>" in this document are to be interpreted as
described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/> when, and only when, they
appear in all capitals, as shown here.</t>
        <?line -18?>

</section>
    </section>
    <section anchor="terminology">
      <name>Terminology</name>
      <t>Improper Block: The validation results that the packets with legitimate source addresses are blocked improperly due to inaccurate SAV rules.</t>
      <t>Improper Permit: The validation results that the packets with spoofed source addresses are permitted improperly due to inaccurate SAV rules.</t>
      <t>SAV Control Plane: The SAV control plane consists of processes including gathering and communicating SAV-related information.</t>
      <t>SAV Data Plane: The SAV data plane stores the SAV rules within a specific data structure and validates each incoming packet to determine whether to permit or discard it.</t>
      <t>Host-facing Router: An intra-domain router of an AS which is connected to a host network (i.e., a layer-2 network).</t>
      <t>Customer-facing Router: An intra-domain router of an AS which is connected to an intra-domain customer network running the routing protocol (i.e., a layer-3 network).</t>
    </section>
    <section anchor="test-methodology">
      <name>Test Methodology</name>
      <section anchor="test-setup">
        <name>Test Setup</name>
        <t>The test setup in general is compliant with <xref target="RFC2544"/>. The Device Under Test (DUT) is connected to a Tester and other network devices to construct the network topology introduced in <xref target="testcase-sec"/>. The Tester is a traffic generator to generate network traffic with various source and destination addresses in order to emulate the spoofing or legitimate traffic. It is <bcp14>OPTIONAL</bcp14> to choose various proportions of traffic and it is needed to generate the traffic with line speed to test the data plane forwarding performance.</t>
        <figure anchor="testsetup">
          <name>Test Setup.</name>
          <artwork><![CDATA[
    +~~~~~~~~~~~~~~~~~~~~~~~~~~+
    | Test Network Environment |
    |     +--------------+     |
    |     |              |     |
+-->|     |      DUT     |     |---+
|   |     |              |     |   |
|   |     +--------------+     |   |
|   +~~~~~~~~~~~~~~~~~~~~~~~~~~+   |
|                                  |
|         +--------------+         |
|         |              |         |
+---------|    Tester    |<--------+
          |              |
          +--------------+
]]></artwork>
        </figure>
        <t><xref target="testsetup"/> shows the test setup for DUT. In the test network environment, the DUT can be connected to other devices to construct various test scenarios. The Tester can be connected to the DUT directly or by other devices. The connection type between them is determined according to the benchmarking tests in <xref target="testcase-sec"/>. Besides, the Tester can generate spoofing traffic or legitimate traffic to test the SAV accuracy of DUT in the corresponding scenarios, and it can also generate traffic with line speed to test the data plane forwarding performance of the DUT. In addition, the DUT needs to support logs to record all the test results.</t>
      </section>
      <section anchor="network-topology-and-device-configuration">
        <name>Network Topology and Device Configuration</name>
        <t>The location where the DUT resides in the network topology affects the accuracy of SAV mechanisms. Therefore, the benchmark <bcp14>MUST</bcp14> put the DUT into different locations in the network to test it.</t>
        <t>The device in the network topology can have various routing configurations and the generated SAV rules also depends on their configurations. The device configurations used needs to be specified as well.</t>
        <t>In addition, it is necessary to indicate the device role, such as host-facing router, customer-facing router, and AS border router in the intra-domain network, and the business relationship between ASes in the inter-domain network.</t>
        <t>The network traffic generated by Tester must specify traffic rate, the proportion of spoofing traffic and legitimate traffic, and the distribution of source addresses, when testing the data plane forwarding performance, as all may affect the testing results.</t>
      </section>
    </section>
    <section anchor="sav-performance-indicators">
      <name>SAV Performance Indicators</name>
      <t>This section lists key performance indicators (KPIs) of SAV for overall benchmarking tests. All KPIs <bcp14>MUST</bcp14> be measured in the bencharking scenarios described in <xref target="testcase-sec"/>. Also, the KPIs <bcp14>MUST</bcp14> be measured from the result output of the DUT.</t>
      <section anchor="proportion-of-improper-blocks">
        <name>Proportion of Improper Blocks</name>
        <t>The proportion of legitimate traffic which is blocked improperly by the DUT across all the legitimate traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="proportion-of-improper-permits">
        <name>Proportion of Improper Permits</name>
        <t>The proportion of spoofing traffic which is permitted improperly by the DUT across all the spoofing traffic, and this can reflect the SAV accuracy of the DUT.</t>
      </section>
      <section anchor="protocol-convergence-time">
        <name>Protocol Convergence Time</name>
        <t>The protocol convergence time represents the period during which the SAV control plane protocol converges to update the SAV rules when routing changes happen, and it is the time elapsed from the begining of routing change to the completion of SAV rule update. This KPI can indicate the convergence performance of the SAV protocol.</t>
      </section>
      <section anchor="control-plane-processing-throughput">
        <name>Control Plane Processing Throughput</name>
        <t>The control plane processing throughput measures the throughput for processing the packets for communicating SAV-related information, and it can indicate the SAV control plane performance of the DUT.</t>
      </section>
      <section anchor="data-plane-forwarding-rate">
        <name>Data Plane Forwarding Rate</name>
        <t>The data plane forwarding rate measures the SAV data plane forwarding throughput for processing the data plane traffic, and it can indicate the SAV data plane performance of the DUT.</t>
      </section>
    </section>
    <section anchor="testcase-sec">
      <name>Benchmarking Tests</name>
      <section anchor="intra-domain-sav">
        <name>Intra-domain SAV</name>
        <section anchor="sav-accuracy">
          <name>SAV Accuracy</name>
          <section anchor="objective">
            <name>Objective</name>
            <t>Measure the accuracy of the DUT to process legitimate traffic and spoofing traffic across various intra-domain network scenarios including SAV for customer or host Network, SAV for Internet-facing network, and SAV for aggregation-router-facing network, defined as the proportion of legitimate traffic which is blocked improperly by the DUT across all the legitimate traffic and the proportion of spoofing traffic which is permitted improperly by the DUT across all the spoofing traffic.</t>
          </section>
          <section anchor="test-scenarios">
            <name>Test Scenarios</name>
            <section anchor="sav-for-customer-or-host-network">
              <name>SAV for Customer or Host Network</name>
              <t><strong>Test Case 1</strong>:</t>
              <figure anchor="intra-domain-customer-syn">
                <name>SAV for customer or host network in intra-domain symmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
| FIB on DUT            +~~~~~~~~~~+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +----------+                       |
|                       |   DUT    |                       |
|                       +----------+                       |
|                         /\    |                          |
|Outbound traffic with     |    | Inbound traffic with     |
|source IP addresses       |    | destination IP addresses |
|of 10.0.0.0/15            |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-customer-syn"/> shows the case of SAV for customer or host network in intra-domain symmetric routing scenario, and the DUT performs SAV as a customer/host-facing router and connects to Router 1 to access the Internet. Network 1 is a customer/host network within the AS, connects to the DUT, and its own prefix is 10.0.0.0/15. The Tester can emulate Network 1 to advertise its prefix in the control plane and generate spoofing and legitimate traffic in the data plane. In this case, the Tester configs to make the inbound traffic destined for 10.0.0.0/15 come from the DUT. The DUT learns the route to prefix 10.0.0.0/15 from the Tester, while the Tester can send outbound traffic with source addresses in prefix 10.0.0.0/15 to the DUT, which emulates the a symmetric routing scenario between the Tester and the DUT. The IP addrsses in this test case is optional and users can use other IP addresses, and this holds true for other test cases as well.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as Network 1.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester generates traffic using 10.0.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
              <t><strong>Test Case 2</strong>:</t>
              <figure anchor="intra-domain-customer-asyn">
                <name>SAV for customer or host network in intra-domain asymmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment               AS |
|                       +~~~~~~~~~~+                          |
|                       | Router 2 |                          |
| FIB on DUT            +~~~~~~~~~~+   FIB on Router 1        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  Router 2  /         \/  10.1.0.0/16  Router 2  |
|               +----------+     +~~~~~~~~~~+                 |
|               |   DUT    |     | Router 1 |                 |
|               +----------+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|Outbound traffic with \          / Inbound traffic with      |
|source IP addresses    \        /  destination IP addresses  |
|of 10.0.0.0/16          \      /   of 10.0.0.0/16            |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \  \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-customer-asyn"/> shows the case of SAV for customer or host network in intra-domain asymmetric routing scenario, and the DUT performs SAV as a customer/host-facing router. Network 1 is a customer/host network within the AS, connects to the DUT and Router 1, respectively, and its own prefix is 10.0.0./15. The Tester can emulate Network 1 and performs its control plane and data plane functions. In this case, the Tester configs to make the inbound traffic destined for 10.1.0.0/16 come only from the DUT and the inbound traffic destined for 10.0.0.0/16 to come only from Router 1. The DUT only learns the route to prefix 10.1.0.0/16 from the Tester, while Router 1 only learns the route to the prefix 10.0.0.0/16 from Network 1. Then, the DUT and Router 1 avertise their learned prefixes to Router 2. Besides, the DUT learns the route to 10.0.0.0/16 from Router 2, and Router 1 learns the route to 10.1.0.0/16 from Router 2. The Tester can send outbound traffic with source addresses of prefix 10.0.0.0/16 to the DUT, which emulates the an asymmetric routing scenario between the Tester and the DUT.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer or host network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-customer-asyn"/> to construct the test network environment. The Tester is connected to the DUT and Router 1 and performs the functions as Network 1.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT, Router 1, and Router 2, are configured to form the asymmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from Network 1 for this test case.</t>
            </section>
            <section anchor="sav-for-internet-facing-network">
              <name>SAV for Internet-facing Network</name>
              <t><strong>Test Case 1</strong>:</t>
              <figure anchor="intra-domain-internet-syn">
                <name>SAV for Internet-facing network in intra-domain symmetric routing scenario.</name>
                <artwork><![CDATA[
                   +---------------------+
                   |  Tester (Internet)  |
                   +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                          |
|                          |   \/                          |
|                       +----------+                       |
|                       |    DUT   | SAV facing Internet   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|Outbound traffic with     |    | Inbound traffic with     |
|source IP addresses       |    | destination IP addresses |
|of 10.0.0.0/15            |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                   +--------------------+
                   |     Network 1      |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-internet-syn"/> shows the test case of SAV for Internet-facing network in intra-domain symmetric routing scenario. In this test case, the network topology is the same as <xref target="intra-domain-customer-syn"/>, and the difference is the location of the DUT in the network topology, where the DUT is connected to Router 1 and the Internet, and the Tester is used to emulate the Internet. The DUT performs Internet-facing SAV instead of customer/host-network-facing SAV.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-syn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
              <t><strong>Test Case 2</strong>:</t>
              <figure anchor="intra-domain-internet-asyn">
                <name>SAV for Internet-facing network in intra-domain asymmetric routing scenario.</name>
                <artwork><![CDATA[
                    +---------------------+
                    |  Tester (Internet)  |
                    +---------------------+
                           /\   | Inbound traffic with source 
                           |    | IP address of 10.2.0.0/15
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment |    |                             |
|                          |   \/                             |
|                       +----------+                          |
|                       |    DUT   |                          |
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|Outbound traffic with \          / Inbound traffic with      |
|source IP addresses    \        /  destination IP addresses  |
|of 10.0.0.0/16          \      /   of 10.0.0.0/16            |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \  \/
                   +--------------------+
                   |     Network 1      |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-internet-asyn"/> shows the test case of SAV for Internet-facing network in intra-domain asymmetric routing scenario. In this test case, the network topology is the same with <xref target="intra-domain-customer-asyn"/>, and the difference is the location of the DUT in the network topology, where the DUT is connected to Router 1 and Router 2 within the same AS, as well as the Internet. The Tester is used to emulate the Internet. The DUT performs Internet-facing SAV instead of customer/host-network-facing SAV.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-internet-asyn"/> to construct the test network environment. The Tester is connected to the DUT and performs the functions as the Internet.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester can send traffic using 10.0.0.0/15 as source addresses (spoofing traffic) and traffic using 10.2.0.0/15 as source addresses (legitimate traffic) to the DUT, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the Internet for this test case.</t>
            </section>
            <section anchor="sav-for-aggregation-router-facing-network">
              <name>SAV for Aggregation-router-facing Network</name>
              <t><strong>Test Case 1</strong>:</t>
              <figure anchor="intra-domain-agg-syn">
                <name>SAV for aggregation-router-facing network in intra-domain symmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                       +----------+                       |
|                       |    DUT   | SAV facing Router 1   |
| FIB on DUT            +----------+                       |
| Dest         Next_hop   /\    |                          |
| 10.0.0.0/15  Network 1   |    |                          |
|                          |    \/                         |
|                       +~~~~~~~~~~+                       |
|                       | Router 1 |                       |
|                       +~~~~~~~~~~+                       |
|                         /\    |                          |
|Outbound traffic with     |    | Inbound traffic with     |
|source IP addresses       |    | destination IP addresses |
|of 10.0.0.0/15            |    | of 10.0.0.0/15           |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-agg-syn"/> shows the test case of SAV for aggregation-router-facing network in intra-domain symmetric routing scenario. The test network environment of <xref target="intra-domain-agg-syn"/> is the same with <xref target="intra-domain-internet-syn"/>. The Tester is connected to Router 1 to emulate the functions of Network 1 to test the SAV accuracy of the DUT facing the direction of Router 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for Internet-facing network in intra-domain symmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-syn"/> to construct the test network environment. The Tester is connected to Router 1 and performs the functions as Network 1.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT and Router 1 are configured to form the symmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester can send traffic using 10.1.0.0/15 as source addresses (legitimate traffic) and traffic using 10.2.0.0/15 as source addresses (spoofing traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 for this test case.</t>
              <t><strong>Test Case 2</strong>:</t>
              <figure anchor="intra-domain-agg-asyn">
                <name>SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                   Test Network Environment                  |
|                       +----------+                          |
|                       |    DUT   | SAV facing Router 1 and 2|
| FIB on Router 1       +----------+   FIB on Router 2        |
| Dest         Next_hop   /\      \    Dest         Next_hop  |
| 10.1.0.0/16  Network 1  /        \   10.0.0.0/16  Network 1 |
| 10.0.0.0/16  DUT       /         \/  10.1.0.0/16  DUT       |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|               | Router 1 |     | Router 2 |                 |
|               +~~~~~~~~~~+     +~~~~~~~~~~+                 |
|                     /\           /                          |
|Outbound traffic with \          / Inbound traffic with      |
|source IP addresses    \        /  destination IP addresses  |
|of 10.0.0.0/16          \      /   of 10.0.0.0/16            |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           \  \/
                   +--------------------+
                   | Tester (Network 1) |
                   |   (10.0.0.0/15)    |
                   +--------------------+
]]></artwork>
              </figure>
              <t><xref target="intra-domain-agg-asyn"/> shows the test case of SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario. The test network environment of <xref target="intra-domain-agg-asyn"/> is the same with <xref target="intra-domain-internet-asyn"/>. The Tester is connected to Router 1 and Router 2 to emulate the functions of Network 1 to test the SAV accuracy of the DUT facing the direction of Router 1 and Router 2.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for aggregation-router-facing network in intra-domain asymmetric routing scenario, a testbed can be built as shown in <xref target="intra-domain-agg-asyn"/> to construct the test network environment. The Tester is connected to Router 1 and Router 2 and performs the functions as Network 1.</t>
                </li>
                <li>
                  <t>Then, the devices including the DUT, Router 1, and Router 2 are configured to form the asymmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester generates traffic using 10.1.0.0/16 as source addresses (spoofing traffic) and traffic using 10.0.0.0/16 as source addresses (legitimate traffic) to Router 1, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the direction of Router 1 and Router 2 for this test case.</t>
            </section>
          </section>
        </section>
        <section anchor="intra-pcp">
          <name>Protocol Convergence Performance</name>
          <section anchor="objective-1">
            <name>Objective</name>
            <t>Measure the protocol convergence performance of the DUT when route changes happen due to network failures or operator configurations, defined as the protocol convergence time representing the time elapsed from the begining of routing change to the completion of SAV rule update.</t>
          </section>
          <section anchor="test-scenario">
            <name>Test Scenario</name>
            <figure anchor="intra-convg-perf">
              <name>Test setup for protocol convergence performance measurement.</name>
              <artwork><![CDATA[
+-------------+          +-----------+
|   Tester    |<-------->|    DUT    |
+-------------+          +-----------+
]]></artwork>
            </figure>
            <t><strong>Test Case</strong>:</t>
            <t><xref target="intra-convg-perf"/> shows the test setup for protocol convergence performance measurement. The protocol convergence process of the DUT to update SAV rules launches when the route changes happen. Route changes is the cause of updating SAV rules and may be because of network failures or operator configurations. Therefore, in <xref target="intra-convg-perf"/>, the Tester is direclty connects to the DUT and emulates the route changes to launch the convergence process of the DUT by adding or withdrawing the prefixes.</t>
            <t><strong>Procedure</strong>:</t>
            <ol spacing="normal" type="1"><li>
                <t>First, in order to test the protocol convergence time of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
              </li>
              <li>
                <t>Then, the Tester proactively withdraws the prefixes in a certern percentage of the overall prefixes supported by the DUT, such as 10%, 20%, ..., 100%.</t>
              </li>
              <li>
                <t>Finally, the protocol convergence time is calculated according to the logs of the DUT about the beginning and completion of the protocol convergence.</t>
              </li>
            </ol>
            <t><strong>Measurements</strong>: The logs of the DUT records the begining time of the protocol convergence process and its completion time, and the protocol convergence time is calculated by subtracting the beggining time from the completion time of the protocol convergence process.</t>
          </section>
        </section>
        <section anchor="intra-cpp">
          <name>Control Plane Performance</name>
          <t><strong>Test Case</strong>:</t>
          <t>The test of the control plane performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The control plane performance measures the control plane throughput to process the protocol messages. Therefore, the Tester can vary the rate for sending protocol messages, such as from 10% to 100% of the overall link capacity between the Tester and the DUT. Then, the DUT records the size of the processed total protocol messages and processing time.</t>
          <t><strong>Procedure</strong>:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the control plane processing throughput of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the protocol messages to the DUT in a certern percentage of the overall link capacity between the Tester and the DUT, such as 10%, 20%, ..., 100%.</t>
            </li>
            <li>
              <t>Finally, the control plane processing throughput is calculated according to the logs of the DUT about the overall size of the protocol messages and the overall processing time.</t>
            </li>
          </ol>
          <t><strong>Measurements</strong>: The logs of the DUT records the overall size of the protocol messages and the overall processing time, and the control plane processing throughput is calculated by dividing the overall size of the protocol messages by the overall processing time.</t>
        </section>
        <section anchor="intra-dpfp">
          <name>Data Plane Forwarding Performance</name>
          <t><strong>Test Case</strong>:</t>
          <t>The test of the data plane forwarding performance uses the same test setup shown in <xref target="intra-convg-perf"/>. The Tester needs to send the traffic which include spoofing and legitimate traffic at the rate of the overall link capacity between the Tester and the DUT, and the DUT build a SAV table with occupying the overall allocated storage space. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1. The DUT records the overall size of the forwarded packets and the overall forwarding time.</t>
          <t><strong>Procedure</strong>:</t>
          <ol spacing="normal" type="1"><li>
              <t>First, in order to test the data plane forwarding rate of the DUT, a testbed can be built as shown in <xref target="intra-convg-perf"/> to construct the test network environment. The Tester is directly connected to the DUT.</t>
            </li>
            <li>
              <t>Then, the Tester proactively sends the data plane traffic including spoofing and legitimate traffic to the DUT at the rate of the overall link capacity between the Tester and the DUT. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
            </li>
            <li>
              <t>Finally, the data plane forwarding rate is calculated according to the logs of the DUT about the overall size of the forwarded traffic and the overall forwarding time.</t>
            </li>
          </ol>
          <t><strong>Measurements</strong>: The logs of the DUT records the overall size of the forwarded traffic and the overall forwarding time, and the data plane forwarding rate is calculated by dividing the overall size of the forwarded traffic by the overall forwarding time.</t>
        </section>
      </section>
      <section anchor="inter-domain-sav">
        <name>Inter-domain SAV</name>
        <section anchor="sav-accuracy-1">
          <name>SAV Accuracy</name>
          <section anchor="objective-2">
            <name>Objective</name>
            <t>Measure the accuracy of the DUT to process legitimate traffic and spoofing traffic across various inter-domain network scenarios including SAV for customer-facing ASes and SAV for provider/peer-facing ASes, defined as the proportion of legitimate traffic which is blocked improperly by the DUT across all the legitimate traffic and the proportion of spoofing traffic which is permitted improperly by the DUT across all the spoofing traffic.</t>
          </section>
          <section anchor="test-scenario-1">
            <name>Test Scenario</name>
            <section anchor="sav-for-customer-facing-ases">
              <name>SAV for Customer-facing ASes</name>
              <t><strong>Test case 1</strong>:</t>
              <figure anchor="inter-customer-syn">
                <name>SAV for customer-facing ASes in inter-domain symmetric routing scenario.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|             \         |              \            \      |
|     P6[AS 1] \        |               \            \     |
|      P1[AS 1] \       |                \            \    |
|          (C2P) \      | (C2P/P2P) (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
              </figure>
              <t><xref target="inter-customer-syn"/> presents a test case of SAV for customer-facing ASes in inter-domain symmetric routing scenario. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 and P6 to AS 2 and the DUT, respectively, and then AS 2 further propagates the route for prefix P1 and P6 to the DUT. Consequently, the DUT can learn the route for prefixes P1 and P6 from AS 1 and AS 2. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;AS 4, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
              <t>Procedure:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain symmetric routing scenario, a testbed can be built as shown in <xref target="inter-customer-syn"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the symmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
              <t><strong>Test case 2</strong>:</t>
              <figure anchor="inter-customer-lpp">
                <name>SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT.</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                |
|                        +~~~~~~~~~~~~~~~~+                |
|                        |    AS 3(P3)    |                |
|                        +~+/\~~~~~~+/\+~~+                |
|                           /         \                    |
|                          /           \                   |
|                         /             \                  |
|                        / (C2P)         \                 |
|              +------------------+       \                |
|              |      DUT(P4)     |        \               |
|              ++/\+--+/\+----+/\++         \              |
|                /      |       \            \             |
|      P2[AS 2] /       |        \            \            |
|              /        |         \            \           |
|             / (C2P)   |          \ P5[AS 5]   \ P5[AS 5] |
|+~~~~~~~~~~~~~~~~+     |           \            \         |
||    AS 2(P2)    |     | P1[AS 1]   \            \        |
|+~~~~~~~~~~+/\+~~+     | P6[AS 1]    \            \       |
|    P6[AS 1] \         | NO_EXPORT    \            \      |
|     P1[AS 1] \        |               \            \     |
|     NO_EXPORT \       |                \            \    |
|          (C2P) \      | (C2P)     (C2P) \      (C2P) \   |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|             |  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|             +~~~~~~~~~~~~~~~~+        +~~~~~~~~~~~~~~~~+ |
|                  /\     |                                |
|                  |      |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                   |     \/
              +----------------+
              |     Tester     |
              +----------------+
]]></artwork>
              </figure>
              <t><xref target="inter-customer-lpp"/> presents a test case of SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT configuration. In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. AS 1 advertises prefixes P1 to AS 2 and adds the NO_EXPORT community attribute to the BGP advertisement sent to AS 2, preventing AS 2 from further propagating the route for prefix P1 to the DUT.  Similarly, AS 1 adds the NO_EXPORT community attribute to the BGP advertisement sent to the DUT, resulting in the DUT not propagating the route for prefix P6 to AS 3. Consequently, the DUT only learns the route for prefix P1 from AS 1 in this scenario. In this test case, the legitimate path for the traffic with source addresses in P1 and destination addresses in P4 is AS 1-&gt;AS 2-&gt;DUT, and the Tester is connected to the AS 1 and the SAV for customer-facing ASes of the DUT is tested.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in inter-domain asymmetric routing scenario caused by NO_EXPORT, a testbed can be built as shown in <xref target="inter-customer-lpp"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the asymmetric routing scenario.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P1 as source addresses and P4 as destination addresses (legitimate traffic) to the DUT via AS 2 and traffic using P5 as source addresses and P4 as destination addresses (spoofing traffic) to the DUT via AS 2, respectively. The ratio of spoofing traffic to legitimate traffic can vary, such as from 1:9 to 9:1.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic and permit the legitimate traffic from the direction of AS 2 for this test case.</t>
              <t><strong>Test case 3</strong>:</t>
              <figure anchor="inter-customer-dsr">
                <name>SAV for customer-facing ASes in the scenario of direct server return (DSR).</name>
                <artwork><![CDATA[
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
|                  Test Network Environment                       |
|                                +----------------+               |
|                Anycast Server+-+    AS 3(P3)    |               |
|                                +-+/\----+/\+----+               |
|                                   /       \                     |
|                         P3[AS 3] /         \ P3[AS 3]           |
|                                 /           \                   |
|                                / (C2P)       \                  |
|                       +----------------+      \                 |
|                       |     DUT(P4)    |       \                |
|                       ++/\+--+/\+--+/\++        \               |
|          P6[AS 1, AS 2] /     |      \           \              |
|               P2[AS 2] /      |       \           \             |
|                       /       |        \           \            |
|                      / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|      +----------------+       |          \           \          |
|User+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|      +----------+/\+--+       | P6[AS 1]   \           \        |
|          P6[AS 1] \           | NO_EXPORT   \           \       |
|           P1[AS 1] \          |              \           \      |
|           NO_EXPORT \         |               \           \     |
|                      \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                    +----------------+        +----------------+ |
|                    |AS 1(P1, P3, P6)|        |    AS 5(P5)    | |
|                    +----------------+        +----------------+ |
|                         /\     |                                |
|                          |     |                                |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
                           |    \/
                     +----------------+
                     |     Tester     |
                     | (Edge Server)  |
                     +----------------+

Within the test network environment, P3 is the anycast prefix and is only advertised by AS 3 through BGP.
]]></artwork>
              </figure>
              <t><xref target="inter-customer-dsr"/> presents a test case of SAV for customer-facing ASes in the scenario of direct server return (DSR). In this test case, AS 1, AS 2, AS 3, the DUT, and AS 5 constructs the test network environment, and the DUT performs SAV as an AS. AS 1 is a customer of AS 2 and the DUT, AS 2 is a customer of the DUT, which is a customer of AS 3, and AS 5 is a customer of both AS 3 and the DUT. When users in AS 2 send requests to the anycast destination IP, the forwarding path is AS 2-&gt;DUT-&gt;AS 3.  The anycast servers in AS 3 receive the requests and tunnel them to the edge servers in AS 1.  Finally, the edge servers send the content to the users with source addresses in prefix P3. The reverse forwarding path is AS 1-&gt;DUT-&gt;AS 2. The Tester sends the traffic with source addresses in P3 and destination addresses in P2 along the path AS 1-&gt;DUT-&gt;AS 2.</t>
              <t>Procedure:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of DSR, a testbed can be built as shown in <xref target="inter-customer-dsr"/> to construct the test network environment. The Tester is connected to AS 1 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of DSR.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P3 as source addresses and P2 as destination addresses (legitimate traffic) to AS 2 via the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can permit the legitimate traffic with source addresses in P3 from the direction of AS 1 for this test case.</t>
              <t><strong>Test case 4</strong>:</t>
              <figure anchor="inter-customer-reflect">
                <name>SAV for customer-facing ASes in the scenario of reflection attacks.</name>
                <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P1')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |  Server+-+    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
              </figure>
              <t><xref target="inter-customer-reflect"/> depicts the test case of SAV for customer-facing ASes in the scenario of reflection attacks. In this test case, the reflection attack by source address spoofing takes place within DUT's customer cone, where the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P5) that are designed to respond to such requests. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-reflect"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="inter-customer-reflect"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P1 as source addresses and P5 as destination addresses (spoofing traffic) to AS 5 via the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 2 for this test case.</t>
              <t><strong>Test case 5</strong>:</t>
              <figure anchor="inter-customer-direct">
                <name>SAV for customer-facing ASes in the scenario of direct attacks.</name>
                <artwork><![CDATA[
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
             |                   Test Network Environment                 |
             |                          +----------------+                |
             |                          |    AS 3(P3)    |                |
             |                          +--+/\+--+/\+----+                |
             |                              /      \                      |
             |                             /        \                     |
             |                            /          \                    |
             |                           / (C2P)      \                   |
             |                  +----------------+     \                  |
             |                  |     DUT(P4)    |      \                 |
             |                  ++/\+--+/\+--+/\++       \                |
             |     P6[AS 1, AS 2] /     |      \          \               |
             |          P2[AS 2] /      |       \          \              |
             |                  /       |        \          \             |
             |                 / (C2P)  |         \ P5[AS 5] \ P5[AS 5]   |
+----------+ |  +----------------+      |          \          \           |
|  Tester  |-|->|                |      |           \          \          |
|(Attacker)| |  |    AS 2(P2)    |      |            \          \         |
|  (P5')   |<|--|                |      | P1[AS 1]    \          \        |
+----------+ |  +---------+/\+---+      | P6[AS 1]     \          \       |
             |     P6[AS 1] \           | NO_EXPORT     \          \      |
             |      P1[AS 1] \          |                \          \     |
             |      NO_EXPORT \         |                 \          \    |
             |                 \ (C2P)  | (C2P)      (C2P) \    (C2P) \   |
             |             +----------------+          +----------------+ |
             |     Victim+-+  AS 1(P1, P6)  |          |    AS 5(P5)    | |
             |             +----------------+          +----------------+ |
             +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P5' is the spoofed source prefix P5 by the attacker which is inside of 
AS 2 or connected to AS 2 through other ASes.
]]></artwork>
              </figure>
              <t><xref target="inter-customer-direct"/> presents the test case of SAV for customer-facing ASes in the scenario of direct attacks. In this test case, the direct attack by source address spoofing takes place within the DUT's customer cone, where the attacker spoofs a source address (P5) and directly targets the victim's IP address (P1), overwhelming its network resources. The Tester performs the source address spoofing function as an attacker. The arrows in <xref target="inter-customer-direct"/> illustrate the commercial relationships between ASes.  AS 3 serves as the provider for the DUT and AS 5, while the DUT acts as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for customer-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="inter-customer-direct"/> to construct the test network environment. The Tester is connected to AS 2 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P5 as source addresses and P1 as destination addresses (spoofing traffic) to AS 1 via the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic with source addresses in P5 from the direction of AS 2 for this test case.</t>
            </section>
            <section anchor="sav-for-providerpeer-facing-ases">
              <name>SAV for Provider/Peer-facing ASes</name>
              <t><strong>Test case 1</strong>:</t>
              <figure anchor="reflection-attack-p">
                <name>SAV for provider-facing ASes in the scenario of reflection attacks.</name>
                <artwork><![CDATA[
                                   +----------------+
                                   |     Tester     |
                                   |   (Attacker)   |
                                   |      (P1')     |
                                   +----------------+
                                        |     /\
                                        |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment              \/     |                    |
|                                  +----------------+               |
|                                  |                |               |
|                                  |    AS 3(P3)    |               |
|                                  |                |               |
|                                  +-+/\----+/\+----+               |
|                                     /       \                     |
|                                    /         \                    |
|                                   /           \                   |
|                                  / (C2P/P2P)   \                  |
|                         +----------------+      \                 |
|                         |     DUT(P4)    |       \                |
|                         ++/\+--+/\+--+/\++        \               |
|            P6[AS 1, AS 2] /     |      \           \              |
|                 P2[AS 2] /      |       \           \             |
|                         /       |        \           \            |
|                        / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|        +----------------+       |          \           \          |
|Server+-+    AS 2(P2)    |       | P1[AS 1]  \           \         |
|        +----------+/\+--+       | P6[AS 1]   \           \        |
|            P6[AS 1] \           | NO_EXPORT   \           \       |
|             P1[AS 1] \          |              \           \      |
|             NO_EXPORT \         |               \           \     |
|                        \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|                      +----------------+        +----------------+ |
|              Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|                      +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P1' is the spoofed source prefix P1 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
              </figure>
              <t><xref target="reflection-attack-p"/> depicts the test case of SAV for provider-facing ASes in the scenario of reflection attacks. In this test case, the attacker spoofs the victim's IP address (P1) and sends requests to servers' IP address (P2) that respond to such requests. The Tester performs the source address spoofing function as an attacker. The servers then send overwhelming responses back to the victim, exhausting its network resources. The arrows in <xref target="reflection-attack-p"/> represent the commercial relationships between ASes. AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of reflection attacks, a testbed can be built as shown in <xref target="reflection-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of reflection attacks.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P1 as source addresses and P2 as destination addresses (spoofing traffic) to AS 2 via AS 3 and the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic with source addresses in P1 from the direction of AS 3 for this test case.</t>
              <t><strong>Test case 2</strong>:</t>
              <figure anchor="direct-attack-p">
                <name>SAV for provider-facing ASes in the scenario of direct attacks.</name>
                <artwork><![CDATA[
                           +----------------+
                           |     Tester     |
                           |   (Attacker)   |
                           |      (P2')     |
                           +----------------+
                                |     /\
                                |      |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
| Test Network Environment      \/     |                    |
|                          +----------------+               |
|                          |    AS 3(P3)    |               |
|                          +-+/\----+/\+----+               |
|                             /       \                     |
|                            /         \                    |
|                           /           \                   |
|                          / (C2P/P2P)   \                  |
|                 +----------------+      \                 |
|                 |     DUT(P4)    |       \                |
|                 ++/\+--+/\+--+/\++        \               |
|    P6[AS 1, AS 2] /     |      \           \              |
|         P2[AS 2] /      |       \           \             |
|                 /       |        \           \            |
|                / (C2P)  |         \ P5[AS 5]  \ P5[AS 5]  |
|+----------------+       |          \           \          |
||    AS 2(P2)    |       | P1[AS 1]  \           \         |
|+----------+/\+--+       | P6[AS 1]   \           \        |
|    P6[AS 1] \           | NO_EXPORT   \           \       |
|     P1[AS 1] \          |              \           \      |
|     NO_EXPORT \         |               \           \     |
|                \ (C2P)  | (C2P)    (C2P) \     (C2P) \    |
|              +----------------+        +----------------+ |
|      Victim+-+  AS 1(P1, P6)  |        |    AS 5(P5)    | |
|              +----------------+        +----------------+ |
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~+
P2' is the spoofed source prefix P2 by the attacker which is inside of 
AS 3 or connected to AS 3 through other ASes.
]]></artwork>
              </figure>
              <t><xref target="direct-attack-p"/> showcases a testcase of SAV for provider-facing ASes in the scenario of direct attacks. In this test case, the attacker spoofs another source address (P2) and directly targets the victim's IP address (P1), overwhelming its network resources.  The arrows in <xref target="direct-attack-p"/> represent the commercial relationships between ASes.  AS 3 acts as the provider or lateral peer of the DUT and the provider for AS 5, while the DUT serves as the provider for AS 1, AS 2, and AS 5.  Additionally, AS 2 is the provider for AS 1.</t>
              <t><strong>Procedure</strong>:</t>
              <ol spacing="normal" type="1"><li>
                  <t>First, in order to test whether the DUT can generate accurate SAV rules for SAV for provider-facing ASes in the scenario of direct attacks, a testbed can be built as shown in <xref target="direct-attack-p"/> to construct the test network environment. The Tester is connected to AS 3 and generates the test traffic to the DUT.</t>
                </li>
                <li>
                  <t>Then, the ASes including  AS 1, AS 2, AS 3, the DUT, and AS 5, are configured to form the scenario of direct attacks.</t>
                </li>
                <li>
                  <t>Finally, the Tester sends the traffic using P2 as source addresses and P1 as destination addresses (spoofing traffic) to AS1 via AS 3 and the DUT.</t>
                </li>
              </ol>
              <t><strong>Expected Results</strong>: The DUT can block the spoofing traffic with source addresses in P2 from the direction of AS 3 for this test case.</t>
            </section>
          </section>
        </section>
        <section anchor="protocol-convergence-performance">
          <name>Protocol Convergence Performance</name>
          <t>The test setup, procedure, and measures can refer to <xref target="intra-pcp"/>.</t>
        </section>
        <section anchor="control-plane-performance">
          <name>Control Plane Performance</name>
          <t>The test setup, procedure, and measures can refer to <xref target="intra-cpp"/>.</t>
        </section>
        <section anchor="data-plane-forwarding-performance">
          <name>Data Plane Forwarding Performance</name>
          <t>The test setup, procedure, and measures can refer to <xref target="intra-dpfp"/>.</t>
        </section>
      </section>
    </section>
    <section anchor="reporting-format">
      <name>Reporting Format</name>
      <t>Each test has a reporting format that contains some global and identical reporting components, and some individual components that are specific to individual tests. The following parameters for test configuration and SAV mechanism settings <bcp14>MUST</bcp14> be reflected in the test report.</t>
      <t>Test Configuration Parameters:</t>
      <ol spacing="normal" type="1"><li>
          <t>Test device hardware and software versions</t>
        </li>
        <li>
          <t>Device CPU load</t>
        </li>
        <li>
          <t>Network topology</t>
        </li>
        <li>
          <t>Test traffic attributes</t>
        </li>
        <li>
          <t>System configuration (e.g., physical or virtual machine, CPU, memory, caches, operating system, interface capacity)</t>
        </li>
        <li>
          <t>Device configuration (e.g., symmetric routing, NO_EXPORT)</t>
        </li>
        <li>
          <t>SAV mechanism</t>
        </li>
      </ol>
    </section>
    <section anchor="IANA">
      <name>IANA Considerations</name>
      <t>This document has no IANA actions.</t>
    </section>
    <section anchor="security">
      <name>Security Considerations</name>
      <t>The benchmarking tests described in this document are limited to the performance characterization of SAV devices in a lab environment with isolated networks.</t>
      <t>The benchmarking network topology will be an independent test setup and <bcp14>MUST NOT</bcp14> be connected to devices that may forward the test traffic into a production network.</t>
    </section>
  </middle>
  <back>
    <references>
      <name>References</name>
      <references anchor="sec-normative-references">
        <name>Normative References</name>
        <reference anchor="RFC3704">
          <front>
            <title>Ingress Filtering for Multihomed Networks</title>
            <author fullname="F. Baker" initials="F." surname="Baker"/>
            <author fullname="P. Savola" initials="P." surname="Savola"/>
            <date month="March" year="2004"/>
            <abstract>
              <t>BCP 38, RFC 2827, is designed to limit the impact of distributed denial of service attacks, by denying traffic with spoofed addresses access to the network, and to help ensure that traffic is traceable to its correct source network. As a side effect of protecting the Internet against such attacks, the network implementing the solution also protects itself from this and other attacks, such as spoofed management access to networking equipment. There are cases when this may create problems, e.g., with multihoming. This document describes the current ingress filtering operational mechanisms, examines generic issues related to ingress filtering, and delves into the effects on multihoming in particular. This memo updates RFC 2827. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="3704"/>
          <seriesInfo name="DOI" value="10.17487/RFC3704"/>
        </reference>
        <reference anchor="RFC8704">
          <front>
            <title>Enhanced Feasible-Path Unicast Reverse Path Forwarding</title>
            <author fullname="K. Sriram" initials="K." surname="Sriram"/>
            <author fullname="D. Montgomery" initials="D." surname="Montgomery"/>
            <author fullname="J. Haas" initials="J." surname="Haas"/>
            <date month="February" year="2020"/>
            <abstract>
              <t>This document identifies a need for and proposes improvement of the unicast Reverse Path Forwarding (uRPF) techniques (see RFC 3704) for detection and mitigation of source address spoofing (see BCP 38). Strict uRPF is inflexible about directionality, the loose uRPF is oblivious to directionality, and the current feasible-path uRPF attempts to strike a balance between the two (see RFC 3704). However, as shown in this document, the existing feasible-path uRPF still has shortcomings. This document describes enhanced feasible-path uRPF (EFP-uRPF) techniques that are more flexible (in a meaningful way) about directionality than the feasible-path uRPF (RFC 3704). The proposed EFP-uRPF methods aim to significantly reduce false positives regarding invalid detection in source address validation (SAV). Hence, they can potentially alleviate ISPs' concerns about the possibility of disrupting service for their customers and encourage greater deployment of uRPF techniques. This document updates RFC 3704.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="84"/>
          <seriesInfo name="RFC" value="8704"/>
          <seriesInfo name="DOI" value="10.17487/RFC8704"/>
        </reference>
        <reference anchor="RFC2544">
          <front>
            <title>Benchmarking Methodology for Network Interconnect Devices</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <author fullname="J. McQuaid" initials="J." surname="McQuaid"/>
            <date month="March" year="1999"/>
            <abstract>
              <t>This document is a republication of RFC 1944 correcting the values for the IP addresses which were assigned to be used as the default addresses for networking test equipment. This memo provides information for the Internet community.</t>
            </abstract>
          </front>
          <seriesInfo name="RFC" value="2544"/>
          <seriesInfo name="DOI" value="10.17487/RFC2544"/>
        </reference>
        <reference anchor="RFC2119">
          <front>
            <title>Key words for use in RFCs to Indicate Requirement Levels</title>
            <author fullname="S. Bradner" initials="S." surname="Bradner"/>
            <date month="March" year="1997"/>
            <abstract>
              <t>In many standards track documents several words are used to signify the requirements in the specification. These words are often capitalized. This document defines these words as they should be interpreted in IETF documents. This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="2119"/>
          <seriesInfo name="DOI" value="10.17487/RFC2119"/>
        </reference>
        <reference anchor="RFC8174">
          <front>
            <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
            <author fullname="B. Leiba" initials="B." surname="Leiba"/>
            <date month="May" year="2017"/>
            <abstract>
              <t>RFC 2119 specifies common key words that may be used in protocol specifications. This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the defined special meanings.</t>
            </abstract>
          </front>
          <seriesInfo name="BCP" value="14"/>
          <seriesInfo name="RFC" value="8174"/>
          <seriesInfo name="DOI" value="10.17487/RFC8174"/>
        </reference>
      </references>
      <references anchor="sec-informative-references">
        <name>Informative References</name>
        <reference anchor="intra-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Intra-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="inter-domain-ps" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-inter-domain-problem-statement/">
          <front>
            <title>Source Address Validation in Inter-domain Networks Gap Analysis, Problem Statement, and Requirements</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="intra-domain-arch" target="https://datatracker.ietf.org/doc/draft-ietf-savnet-intra-domain-architecture/">
          <front>
            <title>Intra-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
        <reference anchor="inter-domain-arch" target="https://datatracker.ietf.org/doc/draft-wu-savnet-inter-domain-architecture/">
          <front>
            <title>Inter-domain Source Address Validation (SAVNET) Architecture</title>
            <author>
              <organization/>
            </author>
            <date year="2024"/>
          </front>
        </reference>
      </references>
    </references>
  </back>
  <!-- ##markdown-source:
H4sIAAAAAAAAA+1de3PbRpL/X1X6Dji7tizZJGVRtjdR5XKryE7iWj+4lpy9
vXVqCwRHJNYgwOAhmbGcz3Kf5T7Zdfc8MAMMHiQh+SVurSOCmJ6enp6e/v1m
MOj3+9tbSeqGk3+5QRSyQyeNM7a95S9i+jNJh/fvf3t/uL3luemhk6QT57Zz
PGPeWyiWjed+kvhRmC4XUPLpk9Mft7fcmLmHzk8sZLEbOP989WT07Oj4ya/b
WxdTecv21iTyQncOZSaxe5b2vRkL++P5xbSfuOchS/E//TELvdncjd/64bR/
/z4WS/00gEI/aL84z1k6iyZREE2XzlkUOydRFnvMOZpMYpYkzi9u4E/cFJQE
1cbjmJ0fOidHv5CI7a3ADUErFqJwNwNB8eH2Vt/xw+TQ2d5yHK7kMx+bHOKF
KIb7/2cWhdNp5oZeFjrP3HEUu2kUL/F3z0+XqKD/b9CNLkRZmMZw7Xjmhy4Y
L0uYc/rssbPD3nlskTqv/7oLUuV9VCOWY3PXDw6dwEfb/OX3qRe44wGbZAMv
tGj42AVFfKXgaQK1zzLXeR365yxOQKmrUC6N0LbhX1JRXbV+z/yxjxpmH8eG
WTBuNOEzUAVMPXX+5n+Unv7NDwOvqOX2VhjFc3Dfc3aI97768fjgz/cfyL+/
0f4ePnxAf8PYDc+MQj5U5/YnEdQT9hcJXXOc1I2nDIb0LE3h2t4eDBIX7vPe
snjgs/RsAAbYg3G6x4coXpKj0xQYR+OAzfsQRFI2Z2G6J+TzsVo5HEEt56km
yHnB0osofps4P7kL5yh0g2XiJz1nxOU7J1J+z4Fo5bxiv2V+TBcSh9cIcqHC
4f3hA9FqFnfcak3gZq1Wgq6g1XnXuLE367i3UaSfMi/NYmY22ejL6vbvQPB9
8eR01znSJDV34PpNucisHVjbkLx7umgIBJt+33HHCWqZ4qA+nfmJAypm2JHO
hJ35IUucuZrJfPiGc5k+BTrpjDkLFtPYDkGn6MxJuHau0O7c1G4XJHozN/ST
eTLASU/77sA07WSpH/i/swkEcmdKE3bK6L44C0ADuLqA+RJVLFSULKIIdJ5y
p5y55wxUZeDc80VAzgkyL/x05oCmS1Ar9qMMGswSfxomOASieMJiqoA3iGqF
69hGL4qhkkUUTrDVicdCLA8tMK2Wum9RRyjgLmA0ut4MLeJFYeKDbCzqktQJ
O/dBdahrDLc64wA8xRlH73rc7tKycy2NAEVcVD1EHWdu6kC97jSMktT3UBC/
PzetqRgocw4qJCiDhFJXzpmbZHFFP7J3PshGlcGcIbvg5pDGpB5NBug46Ehz
fzIJGH67TYMummQez2+2t04aHAL0xD7wz3zPDdNgiZVEMaR/aYveHjgvF4wm
QO4+STadsiTl/jNhiyBaOhP/7IzFKKTgb+/fi7nrwwf+9zf877GbgICIut6P
hRiyY8gjI2Rn534chRTzBtBi1MvHNvVyu+kxiozoG4PYVIX8VQRw7o3ULJAI
CWsE+dKMuROS4npeBmN26WQh+qt0ZOWT0JLC5EqtK0w9Hz4MzOhoUxBCiaPH
pJJs/LEsXVzF7oAWLaJEjOYMPJA8jQ86dDL0q5XsBHKSKDjncqS90NuZGZj0
ocNj2YSPZX1UXPhB4MxYsBDWRieisJPiOIlAf25siECguosKY7Wokj5YLiBF
M1wF6uIjPAFHcUFFKoJNoxrdIIl4teDZE1FpCl5rG4blUUdDV1ZHtQzK8VsN
eIqhECIqArkrDWbYWVcCjBuBHWY+O897L4YmnmWhxx0Uss6eGdnm7hKG4gLH
MY+3ZjcWw5OMkEljfIzZ1I0h1EAIEL0hnIkcx7AT9wq0a1LtAhBysI5M+Kjy
IVMRwnDYhWLs9VBjGJRgXM/oMT75wI8QAgNnAVCO0RXMB8RXuPUCmoA2N608
Y4lUV2glfuexyIUYEU8ucFTFUQYjBE0+xq/QtRAkEhaf84sQmtIMrsyx16DG
nV+e7yKySLmG0JbfSEUYXhBA8h8uZgDsYJ6F6dA1e6LYX35CgxQC0ER4Iwxy
UDcC+4OYyLkQ0ZvPMMwQdicxfByqumA4KqDZ0XyBDbIMAxnmcjHc62/fdn6K
oK1o5BMPhjEfCjXRAGwXmL6AKRk0wwMnSCjqX0RONP43xDxAK5ikb2/ddY4S
+BUxpXPnDjeUdGkemIS+2G9JeucOxWxodgKxBNTHFvalE+bpAxf93JiGyXn8
cUZDHfOpbNxPljCl8eiHt6BsF0yGGoifCkbdoRjzNowuQjTsnTtz34ujnL64
c2d3ULKU4/q8Dr0fisEONPIhC4Lggi4mugJmvcF00Ms9FHwiic5SzVtxmhQR
N5lFF56bNPazFvbRyVxnCv0RyjHJ0xI+HUuL9niGRzaSHYhSeViWFqbcbBH7
MrPMw54v43hRsPQ1A+sAOgcYPlUu95YtHSg1SZxbz1+fnN7q8f86L17S36+e
/O3101dPHuPfJz8fPXum/tgSd5z8/PL1s8f5X3nJ45fPnz958ZgXhquOcWnr
1vOjf9zisefWy9Hp05cvjp7dKgc76lOKqTTFQmqFiZKbbEEM9cDn+KD44Xj0
f/+7/wBm9f9ACL+//y1N8fjlm/0/Y36EEx6vLQohX+NfwXDLLTAtc2NKVqGj
PXfhQ2gC9wAfxF4PYdqL2WBr6+4/0TK/Hjrfjb3F/oPvxQVssHFR2sy4SDYr
XykV5ka0XLJUo6xpXC9Y2tT36B/Gd2l37eJ3/4WRxunvf/Nf32/x5PiUxXM/
pEiEF57O0RUhUvwQRN7bQ5qxtAQZUq4sSBOe8NNoQUgJF8jNAzaFnHOObmxm
x4znwmOUyWhapEqgqyYZeYAfqsRGQauBoc8I9UxXVIiScgxwNm0WJDFdTR/8
dixm0xFOn1wjvGxOspRB4NQJox2ke7xeP/SCjKbaqQvaxhLOQICbZyHgjVSk
Pv2YBS7pJnkqyB+kAo9x8i7Urk3oCeSNAvXlOBXtQZAtWTAPoQ0vAXA746Ac
1RCGxTQRoSJoG80pLyCjcgSTksNQkoktEPgULIkxduInHoRcx08HDmr7M8DB
/pnroZBXFHYPnaPQzLB5OEY7QZJxdCJmfR9nXwCXnoBOkG2ALBUHd/wBG2Bu
EbhLyM2H8gc+ixxnYIM5XO+m6kIpT0hXykCGoiAyiiSTxVEaeeAOBU0PTE35
EISGaQy9CO10+YSl2ULGc0rJE7yCAW0qVg98nqcEvksIAtyeoCNSnQiqsOBj
PkO9JoBGcncevz7dtRgZf8Q8AUMpda9so8QPNBuH3G2owfKGFICVIAY43ObB
+/171Bpn137CPKmQqAYpAwcse4b+KMiVKDaYFiVe3EUNVBhTjGtMaBmCXB4U
8nGusyhsnuGQIqUlWkef1aKWqIQyA9BNRlBq9CzClFLWTDgyJs6B8n6hHSX9
VDZkbFIgjbBioxkUjGE88vsU4GrOzdEh/lAfTqfd+6Pyc4/fccm7XpCpzpOc
MnAu5R0kqW987tFF4w7+r/pcyjug5PfGHeBl+h0obXvrskEOl5XfZdcnv6uu
5fldDR/jLmuNpbus2ueWEB+6LPwdf/tOSeUWtcvSfytqY/b9+0PnNoE0igtE
z/7nrTx2DG59QG/hw5DugXwJUx8+QWghBVNa6C9ij9RPFn6JMivqWYEKjRDC
o4Y1Wsixw+vUKUsVEGwSZW0TyHM9pOOQKliaFXEhohiGAFxsBUnpBeNUyBzH
pJq8iLSK+LgSNZgsMoFea/T6gSExkHAjaFqrUa5Cixzq1hBjDHgdyWM0wdbW
87w9GWmwaoJWeZTpIsJILkP6Q84myt7A+MbZL8GqQOyn79BJYFlKt5UXiSxN
ghYZgU7llIGNEXMUJFdn/jTjVKMjZz5IHPmFC8zXlRIx7wxprNJUBJYAfxAE
uGbgMvkDUqHtrGf6gkMYYJGlqkaY3CKNv5V6WVTgLYdESLZBwMQqXbEniXSV
w0RmEp5ukYRshQJkh0+0JI88AbAlCydJThebAvhQEcoUZBPrpHp2zGSySKCM
OAOelOsOIac7zHDdeMmz5wmmspKdo5ogMwbrJhnkVyBqpuWFkjzyCkmbIpWg
wZCbjflELlI2YUUjLxMm7SkTjbME14zAlJhMYwtn/kJFhaOT3HMMhlfIUR1X
TEFyy0MYEiFgnmFMI2st1Y14U08SwyJdIAalGCJQ4XKMyBsCabVJwRTwTE9Q
vowT/a2GOGFgHKVIjfKBogYsmV8fs+RjIy0+POVdHMWJonoTEXoDgj3IPegB
xVcFnJ2/jp4mu3Ic4rQj+aNyEB44R3AdC/CxOGaSwpvIvuO0MS+TLzoYDEI5
jB/BSOFdY5d9Fkdzns+TGZCkwzCghUURy0ZGz5rwOZEeZHa/ZTZQwMMCkcHJ
ZPRxvThKEhVbq13G55QtBLVA9mtxkmnXDg67KxpS8mPVDCu2rm5IUdBGzeCw
61jjw0/9OdNawG/QCXMwIvb0AjqbiDTBA/oR4ApOhPKWSQVMqF8SSdEzW0xk
CNRQOA5TFdhh/sGbZ0hShT0NPNA4RJ0gbi0S3R3H0OWENaHpphyZyBAWZLKL
ZNVCHcGbg8+TYY1AXbGCYCwyiZZKaxtMCNreE5T06QyUm85gyEi7l0wmb03V
rXL0ifbn1zFEGCVymueM9gq1IE+MjMlot6VD7YmQaHTOvjg/5pH1FYhTM701
9lJ2ZjSxwNto99Y3XitjDJiq1mn31zXN3DJ4Sjnw+9tG5BQmMLexHP3CL/NZ
4kgMTnHttvNSEt945blYfylmZDIw0AI7NdYWJrGR5cmTxxKZNtlSAm1eyBk4
OfkoOgf+Jo7phcwj5B202QU3x4jMxEg05E3udIprgehrfZ6klG6X6y1uYkkK
rnBWUHnENYXvgep7jkOl8eXl28pox5rtf9ZsT+tQd6n0MS7M7N+9e+gUWY8a
3N/0uWflBCrpkTquwPxoOt2ruKWm9KVgKp39IiFglv7x6Q+Y3kt2ZcW6H2Mz
5ecFe5f+axYt4M+9N1yHyg+V3r8/oP/t7T90lLH2ZcGm0tW/4j9v9tYprfEj
a9lc0VS1Nr+Kulva/GWWjqMMx7CO7VXBSwhQVb9DaQEWno40XtTRS+vUqXEX
loY4YfS4phn/p/KGy01HaLVRpLdY7ynQZRayTRMj4NuOcuRdk3szatzRGror
mriCAnnjOGtn7FxS6DdZhpLFq5yh5LTmF1YmkuV8zlLcXSNzQ7VeLGnAykoN
WpBWwzWA1oEGOZrF0SbyEL6uTps7ZBV7ZW5A7mEJOZkT5VES1yw8yhZQsJyn
B1pc8kuyle5iSQxLHp30jAqEmjKtShxcKgZ4cOa/Q4maH5ToS7nMkKuASk4g
tU59MCoKk4IkxVfcnFPmEu0EgZSQ53eCvCXclDCTpCSqh1o3d98yQXuYcYNH
ArGHRh/VkGKzHIQQK3gqejJgbhwmau2LiT2S2EBdgirM1aHNPQErsqgAvyaI
tS3hrLSG64e2evTO4/mM6BDBAtZ4qM4X6wthRpNFgJQakK2J6aMhA1+ihdgk
iUWzBPeQYcvwOQdOWeshVgO6syhA2i3O+O4ufq+SnBgE3N27hLUAmzJIjPDK
/sD50Y+TtGcseVFxtVCrsfbKw8rL3GpvWXfjnvRAKkbQ++PMD9J8CwYRNHVx
qbTmWLU0UVxctK4joMVV+MGLct8g2VgNW76APSSRgveWqxo5htBlqqCEmwsk
scqrpt3blClXB2iq7gC7EZwnWBpjV3ZXooZERmBQd3w3KQ+RnXLM2OUeVxQz
rBVTTPB3jWGGCxQc4QVL3gPEJ1shBpSzxDH0C8Bvy5wgpnCxf/gtFvj2cF94
/ZN3C96drzg5iajgVHNrwklWSCJ7HfcpVIAkqjKP2nwboT66eRfpsGQoRp8+
s2+Q8/xRBUzaIpOjkw2xidMOngwb8/xWCEXcpAaOLqAepEDyh/9U3CRxyj53
6kcGTlEQAyXkA8i4qQB04DfV8Lw8oRWjlvwmixFLSKG2OywCSkClFi1ehQb8
I3uAf7HeogTYkcsbvXwldqkDL2/y8tXwpYxfHuUVv8n1r7plUwjzRxOKefOl
Yhh3ExDjro1i3M5gTI0OG+CYznCJkW+YE3ADZGmHWIwUCWXVP0Ag86eO0YcK
rIQ+aGevDkFUP7REMY/ETnJdlLRhDmfot3pMo/SqwDQqKlfK4lxsAbs8KmQg
euZZzjElnuRL/FQJtJgLZTpIHhb2rFRhtpIisnzPrLmi7L61bMnXVgF4tHO2
ZKQmgFc7eJsgXhlcOZ8CuqoPR+vDK/eq8FXup92DrZ4W+bS60E+rgVfdpLIm
8lIe3xIyWXHX/VohNvj29SIvc+mouChXs3Jk2y7bJt+pzLhUyiWV2F0pj6qQ
Kz+UYFeQ+sJHmsnxSy0XFhmuxPibr5xV7yNW9Veo1rgGVLMEdLVrQAJbXXIH
4z4lu9epR7Ut6/7i1t00r1i9dNu1zquo+2bdrW58Vxulu3U3/OhO7FQF0KvG
rL6cRiyQtWLfx8bLbnqd5d34RdDagRYKl6kKevZNyGLvWeLO6TnoWlpe357K
90N7TApQ+7W1rT0VG597hT3dxbTSSCX1Vb68/jwhlY/N68/65KuCMmFRCWnR
tPyAFRDmTuicFAPKC8W1eyUhfF14YXNPWActFLz1OhdjjO77dJZjFJBdcTWm
JSaoX4v5VDDB9aAC3QMqgUHzooxtLlspNV8l5//ykv4/Ns37nU1T/3oBLTLw
egEGAKgXYF+0Kmhg3jQ0BHxiK1s5mKlZ2cpvsq0raV5iv1AyYtH6BShQu9B4
FRoIA9ysbK0SD6oN1NXKFn4+LZRgW9lqm5atvLBl1NodUKjlQtdBCuLQgTqO
+WOABRVDtAU10hdX1bRTnsog4QZQXOn6Q9GvPwNIUbXocBVrDjfw4lODF+bK
w1Hlkz4rrEF8kg+vNKfRa5DoWqZ8Q6KXPjck+g2Jrn0+z41f7nRq488bH4nc
mEkXFTfnxp2qos5xteUoWGullk1Zs8m11qY7+jMwenaaZzugiPEQSuV5MzJz
EhbhWXosjnOA39UuqWt99uDjsN15X3WTla68E+ZzoLn3P9pDB/ZNj5/73pfK
EbcB5f0xn0NwrpGttWWaaPbhDVt7w9ZKATds7df0HALO4TaidvUkcGXKVlZ9
FRlpLZ+0RkrqrpiTuiskpQZHdn0ZqlHv9aarnfbnuplrp4SqvTO7z2Ovk1v9
JPdzf21JrdHJNbu97ae56ccQvheBf+EtPjQfOGU9+81+IlZ+UhsrnNMmj4iX
o+nM9QM60Qsf2hdv7Smcq2k7+anpEDo5Pq7mGDj7AU0lCGHOuvZZWSAF2wnH
32uJunkocr08++SOtpr2sbuMU47zw4sbu1ccvkaBj8/eGogSs4MMqnltdecl
r1QljS97CXHkmXkQmjhBMJ9uAjfD128m6gVPVgcd8LGlrvryWciMJyAkVi5O
inNjYUDiMaA4zTB14woebhyhq01NuhWNOIynMWNsCNJl5QOOxnNdZksx9pEx
hK/XmnK8pONqQzrtHTOcSexeqIMExRNzazz6VT+ScwVWmswNv1t7HldnZdtW
SMtTsyhIbwjkk4+yU2JYib/1z2MxpoTo6B6+3GqqWisPclX3i9Oh+Vm5yh5y
atq//6eeM8R/BoNBD77e/9PANnlXW5meNQ28jJ/3WDrVmw6l1lzBBYyX5hE0
lGfrmNGyqkrhI8/zMa2mzWJF/BTsxAzWul/UxgH5CK+mFpbNN1G0NcgY33o2
TvGVmtLhQRtdHTWlFOpqo6ecRorngFomaG+xqAi4CriICquP4qTXYSmookXi
2kGkjoavkGqcyGnepp3CqZ1LaVhljqdeT1n5DHGNQz2nY7F5LscP90FWlY5j
LoopZm33/8Qfr4X/FkZY4Idv8X1GLr5iuc3JRdqTxLpzJv7vel/Te2owWOAb
3Erq8dRPO4wUHGV1lGXpZetxsJ95BE3oFHart+gzXcuIukp/rx5g23TH2qFW
NqHgahbXMqeQsputE307qT2PvKtbaowvfD33FeJtp5CYLuuscbvyMGJbBJ4s
ztqG4ObXQ2wSioW/5i+QYMK0hWNoiSZoPgNPvPeLYutGY0Y/1gOjy0S8JDKl
F6YSMxZ5XrZYFjsS/o87GPE9Y5AZ49hNoEp2VfhdwfQmLxd9h8dCiNOyi16u
nzi9diivOe36i4ng5RO3NRqryUN1WNONs14lNVScGmq6t9MZIfdWnWdq9tYu
5oOV69a2M7c1T5tpoKxHYRqwmUCeyG68ovoTO5G99HKVVieySwKd3tSin7Uu
XiYd7y2YeddXf8Z69RHrupm0NMD7nDaoll/1VlovritNPxydOAc7o4NddaF9
3ff23oha997cW7Fux1ixt2pXW1pf4LYVr9+oaXyzFK8rvefsHA9HuzXFy6Ut
y7b3qoqXS4uvMAp2Rg929Uvl4pa6sXegPvqX/zfvqTdNpZWx5HWjhFk8Lz0a
/hP8avirsrRdX+NLuW7VTfn1yuKl0nk3XeolRg9Rs4e/ml+wdMVY0qVW1I6l
5Vga7oyG2li6dEb7WMf+r5XFC3XrYwlKP1Kl7cVL7dbUcip+yL+o0qqeXC2z
tK241t/7heIlLyoXNzXnnSXVoq97I7xk/JB/KbW7OhRafimVvqTO298Z7ffA
FrtaA2THPtwZiY0ZXddNH7F7py5eOmWrGUq2K935Nm5eaXnDTCnmlYrzkvlK
WXnPi01Gro9aDYPZvM37Aoz8ie8+yBOxdjtbCjUBHlMvyXLt+1o2rdz2BCK6
Kv07pH8PFJWp3g74MAeJ2kKd9e2ltcd34msBB1SfeUInthDrNykDulK6T/2s
cr2SnANN8dLv4yid0U0m5COl1IsEknyNZcRX1Ed0NmJZyfLpoOmMXn+Ii+9Z
nM446l2408JiG8+06QBGowql0TGYnP2WgVUlWJS7B+iQSKsoQ1/Cn7xd3BrD
ygdQtYx74YJ9+LaBwutObe8IELVVvh969AB7AJXof48a0L8PbIfLlHgEpTl+
qfV//clW3jQ2oQxecS3XctDkmmOyPXlTDhfdbEVSltb270hZZZrFcpqkaLFE
mm0iSu15kmtuP8oZJXPnEDqpZcMQDZMH+JPdfRue1nTOfVcLCGaN9l33jTXW
vQpA1fclPiZqbGXiwbNpb753BXvzb7C19XODra3Fb7B1oeYbbN0CW5fBMZR+
8fJfT/579PLVaVXxHFsXwfFK2Dqvp0tsvVv+4QZb32DrXB8rtg4Wi7Wwdd0J
9LS9kpaDlKdXAW6ovivAvaJG5vbOG0y+DibXwTgk0NwEuoXprdXp0nFT6Jex
9mqIH34a5UIpjUMHkAJ7WMu52CLOE1HMUYtQXq4z2rC8DpWcE3/uB26MSEU0
pxtddQIC0nLURxzChL0cRmkLVSWlcVBFNthfsWG2NmcZ5Av7Go+7uka2wdh7
crVkw/W/2aLLmLQeC8Fj6BfKQqz7FNQNDfHV0RAHnR8RILK4DdiIPJFsTDbL
yVsLIUfhEtqeOicshgnqHi9Tx1G00wRQlQS/bTWxfCQgtfIV9UJGBwiwDn41
WA91cUVNNiI/lAydxliNA6nq2TZUSP4T/auRGlaGoUkTndwwqI16YkTAZR7I
Zadclks28yNFqsPWiiqapPSppUsa2JJchuhYnTXJaQ/971xI5Vg1yBPr3yjk
daKP1QIHYrIgdilWTUSv5kIeWamUN2Uh+LEwIkVOxCalYNgyMVK7aaHIq/BP
mR+p51cK9Erp8ybvYjWIdZ5E+7tKSHVwtvxSJeQy50wOiDdZiTXpUhP6bMah
GKpfNZWSz8V1VeA/Fad8tOFXjAbV0Szqxp0nkykTc2/1WfnWqre3/p4f2VvN
GowO5MOtrpjpBeij5+cSDg4VOCU0QRBePKqB2HXQyAFNkrgtB0S5nUQwkJ7x
XA2ybTQB5KJpFofOzuOTV7tVjA9UtgHjs0L9N4xOBaPzd9wbAdgzJotS1fSA
SozcQ5I/pCw9zjz1p6dv46aHZpA94IifY32C/QDLCBpIIbyHZI0HuF+dAWbh
jIasmNTMAI7ShuC5VIThKDMFAKg3YZ9xj3reBh9l0rga3uhKZkPyKQcCRzEU
VtXW/byt5jtAy8izmko5aKBSwE2CSD7G7fLeNCr++BssiiMSxt56NAYPDF8o
jVGw0HrUxUE1kTBcnbqggY88grRCW5RfD+PrnL0S4rc4BZBmiAdNL77ZKMco
ZgS2pKY18C8mAzUZUiPyX0WYTCVrdymsoFlhOXwDYfgRiM3OCKwozDhecENh
GkVglbaKMIMqqOAbmoRVuISdd2gSVkUdWPmHRs2qKAQbD2ER1pJKsPARVZq1
oBTKvESVMPmpoxZK/ESDsHqKwdhkYR4hdA9LVEUHO9VgoHcCbhLJXPYvxXFF
Fm3163ZpKGznKE3xOdsYkOtlHmqKBIZRiVUa1wyg8B3acfLdZb9frZm2JcQq
rdZmInApm+k7RGzSap22jhyxSatwjRYkiUVahbA2ZElZWrPT2kgTgyoxNpfU
CKub3eyUhUXYLz6kC3PizEobT4rct0miXJ1mm+Ya4P3qZEpcLsGn63nelC/s
iqclXTHuciToh4k/Ici8vUWZHD+wykiBh4oGiCjtx6S2mQ2AugPE1GsyAqI4
5aCkdVLFBIg7IemfsIVv4O116QBL5VXr36Vb6SQhI2/VVrHct7j1IXA9Jl+1
BNPpnSSH22B7pr+6SfUYyeBNOycnhlLaKwzBk/k5izzr1zG4QLN3zNvBtdOZ
mxLYgITfn4a8v3EdMArpT1rGk5IMnGScalnVWHnapeAxZEu4IDeO8aA4G4DL
+9MPArgYy9NQcW8Fiz3fDUCrgB+nNvMXiToCgByTZ6281Yn2cDM9Bq22J8iT
0zjiguEQ5K/KctGHbAV19CYLY32TiY/KcAQmORhr6U9kc0Gzv6+HvfOu6wx/
Dz9Z/G0LEh3vJHi46ro+EXhFML7Rgnvd/p3VltqLIPzhDQhvEnYDwtcRdgPC
W2t2A8JvQHhXIPzhDQg3f7I07waEO/Ug3LjzcwLhD5tA+MOPAcLFuvZmq/IN
+JvfpS/Gb4y/CxVXYW/jthVxt0iQV8HeblE++SatvMoD9FI3nrK0Hqb36Iwy
qCaY005/uF3iIriHarhGtK167wZsXxPYNp17zUVu2WtfAc4uBoN1MHbN3vn9
NTD2/nVh7IfrbWc3z7UbyTMAR4UzAMsL47bj7UrZRPnTelOc+eEzeIstcuVi
efa6UjEnXy1qW2zNtmlV7r1ZtUQ32xzFQwftWI43BngrKNVqn/5aDx5UWqD6
wgpiNn2CoTNtunoUYqOHISxS1joEwiJl/QciJKLmR7qtfCxENw9FVNMqKz0W
sf6DEV0+GtHxwxHdPB6x5gMSGz8iUVzU3eghiQ4fk+jqQYmOHpXo/mGJTh6X
2PgxhWaqwU40dKRNR3N4lyv8BzZy4aAduZCvePV5Pf3SoQ8ShG2wvG+ppc3a
/gY1V/ELV7b4PhSL79e02C73s6e4Y582tRsMBNcCIccYeRQB9HhDew57N3Oz
JG2gKnSGwd6B6n19q9AL/KkDG0UAPY6nxcf48h1mPOegny9uEAIl7qGGtuiA
fXCul35Y3/9bUhD2bu2MgDj4ZAmIa1jor9t3X0VCDOXT++ZzOR9nxf+gko1o
OuGvKn2wTqrND+61pxNWpBEUfTBsRR+sQxu0pwu6oQla0APr0wIb0gEb4vfN
AfdmQHszgL0ZsF4PUG8IpDcE0KsD5y4Ac0dAeTOAvCIw3hAQy3G1JhDuAABv
Cnw3BLzdAd2NAO6awLYjQHudQBYB7LAJwA6vBcDyrGVj8FqxLl6QLt6GjYmP
fD59Xczacj28tGAdcmuUlq2HV7ZsXQKDZaOsBQQ/ayT4aQLBtdahy935FQDA
Llagh52uQO9fO/YbroP9cBV6JF/Yeqy9Flt77arxWlV6NWqPv0YPhwrvLPXi
adQeJgw+UOQ7MBceuKF6v1/NK7Y3rglfzf2h/QtlN66PXkQrKoTOpJflQRU/
ovwUrz5xYYok8TPk/TC0intIh5SzjHh6hQuzKHTsnDnTIBpDqKSDXyZ4NqpH
oVcWxJebRyFu4+IaUhk/pBcyZm6g/Z4/P4QnB/pihGq3pjmVeRYFQXTBj76I
3TlLkY4kryGH0Y/QVa8vnDNv5oZ+MkfboW6J8/z1ySlGJsHI4BsBtZNveCO4
5/EX9xpyR6pm4iAcpOjotgmDOY+BCePJBTaHN/sspS9InOKcxItAyHnM7z4e
vXaCyJ3w6xANJIROo0UURNMl/+GBqCN/B684FVYIhKnkZAmRY14wwg4bTAfg
LrNlQh0EpoJAmqJZ59DpPu5UAxV6YKV5hGcuenAV3+qIr0fk58QmJLfHTxI9
w61v8p2pu7zuR6ox1rpLJ2b28txZSPjzwOwpMQidp0cvjugYWpyH+JzuvL+N
Vz/wQYEvnY28jHgGdN0w4mVciiuJcPkTBhMdnqVbEpWIXz7IMTaGsDKbu/Fb
imbEvkNE9cDW0kn0KrFjA3/ua4fG6u9shtbEoAmL/d9dGeewndxREv4i8sAd
67MbD51+EvF3mIr5j7ekpGBY8BUoHATo2C5OsxO2gFmEEiQVOcgpyf1fvKQh
YMyeUjEakXN3Kc+aKc+h4AwRKA9haJLxGC50GWz9P+QmZTi8+wAA

-->

</rfc>
