# Placement A/B Testing

Placement A/B Testing is a method that allows you to experiment with different configurations within your Offer Wall placements and determine an effective setting to boost user engagement and revenue.

DT’s Placement A/B Test evaluates multiple ad placements within an app to determine which performs better based on engagement, conversions, and revenue. It runs two or more variations simultaneously, which allows you to test exchange rates, colors, banners, and layouts. You can analyze metrics such as Click-Through Rates (CTR), Offer Conversions, and Unique Users to determine the effective placement and improve ad performance.

## Creating an A/B Test Experiment <a href="#id-01h88zew723g6gphc2qzcsdr17" id="id-01h88zew723g6gphc2qzcsdr17"></a>

Set up the experiment by defining the placements and variables to compare before running the test. To start a test:

1. In the [DT Console](https://console.fyber.com/), go to **Monetization→Offer Wall**.\
   The Offer Wall Publisher Dashboard appears.
2. Select the **App** where you want to run the test.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2F3gZoJaJiJ9oMBArXJevi%2F2025-02-10_11-52-42.jpg?alt=media&#x26;token=42f4374a-d679-4ec8-9a99-38d1fb45af38" alt=""></div>

3. Select the **Placement** for the test.\
   The **Placement Details** window appears.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2Fmf4uipxL9bcZG7jDKLPU%2F2025-02-10_12-07-18.jpg?alt=media&#x26;token=ac292e32-f016-4986-96d3-7dcfd4d30b83" alt=""></div>

4. Click **Create Experiment**.\
   The **Experiment** window appears and shows all available settings.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2FpmYNfAFpwDrEAgkD5s2o%2F2025-02-10_09-38-23.jpg?alt=media&#x26;token=b01e5989-31da-46a3-b0f0-33a7e68d2188" alt=""></div>

5. In the **Enter Experiment Name** field, enter an Experiment name.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2FrX0qqPGjILmQfgjUazAv%2F2025-02-09_14-37-56.jpg?alt=media&#x26;token=36842065-994b-40c4-9d71-5fa026685f4e" alt=""></div>

6. Click **Add Variants** to add new variants.

{% hint style="info" %}
DT allows you to test the Baseline and up to three additional variants, for a total of four.
{% endhint %}

7. To edit the **Variant** name and **User Distribution** percentage, click the **Edit** icon in the **Variant** tab.\
   The **Set variant name and percentage** window appear.

* Enter the **Name** and the **User Distribution**, then click **Save**.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2FzvkU4Tq1uFSohJpfxAt5%2F2025-02-10_12-25-23.jpg?alt=media&#x26;token=35cc53a1-b609-4dea-98f5-2dc5c9076bee" alt=""></div>

{% hint style="info" %}
Ensure that you make changes to the Variant, not the Baseline.
{% endhint %}

8. To test the exchange rate:

* Click **Add Entry** in the **Exchange rate per country** area.
* Select the country from the **Countries** drop-down list and click the checkbox.
* Enter the required exchange rate in the **Exchange Rate Points per 1 USD** field.
* To add additional exchange rates, click **Add Entry** and define the parameters.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2FjlrTVTFFZ5cdeZmkSGRi%2FScreen%20Shot%202023-08-23%20at%2011.09.37.png?alt=media&#x26;token=463799e2-8a83-4bb9-82c1-a10f19715bef" alt=""></div>

9. To test the **Tag Color**, **Button Color**, or **Button Text Color**, do one of the following:

* Enter the required hex code in the text field.
* Click the color picker to enter the RGB values or select a color from the palette.\
  The preview updates and reflects the new color selections.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2FiNg8D8TuL3pMIywgxYaw%2F2025-02-09_14-58-38.jpg?alt=media&#x26;token=95dbc9e2-a7e7-483e-8dae-022f016e351f" alt=""></div>

10. To test a **Mobile banner image** or **Tablet banner image**:

* Click **Choose File**, and upload the required image.

11. To view the UI customization, click **Open Preview**.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2Fmx6MpJyzqFeNX8fTH6a1%2F2025-02-10_12-31-07%20(1).jpg?alt=media&#x26;token=07969c62-fbcd-448c-b4c7-80a01cb8274e" alt=""></div>

12. Click **Start Experiment**.

{% hint style="info" %}
You can change the percentage between the variants and configurations during the experiment. However, these changes might impact the significance of the results.
{% endhint %}

## Monitoring Test Results <a href="#id-01h8gqd3spps59ra43dkfhbfa5" id="id-01h8gqd3spps59ra43dkfhbfa5"></a>

The DT Offer Wall Report allows you to view the performance of your test. To monitor the test results:

1. In the DT console, go to **Dynamic Reports**→**Offer Wall Report**.\
   See [Offer Wall Report](https://docs.digitalturbine.com/dt-offer-wall/publishers/reporting/dt-offer-wall-report-for-publishers) for more information.
2. To **Filter** the report, add the **Placement Experiment and Variant** dimension to your report.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2FaAi3j4tGgnFQqNjRHJ01%2F2025-02-10_12-54-48.jpg?alt=media&#x26;token=1e7efb7e-814f-492a-bf7d-c4c77f3824ae" alt=""></div>

3. Filter the dimensions and select the required experiments.
4. To view the results, toggle the **Performance** button.\
   The report updates and shows the results of the experiments.
5. (Optional) To monitor the aggregated results of the experiment, select **Table**.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2FgyxO3ppPj80Ey6dzRcHe%2F2025-02-10_12-38-06.jpg?alt=media&#x26;token=a01cde91-118f-4b81-8c11-9b3f69b9ab12" alt=""></div>

6. (Optional) To track the results over time, select **Line Chart** and split the data by the **Date/Time**.\
   The report appears as a line chart.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2Fhpc48HZFxNbyZZIIISfL%2F2025-02-10_12-48-17.jpg?alt=media&#x26;token=2873f78a-5055-417c-b200-12c829729e84" alt=""></div>

{% hint style="success" %}
You can monitor your KPIs for the experiment and include the following metrics in your report:
{% endhint %}

| **Metric**            | **Description**                                                               |
| --------------------- | ----------------------------------------------------------------------------- |
| OFW Unique Users      | Verifies the distribution between variants.                                   |
| Publisher Revenue     | Reviews overall revenue coming from each variant.                             |
| ARPDEU                | Average revenue per engaged user for each variant.                            |
| Container CTR         | Number of users who entered the Offer Wall and clicked on at least one offer. |
| Offer CTR             | Number of users who started offers.                                           |
| Click/Conversion Rate | Overall conversion rate for each variant.                                     |

## Choosing the Winning Variant <a href="#id-01h8k79k12h3zxxccbecqzs2e0" id="id-01h8k79k12h3zxxccbecqzs2e0"></a>

After completing the experiment and analyzing the results, follow these steps to set the variant as the default:

1. From the **Placement Experiment** window, select the winning variant.
2. Click **End Experiment**.\
   The winning variant is now the default placement configuration.

<div align="left" data-with-frame="true"><img src="https://592572939-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F7drDlNSyycpmA7Zx8kgX%2Fuploads%2FhcJFAZ3KVfDPCOwZLhXg%2FScreen%20Shot%202023-08-30%20at%2014.33.49.png?alt=media&#x26;token=4e01fe03-9abd-4cd6-aeb8-50e57de781b0" alt=""></div>
