|
||
This tab allows you to configure and schedule the sending of newsletter issues that utilize A/B testing. It also provides a way to view the e‑mail tracking statistics measured for individual issue variants included in the test.
The slider in the upper part of the page is used to define the size of the subscriber test group. By moving the slider's handle, you can increase or decrease the number of subscribers that will receive the variants of the newsletter during the testing phase. The test group is automatically balanced so that each variant is sent to the same amount of subscribers. Because of this, the overall test group size will always be a multiple of the total number of variants created for the issue.
The remaining subscribers who are not part of the test group will receive the variant selected as the winner once the testing is complete. Please note that the slider will be locked after the first variant is sent out and the testing begins.
|
Using a full test group
It is even possible to set up a scenario where the test group includes 100% of all subscribers. In this case, the A/B test simply provides a way to evenly distribute different versions of the issue between the subscribers and the selection of the winner is only done for statistical purposes. |
In the Schedule mail-out section, you can specify when individual issue variants should be sent to the corresponding test group of subscribers. To schedule the mail‑out, enter the required date and time into the field below the list (you may use the Calendar selector or the Now link) and then click OK. This can either be done for a specific variant, all variants or only those selected through the checkboxes in the list. If the mail-out time is the same for multiple variants, they will be sent in sequence with approximately 1 minute intervals between individual variants.
After the testing phase begins, the variant list is instead used to display the Test results. The current tracking statistics are shown for each variant, specifically the number of opened e‑mails and amount of unique link clicks performed by subscribers. By clicking on these numbers, you can open a dialog with the details of the corresponding statistic for the given variant. During this phase, it is possible to reschedule the sending of variants that have not been mailed yet using the selector and date-time field below the list.
The Select as winner action may be used at any time to manually choose a winner. This opens a confirmation dialog where you can also schedule when the winning issue variant should be sent to the remaining subscribers. If you specify a date in the future, you will still have the option of choosing a different winner during the interval before the mail-out. Once the test is concluded and the winner is decided, the given variant will be highlighted by a green background. At this point, no further actions will be possible except for viewing the statistics.
The configuration made in the Winner selection section at the bottom of the tab determines how the winning variant of the A/B test will be chosen. You can select one of the following options:
•Number of opened e-mails - the system will automatically choose the variant with the highest number of opened e-mails as the winner. This type of testing focuses on optimizing the first impression of the newsletter, i.e. the subject of the e-mails and the sender name or address, not the actual content.
•Total unique clicks - the winner will be chosen automatically according to the amount of link clicks measured for each variant. Each link placed in the issue's content will only be counted once per subscriber, even when clicked multiple times. This option is recommended if the primary goal of your newsletter is to encourage subscribers to follow the links provided in the issues.
•Manually - the winner of the A/B test will not be selected automatically. Instead, you can monitor the results of the test using the statistics provided in the list above and choose the winning variant at any time through the Select as winner action.
When using an automatic selection option (one of the first two), it is also necessary to enter the duration of the testing period through the Select a winner after settings below. This way, you can specify how long the system should wait after the last variant is sent out before it chooses a winner and mails it to the remaining subscribers.
If a draw occurs (i.e. the top value in the tested statistic is achieved by multiple issue variants), the selection of the winner will be postponed and evaluated again after one hour. In special cases, you may need to choose the winner manually even when using automatic selection (for example if you are testing the number of opened e-mails and all subscribers in the test group view the received issue).
The winner selection settings may be changed at any point while the testing is still in progress.
Once everything is configured as required, you can confirm that the variants should be sent according to their mail-out scheduling time by clicking the Send button located in the header of the tab. If you only wish to save the configuration of the A/B test without actually starting the mail-out, use the Save button instead.
Further information about newsletters and A/B testings may be found in:
•Developer's Guide -> Modules -> Newsletters -> Overview
•Developer's Guide -> Modules -> Newsletters -> On-line marketing -> A/B testing
•Developer's Guide -> Modules -> Newsletters -> On-line marketing -> E-mail tracking