Interpret your InSight test results
When your test has finished running, you can navigate to the test results page by clicking on the test from the all tests list page. The results page will show the impact your marketing action had on your target metric and displays the following information about your test:
- An summary of the results.
- Insights that help you interpret your results and recommendations for next steps.
- A granular breakdown of key metrics.
To help you understand your test results, Adjust provides Actionable insights. For each test, Adjust displays Details about the impact of your marketing action to help you interpret the results of your test, as well as Recommendations that provide practical advice on what to do next based on the results.
For example, you’ve just completed an InSight test to evaluate the impact of your newly launched campaign and determine whether it achieved the desired incremental lift in both All and Organic traffic.
From your InSight test, you can see detailed information about what occurred when you executed your marketing actions. For example, you can assess the incremental effect of the campaign and determine whether All and Organic installs increased or decreased, or if organic cannibalization occurred. These details help you evaluate the effectiveness of your campaigns.
Additionally, Adjust provides Recommendations based on the test results. If your campaign led to a significant increase in All installs but also resulted in organic cannibalization, Adjust might suggest increasing your budget for this campaign while simultaneously shortening your attribution windows. By combining detailed results with actionable recommendations, Adjust empowers you to make informed decisions and refine your marketing strategies effectively.
Glossary of terms
The following terms are used in the results page to describe test results.
- Incremental lift
- Your campaign had a positive effect on your target metric that wouldn’t have occurred if the campaign hadn’t run.
- Loss
- Your campaign had a negative effect on your target metric that wouldn’t have occurred if the campaign hadn’t run.
- Organic cannibalization
- Events/installs that would have occurred without the marketing action taking place. Organic cannibalization data is available only when measuring Campaign start or Budget increase.
- Incremental effect absolute value
- The effect your action had on the metric value over the entire testing period. This is measured by comparing the average daily difference between the actual value with the action and the average predicted metric value without the action. The absolute value is calculated using the following formula:
average actual value - average predicted value
. - Incremental effect percentage
- The effect your action had on your target metric as a percentage. This is measured by comparing the average daily difference between the predicted metric value without the action and actual value with the action.
- MAPE(Model performance)
- The mean absolute percentage error (MAPE) value measures the average difference between the actual values of your target metric during the pre-period and the predicted values produced by the model. A lower percentage of difference between these values indicates a more accurate model. A value below 10% is high, 10% to 20% is medium, and above 20% is low.
- P-value(Significance)
- The P-value (Probability value) represents the probability that your results are due to chance. A lower P-value indicates a higher probability that your marketing action directly impacted your target metric, while a higher P-value indicates that there is a higher probability that your results were due to other changes. Values of less than 0.05 indicate your measured action had a significant impact.
- P-value 0.01
- A P-value of 0.01 indicates a 1% probability that the observed effect could occur due to chance. This is considered strong evidence that your marketing action had a significant impact.
- P-value 0.2
- A P-value of 0.2 indicates a 20% probability that the observed effect could be due to random fluctuations rather than your marketing action. It suggests your marketing action might have had an impact, but the evidence isn’t strong enough to be sure without further data or analysis.
- P-value 0.5
- A P-value of 0.5 indicates 50% probability that the observed effects could be due to chance, suggesting no evidence that your marketing action had any significant impact on your target metric.
Actionable Insigts
The results screen allows you to analyze results for All installs and Organic installs with five sections to highlight all aspects of your incrementality tests and to provide you with actionable insights.
App bar
- Identifies key aspects of your incrementality test
- Action: The marketing action you selected for the test
- Campaign: Name of the campaign selected for the test
- Country: Country for which the test was run
- Channel: Channel selected for the test
- After action: Time period during which incrementality was analyzed
- Metric: The metric you chose to measure
- Performance: How reliable the model is
- Significance: How the change falls within the prediction range.
- If the actual results fall within the prediction range, the result is insignificant.
- If the result falls outside the prediction range, the result is significant.
Action response
- The Result of the test: Whether the marketing action had a Positive, Mixed, or Negative impact on your campaign.
- Details referring to your action and its impact.
- Recommendations for next steps you can take based on the test results
Adjust calculates the Result of a test by measuring the impact your marketing action had on your target metric for All traffic and for Organic traffic. The table below shows how each result is calculated.
Marketing action | Impact on metric for All traffic | Impact on metric for Organic traffic | InSight result |
---|---|---|---|
Campaign Start | Lift | Lift | Positive |
Campaign Start | Lift | Loss | Mixed |
Campaign Start | Loss | Lift | Mixed |
Campaign Start | Loss | Loss | Negative |
Budget Increase | Lift | Lift | Positive |
Budget Increase | Lift | Loss | Mixed |
Budget Increase | Loss | Lift | Mixed |
Budget Increase | Loss | Loss | Mixed |
Campaign Stop | Lift | Lift | Mixed |
Campaign Stop | Lift | Loss | Mixed |
Campaign Stop | Loss | Lift | Mixed |
Campaign Stop | Loss | Loss | Positive |
Budget Decrease | Lift | Lift | Negative |
Budget Decrease | Lift | Loss | Mixed |
Budget Decrease | Loss | Lift | Mixed |
Budget Decrease | Loss | Loss | Positive |
Incremental effect
- The top two charts show the daily average incremental impact, expressed both as a percentage and an absolute value for all installs and organic installs
- The graph displays a comparison of daily average Actual (solid line) and Predicted (dotted line) values.
For each test, InSight displays a summary of the Incremental effect your marketing action had on your target metric over the test period. This is summarized in the following metrics:
The following InSight-specific metrics are also available in Datascape and in the Report Service API. See Datascape metrics for a complete list of Datascape metrics.
Definition | Formula | Metric API ID | |
---|---|---|---|
Average revenue per event | Average revenue generated per your selected event from users who installed your app within the time period you selected | Total revenue of event / number of times the event was triggered | average_revenue_per_event |
Incremental revenue | Extra revenue generated from when compared to a control group | (Actual incremental value - mean incremental value) * Average revenue per event | incremental_revenue |
Incremental ROAS | Return on advertising spend (ROAS), calculated using only in-app revenue, for a selected cohort period | - | incremental_roas |
Incremental effect over time
- All installs with the action impact
- Organic installs with the action impact
- Graph that shows pre and post action results
- The after action results highlight the prediction range and where in which the actuals and predicted installs fell
- Note that you are able to choose ‘Only show after the action’ to shift the view to isolate the after action data points
Advanced data
The Advanced data section shows you a breakdown of key metrics related to your InSight test. For both All and Organic traffic, you can select the plus icon to show the values of each metric Before and After you took your selected marketing action to see exactly what impact your marketing action had on your campaign.
If your action is a campaign you will see campaign-level granularity with the following metrics as they apply:
- Installs (attribution)
- Installs (SKAN)
- CPI (all)
- Total sessions
- Ad spend