View your InSight results
Once your test finishes, open the results page to see the measurable impact of your marketing action on the selected target metric. You can see:
- Results summary: A clear verdict on test performance
- Insights & recommendations: Guidance to help you interpret the results and decide next steps
- Metric breakdown: A detailed view of key performance metrics
How to interpret your results
Adjust provides actionable insights to help you understand what happened during your test and what to do next.
The results page breaks down performance for both All installs and Organic installs, giving you a complete view of incremental impact. Results are organized into sections that highlight:
- The incremental effect of your marketing action
- Changes in All vs. Organic installs
- Whether organic growth was driven or cannibalized
- Statistical significance of the results
- Practical implications for future campaigns
Example
You’ve completed an InSight test to measure the impact of a newly launched campaign. Based on the outcome, Adjust provides tailored recommendations to guide optimization.
If your campaign significantly increased All installs but also caused organic cannibalization, Adjust may recommend:
- Increasing campaign budget to amplify incremental gains
- Shortening attribution windows to reduce cannibalization
By combining clear results with concrete recommendations, Adjust enables you to make faster, more confident decisions and continuously refine your marketing strategy.
App bar
- Identifies key aspects of your incrementality test
- Action: The marketing action you selected for the test
- Campaign: Name of the campaign selected for the test
- Country: Country for which the test was run
- Channel: Channel selected for the test
- After action: Time period during which incrementality was analyzed
- Metric: The metric you chose to measure
- Performance: How reliable the model is
- Significance: How the change falls within the prediction range.
- If the actual results fall within the prediction range, the result is insignificant.
- If the result falls outside the prediction range, the result is significant.
Action response
- The Result of the test: Whether the marketing action had a Positive, Mixed, or Negative impact on your campaign.
- Details referring to your action and its impact.
- Recommendations for next steps you can take based on the test results
Adjust calculates the Result of a test by measuring the impact your marketing action had on your target metric for All traffic and for Organic traffic. The table below shows how each result is calculated.
| Marketing action | Impact on metric for All traffic | Impact on metric for Organic traffic | InSight result |
|---|---|---|---|
| Campaign Start | Lift | Lift | Positive |
| Campaign Start | Lift | Loss | Mixed |
| Campaign Start | Loss | Lift | Mixed |
| Campaign Start | Loss | Loss | Negative |
| Budget Increase | Lift | Lift | Positive |
| Budget Increase | Lift | Loss | Mixed |
| Budget Increase | Loss | Lift | Mixed |
| Budget Increase | Loss | Loss | Mixed |
| Campaign Stop | Lift | Lift | Mixed |
| Campaign Stop | Lift | Loss | Mixed |
| Campaign Stop | Loss | Lift | Mixed |
| Campaign Stop | Loss | Loss | Positive |
| Budget Decrease | Lift | Lift | Negative |
| Budget Decrease | Lift | Loss | Mixed |
| Budget Decrease | Loss | Lift | Mixed |
| Budget Decrease | Loss | Loss | Positive |
Results summary
For each test, InSight displays a summary of the Incremental effect your marketing action had on your target metric over the test period. The top two charts show the daily average incremental impact, expressed both as a percentage and an absolute value for all installs and organic installs. The graphs display a comparison of daily average Actual (solid line) and Predicted (dotted line) values.
Here are the specific metrics available for InSight reporting:
| Definition | Formula | Metric API ID | |
|---|---|---|---|
| Average revenue per event | Average revenue generated per your selected event from users who installed your app within the time period you selected | Total revenue of event / number of times the event was triggered | average_revenue_per_event |
| Incremental revenue | Extra revenue generated from when compared to a control group | (Actual incremental value - mean incremental value) * Average revenue per event | incremental_revenue |
| Incremental ROAS | Return on advertising spend (ROAS), calculated using only in-app revenue, for a selected cohort period | - | incremental_roas |
See Datascape metrics for a complete list of Datascape metrics.
Incremental effect over time
- All installs with the action impact
- Organic installs with the action impact
- Graph that shows pre and post action results
- The after action results highlight the prediction range and where in which the actuals and predicted installs fell
- Note that you are able to choose ‘Only show after the action’ to shift the view to isolate the after action data points
Advanced data
The Advanced data section shows you a breakdown of key metrics related to your InSight test. For both All and Organic traffic, you can select the plus icon to show the values of each metric Before and After you took your selected marketing action to see exactly what impact your marketing action had on your campaign.
If your action is a campaign you will see campaign-level granularity with the following metrics as they apply:
- Installs (attribution)
- Installs (SKAN)
- CPI (all)
- Total sessions
- Ad spend
Glossary of terms
The following terms are used in the results page to describe test results.
- Incremental lift
- Your campaign had a positive effect on your target metric that wouldn’t have occurred if the campaign hadn’t run.
- Loss
- Your campaign had a negative effect on your target metric that wouldn’t have occurred if the campaign hadn’t run.
- Organic cannibalization
- Events/installs that would have occurred without the marketing action taking place. Organic cannibalization data is available only when measuring Campaign start or Budget increase.
- Incremental effect absolute value
- The effect your action had on the metric value over the entire testing period. This is measured by comparing the average daily difference between the actual value with the action and the average predicted metric value without the action. The absolute value is calculated using the following formula:
average actual value - average predicted value. - Incremental effect percentage
- The effect your action had on your target metric as a percentage. This is measured by comparing the average daily difference between the predicted metric value without the action and actual value with the action.
- MAPE(Model performance)
- The mean absolute percentage error (MAPE) value measures the average difference between the actual values of your target metric during the pre-period and the predicted values produced by the model. A lower percentage of difference between these values indicates a more accurate model. A value below 10% is high, 10% to 20% is medium, and above 20% is low.
- P-value(Significance)
- The P-value (Probability value) represents the probability that your results are due to chance. A lower P-value indicates a higher probability that your marketing action directly impacted your target metric, while a higher P-value indicates that there is a higher probability that your results were due to other changes. Values of less than 0.05 indicate your measured action had a significant impact.
- P-value 0.01
- A P-value of 0.01 indicates a 1% probability that the observed effect could occur due to chance. This is considered strong evidence that your marketing action had a significant impact.
- P-value 0.2
- A P-value of 0.2 indicates a 20% probability that the observed effect could be due to random fluctuations rather than your marketing action. It suggests your marketing action might have had an impact, but the evidence isn’t strong enough to be sure without further data or analysis.
- P-value 0.5
- A P-value of 0.5 indicates 50% probability that the observed effects could be due to chance, suggesting no evidence that your marketing action had any significant impact on your target metric.

