Are you correctly applying Inertia Compensation?
When using any type of software inertial compensation, to adjust your data for the Hp effects of accelerating the driveline mass, the dampening time period applied to both the Measured Torque and the RPM channels should be matched. Why? Well, inertia compensation formulas monitor the rate of change in RPM for each of the dyno and engine “flywheel” masses. Those acceleration rates are then multiplied by the Polar Moment of Inertia for the various components. That math returns the Inertial Torque values to add to (or subtract) from the Measured (strain gauge) Torque readings.
However, when dampening is applied to that RPM data, it has the effect of flattening out rapid RPM changes. Since the magnitude of inertial compensation is tied to the RATE of RPM change, this spreads out the total Inertial Torque compensation over a slightly wider (in time) segment of run data. Although the total amount of Hp added (or subtracted) remains the same, the actual Measured (strain gauge) Torque channel needs to be similarly dampened. If it is not, our dampened (milder but lengthened) RPM based inertia correction is superimposed onto an undampened (peaky and narrow) Measured Torque spike. So, instead of correctly balancing one another out, they only partially cancel each other (at their apex) and actually induce two smaller spikes!
Since most testing is done at fairly slow acceleration rates, even when the dampening is not matched, this error will probably be small. Unless you have high inertia components in your driveline (i.e. chassis dyno rollers) you may not have even noticed it. But now you at least know how to avoid it completely!