Sensitivity is defined as the slope of the straight line through two defined points in the measuring range. The points depend on the sensor in question.
Sensitivity error is the relative deviation (%) of the slope of the line through these points, compared to the optimum, or the zero-deviation slope.