I just responded to Bill on this before seeing your post.
RMS is a measure of error from the predicted data, which in this case is the error with respect to straight line tracking, I would have thought? So isn't the RMS primarily determined by the variation ("error") caused seeing? But maybe I'm misunderstanding how PHD2 calculates this value.
Anyway, the max DEC error I see with guiding turned off can vary greatly. I just uploaded a screen grab from last night's GA calibration run. The reported peak DEC error was 3.71 arc-s or 1.25 px. I think using that as the DEC MinMo would be overly pessimistic.
The problem with using the peak value is that is is susceptible to random worst case scenarios: sudden huge spike in seeing, piece of dirt in DEC worm, moth landing on the camera :). It doesn't necessarily give a good idea of what the average seeing or tracking will be.