The rules for determining the test pressure for a hydrostatic test include a temperature correction factor, ensuring that a proper amount of stress is imposed on the cold piping. The rules for a pneumatic test dictate that test pressure shall be 110% of the design pressure. For "hot design" piping, this rule would yield a non-conservative strength test due to the lack of temperature factor. Does anyone have any insight as to why the rules are stated in this way?