Definition: | time required for a detector output to fall from a stated high percentage to a stated lower percentage of the maximum value when a steady input is instantaneously removed
NOTE – It is usual to consider a high percentage of 90% and a low percentage of 10%.
|