Help with fundamental RF problem

J

Joe

Guest
Hello, I'm having trouble understanding a very fundamental question
with respect to RF measurements.

This I know:

Rollett stability factor (K):

determines if a transistor will oscillate due to input and output
impedance conditions present at the ports of a transistor. (i.e. if
K>1 & |delta| = |(S11)(S22)-(S12)(S21)| <1, transistor/amplifier is
unconditionally stable, thus no oscillation can occur with any
input/output impedance conditions)


Fmax:

-is defined as the maximum oscillation frequency of a transistor

-calculated at the frequency in which the power gain of a transistor =
1 (i.e. input power = output power) Is calculated by extrapolating
the Maximum Available Gain (MAG) of a transistor with a -20dB/decade
slope in the region where K>1 to where MAG=1 to find Fmax.

This is where I'm confused.

If one is measuring MAG in a region where K>1 then by definition the
transistor is NOT oscillating and thus how can we determine the
MAXIMUM oscillation frequency if the transistor is not oscillating in
that region already??????

Also why is the maximum oscillation frequency (Fmax) defined as the
frequency for which the power gain = 1 (input power = output power)??

What is the correlation??

Thanks for the help
 

Welcome to EDABoard.com

Sponsor

Back
Top