I am trying to work out the likely phase lag for a lowpass filter
whose job is to take out the notches from a DAC.
The DAC is 12 bit and will be generating a 400Hz sinewave. It will be
driven at around 250x i.e. 100kHz. So the fundamental to filter out
will be 100kHz.
I am not good at maths but doing some digging around, it looks like a
simple RC has a 45 deg phase shift at the 1/2piRC point, and somewhere
around 1 degree when a factor of 100 away from that (40kHz).
I am happy with 1-2 degrees of lag, but more importantly it needs to
be fairly constant from 400Hz to 500Hz, or at least quantifiable,
because the table feeding the DAC can be shifted to compensate.
What about a 2nd order filter? The filter performance should be better
for a given phase lag, no?
Obviously perfection is impossible to achieve but I think a 10kHz
rolloff frequency would produce a really clean result. The Q is what
delay can be achieved at that rolloff.
There is a huge amount of stuff online but a lot of it is the audio
stuff, which is full of BS