0

The following Matlab scrip (taken from MATLAB help for fft) runs perfectly fine

Fs = 1000;                    % Sampling frequency
T = 1/Fs;                     % Sample time
L = 1000;                     % Length of signal
t = (0:L-1)*T;                % Time vector
% Sum of a 50 Hz sinusoid and a 120 Hz sinusoid
x = 0.7*sin(2*pi*50*t) + sin(2*pi*120*t); 
y = x + 2*randn(size(t));     % Sinusoids plus noise
plot(Fs*t(1:50),y(1:50))
title('Signal Corrupted with Zero-Mean Random Noise')
xlabel('time (milliseconds)')

enter image description here

But I am unable to understand why we needed Fs*t in plot(). Why I am making it dimension less?

2
  • If this is strange, then what did you expect? (And what happens if you try to plot that) Commented Feb 8, 2013 at 10:15
  • It is consistent with the way you defined x and y. Commented Feb 8, 2013 at 10:22

1 Answer 1

2

Your vector t is defined in terms of samples, i.e. t(10) is the value taken as the 10th sample.

If you want to plot the signal vs. time, you will have to multiply the sampling instance with the sampling time, i.e. time = FS*t .

If you don't scale, you eventually plot the signal vs. the sampling instances. Then, however, the label "time(ms)" is not correct.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.