LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Problem with Butterworth Filter

Hi! Hope you are doing well. I am a beginner on Labview, it might be a simple solution but I am struggling. 

 

Background: We have a data acquisition system for our 3D Printed Sensors with NI BNC 2090A in our lab. That one is coded with matlab (ss attached of the code). I am thinking to replicate the same system with NI USB 6212 and labview(Attached file 2 , Labview file). Because, my other instruments are controlled via labview. Therefore working with labview gives me freedom to work together in one platform. Therefore I already developed a code( Attached file 2) for multi-channel data acquisition. I have weak basics in signals. Therefore I am am having issues on some points.

In summary, I want to replicate the exact filters that is used in the matlab file to my labview code.

 

1. I am using the frequency I want to set as Data Collection Frequency at AIConv.rate. Is this a correct approach?

2. What is the sampling frequency in the butterworth filter vi? Is this the same as the data collection frequency that I want to set? 

3. How to find out Low Cut Off Frequency and High Cut Off Frequency for the butterworth filter? I know that the Normalized cut-off frequency(Normalize CO Fre= (2*Cutoff)/ Frequency Sample)) should be 0.5 and the order is 2. Is there any VI in labview that takes normalized cut-off frquency as an input? If not, do you have any recommendation on how to set low cut off and high cut off frequency on labview.

4. I used random values to filter my signals. What I found is a delay between raw data and filtered data. How to reduce this delay? (Picture attached)

Jarir_0-1665609731346.png

 

Thank you for your support. I appreciate your time. 

0 Kudos
Message 1 of 3
(1,134 Views)

Welcome to the Wonderful World of Digital Filtering.  Have you had a class in Signal Theory?  Do you know about "analog" filters, such as the tried-and-true RC Low Pass filter?

 

You ask about reducing the delay between the "raw signal" and the "filtered output".  Why is there a delay?  Because the filter cannot "predict the future" -- it is only looking at the signal now and in the past, and does not know how the signal will change in the future.  Have you heard the terms "corner frequency" for a simple RC circuit?  How about "transfer function", which describes how the ratio of the output-to-input of a filter depends on the frequency of the signal coming into the filter?  [Ask a EE student about it, or one of your Engineering faculty].

 

Bob Schor

0 Kudos
Message 2 of 3
(1,083 Views)

Thank you for your suggestion. I will look into it.

0 Kudos
Message 3 of 3
(1,062 Views)