10-09-2020 07:41 AM
Dear Community,
I am currently trying to simulate a pre-recorded signal in LabVIEW. I have successfully plotted the amplitude vs. data points on a chart from the excel file. However, I am having trouble plotting in the speed I want to mimic in real-time. My goal is to plot 1000 data points per second from the excel sheet, however, I'm not sure how to adjust the timing loop to achieve that fast. Also, when I rerun the program, how do I make the x-axis of the chart start from zero again. Thank you in advance!
The files are attached below.
"Read_Signal3.vi" is basically reading data from "20.csv" and plotting the points in a chart.
"20.csv" are the data points recorded with the first column as time and the second column as amplitude to the plot.
Solved! Go to Solution.
10-09-2020 08:29 AM
See if the attached can give you some ideas.
10-12-2020 11:19 PM
Thank you so much for the constructive feedback, very appreciate your time!
My code indeed looked ugly... I really need to spend more time understanding the basics of dealing with arrays.
10-13-2020 08:59 AM
A small addition to Altenbachs #1, Windows is very bad at keeping timing at sub 3ms periods. If you want to replay in a realistic manner it's probably better to plot 10 points at once in a 10ms loop.