12-06-2015 12:42 AM
I am working with an NI-2901 USRP device within LabVIEW Communications Suite 2015. The USRP is connected via USB to a 3.20GHz quad core with 8.00GB of RAM. I am trying to generate a chirp signal that is about 10MHz(give or take a few MHz) for a radar project. Obviously, a chirp with this high of a frequency range requires a sampling frequency greater than 10MHz.
My issue is the common underflow problem whenever I increase my sampling rate rate beyond 2MHz. I have learned that the common cure for this problem is to implement a Consumer-Producer model with queues. Attached is my Tx program that generates the chirp via code. This is then sent to an enqueue loop. I have the USRP Tx process pull off the data from a dequeue loop. This results in errors regarding an underflow.
What I believe may be the issue is that my producer is running slower than the Tx consumer loop whenever I increase my sampling rate beyond 2MHz. Is there any way to work around this limitation? Is my PC really a bottleneck(which I cannot say is a poor performing E-Machine)? Is my goal of generating a chirp signal with this high of a frequency impossible with my current gear?
12-07-2015 04:55 PM
Hi,
Thank you for posting your code. Can you also post the error code and a screenshot of the error?
Can you please explain why you are dequeueing elements in the TX VI? I believe you should only be queueing elements in this portion of your code.
It may be helpful to refer to this community example, even if you are not using OFDM. This shows the proper procedure for queueing/dequeueing elements using LabVIEW Communications and the USRP: https://decibel.ni.com/content/docs/DOC-34781
Thanks!