LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How does LabVIEW Render the data in 2D Intensity Graphs?

Hello,

 

I was hoping someone could explain to me how LabVIEW renders it's 2D intensity data. For instance, in the attachment I have included an image that I get from LabVIEW's intensity graph. The array that went into the intensity graph is 90000x896 and the width/height of the image in pixels is 1056x636.

 

Something I know from zooming in on these images as well as viewing the data in other programs (e.g. Matlab) is that LabVIEW is not simply decimating the data, it is doing something more intelligent. Some of our 90000 lines have great signal to noise, but a lot of them do not. If LabVIEW was simply decimating then our image would be primarily black but instead there are very obvious features we can track.

 

The reason I am asking is we are trying to do a "Live Acquistion" type program. I know that updating the intensity graph and forcing LabVIEW to choose how to render our data gives us a huge performance hit. We are already doing some processing of the data and if we can be intelligent and help LabVIEW out so that it doesn't have to figure out how to render everything and we still can get the gorgeous images that LabVIEW generates then that would be great!

 

Any help would be appreciated! Thanks in advance!

0 Kudos
Message 1 of 9
(5,342 Views)

Hi ColeVV,

 

Unfortunately, the algorithms implemented for displaying data in LabVIEW indicators are proprietary and we cannot release that.  I can, however, say that the two easiest forms of decimation include simply discarding undisplayed data (unlikely since you are seeing your image cleanly) and implementing an averaging across all points (maybe more likely if you are seeing your the image through the noisy points - e.g. averaging out RMS noise).

 

You may try implementing your own decimation algorithm on your array such that your data points do not exceed the number of pixels in your indicator, avoiding the need for LabVIEW to decimate.  As for how you might implement that decimation, you may want to post in the Vision section of our forums for further assistance.

 

Regards,

Regards,
Chris Elliott
x36772
0 Kudos
Message 2 of 9
(5,298 Views)

Hello Chris,

 

Thank you for responding to my message. Our company is trying to evaluate whether LabVIEW is the right software development solution for the initial versions of our software. I can understand that how LabVIEW renders the data is proprietary. It should be! You guys have done a great job with it.

 

However, relying on the LabVIEW renderer compared to a few of the solutions I have tried to implement means that I have to sacrifice a lot of performance in order to have an output that our customers can clearly interpret. For our real-time viewing software, this is a trade-off that we can't do. I have done decimation, averaging, and taking the column with the largest signal and none are close enough to what we know we should be able to get.

 

It's possible that I am asking the wrong question, because all that I really care about is great graphs and high performance. If there are other solutions that will allow me to speed up the application while having the output graphs the same, I would gladly use those! I don't need to know what the renderer is doing, I just haven't found any solutions that have been good enough yet.

 

Any help you can give me on this would be appreciated. LabVIEW offers a lot of compelling reasons to use it as a software development solution but if we can't convince customers with great images and performance then those reasons won't be sufficient.

 

Thanks for your time,

 

Cole

0 Kudos
Message 3 of 9
(5,264 Views)

Hi Cole,

 

Thank you for your understanding.  I do have a few tips and tricks you may find helpful, though as I mentioned in my previous post - optimization for images or image-like data types (e.g. 2D array of numbers) may best be discussed in the Machine Vision Forum.  That forum is monitored by our vision team (this one is not) who may have more input.

 

Here are some things to try:

 

  • Try adjusting the VI's priority (File»VI Properties»Category: Execution»Change Priority to "time critical priority")
  • Make sure Synchronous Display is diasbled on your indicators (right-click indicator»Advanced»Synchronous Display)
  • Try some benchmarking to see where the most time is being taken in your code so you can focus on optimizing that 
  • Try putting an array constant into your graph and looping updates on your Front Panel as if you were viewing real-time data.  What is the performance of this?  Any better?

 

The first few tips there come from some larger sets of general performance improvement tips that we have online, which are located at the following links:

 

 

Beyond that, I'd need to take a look at your code to see if there's anything we can make changes to.  Are you able to post your VI here?

 

Regards,

Regards,
Chris Elliott
x36772
0 Kudos
Message 4 of 9
(5,251 Views)

Well, the simple fact is that beauty requires efforts, so in general, the prettier the desired outcome the more computing needs to be done. Of course if you do your own, you can more finely adjust the balance between beauty and speed as needed. 😉

 

Unless you have techniques to e.g. offload this task to the GPU (or maybe FPGA), I doubt you would get the same result as LabVIEW while spending less efforts.

 

For regular (not intensity) graphs, you can get some ideas about what's involved by reading this article (e.g. about min/max decimation). You can also try to analyse the internal LabVIEW methods by taking greyscale screenshots of the same intensity data at various resolutions, then inspecting the pixels. While you might get a good idea about what they are doing, you might be orders of magnitude slower trying to do it in your own code. Just guessing. 😄

Note that the Z axis of intensity graphs is only 8 bits (256 element color ramp).

 

Do you have a VI containing a typical dataset as shown in your picture above? What are your speed requirements?

0 Kudos
Message 5 of 9
(5,250 Views)

I seem to recall that the intensity graph simply uses the max value in a 2D region.  Easy to verify, simply create a large 2D array with all zeros except for one row/colum which is 0.1 or -0.1.  0.1 will always show up, -0.1 won't.

0 Kudos
Message 6 of 9
(5,220 Views)

Hi Chris,

 

Thanks for the extra info. I'll give those a try and and if I can get a minimum working example that highlights my problem then I'll check in with the Machine Vision Forum.

0 Kudos
Message 7 of 9
(5,178 Views)

Hi altenbach,

 

That article was useful. I certainly did spend some time trying to find an un-optimized algorithm that would allow me to match the LabVIEW intensity graph, but it's only worth spending so much time on this one small portion of my code. That particular image was taken at 230kHz, but I don't expect any live viewing application to keep up with that. The combination of high speed and large data sets is certainly what makes this a challenge. My idea to increase usability was to reduce that amount of data that the intensity plot had to deal with which would make the program much more responsive.

0 Kudos
Message 8 of 9
(5,174 Views)

Hi Cole,

 

Your mention of image rate made me think - you may want to try not only decimating pixels, but also frames.  Your monitor likely doesn't have a refresh rate above 60 FPS, and most people are happy with a framerate of 30 FPS.  Perhaps a decimation in the time domain could be helpful as well.

 

Just a thought.

 

Regards,

Regards,
Chris Elliott
x36772
0 Kudos
Message 9 of 9
(5,166 Views)