Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

How many cores

Hi

 

Use use VB2005 with IMAQ, Vision and Machine Vision to develope industrial X-Ray applications. I am in the concept stage of a new project that will use a new image format. The camera format is 4168x4168@ 1fps or 2084x2084 @ 4fps - grayscale 16bit.

 

Processing on these images can be quite though. I am just wondering what type of computer I need to spec. Core2Duo or Xeon or Dual Xeon? How many cores will the NI components use?

 

Thanks

Bestbier

0 Kudos
Message 1 of 6
(4,455 Views)

Unless there has been some huge change in Labview 8.6, Labview 8.5 and 8.2.1 seem to use one core per VI. If you have multiple operations in parallel, then Labview will scale across the cores accordingly. I've found that vision code tends to be extremely linear, at least if it's expected to be real time (one VI goes into another, goes into another). In that case, you'd want to have fewer, faster cores rather than several more slower ones (For example, a higher clocked dual core over a slower clocked quad core). Also remember that more cores are better for system stability, you can load 3 out 4 cores on a quad core computer and still be able to navigate and use the computer normally.

 

Also remember that, depending on your camera configuration, you may be using a lot of CPU load on just bringing in the frames. For an example: My school used a Intel Q6600 (quad core) with a Gig-e camera. Just bringing the frames into Labview (no processing) used an entire core. USB and Gig-e cameras are high, Firewire are generally low, but everything needs processing power unless something else is doing the processing for you.

 

If you don't need real time, and you're planning on processing multiple frames at a time, you can pipeline your code so that you can use more cores. Pipelining means that you're performing multiple vision operations in parallel (if you had 4 operations to perform, you'd perform operation #1 on image 4, while performing operation #2 on image 3, operation #3 on image 2, and operation #1 on image 1). This consumes more memory and has some loss, but it's one of the only ways to get all the cores working for you.

 

I'm a big fan of the Q6600, just because it's so cheap and powerful. I don't like the Xeons too much because they always seem overpriced for the power you're getting (I'm a gamer, just my opinion). 

0 Kudos
Message 2 of 6
(4,441 Views)

Hi

 

Please note that I don't use Labview. I don't even know exactly what a VI is. I use Visual Basic 2005. The reason for this is that my application does image processing, PLC control, Reporting in DOC format and soon SQL.

 

I have a previous version that uses a NI-1427 PCIe cameralink card at 1000x1000 16bit. On my developement PC (1.8 Dual core) when I started the app it used both cores at 95-100%. Note that I am processing 30fps to display live and then averaging 60 frame. When I installed the app on the client's computer which is a Quad-core it used all four cores at about 60-70%. Processing time was slightly reduced (I think).

 

I was wondering if I use a Dual-Xeon, will all 8 cores share the processing? My new camera is probibly going to be 4200x4200 16bit with GigE interface.

 

Thanks

0 Kudos
Message 3 of 6
(4,433 Views)

Hello bestbier,

First off, awesome name. Deutsches bier ist sehr gut!

Second, I would select the Dual-Xeon 8-core PC for your application. Windows will automatically make use of the multicore capabilities to the best of its ability. Arguably, LabVIEW would do a better job dividing the tasks/threads because in version 8.6 the programmer has the ability to target individual cores. In summation, your VB 2005 IMAQ / NI-Vision code will perform better on the Dual-Xeon than on the Xeon PC… with a greater amount of time before obsolescence.

Just so you know, LabVIEW has the ability to interface with SQL, Microsoft Office applications, and interface with any PLC. All this with the convenience of graphical programming. And as music2131 mentioned, LabVIEW Real-Time offers deterministic performance unlike any Windows operating system.

What kind of hardware interface are you using to acquire images? Do you see a change in acquisition / image processing performance when Windows is interacting with the PLC or writing to Office?

I hope this information helps! 🙂

Message Edited by DjDaveNI on 09-05-2008 11:11 AM
David G
Sales Engineer - SE Michigan & N Ohio
National Instruments
0 Kudos
Message 4 of 6
(4,416 Views)

Hi David

 

Thanks. Bestbier is actually my surname and what people call me - usually takes some explaining.

 

Thanks for the info. I have never actually looked at labview. I always thought it was designed to testing and research and not suitable for my application. Generally one system will save about 300 000 images per year, each one interpreted by the operator. I will take a look at the links you posted.

 

I will probibly use a NI PCIe-8231 for this system. Currently I use a PCIe-1427 for a cameralink system. I use a component to do the reporting so there is basically a mini word processor built into the software. I have reverse engineered the PLC communication to make as streamlined as possible so there is not lag with PLC or reporting

 

Regards

Bestbier

0 Kudos
Message 5 of 6
(4,412 Views)

Hey bestbier,

LabVIEW really is a powerful software tool for testing, measurement and control... I'm glad you will look at the links I posted. I should mention that the LabVIEW DSC Module is how LabVIEW communicates with PLCs... with virtually no "lag" or jitter.

The PCIe-8231 should work just fine with your Gig-E compliant camera.

Post back if you have any further questions. Thanks!

David G
Sales Engineer - SE Michigan & N Ohio
National Instruments
0 Kudos
Message 6 of 6
(4,372 Views)