LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

LabviewのNI-Industrial Communications for CANopenを使わない、NI-XNETのみを用いたCanopen機能の実装について

現在、CANopenプロトコルでデータの送受信をする加速度センサーと、NI9862モジュールを通してWindows 11のパソコンでデータ(フレーム)の送受信をしています。データの送受信自体はできていますが、問題が発生しています。
加速度センサーから送られてくる計測データ(PDO: Process Data Object)をLabVIEW内で受け取り処理しようとすると、受け取ったデータ数が明らかにセンサーから送信されたデータ数から変化します。具体的には、設定したサンプリング周波数に関わらず、常に1000 sps(samples per second)になるよう値が自動調整されているようです。
この自動調整は意図したものではありません。LabVIEWやNI-XNETドライバの設定で、このような自動調整を制御できる項目があるでしょうか?
この現象は、SignalinputのwaveformにしてもFrameInputなどいろいろな設定にしても同様と思われる現象(値の自動調整もしくわ増加)が起こっています。
待機時間も100nsに設定はしていますが、それ以上早くしすぎるとオーバーフローが発生します。
同期に問題がるのかとも思いましたが、よくわからず。
原因や同じような症状など、どんなことでもよいので情報くださると幸いです

0 Kudos
Message 1 of 7
(426 Views)

@mkkkkk wrote:

Currently, I am using an accelerometer that transmits and receives data using the CANopen protocol to send and receive data (frames) on a Windows 11 computer through an NI9862 module. Although I am able to send and receive data, I am having problems.
When I try to receive and process the measurement data (PDO: Process Data Object) sent from the accelerometer in LabVIEW, the number of data received clearly changes from the number of data sent from the sensor. Specifically, regardless of the sampling frequency I set, the value seems to be automatically adjusted to always be 1000 sps (samples per second).
This automatic adjustment is not intentional. Are there any settings in LabVIEW or the NI-XNET driver that can control this automatic adjustment?
This phenomenon seems to be the same regardless of the settings of Signalinput waveform, FrameInput, etc. (automatic adjustment or increase of values).
The wait time is also set to 100ns, but if it is set any faster than that, an overflow occurs.
I thought it might be a problem with synchronization, but I'm not sure.
I would appreciate any information you can give me, such as the cause or similar symptoms.


NI-9862 does not support CANopen natively so I believe you implemented the CANopen on your own.

I am going to cover my opinion on NI-XNET only. XNET Read VI returns data differently for each session mode.

Assuming that the signal has a transmit time of 0.001s in the database:

Depending on what you are trying to do with the data, I can recommend which session suits your application the best.

-------------------------------------------------------
Applications Engineer | TME Systems
https://tmesystems.net/
0 Kudos
Message 2 of 7
(384 Views)

Thank you for your response. To clarify my situation and needs:

  1. My goal is to receive the data transmitted from the sensor, save it to a CSV file, and display a graph on the front panel.
  2. Currently, we have a one-to-one relationship between the sensor and the computer, using asynchronous communication in CANopen.
  3. I've tried using Signal Input Waveform mode to create a graph, but I'm still experiencing the same symptoms (automatic adjustment to 1000 sps). I've also attempted other modes like Frame Input Queued mode and Signal Input XY mode, but the results remain unchanged.
  4. Which session mode do you think would be most suitable for resolving this issue? Do you have any advice on settings or methods to accurately receive and process the data?
  5. If there's any additional information that might be helpful in solving this problem, such as details about our CANopen implementation or database file settings, please let me know.

I appreciate your guidance on how to overcome this automatic adjustment issue and accurately capture the sensor data at the intended sampling rate.

0 Kudos
Message 3 of 7
(373 Views)

What is the desired sample rate or interval of your data?

Do you want the sample interval to be fixed as defined in the database file? Or do you want to to be dynamic and reflect the actual timestamp of the CAN frames?

-------------------------------------------------------
Applications Engineer | TME Systems
https://tmesystems.net/
0 Kudos
Message 4 of 7
(360 Views)

Thank you for your questions. To clarify:
The desired sample rate should match the actual sampling rate of my accelerometer sensor. The sensor has the following sampling rates and intervals:
1,000 sps (1 ms interval)
500 sps (2 ms interval)
200 sps (5 ms interval)
100 sps (10 ms interval)
50 sps (20 ms interval)

I want the data acquisition system to adapt to these different sampling rates as I change the sensor settings. The goal is to accurately capture the data as it's sent from the accelerometer, without any automatic adjustments or resampling.

Ideally, the sample interval should reflect the actual timestamp of the CAN frames to maintain the integrity of the data stream from the sensor.

The most important aspect is to ensure we don't miss any samples or introduce artificial data points, regardless of the chosen sampling rate. I need the flexibility to change the sampling rate for different experiments without encountering the automatic adjustment issue to 1000 sps that I'm currently experiencing.

Since my English skills are not very strong, I rely on tools like DeepL for translation. I apologize if there are any unclear or incorrect parts in my message.

Can you recommend the best approach in NI-XNET to achieve accurate data capture that matches these varying sensor output rates? Should I be using a specific session mode or configuration to handle this type of adaptive data acquisition more effectively?

 

0 Kudos
Message 5 of 7
(347 Views)

If I understand you correctly, the transmit rate of the CAN messages sent by the accelerometer can be configured.

And you want the NI-XNET device to capture and report the actual timestamp of the CAN messages, adapting to the transmit rate from accelerometer.

In that case, Signal Input XY Mode should meet your requirements.

If you have tried that and it didn't work, please attach your code and NI-XNET Bus Monitor Log. I can help to investigate.

-------------------------------------------------------
Applications Engineer | TME Systems
https://tmesystems.net/
0 Kudos
Message 6 of 7
(334 Views)

Thank you for your response, and I apologize for the delay in getting back to you. I tried using the Signal Input XY Mode as you suggested, but unfortunately, it didn't resolve the issue. The program still encounters a stack overflow and doesn't function properly. I believe this is also due to the automatic value adjustment. However, when I checked with the NI-XNET Bus Monitor, I didn't observe any automatic value adjustment. This suggests that the issue might be in the code itself rather than in the signal transmission. I'm attaching my current code. While it's not using Signal Input XY Mode, it's the only version that works somewhat properly. I'm also including screenshots of the NI-XNET Bus Monitor log for your reference. Thank you for your continued assistance. I look forward to your insights. (This message has been translated by machine. I apologize for any errors or unintended discourtesies that may have occurred in the translation process.)

Download All
0 Kudos
Message 7 of 7
(271 Views)