Life Science

cancel
Showing results for 
Search instead for 
Did you mean: 

Spectral analysis and low frequencies

And this is the front end utilizing data socket.

Forgive me please, I have no idea how to attach multiple files in one post! :~(
0 Kudos
Message 11 of 27
(7,042 Views)
You were pretty close with your example. I just fixed up your VIs so the passing of the waveform now works. I included "dt" in the transferred data.

The odd "t0" was caused by "Simulate Signal". To get real timing information, choose "Simulate Acquisition Timing" and "Absolute Timing" within this ExpressVI.

See if you like it.

(Sorry, I deleted the TCP Error.vi in the examples - you can insert them yourself.)
- Philip Courtois, Thinkbot Solutions

Thinkbot Solutions
Message 12 of 27
(7,032 Views)
Thank you Philip, that works out quite well. I will have to look at it some more to understand what exactly is different. I have one last question for you if you would be ever so kind.

Ultimately, I will only use the client portion of the LabVIEW VI for my project. I wanted to test both client and server in LabVIEW to get a feel for TCP connections in LabVIEW. The "server" is going to be an ethernet microcontroller from Maxim IC (DSTINIm400) which will package data from an 8051 microcontroller and send it via an ethernet connection to a computer running the client VI. The DSTINIm400 can have Java, C/C++ code uploaded into its memory. I was going to use one of these languages to program a server on the board.

What I am wondering is can I follow the same logic of the LabVIEW server and "convert" it over to Java or C++ while still using the same LabVIEW client code? That is, sending out the raw data, time stamp, and time between samples from the DSTINIm400 and then having the LabVIEW client "understanding" what is coming through the TCP stream even though the server itself is not written in LabVIEW? I hope that makes sense!

Any suggestions would greatly be appreciated. Again, thank you very much for the help currently received!
0 Kudos
Message 13 of 27
(7,022 Views)
If I understand correctly for the final implementation: -
You have a process on one computer in Labview
A process on another computer in Java
A communication mechanisim between the two using TCP/IP

The short answer: -
Labview would not 'know' about any change.

The long answer: -
This part of engineering is called a 'Black Box' approach. Others call it divide and conquer, either way the principles are the same.

1) Isolate the process
2) Define the communication
3) Implement

It's really no different to any module you might use already on your computer. You have no knowledege (or possibily care) about with what tools a DLL, Active X component or other routine be it Mac, Windows or UNIX was 'written in'. What you are interested in is communication between the component of interest and your module. The fact that in your instance a part of the communication takes place over a network is actually of little consequence. With any communication, one should have taken account of the bandwidth available at the design phase (So have you done this?).

This latter point could have more impact on you project than you anticipate, if you haven't done the sums properly. In this instance it's a case of appreciateing the difference between a simulation and the implementation. Of course there is nothing to stop you modelling the properties of the actual comminication medium into a module. This requires that you underatand the properties that are of interest. Simply put, you need to consider how much information you are intending to send and how much of a delay there might be. Have you allowed for the fact that some of the data could get lost?
Message 14 of 27
(7,009 Views)
Good. As far as I understand a client and server can be written in to different languages (Java and C++) so the communications via TCP (or even UDP) would be independent of this...as long as you tell each what to send and what to receive. I was hoping LabVIEW was the same way, I do not know much about LabVIEW and was making sure there are no "quirks" to to speak.

Honestly we have not thought about bandwidth and we chose TCP/IP to help take care of data transfer. Our project basically is "proof of concept" and will be tested on an intranet. Obviously there is a possibility of losing data across the network but we feel the intranet testing will suffice to prove that the concept can work and include problems/issues in our final report. If all goes well, we can even work on it after it is all said and done.

Now the only thing I have to do is figure out how to send 4 channels of EEG information, heart rate, respiratory rate, and galvanic skin reponse via TCP using C++ and having it received correctly in LabVIEW! Oi!

Thank you very much for your suggestions.
0 Kudos
Message 15 of 27
(6,997 Views)
Here's a bit of advice: -
Keep the protocol simple
Keep the overhead low
Allow for validation
Try to allow for extensibility (If you get the above right, the rest normally falls into place very easily)

What is the client server model that you intend to select as a basis for implementation?
Better yet find a model / protocol that already exists and implement it (There might be standards for example)!!

Thought about error detection in received data? It's often the case with ECG data that information gets lost or corrupted, you need to be able to figure out the difference between the two. I for one would be rather unhappy if interference indicated I had palpitations or fibrillations!!!!! There again if the sensor falls off, someone might assume I have just died, when in fact the spasm in my right arm caused me to pull off the sensor and I die as the monitor simply thought the sensor had become detached and so didn't bother to call out the doctor. 😞
0 Kudos
Message 16 of 27
(6,994 Views)
Just to clarify, I miss took EEG for ECG sorry for the ramblings if this distracts from the point.
0 Kudos
Message 17 of 27
(6,996 Views)
Ohh my point about the communications protocol!

Well if you take the project to extremes then one could imagine that the central part of the project is in fact the comunication protocol as all the other bits are liable to change in both feature set and hardware platform. Thus from a value perspective to a company and possbily the only constant, is an exensible layerd protocol stream that will allow you to support multiple device (target and host) platforms in a seamless manner.

Well it's possibly a different perspective to the one you started with?
0 Kudos
Message 18 of 27
(6,991 Views)
Well my knowledge of networking is equivalent to that of a novice. I am assuming when you are asking for the client/server model you are asking what the responsibilities the client and server will service. I am thinking that the server (patient) will acquire the needed data where the client (doctor) will request this information in "real time" via TCP/IP and do the FFT, display, and recording of data. Right now, the patient will be sitting at home and their device (consisiting of sensors, ADC, and microcontollers) will be the server while the doctor's VI will be the client...who may be across town. We are thinking applications in telemedicine with this.

There is been no thought as to the complexity of the networking scheme besides "point to point" connection through the Internet with one patient and doctor at a time as of right now. This is a school project so at the moment there is not an extreme importance on making a "perfected" network scheme.

We were thinking about using some kind of data tagging for data transfer and reliability. Since we are using a 12 bit ADC, we figured we could use the first nibble of a 16 bit signed integer as an identifier.
0 Kudos
Message 19 of 27
(6,986 Views)
Here is the finalized LabVIEW client VI that I have worked on these past few days. It is very big and nasty...I am sure there is a better way to do this...maybe make some sub-VIs? Either way, it works and I appreciate your suggestions. However, how does the TCP read and write actually work? Does it create one packet per iteration of the while loop and reads in one packet following the same logic? I am wondering because I am going to be writing a C++ or Java server to communicate with the LabVIEW client and I need to know how bytes to send in a header to tell the LabVIEW client how many bytes to be expecting and then actually send the values. I am a little confused as to the logic of how LabVIEW does this. Here is my client VI for you to look at.

Thank you very much.
0 Kudos
Message 20 of 27
(6,937 Views)