LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

How do you set a start and end time based on changes in data averages?

I am making a VI to collect data from a calorimeter that will detect automatically when I have triggered the combustion by recognizing a rise in the average temperature, and then will record data from that point until it the temperature rise stops or plateaus. I have worked out how to average my data, how to write it to a file, etc. but I don't know what the best way to construct my VI to automatically respond to changes in the average would be. It seems to me that there are a number of ways to do this, but I am fairly inexperienced with LABView and don't know which is best.

 

Does anyone have any ideas, specific advice, or general thoughts about using fluctuations in data averages to set start and stop times for a VI?

 

 

0 Kudos
Message 1 of 4
(2,267 Views)

First, I suggest that you be careful about your terminology.  I think you are not interested so much in start and stop times as in selecting appropriate data subsets (which may include timestamps).

 

You will probably want a VI which runs before, during, and after the combustion process.  After extracting the data subset, the other data can be discarded.

 

Only you can decide what the best triggering approaches will be because no one else knows how your data behaves.  Are you measuring detonations or iron rusting?  The timing and the changes in the data will be quite different.  How much noise is in your data? Do you need to deal with plateaus due to heat of vaporization or heat of fusion as components vaporize or melt?  What are the consequences of deciding incorrectly where to start or stop?

 

If you can show some data, someone may be able to make more specific suggestions.

 

Lynn

0 Kudos
Message 2 of 4
(2,261 Views)

Lynn,

 

All great points - thank you for your thoughts. You are right, I am interested in selecting a subset of my data in which the temperature is rising, not setting start and stop times.

 

I realize that it is difficult to give concrete advice without more information, but I'm hoping that with some clarification on my part you might be able to give a few suggestions. The calorimeter I am working with is a microbomb calorimeter, which combusts small, dry biological samples in a pressurized, oxygen environment, and it does not have a water tank or other components on other bomb calorimeters. The data I am recording is mV signal from a thermocouple ring that the microbomb rests in. To directly answer your questions, I would say I am measuring detonations, or more specifically the electrical potential generated in the thermocouple due to the temperature increase of the stainless steel bomb. The data is fairly noisy, so I have previously been averaging every 1000 data points together in order to observe the temperature increase clearly.

 

The only plateau in the data that I have observed that I need to deal with is at the peak of the temperature increase, when I would like to put my second timestamp/end my data subset. There is a slight lag time due to the rate that the steel heats at, so the entire measurement can take 30-40 seconds even though the combustion occurs in seconds. What I would like to do is have my VI observe and respond to the averages that it is already making, and then by some means, like an increase/decrease beyond one or two standard deviations, for example, and then set the datapoint before the increase as Time 0. Then I would like to identify the end of the temperature increase/end of the datasubset by similar means as the increase levels off. Finally, I want to integrate the curve of the increase and display the integrated value for my users.

 

I would appreciate any thoughts. I don't have any raw data to share, but I do have a TDMS of the averaged data from a calibration test I did combusting benzoic acid. I'm not sure how much it will help, but I'll attach it regardless.

 

Thanks,

 

~nat

 

0 Kudos
Message 3 of 4
(2,241 Views)

nat,

 

I believe you should be able to implement the functionality you described.  On a low level, you could use the statistics VIs to calculate the standard deviation of your averaged data set, and compare your data points to a multiple of that value in order to decide when your temperature increase was occuring.  Once that point is reached, you could switch cases and begin saving the desired data set while checking for the end point behavior you described (when the increase falls below a certain threshold).  Depending how you have set up your acquisition, you may be able to implement a state-machine structure (one of the provided templates in LabVIEW, if you're not familiar) and have states like: "check data", "increase detected", and "stop";  where the first would perform the comparison to the standard deviation, the second would save the within-range data, and the last would end acquisition and perform the desired integration.

 

The actual implementation would require a pretty detailed knowledge of what ranges are being held and how critical start and end positions are, as well as what sort of thresholds to expect/set.  I think you could achieve your goals using some comparison and case logic.  If I think of any more elegant solutions I'll post my thoughts, but that's what I'm leaning towards at the moment. 

 

Regards,

National Instruments
0 Kudos
Message 4 of 4
(2,225 Views)