09-08-2017 06:35 AM
When configuring LabVIEW adapter as development, the local decimal point used within a LabVIEW module dépends on LabVIEW option "use localized decimal point"which appears in LabVIEW.ini configuration file.
But when configuring LabVIEW adapter as runtime, it seems that only the operating system decimal point is taken into consideration.
Is there any possibilty to force/unforce the use of local decimal point in LabVIEW runtime like "useLocaleDecimalPt=True/False" in LabVIEW.ini ?
Jean-Louis SCHRICKE
├ CTA - Certified TestStand Architect (2008 - 2022)
├ CTD - Certified TestStand Developer (2004 & 2007)
└ CLD - Certified LabVIEW Developer (2003 & 2005)
09-08-2017 08:28 AM
You should still be able to use the token with the runtime engine:
Using LabVIEW INI Tokens with VIs Called from TestStand
The ini file is per process, so remember that you'll need to include it for each operator interface you use.
Hope this helps!
Trent
09-08-2017 09:39 AM
Thanks for you reply a the link.
We may use a LabVIEW runtime Operator Interface which is not in the same version than the LabVIEW runtime modules called.
Does the TestExec.ini token "useLocaleDecimalPt" apply to all LabVIEW runtime used by LabVIEW modules ?
Jean-Louis SCHRICKE
├ CTA - Certified TestStand Architect (2008 - 2022)
├ CTD - Certified TestStand Developer (2004 & 2007)
└ CLD - Certified LabVIEW Developer (2003 & 2005)
09-08-2017 10:28 AM
Create a [LVRT] section if one doesn't already exist and put the token under that.