SystemLink Forum

cancel
Showing results for 
Search instead for 
Did you mean: 

Generating a report from data in a DataFinder

Solved!
Go to solution

Split from another thread.

 

I'm trying to do generate a basic report, populated with some data from a TDMS file in a DataFinder. If I understand correctly, the analysis module has provisions for doing this, but I think it's overkill for what I'm trying to accomplish. I don't need any complex calculations or filtering, I just need to lookup a given DUT and display its test results graphically (with some minor calculations). It would also be a bonus if this could be done dynamically so that I don't have to store a permanent report for every test on a server somewhere.

 

My current thought is to build a WebVI that serves as a 'dashboard' of sorts and allows users to view test results, etc. BUT there isn't a DataFinder API for NXG. I'd like to dig in to the HTTP APIs deeper (presumably the Data Navigator Data Service), but I'm hitting a bit of a wall trying to decipher the documentation (specifically http://localhost/systemlink-tdm-offline-doc/DataNavigatorDataService/DataNavigatorDataService.html#_... ). I am also considering an indirect approach, running a 2019 web service strictly for the purpose of doing lookups and reporting back the required data, but that seems a bit kludgy.

 

So all that said, I guess my direction questions would be:

  • What is the syntax for creating a session with the /ods POST method? (See below for details on what I've tried)
  • Should I pursue the 2019 Web Service approach instead?
  • Is there a better option that I'm missing?
  • On a tangential note, does the DataFinder database ingest the entire file, or does it just serve as a lookup?
    • More to the point, do I need to keep the TDMS files on disk to ensure the data stays available, or can the files themselves be ephemeral?

 

Regarding creating a session:

 

My approach is as follows:

1. Login using the Swagger UI

2. Go to the Data Navigator Data Service

3. Go to the Init section, select POST /ods and expand

4. Click "Try it out"

5. Change the body to be as follows, where my datafinder name is literally MyInstance: 

 

curl -X POST "http://localhost/ni/asam/ods" -H "accept: application/x-asamods+json" -H "authorization: Basic YWRtaW46TWlQWEllMjAxNw==" -H "Content-Type: application/x-asamods+json" -d "{\"variables\":{\"$URL\":{\"stringArray\":{\"values\":[\"corbaname::#@MYINSTANCE@.ASAM-ODS\"]}}}}"

6. Set Parameter content type and Response content type to both be JSON

7. Click Execute

 

Result:

Response Code: 400

Response body:

{
  "reason": "EXCEPTION: CORBA::SystemException (BAD_PARAM)"
}

Response headers:

 access-control-allow-credentials: true 
 access-control-allow-origin: http://localhost 
 access-control-expose-headers: location 
 connection: close 
 content-length: 58 
 content-type: application/x-asamods+json 
 date: Wed, 30 Oct 2019 20:26:44 GMT 
 server: Microsoft-HTTPAPI/2.0 
 vary: Origin,User-Agent,X-NI-Auth-Method 

Note:

I tried removing the # and both @ symbols from the POST body, and then get the following response body (still code 400):

{
  "reason": "EXCEPTION: CORBA::TRANSIENT (TRANSIENT)"
}

I *think* after I can figure this out, it should be doable, but I'm hitting a wall here.

--------------------------------------
0 Kudos
Message 1 of 4
(2,300 Views)
Solution
Accepted by topic author ChrisStrykesAgain

I think I figured it out, but it would be nice to have confirmation: 

 

{"variables":{"$URL":{"stringArray":{"values":["corbaname::#MyInstance.ASAM-ODS"]}}}}

Unfortunately, now I'm getting a TIMEOUT error on this DataFinder and I'm still getting a BAD_PARAM error on a different DataFinder.

 

 

The reason I think this syntax is correct, however, is because I'm getting the exact same errors trying to open them via the web application (http://localhost/#datanavigator/server_overview). I think I might be hitting what I described over here.

 

Worth noting, I can still access via the LabVIEW 2019 API, but based on packet sniffing it seems to use an RPC interface instead.

--------------------------------------
0 Kudos
Message 2 of 4
(2,216 Views)
Solution
Accepted by topic author ChrisStrykesAgain

Hi Chris,

 

The Data Navigator uses the ASAM-ODS api under the hood, and it comes in various flavors (RPC, CORBA, HTTP) as it has evolved over the last few decades.  I could well believe that the LabVIEW DataFinder Toolkit VIs might use RPC, but I don't know if they are limited to that interface (I doubt it) or why they would be using RPC when other clients use CORBA or HTTP interfaces.

 

The DataFinder requires that a file exists for each record in its data base-- it only ingests metatdata, and therefore you can only query metadata.  Bulk data (channel arrays) are always left in the data file.  This minimizes the size and maximizes the performance of the data base.  It also means you have to have any information you want to query already exposed as a property by the DataPlugin when the file is indexed.  If a file is indexed and then later deleted or moved out of a Search Area, then all records from that file will automatically be removed from the DataFinder.  It is possible to use "file stubs", tiny files that point to other data sources, such as locations in 3rd party data bases or other recipes for how to request metadata or bulk data on-demand.  The information handed by the DataPlugin to the DataFinder doesn't have to be included in the data file that was indexed, but there always has to be a file.

 

Brad Turpin

DIAdem Product Support Engineer

National Instruments

Message 3 of 4
(2,207 Views)

Thanks Brad!

 

That all makes sense, and the 'stub file' you describe is intriguing, but I think it makes more sense to just maintain all of the TDMS files for my particular application.

 

FWIW, I finally got everything working, and my last post re: the syntax was correct.

 

 

 

--------------------------------------
0 Kudos
Message 4 of 4
(2,195 Views)