Continuous Integration

cancel
Showing results for 
Search instead for 
Did you mean: 

Do you build dependencies?

Who is responsible for a "dependency" being available and up to date?

Should downstream projects care?

 

An example is described below, with the following assumptions:

  • Code is spread throughout multiple repositories, each of which contains code for one "thing" that can be built/versioned/released (application or library)
  • A Directed Acyclic Graph can in principle be constructed (there are no "circular dependencies" between any of the repository build outputs)
  • Some repositories contain code for libraries (e.g. the output is a PPL or similar) that may depend on other libraries
  • Some repositories may contain applications, which can depend on arbitrary collections of libraries (and conceivably other applications, but I don't think this adds anything to the discussion)
  • A given library or application knows all of its own direct dependencies
    • Ideally it doesn't need to know about its dependencies' dependencies, although these would also be required for build/run, and appear in the project when loaded in LabVIEW
  • No library or application knows what depends on it

 

I'd be interested in thoughts on if I want to build a library that depends on another library (build B.lvlibp, depends on A.lvlibp):

  1. Should the code that builds B be able to check if A.lvlibp is up to date? If it is not, should it a) rebuild A, b) fail to build B, c) warn and continue with the old A.lvlibp, d) continue without warning (presumably in this case, don't check at all)?
  2. If A.lvlibp does not exist in some sort of repository/NIPKG directory/network drive (wherever it is placed on build), similar to above (build A or fail B)?
  3. Should B's dependency on A include a version for A? Should it be strict (exact version), looser (same major version etc), configurable (specify what matches and what does not)?

 

Currently I have a system that effectively rebuilds "A" when it is out of date and "B" is requested, but this requires quite a chunk of time confirming that "A" is up to date or not.

There are ways I could speed that up (probably, look at "Releases" on GitHub for "A" and see if there are more commits since, potentially filtering by some sort of branching strategy if desired), but I wonder if this is "B"'s problem at all...

In particular, if I specify a version for "A" in "B"'s dependencies, it shouldn't really matter, provided a version of "A" that is suitable can be found (and installed... without conflicting with C, D, E, whatever other dependencies are listed - but if this is not the case, then there is a configuration problem and I'm fine with that being a failure).

 

If I give "B" the information it would need to produce a list of dependencies for e.g. an NIPKG, then I can presumably just install those packages before loading and building B. Nested dependencies would be resolved by NIPM. Versions can also be specified, although this imposes some limits on the way they can be specified.


GCentral
0 Kudos
Message 1 of 5
(1,885 Views)

What works for us:
- Distribute dependencies as packages (NIPM, VIPM or choco packages) - package is rebuild on commit

- If the application/library needs some dependencies then CI will install package (latest or specific version)

 

It's quite fast (installing packages is usually very quick), it's flexible (you can have multiple versions of package and install an appropriate one for this particular project (not necessary the latest).

 

So we basically follow the last paragraph from you post 😉

 

Message 2 of 5
(1,866 Views)

Hey!

 

In our company we use Git submodules for most of the dependencies (source code) as long as they are not provided by some kind of packet manager. We had this kind of discussion for our projects too and right now we are quite happy with the submodule, VI/ NIMP Package strategy.

 

A question you might ask your self is, is it really necessary to always have the newest version of a dependency? How do you make sure an update in a dependent package didn't break code in you project? Just blindly using the most recent version for a dependency is something we almost never do. We always use a specific version for a dependency that is tested with the project, if there is an update we have to update the submodule/ dependency by hand and test it until we are sure nothing did break.

 

It might be worth mentioning that we use Gitlab and it's CI pipelines to build almost all of our dependencies in some kind of VI or NIMP package. Right now we don't use .lvlibp so they might need some different approach.

 

1. In our case no - if the necessary files could not be found the build fails.

2. fail

3. Yes and in our projects it is always a strict version of "A".

 

Be aware that our way of doing things might not work for you. It is just one way to do it and it depends heavily on how your projects work and what kind of dependencies they use/ need. In the end you need to find a solution that meets your needs and most importantly is easy to maintain.

Message 3 of 5
(1,860 Views)

I have not found it to be fast or easy dealing with this.

 

VIPM has a tendency to crash at the drop of a hat, I have to kill it to ensure I do not get a VIPM timed out error.  I have to include this in my build script. 

Get-Process -Name "VI Package Manager" -ErrorAction SilentlyContinue | Stop-Process -ErrorAction SilentlyContinue ; $Error.Clear()

 

 

My hope would be if I can move to a container solution that would work better but I have not got there yet. 

 

0 Kudos
Message 4 of 5
(1,829 Views)

We make a point of keeping as many dependencies as possible directly in any project's repository. That way, for those "local" dependencies, this is not a problem. For very generic reuse code, we use .vip's more and more. Some of our repos also make use of git submodules (although we try to limit that to specific use cases).

 

When building through CI, our Release Automation Tools can

- apply .vipc file(s) if configured to do so

- clone any number of repositories to any given location

 

This works nicely for us and for our customers. So, all in all, the answer is: No, we don't build dependencies "on the fly".




DSH Pragmatic Software Development Workshops (Fab, Steve, Brian and me)
Release Automation Tools for LabVIEW (CI/CD integration with LabVIEW)
HSE Discord Server (Discuss our free and commercial tools and services)
DQMH® (The Future of Team-Based LabVIEW Development)


0 Kudos
Message 5 of 5
(1,801 Views)