LabVIEW Idea Exchange

cancel
Showing results for 
Search instead for 
Did you mean: 
JackDunaway

Maintain Mutation History as part of Enum Type Definition

Status: New

This Idea was spawned from my quest for the über-migration experience when upconverting previous versions of enum data to the most recent version - in other words, providing backwards compatibility for data sources (networked clients/servers, databases, files...) that use a previous version of an enum. 

 


 

When you flatten an enum, the result is the only the numeric value of the name-value pair. Once flattened, an enum is indistinguishable from a vanilla numeric, containing no additional information:

 

22751iAAC1E6C59DDD7753            22747i41FF880474AA8369

(Yes, the snippet works in 2010, even with the cameo, but it's slow because the "Favorite Scrabble Word" enum has 47k+ items)

 

Looking at the flattened value, the first 4 bytes (00000001) represent the value of the I32, the next two represent the 6809th enum element (1A99), the next two represent the decimal year in hex (0797), and the final two bytes represent the birth month enum index and day of month respectively. That's all the bytes there is - 20 hex nibbles of flattened data to describe the value.

 

Storing this data structure as-is will be problematic if the type definition of the Scrabble Club Member ever changes. Fortunately, a typedef can be converted into a class, and the class definition will natively retain data structure mutation history, meaning flattened object data can be upconverted from previous versions. Cool!

 

22755i48F4D7119EE88C29

 

Above, we can see that if the data structure is converted to a class, the red-underlined represents byte-for-byte the same information as the flattened cluster, but with flattened classes there are additional goodies buried in the data, such as the version of the data underlined in blue.

 

From Preserving Class Data:

 

"LabVIEW, as a graphical programming environment, has an advantage over other programming languages. When you edit the class, LabVIEW records the edits that you make. LabVIEW is context aware as you change inheritance, rename classes, and modify private data clusters. This allows LabVIEW to create the mutation routine for you. LabVIEW records the version number of the data as part of the flattened data, so that when LabVIEW unflattens the data, LabVIEW knows how to mutate it into the current version of the class."

 

But what if you mutate one of the enums, by adding items, removing items, rearranging items, or renaming items? Now you're in a pickle, because Enums currently do not store mutation history as part of their definition.

 


 

Consider the simple enum, GreekLetters, as part of a project developed by So-So Engineering, Inc. The SSEI developers launch the code into production with version 1 of the enum: v1: { Alfa, Betta, Hero, Delta } (The So-So engineers aren't too good at Greek, but so-so engineering brings home the bacon at SSEI).

 

Throughout a long development cycle, every few months this enum is mutated to account for mistakes, new features, or obsolete features:
 
  1. v2: { Alpha, Beta, Gyro, Delta }
  2. v3: { Alpha, Beta, Gyro, Delta, Gamma }
  3. v4: { Alpha, Beta, Delta, Gamma }
  4. v5: { Alpha, Beta, Gamma, Delta }
The SSEI developers pounded out code in blissful ignorance until v4 when the app started outputting thingamabobs rather than whirleygigs. 
 
Their superstar programmer - boasting his shiny CLAD certificate he had been toiling for 3 years to obtain - realizes the previous versions from persistent data logs were being interpreted incorrectly. With the new v4 enum, v1 data with index 2 is now being interpreted as "Delta" instead of "Hero", and v2 data with index 3 is now "Gamma" instead of "Delta".
 
v5 comes around with a homebrew-upconverting-state-machine-mutater, and continually ranks higher as nastiest code the guys at SSEI have ever seen as they slowly discover Epsilon in v6, Zeta in v7, Eta in v8...
 

 
I propose that this revision history, or mutation history, be stored with the control type definition of the enum itself. Further, a flattened enum should include a four-digit version (like classes) in addition to the flattened value.
 
This Idea may be manifested as a new data construct altogether - the 'Enhanced Enum' or the 'Enum with Benefits' 😉 - or more elegantly it could be included as a new flag 'IncludeVersionHistory' in the existing enum construct.
 

 
As a final note, even if you think you're immune to this problem since you don't write enums to persistent files to be read by future versions, you're probably affected. A subVI that uses an enum BD constant will improperly upconvert to the most recent version if it is not in memory while the enum goes through multiple mutations - say a rearrange and rename - that cannot be implicitly determined without explicit mutation history.
 
If that explanation does not make sense (hey, it doesn't even make sense to me), just ask yourself this question: "Do default values of controls and BD constants sometimes get screwed up after rearranging, adding, or renaming enums?"
 
22753iC17B0E7D87CFF612
 

 
Finally, while you're excited about the Enhanced Enum, also get excited about the Enhanced Enum supporting sparseness and different datatypes. Feel free to post questions/comments/insults in the comments section below or on the original thread.

 

14 Comments
X.
Trusted Enthusiast
Trusted Enthusiast

 They allow us to assign human readable names to magic number values, but the number, not the name, is what's important


 

I am probably one of those erring on the dark side here. In my suggestion for a generalized global variable , I would actually be relying on the fact that "gamma" of version 2 in Jack's original post is actually the same as "gamma" in version 3, so that case structures to which this typedef enum are connected react accordingly to the changes made to the enum. However, as part of the suggestion, I mentioned that the "Case Structure" object needs to be modified too. That is, an option to "Track Typedef Enum Changes" should be added to the case structure AND the "Add case for every value" would add cases IN THE ORDER SPECIFIED BY THE REFNUM.

Let me expand on this:

If you did not select the "Track Typedef Enum Changes" option, the case structure would just treat the typedef enum as a number. If you did select the option, each modification of the typedef enum would have an impact on the case structure. Inserting a value in the typedef enum would break the case structure (provided you also had checked the "Add Case for every value OPTION" (note that I treat is as an option AND an action), forcing you to reuse "Add case for every value".

Modifying an existing element of the enum would result in the same behavior as currently.

Deleting an element would break the case structure (showing a red case as currently).

 

I am not sure that's addressing (indirectly) Jack's request, but I thought this may be a related discussion. The point is that this does not require to store "mutation history" as I understand Jack's discussion, but puts the burden on a specific object (the "linked" case structure) to track changes to the typedef refnum as they happen. Of course that also means that you have to have those in memory when editing your refnum otherwise you would have of course no way of knowing about the revision history... well, you know why.

nanocyte
Active Participant

Here's a possible enum editor interface where you'd use arrows to specify the mutation history. I think this sort of interface would also be useful to class mutation so the user can explicitly specify what should happen. It might be possible to automatically generate the arrows and then let the user optionally fix them as needed.

 

enum mutation.PNG

 

 

Intaris
Proven Zealot

Or we could do exactly the opposite. (This suggestion probably won't gain me any friends)

 

If I want to go hardcore, I could suggest that any Enum whose values have ever been persisted to storage (which for completeness sake, we need to assume is ALL of them Smiley Indifferent) should - on deletion or renaming of any elements - essentially become a different typedef (and thus break wires between them) Smiley Tongue

 

OK that's draconian for most but would get the perils of the change across very quickly.  Because there ARE Perils of doing this.  Pretending there aren't is just not true.  Then we at least get to see WHERE the problems are (Constants, Control default values, Scan to and from String, Load / Save, Globals, Queues and so forth), and how far-reaching the change is and can go about fixing them accordingly or settling for NOT doing that.  I'm seriously unsure whether adding another hidden mutation to fix what is essentially NOT a language problem is better than simply throwing up your hands and saying, look, now we have 289 places where that edit to your Enum possibly causes problems.... are you sure you want to do that? With direct feedback, we very quickly learn to choose the wiser path.  And if we decide we DO want to go ahead, the broken wires guide us effectively through the entire process.

AristosQueue (NI)
NI Employee (retired)

Intaris has a point. I've been known to rename the typedef file so it is missing in order to find all the places in the code that need to be fixed. And a lot of us warn against using "Default" so that if you add enum elements, case structures will break.

 

(Note that there are still good reasons to have automatic mutation, generally when maintaining systems where the enum was DESIGNED to be expandable/changeable, but not all systems are designed that way, and in those cases, breaking is often better.)