I feel that the current implementation of scan from string with "output" defaults comes close enough to this idea to make it not worthwhile for me personally.
Concatenate the input array to a single string (with "space" delimiter) and wire up to a "Format from string", slap a %s %s %s %s %s %s %f %f format string on it and input all of the defaults wired and you have essentially the same thing (Perhaps not as efficient but string conversions are generally not the fastest things LabVIEW ever does anyway).
What I COULD get behind is a default behaviour of "Format from String" to autopopulate the "Format String" based on the defaults wired to the output nodes (as above), thus doing away with the need for the format string if all defaults are wired.
The major problem I have with this idea is that it's fundamentally not an "index array" function anymore. The index inputs have been replaced with type inputs, meaning that the array is being pseudo-unbundled in order of elements. Even if you added indexing inputs, though, what you are really asking for is an array-aware "Scan From String" function that automatically translates the array elements into the outputs in order of appearance. This is much like what Intaris describes above.
Incidentally, you CAN leave off the format string fof the "Scan From String" primitive in LV2015 (not sure how many versions back it goes) and only wire the default nodes. LV will default to parsing the input string as a whitespace-delimited string with formatting string matching the default node types. The primitive will of course throw an error if the input string has fewer arguments than you have output nodes, and will output the default data for the missing ones.
There are some problem with relying on this default behavior, such as the fact that if your input array contains any actual strings with whitespace, the parsing goes to Hades. There is, of course the normal issue that "Scan From String" is strict about formatting of numbers, so unlike "Decimal String to Number", trying to format "2.1" into an integer will derail everything that follows.
Just to satisfy my curiousity about whether a primitive could be created that could essentially perform the "Variant to Data" function with string scanning, I wrote up a VI that approximates this (with a "Variant to Data" function outside to convert to the cluster wire type - something that could be built in with XNodes I'm guessing?). I won't make any boasts about performance given the overhead of variant flattening and stuff, nor any guarantees about documentation (there is none), error handling, or types of data supported. This just covers the signed and unsigned integers, floats, and strings as input types. Finally, I apologize but it does have OpenG dependencies.
Notwithstanding all that CYA, it seems to do a passable job of getting your array of strings into a formatted cluster with only one VI and the "Variant to Data" function.
That's not what this is at all. This involves conversion from strings into native types. I encourage you to download my VI and drop my snippet above onto a block diagram (assuming you have the OpenG lib). It isn't the most efficient, but I believe it illustrates the challenge well.
Essentially, the request is for the Scan From String function to support a 1D string array as input and a cluster as output, with format automatically determined from default cluster datatype and bundling according to cluster element order.
In question of scenarios this idea would come in handy: I often read random elements of my ASCII config files, coming back to me as array of strings. CBlum's screen shot show's nicely the clutter that assembles in most of my ini-cases. Kudos!