06-25-2015 07:56 PM
@RunningWreck wrote:
But if you're multiplying matrices with vectors, it does make sense to transpose a 1-D array (or vector), so that the dimensions of your multiplication operation will match. How can I transpose a vector (1D array) so that I can multiply it correctly with a matrix?
You don't have to transpose your vector. If it's on the left of the multiplication, it will be treated as a row. If its at the right, it will be treated as a column.
03-20-2018 02:58 PM - edited 03-20-2018 03:13 PM
03-20-2018 03:11 PM
@alexalexalexalex wrote:
Nope.
It matters if you want to use "build array" VI. If two 1D array are in rows, it will give you an appended 1D array. If they are in columns, it will give you a 2D array. So it matters much!
This is an old post, so I hesitated responding, but I do feel the need to correct the above statement. A 1D array is just a 1D array in LabVIEW, it is not a row or a column. It doesn't matter if you expand it one way or the other to see more elements, it's functionally the same.
When using Build Array with two 1D array inputs, those inputs will become the rows of a new 2D array. You get the concatenated 1D array by right clicking and selecting "concatenate inputs", not by playing with your input arrays.
02-01-2019 08:08 AM
This is helpful. I would like some clarification, if you don't mind? I'll give an example:
I have a 2D array of "Low Voltage Leakage Currents", and a 2D array of "High Voltage Leakage Currents". I would like to put these into a cluster, then output that cluster into excel. Each leakage value is associated with a specific "Pin" (I am measuring leakage of digital pins on a MMIC). I have a 1D array that holds the "Pin Names" in it. Right now, I am turning the 1D array into a 2D array, such that the "Pin Names" would each have there own column. This way I can use the "Pin Names" array as a label at the top of my excel file for each leakage current. All of these 2D arrays are put into a cluster at the output of my VI for a single output on the VI.
Is this the right way to go about this? Or is there an easier way to accomplish this task? My end goal is to associate each leakage value with a pin name in an excel format on the output. The leakage values always come out as 2D arrays, each column being a new leakage, each row being a specific site (in my case, I am using just a single site, so it will be a 1 row, N column 2D array). The channel names always come out as a 1D array, but if I save it in the cluster as a normal 1D array and output to excel, I am worried the channel names will only be saved as N row, 1 Column, and will not correspond to the leakage values.
02-01-2019 08:17 AM
Let's see some code!
What functions are you using?
If you are using Write to Spreadsheet File now called Write Delimited Spreadsheet, you can wire in a 1-D array which will give you the headers across the top.
Then wire your 2-D array into another function (make sure you set Append? to True). If needed, transpose the 2D array before you feed it in.
02-01-2019 09:03 AM
Normally that is exactly what I would do. In this case however, I am going to be using this VI as a sequence step in TestStand 2017 which lets me sequence through various VI's and save the outputs as local variables in a test container which I can grab in subsequent sequences. This way I can have information like temperature, lot ID, die ID, x-coordinate on wafer, y-coordinate on wafer, etc. so I know which die the leakage values are corresponding to and in what conditions.
I am already saving this data using "TSM_Publish.vi", which saves it in an STDF format inside TestStand, which is useful for production test and limits/binning, but I also want to make it so my VI is useful for characterization, which means the data is better saved/formatted in excel files for pivot chart use and datasheet graphs. I will be exporting the aforementioned cluster into a CSV file using a sequence step in TestStand that I have created, but I'll also be saving information like temperature, LOT No., Die No., etc. such that my CSV file is parametric and I can pivot chart different temperatures, lots, die, etc. with the leakage information, all available in the test contained. I'll have to see if the cluster with the leakage values and pin names is saved appropriately inside the parametric CSV file when I grab the information from the test container in TestStand.
I guess I could just have the VI itself write to a spreadsheet file just like you described, and then edit that information using another VI that grabs the local variables I need to append/add to that CSV for parametric analysis.
02-25-2019 01:00 AM - edited 02-25-2019 01:05 AM
Why not just use the handy built-in Transpose 1D Array primitive? ![]()
12-03-2025 10:35 AM
I would recommend rethinking your strategy slightly. Generally, the best practice for this sort of thing is to store the data in memory in the best way you use it in memory, display the data the best way to display it, and save the data in the best way to save it.
Trying to make your data a cluster so that it writes to Excel is already giving you headaches. First figure out how to wrangle the data in memory- maybe it's a 2D array, maybe an array of clusters, maybe it's objects, a database, an in-memory TDMS file, whatever. Then, decide how you want the data to look in Excel, and make a VI that converts your "optimized for usage" storage method into a spreadsheet-compatible format.
Your comment "I'll have to see if the cluster [...] is saved appropriately inside the parametric CSV file" makes me think you're trying to lay out the data in memory to write as one big blob into the spreadsheet file. That's nearly always a bad idea- it's fine for quick-and-dirty stuff, but you really want it to be more robust for production usage.
12-03-2025 12:02 PM
@BertMcMahan wrote:
I would recommend rethinking your strategy slightly. .
It seems this ancient thread gets revived about every 5 years or so. Most of the posters haven't been seen in years. 😄
12-03-2025 12:06 PM
I was very surprised to see a response 6 years later! I did end up changing my approach :), pretty much in the way Bert recommended. I have a particular function to convert from the optimal data-structure in LabView to the particular Excel spreadsheet I wanted it formatted in now.