NI TestStand

cancel
Showing results for 
Search instead for 
Did you mean: 

General doubts about testing

Hi all, while 99.9% of posts regards details about the features of NI products, my doubt is about the general methodology of testing.

I recently received the task of testing the firmware that runs on a 16k rom 8 bit microntroller that controls a brushless motor.

Unfortunately the managment tossed the task at me, but no one is able to tell better what testing is.

There are no system/software requirements specifications written of papers, so what am I suppose to test ?

I am supposed to test something on against a list of desired behavior/constraints, but this list does not exists.

I am afraid that I will have to do this list by myself.

I have no bedget for hardware/software purchasing, I do not know what I am expected to test, how deep, the documentation I am going to produce.

 

I started to read about everything I find on the web about testing, but I am rather disappointed that almost all the material regard PC software application, that runs on the PC only and that are by no mean linked to outside world.

Many many papers are about testing web based application

 

My questions are: how am I supposed to conduct the test ?

For sure Labview and its infinite hardware/software options are a great help, but in general, what am suppose to do ?

 

Isolate every function in the firmware and test it on the target by feeding inputs and logging output ?

Do I have to do a simulation as well on a PC ?

Do I have to connect the micro I/O to labview hardware and simulating the hardware/motor behaviour.

Do I have to document every test I make, the way I make it, make it repeatable.

Everyone in my company is aware that firmware testing is good and avoids troubles in the future, but no one has the slightest idea how to do it.

By far we make test on the whole system , such as endurance test in a climatic chamber, stress of the system in any way, but no one is able to go deeper into firmware.

As you see I am rather disappointed about how this thing is started, and I am not sure that something good will come out, because when input are vague or wrong, you can't expect good outputs.

They just tell me "TEST IT", while no one know what testing is in details.

 

I have no budget so far how much can I spend for hardware/software.

I received an NI9172 cDAQ with thermocoupes and 4 Analog IN that has been bought 3 years ago, then put away because no one used it.

I bought some analog outputs and some IO to try to simulate the exteral hardware, and see what is possible to do.

 

We evaluated TESTSTAND as well, but so far I have no idea how test stand will produce any help in this stage.

They think that buying a tool will solve problems.

In this stage I can do anything with  normal labview VI.

I have not really clear what TEST stand can do that a VI, ot a project of VIs cannot do.

 

For that I am asking you all, how do tyu in general do test/validation/verification of software/hardware.

If you look around, there is little or no information about how to test some firmware.

 

I think in the end, as usual is in the software world, everyone will do his way.

 

I'm hoping someone will throw some idea/suggestion about general practices, standards, time/mony /resources involved about testing.

Maybe a good discussion/idea exchange will arise.

 

I see that you all in this board are concerned about the Labview bahvior in details: why this function A doesn't do B, why my led doesn't switch on, why test stand doesn't work, and so on.

Everyone seems to know perfectly what the test system is supposed to do, and the problem are about details.

My problem is the opposite.





0 Kudos
Message 1 of 4
(4,028 Views)

Wow, I can feel your frustration!

 

Rather than expound on the mistakes (no specifications, structure, etc.) I'll try to give you a bit of direction.

 

Check out some books by Watts S. Humphrey, on the software process.  It sounds like your company has a lot to learn on getting the process in line with the rest of industry.

 

DESIGN:

1. Create a specification as to what your product is intended to do.  (The design engineer should do this, but in leu of any help, do it yourself.)

      If you can read the code, try to build a diagram of what the product is intended to do.

      Interview the designer, what was his design goal, what is the code supposed to do?

      Try to break the specification up into sections.

          Inputs (what are the 'gozintasz"?)

          Outputs  ("gozattaz"?)

          Processing (what functions are intended for the product)

      At this point, consider it a "work in progress".  As you work through the stages, keep updating it as you go through the process.

 

2. Create a set of VERIFICATION (test bench) test cases that you can use to verify each of the functions.

    You should be able to create a set of tests for each of thespecifications that you have to verify that the product is performing correctly.

    Don't forget to create Negative test cases (what if...)

    Also add VALIDATION (real world actual customer use) tests.  How does the customer expect to use the product?

 

3. Figure out what you will need for hardware I/O for testing the product.

    What inputs (signals) will you need to drive the motor?

    What outputs (rpm, state, current, etc.) do you need to measure the device performance?

 

4. Build your test system, both hardware and software. (This can seem to be a daunting task when faced with it up front.)

    Try to design it with future flexibility in mind so you don't have to rebuild for each product.

 

TEST

5. After you have designed and started testing your product, write up issues (bugs) and track them through the system (Bug, fix, verify).

    I would suggest that you refer to them as "Issues".  There will be things that are not optimal that if you call it a bug, the engineer will say

    "no, that's the way I designed it" (not that it is right!)

    Update your specification and test cases as you go through the testing process.  (it's a learning experience).

 

6. Track any field software failures to learn what you missed.

 

7. Take what you've learned and apply it to your next product design cycle.

 

LAST RESORT

As a last resort, if your company has no patience to allow you to build and follow a process, then go the VALIDATION route.  Connect the product up as the customer would use it and try to break it.  Try failed inputs, (open, shorted switches, voltage variations, temp, connections, etc.) as a customer would do.  Pay peticular attention to any thing that could even remotely be safety related.  Keep careful track of your tests, results and problems and solutions.

 

It's all a learning curve and it sounds like you're behind the curve...  Depends on how hard the lessons are and what they cost the company in rework and retrofits vs the process time up front.

 

Best of Luck,

 

Mike

 

Message 2 of 4
(3,998 Views)

Hi,

 

Here's a couple specific suggestions.

 

The people who designed and developed the product absolutely should tell you what tests need to be run, and list what data should be collected. They may do this specifically, or generally. You should then write a "system requirements specification" and you may need to go back and ask them more questions as your write it.This spec should have a list of all the tests and the steps to perform them, a list of test equipment, a list of all the data to be acquired, and if possible pass/fail limits for each data. It might also contain things about the user interface, the accuracy etc of the measurements. Then you need to submit that back to them to get it approved. Get everything in writing. This is your first step and nothing else should be done without this.

 

If you have to start from scratch and recommend all the tests yourself, that might mean you need to spend a month going through the drawing package, do some research, and in some cases get a BS in electrical or mechanical engineering. 🙂 If you don't feel comfortable doing this, I'd be honest with them.

 

This link has a system requirements spec template, but really you can use any format.

http://www.ni.com/automatedtest/guides.htm   then click on the Documentation Templates.

 

If you really are testing just the firmware, you could "detach" the micro from the product and then wire it up to some test equipment that is running a model of the motor, the "plant." This is hardware in the loop testing,
http://zone.ni.com/devzone/cda/tut/p/id/10345  You could ask them if they have already tested the firmware in software only.

 

cc

Message 3 of 4
(3,973 Views)

Hi, downow and mike.

 

I really appreciated your answers. I think I'll be going to print them and keep them beside the screen.

 

By now I have no time to reply extensively, but I'll do it shortly.

 

Just to reply to mikw, I have ordered a 400 pages book about embedded software test,

Testing Embedded Software

by Bart Broekman, Edwin Notenboom

 

It will surely have the effect to show them that someone wrote a 400 pages book about testing, and that it's not so simple as they mean. They claim it can be done by a 5 minute suggestion talk or a one page paper someone wrote in 10 minutes. I'll show you later the page that for them is how to do testing.

 

By now thanks to everyone.





0 Kudos
Message 4 of 4
(3,932 Views)