What does the device/code certification process at your job look like?
When new hardware comes in or a new software update is released for an existing platform, what kind of testing is done to "validate" the device/code performs as expected?
Do you document your results in a formal tool, if so, what is it called?
I'd like to compare how my company does things to others in the industry to see where we can improve.
My employer is very certification heavy where we certify every small change between software updates and new devices. For example, new update comes out for a Nexus 9K, we deploy the update in lab and execute 100s of tests against the device manually and record the results one by one in a document. These results are then uploaded into an in-house tool. Given the high level of touch points, the certification process takes a week+ to complete, and is highly prone to user error. We are in the process of using pyATS/Genie and robot framework to develop automated test cases, but once those tests are complete and the results are generated, it's not obvious to me how best to document them, and how best to report on currently "approved" devices, software releases, and configurations.
How does your team handle this?
No comments:
Post a Comment