Michael L. Behm - Cedar Park TX, US Steven R. Farago - Austin TX, US Brian L. Kozitza - Georgetown TX, US John R. Reysa - Austin TX, US
International Classification:
G06F 17/50
US Classification:
703 15
Abstract:
A system for selecting a test case. A test case with a high score is selected. A simulation job is run on a device under test on a plurality of processors using the selected test case. Simulation performance and coverage data is collected for the selected test case and the collected simulation performance and coverage data is stored in a database.
Clustering Simulation Failures For Triage And Debugging
- Armonk NY, US John Reysa - Austin TX, US Mohamed Baker Alawieh - Austin TX, US Brian Kozitza - Georgetown TX, US Erica Stuecheli - AUSTIN TX, US Tuhin Mahmud - AUSTIN TX, US Divya Joshi - BANGALORE, IN
International Classification:
G06K 9/62 G06N 20/00 G06F 16/906 G06F 17/50
Abstract:
Methods, systems and computer program products for clustering simulation failures are provided. Aspects include receiving simulation data comprising a plurality of simulation failure files, generating a token for each simulation failure file of the plurality of simulation failure files, determining a token score for each token for each simulation failure file of the plurality simulation failure files, normalizing each token score for each token in the plurality of simulation failure files utilizing a weighting scheme to create a normalized token score for each token, determining a set of groups for the plurality of simulation failure files, and assigning one or more simulation failure files from the plurality of simulation failure files into a group in the set of groups based at least in part on normalized token score.
- Armonk NY, US Mohamed Baker Alawieh - Austin TX, US Brian L. Kozitza - Georgetown TX, US John R. Reysa - Austin TX, US Erica Stuecheli - Austin TX, US
International Classification:
G06F 17/50
Abstract:
A method, computer program product, and a fail recognition apparatus are disclosed for debugging one or more simulation fails in processor design verification that in one or more embodiments includes determining whether a prediction model exists; retrieving, in response to determining the prediction model exists, the prediction model; predicting one or more bug labels using the prediction model; determining whether a fix is available for the one or more predicted bug labels; and simulating, in response to determining the fix is available for the one or more predicted bug labels, the fix for the one or more predicted bug labels.
- Armonk NY, US Bryan Ronald Hunt - Cedar Park TX, US Stephen McCants - Austin TX, US Tierney Bruce McCaughrin - Georgetown TX, US Brian Lee Kozitza - Georgetown TX, US
International Classification:
G06F 9/50 G06F 9/48
Abstract:
A job submission technique includes a set of algorithms that provide automated workload selection to a batch processing system that has the ability to receive and run jobs on various computing resources simultaneously. The job submission technique provides for organizing workloads, assigning relative ratios between workloads, associating arbitrary workload validation algorithms with a workload or parent workload, associating arbitrary selection algorithms with a workload or workload group, defining high priority workloads that preserve fairness, and balancing the workload selection based on a current status of the batch system, validation status, and the workload ratios.