W05_SJP_Development of a Scoring Model for the GAO Ten Best Practices Part 2

Issue Identification and Appraisal

In W04 Blog, the scoring model best suited for the GAO’s Ten Best Practices was determined, however it left two items outstanding; i) criteria to be used, and ii) what would the acceptance criteria be. This week is part two of our problem statement, “How to develop a scoring model based on the GAO’s Ten Best Practices.”

Feasible Alternatives

The challenge is determining the selection criteria to be used as the ‘acceptance criteria’ should be uniform across the data ranges. There are four alternatives for the selection criteria; i) Individual Checklists at end of each Best Practice section, ii) Appendix II, iii) Appendix VI, and iv) a Hybrid system of i, ii & iii in a more concise manner.

Develop the Outcomes for each

As the criteria sets all come from the same document, there are very little we can do in the means of selection other than look at the potential benefits that might arise between the 4 alternatives. And while all could be used, the potential of a hybrid version would have advantages.

Table 1: Benefits of each Alternative

Selection Criteria

The GAO Schedule Assessment Guide provides several criteria sets;

  • Individual checklists for the Ten Best Practices – these can be found on the following pages of the document; 25/26, 46/47, 61/62, 68/69, 73/74, 89, 96/97, 119/120, 132/133, 146/147.
  • Appendix II: An auditor’s key questions (pages 151 to 165).
  • Appendix VI: Standard Quantitive measures for Assessing Schedule Health (pages 183 to 188).
  • A developed Hybrid from the three above.

Analysis and Comparison of the Alternatives

Using the existing data sets plus a Hybrid, a simple analysis was performed with regards meeting the GAO, ensuring we have the correct amount of points to monitor (ideally 100 points would ensure the criteria points each can gain 1%).

Criteria excerpt showing all four data sets for Best Practice 2 is shown in the table below.

Table 2 Criteria for Alternatives showing Hybrid

Analysis of the alternatives is shown below, there were not too many variables to score against.

Table 3 Analysis results of Alternatives

Selection of Preferred Alternative

Due to the ambiguity of some item descriptions on the Checklists, Appendix II and Appendix VI, the preferred alternative is the use of the hybrid list.

Using Best Practice 2, Hybrid alternative, and adding the acceptance criteria, the initial sheet is represented in Table 4 below.

Table 4 Acceptance Criteria (Hybrid model)

Research on several professional software packages (Deltek, XER Toolkit, Schedule Analyzer) confirmed the ‘acceptable criteria’ was in the ranges used by them, albeit they support the DCMA.

Monitoring Post Evaluation Performance

The Scoring Method is now completed, and the next step is to perform a detailed analysis on a schedule to determine the results. There is a need to ensure that the Acceptance criteria continues to meet the industry requirements, so there needs to be periodic checks on the system.

Once the system has been fully tested and results verified, a demonstration to Company management needs to be organized, with the view to incorporating key metrics into future contracting strategies.

References

  • GAO (United States Governance Accountability Office), 2015, GAO-16-89G Schedule Assessment Guide
  • Guild of Project Controls. (2015, October). GUILD OF PROJECT CONTROLS COMPENDIUM and REFERENCE (CaR) | Project Controls – planning, scheduling, cost management and forensic analysis (Planning Planet). Retrieved June 14, 2017, from http://www.planningplanet.com/guild/gpccar/managing-change-the-owners-perspective
  • Ishizaka, A., Nemery, P., & John Wiley & Sons. (2013). Multi-criteria decision analysis: Methods and software[Kindle].
  • W04_SJP_ Development of a Scoring Model for the GAO Ten Best Practices – Achieving Guild of Project Controls / AACE Certification BLOG [Web log post]. (n.d.). Retrieved from https://js-pag-cert-2017.com/w04_sjp_Development of a Scoring Model for the GAO Ten Best Practices/
  • DELTEK ACUMEN 8.0 [Computer software]. (2016)
  • SCHEDULE ANALYZER FOR THE ENTERPRISE Build 4.4.312 [Computer software]. (2016)
  • XER SCHEDULE TOOLKIT version 132-0-0-19 [Computer software]. (2016)
 

One Reply to “W05_SJP_Development of a Scoring Model for the GAO Ten Best Practices Part 2”

  1. AWESOME job, Steve….!!!

    Going to be interesting to see if you scoring criteria for the hybrid model is going to be too strict or not strict enough…..

    But you are definitely heading in the right direction so carry on!!!

    BR,
    Dr. PDG, Orlando, Florida

     

Leave a Reply

Your email address will not be published. Required fields are marked *