Back to all updates

about 11 years ago

Contestant Process Overview

Please view this document in entirety in PDF format at

This document provides an overview of the process and deadlines associated with the Scheduling Contest.

The process includes four major stages:

  • Registration and assignment of test environment resources
  • Preparation and submission of contestant solution for evaluation
  • Testing:  Steps 1, 2A, and 2B
  • Final determination and announcement of awards

The flowchart on the following page illustrates the process flow of the Contest.  The remainder of this document expands on some of the key events in the flowchart.

Contest Opens and Contestant Registers

The Contest will open on December 14, 2012.  The Contest Web Site,, will contain a number of documents (including this overview document) that provide necessary technical and administrative information for the participants.  Registration is done via the Contest Web Site.  Upon receipt of a registration submission, the Department of Veterans Affairs (VA) will review the registration submission to ensure eligibility.  If the registration submission meets all requirements, it will be validated and passed to the VA technical team for action.  Note that registration will be accepted only until Noon on May 13, 2013 as indicated on the flowchart.

Virtual Machines Assigned

The VA’s technical team will assign three virtual machines (VMs) to each contestant. Further details, including specific VM configurations and the format of contestant-specific access information to be provided via email, are available on the Contest Web Site.   One of the items of information required in the registration is the contestant’s preference for Windows or Linux in two of the three machines.  This preference will dictate the exact configuration of machines provided.

No real patient information will be used during the preparation or testing of submissions. Contestants should not include any personally identifiable information (PII) or third party confidential information in their submissions.

Contestant Produces Submission

Depending upon the promptness of their registration, contestants will nominally have several months to create and integrate their scheduling package solution with the VistA instances provided in the virtual machines.  In the process, contestants are allowed to modify the source code and database of the VistA instances.

A document will be available on the Contest Web Site that provides a list of specific scheduling scenarios that will be used to evaluate the level of “Compatibility with open source VistA”.  These scenarios must be addressed on each of the three virtual machines provided, demonstrating compatibility with three different versions of VistA.  For one of those scenarios, an example automated software script will be provided (one in each virtual machine) that executes the preparation steps of the scenario, executes the required scheduling tasks, and evaluates whether the task was performed correctly.  A description of the automated script example will be available in a document available on the Contest Web Site.  Using this example script as a guide, contestants will implement automated scripts for each of the scenarios described in the scenario document.  It is mandatory that each one of those scripts be automated in a way comparable to the provided example, and that the test script be configured in such a way that it can be run multiple times and produce consistent results, i.e., the scripts must have a mechanism for either returning to the pre-execution state, or must be able to handle the changing baseline configuration in a meaningful fashion.  These automated scripts will be used during Step 1 testing.

Contestants will also be required to create the appropriate input for Step 2 testing, which addresses Attachments A, B, and C of the Contest Announcement (Scheduling-Centric, VA-Specific, and Non-Functional requirements, respectively).  Details on these requirements are contained in the Contest Rules.

Note that contestants, in order to be judged, will contribute the open APIs and any open source content in their entries to the Open Source Electronic Health Record Agent (OSEHRA).

Contestants must provide (in open source form) the code that integrates open source VistA with their scheduling package, the test scripts that demonstrate the Step 1 scenarios, and any modifications that they may have made to the VistA instance(s). This does not imply that the solutions provided by contestants must be open source.  Contestants may submit non-open-source scheduling solutions.  However, the integration code, tests, and any modifications to VistA code must be open source. Contestants must also provide detailed instructions on how to run the test scripts that demonstrate each one of the scenarios defined in the TCMS – Step 1 Test Cases.docx document. Evaluators will use these instructions during Step 1 testing.

Some Contest documents will be delivered using a required directory on each one of the three VMs.  The Contest Web site contains directions for creation of this directory, along with a checklist document that calls out the specific contents required. The documents in this directory provide additional information to the VA technical team necessary to support the evaluation of the submission.

Contestant Submits Submission Letter

Per the Contest Rules, the contestant must declare its submission ready for evaluation by uploading a Submission Letter to the Contest Web Site.  This may be done at any time, but not later than the deadline of 12:00 Noon EDT on June 13, 2013.  Note that the Submission Letter must be received by VA prior to the Noon deadline on June 13 – otherwise the contestant’s submission will not be evaluated.

Virtual Machine Frozen (Access Closed) and Step 1 Evaluation

At 12:00 Noon on June 13, 2013 (or earlier if the contestant uploads a Submission Letter to the Contest Web Site before the deadline), the VA technical team will take control of the contestant’s virtual machines and remove contestant access.  At this point the VA technical team will initiate the evaluation of that submitted solution. Contest rules require that the contestant commit to not modifying any of the components of the submission that may remain under its control (for example, external servers, or Software as a Service (SaaS) components) from this time until the completion of the evaluation period.

Prior to commencing Step 1 testing, the VA technical team will verify that the special submission directory is present in each virtual machine and contains all of the required files specified in the checklist document. Submissions that do not have all required files will fail Step 1 testing without further evaluation.

Step 1 is focused on verifying “Compatibility with Open Source VistA” as defined in the Contest Web Site document on that topic. The evaluators will follow the instructions provided by the contestant on how to run the automated scripts that demonstrate each of the required scenarios on each of the three virtual machines provided.  Evaluators will execute each automated script multiple times to verify reproducibility of results and will also inspect the open source components of the submitted solution, including at least integration code, tests scripts, and modifications to VistA.

The output of Step 1 will be the set of results obtained by running the tests for each of the individual scenarios.  These elements will be provided to the VA judging panel for a pass-fail decision.

Step 2A Evaluation

Submissions that have successfully passed Step 1 will proceed to Step 2 testing.  Step 2 consists of two separate evaluations, 2A and 2B. Step 2A focuses on evaluating whether the submitted solution satisfies the requirements listed in Attachments A, B, and C of the Contest Announcement (Scheduling-Centric, VA-Specific, and Non-Functional requirements, respectively).

These Step 2A features do not have to be demonstrated via automated scripts.  However, contestants are required to provide written instructions on how evaluators should interact with the submitted solution to verify whether it satisfies each one of the requirements.  Evaluators may also ask contestants to participate in interactive events (remote or in person) to demonstrate some of those functionalities.

Testing for the requirements of Attachments A and C will be pass-fail.  Testing for Attachment B requirements will result in up to 120 points as described in the Contest Rules.

Step 2B Evaluation

Step 2B evaluation focuses on the open source content and quality of submission.  A Contest Web Site document entitled “Criteria for Evaluation of Open Source Compatibility” describes the criteria and points associated with this step.  The output of this step will be a written report summarizing the open source quality of each submission and points awarded.  Up to 30 points may be awarded in Step 2B, as described in the Contest Rules.

Evaluate Contest Results and Announce Contest Results

The VA judges will review all supplied test results and make a determination of award(s).  Final results will be announced no later than September 30, 2013.