Instances of agreement between raters for nine outcomes (Omaha System knowledge, behavior, and status scores at discharge for Income, Mental health, and Substance use problems) was calculated for experts (pairs and all) and for experts with the agency. The inter-rater reliability of the scores was evaluated using Intraclass Correlation Coefficients (ICC). The ICC looks at the degree of agreement between multiple observations of the same problems, and reflects the percentage of score variance attributable to different sources. Values of ICC greater than are considered to represent acceptable inter-rater agreement (Streiner & Norman, 2008). The data collection period was three consecutive days. At the end of the third day, a post-processing meeting was convened to review the data collection process and challenges.