Grant Selection and Performance
September 1, 2017
September 1, 2017
PRESENT: Aaron Dombroski, Ed Barrett, Beth Lambert, Nicki Pellenz, Joe Young, M. Crofton, P. Fitzpatrick
The meeting began at 8:30 am by teleconference.
1. FY 18 Federal NOFO Overview
Below is a list of the major changes from last year’s NOFO:
* ASN Living allowance was increased due to an increase in the VISTA living allowance.
* Maximum cost per MSY was increased proportional to the increase in the ASN living allowance increase.
* Application was streamlined with new page limits of 12/15 pages based on applicant.
* Learning memo added.
* Not providing the Segal Education Award amounts as Pell Grant amounts have not yet been set.
* Eliminating criteria that was redundant (intervention – covered in Theory of Change and Logic Model and Cost Effectiveness and Budget Adequacy narrative).
* Eliminating criteria that did not differentiate applicants or is more appropriate to be addressed post decision and pre implementation (Member Training and Commitment to AmeriCorps Identification).
* Evidence Base points awarded based on evidence tier + capacity to collect and use data not just evidence tier. This will level the playing field for applicants regardless of focus area and level of evidence.
* Added an Organizational Capacity criteria – Culture that Values Learning.
* Eliminated the narrative section of the cost effectiveness and budget adequacy section. Section will be scored based on the budget submitted.
* All points in the Program Design section were changed within the section.
2. Revised Mandatory Performance Measures for ME AmeriCorps Grantees
When CNCS instituted standardized national performance measures in 2011, they directed grantees to only measure the impact of direct service activities. The stated reasoning was that the two purposes where required under the law and data collected in other places (number of enrolled members, count of volunteers) would be sufficient to indicate whether programs were complying. MCCS felt differently and directed grantees to use a member development performance measure that counted the number of members with professional development plans (output) and the number of plans successfully completed (outcome). The thinking was this would let each program customize the content to suit the focus of their program.
In 2012 and again in 2015 (total 2 grant cycles), MCCS directed grantees to use a member development performance measure that counted the number of members with professional development plans (output) and the number of plans successfully completed (outcome). The thinking was this would let each program customize the content to suit the focus of their program.
Likewise, for community capacity building, MCCS directed grantees to help every host site agency develop a plan for improving volunteer management (output) and as the outcome, measure how many sites increase implementation of the essential volunteer management practices.
The member development set has not proven to be the best wording. The output revision proposed below is a modified standard national measure used for programs doing employment skill training of community members. It is expected to help tie grantee training plans more tightly to measureable AmeriCorps member development (skills, knowledge, abilities).
All research about volunteer engagement (recruitment through completion of service assignment) shows the quality of an organization’s volunteer management directly impacts volunteer retention. The revision for community capacity building links number of community volunteers recruited or managed and the number of hours those volunteers served with the outcome of improved volunteer management. It also modifies the language of G3-3.14 so the outcome measure for recruited volunteers is an increase in services by the host site. Then, the outcome for managed volunteers is linked to volunteer management based on improving use of essential practices.
Output: Number of completed professional development plans with at least two improvement goals (1 personal, 1 program)
Outcome: Number of members with improved skills, knowledge, and abilities needed to carry out service assignment responsibilities and tasks.
Output: Number of AmeriCorps program training and other formal development activities that result in increased AmeriCorps member skills, knowledge, and abilities related to the service assignment (community, tasks, and sector).
Outcome: Number of AmeriCorps members demonstrating increased competency in skills or application of knowledge.
COMMUNITY CAPACITY BUILDING
Current alignment does not match up number of volunteer recruited with hours served (an indicator the volunteers did more than fill out applications). The same issue applies to volunteers managed. The outputs are either implementation of volunteer management practices or host sites reporting they are more effective because of volunteers. The latter is a yes/no answer which isn’t very strong evidence of actual change.
* OUTPUT A:
G3-3.1 Number of community volunteers recruited by AmeriCorps members or program
G3-3.7 Hours of service contributed by community volunteers who were recruited by AmeriCorps members or program
* OUTCOME A:
Number of additional service activities and/or units completed for organizations by volunteers recruited/managed by AmeriCorps members.
* OUTPUT B:
G3-3.2 Number of community volunteers managed by AmeriCorps members or program
G3-3.8 Hours of service contributed by community volunteers who were managed by AmeriCorps members or program
* OUTCOME B:
G3-3.3 Number of organizations fully implementing three or more new effective volunteer management practices as a result of capacity building services provided AmeriCorps members
3. Proposal to Modify the Rating Rubric for Peer Reviewers and Task Force Reviewers
The staff would like to modify the rating rubric and alter the percentage of total points each rating would receive in order to yield stronger distinction between proposal qualities. The rating scale with five rating options has consistently led to scores that do not substantially differentiate between good quality and weak proposals. “Satisfactory” has become a de facto neutral.
4. Administrative Procedures: Updated Annual Monitoring Procedures
As a result of the risk management assessment, monitoring tasks will be scheduled for the year. The following tasks are routine monitoring for all grantees and are augmented according to the risk.
* Monthly monitoring review of Periodic Expense Reports for over/under expenditure based on program budget.
* Quarterly monitoring review of Aggregate Financial Report, Income Report, Enrollment Cycle, Fill Rate, Performance target achievement, and narrative Grantee Progress Report.
* Annual site visit to review Member Records for compliance. If grantee is in the first year of a multi-year grant cycle, the fiscal system will also be reviewed.
* Annual monitoring visit/interview with Member Service location to include interaction with supervisor and member.
The sample size will be determined based on the number of members a program has and the risk level for each program. If the sample size reveals instances of noncompliance or evidence internal controls/policies/procedures were not followed, the monitor will expand the sample size to the next risk level. If that expansion reveals new instances, then the monitor will conduct the review as if it is a high risk grantee.
5. AmeriCorps Host Site Report
This report was prepared by Cecily C. to find out what the result/long-term effect of hosting AmeriCorps members was on the benefiting agencies/organizations. There are two versions of the same report – including and excluding Maine Conservation Corps. Both reports can be provided upon request.
6. Tentative Timeline for the FY18 AmeriCorps State and National Competitive Grant Application
The programs have approximately 8 weeks to write/submit their applications. There will be 2 Continuation reviews- USM-MCC and LearningWorks and at least 1 Opioid Initiative application review. MCCS currently has no information on how many new applications are expected.
November 21: Submission date for all applications
November 27- December 8: Peer Review, Analysis of new applications and Performance review for the 2 Continuing programs.
December 11th: All Task Force materials will be posted to the Hub including the Grants Officer’s Assessment.
The week of December 18th - TBD: GTF Consensus Meetings for both Continuation and New applications- At least 3 GTF members are required to review each application. The GTF members will have one week to review all the provided materials prior to the GTF Consensus call.
January 5th or 8th (F or M): GTF Call
January 12th (F): Full Commission Meeting
January 17th (W): Grant Application Deadline
7. Audit findings for LearningWorks and TAKE 2
Both programs were contacted for further documentation request and both programs provided the documentation needed in a timely manner. The Grants officer will review the documents and will get back to the programs with a list of findings and corrective actions that need to be taken. Although further investigation is needed, both programs are likely to have some compliance issues with the member records.
8. New Date/Time for the GTF meetings
The GTF members showed no interest in changing the day/time for the GTF meetings.
9. Training for the new GTF members
Training will be provided for the new GTF members on the Task Force Review Process of both New and Continuing applications in October by Maryalice C. and David D. to prepare for the upcoming competition.
The meeting ended at 9:40 am.