Appendix A: Indicator Framework

Process evaluation questions

Appropriateness

Evaluation questions Indicators Measure Data sources
What is the evidence of continued need for the program and role for government in delivering this program? (P1) Evidence that need is not being met by other programs for targeted cohort groups
  • evidence of perpetrator intervention programs reducing or preventing family violence
  • number of L17s
  • Royal Commission into Family Violence
  • literature review
  • Victorian crime statistics data
Inability to access MBCPs
  • wait list on MBCPs
  • reported pathway into MBCPs
  • other reported barriers to access
  • number of accepted participants who were deemed inappropriate for MBCPs
  • administrative data including of MBCPs
  • literature review
  • trial referral data
Diversity of participants based on needs and circumstances
  • reason for program engagement
  • data collection tool
  • interview with peak body
  • interviews with referral agencies
Have the initiatives been implemented as designed? (P2) Realisation of delivery activities as outlined in submissions and program logic
  • whether activities and timeframes as outlined in submissions were realised
  • whether service providers keep appropriate case notes, records, perform intake, partake in the FVIS, provide supervision and debrief to service delivery staff
  • identification of barriers and enablers to implementation and how these were overcome
  • stakeholder consultations – service providers and government, and program participants
  • program documentation including reports and submissions
  • FSV program data
  • program logic
How are the initiatives innovative and contributing to best practice? (P3) Evidence of innovative program features and contribution to best practice
  • presence of innovative and best practice features in case management and intervention trials
  • stakeholder consultations – service providers and program participants
  • literature review

Effectiveness

Evaluation questions Indicators Measure Data sources

Are there early positive signs of change that might be attributable to the program? (P4)

Increase in people who experience violence’s feelings of safety and support

People who use violence report to understand the factors contributing to their behaviour, and how it impacts others
  • reported feelings of safety and support at baseline compared to follow-up
  • changes in participants views on their responsibility in perpetrating violence or using force
  • stakeholder consultations – people who experience violence, people who use violence
  • data collection tool
To what extent are the outputs being realised? (P5) Uptake of programs among people who use violence and people who experience violence
  • number of people who use violence attending interventions
  • number of families involved in Aboriginal based programs
  • extent to which the participant numbers are as expected
  • document review and program administrative data
Have people who use violence and people who experience violence responded positively to the program, including enrolment, attendance/retention and satisfaction? (P6)

Increase in people accessing the programs

  • number of enrolments across programs at the organisational level
  • attendance rates across programs, including changes over time at the individual and organisational level
  • comparison of attendance rates at programs compared to other MBCPs
  • reasons participants report not attending programs
  • program administrative data
  • stakeholder consultations – service providers and program participants
Increase in referrals
  • number of referrals providers receive, from where, and changes over time
  • stakeholder consultations with referral agencies
Reduction in number of referrals not taken up for case management and intervention programs
  • number of referrals not taken up decreases over time
  • program administrative data including of MBCPs
People who use violence reported level of satisfaction of the program
  • participants views on what they liked, did not like, and found most and least helpful in the programs
  • stakeholder consultations – perpetrators and women who use force, and service providers
What are the barriers and enablers to effective referral of participants? (P7) Number of referrals and drivers of this
  • number of referrals providers receive, from where, and changes over time
  • stakeholder consultations with referral agencies
  • FSV program data
What governance and partnership arrangements have been established to support the implementation of the initiatives and are these appropriate? (P8) Presence of governance and partnership arrangements and attitudes toward these
  • presence and use of reference group or equivalent
  • presence of monitoring and reporting system to FSV and DHHS
  • program documents
  • stakeholder consultations – government and service providers
Frequency and nature of FSV and DHHS’s interaction with service providers
  • number and type of contacts/communication between FSV/DHHS and service providers and their perceived value of these
  • program documentation
  • stakeholder consultations – service providers
Do the program workforces have a clear idea of their roles and responsibilities? (P9) Stakeholders report to have a clear understanding of their role in program delivery
  • presence of position descriptions, terms of reference, project plans, service agreements
  • stakeholder understanding
  • program documentation
  • stakeholder consultations
What components of the model are perceived to be the most valuable? (P10) Identification of enablers
  • service providers, government, victim survivors and program participants’ reporting features they identify as enablers and of most value
  • stakeholder consultations – all stakeholders
What improvements to the service model could be made to enhance its impact? (P11) Identification of barriers and improvement opportunities
  • service providers, government, victim survivors and program participants’ reported barriers and improvement opportunities
  • stakeholder consultations – all stakeholders
Have there been any unintended consequences, and if so, what have these been? (P12) Identification of unintended consequences
  • service providers, government, victim survivors and program participants’ reported barriers and improvement opportunities
  • stakeholder consultations – all stakeholders

Efficiency

Evaluation questions Indicators Measure Data sources
Has the department demonstrated efficiency in relation to the establishment and implementation of the program? (P13) FSV/DHHS resources used to implement the program have not been wasted
  • FSV/DHHS budget and FTE used to support program delivery
  • FSV/DHHS program implementation staff’s views on the resources required to effectively implement and monitor the programs
  • program documentation
  • stakeholder consultations – service providers

Impact evaluation

Appropriateness

Evaluation questions

Indicators

Measure

Data sources

Are the programs responding to the identified need/problem? (I1)

Increase in perpetrators and women who use force accessing intervention programs and case management, including where they otherwise would not have (uptake)

  • reported access to similar programs prior to this intervention
  • number of program referrals
  • wait list on MBCPs
  • FSV program data
  • data collection tool

Perpetrator and women who use force report the program has been appropriate for their needs

  • report appropriateness of programs
  • stakeholder consultations – perpetrators and women who use force

What are the design considerations of the program to support scalability? (I2)

Stakeholder assessment of program scalability

  • extent to which stakeholders believe the program could be scaled
  • reported enablers or barriers to scalability
  • stakeholder consultations

Effectiveness

Evaluation questions

Indicators

Measure

Data sources

Have the program inputs, activities and outputs led to the desired change mapped out in the program logic? [1] (I3)

Service provider workers challenge violence, threatening and controlling attitudes and behaviours

  • service provider’s reported ways of challenging violence, threatening and controlling attitudes and behaviours
  • stakeholder consultations – perpetrators and women who use force
  • stakeholder consultations – service providers
  • stakeholder consultations – victim survivors

Service provider workers encourage people who use violence to recognise the effects of their violence on others and take responsibility for their behaviour

  • service provider workers reporting how they have encouraged people who use violence to recognise the effects of their violence on others
  • service provider workers reporting how they have encouraged people who use violence to take responsibility for their behaviours
  • stakeholder consultations – perpetrators and women who use force
  • stakeholder consultations – service providers
  • stakeholder consultations – victim survivors

People who use violence report to understand the factors contributing to their behaviour, and how it impacts others

  • changes in participants views on their responsibility in perpetrating violence or using force, at baseline compared to follow up

Have program participants and victim/survivors responded positively to the program (enrolment, attendance, completion, satisfaction)? (I4)

As per the process evaluation question plus:

Number or enrolments, attendance rates, completion rates

  • proportion of participants who complete the programs
  • proportion of participants who complete the program compared to other MBCPs
  • number of enrolments across programs at the organisational level
  • decrease in referrals not taken up
  • attendance rates across programs, including changes over time at the individual and organisational level
  • comparison of attendance rates at programs compared to other MBCPs
  • program administrative data including of other MBCPs
  • stakeholder consultations

What are the drivers for effective participant engagement in the programs? Does this differ according to the different cohorts? (I5)

Reasons for the increase in people accessing the programs

Reason for engagement in the program

  • reasons participants report not attending programs
  • reported reasons for continued engagement with the program
  • program administrative data
  • stakeholder consultations

What is the impact of the program on victims/survivors perceptions of safety? (I6)

Increase in people who experience violence’s feelings of safety and support

  • reported feelings of safety and support at baseline compared to follow-up
  • stakeholder consultations – people who experience violence
  • data collection tool

What were the barriers and facilitators to the programs being integrated into the broader service system? (I7)

Stakeholders views on system barriers and facilitators

  • identification of barriers and enablers
  • stakeholder consultations with all

What impact has the program had on the management of risk associated with this cohort? (I8)

Providers use and experience of MARAM (risk assessment framework)

  • providers reported use of the MARAM framework and its applicability to the interventions and case management
  • stakeholder consultations – providers
  • MARAM framework evaluation

Decrease in perpetrator use of violence and women who use force

  • reported use frequency and nature of violence/use of force
  • program administrative data
  • stakeholder consultations – people who use violence and service providers

What impact has the program had on referral pathways and information transfer between community services and relevant authorities? (I9)

  • Increase or decrease in referral pathways for the programs and programs
  • Comparison between referrals in regions where there is an Orange Door present compared to where there is not
  • increase in program referrals
  • reduction in waitlist numbers for intervention programs
  • increase in program attendance rates
  • increase in program participation rates
  • difference in number of referrals in regions where there is an Orange Door present compared to where there is not
  • program administrative data

What impact has the program had on the confidence, knowledge and skill of the case management and service delivery workforces in supporting the target cohort in the community? (I10)

Case managers reportedly feel confident in undertaking their role

  • reported confidence in working with people who use violence
  • stakeholder consultations – service providers

Are key stakeholders, including the program workforces, supportive of the model? (I11)

Stakeholders express support for the model

  • whether stakeholders agree with the design of the model
  • whether stakeholders think the program should continue/be expanded
  • stakeholder consultation

What would be the impact of ceasing the program (for example, service impact, jobs, community) and what strategies have been identified to minimise negative impacts? (I12)

Identification of the impact and mitigation strategies

  • number of people employed in the programs
  • adverse consequences of the program not existing
  • stakeholder consultations – service providers and government stakeholders

Efficiency

Evaluation questions

Indicators

Measure

Data sources

Has the program been delivered within its scope, budget, expected timeframe, and in line with appropriate governance and risk management practices? (LP) (I13)

Extent to which the program was delivered with fidelity and within planned scope, budgets and timeframes

  • approved budget compared to costs incurred
  • original scope and any scope changes
  • planned and actual timeline of program delivery
  • program documentation
  • stakeholder consultations

Has the department demonstrated efficiency and economy in relation to the delivery of the program? (LP) (I14)

The program could not have been delivered in less time, or with less human or financial resources

  • total budget of the program
  • alignment with intended timeframes
  • program documentation
  • stakeholder consultation

The number of people who use violence referred to the program is as anticipated

  • number of people who use violence who accessed the program compared to the number that were estimated to access the program
  • program administrative data
  • provider submissions

Does the initial funding allocated reflect the true cost required to deliver the program? (I15)

Cost to deliver the program compared with original budget

  • approved budget compared to costs incurred
  • program documentation
  • stakeholder consultations

Italicised = lapsing program evaluation guidelines

Program refers to both the case management program and perpetrator intervention trials

Italicised evaluation questions reflect those that have been added by Deloitte Access Economics, that are in addition to the lapsing program guidelines and the questions posed by Family Safety Victoria in the RFP


[1] This question aligns with the lapsing program evaluation question: What is the evidence of the program’s progress toward its stated objectives and expected outcomes, including alignment between the program, its output, departmental objectives and any government priorities?

Updated