Section Four:  How do I monitor?

Overview of the PD monitoring process: build a logic model, create a monitoring plan, and run an after-action review—roles, workflows, and best practices.

An overview of the Public Diplomacy monitoring process
 

R/PPR recommends that PD staff complete the PD monitoring process for a subset of PD section activities.  Guidance on what to monitor is included in the What should I monitor? section.

In this toolkit, we use the term PD monitoring process as shorthand to refer to the three main steps you should take to monitor your activities.  

For each initiative or section activity your section chooses to monitor, you will do three things to complete the full monitoring process.  

  1. Complete a logic model – A logic model is a planning tool that visually represents the logical relationships between the inputs, activities, outputs, and outcomes of a section activity.  A logic model should provide a snapshot of how staff think a section activity will work, and analyze the "if-then" (causal) relationships between section activity elements (i.e., how do activities lead to outputs, how do outputs lead to outcomes).  See the Logic Model section for more information.  
  2. Create a monitoring plan – A monitoring plan is a tool that articulates a section activity’s indicators, data sources, targets and actuals.  An indicator shows progress or change in an output or outcome from your logic model.  See the Monitoring Plan section for more information.
  3. Conduct an after-action review (AAR) - An after-action review is a  debrief tool designed to help a team reflect and learn, as well as document and prioritize specific management actions and timeframes to make change.  An after-action review (AAR) provides space for the activity team and stakeholders to reflect on a section activity’s implementation and identify areas to keep or improve in future activities.  See the After- Action Review section for more information.

After using the monitoring selection criteria to identify which section activities to monitor, the next step is to identify who should be responsible for completing the PD monitoring process. For section activities conducted by implementing partners, the implementing partner is responsible for completing the logic model, monitoring plan, and the AAR in collaboration with the PD section.

For activities that the PD section directly implements, PD section staff are responsible for completing the logic model, monitoring plan, and the AAR.  In PD sections with staff who are fully or partially dedicated to M&E, these staff  generally lead the monitoring responsibilities  in close coordination with program staff.  However, for those sections without dedicated M&E support, the team directly responsible for implementing the section activity, whether that is an implementing partner or the PD section, should complete the tasks.  This team may receive help from others in the section with additional expertise in planning or whose function within the section involves strategy, research, data collection, or coordination of M&E efforts.

For more ideas about how this works in practice, review Appendix B: Monitoring for grants and PD-implemented section activities.

How do I incorporate monitoring into my PD section‘s everyday workflows?

General best practices

Regular monitoring, through consistent feedback and data flow, aids decision makers in effectively managing PD portfolios.  Regular monitoring also helps maintain institutional memory during staff changes by creating a record of data over time.  While finding time for monitoring can be challenging, PD sections can begin by integrating the following practices into their routine meetings, workflows, and planning cycles.  

  • Dedicate time to identifying objectives, outcomes, and outputs.  
    • The logic model is the basis for most monitoring work.  PD staff and implementing partners should develop an objective and expected outputs and outcomes for a section activity they will be monitoring as early in the planning process as possible.  

In planning discussions, make sure everyone agrees on the desired outcomes (e.g., changes in audience attitudes, knowledge, attitudes, skills, or behaviors) of the section activity.  Identify the data you will collect to track progress and manage the activity.  Specify the expected outputs like the number of recruits or attendees, along with corresponding targets.

  • If your activity is grant- funded, include the completion of logic models, monitoring plans, and AARs as part of the grant life cycle.
    • Provide grant applicants and implementing partners with templates and basic instruction on the monitoring process.  
    • In the pre-award stage, ask grant applicants to include a logic model with their proposal, and ensure that they work with you to complete a monitoring plan before implementation begins.  
    • In the implementation phase, work with implementing partners to ensure the data that they collect directly tracks the outputs and outcomes covered in the logic model.  Make sure the monitoring plan includes a clear plan for when and how they will collect data.
    • Incorporate an AAR in the grant close-out process to review, reflect, and learn.  Use the grant close-out process to:  review data together; document lessons learned and best practices; and identify actions for future improvements, continuations, discontinuations, or changes.
  • Set aside time in periodic project meetings to review monitoring data.  Monitoring data is only useful if it is analyzed and put to good use.  Blocking off dedicated time to review data in regularly scheduled meetings, such as in weekly or monthly project meetings, is a critical step in learning from and using data to inform planning and adapt future programming.  
  • Demonstrate data use in leadership decisions.  Highlight data use in strategic, resourcing, or programmatic decisions whenever possible.  Incorporating data collected during the monitoring process demonstrates to the team that leadership makes decisions based on data, not subjective opinion, and it encourages others to adopt the same practices.
  • Reward successes in data-based decision making.  Create incentives for completing the monitoring process by creating staff awards (formal or informal) for research and data use.  Doing so will help create a culture of learning and data-based decision making that boosts effectiveness.