Track & Report

Best Practice: Feedback Collection, iSpace

“We got a much deeper understanding of how we could deliver the next iteration just by talking to people”

“As part of our ‘Unlocking Women and Technology Alumni’ programme, we noticed the data we gathered from feedback in was all very positive. We realised that people didn’t weren’t writing long stories or give constructive feedback.

We decided on a verbal feedback module to help derive the right information using an informal set up to encourage these conversations. We organised a dinner, a lunch and a brunch for the ladies so that they could join when it was convenient for them.

The results were astounding, they shared insight on their challenges within the programme and we got a much deeper understanding of how we could deliver the next iteration.”

Favor Nma

Operations Manager
iSpace, Ghana, Africa

Track & Report

Why do we need to track D&I data?
  1. Even if you changed nothing anything else, tracking D&I data results in increased awareness for the people doing the tracking
  2. Data helps to benchmark your hub’s progress over time
  3. Tracking is key to ensuring that you’re delivering content and address issues that are relevant to your hub’s needs

Our 5 Step Process To Tracking and Reporting

Who is involved at each step?

1. Executive decision-makers & Data/Reporting team

2. D&I Champion (approval from Exec)

3. D&I Champion (approval from Exec)

4. Middle Managers & Entire Team

5. Entire Team & Data/Reporting team

Use this page to drill down on each of these sections in detail.

Define Your Reporting Output

Create a centralised resource to capture all the D&I activities going on within your hub. It can be something as simple as a deck or committing to do a report or regular update for your community.


  1. Who is responsible for compiling this/these reports
  2. How frequently this format will be used for reporting
Some Examples Of Reporting Outputs
  • Community Newsletter
  • Community Townhall
  • Company Offsite
  • Partner Reporting
Assigning Reporting Accountablity

Sit down with key members from each of your hub verticles (i.e. community team, programmes team, events team, communications lead) and define

  1. What they’re currently tracking
  2. Where D&I can be baked into this process.

When defining your process be sure to outline the following

  • Responsibilities/Who tracks this
  • Channels to track via
  • Metrics to track
  • Questions to ask

Define Which Groups Are Underrepresented In Your Ecosystem

Why do we need to track D&I data?

There are vast differences between nations and regions which determine which diversity issues are relevant. These issues, as well as your hub’s interaction with them, change over time and your approach to D&I should reflect these changes.

Understanding the unique D&I needs of your ecosystem is a crucial first step in helping to effectively engage with relevant groups to have the most impact.

Framework For Defining Under Represented Minorities (URM’s)

In this deck you’ll find frameworks to help you identify opportunities to help you broaden how you think about D&I within your ecosystem.

Set Goals

Draft a measurable strategic inclusion plan using goals and tangible metrics. Goals should be inspirational but also realistic enough to be achievable and incrementally implemented over time.

Change Catalyst Toolkit Part 4: Creating An Inclusive Culture has a concise step by step guide for developing an inclusion plan on page 8-11

Frameworks for Goal Setting

Use this deck to help you set goals with a data-driven focus.

Remember – D&I programs are started and managed by those already in power. These leaders decide the value of each goal and how to measure them. The number of diversity programs that have been designed and led by those who are directly affected by the issue is extremely low, so consider whether there is scope to involve the targeted groups in the goal-setting process and throughout.

Tracking Your Data 

Aim should be to systematize the process of collecting and reviewing data or integrate D&I data into your existing data collection processes.

You’ll need to design a regular data collection avenue because a once-off measure will not be an accurate representation of attitudes over time.

Below are some suggestions of channels you can use to gather feedback and some data. You’ll find more detailed suggestions for metrics in each of the operational pages i.e. community, corporate programmes etc.

Establish a Tracking Process
  • Have a measurable index of goals
  • Assign responsibility and establish accountability
    • Your goals should be clearly assigned to individuals or teams who are responsible for their achievement
    • Check out this change catalyst guide for a framework on how you can do this.
  • Clearly define the process for reporting and responding to findings.
  • Bake D&I tracking into your hubs existing operations i.e. Members surveys, feedback forms for events etc.
Online Avenues

Potential touchpoints:

  • Registration Forms (i.e. eventbrite)
  • Registration Upon Arrival
  • Feedback Surveys

Note! Don’t forget to mention how accessible your facilities are at these touchpoints also. A line like “Our space is accessible for all wheelchair users. We also have some capacity to support those with visual and hearing impairments” might be the difference between someone coming to your event or not.


  • Check out SurveyMonkey’s D&I guide which contains survey templates, stories from D&I leaders and strategies for timing, sending, and optimizing D&I surveys.
Offline Avenues

Events are a great opportunity to gather data.

Potential touchpoints:

  • Hard copy event feedback survey like this
  • Live QR code
  • Visitor check-in app to track who visits your space to meet with members or attend events. This doubles up as a security measure and a data collection method.
Focus groups

Type of data: Mixed (Qual and Quant)

Facilitate focus group discussions with your hub members to gather information on the issues and challenges facing diverse talent.

Potential avenues for collection:

  • Sticky Note Feedback Sessions at events/workshops
  • Dedicated meeting arranged for the sole purpose of gathering data.
Exit interviews

Type of data: Qualitative

Exit-interviews are a great way to supplement the findings you have from qualitative data collected. It’s also a great way of combating the risk of non-completion by employees who are not engaged. HR should also make a note of any grievances to spot any D&I themes. Make sure there’s a system in place for reporting on and responding to findings.

Potential avenues for collection: HR exit interviews

Type of data: Qualitative

Undertake an Inclusion Review within your Hub

Take the opportunity to gather feedback internally on what D&I issues are important to your Community. You can do this for example by conducting an inclusion review.

Why? These reviews can help identify differences in employee perceptions of D&I which can exist between or within demographic groups. Understanding these perspectives helps you identify your hub’s current D&I strengths and opportunities for improvement. It also helps ensure that any initiative you plan will meaningfully engage members of your community.

Have a look at Survey Monkey’s templates to get an idea of the targeted inclusion reviews you might run. 

You can find other suggestions for ways of gathering data in the Tracking > ‘Capture Your Data’ heading of this playbook.

The Black Report is the first qualitative report on Black startup founders in the UK. It’s a phenomenal example of how data can give better insights into issues facing underrepresented groups.


Whether you’re a data analyst or a D&I champion the only way to galvanize support and lobby for impact is to provide measurable indicators of success and failures within your hub.

Regular reporting is also a key forcing function for tracking D&I throughout your hub by helping everyone accountable for D&I across the different teams within your hub.

Think about how you’ll present your findings and make suggestions to decision-makers based on the data you collect. It will also help you simply communicate your rationale for any actions you plan to take within your hub

Visually Presenting Your Insights

Visual Infographics

Infographics are some of the most effective tools you can employ when reporting your D&I data, particularly to stakeholders who don’t have a technical background. They can also be used in your communications content if well designed. An example of an effective D&I infographic is from our partner Unilever, which you can see below.

Graphical Visualisation

Technical Readers You can build your own visualisation model. Here’s an example of a diversity in tech dashboard which measures employee representation across large technology companies in the US to inspire you!

Non-technical readers take a look at this open source ‘People & Culture Health’ Dashboard template

Or if you’re proficient in excel you might want to this about representing your data pictorially.

Reporting Impact


Scheduled reports act as a forcing function for measurement and accountability. Some examples include

  • Embedding D&I into the bi-annual or annual reports about the hub performance report. At Dogpatch Labs documenting our D&I progress is baked into how our hub operates, with every one of our team members feeding into the big picture of what’s going on in the different verticals within our hub.
  • Zaiper used a different bootstrapped tactic by cataloguing their D&I efforts and releasing it in a blog post

Use Impact Management Projects framework to help structure your impact when reporting on D&I.

You can find a simpler version of this framework which we’ve used.

Deeper Analysis

Your people analytics should be designed with D&I in mind from day 1 to help you track how engaged, integrated, and included your hub members feel.


This helps highlight equality discrepancies across your hub and also help you understand your members needs more granularly which can then help feed into your programming and strategy.

Type of analysis you run on the data you collect will largely depend on the size and needs of your hubs.

Note! This data will be heavily subjective, so it will be more sensitive to confounding variables and so will require expert survey design and factor analyses to ensure statistical reliability and validity.

People Analytics – Basic

What is People Analytics? People analytics is the emerging trend in talent management that replaces gut decisions with data-driven practices.

We recommend a phased approach to people analytics within a hub so you don’t overwhelm your audience.

To start you can map:

  • The relative influence of individuals/positions in your hub
  • The opportunities available to them
  • Any explicit and unwritten rules in your hubs culture

Then you can:

  • Define how those levels map on to relative power/influence within the hub
  • Get a deeper understanding of existing hierarchies within your hub
  • Identify where biased preferences might be creeping in and where there are opportunities to flatten out the structure
People Analytics – Advanced

People analytics is increasingly drawing on dynamic network theory to run organizational network analysis to map D&I across different identity groups.

For deeper insights into the technicalities of this analysis, click through to the hyperlinked articles above.

Measuring D&I felt daunting at first as it’s hard to understand what kind of data to collect, what questions to ask to accurately measure the cause and effects. Using this process I came to realise that D&I can be analysed and broken down into its component parts, giving us greater insight into the impact of our initiatives.

Gleb Sapunenko
Data Analyst
Dogpatch Labs

Data Collection Best Practices

It’s essential that your data collection, validation and analysis measures are sound.

Before capturing any data think about some of the following considerations specific to D&I when data collection.

Ethical Considerations for D&I Data

Are there any specific laws which dictate the type of data you can gather?

For example, Ireland is based in the EU and as such, we need to be GDPR compliant, which you can read about in the blog article linked below.

The sensitive nature of D&I data means that privacy and respectful use of the data need to be front of mind regardless of where you’re based.

Tip! Transparency is one of the most effective ways of fostering trust and encouraging accountability within your hub. Not all metrics need to be disclosed, not disclosing an unfavourable metrics can erode trust in your hubs commitment to D&I.

Sharing results externally can also strengthening your hub’s reputation, by displaying your commitment to D&I. If your hub is lagging behind or has missed its diversity targets, you should respond honestly and sincerely, outlining a plan for rectifying these failures.

Social Desirability Bias

Social desirability (SD) is a well-known phenomenon that confounds research. SD bias happens because respondents answer reflect how they think they should be answering rather than how they actually feel or act (Read more here).

Some tactics for combating this bias include

Your metrics and data collection and analysis methods should be as robust as possible and aim to gather data from a representative audience.

Remember also that D&I data highlight subconscious biases that cause shame, fear, and uncertainty which generally feeds into an environment where people don’t want to get involved. Be conscious of your messaging when rallying for support for D&I. Your objective should be to minimise resistance and maximises engagement across all groups.

How to Protect Yourself From Biased Analytics

Data analytics is a powerful tool in helping to understand, predict and mitigate exclusive behaviours in your hub. However, algorithms are built by people and are therefore prone to the biases of their developer. Take 5 minutes to read over some tips on how to mitigate bias from analytics. Some key take aways are:

  • Predictive models need large amounts of data gathered across time. Remember that less than 2-3 years of data, the validity of the model and analysis is significantly reduced.
  • The accuracy of a prediction depends on the data used to create the model.
  • The patterns behind why people make decisions aren’t always simple, meaning differences that arise over time can mean that a model may not maintain its statistical validity from one year to the next within the same company.


This section was developed based on lessons and advice from the following toolkits, articles and open source collaboration: