BETA v0.1

Resources About

15min to complete this step

Download full questionnaire

Creating a Protected Processing Environment

Have you created a protected processing environment for the analysis, if necessary, to limit access and guard against unauthorized data extraction?

Key Stakeholders:

  • Data Engineering
  • Partner

Keyword tags and Resources:

Security:
Partnership:

Implementing Targeted Analysis

Have you determined if the analysis techniques are targeted and in alignment with the problem statement?

Key Stakeholders:

  • Data Science/Analytics

Keyword tags and Resources:

Purpose:

Documenting Analysis

Have you documented relevant data, algorithms, assumptions, statistical techniques, and findings to enable replicability and internal or external audit?

Key Stakeholders:

  • Data Science/Analytics

Keyword tags and Resources:

Operations:

Testing Multiple Techniques

Have you tested multiple analysis techniques and/or algorithms to determine which is most accurate, relevant, and scientifically rigorous?

Key Stakeholders:

  • Data Science/Analytics

Keyword tags and Resources:

Quality:
Evaluation:

Making Corrections

Have you defined processes to adjust incorrect or flawed analyses?

Key Stakeholders:

  • Data Science/Analytics

Keyword tags and Resources:

Quality:

Vetting Training Data

Have you vetted any training datasets that will be fed into algorithms or machine learning models?

Key Stakeholders:

  • Data Science/Analytics

Keyword tags and Resources:

Quality:
AI:

Considering Proxy Bias and Blindspots

Have you assessed whether the analysis relies on proxies that might introduce bias or blindspots, e.g., using number of arrests as a proxy for prevalence of criminal activity?

Key Stakeholders:

  • Data Science/Analytics

Keyword tags and Resources:

Bias:
Quality:

Assessing Inferred Personal Data

Have you assessed the potential implications of analytical inferences made about individuals?

Key Stakeholders:

  • Data Science/Analytics

Keyword tags and Resources:

AI:
Bias:

Avoiding Reproducing Inequalities

Have you considered whether algorithmic analysis could reproduce existing inequities and data biases?

Key Stakeholders:

  • Partner
  • Data Science/Analytics

Keyword tags and Resources:

Bias:

Ensuring Algorithmic Interpretability

Determine whether you can you interpret and articulate how algorithmic decisions are made?

Key Stakeholders:

  • Partner
  • Data Science/Analytics

Keyword tags and Resources:

AI:
Communications:

Testing Model Predictions

Have you tested model predictions against real world quantities to identify and correct deficiencies?

Key Stakeholders:

  • Data Science/Analytics

Keyword tags and Resources:

AI:
Evaluation:

Introducing Human Evaluation

Have you ensured that humans are able to scrutinize algorithmic or computational results and retain utlimate control over decision-making?

Key Stakeholders:

  • Partner
  • Operations/HR
  • Data Subjects
  • Intended Beneficiaries

Keyword tags and Resources:

Accountability:
AI:
Evaluation:

Avoiding Monolithic Evaluation

Have you taken steps to avoid evaluating models based on a single accuracy score or metric?

Key Stakeholders:

  • Partner
  • Data Science/Analytics
  • Marketing/Comunications
  • Operations/HR

Keyword tags and Resources:

Quality:
Evaluation:

Educating Engaged Parties

Have you educated all engaged parties about the algorithmic and machine learning techniques used in the data collaborative to avoid improper use, applications, or incorrect conclusions?

Key Stakeholders:

  • Partner

Keyword tags and Resources:

Training:
Evaluation:
Operations:
Partnership:

Transforming Results into Action

Have you presented the findings of the analysis in a way that is clear and actionable?

Key Stakeholders:

  • Marketing/Communications

Keyword tags and Resources:

Communications:
Action:

3

priorities identified

View my Analyzing report