Anomaly: a deviation from what is normal or expected.
Baseline: a starting point or point of reference, usually established through the initial collection of data, from which comparisons, evaluation and target setting can be conducted.
Benchmark: A standard or point of reference against which data may be compared or assessed. May be used in data analytics or in the absence of internal data.
Benchmarking: The process of continuously comparing and measuring one organization against another to gain information that will help the organization take action to improve its performance.
Capacity: the maximum amount something can achieve and/or the ability to do or understand something.
Correlated Variables: two or more contributors to an outcome which have a dependent relationship on each other (i.e. a change in one relates to, but does not necessarily cause, a change in another).
Data: electronic records stored in computer systems. In the simplest terms, data are lists of things such as requests for service, inventories, or incidents, which include helpful details such as dates, locations, images, video, and more.
Data Hygiene: the process of “cleaning” data to ensure it is free of errors before analysis is conducted
Data Science: an interdisciplinary field of study and practice which extracts insights from data to inform decision making. It includes, but is not limited to, mathematics, statistics, and computer science.
Data Visualization: the creation and study of the visual representation of data
Descriptive Statistics: the practice of quantitatively describing a data set by using measures of central tendency, variability and dispersion
- Central Tendency: measures of central tendency describe a central or typical value of a data distribution (e.g. mean, median, mode)
- Variability & Dispersion: measures of variability & dispersion describe how compressed or stretched the values are in a data set (e.g. variance, standard deviation, interquartile range)
Error Range (Margin of Error): a statistic which expresses the likelihood a prediction will miss the true number if an entire population is sampled.
Exploratory Analysis: an approach to analyzing data which involves summarizing and understanding its main characteristics, often through descriptive statistics and data visualization
Forecasting: the process of making predictions of the future based on past and present data and analysis of trends. It involves estimating specific values at certain future times. Forecasting is related to target setting, in that each process involves an analysis of past data and an analysis of trends; however, forecasting also involves using the latest up-to-date data to continuously improve the accuracy of the forecast.
Goal: see the definition for "Target"
Goal Statement: broad statements describing desired end states, but more specific than an organization’s mission. Sometimes referred to as strategic goals or strategic objectives, goal statements combine qualitative and quantitative information to convey a clear overarching context and purpose to a specific level of desired achievement. Goal statements include, but are not limited to, goals/targets. For example, in the following goal statement “The City of Smallville will create stronger and safer communities for all residents by ending homelessness by 2018,” ending homelessness by 2018 is the goal/target. The preceding language in the goal statement makes it clear to stakeholders why ending homelessness is important.
Government Data: data which describes the operations of a government; electronic records which the government maintains to do its business; statistical information created or maintained by or on behalf of and an agency that records a measurement, transaction, or determination related to the mission of an agency.
Inferential Statistics: the practice of using random sampling to predict (infer) outcomes for a larger population
- Random Sampling: a method of extracting a sample data set from a larger population of data such that each item has an equal chance of being selected for the sample
- Data Modeling: a process used to define and analyze existing data, identify its interrelationships, and determine an evolving framework through which new data can be incorporated and leveraged by software applications to make predictions
- Statistical Inference: a process that uses the analysis of a randomized sample to infer properties onto the larger population
Family of Measures: a combination of performance measures/metrics which, when taken together, provide multiple perspectives on an organization's achievement of its desired end state. A family of measures commonly includes the following categories.
- Effectiveness: the degree to which a process yields the desired outcome/result, regardless of cost (e.g. incidence of foodborne illness)
- Efficiency: the degree to which a process yields the desired output at minimum cost (e.g. cost per inspection)
- Input: the amount of resources contributed to the means of production, process or system (e.g. number of health inspectors)
- Outcome: the impacts or changes that occur resulting from the difference a program’s output makes on goals/targets ( e.g., high school graduation rates increased by 90% over a period of 5 years)
- Output: the amount of resources generated by the means of production, process or system (e.g., number of health inspections)
- Productivity: rate of output per unit of input (e.g. inspections per inspector)
- Service Quality: the extent to which the process or service meets requirements and/or expectations (e.g. number of customer complaints, error rates)
Metadata: A set of data which describes or gives information on other data.
Open Data: makes electronic data records accessible in whole or in part to the public. This practice is considered proactive disclosure - making information available without it being requested.
Performance Management: the process by which leaders, managers, employees and stakeholders work collaboratively to identify what they want to achieve, decide how to measure progress, take informed action based on evidence, and take stock of the results to inform future decisions. It includes performance measurement, performance measures, performance targets, data science practices, and transparency.
Performance Measurement: the building block of performance management, it is the process of establishing performance measures/metrics, collecting the relevant data for those measures, and reporting out on the results.
Performance Measure/Metric: a quantifiable unit which provides information about the success of a program, department, service, or outcome people care about achieving or maintaining. A government may identify a measure/metric by inventorying data that it already collects, collecting new data, or using validated external data. Measures/metrics can focus on inputs, outputs, service quality, efficiency (e.g. cost per application processed); productivity (e.g. throughput); and effectiveness/outcomes (e.g. unemployment rate).
- Key Performance Measure/Metric: Commonly referred to as a KPI (Key Performance Indicator), it is a prevailing metric that a government identifies as the primary way to measure progress toward a goal statement. A goal statement may have one key measure organizations monitor on a regular basis, but multiple others.
Performance Priority: Category of subject matter on which a government wishes to achieve results. Categories may include public safety, public health, education, sustainability and the environment, jobs and the economy, and government operations and management.
Proxy Measure: A proxy is an indirect measure of the desired outcome which is itself strongly correlated to that outcome. It is commonly used when direct measures of the outcome are unobservable and/or unavailable. The most common example of a proxy measure is Gross Domestic Product (GDP), which is used by many organizations and research institutions as a proxy for standard of living or quality of life. An organization should use a proxy measure when there is little or no data available about the program being implemented, but the outcome the program is designed to influence has an existing and commonly accepted proxy.
Result: The ultimate desired endgame, which is achieved by strategies and assessed through analysis, and measures/metrics.
S.M.A.R.T Goal Statements: S.M.A.R.T. is commonly used mnemonic acronym in performance management. It provides criteria for drafting strong goal statements. The concept originated in 1954 when Peter Drucker published a book about management by objectives. The idea is to ensure each goal statement fits all of the criteria in the acronym. The letters S and M usually mean specific and measurable. The other letters have evolved to mean different things to different authors, as described below.
- S = Specific, Strategic
- M = Measurable
- A = Achievable, Attainable, Action-Oriented, Agreed-upon, Aligned, Ambitious
- R = Relevant, Realistic, Resourced, Reasonable, Results-based
- T = Time-bound, Time-based, Time-limited, Timely, Time-sensitive
Strategic Framework: An overarching set of performance priorities for which a government wishes to achieve results.
Strategic Goal: a goal statement that guides an organization’s efforts to move toward a desired end state and advances the organization’s mission.
Strategy: An action or set of actions that departments may take, individually or in concert with one another and/or the Chief Executive’s office and/or external stakeholders, to achieve the goal and ultimately the end result.
Target: a mark to aim for. A desired change in the measure/metric that will advance progress toward a goal statement within a specified timeline. This is often referred to as a "goal."
Trends: a general direction in which something is developing or changing over time.