Attribute Data |
See discrete data. |

Average |
The arithmetic mean of a set of observations. |

Bar Chart |
A chart which displays using rectangles (with fixed widths) and whose heights are proportional to the number of observations, or probability of occurrence. |

Benchmarking |
Investigation of the best practices of similar processes. Benchmarking can be conducted within an organization, or expanded to look outside to other competitors or industries, with a goal to leveraging the knowledge of others. |

Bias |
A measurement procedure or estimator is said to be biased if, on the average, it gives an answer that differs from the “truth”. The bias is the average (expected) difference between the measurement and the “truth”. For example, if you get on the scale with clothes on, that biases the measurement to be larger than your true weight (this would be a positive bias). The design of an experiment or of a survey can also lead to bias. Bias can be deliberate, but it is not necessarily so. See also non-response bias. |

Bottleneck |
A blocked point in the chain that preventing process/product flow. Usually temporary in nature. |

Cause and Effect Diagram |
Brainstorming tool that assists with identifying input variables (potential root causes) that impact an effect (output variable). Also known as a fish bone diagram. |

Central Limit Theorem (CLT) |
The theorem that is the heart of inferential statistics which states that if samples are drawn from a population, and the sample means (xbar) are calculated, then as the number of samples are increased, their means will progressively resemble that of a normal distribution. This theorem allows you to assume a normal distribution when using the means of samples. If the sample means do not follow a normal distribution, increase the number of samples. |

Central Tendency Measures |
A measure for a given set of data that provides a single number that can summarize an entire distribution of measurements. The three most commonly used measures are mean, median and mode. |

Characteristic |
A trait, feature, or attribute that can be measured. |

Characteristic Selection Matrix (CSM) |
A brainstorming tool that allows a team to quantify and prioritize identified input variables in relationship to output variables. |

Constraint |
Describes the nature of the cause that is preventing the achievement of the targeted goal. |

Continuous Data (Variable Data) |
Data in the form of measurements, such as length of cylinders, tube fill weight, time to fill order, or growth rate of plants. |

Control Chart |
An SPC tool used to display both continuous and attribute data. It is used to track the process statistics over time. |

Control Limits |
Upper and lower limits determined by process. They fall ±3 standard deviations from the center line. |

Control Plan |
The summary of all the control actions for a process. |

Correlation |
A measure of linear association between two variables. Two variables can be highly correlated without having any causal relationship, and two variables can have a causal relationship and yet be uncorrelated. |

Cost of Poor Quality (COPQ, COQ) |
The costs incurred by producing products or services of poor quality. These costs usually include the cost of inspection, rework, duplicate work, scrapping rejects, replacements and refunds, complaints, and loss of customers and reputation. |

Data |
Information or measurements collected from investigations, experiments, surveys, or observational studies. |

Defect |
The presence of something that impairs the worth or utility of a product or service or the absence of something necessary for completeness, adequacy, or perfection |

Defective |
When something is imperfect in form or in function, or when it falls below the norm in physical function. |

Deployment plan |
Formalized execution strategy for Six Sigma/Lean implementation that is aligned with and linked to short and long-term organizational objectives. Ref: 0-1 Deployment Planning Workshop. |

Descriptive Statistics |
Statistics are used to describe, display, graph, or depict your data. |

Design of Experiment |
A systematic investigation into the variables of a product or process. It allows you to determine the optimum settings for the variables included in the model. |

Deviation |
The difference between an observed value and some reference value, typically the mean of the data. |

Discrete Data (Attribute Data) |
You have discrete data when your data can take only one of two values such as pass/fail, go/no-go, or present/absent. Counts data, such as number of surface scratches, number of accidents, number of typographical errors, also falls under discrete data. |

Dispersion |
The degree of scatter or concentration of observations around its center or middle. | |

Distribution |
The set of frequencies or probabilities assigned to various outcomes of a particular event or trial. | |

DPMO |
Defects per million opportunities. DPU/(opportunities per unit) x 1,000,000. | |

DPPM |
See ppm. | |

DPU |
Defects per unit. Total defects observed/total units produced. | |

Drift |
A gradual change in a process characteristic. |

Effect |
The variation in the output caused by a change in the input. |

Estimator |
An estimator is a rule for “guessing” the value of a population parameter based on a random sample from the population. An estimator is a random variable, because its value depends on which particular sample is obtained, which is random. |

Event |
An event is a subset of outcome space. An event determined by a random variable is an event of the form A=(X is in A). When the random variable X is observed, that determines whether or not A occurs: if the value of X happens to be in A, A occurs; if not, A does not occur. |

Failure Modes and Effects Analysis (FMEA) |
A tool for recognizing and evaluating the potential failure of a product or process, and the effects of each failure, identify actions that could eliminate or reduce the risk of failure, as well as document the process. |

Flow |
When referring to the value-stream, it is the concept of how value is/should be added in an uninterrupted flow, and can include the order to delivery process as well as the entire supply chain. |

Flow Chart (Flow Diagram) |
See process map (flow diagram). |

Frequency Distribution |
A table or graph that represents the values and frequency of occurrence of data within a given sample or population. Common tools for representing a frequency distribution are the bar graph, histogram, frequency polygon, and the frequency table. |

Hawthorne Effect |
Initial improvement in a process of production caused by the obtrusive observation of that process. The effect was first noticed in the Hawthorneplant of Western Electric. Production increased not as a consequence of actual changes in working conditions introduced by the plant’s management but because management demonstrated interest in such improvements |

Hidden Factory |
The differences between the documented process and the actual process. |

Histogram |
A bar chart that displays the frequencies. Histograms separate the data in appropriate class intervals. |

Historical DOE |
A method of analyzing for data that already exists but exhibits a lot of noise such as environmental issues. The resulting regression will have limited R-square. |

Hypothesis Testing |
Hypothesis testing allows us to evaluate a theory (statement) using data gathered during experimentation, accept or reject the theory. |

Ishikawa diagram |
See cause and effect diagram. |

Just-In-Time (JIT) |
The ability to deliver the right product, in the right quantity, exactly when it is required. In lean, this is done without creating buffers of inventory. Demand is the trigger for upstream processes to deliver. |

Kaizen |
Team focused efforts that provide incremental continuous improvement. Characterized by small work groups focusing on a specific activity (5S) and rapidly making process improvement decision. |

Lean Manufacturing |
The elimination of all waste (non-value added effort) from a manufacturing process with a focus on cycle time reduction. |

Mean |
Usually refers to the arithmetic mean, but can also denote the median, the mode, the geometric mean, and weighted means. |

Measurement System Analysis (MSA) |
A formal study undertaken to determine if the measurement used (for data collection and analysis) is accurate and reliable enough to base decisions on. |

Median |
The center (physical position) number after a set of numbers has been rank ordered. |

Muda |
Waste. Seven sources of waste being; overproduction, waiting, transport, over processing, inventory, movement, and defects. |

Non-Value Added |
Identified activities in a process that contribute no value in terms of customer requirements. Often identified and labeled during process mapping exercises. What activities a customer would typically not be willing to pay for. |

Poka-Yoke |
A mistake/defect-proofing procedure that is implemented in a process. |

PPM |
Defective parts per million. Defective units/total units x 1,000,000. |

Process Capability |
The description of a process in terms of its accuracy (ability to hit “the target”) and its precision (to be consistent on “where it hits the target”). See CP, CPK, PP, PPK. |

Process Flow Diagram |
A detailed map of every step in the process, including hidden factory steps (rework, repair, etc.) |

Project Charter |
Formal statement of (Six Sigma) project opportunity including benefits, stakeholder assessment, and other vital information. |

Quality Function Deployment (QFD) |
The visioning and decision-making process required to align all activities in the value-stream to that of the customer needs and expectations, often resulting in cross-functional teams focused on flow. |

Regression Analysis |
A technique, which allows you to mathematically determine the relationship between your independent variables (predictors) and your dependent variable (response variable). |

Residual |
The difference between your response and the predicted response (based on probability). |

Risk Priority Number (RPN) |
Used as a mechanism to rank failure modes in the Failure Modes and Effects Analysis (FMEA) by assigning numerical values to Severity x Occurrence x Detection. |

Rolled Throughput Yield |
First Time Yield focuses on the number of units produced (1,000). Rolled Throughput Yield looks at the total number of defects that can be observed from the total opportunities (8 per unit x 1,000 units = 8,000). |

Sample Statistic |
Used to determine the value of a group characteristic. |

Scatter Plot |
See dispersion diagram. |

Shift and Drift |
Changes in a process characteristic (measurement) that occur over time. Six Sigma practices include an allowance (1.5 standard deviations) for this. |

Six Sigma |
A measure of quality that strives for near perfection leveraging a disciplined, data-driven approach and methodology for eliminating defects (3.4 defects per million opportunities – allowing for 1.5 sigma shift). |

Spaghetti Chart |
A map of the path of a product (deliverable) as it travels down (and across) the value stream. Not too surprisingly, this ends up in a tangle or web of hands and activities that may somewhat resemble a plate of spaghetti, hence the name. |

Special Cause Variation |
Intermittent variation attributed to assignable events. |

Specification Limits |
Requirements set by the customer for a product, service, or process. |

Standard Deviation |
A common measurement of variability. The square root of the variance. |

Survey |
A systematic and impartial means of collecting information, and allows us to draw generalized conclusions or make statements about the larger population from which the survey sample was drawn. |