![]() ![]() It was in this fertile climate for statistics that Dr. Deployment of statistical tools in the management of quality had become commonplace. The onus thus shifted to the measurement of the consistency of the process in place to produce interchangeable parts, so that the end products were within acceptable tolerance limits for quality.Īn allied development was the widespread adoption of rudimentary statistical techniques such as sampling. A large number of parts involved had rendered manual measurement against go and no-go gauges, as was the prevalent practice, unfeasible. ![]() This meant that the need for measurement of parts against pre-determined standards became more acute. With the introduction of the assembly line and Ford’s adaptation of it to the automobile industry, cost-effective mass manufacturing became a reality. This development is notable for the simple reason that this was the first instance a concerted effort had been made to address quality in production. Whitney’s application- termed the Uniformity System- was adopted by the military-industrial complex and defense establishments across Europe and the Americas. When handed a contract for the production of 10000 muskets by the French government, Eli came up with designs for standardized musket parts so they could be produced without much variation off a single template for years, thus ushering in the age of mass production. It was around this time that Eli Whitney –famed American inventor of the cotton gin- took up an idea first articulated by Honore le Blanc: interchangeable parts. The introduction of machinery meant goods could now be produced en masse and quicker than before. On the flipside, supervision was unnecessary, which made the entire process less tiresome.Īll this changed with the Industrial Revolution. Quality workmanship was prized, and the only way to get a good-quality product was to have it done at a premium price. In the pre-industrial world, quality inspection and management was an expensive affair. The Industrial Revolution – First Stirrings of Quality Management In the context of statistical quality control, processes and products are measured and evaluated to determine variation from acceptable standards, and the spread of the distribution signifies variability. The outliers on the normal distribution lie multiples of one standard deviation, represented by the Greek alphabet ‘σ’ (‘sigma’), away from the mean. The central pillar of statistical theory, as utilized in Six Sigma, is German mathematician Friedrich Gauss’ Normal Distribution curve (also called a ‘Bell Curve’). The set of principles that comprise Six Sigma has its origins in the quest for quality in mass production, beginning in the late 18th century, though the field of statistics itself –upon which many of Sigma’s tools are based- has been around for much longer. Much vaunted and in-demand, Six Sigma techniques have resulted in savings across business process and product lifecycles for corporations across the world.īut how did this come to be? Read on, as we trace out a brief history of Six Sigma, dig into its humble beginnings, and chart out its evolution. Six Sigma the very term evokes notions of clinical, stringent quality standards.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |