Thursday, March 28, 2013

Cjsteele.com | ebook

Cjsteele.com | ebook


click here for more images

about | terms | privacy | site map | contact us if you work in a role where you need to increase quality or reduce risk and you are finding that the current methods just aren't giving you the results that you need, then you're about to learn the intuitive and simple way to combine robust design methods with what you already know to design the highest quality and low risk systems possible

so many of us work in areas where random variability is the major source of the issues we need to deal with: quality, risk, reliability. But how many of us really understand the key to analysing random variability: probabilistic design?

i am sure that you have heard of six sigma, value analysis, value engineering, quality circles, quality function deployment, design of experiments, failure mode and effects analysis, fault trees, swot, facilitated risk analysis process and many other similar systems that have been and gone.

but let me ask you this. Do you really understand how these systems actually optimise so that the negative effects of random variability are reduced or eliminated?

how can it be that we all spend so much time working in an area imbedded in probabilistic issues and yet we know next to nothing about probabilistic design? well i think it's because of the influence of taguchi and the current domination of quality and risk by statisticians. My phd supervisor told me about an argument that he had with taguchi over the application of basic probabilistic...read more detail



No comments:

Post a Comment