Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

of

Msa attribute study Slide 1 Msa attribute study Slide 2 Msa attribute study Slide 3 Msa attribute study Slide 4 Msa attribute study Slide 5 Msa attribute study Slide 6 Msa attribute study Slide 7 Msa attribute study Slide 8 Msa attribute study Slide 9 Msa attribute study Slide 10 Msa attribute study Slide 11 Msa attribute study Slide 12 Msa attribute study Slide 13 Msa attribute study Slide 14 Msa attribute study Slide 15 Msa attribute study Slide 16 Msa attribute study Slide 17 Msa attribute study Slide 18
Upcoming SlideShare
What to Upload to SlideShare
Next

3 Likes

Share

Msa attribute study

MSA FOR ATTRIBUTE DATA

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all

Msa attribute study

  1. 1. 1
  2. 2. Attribute Measurement System Analysis Prepared by : VIPUL WADHWA 2
  3. 3. ◦ Define MSA ◦ Requirement of MSA in IATF 16949 ◦ Typical Reasons for MSA Study ◦ What is an Attribute study? ◦ When to use Attribute study? ◦ Define Procedure for conducting attribute MSA ◦ Demonstrate trial for conducting attribute MSA ◦ Types of Errors in Attribute Measurement system ◦ Analysis Technique ◦ Attribute MSA Study Conclusion Objectives :- 3
  4. 4. An experimental and mathematical method of determining the amount of variation that exists within a measurement process. A Measurement systems analysis is an evaluation of the efficiency of a measurement system. It is applicable to both continuous and attribute data. The sources of variation in a measurement process can include the following: What is MSA? 4
  5. 5. 7.1.5.1.1 Measurement systems analysis SHALL be conducted to analyze the variation present in the results of each type of inspection, measurement, and test equipment system identified in the control plan. The analytical methods and acceptance criteria used SHALL conform to those in reference manuals on measurement systems analysis. Other analytical methods and acceptance criteria may be used if approved by the customer. NOTE: Prioritization of MSA studies should focus on critical or special product or process Characteristics. Requirement of MSA in IATF 16949 5
  6. 6. • There is a new manufacturing process. • There is a new product to manufacture. • There is new equipment. • There are Customer concerns. • There are internal quality issues Typical Reasons for an MSA Study 6
  7. 7. Most problematic measurement system issues come from measuring attribute data in terms that rely on human judgment such as good/bad, pass/fail, etc. This is because it is very difficult for all testers to apply the same operational definition of what is “good” and what is “bad.” What is an Attribute study? 7
  8. 8. • Used when measurement value is one of the finite number of categories. • When, we are not getting any measurement values then the tool used for this kind of analysis is called Attribute study. • Commonly use Attribute study is Go/ No- Go gauge When to use Attribute study? 8
  9. 9. • Select at least 20 parts to be evaluated during the study. • To evaluate product features and make accept/reject decisions (True decision). • At least 5 of the parts should be defective in some way. If larger sample sizes are used, include at least 25% defective parts. • Care should be taken when selecting defective parts – If possible select parts which are slightly beyond the specification limits. Label each part with proper identification. • Three inspectors will evaluate each part thrice (Three trials). • A fourth person should record the data. Note down the observations in the form of G or B, G is Good part (OK), B is for Bad part (not ok). Note: The order of inspections should be randomized after each group of inspections to minimize the risk that the inspector will remember previous accept/reject decisions. The inspectors must work independently and cannot discuss their decisions with each other. Procedure for conducting Attribute MSA 9
  10. 10. • The data recorder may use a table similar to the one given below. G Good B Bad PART NO. & NAME :- DATE:- APPRAISER NAME SPECIFICATIONS:- GAUGE NAME:- A :- Mr. A No. of parts (n) :- 20 GAUGE NO. :- B:- Mr. B NO. OF TRIALS(r):- 3 PERFORMED BY:- C:- Mr. C DATA COLLECTION PART NO. TRUE Decision APPRAISER-A APPRAISER-B APPRAISER-C 1 2 3 1 2 3 1 2 3 1 B B B B B B B B B B 2 B B B B B B B B B B 3 G G G G G B G G G B 4 G G G G G G G G G G 5 G G G B G G B G G B 6 G G G G G G G G G G 7 G G G G G G G G B G 8 G G G G G G G G G G 9 B B B B B B B B B G 10 G B G G G G G G G G 11 G G G G G G G G G G 12 G G G G G G G G G G 13 G G G G G G G G G G 14 G G G G G G G G G G 15 B B B B B B B B B B 16 G G G G G G G G G G 17 G G G B G G G G G G 18 B B B B B B B B B B 19 G G G G G G G G G G 20 B B B B B B B B B B Procedure for conducting Attribute MSA 10
  11. 11. Type 1 Errors : When a Good part is rejected. • Type 1 errors increase manufacturing Costs. Incremental labor and material expenses are necessary to re-inspect, the suspect parts. • Type 1 errors are also called as “Producer’s Risk” or alpha errors. Type 2 Errors : When a Bad part is accepted. • Type 2 errors may occur • Perhaps the inspector was poorly trained or rushed through the inspection and inadvertently overlooked a Small defect on the part. • When Type 2 errors occur, defects slip through the containment net and are shipped to the customer. • Because Type 2 errors put the customer at risk of receiving defective parts; customer may raised the complaint. • Type 2 errors are sometimes called as “Consumer’s Risk”. • Type 2 errors are also called as “beta” errors. Types of Errors in Attribute Measurement system 11
  12. 12. Theeffectiveness of an inspection process is Correct Call. Correct Call (Cc):- The number of times of which the operator (s) identify a good sample as a good one and bad sample as a bad one. Effectiveness (E) = Number of correct evaluations (GG+BB) Total Number of parts (TN) What is Effectiveness? 12 E-EFFECTIVENESS GG-GOOD PART INSPECTED AS GOOD BB- BAD PART INSPECTED AS BAD TN-TOTAL NO OF PARTS
  13. 13. False Alarm (Fa) – The number of times of which the operator (s) identify a good sample as a bad one. The probability of a false alarm, also known asType I error or producer’s risk, is given by: FalseAlarm (Pfa) = Number of false alarms (GBB) Total Number of Good parts (TG) What is False Alarm? 13 Pfa- POSSIBILITY OF FALSE ALARM GBB- GOOD BUT BAD TG- TOTAL NO OF GOOD PARTS
  14. 14. What is Miss rate? A Miss is a defective item that is classified as non- defective. Miss rate - The number of times of which the operator identify a bad sample as a good one. The probability of a miss, also known as Type II error or Consumer’s risk, is given by: Miss rate (Pmiss) = Number of misses (BBG) Total No of Bad parts (TB) 14 Pmiss-POSSIBILITY OF MISSED BBG-BAD BUT GOOD TB-TOTAL NO OF BAD PARTS
  15. 15. RESULTS APPRAISER CORRECT (GG) CORRECT (BB) Total RIGHT DECISIONS (GG+BB) FALSE ALARMS (NF) (GBB) NO. OF MISSED (NM) (BBG) TOTAL NO OF CHECKED PARTS (TN) EFFECTIVEN ESS (E) GG+BB/TN Pfa (NF/TG) Pmiss (NM/TB ) A 39 18 57 3 0 60 0.95 0.071 0 B 40 18 58 2 0 60 0.9667 0.048 0 C 39 17 56 3 1 60 0.9333 0.071 0.056 15 GG-GOOD PART INSPECTED AS GOOD BB- BAD PART INSPECTED AS BAD GBB-GOOD BUT BAD BBG-BAD BUT GOOD TN-TOTAL NO OF PARTS TG-TOTAL NO OF GOOD PARTS TB-TOTAL NO OF BAD PARTS E-EFFECTIVENESS OF APPRAISER NF-NO OF FALSE ALARMS NM-NO OF MISSED Pfa- POSSIBILITY OF FALSE ALARM Pmiss-POSSIBILITY OF MISSED
  16. 16. Acceptability criteria: If all measurement results agree, the gauge is acceptable. If the measurement results do not agree, the gage can not be accepted, it must be improved and re-evaluated. Parameter Acceptable Marginal Not Acceptable EFFECTIVENESS (E) >.90 .80<E<=.90 <=.80 FALSE ALARM RATE (Pfa) <.05 .10>Pfa>=.05 >=.10 MISS - RATE (Pmiss) <.02 .05>Pmiss>=.02 >=.05 16
  17. 17. If any of the decisions disagree, the measurement system may need improvement. Improvement actions include: • Reworking the gauge, • Re‐training the inspectors, • Clarifying the accept/reject criteria, • Adding more lighting After implementing the improvement actions, repeat the study. If the error cannot be eliminated, Must take appropriate corrective actions, such as switching to a new measurement system, adding redundant inspections, or conducting a more extensive study. Attribute MSA Study Conclusion 17
  18. 18. Thank you very much for your attention! 18
  • KarinaGorter

    Dec. 1, 2021
  • Peter7777

    Sep. 16, 2020
  • prasad_rr

    Nov. 15, 2019

MSA FOR ATTRIBUTE DATA

Views

Total views

1,223

On Slideshare

0

From embeds

0

Number of embeds

0

Actions

Downloads

0

Shares

0

Comments

0

Likes

3

×