I feel like I am misunderstanding something. Defect rate seems like the inverse of yield? 90% defective = 10% yield, which no matter what “context” is missing, sounds abysmal. 10 chips, 10% yield assuming translates to a 90% defect rate, means 1 chip is good. 1000 chips, 900 are defective. No matter how you slice it, this seems terrible? Regardless of size, number of chips per wafer, if you are only getting 10% out of it, thats a lot of waste?
If we have a small chip design where we can fit 100 into this wafer, only 20 will be harmed by defects, and the remaining 80 will be fine. That’s an 80% yield rate. If we build a larger chip and only 25 can fit into a wafer, we would have 20 defective dies and 5 good dies. That’s a yield rate of 20%.