At work all our products do not eat children by default and it would require legal review to add any such feature. Therefore we don't need a procedure to make sure that any child-eating components get disabled in regions that have some concerns about child-eating.
Except too bad you're launching a washing machine which needs to be checked for child-eatingyness on the tiny island of Zogdog, so you just decided not to sell it there.
> At work all our products do not eat children by default and it would require legal review to add any such feature.
I bet the people at Peloton thought that too until they made that treadmill[1]. I know you meant your critique to be absurd, but it turns out creating a child eating machine by accident is entirely possible. I also bet Peloton product development now includes a process to review child-eatingness despite that not being their primary market, just the usual twice burnt reflex.
Accidentally logging the PII can easily happen for a single engineer. I managed to do it on an product that was privacy-focused & the error slipped through review. The odds of such inadvertent errors rise linearly with the number of products and engineers, and the fines are probably superlinear with the size of an offending organization. If your 3-person consultancy chomps on a GDPR baby or 2, no one will ever know about it, but if Google does it, it's going to be news headlines an millions in fines.
Logging some PII by accident is also an issue and can also lead to compromises, but I think intentionally collecting it in bulk is the primary concern here. With google I assume bulk data collection is their default stance and that's why they need to carefully trim it down just enough that the lawyers say is justifiable rather than the other way around. That's the problem I'm gesturing at.