supermatt 1 day ago

Not available in the EU - so I guess the question is what are they collecting off you under the guise of playing chess that would require them to block EU users...

3
dbish 22 hours ago

It’s probably not nefarious and just generally not worth the headache of an EU release and going through the checks/requirements for an experiment or early beta.

The many rules of the EU stifle speed and change the math for releasing something especially in a big company that has a variety of requirements built up over time to reduce litigation risk or being on the wrong side of one of the overreaching government officials there.

the8472 20 hours ago

> going through the checks/requirements

The process should be the reverse of that. Don't collect data unless you have been through the process of checking that it has a legal basis.

sangnoir 20 hours ago

Typically the internal processes to verify what is/is not collected are mandated by launch region, not whether or not the initial version of the software does data collection.

the8472 20 hours ago

At work all our products do not eat children by default and it would require legal review to add any such feature. Therefore we don't need a procedure to make sure that any child-eating components get disabled in regions that have some concerns about child-eating.

stuartjohnson12 18 hours ago

Except too bad you're launching a washing machine which needs to be checked for child-eatingyness on the tiny island of Zogdog, so you just decided not to sell it there.

sangnoir 17 hours ago

> At work all our products do not eat children by default and it would require legal review to add any such feature.

I bet the people at Peloton thought that too until they made that treadmill[1]. I know you meant your critique to be absurd, but it turns out creating a child eating machine by accident is entirely possible. I also bet Peloton product development now includes a process to review child-eatingness despite that not being their primary market, just the usual twice burnt reflex.

Accidentally logging the PII can easily happen for a single engineer. I managed to do it on an product that was privacy-focused & the error slipped through review. The odds of such inadvertent errors rise linearly with the number of products and engineers, and the fines are probably superlinear with the size of an offending organization. If your 3-person consultancy chomps on a GDPR baby or 2, no one will ever know about it, but if Google does it, it's going to be news headlines an millions in fines.

1. https://www.bbc.com/news/business-56993894

the8472 15 hours ago

Logging some PII by accident is also an issue and can also lead to compromises, but I think intentionally collecting it in bulk is the primary concern here. With google I assume bulk data collection is their default stance and that's why they need to carefully trim it down just enough that the lawyers say is justifiable rather than the other way around. That's the problem I'm gesturing at.

wyldberry 20 hours ago

This mindset assumes that you know what question you want to ask before you have the data, as opposed to having data and then being able to generate hypothesis from it.

Legal basis is ever shifting based on the regional locale. As n+1 requirements pop up, it's only natural to release things like GenChess in the place that requires the least friction, especially when it is not a revenue generating event.

dbish 16 hours ago

Or don’t launch in EU until you have to, which is what most do now so your choice

drewmate 22 hours ago

I suspect it’s a byproduct of Google’s internal launch process which may require more work or longer processes for launching something in different jurisdictions. So it’s probably not an active decision that they couldn’t legally do this in the EU, but a result of being extra careful around what they can “launch” in a jurisdiction with potentially high penalties.

(I am a Googler, but not on this team, or familiar with their launch policies)

mgoetzke 23 hours ago

Yes, always suspicious. I mean the GDPR is not that complicated really, unless you really want to do personalized tracking

sbuttgereit 21 hours ago

I think one of the least wise things a person (or company) can do when faced with any law is to assume that it's "not complicated really."

Much, much wiser to assume "there be dragons" and only engage once qualified legal counsel has helped you understand what compliance means to you.

And along these lines... The second least wise thing to do in this scenario is listen to randos in a forum like this tell you, "but all you have to do to comply is..."

WesolyKubeczek 21 hours ago

The problem with thinking like yours is that legislation like GDPR is _really_ made to be simple and straightforward, but since companies whose livelihood depends on them abusing your privacy will fight it tooth and claw, they will gladly make it look like it's more complicated and insurmountable than it really is. They will also devise ways to comply in such ways that's most cumbersome for the end user and will readily blame GDPR for it.

To devise such a way to comply, they definitely need a large and expensive legal department.

The privacy abusers are much like trolls on the internet who, upon seeing a code of conduct (previously known as "rules") consisting of only "don't be a dick", will spawn endless arguments about what a "dick" is and how it is or is not inappropriate word, what does it really mean to be one, or, indeed, to be, question the use of the indefinite article, and complain about "don't" being too assertive and arrogant.

sangnoir 20 hours ago

> ...they will gladly make it look like it's more complicated and insurmountable than it really is

There are non-malicious explanations for the same pattern of behavior at large organizations - the motivation (malice or not-malice) that seems correct is a Rorschach test.

If I accidentally log IP addresses for EU users that opted out on some throw-away experimental page on my site, Brussels would never find out. If Google does it, it not only has to report the incident, but will most likely be fined. In order to avoid this outcome, they have internal review processes which makes ot "complicated and insurmountable", because how do you justify investing many hours of dozens of lawyers and technical reviewers time for a frivolous, niche AI demo?

sbuttgereit 20 hours ago

Ok... Let's assume this is true (which I'll reiterate, that I contend assuming so is foolish). What happens when courts have interpretations of this "simple law?" Do the courts make an effort to keep things simple and in plain language? Or do lawyers and bureaucrats do what they can to drive unintuitive interpretations, but favorable to their cause, of otherwise plain language? Are European laws such as this subject to the interpretive lens of case law? If so, the best intentions of legislators may only be secondary relative to the actual rulings and unintended consequence of their laws. The problem with thinking like yours is that it dismisses all of this messy reality in favor maintaining the idealism that might have motivated public support of the law.

Those that have to follow those laws need to care about the mess.

WesolyKubeczek 9 hours ago

You should probably do some trivial research. No, there is no case law in most of Europe, for sure in none of EU.

kmac_ 20 hours ago

Second that. GDPR actually made those aspects clear and never caused a headache during implementations I've seen or participated in (more like a checkbox on a list). When I see any complaints, then it's clear some iffy user sniffing is happening.

dbish 22 hours ago

For an experimental project every extra requirement is annoying/slows down release. It’s completely reasonable that they don’t one to add more work before releasing to a large customer base that doesn’t need the extra work.

michaelt 20 hours ago

> I mean the GDPR is not that complicated really

Doesn't the EU also have an 'AI Act' that imposes additional rules, even when you're not tracking anyone?

And a lot of employers have legal teams who are extremely risk-averse, so even if it's obvious to you and me that rules about "deepfakes" don't apply to a tool for generating pictures of chess pieces made of cheese, doesn't mean legal will sign it off.

dspillett 22 hours ago

Not sure it is a GDPR issue – it isn't filtering my out on either mobile or the office connection. I'm in the UK, and despite brexit we are still (technically at least) covered by GDPR (there are some differences in UK-GDPR, and more will come, but IIRC they are not significantly substantive yet). Unless, perhaps, they are banking on our ICO being toothless so won't enforce anything.

allenjhyang 20 hours ago

From my experience helping my company with GDPR, IMO it's true that the principles of GDPR are straightforward. But there can be a fair amount of ambiguity in how certain parts are interpreted, so in practice if you're taking it seriously (which every company should), you'll want to loop in your lawyers. Then there are more and more conversations to make sure everybody understands what the company is doing and what their stance is on GDPR.

Sadly, GDPR is not a black-and-white (pun intended with the chess project) checklist with black-and-white checklist items.

pantalaimon 18 hours ago

I think this not about GDPR but about the AI Act