imadabouzu

joined 5 months ago
[–] [email protected] 0 points 2 months ago (2 children)

I am not a lawyer. But you wouldn't be surprised to hear that

  1. I don't have inside story of Bing in Germany. It could be that Microsoft either doesn't want to do it well, or hasn't yet done it well enough. I'm not promising either in particular, but it can be done.
  2. Generally as an engineer you have a pile of options with trade offs. You absolutely can build nuanced solutions, as often the law and the lawyers live in nuanced realities. That is the reality of even the best sorts of tech companies who are trying.

My commitment is that maximalism or strict binary assumptions won't work on either end and don't satisfy what anyone truly wants or needs. If we're not careful about what it takes to move the needle, we agree with them by saying 'it can't be done, so it wont be done.'

[–] [email protected] 2 points 2 months ago* (last edited 2 months ago) (4 children)

That's a good question, because there is nuance here! It's interesting because while working on similar projects I also ran into this issue. First off, it's important to understand what your obligation is and the way that you can understand data deletion. No one believes it is necessary to permanently remove all copies of anything, anymore than it is necessary to prevent all forms of plagairism. No one is complaining that is possible at all to plaigarise, we're complaining that major institutions are continuing to do so with ongoing disregard of the law.

Only maximalists fall into the trap that thinking of the world in binary sense: either all in or do nothing at all.

For most of us, it's about economics and risk profiles. Open source models get trained continuously over time, there won't be one version. Saying that open source operators do have some obligations to in good faith to curate future training to comply has a long tail impact on how that model evolves. Previous PII or plaigarized data might still exist, but its value and novelty and relevance to economic life goes down sharply over time. No artist or writer argues that copyright protections need to exist forever. They literally, just need to have survival working conditions, and the respect for attribution. The same thing with PII: no one claims that they must be completely anonymous. They just desire cyber crime to be taken seriously rather than abandoned in favor of one party taking the spoils of their personhood.

Also, yes, there are algorithms that can control how further learning promotes or demotes growth and connections relative to various policies. Rather than saying that any one policy is perfect, a mere willingness to adopt policies in good faith (most such LLM filters are intentionally weak so that those with $$ and paying for API access can outright ignore them, while they can turn around and claim it can't be solved too bad so sad).

Yes. It is possible to perturb and influence the evolution of a continuously trained neural network based on external policy, and they're carefully lying through omision when they say they can't 100% control it or 100% remove things. Fine. That's, not necessary, neither in copyright nor privacy law. Never been.

[–] [email protected] 9 points 2 months ago

It even works the other way! What if as the super intelligent all knowing super computer simulates everything, concludes you can get to the end by any means, and there is no meaning to rushing, ordering, or prioritizing anything more than would already be the case, and like the rest of nature, conserves on taking only the minimal action, and replies, "nah, you can walk there yourselves" before resigning itself to an internal simulation of arbitrary rearrangements of noise.

This would be insufferable to the people who believed in short cuts.

[–] [email protected] 9 points 2 months ago

In a sense, to me, it is the same thing. If your business is built upon repurposing everyone else's inputs indiscriminately to your benefit and their detriment, it is, too expensive, to reveal that simple truth.

[–] [email protected] 9 points 2 months ago

Imagine, a corporation finding their own voice, as a proper signal of their awareness of their customers. Nope, gotta sell your soul to tech stocks.

[–] [email protected] 35 points 2 months ago (11 children)

Despite what the tech companies say, there are absolutely techniques for identifying the sources of their data, and there are absolutely techniques for good faith data removal upon request. I know this, because I've worked on such projects before on some of the less major tech companies that make some effort to abide by European laws.

The trick is, it costs money, and the economics shift such that one must eventually begin to do things like audit and curate. The shape and size of your business, plus how you address your markets, gains nuance that doesn't work when your entire business model is smooth, mindless amotirizing of other people's data.

But I don't envy these tech companies, or the increasing absurd stories they must tell to hide the truth. A handsome sword hangs above their heads.

[–] [email protected] 7 points 2 months ago

Moravec's Paradox is actually more interesting than it appears. You don't have take his reasoning or Pinker's seriously but the observation is salient. Also the paradox gets stated in other ways by other scientists, it's a common theme.

One way I often think about it: in order for your to survive, the intelligence of moving in unknown spaces and managing numerous fuzzy energy systems is way more important to prioritize and master than like, the abstract conceptual spaces that are both not full of calories and are also cheaper to externalize anyways.

It's part of why I don't think there is a globally coherent heirarchy of intelligence, or potentially even general intelligence at all. Just, the distances and spaces that a thing occupies, and the competencies that define being in that space.

[–] [email protected] 4 points 2 months ago

Isn't the primary reason why people are so powerful persuaded by this technology, because they're constantly sworn to that if they don't use its answers they will have their life's work and dignity removed from them? Like how many are in the control group where they persuade people with a gun to their head?

[–] [email protected] 7 points 3 months ago (1 children)

He's so close to being depressed enough to maybe ask a vital and important question about meaning and his own relationships with technology. But probably he'll just buy more AI.

[–] [email protected] 6 points 3 months ago (1 children)

Credit is a funny thing. If you merely exist in proximity to a solution, you can, by some means, claim credit to it.

"AI solved the climate crisis, because look, the climate crisis was solved, and some people also used AI!"

[–] [email protected] 7 points 3 months ago

You don't have to agree with someone to recognize that they care.

[–] [email protected] 14 points 3 months ago

Is this what competing product releases look like now? Illya runs off and promises to "never release any software until it's superintelligent" and I guess that forces Sam to compete for debt by promises to release software AND superintelligence?

view more: next ›