JTugger

joined 1 year ago
[–] JTugger 1 points 9 months ago

There are tools to manage major hallucinations. More are coming. Automated fact checking, pattern analysis, multiple layer analysis, etc.

Yes, there are functional mechanisms that power hallucinations. Especially in the probability models. But there are some powerful tools automate analysis of the outputs and rework for accuracy. Those are likely to improve to eventually reach a level of trust that is sufficient for many business use cases.

[–] JTugger 6 points 9 months ago (2 children)

Those pointing to hallucinations and such are focused on Generative AI as it is today. However, it will be vastly different in 4-6 years when people leave law school if they start today. This technology is on a growth curve that is much more rapid than most, if not all, we have seen in history.

A lot of the issues in AI today will be mitigated by the time the newly minted attorneys are ready to practice.

[–] JTugger 5 points 1 year ago

Thrift stores and garage sales are good places to find tvs with composite inputs cheap. That’s what I would do.

[–] JTugger 3 points 1 year ago

What am I intentionally ignoring?

What can I do in one hour or less to make my life better?

[–] JTugger 2 points 1 year ago

It’s absolutely amazing. Thanks for the hard work.