this post was submitted on 21 Jun 2024
50 points (80.5% liked)

Technology

59087 readers
4680 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A month after he left OpenAI amid disagreements regarding the safety of the company's products, Dr. Ilya Sutskever announced a new venture called Safe Superintelligence (SSI). “Building safe superintelligence (SSI) is the most important technical problem of our​​ time,” read the new company's announcement also signed by fellow co-founders Daniel Gross and Daniel Levy. “We have started the world’s first straight-shot SSI lab, with one goal and one product: a safe superintelligence. It’s called Safe Superintelligence. SSI is our mission, our name, and our entire product roadmap, because it is our sole focus. Our team, investors, and business model are all aligned to achieve SSI.”

The founders of SSI have deep ties to Israel. Sutskever (37) was born in the USSR before immigrating to Jerusalem at the age of 5. He began his academic studies at the Open University but completed all his degrees at the University of Toronto, where he earned a doctorate in machine learning under the guidance of Prof. Geoffrey Hinton, one of the early pioneers in the field of artificial intelligence (AI).

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 7 points 4 months ago

safe /for us/ super AI