Singularity
The technological singularity—or simply the singularity—is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable changes to human civilization. According to the most popular version of the singularity hypothesis, I. J. Good's intelligence explosion model, an upgradable intelligent agent will eventually enter a "runaway reaction" of self-improvement cycles, each new and more intelligent generation appearing more and more rapidly, causing an "explosion" in intelligence and resulting in a powerful superintelligence that qualitatively far surpasses all human intelligence.
— Wikipedia
This is a community for discussing theoretical and practical consequences related to the singularity, or any other innovation in the realm of machine learning capable of potentially disrupting our society.
You can share news, research papers, discussions and opinions. This community is mainly meant for information and discussion, so entertainment (such as memes) should generally be avoided, unless the content is thought-provoking or has some other qualities.
Rules:
- Be nice to everyone, even if you disagree.
- No spam. No ads.
- No NSFW.
- Self-promotion is acceptable if not excessive (i.e. no spam).
view the rest of the comments
The paper is really interesting, even though it looks like it was translated with Google Translate. I'm not sure as to the credibility of this particular institution, but the concept of using BOLD data to initiate the simulation is a really interesting one. I wonder if it would work better using data from the new sub-micron fMRI tech.
The really fascinating point is that they were able to check the simulation but giving it inputs similar to what a live brain would receive, and get very similar responses. That's amazing, and I'd like to see if anyone else manages to replicate their results.