The Singularity

Discussion in 'Science and Nature' started by nitrum, Mar 13, 2008.

  1. When you think about it, all our human traits are just a byproduct of our evolution, and most traits are just the minimum that early man needed to survive in Ethiopia thousands of years ago. It's amazing that we even have the ability to carry out science experiments, and that we are born with an inner desire to understand our own existence. That's pretty impressive for a bunch of primates from Africa.

    Human traits are still just byproducts though. But we are soon approaching a time when we might be able to create an entity with a purpose beyond survival. Weather you feel that the greatest "good" in the universe is acquiring knowledge, individuality, wonder, being happy, or simply being a sentient conscious being, it may be possible to create a being which maximizes these traits.

    People call the point when an intelligent being exponentially adds to its own intelligence the Technological Singularity. This is how I personally believe it will happen (assuming we don't all die first). I think we will gradually interface our brains with computers. This will lead to an enormous increase in our intelligence, and also make it possible to know how our mind works (consciousness and intelligence). Then we can make a hyper-intelligent being, which can then make increasingly smarter beings.

    This sounds scary, like robots taking over, or that this God-like "race" would make humans obsolete. But, I believe that humans would integrate themselves into the singularity (or singularities) and be a part of it if they chose. This would be an expansive network of intelligence that makes the greatest goals of science, art, philosophy, ethics, and happiness possible. Life would extend indefinitely, but some would chose to eventually die.

    Still, I don't believe this is possible just through artificial intelligence. Regardless of how smart a computer becomes, it's still isn't conscious. I'm not saying there is something magical about biological matter, but I think in order to gain a "soul" a computer would need to interface with something biological.

    I admit, I have somewhat of a romanticized view about the subject. I image a future where every available atom buzzes with awareness. If this is the case humanity basically has a moral obligation to create the singularity. Essentially, this is the next step (and greatest step) in the evolution of man.

    Perhaps this won't work though. If you were smart enough to understand EXACTLY how your brain worked, it might ruin it for you. If you could observe the electrical impulses that were making consciousness possible everything might become meaningless. (that's a depressing thought) Maybe ignorance is necessary to be happy, and infinite knowledge is infinite emptiness. But I guess you can program ignorance in (This sounds a lot like Brave New World).

    So what do you say? Do you think a singularity is possible? Is it moral? Is it moral NOT to? Is ignorance of our own brains necessary?

    You can probably guess that I think about this a lot when I'm high.
     

Share This Page