Friday, December 5, 2014

Technological Singularity

Technological Singularity is the idea that computers will continue to evolve until they have great enough intelligence to "think," or be "awake," or in other words, until they are conscience. The most controversial topic in the study of Technological Singularity, is whether we can create an Artificial Intelligence (AI) that's equivalent to human conscience. One that would allow machines to be human. If a machine were to gain consciousness, it would most likely recognize humans as an inferior being, and most likely leave us behind, while they explore the universe. However, unlike most robot-apocalypse movies, in which the robots decide to destroy the human race, the robots would view us as their gods, or their creators, because we gave them life. This is most likely the reason, if they were to leave us, they would do so peacefully.



Charles Platt, Technological Singularity enthusiast, says he would not be surprised if human-like AI is created before 2030. The exact date in which this new AI is created, depends solely on the continuous advancement of technology. If humans continue to advance technology at the exponential rate, in which it has been going for the last several decades, this new AI shouldn't be too far in our future.



One of the most popular suggestions to avoid being left behind by this intelligence, is to simply turn into one of them. Putting our brains in a mechanical exoskeleton is not as futuristic as it seems. People today are essentially turning themselves part robot by implanting computers and other mechanics into their body, technically making them cyborgs. The next step in evolution could very well be becoming cyborg, or eventually even become robots.



Despite the preconceived notion that technological singularity is far from our generation's time, the truth is, we really aren't very far from being inferior to machines. The age of machines is only just beginning, and the fact that we are advancing as much as putting machinery inside us proves it. The days, in which man is smarter than machines, are in the past.

Sources: http://www-rohan.sdsu.edu/faculty/vinge/misc/singularity.html

Programming and Coding

Programming is the process of embedding sets of instructions in order to execute commands. In order to code, you need to have knowledge of the application domain, as well as formal logic, and specialized algorithms for different desired functions and commands.

Coding first originated from within the fertile crescent of Mesopotamia, more specifically, Sumer. This early Sumerian contraption was the first basic mechanism established for computing numbers. It was created around 2500 BC and was nothing more than a flat surface, usually wood, with sand evenly spread across and small numbers carved into the grain of the wood. The abacus wasn't innovated until roughly 200 BC, right after the death of famous mathematician, Archimedes. The only change made, was that the device now had tiny grooves for counters, and the sand was ultimately removed. The next version of the abacus eliminated the grooves for the counters and instead had the counters attached to thin metal rods, allowing the counters to move more freely. This is the most historically known version of the abacus. Each wire corresponded to a digit in a positional number system, commonly base 10. Greek mathematicians were generally the people who had their hands on this version of the abacus the most during the time period of this innovation.



Coding and programming continued to slowly evolve throughout the years, until eventually becoming what it is today. The next step in programming came as a bit of a surprise because, in truth, it wasn't really coding. Charles Babbage created his famous difference engine, which could only be made to execute tasks by shifting gears, which executed calculations, making this the earliest form of computer language, but it was physical motion rather than on a computer.



Programming and coding didn't really start to become advanced until the ENICA was invented in 1942 by John Mauchly and John Presper. The ENICA was not only a calculating device created for the Armed Forces, but it also worked as an artillery firing device. It allowed computerized aiming and firing, which had never been imagined, never-mind seen. After this amazing innovation, the technology quickly spread to other parts of the world and continued to evolve into what it is today. The internet is the product of computer programming, and without it we wouldn't have the internet today. This is completely unimaginable considering everything we do nowadays is on the computer. In reality, if early computer programming had never come to be, I wouldn't be writing this right now. But fortunately, I am able to write this, so I guess I'll close it out.

In conclusion, I've just reviewed the history of programming and how it has developed since it was first developed in 2500 BC.

Sources: www.cut-the-knot.org/blue/abacus.shtml
               cs.brown.edu