Artificial Intelligence: The term “algorithm” is being used far more frequently than in the past. One of the reasons is that scientists have discovered that if given a few simple instructions, computers may learn on their own. Algorithms are nothing more than mathematical instructions. An algorithm, according to Wikipedia, is “a step-by-step technique for calculations.” The term “algorithm” is being used far more frequently than in the past.
The calculation, data processing, and automated reasoning are all done with algorithms.” Algorithms are becoming a commonplace aspect of our lives, whether we realize it or not. This trend, according to some analysts, is dangerous. “The NSA revelations emphasize the importance sophisticated algorithms play in sorting through quantities of data,” writes Leo Hickman (@LeoHickman). But what’s more startling is how common they are in our daily lives. “Should we be more afraid of their power?” you could ask. [The Guardian, 1 July 2013] “How Algorithms Rule the World.” Although declaring that algorithms control the world is a bit dramatic, I agree that their use is growing increasingly ubiquitous. Because computers are becoming increasingly vital in so many facets of our life, this is the case. I appreciate this explanation from HowStuffWorks:
“You need to build a computer program to get a computer to perform anything.” To develop a computer program, you must first explain to the computer what you want it to accomplish, step by step. The computer then ‘executes’ the program, mechanically carrying out each step to reach the desired outcome. You get to pick how the computer does something when you tell it what to do. Computer algorithms have a role in this. The algorithm is the fundamental method of accomplishing the task.”
The only part of the statement that is incorrect is that a computer must be told “exactly what you want it to accomplish” step by step. Some computer algorithms are designed to allow computers to learn on their own rather than following just clearly set instructions (i.e., facilitate machine learning). Data mining and pattern recognition are two applications of machine learning. “Today’s internet is dominated by algorithms,” writes Klint Finley.
These mathematical constructs control what you see on your Facebook page, Netflix recommendations, and Gmail advertising.” [“Want to Make Your Own Google?”] Artificial Intelligence Algorithms may be found on the App Store,” Wired, 11 August 2014].
Algorithms, like mathematical equations, are neither good nor bad. Algorithms, on the other hand, have been utilized by individuals with both good and harmful motives. “[Algorithms] are now intertwined into our life,” Dr. Panos Parpas, a lecturer in the department of computers at Imperial College London, told Hickman. On the one hand, they are beneficial since they free up our time and perform routine tasks for us. The current debate about algorithms is not about algorithms in and of themselves, but rather how society is structured in terms of data usage and privacy. It also has to do with how models are utilized to forecast the future. Data and algorithms are now married oddly. There will be blunders as technology advances, but it’s vital to remember that they’re simply tools. We shouldn’t hold it against our tools.”
Algorithms are not a new concept. They are essentially mathematical instructions, as stated previously. Their application in computers may be traced back to Alan Turing, one of the pioneers of computational theory. Turing “presented a series of equations that aimed to describe the patterns we observe in nature, from the dappled stripes on a zebra’s back to the whorled leaves on a plant stem, or even the complicated tucking and folding that converts a ball of cells into an individual” in 1952. [Gizmodo, 13 August 2014, “The Powerful Equations That Explain The Patterns We See In Nature,” by Kat Arney (@harpistkat)] Turing rose to prominence during WWII after assisting in the deciphering of the Enigma code. Turing, unfortunately, committed suicide two years after the publication of his book. Turing’s influence on the world, fortunately, did not cease with his death. Scientists are still utilizing Arney’s algorithms and Artificial Intelligence to identify patterns in nature, according to Arney. Arney sums up:
“Alan Turing’s mathematical ambition — a programmable electronic computer — flared into being from a cantankerous assemblage of wires and tubes in the latter years of his life.” It could only crunch a few digits at a snail’s pace back then. Today’s smartphones are jam-packed with computational capability that would have astounded him. It’s taken nearly a lifetime to put his biological vision into scientific reality, but it’s turning out to be more than a nice explanation and some fancy mathematics.”
“Data scientists need to validate if their conclusions make sense,” Douglas Merrill, CEO of Zest Finance, told Gage. People aren’t being replaced by machine learning.” The fact that most machine learning systems do not mix reasoning and computations is part of the problem. They just vomit out correlations whether or whether they make sense. “Zest Finance eliminated another conclusion from its program,” Gage writes, “that taller individual are better at repaying loans, a concept Mr. Merrill dismisses.” Correlations and insights become significantly more valuable when reasoning is included in machine learning systems. “Part of the difficulty is that when we humans communicate, we rely on a broad background of unsaid assumptions,” says Catherine Havasi (@havasi), CEO and co-founder of Luminoso.
We assume everyone we meet shares this knowledge from Artificial Intelligence. It forms the basis of how we interact and allows us to communicate quickly, efficiently, and with deep meaning.” [“Who’s Doing Common-Sense Reasoning And Why It Matters,” TechCrunch, 9 August 2014] She adds, “As advanced as technology is today, its main shortcoming as it becomes a large part of daily life in society is that it does not share these assumptions.”
Havasi goes on to say:
“Common-sense reasoning is a branch of artificial intelligence that seeks to develop ways to gather and teach these assumptions to computers to enable computers to comprehend and interact with people more naturally.” Though considerable work has been done in other areas, Common Sense Reasoning has proved most effective in the field of natural language processing (NLP). With its unusual name, this branch of machine learning is silently infiltrating a variety of applications, from word comprehension to processing and analyzing what’s in a photo. In an increasingly digitized and mobile environment, it will be challenging to construct flexible and unsupervised NLP systems without common sense. … NLP is a field in which common sense reasoning shines, and the technology is beginning to make its way into commercial products. Even though there is still a long way to go, common sense thinking will continue to advance fast in the future years, and the technology is solid enough to be used in business today. It has substantial benefits over conventional ontology and rule-based systems, as well as machine learning-based systems.”
Artificial Intelligence Algorithms may make systems smarter, but they can also yield some strange results if they aren’t combined with some common sense.
Stephen F. DeAngelis is the President and CEO of Enterra Solutions, a cognitive computing startup.