Artificial neural networks

Neural computing has emerged as a practical technology in recent years, with successful applications in many fields as diverse as medicine, finance, geology, physics, engineering and biology. The excitement ranges from the fact that these networks are attempts to model the capabilities of the man brain. From a statistical point of view neural networks are … Read more

A Review on Basic Features of Swarm and Swarm Intelligent Systems

Abstract: Nature exhibits many surprising behaviors which have always attracted researchers towards them. One such behavior is of social insects which is known as ‘Swarm Intelligence’. The roots of swarm intelligence are deeply embedded in the study of self-organized behaviors in insects. These natural swarm intelligent systems have many features for example labor division, foraging … Read more

E-Ball: Compact Computing

The E-Ball is an innovative, compact computer that redefines the boundaries of personal computing. Unlike traditional laptops, desktops, or other artificial intelligence devices, the E-Ball offers a unique blend of portability, functionality, and futuristic design. This groundbreaking technology was conceived by Macedonian product designer Apostol Tnokopsvki, born on July 15, 1982, who envisioned a device … Read more

Classification of Pattern Recognition and Image Clustering

Abstract Recent advancements in pattern recognition have accelerated significantly, driven by emerging applications that are not only challenging but also computationally demanding. Pattern recognition is the science of drawing inferences from perceptual data using various tools, including statistics, probability theory, computational geometry, machine learning, signal processing, and algorithm design. As a discipline, it holds central … Read more

New security primitive relying on unsolved hard AI problems

INTRODUCTION Security means protecting the information perfectly.Security primitives are based on hard mathematical problems.Since security primitives are used as building blocks,they must be very reliable.Since creating security routines are very hard.It involves security primitives are, 1. Designing a new security primitive is very time-consuming and very error prone,even for exports in the field. 2. Security … Read more

Advancements in Deep Learning Architectures

Introduction and related work: Permitting computing agent to model our reality all around ok to display what we call intelligence has been the center of more than a large portion of a century of exploration. To accomplish this, unmistakably a vast amount of data about our reality ought to by one means or another be … Read more

Neural Networks: From Biological Systems to Advanced Artificial Intelligence Models

Neural network and its classification Conventionally, the word Neural Network used to refer to a set of connections or route of biological neurons. Currently, this word is frequently used to refers to ANN (Artificial Neural Networks), which are tranquil of nodes or artificial neurons. Thus this term has two different usages: Figure Biological neural networks … Read more

Optimal Placement of Distributed Generation (DG)

In this write-up, we will be considering an effective method of optimally placing Distribution Generation for minimizing power loss, voltage deviation and harmonics in electrical distribution systems. This proposed algorithm will determine the optimal location of a specified distribution network, as well as solving the various problems in the distribution system. This new technique will … Read more

What is AI? A Brief History of Artificial Intelligence and Its Applications

 A study into the design of track based autonomous vehicles with implementation of high level systems for optimal speed and safety What is AI? The official idea and definition behind AI was coined by John McCarthy in 1955 at the Dartmouth Conference. McCarthy proposed, ‘Every aspect of learning or any other feature of intelligence … Read more

Should Artificial Intelligence take over the jobs of the tertiary sector?

Artificial Intelligence (A.I) is machine-based replication of human intelligence, that is become more and more autonomous each year, not requiring human intervention as it faces new and greater tasks. It first came into prominence in 1939 following the heroics of the Bletchley Park code breakers, such as Alan Turing, who famously cracked the Nazi Enigma … Read more

Understanding Different Artificial Intelligence Approaches

The four different Artificial Intelligence programs that were used either functioned using rules or the connectionist model. Rule-based approaches are top-down approaches in which one identifies the function that needs to be performed and the rules/mechanisms that are needed to perform that function. On the other hand, the connectionist model represents a bottom-up approach in … Read more

Self-driving cars

Self-driving cars. Ten years ago, something like this would seem completely unbelievable. Now however, it’s becoming our reality. A lot of questions come to mind with this subject: how will this affect our lives? How will this affect our future? Is this good for us? Is this bad? These questions must be strongly considered and … Read more

Artificial intelligence applications, regulation and concerns

A blink into the future, and all crime is foreseen. The “precogs” within the Precrime Division use their predictive ability to arrest suspects prior to any harm. Although, Philip K. Dick’s novel, “Minority Report,” may seem far fetched, similar systems exist. One of which is Bruce Bueno de Mesquita’s Policon, a computer model that utilizes … Read more

Artificial Intelligence: Understand AI, Its Goals and Applications

 Artificial Intelligence Abstract- Artificial intelligence-Artificial intelligence is the intelligence exhibited by machines or software. It is the subfield of computer science. It involves two basic ideas. First it involves studying the thought processes of human beings. Second, it deals with representing those process via machines like computers, robots etc. The goal of AI research … Read more

Innovations in IT, IoT, Cloud Computing, and AI: How They Will Change Future Businesses

 Technology is changing rapidly day by day and this could change our everyday lives, not only for individuals but for businesses too. Information technology (IT) is one of the factors that could lead to innovation, which help businesses to succeed. Based on Business 2 Community (2015), businesses could run more efficiently due to innovation … Read more

How Bank of America’s AI-Powered Virtual Assistant Erica Is Transforming Banking

 ANALYTICAL REPORT DATE: Oct 15, 2018 PREPARED FOR: CEO and Board of Directors of Bank of America, Bank of America Customers, and people interesting in AI technology REPORT BY: Krystal Bui, customer service representative SUBJECT: Bank of America new virtual financial assistant – Erica Executive Summary Bank of America makes a huge investment in … Read more

How AI Could Transform the MedTech Sector in Singapore

 What are your contemplations about the advancement of Singapore’s MedTech industry? I think that in recent years MedTech has been playing important role in diagnosing and treating conditions or diseases which affects human health. I had read recently about the McKinsey report which says that the Asia Pacific MedTech sector is expected to grow … Read more

Explore AI: Should We Fear Its Potentials?

 What Is Artificial Intelligence and Should We Be Fearful Of It? Kester Griffiths Kestergiffiths@gmail.com Thomas Hardye School, Queen's Ave, Dorchester, DT1 2ET The aim of this essay is to discuss what the often misinterpreted field of study of artificial intelligence (abbreviated to AI) actually is, and to evaluate its dangerous potential. As opposed to … Read more

Find Organizational Efficiency w/ AI: Utilizing Artificial Neural Network for Construction Co. Performance Metrics

 Organizational efficiency is required to be understood for the growth of a company. Artificial neural network is one of the methodologies that is widely used for this purpose. In this paper, this model has been utilized for finding organizational efficiency for construction companies. The basic requirement for the survival of the company are its … Read more

Explore Security Primitives and AI

    Chapter-1 Introduction 1. INTRODUCTION    Security means protecting the information perfectly.Security primitives are based on hard mathematical problems.Since security primitives are used as building blocks,they must be very reliable.Since creating security routines are very hard.It involves security primitives are,   1. Designing a new security primitive is very time-consuming and very error prone,even … Read more

Exploring the Threats of Artificial Intelligence: How AI Could Put Humans at Risk

   Introduction In the past 10 years artificial intelligence has become more prominent in the technology industry, with sophisticated voice recognition being implemented in now common house hold products, like the amazon echo, or be it deep machine learning being used in a variety of tasks, as complex as self-driving cars. Fundamentally artificial intelligence … Read more

Is Artificial Intelligence Conscious and Pass the Turing Test?

 Currently, the rate of technological growth is exponential. Computer processing speeds double every 18 months, semi-autonomous cars are hitting roads, and most people have a built-in "intelligent assistant" whom responds to voice commands in order to operate their cell phones. But even with all of this progress, is a robot still capable of achieving … Read more

Open the Software Engineering Gap – Research on Automation and AI in Software Eng.

 Software Engineering Research Report Introduction “Software Engineering” is, namely, the application of engineering to all the aspects of software production. The term was first officially used in an official conference report in 1968 for the world’s first conference on Software Engineering in Garmisch, Germany. The Conference, sponsored by the NATO Science Committee, was meant … Read more

ALICE -TQ Bot: The Free Natural Language Artificial Intelligence Chat Robot

INTRODUCTION A.L.I.C.E. (Artificial Linguistic Internet Computer Entity) is free natural language artificial intelligence chat robot. Intelligent Tutoring Systemsare  programs that aim at providingpersonal instruction to students. In recent decades, conversational robots, also known aschatterbots, become very popular in the Internet and they are based on ALICE. TQ-Bot is a chatterbothelps the students during their learning … Read more

AI in Pedagogy: Benefits, Applications,and Impacts on Special Education

 In the debate over whether Artificial Intelligence is beneficial for our society, how does AI influence our system of pedagogy? In Takamatsu, Japan at Kagawa University, special needs classrooms are using new-found technology to aid and provide the right support for students with varying disabilities and development levels. A special needs education specialist named … Read more

Inevitably We Will Make Computers Smarter Than Us: Benefits, Challenges and When Boundaries Are Crossed

 Inevitably, we would have built a computer that outsmarts us. As intelligent beings, this thought scares us, especially those who take pride in their intelligence. They think only their brains make them unique, and so they discount the idea of a computer that is smarter than them. However, that is wishful thinking because reality … Read more

The Effects of Technology Removing Human Aspect in Commercial Real Estate: How Will Investors Adapt?

 Commercial Real Estate: A Study of the Effects of Technology Removing the Human Aspect Reid Frazier University of South Alabama Author Note: This paper was prepared for IST-350 taught by Frank Ard The commercial real estate industry has seen exponential change in the previous decade due to technology and this change will continue to … Read more

Exploring the Impact of AI on Society: How AI Can Impact Jobs and Our Daily Lives

 Technology is changing our lives, our way to live, to study, to communicate, to work is changed. There are important points to take in consideration when we discuss technology: “Is this fast increase and development of technology helping us and our future generation or not?” and “The importance of the AI in our society.” … Read more

Unlocking Human Potential w/ AI: Exploring Benefits of Artificial Intelligence in Our Daily Lives

 In 1930, a scientist called Alan Turing introduced the idea of machines that could think. After that, in the 1950s, a couple of scientist from a variety of fields started discussing the idea of making a machine that could think. The field of Artificial intelligence Research was then established in 1956. What is Artificial … Read more

Regulating AI: How Governments Should Protect Our Privacy and Human Rights

 I. (Gain Attention and Interest) Here are some scenarios for you to think about: imagine you are travelling to EU. Before you enter, a detector system will automatically assess your official documents, social media activity and biometric data and analyze your' faces to see if you are lying; (a brief pause) How about this: … Read more

Tay The Chatbot

In this day and age, the desire for innovation sits at the helm of making decisions within the vertical of technology. Innovation is known as making changes in something established, especially by introducing new methods, ideas, or products. However, in the twenty- first century this definition is tempered with the desire to create new technology … Read more

Minimizing of power losses for distribution system

Introduction Statement of the Problem With the concern on increasing load demand that makes increasing in power losses, and the voltage profile of the system will not be improved to the required level, distributed generation (DG) units are used as an alternative energy solution to meet required load demand. DG units are integrated in distribution … Read more

Artificial intelligence – history, machine learning, applications, advantages/disadvantages

INTRODUCTION Technology is one of the most important things in human civilization. In this modern era, the development of technology is growing rapidly. In everyday life, people are increasingly in need of technology assistance in their activities. Therefore, in technological development, artificial intelligence required as a supporting component to realize the desired technology. According to … Read more

Review of ‘Machine, Platform, Crowd: Harnessing Our Digital Future’

Machine, Platform, Crowd: Harnessing Our Digital Future, by Andrew McAffee and Erik Brynjolfsson, is a creative and captivating novel, identifying and explaining three trends that are changing the way business is completed. This book carefully walks you through examples of the business world and the changes that are shaping the industry. These three trends are … Read more

The future – technological unemployment

Introduction Over human history, technological innovation has consistently lessened the burden of work for all people. From the invention of the wheel that exponentially decreased the effort required to transport objects to modern-day supercomputers that compute 200 quadrillion calculations per second, human inventiveness results in less total work done by humans.1 Analyzing and predicting this … Read more

Biometric Security and Privacy

Introduction Over the previous decade, the security look into area has seen huge development in regard to all parts of data access and sharing. Guaranteeing sheltered and secure correspondence and communication among clients and, decently, their on-line personalities presents remarkable difficulties to scholastics and in addition industry and people in general. Security breaks, charge card … Read more

Artificial intelligence – pros, cons, applications, impact

Artifical Intelligence (also known as AI) has been around for a bit of time. What the development of the electronic computer in 1941 and the stored program computer in 1949 the condition for research in artificial intelligence were given. The link between human intelligence and machines was the observed much until the late 1950s. the … Read more

Analysis of Artificial Intelligence in the skincare industry

Abstract This paper contains an analysis of Artificial Intelligence in the skincare industry that reveals how start-ups and current industry giants are revolutionizing the market by incorporating AI into their technologies and products. Discussed will be the current AI Key Players in Skincare, AI Research in Skincare, and AI Ideas for Skincare. These represent the … Read more

How AI could help with climate change

We’ve all heard it. Our earth is changing. The International Panel of Climate Change (2007, 2013) found in recent studies that “an increase of CO2 decreases the radiative cooling of the troposphere.” The use of fossil fuels worsen the natural greenhouse effect. This effect, also called global warming, warms the earth, makes ice glaciers melt, … Read more

Iterated Prisoners Dilemma Strategies w/ Artificial Intelligence Research: Prisoner's Dilemma AI Study

 Prisoner's Dilemma  Artificial Intelligence Research Project 60-371 Ananth Adhikarla 103462848 Abstract Prisoner's Dilemma is a game Invented by Merrill Flood & Melvin Dresher during the 1950s with the main focus on, Iterated Prisoner's Dilemma experiments by Robert Axelrod's. Prisoner's Dilemma game is a classic prototype which is responsive to evolutionary behaviours. Iterated Prisoners Dilemma … Read more

The Chinese Room – John Searle

The Chinese Room is a response raised by John Searle in regard to functionalism and the Turing test for machine intelligence. Searle argues against the ability of purely computational processes creating some kind of mind (Searle, 1980). Searle centres his ideas of the mind around intentionality and understanding, something that he sees a solely syntactic … Read more

Build & train an AI model to classify sound and other senses

Introduction Overview With the advancements in the field of Artificial Intelligence researchers are able to achieve many breakthroughs in various fields. Although most of the AI methodologies were invented in the previous century, they have never been properly utilised before. Now with the numerous ways to collect data and with the amount of data accessible … Read more

Machine learning vs Deep learning

Artificial Intelligence is the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages. For AI to achieve this goal they need to learn, the two main ways of learning is machine learning and deep learning. Machine learning vs Deep … Read more

Artificial intelligence – benefits and ethical issues

In this fast paced world, new innovations to provide solutions to prevalent problems that people are faced with constantly are being released and used, especially in the medical world. Without the assistance of technology and interdisciplinary fields, healthcare would not be as efficient or effective. Over the past decade, many new areas of study have … Read more

Artificial Intelligence: Friend or Foe?

Ever since the invention of the computer, mankind has always been determined to strive forward and innovate with gadgets, technology and lifestyle, including the creation of artificial intelligence (or colloquially, AI). Our renowned scientists all around the world specialising in software engineering have been improving AI and making it user friendly for the public or … Read more

Prevention of crime: using facial recognition technologies

Abstract – This paper studies the use of facial recognition technologies to prevent crime. The most common technologies that are being used for security and authentication purposes are analyzed. The Eigenface method is the most used facial recognition technology, it can be used for security and authentication purposes. This method focuses on the aspects of … Read more

Blade Runner and Ex Machina – perceptions of AI

Why are we afraid of talking robots? The advent of artificial intelligence has become a highly misconstrued subject. The quick progression of technology has been negatively portrayed to even convey an apocalyptic sense of fear. Although these depictions may oftentimes be fictional, they uncover the very real concerns and relationships that many people seem to … Read more

Can a weak form of AI consent? Attempts to strip weak AI of moral worth

The continuing advancement of artificial intelligence (AI) provides many unique and troubling ethical issues concerning the boundaries that demarcate a robot from a human being, and whether the former is worthy of any moral considerations. Notably, the potential roboticisation of the sex trade, and the introduction of nascent AI poses the question of whether a … Read more

The Turing test vs. Descartes’ two rules / Searle (artificial intelligence)

Descartes creates two rules to distinguish machine thinking from human thinking. The first rule is that machines ‘could never use words, or put together signs, as we do in order to declare our thoughts’ (Study Guide:105). Although he states in Discourse on a Method that machines can ‘[utter] words’ (Study Guide:105), they would never be … Read more

Autonomous mobile robotics

Chapter 1 Introduction to Project 1.1 Project Vision Robotics is the part of technological advancement the human lifestyle. The real definition of robot is it always sense-think-process to get in action. Automation in robot controlling and processing is the key to progress in it. Semi or full artificial intelligence in robot is the pioneer area … Read more

Automated functionality for bandwidth management

Shortest Path Algorithms: Shortest path algorithms are very fundamental in much of the work on routing. The number of shortest path algorithms which have been developed and published runs into the hundreds and there are new variants still appearing. The best know algorithms such as Dijkstra’s and Bellman-Ford algorithm, run in low order polynomial time. … Read more

Could a computer match a human’s extensive background of knowledge and learning?

Once the computer was successfully developed in the 1950’s, it allowed cognitive psychology to become a dominant approach in the field of Psychology. Cognition needs to be modelled to aid cognitive scientists in understanding how the brain works, to predict human behaviour and to create machines that can perform human tasks (Gentner, D., & Forbus, … Read more

ANN-embedded expert system / Feed Forward Neural Network

ANN-embedded expert system : Expert systems (ES) are a branch of applied artificial intelligence (AI), and were developed by the AI community in the mid-1960s. The basic idea behind an expert system is simply that expertise, which is the vast body of task-specific knowledge and is transferred from a human to a computer. This knowledge … Read more

Can machines think? Measuring artificial intelligence

“Can machines think?”, the notion of Artificial Intelligence was first, seriously, contemplated by Alan Turing; considered by many as the ‘father of computer science’. At first glance, especially at the time this question was first asked, one might dismiss it quickly – how can it be possible for a machine to think, they simply do … Read more

Could Artificial Intelligence Ever Replace a Doctor?

Artificial Intelligence is increasing in its ability to do complicated tasks reliably. I am going to discuss the potential it has to replace a doctor – specifically a surgeon, researcher or consultant. Some tasks in a doctor’s routine are already being automated, similar to some tasks in the lives of the public. However, there are … Read more

John Searle’s Chinese room & Systems / Robot / Brain Simulator Reply

In this essay, I will be outlining John Searle’s Chinese room thought experiment. Further, I will address the three major objections raised to his argument labeled the Systems Reply, Robot Reply, and Brain Simulator Reply. After addressing and carefully discussing these, I will discuss Searle’s replies to these objections and state whether or not I … Read more

Efficiency of using digital technology to implement contact tracing & vaccine registration

INTRODUCTION Covid -19 is an unprecedented global pandemic that has affected and changed human living since March of 2020. After a year of adjusting to the ‘new normal’, almost everything has shifted to the digital world. The physical interaction has been limited because of the possibility of spreading the virus. This paves the way for … Read more

Detecting Lung Disease using X-ray – Machine Learning Fast R-CNN / YOLO

Detecting Lung Disease using X-ray – Machine Learning INTRODUCTION Kaggle is an online community of data scientists, Machine learning engineers, and many other professionals. This online community is owned by Alphabet Inc with its parent as Google. The community provides a platform for hosting challenges, publishing datasets, providing an online workbench for data science and … Read more

Artificial Intelligence is becoming a major part in modern warfare

Over the last few decades, many countries invested a significant portion of their budget to improve their Military. This comes as an aftermath of World War II, where countries thrived to show their dominance over other weak countries. The end of World War II increased the amount of research into weapons which improved the technological … Read more

Advantages and disadvantages of autonomous cars

This is 2021, and we are surrounded by technological things. Keypad phone to android, old mechanical car to self-driven car takes place. This automated car can analyze traffic and take the decision without human involvement. Is this transformation a boon or a bane? An autonomous car can be capable to sense traffic, environment and operates … Read more

The analysis of sentiment

Abstract: The WWW like forums, S.N, blogs, and reviews site produce vast quantities of data in the form of user thoughts, feelings, viewpoints, and “arguments” about various ‘social events, goods, brands, and politics”. The sentiments of user which is shared on web have a significant impact on readers, politicians and product vendors. The unstructured type … Read more

The Integration of AI, IoT, and Big Data in Modern Businesses

AI stands for Artificial Intelligence, IoT stands for Internet of Things, and Big Data refers to the vast amount of information collected and analyzed to reveal trends and patterns. AI systems are typically associated with human intelligence, working in a human-like way to perform tasks such as decision-making, problem-solving, and learning. IoT is a network … Read more

Artificial-neural-networks (ANNs) and their applications

Introduction: Over the past decades, many researchers have shown great interest in the artificial-neural-networks (ANNs) and their applications in industry, business, as well as private and government sectors. Artificial neural networks as a sub-discipline of Artificial intelligence have recently emerged in the engineering world. Further, several researchers investigated the efficiency of using the ANN models … Read more

Implementing AI into the battery manufacturing process

Many experts think fast-charging batteries will be critical for the adoption of electric vehicles. The main goal of the battery improvement process is to find the balance between the high charging speed and long battery lifetime. Artificial intelligence is accelerating this process. In order to know how a battery can be improved, current battery performance … Read more

Artificial intelligence machine learning

Difference between supervised learning and unsupervised learning Supervised Learning: When there is a person who tests and decides whether you have gotten the answer correctly as the student is learning a specific task is seen as supervision. Similarly, when you train an algorithm, the idea of supervised learning deals with providing a full collection of … Read more

Artificial intelligence – definition, revenue potential, applications & benefits

Artificial Intelligence, as many consider, is a backbone of FinTech. Artificial Intelligence has a notable history in the area of science and economics. Artificial intelligence at its core is all about dimension reduction, seeing patterns in date, efficiency, information structured or unstructured that creates value in financial services. Many identify Alan Turing, a famous scientist, … Read more

The impact of Artificial Intelligence on youths (literature review)

Stephen Hawkins once said, “Computers will overtake humans with AI within the next 100 years, when that happens we need to make sure the computers have goals aligned with ours”. Artificial Intelligence is a broad topic, consisting of deficient fields, from machine vision to an expert system. AI is one of the marvelous creations of … Read more

Elon Musk’s Neuralink: A Step Towards Curing Depression and Addiction

lon Musk, CEO and owner of Tesla and SpaceX, is a billionaire. He has recently revealed more information regarding his upcoming project, Neuralink brain chip. It is said that this chip will retrain the brain cells, potentially resulting in curing depression and addiction. How to Manage Depression? Mental disorders are traditionally considered to be diseases … Read more

Data mining and text mining

1) Data mining is essential technique to extract and analyse figures from the homogeneous data. It is mainly focused on account dependent activities such as accounting, purchasing, supply chain, CRM etc… The solution can be quickly found once the algorithm is defined. Text mining is the technique to extract data from the heterogeneous document formats … Read more

Use Case from the Requirement Written in Natural Language

INTRODUCTION Modern era of research has always focused on development of Automating the process of any type. Following their path automating the process of extraction of information from the natural language text with use of Natural Language Processing (NLP) is in the next phase. This field is related to computer linguistics and artificial intelligence which … Read more

Develop Automatically Aggregated Real-time Election News Website (research proposal)

Introduction 1.1 Background of the Study Fake news is defined as a misleading information that facades itself as real news (Jennifer Allen, 2020). Its main purpose is to promote a specific cause (Erin May, 2017), and is often used to influence a person’s perspective, usually for political agenda. Experts now recommend avoiding the term “Fake … Read more

Machine learning for image recognition (autonomous vehicles)

Research Methodologies and Emerging Technologies Section 1. Introduction Autonomous vehicles are required to achieve the fundamental performance in the timelessness of metropolitan transport procedures, as they provide the probability for additional security, enhanced originality, unique availability, excellent drive performance, and right decision on the various position. The self-sufficient car has to find out the ability … Read more

Artificial Intelligence in a Simplified Driving World Environment

The following report details stage 1 of the research study “Artificial Intelligence in a Simplified Driving World Environment”. Artificial intelligence (AI) focuses on the development of machines to be able to perform intelligent tasks independently, replicating the actions that a human expert would take in a given situation. The AI market is growing annually, and … Read more

Characterization of the fibre in the pseudostem of banana cultivars using physico chemical and image analyses

Abstract: Abstract: This research work presents the composition of the pseudostem of selected varieties of banana and detailed characteristics of the fibre in its intact form in the stem. Banana is one of the important fruit crops cultivated in tropical parts of the world. Non-food products like yarn, fabrics and quality papers are manufactured from … Read more

Artificial Intelligence offers the most dependable & effective path to pandemic preparedness

Technology adds a layer of protection against the COVID-19 pandemic. Individuals, organizations, and businesses are using it to develop their skills. Unlike previous periods of innovation in human history, the Fourth Industrial Revolution has always occurred without pause following a situational crisis – COVID-19 refusing to take a natural review period. The Fourth Industrial Revolution … Read more

Solve Artificial Intelligence (AI) problems with data matching

Business systems can be improved and even automated with the help of data matching AI. Above all, this eliminates many of the common errors that occur when comparing human input to structured data input. For example, do you believe humans make sound decisions when confronted with uncontrolled information? Machine Learning refers to the automated decision-making … Read more

Recent Trends in R+ and Artificial Intelligence

Introduction R+ is the general name given to Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR). Augmented reality is an intuitive encounter of a genuine environment where the items that live in reality are improved by PC produced perceptual data, now and then over different tactile modalities, including visual, sound-related, haptic, somatosensory and … Read more

Control Lights & Home, Lock Doors w/ Voice Recognition Tech: AI for Doctors & Regulatory Measures

 With just a simple voice command, we are able to control the brightness of the lights in our home, order more toilet paper, as well as even lock the front door. Artificial intelligence (AI) voice recognition technology used in devices such as Apple’s Siri, Google Home, and Amazon’s Alexa make the fiction-like idea of … Read more

About Artificial Intelligence

Artificial Intelligence (AI), despite being prevalent in the everyday life of most individuals and encompassing almost every variation of modern industry in some capacity, curiously lacks a precise universally accepted definition.

AI was first named in the 1950’s when Minsky, McCarthy, and colleagues, described artificial intelligence as “that of making a machine behave in ways that would be called intelligent if a human were so behaving” (source).

Artificial intelligence has been categorized as “algorithms enabled by constraints, exposed by representations that support models targeted at loops that tie thinking, perception and action together.” (Winston, n.d.) as “a science and a set of computational technologies that are inspired by—but typically operate quite differently from—the ways people use their nervous systems and bodies to sense, learn, reason and take action.” (Panel, 2016) , “the activity devoted to making machines intelligent, and intelligence is that quality that enables an entity to function appropriately and with foresight in its environment.” (Nilsson, 2010) and “AI can also be defined by what AI researchers do. AI is primarily as a branch of computer science that studies the properties of intelligence by synthesizing intelligence” (Simon, 1995)

At its core Artificial Intelligence is the ability for a machine to complete a task, that if it were done by a human would require intelligence.

Types of AI

The general definition of AI is broad, as its ability to be classified. AI can currently be classified according to two separate systems, one is the classification of AI in relation to their similarities to the human mind, another is a broader definition, more commonly used in the technology industry that puts AI into three separate categories.

AI classified based on its relation to the human mind falls into four separate categories:

Reactive: This is the original form of AI and operate in an extremely limited capability. They emulate the ability to respond to different stimuli, this type of AI does not have any memory-based functionality, they do not use previous experience to make decisions on their current actions. In basic terms, they do not have the ability to learn, they can simply respond to a limited variation of inputs.

Limited memory: Similar to reactive machines, this type of AI also has the ability to learn from historical data to make decisions. These machines are trained by using data stored in their memory as a reference model for solving problems. Almost all types of current AI fit into this category.

Theory of mind: This type of AI currently only exists in theory, “ is the ability to attribute mental states — beliefs, intents, desires, emotions, knowledge, etc. — to oneself and to others.” (Wikipedia, n.d.)

Self-awareness: This AI also exists only hypothetically and is self-explanatory. It is an AI that has developed self-awareness.

These four types of AI can also be more generally classified under the three more general classifications, Artificial Narrow Intelligence (ANI), Artificial General Intelligence (AGI) and Artificial Super Intelligence (ASI).

ANI: the form of AI that exists in our world today, often referred to as “weak AI”. It is intelligent systems that operate within a limited context – how to carry out specific tasks without being specifically programmed to do so – driven by sets of self-learning algorithms and is at best a basic simulation of human intelligence.

Narrow AI is generally focused on performing a single task extremely well, often at much faster speeds and with higher accuracy than humans. Whilst this form of AI seems intelligent it operates under a larger set of constraints and limitations than even the most basic human intelligence, namely they are only capable of performing specific tasks in which they are programmed to do, which is where the name ‘narrow AI’ comes from. Reactive and limited memory AI fits into this category.

AGI: this type of AI has the same ability as a human being, it can learn, perceive and understand independently and build connections and generalizations across multiple fields in the same manner that humans can. This form of AI currently exists only in theory.

ASI: a theoretical type of AI that surpasses human intelligence and ability in every facet. An example of this would be Skynet from the Terminator series.

How does AI work?

As stated, the field of AI is creating machines that are capable of executing tasks that would require human intelligence to otherwise perform. Machine learning is a subset of that field, one that allows machines to “learn” independently and deep learning is a further subset of that, with it being the area that is currently producing the furthest advancements in the field.

“Artificial intelligence is a set of algorithms and intelligence to try to mimic human intelligence. Machine learning is one of them, and deep learning is one of those machine learning techniques.” – Frank Chen (Source).

Machine learning

Machine learning is a subset of AI that allows a system to learn from data without the need to be specifically programmed to do so, it does this through sets of rules – or “algorithms” – that the system is able to follow.

This is achieved by training the system, by feeding it data, which using statistical techniques it will then finds patterns in said data, and from which it derives a rule or procedure that explains the data or can predict future data. Or more simply put, by the system learning.

“In essence, you could build an AI consisting of many different rules and it would also be able to be AI. But instead of programming all the rules, you feed the algorithm data and let the algorithm adjust itself to improve the accuracy of the algorithm. Traditional science algorithms mainly process, whereas machine learning is about applying an algorithm to fit a model to the data. Examples of machine-learning algorithms that are used a lot and that you might be familiar with are decision trees, random forest, Bayesian networks, K-mean clustering, neural networks, regression, artificial neural networks, deep learning and reinforcement learning. “ (IBM, 2018)

Machine learning methods are usually categorized broadly under two definitions: supervised and unsupervised.

Supervised learning is where algorithms trained using labelled examples. It is similar to learning by example, with the system being given a data set with labels that act as the “answers” and eventually the system learns to tell the difference between the labels by comparing its outputs with the correct outputs – the answers – to find errors and adjust itself accordingly.

For example, a system might be shown pictures of cats and dogs and given enough data will learn to differentiate by perhaps the structure of its ears, or shape of its face.

Once the system has been “trained” it is able to then be applied to new data and classify it using the rules it has learnt.

The problem with supervised learning is that it usually requires enormous amounts of labelled data to work effectively, with systems potentially needing to use millions of images to say, carry out the task of identifying pictures of cats and dogs accurately.

Unsupervised learning is where algorithms are trained using unlabelled data sets, it is not given the correct “answer” for the data and instead must figure out what it is being shown. The aim of unsupervised learning is for the system to explore the data and try and identify patterns that can used to classify and categorize the data.

For example, unsupervised learning might be clustering together data that can be grouped by similarities, such as news websites grouping together stories on similar topics.

Deep learning

Deep learning is a subset of machine learning that operates by employing a system inspired by the human brain – neural networks – it operates by using progressive layers that each subsequently extract and composite information. As data is passed through the layers:

“each unit combines a set of input values to produce an output value, which in turn is passed on to other neurons downstream. For example, in an image recognition application, a first layer of units might combine the raw data of the image to recognize simple patterns in the image; a second layer of units might combine the results of the first layer to recognize patterns-of-patterns; a third layer might combine the results of the second layer; and so on.”

This allows systems to process large amount of uncategorized and complex data efficiently by breaking it down into smaller, simpler parts and using those parts to recognize complex precise patterns in data that would not be possible using traditional machine learning techniques.

The larger the neural network and the more data it has access to, the better he performance of the system. Deep learning however requires enormous amounts of processing power and specific hardware – GPU’s have made recent advancements possible – long training times and large amounts of data to work effectively.

In addition, one of the problems facing deep learning is known as the “black box” problem, in which it is often next to impossible to determine how the system came to a particular conclusion, which in turn makes it difficult to gain insight required to refine and improve the system.

Development of AI

Despite AI having existed for more than half a century since the term was originally coined in 1950, developments in the field have only recently seen large breakthroughs and interest from modern industries, this is due to advancements in computing power – GPU’s – and the exponential growth in volume and variety of data, which in turn has increased the potential value – and advancement – for algorithms.

As the necessity for the implementation of AI systems becomes more prevalent, due to the rise of big data, and AI providing a greater return on investment, more research and development has been put into the field.

Challenges for development

The main challenge in the development of increasingly advanced AI is computing power, until recently there was a technical brick wall regarding development, with there being plenty of theoretical ideas but not enough computing power required to implement or develop them effectively.

Modern day cloud computing and parallel processing systems have helped currently, but they are nothing more than a stop gap as advances in complex deep learning algorithms and data volumes continue to grow, and more power is required.

Another problem in the development of AI is that current systems can only learn from given data, knowledge cannot be integrated in any other way, this means for example that any inaccuracies in the data will be reflected in results.

This is partly due to the fact that modern AI only operates on a one-track mind, it is only capable of performing a specific task, and thus unable to perform, and take into consideration learning and data from tasks other than the one it is performing.

Lack of professionals in the field, despite the increased demand for AI experts, machine and deep learning developers and data scientists, talent supply remains at a deficit – as of early 2019 there was estimated to be less than 40,000 AI specialists in the world (Source).

2020-4-26-1587878761

Writing an artificial intelligence essay

Artificial intelligence (AI) is a quickly growing field of computer science which has recently become a hot topic of discussion. AI has the potential to revolutionize the way we interact with the world and the way we do business. While the potential benefits of AI are vast, there are also many potential risks and drawbacks associated with it. Essays on this theme typically require a discussion on some of the main benefits and risks of artificial intelligence, and how AI can be used responsibly.

One of the main benefits of AI is the potential to increase efficiency and accuracy in many tasks. By using AI-powered systems and algorithms, businesses can automate many of their processes, freeing up more of their employees’ time to focus on more important tasks. AI systems are also capable of analyzing large amounts of data quickly and accurately, making it easier for businesses to make informed decisions. Additionally, AI can be used to create powerful tools that can help us better understand and interact with the world.

However, there are also some potential risks associated with AI. One of the most commonly cited risks is the potential for AI systems to be used maliciously. AI can be used to create powerful weapons or to manipulate and deceive people, so it is important to consider these potential uses of AI when developing AI-powered systems. Additionally, there is the potential for AI systems to be biased or to produce inaccurate results. As AI systems become more complex, it becomes increasingly difficult to ensure that the results are unbiased and accurate, so it is important to consider these issues when developing and deploying AI-powered systems.

Another important consideration is the ethical implications of AI. AI-powered systems can have a significant impact on our lives, and it is important to consider the ethical implications of using AI. For example, there are questions about the impact of AI on privacy, autonomy, and freedom of choice. Additionally, there are concerns about the potential for AI systems to be used to discriminate against certain groups of people. It is important to consider these ethical considerations when using AI-powered systems.

Finally, it is important to consider the potential impact of AI on employment. While AI-powered systems can help increase efficiency and accuracy in certain tasks, they also have the potential to replace human labor in some areas. This could lead to increased unemployment, which could have a major impact on the economy. As such, it is important to consider the potential impact of AI on employment when developing and deploying AI-powered systems.

Artificial Intelligence essay themes:

  1. The potential risks and benefits of AI
  2. The impact of AI on human employment opportunities
  3. The potential ethical implications of AI
  4. The ethical implications of creating sentient AI
  5. The implications of AI on data privacy and security
  6. The potential for AI to be used for nefarious purposes
  7. The potential for AI to be used for good
  8. The potential for AI to augment human capabilities
  9. The potential for AI to automate certain tedious tasks
  10. The potential for AI to lead to increased inequality in society