Computers Can Learn Like Humans

By Cerise Carey ’16


Scientists have created an algorithm that allows computers to recognize and draw simple visual concepts, such as handwritten characters. A research group under the guidance of Dr. Brenden Lake, a Moore-Sloan Data Science Fellow at New York University, devised an algorithm that serves to shorten the time it takes for computers to “learn” new concepts and replicate types of pattern recognition tasks that are exhibited by humans. This new approach allows a computer to learn based on model building, basing its knowledge off the data provided to it through the algorithm. The algorithm also “learns to learn,” drawing knowledge from different concepts while learning new ones. For example, the computer can use its knowledge of the Latin alphabet to learn letters in the Greek alphabet. After computers had to reproduce characters when given a single example, such as a set of numbers or letters, the characters reproduced by the computer were nearly indistinguishable from those reproduced by humans. This algorithm can serve as a model of learning in young children, with the computer able to recognize real-world concepts as it learns. Although this development is far from being as smart as a human child, it is the first machine to utilize and learn from real-world examples.



  1. Image acquired from:
  2. Scientists teach machines to learn like humans. New York University. (2015).



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s