Unbelievable ‘Artificial Brain’ Developed by Former Apple Software Designer

Unbelievable ‘Artificial Brain’ Developed by Former Apple Software Designer
Text Size
- +

Toggle Dark Mode

Thanks to the intuitive work of a former Apple software designer, Mike Matas, the world might be a little bit closer to actualizing a software-enabled artificial intelligence reality.

Matas, who as a former software engineer at Apple, was once responsible for creating interactive user interfaces for company’s Maps, Photos, and Camera applications, just recently showed off his latest creation — an actual ‘artificial brain’, appropriately referred to as The Brain.

brain

In essence, as you can see for yourself in the YouTube video posted below, The Brain is a neural network — boasting a unique, but undeniably simplistic user interface, that can be taught to render different emoticons based on hand-drawn shapes. You have to see this in action to believe it!

It’s an inarguably simple, no-nonsense interface, however it still manages to show how The Brain can be taught — and actually learn, over time, how to process and respond to various input gestures. Matas starts out by explaining to The Brain how it will learn to generate certain emojis based on the input of corresponding shapes — prior to explaining exactly what emoji each shape represents. For argument sake, an upward-curve, smile-like line represents happy emojis like the smiley face, and a Z shape input represents the sleep emoji.

“There’s no preprogrammed rules here that are telling it a curved line means this or that,” according to Matas. “It’s all being figured out automatically based on the examples we taught it.”

This concept is best exemplified when The Brain registers a particular shape that it doesn’t already know. For example, at around the 1:20 mark in the video demonstration, Matas draws both a heart and a teardrop — neither of which The Brain is familiar with. And so, acting solely in accordance with its pre-existing knowledge, the application then makes its best guesses — throwing out both a smiley and a frowning face, respectively. Reverting back to The Brain’s teaching mode, Matas then specifies the proper response for those shapes, which leads The Brain to not mistake them for others anymore.

Not only does the interface look clean and, how should we say, impressive, but it also shows off the underlying processes that go into associating shape input with its respective emoji.

The Brain was built using Quartz Composer — a node-based processing language that’s part of the Xcode app development platform, a suite of tools that both Apple and registered software developers can utilize to create apps for iOS and OS X.

What exactly The Brain can do outside of emoticon processing is still unknown, however Matas sees a great deal of potential insofar as future applications are pertinent.

“It’s sort of crazy on the inside, but on the outside it’s pretty simple,” Matas says. “The same process that allows it to learn and recognize what this shape means also allows it to learn a whole lot of other stuff. And so it would be interesting to think, if this kind of thing were made a little bit easier to use, what kind of ideas might come out of it.”

Learn More: iPhone 7 Likely to Feature Dual Stereo Speakers in Lieu of 3.5mm Headphone Jack

Sponsored
Social Sharing