Future AI, an artificial general intelligence (AGI) company developing “Technologies that Think,” today launched Sallie, its prototype software and the artificial entity that learns in real-time with vision, hearing, speaking, and mobility, giving it the ability to draw conclusions, a critical facet of genuine thinking and a necessary component to ushering in AGI, the most exciting project on the planet.
“The first component of being able to understand like a person is learning about immediate surroundings. Sallie can recognize objects with vision, build an internal model, ask questions, and take direction without any initial information,” explains Charles Simon, Founder, and CEO, of Future AI. “Our work advances new algorithms which simulate biological neuron circuits with high-level artificial intelligence techniques. Sallie can infer information about objects she doesn’t understand – demonstrating one-shot, real-world learning without tagged data sets or backpropagation.”
Future AI’s software creates connections on its own between different types of real-world sensory input in the same way that the human brain interprets everything it knows in the context of everything else it knows. Sallie emulates the processes of human thought, beginning with perception. Sallie consists of a centralized computer mind and uses mobile sensory pods with multiple senses and abilities that enable the system to learn from a real-world environment and gain a fundamental understanding of physical objects, cause and effect, and the passage of time.
“With the latest advancements in symbolic AI and neuromorphic computing, adding real-world understanding to AI and achieving human-like intelligence is gradually transitioning to the realm of possibility,” says Ritu Jyoti, group vice president of AI and Automation market research and advisory services, IDC. “This decade will play a crucial role in accelerating the development of AGI.”
Upending AI technologies that use and are bound by the limitations of machine learning, Future AI uses unique graph algorithms and structures that are self-adaptive. Companies or individuals interested in evaluating Sallie can sign up now for priority beta-testing in Q4 of this year.
Recently, Future AI raised $2 million in initial funding to accelerate the development of its technology and algorithms, including its Universal Knowledge Store (UKS) which aggregates different types of information and creates connections between them, similar to the cognitive processes of human intelligence. Modeled in neurons, the UKS has biological plausibility and the ability to learn and function unsupervised the way children do.
Sallie’s technology and the knowledge she acquires will be incorporated into existing AI applications, radically improving personal assistants like Alexa and Siri, language translation, computer vision, automated customer service systems, and many other human-interactive systems. Additional Sallie enhancements already in development will give Sallie an even better understanding of the world around her.