Brain Mind
Robotics
BioComputing


Product Market
Consumers
Stage 1:
The Infant Biocomputer
Stage 2: Stem Cells
Grafting of Neural Networks
Apply for
Employment
Management
Administration

Machine Learning
Engineers
Speech Learning
Engineers
Vision Learning
Engineers
Robotics
Movement Engineers
Stem Cell
Computing

Brain Mind Robotics
BioComputing

"Silicon Valley" (Palo Alto - San Jose) California


We Are Building a Robotic Bio-Computer, Based on the Functional Neuroanatomy of the Human Brain, and Which Can Speak, Reason, Read, Understand Language, Experience Self-Consciousness and Human Emotions, Think Creatively, and Physically Interact with the Environment.

We Are Hiring Engineers With Experience In Computer Science, Machine Learning, Robotics, Artificial Intelligence, and the Creation of Auditory and Visual Platforms For Speech, Object, and Face Recognition.

STAGE 1: The creation of a stand-alone unit which is programmed to "reflexively" respond to simple visual and auditory stimuli, and to "reflexively" move the "eyes", turn the "head", open and close the "mouth" and its "hands", and make sucking, chewing, swallowing, swimming, and leg-lifting stepping movements, and to raise its "arms" and touch its "mouth" and "face."

Robotics and Motor/Movement Engineers--Qualifications

• Experiencing in robotics, and the integration of machine vision and auditory detection with robotic-motor functioning (simple head, hand, arm, leg, eye, and oral movements).
• Experiencing in designing, developing and building advanced robotic sub systems & systems utilizing auditory and visual sensing for performing simple, reflexive, and rhythmic movements.
• Functional experience with 2 & 3D CAD & CAE systems.
• Experience in prototyping algorithms and working with robots
• Experience with deep learning tools (Caffe, Tensor Flow, Theano)
• Programming experience in one or more of the following: C, C++, scala, R, Python, MATLAB, Objective-C, Swift
• Develop and implement motion planning software and algorithms, including designing interfaces between subsystems
• Ability to implement inverse kinematics and control of robotic systems.
• Proficient in the Integration of sensor fusion techniques as well as machine vision, audition perception, and robotic movements.
• Experience with applied machine vision technologies including lighting, optics, electronics, and imaging algorithms is required.
• Ability to design and implement validation and calibration methodologies.
• PLC and HMI programming experience utilizing commercial HMI software with experience developing custom display elements and functionality from within that environment. Experience with.
• Experience writing complex VB.NET and/or C# applications a plus.
• Experience in automation systems, drives, sensors, servo controls.
• Experience with PLC, networking, device communications, integration, and design.
• Ability to assist in fashioning a simple robotic system which can "reflexively" (in response to simple visual, tactile, or auditory stimuli) turn its eyes and head, open its mouth, wave its hands, clench its fists, touch its face, and make stepping movements.


*****