Can we build a open source machine that can receive the dankest memes from the future?
To make the experience fit your profile, pick a username and tell us what interests you.
We found and based on your interests.
Create an account to leave a comment. Already have an account? Log In.
Interesting. I'm working on a language specifically to control my robotics hardware (#AIMOS). It isnt just used by the programmer, in fact the programmer never sees it. #Cardware has a sensory skin so you can directly teach the robot motion by guiding its limbs with your hands. The feedback from this is converted into an AIMIL program which can be transmitted to another robot and used to animate it. It does not however transmit the servo positions, it creates metaphors for them and transmits that instead.
The idea behind this is so the robots can learn from their users and from each other.
For example. Each robot has an individual gait formed by teaching it how to take a step. Walking involves taking many steps, so when learning it, the AIMIL program for WALK instigates steps already learned by the robot, giving it a new skill (walking) based on existing knowledge (take a step).
Because all the robots are to be joined wirelessly as a compute network, the AIMIL programs running across it will be the equivalent of machine thoughts. Very basic, granted, but the first steps to a consciousness.
Become a member to follow this project and never miss any updates
I just saw the #AIMOS and #Cardware,I skimmed through and they are very interesting projects. It must have been incredibly tough to program. "equivalent of machine thoughts. Very basic, granted, but the first steps to a consciousness." Thats exactly what may be needed for this project. Something to interpret raw basic data and pop out thoughts as humans can pop very soft thoughts about tomorrows lunch.