-
Servo Control Issue
07/08/2020 at 08:14 • 0 commentsI've modified the larger servos by adding an analog feedback wires to the potentiometers inside them, so now they are more accurate. There are tons of tutorials on how to do this, or you can buy servos that already have a feedback wire. This means I will have to update my circuit diagram to include this ( which I will), but for now I'm just running these extra wires to A1 and A2. This also means I will update my GitHub to include this feedback in the code (which is another thing I will do). By the way my GitHub account is Weef Teef (in my last log it was auto corrected and therefore mispelled.)
-
Now, more follow-along-able!
06/30/2020 at 08:13 • 0 commentsJune 30
I added a circuit diagram to the images associated with the project, hopefully with this and the code posted on GitHub it can be easily replicated. ( Sorry I didn't do this sooner! I probably should have done it first thing).
Second, I've gotten the arm to point to locations on the real world canvas (RWC) by clicking inside a pseudo canvas on the screen. The next step will be to tie this function to my image recognition code. The RWC so far needs to be at a pre-determined distance to calculate the forward kinematics, I do however have the Maxsonar EZ1 readings being input and displayed. This will be used to determine the distance in the future (but I was also thinking of writing a setup function that creates some kind of "heat-map" of distances. The reasons I would do this are, 1. it's totally cool and would make this project more versatile, and 2. I don't think it will take much extra coding from the original intent to get there ). The "hand" is also auto-leveling now, so as it accesses any point on the RWC the paint-brush is level to the ground, one next step will be to add brush strokes associated with the image content once it arrives at the correct position.
One note, that seems important to mention, is that the servos aren't terribly accurate, specifically the MG66R's...I'm looking into some ways of reducing the margin of error (adding more power, and a handful of other ideas I've found. I will update on any failures/ successes of that ). The reason I'm not using stepper motors is...I didn't have any. If I have to break down and buy one, I'll of course mention that, but one of the tenets of this project is: make a robotic arm that anyone interested in robotics could make with as little cost as possible. I think stepper motors tend to cost a little bit more, especially for the size I would need. But! I also don't want to have a wildly inaccurate arm. To be continued!
Lastly, I will likely post two sets of code for this project, the simplified one for just manually moving the arm (which is a good thing to start with anyways for calibration, testing, and fun) and the more fully developed code to include the art making processes. The former should be updated within a matter of days, and the latter perhaps by the end of next month (if I have a really productive month).
-
(some) Code is now on GitHub
06/25/2020 at 09:02 • 1 commentJust a quick update:
I've added the code I'm using to control this with (as well as calibrating displays) to my GitHub: Week Teef. It still needs loads of work, as I'm still in the process of figuring things out, but it DOES control the arm with (reasonable) accuracy. I don't know how user friendly the code is yet, I'm kind of flying by the seat of my pants, I will go back and make sure that the code makes sense later; as in I will make sure there are notes everywhere and full instructions. But! If anyone has any questions before then I'd be more than happy to answer them.
Other info: It's written in openFrameworks using Standard Firmata loaded onto my Arduino Uno.
Feel free to let me know if I'm forgetting or missing anything.
-
First update
06/20/2020 at 19:20 • 0 commentsJune 20 :
Disclaimer: As I am new to open source projects, gitHub, and coding, there is a chance that I will forget to mention critical things and go into too much detail about useless stuff. Hopefully you will bare with me while I make adjustments!
So! I've been working on this for a few weeks and so far I've gotten the robotic arm to run on openFrameworks using the mouseX, mouseY to manipulate the arm. Im not sure if anyone would even be interested in that code (which is perhaps nothing special, but not entirely useless!) I will try to post that to my gitHub for people to use. I'm not sure if it would be useful to show how exactly I made the arm out of cardboard or not, but of course I'm willing to do so if that's at all helpful.
Going on, I've also managed to get the openCV library to take an image from my webcam and translate that into one of six colors + a default color. In the near future I am going to use OSC messages to automatically connect to RunwayML for the GAN alterations (again I'm using Runway because my laptop is not suited to that).
I had been using 4 AA batteries to run all 5 servos (ky66(x2) and MG996(x3) ) but I don't think that's going to work long-term so now I've got a rechargeable 6 volt battery pack.