-
So many details, so little brain to made it fix...
05/13/2015 at 07:10 • 0 commentsMade some edits and additions to some of my pseudocode. In the meanwhile, I'm going to start investigating which of the development boards I have that I should use to run the final simulation on. I have a SAM4S Xplained Pro Starter Kit from Atmel and a Stellaris LaunchPad LM4F120 Evaluation Kit from TI. Although both have Cortex-M4 based MCU, the SAM4S kit has a higher clock speed and more memory to work with. It even comes with a memory card reader built into the development board. I might try to take advantage of that somehow. We'll see. If I can make it so that the amount of neurons and layers can be easily expanded, this would truly be something else. Most of what I could glance at seems to have trouble will memory issues in storing multiple links for each neuron. I don't know, maybe I can fix that. Again, we'll see.
Note: Thanks to the HackADay for letting me "swipe" the SAM4S at the Disrupt Hackathon. ;) Wish I could contact the Microchip guy who was there. Might have to investigate that too.
-
And again, and again, and again with the loops...
05/12/2015 at 23:35 • 0 commentsSo, I've come up with some pseudocode for determining the highest and lowest intervals. It will compare the range of the bounds and will also compare the respective upper and lower bounds to see which is higher or lower. To do this properly, this will be broken into two functions as shown:
intervalH (n) h = 0 for t = 2 to 8 ++1; if (b_(n,t) - a_(n,t) > b_(n,t-1) - a_(n,t-1)) and (a_(n,t) > a_(n,t-1)) then h = t; else h = t - 1; next t; return h; intervalL (n) h = 0 for t = 8 to 2 --1; if ( b_(n,t-1) - a_(n,t-1) < b_(n,t) - a_(n,t)) and ( b_(n,t-1) < b_(n,t)); then h = t - 1; else h = t ; next t; return h;
With so many loops, it's no wonder why most designers of neural networks go for the single weight system. As simple as this all seems, I can already tell that the computational time of all of this we definitely eat up a lot of processing power just to do one iteration of forward propagation, not to mention what it will take for the many iterations for the network to learn or do anything. This will not discourage me though. I will still press on. As I said before, it is only a model of abstraction that can be, with time, optimized later to overcome such challenges. However, if the first steps are not taken, no one will get anywhere.
-
A picture is worth a thousand code lines...
05/12/2015 at 07:15 • 0 commentsWell, found a picture to help give a visual. It was initially a .svg file but I was still able to switch it to a .png file for posting. And in keeping with proper manners, the relevant info on the picture is as follows:
"http://commons.wikimedia.org/wiki/File:Colored_neural_network.svg#/media/File:Colored_neural_network.svg">Colored neural network" by Glosser.ca - Own work, Derivative of File:Artificial neural network.svg. Licensed under http://creativecommons.org/licenses/by-sa/3.0">CC BY-SA 3.0 via Wikimedia Commons.
-
I pseudo want to code, it's too hard...
05/11/2015 at 18:53 • 0 commentsThis is the best I can come up with for now. My scattered brain and I can't seem to get it together to come up with something better. Oh well, this is only the beginning. I will hope to have a method to determine the highest and lowest interval that I can sneak in there. This is just the main bulk of how the neurons will be activated by the weights of their connections. How this will come together completely is another question but I'm excited that I've made it this far.
A simplified pseudo-code for averaging and activation function will be something close to the following:
i = neurons being read; j = neurons reading; x = first neuron in a column; y = last neuron in a column; for j = x to y ++1; A_j = 0; h1 = 0; h2 = 0; ht = 0; F_t = 0; for t = 1 to 8 ++1; for i = x to y ++1; h1 = h1 + A_i*W_(i,t); h2 = h2 + A_i; next i; if h2 == 0 then ht = 0 else ht = h1/h2 if a_(i,t) ≤ ht and ht ≤ b_(i,t) then F_t = 1 and A_j = 1; next t; for t = 1 to 8 ++1; if F_t = 1 then for p = 1 to 8 ++1; if F_p = 1 then change(t, p, j); next p next t; next j change(t, p, j) switch (t); case (t = 1) : b_(j,p) = b_(j,p) + 1; (t = 2) : b_(j,p) = b_(j,p) - 1; (t = 3) : a_(j,p) = a_(j,p) + 1; (t = 4) : a_(j,p) = a_(j,p) + 1; (t = 5) : W_(intervalH(j),p) = W_(intervalH(j),p) + 1 of highest interval; (t = 6) : W_(intervalH(j),p) = W_(intervalH(j),p) - 1 of highest interval; (t = 7) : W_(intervalL(j),p) = W_(intervalL(j),p) + 1 of lowest interval; (t = 8) : W_(intervalL(j),p) = W_(intervalL(j),p) - 1 of lowest interval; break return
Note: I was pleasantly surprised that the code snippet function recognize my pseudo code as VBscript. I was just throwing things together, who knew?
Edit: Updated the code to reflect the use of functions that can determine the higher and lower intervals.
Edit: Updated the code to temporarily deactivate the neurons while they read their input of weights. This will make sure they they will only active when their intervals are triggered and not have to go through the process of flipping their status for each individual weight check. Even if the neuron was active before and is later switched off, the interval will still trigger it back to its active state because it is checking the active status of neurons in the preceding layer. So, as long as it was active before, it will still go back to its active state so long as nothing has changed that would cause it to deactivate (all of the intervals fail to trigger).
-
But I hate math...
05/11/2015 at 05:33 • 4 commentsAs for the equation that will be used to determine whether a neurotransmitter triggers one of the intervals in a neuron, it will be as follows:
where the numerator represents the sum of the specific neurotransmitter (t) from the neurons that are active and connected, while the denominator represents the sum of how many neurons are active (where n is the set of neurons that are active). This function will be compared with the corresponding interval to see if it is within the range to trigger that particular action and render a particular neuron active. This is just an averaging function. It might need adjusting later but, for now, it should suffice for what I'm trying to accomplish. That's all as is. Hopefully, I'll have a flowchart and some pseudocode to post later.
Note: Special thanks goes to @M. Bindhammer for helping me correct the equation's presentation.
-
Cake? One layer or five?
05/11/2015 at 05:18 • 0 commentsWell, as mentioned before, this project is meant to combine different type of neural networks into one single framework. However, the overall structure will be based on the multilayer perceptron. This topology relies on different layers of neurons that are typically visualized as vertical columns where each neuron in a column is connected to every neuron of the column adjacent to it. The first perceptron only had two layers; one for input, one for output. More layers were added later in order for the network to deal with larger more complex problems (voice analysis and filtering, data mining, etc) with the additional layers being named "hidden" layers.
The amount of layers I have settled on will be three (with 10 neurons each) which will be "hidden" and two layers that will serve as input and output. My project will not be using backpropagation. Any difference in actuation will be feed forward through the network from the receptors (input layer) till it gets to the actuators (output layer) creating a feedback loop. As such, and to still keep it a bit simple, there will be twenty receptors and five actuators. The receptors will only have their activation status changed, allowing their weights to be read by other neurons. Intervals will not be necessary as their will not be what activates the receptors. Meanwhile, the actuators will only have intervals that can be triggered to determine whether the actuator is activated or not. They will not possess any weights themselves as the are the end-point of the chain and only the activation status is important as with the receptors.
-
When one neuron talks to another...
05/08/2015 at 20:44 • 0 commentsWell, I started on how to describe some of the behavior of the neurons on an individual level. Here's the current scoop.
Each neuron will have a list of variables for use in carrying out the propagation of an input signal.
where:
- is the weight of a neurotransmitter(t) of a specific neuron(n)
- is the interval that has be in for the neuron to trigger a specific function
- is the status of neuron(n) where one means it's active and zero means it's inactive
List of neurotransmitter(weights), initial intervals and functions
Weight
Interval
Function taken by neuron
t=1
a=0, b=100
Increases b of triggered interval
t=2
a=0, b=100
Decreases b of triggered interval
t=3
a=0, b=100
Increases a of triggered interval
t=4
a=0, b=100
Decreases a of triggered interval
t=5
a=0, b=100
Increases weight of highest interval
t=6
a=0, b=100
Decreases weight of highest interval
t=7
a=0, b=100
Increases weight of lowest interval
t=8
a=0, b=100
Decreases weight of lowest interval
A neuron will be active only if one or more of its intervals are triggered. When active, only the weights of that neuron can be read by the other neurons connected to it, the weights of other neurons that are connected are ignored. While the upper and lower bounds of an interval can take on negative values (i.e. -1,-100), the weights themselves cannot. This is to be more consistent with the notion that a neurotransmitter in biological organisms can be either present in a certain amount or not present at all. It can also help to properly model how certain neurons wouldn't activate even though there's a high amount of a particular neurotransmitter (medication that is ineffective in correcting chemical imbalances).
That's all I have for now. How the neuron determines the highest/lowest interval or evaluation function to match against the list of intervals hasn't been done yet, but hopefully it will be soon.