-
1This is going to take a very long time.
First, you are going to need some kind of computing hardware, that ideally is supported by some kind of C++ compiler. Then you are going to need a lot of patience. This is going to take awhile.
-
2Obtain the Source Code for the various components.
You are probably going to want to install Py-Torch as well as VC-2022 for Windows for most of the experiments. If you are attempting to build something for a Pi, a Propeller, or an Arduino - then you will also need those build environments. Download the various tools that you need as well as the materials for this project.
-
3Get some of the examples running, and then begin experimenting with some of the suggested changes.
As discussed in the project logs, you might want to try adding a competitive tournament-style evolutionary selection of models to the genetic algorithms part, or maybe if you want to get Eliza up and running, go ahead and try having conversations with it, and then go back and edit the scripts that Eliza uses, etc.
-
4Figure out how to make Deep Seek run custom, home grown models.
Study how the mixture of experts model works, and then figure out how to create new experts that can be added to a larger existing framework! Or else find out if the MoE model is viable for collections of small AI-based applications that can be made to interact with each other in other useful ways, as discussed in the Atari Genetics and braiding examples. Obviously, any neural network has a "topology", and that is somehow stored in the files that Deep-Seek, or LLAMA uses, for example.
Thus, it should be possible to create new models from scratch that work in Deep Seek etc.,, or else maybe even get something that works like Deep-Seek to be able to solve real mazes, for example using a small physical robot., while chatting with it about what it is doing.
glgorman
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.