Once the novelty of being able to write programs by feeding natural language prompts in 1 end & getting high level python out the other end wore off, lions started pondering the next evolution for language models. The ideal language model converts a better language than english straight to assembly language.
Describing gen AI prompts in a medeival natural language is exhausting. They desperately need a more precise language with modern programming features from the last 100 years. A plus is a library of prompts that are called by function prototypes instead of complete paragraphs. Functions are essential for any reuse. Commenting is essential for any prompts to be readable. We would be so lucky if prompts had common data structures like a stack & some form of memory management.
The easiest way might be a preprocessor that tokenizes keywords into medeival english, strips out comments, unwraps loops, expands variables, links libraries. It's not practical to train the model to ingest a modernized prompt language because the cheapest training data is all using medeival labels.
Gen AI is basically a compiler for an ancient, imprecise programming language & it's another step in the history of creating new compilers for new languages.
lion mclionhead
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.