If you have been around these parts long enough, you know how the classic chatbots like Eliza, Parry, and Megahal work. Then there were other important approaches to AI, like Perceptron's, SHRDLU, Mathematica, Watson, and so on. Some of those things have open source versions, and some do not. One thing that they all had, and perhaps still have in common, was, and perhaps there still is the fact that there always seems to be a "great breakthrough" in AI, which almost always seems as if it were as great of a leap forward, as it would be, that is to say, as if sustainable controlled fusion had also been solved. And then of course, no, not really. Better luck next time.
Then perhaps one of the biggest breakthroughs ever came with the landmark observation, "What if attention is all you really need?" Purely attentional-based systems, however they actually work however, have their own set of problems, and that is, of course, efficiency. And yet there are some very interesting things that can be said about "attention", in and of itself, so maybe a digression into the history of all of this will prove to be worthwhile.

Most people reading this, of course, were not even born yet when Don Lancaster wrote the article for the very first build it yourself "TV Typewriter, as if all that mattered at the time was putting "your message on the screen." Ah, the TV Typewriter! Somewhere, somehow, back in the day, beyond the valley of the shadows, in the once upon a long time ago, long before I ever wanted to write a novel where every chapter began with "It was a dark and stormy night, and as the swamp thing staggered from the crypt, suddenly there was a need for words", there was, of course, this:

Yeah, that thing. Back when Bill Gates was still in high school. Then one thing happened, and another, another, and we all know what happened. Even if Eliza came out in the 60's.
glgorman