10-21-2024, 03:35 AM
(This post was last modified: 10-21-2024, 02:06 PM by Connor Kent.)
Connor had a buddy from back at UNM who he decided to reach out to. Todd Crowther now worked for the NSA, as part of the LLM Handler committee. The beauty of Large Language Model AIs, of course, was their ability to process trillions of pieces of data. This processed data was used to create an N dimensional vector space shaped by every concept, category, fact method, images, sound, and every other type of information that could be expressed digitally.
This vector space could then be queried to analyze and extract any kind of pattern based on that data formed it..
What people didn't realize was how important the initial training data actually was. The AI LLM was literally formed from that data. Any suppositions or predictions or analyses it gave was based on the "shape" of that space, a shape that was created by the training data.
AI, at the end of the day, was just performing vector multiplication, dot products, of the questions given it against that space to find an answer. It didnt, yet anyway, have any detectible volition or self awareness. It was purely algorithmic, even if it was beyond human analysis.
Managing such a beast, understanding how it came to its conclusions, assigning statistical liklihoods to its predictions or analyses and recommendations was complicated. It was known as the black box problem and it had only become more intractable.
Connor was aware that some were trying to use AI pattern analysis to figure out how another AI LLM might have come to its decision- and how likely it was that the recommendation was correct. But then you had to ask how did THAT AI come to its decision?
Truthfully, other than the concepts, that was all way beyond Connor's understanding. But Todd was on the team that basically babysat the NSA's AIs.
Overkill, he knew. But Todd would be able to track down Nox- if he wanted to. But he was a friend and he trusted Connor. So it was only a day later that he had a contact number.
So there they were, Connor with a number (and even location- Todd was always thorough), wallet in hand and ringing as he and Ayden sat on the bed. "Here goes," he said as he squeezed Ayden's leg affectionately.
It had begun.
This vector space could then be queried to analyze and extract any kind of pattern based on that data formed it..
What people didn't realize was how important the initial training data actually was. The AI LLM was literally formed from that data. Any suppositions or predictions or analyses it gave was based on the "shape" of that space, a shape that was created by the training data.
AI, at the end of the day, was just performing vector multiplication, dot products, of the questions given it against that space to find an answer. It didnt, yet anyway, have any detectible volition or self awareness. It was purely algorithmic, even if it was beyond human analysis.
Managing such a beast, understanding how it came to its conclusions, assigning statistical liklihoods to its predictions or analyses and recommendations was complicated. It was known as the black box problem and it had only become more intractable.
Connor was aware that some were trying to use AI pattern analysis to figure out how another AI LLM might have come to its decision- and how likely it was that the recommendation was correct. But then you had to ask how did THAT AI come to its decision?
Truthfully, other than the concepts, that was all way beyond Connor's understanding. But Todd was on the team that basically babysat the NSA's AIs.
Overkill, he knew. But Todd would be able to track down Nox- if he wanted to. But he was a friend and he trusted Connor. So it was only a day later that he had a contact number.
So there they were, Connor with a number (and even location- Todd was always thorough), wallet in hand and ringing as he and Ayden sat on the bed. "Here goes," he said as he squeezed Ayden's leg affectionately.
It had begun.