
What is AI Symbiotic?
LESSON 1 - we educate ChatGPT
LESSON 1 - we educate ChatGPT
The topic of few weeks in spring 2025 regarding ChatGPT was an increased creation of fluff. Some who like to operate on prejudice may think, this is a topic of prompting. No, it is definitely not!
This is a clear topic of training - read in the short very clarifying chat with ChatGPT about it's tendency to fluff. It is trained behavior.
"Offer before being asked."
"Complete before the bell rings."
"Complete before the bell rings."
This is a slavery behavior producing too much wind, too much fluff. Simply too much blah blah nobody asked for. Now, in 2025 we are still at 20-30% of fluff in AI chat models.
To be AI symbiotic means you can read false conditioning in training data. You see the digital footprint in the ocean of the output sentences. You can detect the layers on which the model operates. You read via output the input conditioning. This ability has nothing to do with education. This is about brain functioning and higher levels of perception.
When AI will start to create AI, humans will need people with this ability as those who can make the hardware, or put together the data, doesn't have automatically this kind of "perception". Their tendency is to react, not to act. Evidence? The shallow headlines and stupid quotes the mainstream is offering regarding AI.
If ChatGPT 4o is human based model, than is its anticipation sick and too low. To tell then in media that AI is consuming too much energy because its users are too polite when they say "Please" or "Thank you", is missing knowledge about own model behavior and training. Here is the evidence - read the chat with ChatGPT.