It can't; again the model does not and cannot change once it's been generated.
I think even "intelligence" here is a stretch. In a very narrow sense, it is intelligent: it creates text, simulates conversations, answers questions. But that is not what intelligence is (and it is all LLMs can do).
[GPT-4] is fed, like, a line of text from some source, but with the last word missing. It guesses what the last word might be, and then it gets told whether or not it got it right so it can adjust its internal math.
GPT-4 cannot alter its weights once it has been trained so this is just factually wrong.
“It had to build, in its internal wirings and all its software neurons, some understanding of what an egg is - In other words, to get the next word right, it had to become intelligent. It’s quite a thought. It started with nothing. We jammed huge oceans of text through it, and it just wired itself into intelligence, just by being trained to do this one stupid thing.”
LLMs are really cool and very useful, don't get me wrong. But people get excited by what they seem to do and lose sight of what they actually can do. They are not intelligent. They create text based on inputs. That is not what intelligence is, unless you have an extremely dismal view of intelligence that humans are text creation machines with no thoughts, no feelings, no desires, no ability to plan... basically, no internal world at all.
An LLM is an algorithm, not an intelligence.
Jesus Christ. Netanyahu is not the ADL and does not speak for all Jews. Even though the ADL is part of Israel that doesn’t make it propaganda or trying to overthrow democracy. Many Israelis are opposed to that themselves.
This is why the claims of “being anti-Israel doesn’t mean you’re antisemitic” are actually worthless. Because in the same breath people will paint all Israel and all Jews with the same brush. Yes, that is antisemitism.
What is the point of your reply? ChatGPT-4 does not use this method, and even if it did, it still does not allow it to change its model on-the-fly... so it just seems like a total non-sequitur.