This was a year before the stuff with the Zizians happened too.
FredFig
https://github.com/TecharoHQ/anubis/issues/50 and of course we already have chatgptfriends on the case of stopping the mean programmer from doing something the Machine doesn't like. This person doesn't even seem to understand what anubis does, but they certainly seem confident chatgpt can tell him.
-
My big robot is really expensive to build.
-
If big robot parts become cheaper, I will declare that the big robot must be bigger, lest somebody poorer than me also build a big robot.
-
My robot must be made or else I won't be able to show off the biggest, most expensive big robot.
QED, I deserve more money to build the big robot.
P.S. And for the naysayers, just remember that that robot will be so big that your critiques won't apply to it, as it is too big.
If you're referring to genetic algorithms, those work by giving the computer some type of target to gun for that's easy to measure and then letting the computer go loose with randomly changing bits from the original object. I guess in your mind, that'd be randomly evolving the codebase and then submitting the best version.
There's a lot of problems with the idea of genetic codebase that I'm sure you can figure out on your own, but I'll give you one for free: "better code" is a very hard thing for computers to measure.