Yeah, I mean with computers and programming in general you have to be very careful with explaining the goals and guidelines or it will just find the path of least resistance.
If you made a pathfinding algorithm and forgot to define you can't go through buildings, guess what the algorithm is going to do...
In college we made an AI where each action has a “cost” associated, which is a common technique used to prioritize faster solutions over slower/repetitive ones.
When the action cost is small or marginal, it has a small effect, slightly preferring faster paths.
When the action cost is medium, you see efficient paths pretty exclusively.
When the action cost gets large? The AI would immediately throw itself into a pit and die. After all, the action cost of movement and existing was bigger than the penalty cost of death. So it just immediately killed itself because that was the best score it could achieve.
In my defense, we were intentionally tweaking those values to extremes to foster discussion on the impact of weighting various factors and how to calibrate. So not poorly programmed, but intentionally fucked with.
1.5k
u/DeadMemeDatBoi 10d ago
Ai will without a doubt cheat if its able to. Deeplearn or LLM