@fastfinge@interfree.ca
Just for the heck of it, I decided to try #AI to fix an addon to run in the latest #NVDA alpha. I knew exactly what was wrong with it: it was importing a module included for #Python 3.11, and new alphas of NVDA are built with Python 3.13. I gave chat GPT the addon file, with this instruction: "In the latest alphas of NVDA, NVDA is now built with Python 3.13. That means this addon won't work. Please fix it." The simple solution would be to go get the right precompiled module from pip. But I didn't know if it would even be able to figure out what a .nvda-addon file was or how to fix it. Turns out it did. It spent six minutes, unpacked the addon, did things to the python code, then gave me back a new .nvda-addon file. But instead of just getting the right version of the module, it...rewrote the entire addon to not use that module. And its rewrite included some new features, while removing some existing ones. I'm kind of both impressed that it was able to handle the entire task on its own, and baffled that it did it in the hardest possible way. It even updated the manifest file correctly. It also left commented out sections of code, and created some Python files it never wound up using for anything. You could kind of see it doing something, forgetting what it was doing, then going to do the same thing a different way, but not deleting the half-completed evidence of the first attempt it abandoned for some reason. If more and more people use AI, code is going to get really, really weird and organic, y'all.