

Amdahl’s isn’t the only scaling law in the books.
Gustafson’s scaling law looks at how the hypothetical maximum work a computer could perform scales with parallelism—idea being for certain tasks like simulations (or, to your point, even consumer devices to some extent) which can scale to fully utilize, this is a real improvement.
Amdahl’s takes a fixed program, considers what portion is parallelizable, and tells you the speed up from additional parallelism in your hardware.
One tells you how much a processor might do, the only tells you how fast a program might run. Neither is wrong, but both are incomplete picture of the colloquial “performance” of a modern device.
Amdahl’s is the one you find emphasized by a Comp Arch 101 course, because it corrects the intuitive error of assuming you can double the cores and get half the runtime. I only encountered Gustafson’s law in a high performance architecture course, and it really only holds for certain types of workloads.
The script doesn’t go away when you replace a helpdesk operator with ChatGPT. You just get a script-reading interface without empathy and a severally hindered ability to process novel issues outside it’s protocol.
The humans you speak to could do exactly what you’re asking for, if the business did not handcuff them to a script.