Meanwhile I would argue that most efficiency gains from AI are outweighed by the cost of false positives wasting time and energy, learning prompt engineering, and other testing/fiddling adding overhead.
In my experience, for developers, actual productivity is inversely correlated to AI usage espousal. I straight up had an extremely mid developer tell our CFO (who has no business scheduling meetings about AI with devs in the first place) that AI made them 10x as productive. This of course is complete bullshit, and of course it got the CFO salivating. All of this is madness
LLMs work in some limited use cases with good data to draw from. Most companies, especially large corporations, have shit data. Current AI cannot fix bad data. They will never do what these guys are promising they will do.
I remember talking about this at the time. It did get some attention, although the industry was quick to blame other things instead, like AI.
Meanwhile I would argue that most efficiency gains from AI are outweighed by the cost of false positives wasting time and energy, learning prompt engineering, and other testing/fiddling adding overhead.
In my experience, for developers, actual productivity is inversely correlated to AI usage espousal. I straight up had an extremely mid developer tell our CFO (who has no business scheduling meetings about AI with devs in the first place) that AI made them 10x as productive. This of course is complete bullshit, and of course it got the CFO salivating. All of this is madness
LLMs work in some limited use cases with good data to draw from. Most companies, especially large corporations, have shit data. Current AI cannot fix bad data. They will never do what these guys are promising they will do.
You’re not arguing against anything I’ve said given I said blame, as in, scapegoating AI for mismanagement and fundamental changes.