Discussion about this post

User's avatar
The Bull and The Bot's avatar

Your Leadership-Lab-Crowd triangle perfectly describes the gap I'm seeing on Wall Street. Only some large banks and PE funds have rolled out internal chatbots or parter with firms creating financial ai tools - but even there, uptake is tiny because the way they're presented to employees is as "optional sidekicks". When the message is “play with it on your own time,” no one pulling 80-hour weeks willingly does so.

The deeper blocker is psychological. Junior staff worry that using AI to do grunt work will short-circuit the skills they’re supposed to master. However, not all grunt work is of equal value when it comes to skillbuilding. Moreover, the skill that will matter most in the years coming will be knowing how to direct, audit, and iterate AI outputs. Leadership has to make that explicit - shift the truly mind-numbing grunt work to AI, keep the judgment-building parts in human hands, and treat “managing the machine” as the new apprenticeship. And that kind of sorting process won’t happen if the message from the top is “try using AI if you want, when you want”. It needs an org-wide mandate and protected forums: AI discussion committees & innovation sessions, where teams test, map, and share what works regularly

I make the same case in my substack post "Grunt Work & Growth" and would love your take if you have a minute. Thanks for pushing the conversation forward!

Expand full comment
J Young's avatar
7hEdited

Great article. We are in the "faster horses" phase of Ai, in reference to Henry Ford's comment, "If I asked my customers what they wanted, they would say a faster horse."

Everyone is just imagining how Ai will make their old work faster, not really understanding the impact of the changes. If you use Ai to fill out a document, and the other person uses Ai to read the document, then why do we need the document at all? Some documents will still be needed of course, but maybe not the ones that could be easily automated.

This is the time to rethink the entire process. There is still too much emphasis on reverse engineering Ai to do pointless work faster and in higher volume.

Looking back to previous technological automations, think of the intense precision and careful thought required to get an automated packaging line to operate properly. The WORK is now in the design of the task, not in the doing of the task.

The best humans could never match the output of an automated filling line. But a poorly thought out design can lead to choke points and piles of broken bottles that takes longer to fix, and is more expensive, than just filling the bottles by hand.

Doing bad or pointless work faster or more frequently is not the goal.

Expand full comment
11 more comments...

No posts