Punchline: AI will make it harder for leaders to build great cultures.

Leaders looking to build stand-out organizations can expect AI to put a significant and hard-to-combat drag on their efforts to build a great place to work. This is because AI learns over time from all the people using it—not just from your company, but from all the other companies whose employees are using that AI, too. So when your AI bot suggests a “best” practice, it’s not really a “best” practice, it’s more like a “most common” practice. So while you’re working to infuse your company with practices that are unique, your software will be subtly guiding employees into practices that are standard.

For example, if your company is known for putting grainy, homemade GIFs in customer emails, your AI editor might tell you that those types of images reduce readership relative to hi res, professional-grade photos—because that’s a common differentiator across all users of the editor. And that guidance will happen every time a GIF gets put into an email. So now, Leader, you have a problem: not only do you have managers who maybe think your GIF culture is silly, but now they’re armed with an authoritative voice telling them they’re right and you’re wrong.

Your job just got a lot harder.

The solution to this will include much of what culture-building has always required: active leadership, 100% adherence from executive, lots of face-to-face time with employees, near incessant repetition, and ruthless hiring and performance practices that reject people who refuse to get with the program.

Moving forward, the required dosage for all of these activities will increase. Leaders will also require more intentionality than ever before, as they’ll need to be able to clearly explain how their AI bot is wrong with some things even while they try to get people to trust the bot with other decisions.

It may also require hiring software developers in HR where organizational behavior experts tend to dwell; these developers will be responsible for tweaking AI bot algorithms and protecting the company culture in areas where “best practices” would water it down.

These are going to be very interesting times, indeed…

Categories: AIleadership

1 Comment

John Sumser · November 16, 2017 at 4:28 pm

Umm. It’s more like, “it depends.” Not all intelligent software harvests the entire customer catalog to generate guidelines for mediocrity. This is a good thing to discuss with your provider but not an inherent part of the deal.

The important point that you raise is that buyers need to be clear about what they are buying. The sort of intelligent software you are describing should put an end to the other big source of mediocrity – benchmarking. The warning is important. Alternatives exist.

Please add your thoughts...

Related Posts

communication

Employee Brand Advocates: My DisruptHR Talk

If you’re developing an employer brand, The Muse has a great tool for you, called BrandBuilder. BrandBuilder gives your employees a chance to up-level their professional elevator pitches, which makes them more articulate about what Read more…

AI

AI: The End of the Amazing Employee…?

Recruiters love talking about, searching for, and finding “purple squirrels.” To a recruiter, the purple squirrel is that mythical employee who is meets every criteria a hiring manager has asked for. This usually includes that Read more…

%d bloggers like this: