Can ChatGPT Write Your Employee Handbook? Why It’s Risky with Bryan Driscoll

In this episode of World’s Greatest Boss, Jackie Koch dives into the complexities of using AI in HR practices with Bryan Driscoll, a former employment attorney turned HR consultant. With AI tools like ChatGPT becoming increasingly popular for tasks like writing employee handbooks and screening job applicants, it’s crucial to understand where AI can be helpful and where it poses significant risks—especially from a legal perspective.

Bryan shares insights on the legal ramifications of using AI in hiring, the difference between AI and automation, and why relying on AI for employment decisions can lead to compliance issues. He explains how the EEOC holds companies accountable for discrimination caused by AI and why small businesses need to be extra cautious when using AI-driven hiring tools. Bryan also touches on the importance of human judgment in screening resumes and navigating complex employee issues.

From hiring challenges to managing workforce behavior, Jackie and Bryan discuss practical advice for using AI responsibly and avoiding common pitfalls. Bryan also shares tips on keeping employee handbooks compliant, especially when dealing with multi-state workforces.

What you’ll hear in this episode:

[00:00] Intro: Risks of using AI in HR practices

[01:50] Bryan Driscoll’s perspective on AI in the workplace

[02:56] Legal risks of using ChatGPT for compliance and HR issues

[05:28] Defining AI vs. automation in HR

[07:18] Examples of AI-driven hiring and potential discrimination risks

[08:41] Jackie’s real-world experience with AI tools in recruiting

[10:34] Challenges of using AI to hire generalists in small businesses

[12:40] The importance of auditing AI-driven hiring processes

[13:19] Risks of using AI to manage workforce issues and employee behavior

[14:22] Why creating employee handbooks with AI can lead to compliance failures

[17:22] Why employees are more knowledgeable about their rights today

[18:32] Handling compliance mistakes proactively

[19:38] Legal trends and upcoming regulations to watch

[20:15] Bryan’s background and advice for business owners

[21:43] Closing thoughts: Being proactive with HR policies

Resources & Links:

Connect with Bryan Driscoll on LinkedIn for more HR insights and guidance.

Visit PeoplePrinciples.co for more resources on building a compliant and effective team.

Follow Jackie on LinkedIn:Jackie Koch

READ IT INSTEAD:

In today’s fast-paced business world, you are always (or should be) looking for ways to streamline processes, save time, and make better hiring decisions. You’ve got to be using or at least thinking about using tools like ChatGPT to become more efficient. Whether it’s writing employee handbooks or screening job applicants, it seems like the perfect solution for busy entrepreneurs.

But here’s the thing… using it to handle all your important HR tasks can actually put your business at serious legal risk. If you’re using ChatGPT to draft your handbook or relying on automated resume screening tools, you might be setting yourself up for a compliance nightmare.

The Hype: Are You Putting Too Much Trust in Technology?

It’s easy to see why it’s so appealing. Who wouldn’t want a tool that screens resumes in seconds or drafts employment policies with the click of a button? The problem is, AI doesn’t think like a human, and that’s a big deal when it comes to employment decisions.

For example, some of these tools used in hiring can inadvertently introduce bias. Imagine using a technology that screens resumes based on keywords. Sounds like a dream, right. If those keywords reflect historical biases (like favoring certain schools or backgrounds), you could end up with a candidate pool that lacks diversity. Even if it’s unintentional, you could still be legally liable for discrimination.

Real Risks: What the Law Says About AI and Hiring

The Equal Employment Opportunity Commission (EEOC) has made it clear: If your AI software makes discriminatory hiring decisions, your business, not the software developer, is liable. That’s right. Even if you’re just using a basic AI tool to sort through resumes, you’re still responsible for any bias or unfair practices that result.

Why ChatGPT and Employee Handbooks Don’t Mix

ChatGPT and similar tools can generate text quickly, but when it comes to legal documents like employee handbooks, it’s not that simple. Different states have different employment laws, and ChatGPT doesn’t always know the specifics. If your handbook doesn’t comply with local regulations, you could face fines or lawsuits - especially if you have remote workers in states like California, where employee protections are much more stringent.

So, What Should You Do?

  1. Audit Your AI Tools: If you’re using any AI in hiring, regularly audit the results. Make sure the AI isn’t filtering out qualified candidates based on non-relevant factors like school names or keywords that reflect bias.

  2. Human Oversight is Key: AI should support your hiring process—not replace it. Always have a person review resumes and final decisions, especially if the role requires a nuanced skill set.

  3. Be Cautious with Handbooks: Never rely solely on AI to draft or update your employee handbook. Employment laws vary greatly by state, and an AI-generated template could land you in legal hot water.

  4. Stay Informed: Keep an eye on legal updates regarding AI in the workplace. The rules are constantly evolving, and staying informed will help you make smarter decisions.

AI can be a game-changer for small businesses, but only when used thoughtfully. It’s tempting to automate every step, but when it comes to HR, human insight still wins. Use AI as a tool, not a crutch, and always do your due diligence to ensure compliance.

Next
Next

Why a 90-Day Probation Might Not Be Doing What You Think It Iscc