At the start of 2023, everyone was talking about chatbots – including us, when we considered whether you could use ChatGPT for EHS. Toolbox talks, understanding regulations, and benchmarking suddenly seemed a lot easier, but we found that the answers often didn’t hit the mark.

Now that a year has passed since we last tested ChatGPT for EHS, a lot of new chatbots have come on the market. Claude, Google Bard, Meta Llama, Pi, Poe – just to name a few.

Since we’re all about Microsoft integration at Pro-Sapien, we had a look at Microsoft’s AI assistant “Copilot”.

Is it better than ChatGPT, and can you use it for EHS?

For that, we tested both tools with health and safety prompts and compared their responses. We also did a bit of research into privacy and whether the bots are safe to use – keep reading to discover our findings:

  • What is Microsoft’s Copilot?
  • ChatGPT vs. Copilot – which AI tool…
    • Understands prompts better?
    • Shares accurate information?
    • Has the better privacy policy?
  • The winner is…

Let’s get started by introducing Copilot.

What is Microsoft’s Copilot?

Similar to ChatGPT, Microsoft’s Copilot is an AI assistant that initially launched as “Bing Chat” in February 2023. Back then, it made headlines by being a bit of a feisty chatbot that didn’t hold back with emotions.

Consequently, Microsoft recognized that users wanted a chatbot that was not just entertaining, but also reliable and accurate. (Who would have thought!)

Therefore, Bing Chat moved away from its initial playful nature towards a more professional tone and eventually rebranded as Copilot. On Copilot, you can choose between detailed, adjustable prompts through Notebook, or the conversation-styled chatbot that lets you decide whether you want “creative”, “balanced” or “precise” answers.

According to Microsoft, the tool can handle complex queries, summarize content, and provide real-time access to information through its integration into Microsoft Edge.

We put Copilot to the test with EHS-specific prompts – and compared it to ChatGPT.

First of all, we looked at how well it understands instructions.

Understanding Prompts

Last year, we tested the prompt “Write me a toolbox talk about hand protection”. Back then, ChatGPT kicked off with “Ladies and Gentleman” and came up with an overly formal speech that wasn’t very engaging to frontline workers.

Now, we’ve tested the same prompt again with both ChatGPT and Copilot:

ChatGPT’s toolbox talk
Copilot’s toolbox talk

ChatGPT’s toolbox talk was certainly better than the one from a year ago. Its introduction now starts with “Hello everyone”, followed by “Let’s discuss something vital to our safety – hand protection”.

The talk is well-structured, from identifying hazards to selecting the correct hand protection and the proper use and maintenance.

Looking at Copilot, it doesn’t address its audience and dives straight into hand injuries, such as “Hand Getting Crushed”.

While every hand safety toolbox talk will inevitably talk about potential dangers, you engage workers better by acknowledging the hazards of their workplace and giving clear instructions on how to protect themselves.

Therefore, we’d say that ChatGPT offers a more complete response by incorporating preventative measures and focusing on “what to do” rather than on “what not to do”.

So, there is room for improvement for Copilot when it comes to understanding prompts and writing targeted, creative content.

Sharing Accurate Information

Next, we tested how accurate and relevant the information from both chatbots is. For that, we asked them “What is ISO 45001 and what are developments in 2024?”:

ChatGPT’s response on ISO 45001
Copilot’s response on ISO 45001

Both tools gave us a similar definition of ISO 45001 but then showed key differences in their response to current developments.

ChatGPT was last updated in January 2022, meaning there is a knowledge cut-off. The chatbot gives predictive responses to any questions referring to events after its last update. More recently, it has also been in the news for providing false information about individuals.

In contrast to this, Copilot is powered by Microsoft Bing and has therefore access to instant information from the internet. With that, Copilot is also able to share the sources for its responses through footnotes and as links within the text.

Verifying any EHS research you do with chatbots is crucial since AI reflects human bias based on whoever trained the bot, leading to inaccurate or one-sided information. Therefore, Copilot’s ability to show its sources is very helpful to double-check its responses and find further information on the topic.

Other Useful EHS Prompts

During our chatbot research, we experimented with various prompts to find more ways you can use them for EHS task management. We also asked the chatbots themselves how they could help. As a result, we discovered additional areas where they could assist in EHS:

EHS taskExample prompt
Job hazard analysisCreate a job hazard analysis of a slurry operator
Regulations and compliance requirementsGive me an update on recent changes of reporting obligations in the EHS industry in the US, referencing relevant sites such as EHS Today and OSHA
Root Cause AnalysisCreate a root cause analysis for the following incident (see our last section on chatbot privacy)
Safety trainingCreate a safety training schedule for the next month, including topics such as fire safety, chemical handling, and emergency response

We found that all of the above worked best with Copilot, especially the prompts referring to current news and events.

In fact, independent research firm Verdantix found that ‘summarizing regulatory requirements’ was a priority for 77% of EHS professionals, second only to ‘analyzing leading indicators to predict safety risks’.

If you’re interested in more uses of AI that go beyond chatbots, we recommend reading the Verdantix report on “AI and the revolution of EHS compliance”.

Tip: When you come across a useful prompt, consider documenting it in a dedicated file. This way, you can easily reference them and make your work more efficient.

Maintaining Privacy

While using a chatbot for toolbox talk inspiration is fairly harmless, it becomes more of a problem if you start giving it information about your company, for example for EHS benchmarking.

Unless you explicitly opt out of ChatGPT’s privacy settings, it will use any of your prompts to train its AI model. That means your data can eventually end up becoming part of a response for someone else.

For instance, Samsung developers sent lines of confidential code to ChatGPT to fix a bug, which the AI chatbot happily used as training data for future public responses.  

The new version of ChatGPT is able to memorize in order to recall previous prompts, which again feeds into the privacy concerns.

With EHS data being particularly sensitive, you need to be careful what you feed into the chatbot.

According to Microsoft’s data privacy statement, Copilot doesn’t use prompts, responses, and data to train the chatbot, as long as you’re signed into a work account. Moreover, it uses Azure OpenAI for processing, and not OpenAI’s publicly available services.

However, if you use plugins in combination with Copilot, it depends on the plugins’ terms of use if your information is protected.

Therefore, as an overall rule of thumb, I’d recommend only using AI for company data if you are 100% sure that it won’t ever leave that space. Or avoid sharing sensitive data altogether. Then you’ll always be on the safe side.

The winner is…

So, which chatbot is better for EHS? ChatGPT or Copilot?

Let’s summarize the key strengths and weaknesses that we identified throughout this article:

Copilot:

  • Great for finding relevant, up-to-date information with the related sources
  • Clear privacy policies
  • Creative content could be better and more targeted
  • Sometimes takes a few prompts to understand the brief

ChatGPT:

  • Good at understanding prompts
  • Became better at writing audience-targeted talks
  • Knowledge limited to January 2022, with no access to sources
  • Tendency to have bias and share predictive or wrong information about individuals
  • Prompts are monitored and used for AI training purposes unless explicitly turned off

Overall, Microsoft’s Copilot might take a few more prompts to nail creative content. However, its ability to share current information along with footnotes makes it a much more reliable AI tool than ChatGPT.

On top of that, Copilot’s privacy features are probably the biggest reason why we’d recommend using it over ChatGPT. While you should always be careful about sharing company information on any sort of external platform, Copilot won’t utilize it for training purposes or share your data externally (unless you use it in combination with a plugin or without a work account).

If you find Copilot is a great fit for you, you can integrate it into your day-to-day Microsoft apps. For example, you can add it to Microsoft Teams and ask Copilot to produce conversation summaries.

On that note, Pro-Sapien’s EHS software also integrates into Microsoft 365 and saves you time – without needing any AI prompts. Discover now how it boosts EHS engagement in enterprises:

New call-to-action

Leave a Reply