Looking for Technically Write IT (TWi)? We rebranded to Altuent.
3 things knowledge managers must do before using AI

3 things knowledge managers must do before using AI

A common thread I’ve noticed cropping up in recent conversations with knowledge managers is that they’ve received a blanket mandate from management to “use AI” in their teams. Often, that message is passed on without any specific direction on scope or business objectives to support.

Reading between the lines, the gist of the message seems to be “we need to prove that we’re on board the AI train and that we’re not getting left behind.”

If this sounds familiar, read on for the first 3 things to do before you “use AI” in your team.

For the purposes of this blog, we’ve used examples that reflect the most common scenario we encounter: organisations using SharePoint to collaborate and share knowledge who roll out Microsoft Copilot.

Note: There are differences in the content optimisation required for Microsoft 365 Copilot, Copilot Chat, and Copilot Studio agents.

Note: This blog does not aim to provide an exhaustive guide to getting your content or your SharePoint AI ready. Instead, it offers knowledge managers a starting point before using Copilot on existing content. The focus is on content optimisation and provides three broad areas to explore that can help shape your thinking and approach. This blog assumes that all security preparation and regulatory restrictions have been implemented.

3 things knowledge managers need to do before using AI in their team

1. Ensure the AI can only access the right information

When someone in your team uses AI to generate content or derive insights from existing content, it’s important that the AI only accesses the information that the user ordinarily can access.

Example

Both the CEO and a member of the Customer Support team use Microsoft 365 Copilot to generate an overview of a product manufactured in-house.

Because the CEO has access to more information than the Customer Service team member, the overview Copilot generates for the CEO and for the Customer Support team member would include very different levels of detail.

Note: The focus is on content optimisation and assumes that all security preparation and regulatory restrictions have been implemented.

Review SharePoint permissions

Microsoft 365 Copilot respects the permissions applied to content in SharePoint. So, the first thing to do before using Microsoft 365 Copilot in your team is to review SharePoint permissions and the security model in place.

Use a Copilot Studio agent with a SharePoint library as knowledge source

Another way of putting guardrails around the content that Copilot uses to generate content and insights is to create a SharePoint library for a subset of documents and define this as the knowledge source for a Copilot Studio agent.

When the user, interacting with the Copilot agent in a Teams chat, queries the Copilot agent, the agent accesses only that specific subset of documents. If asked about something not in that subset, you can configure the Copilot agent system prompt so the agent responds with something along the lines of “I couldn’t find any relevant documents.”

2. Ensure the AI can read all the information you want it to access

If crucial knowledge that your team needs to access is stored in scanned PDFs, zipped files, or media files, you may need to take action to enable Copilot to access that knowledge. 

Check your file types

To enable Copilot to read and surface the knowledge contained in this files, take the following actions:

  • For scanned PDFs, enable or apply Optical Character Recognition (OCR).
  • Extract files from .zip files.
  • Add transcripts and metadata to media files.

Consider document length

The length of documents that contain crucial information may also need attention. The number of pages that Copilot can summarise varies across Microsoft 365 Copilot, Copilot Chat, and Copilot agents and is improving. However, information at the start of a document is still prioritised.

A good rule of thumb is to keep documents as short as possible and to consider breaking up long documents.

Optimise standard document elements

There are also some everyday elements of documents that Copilot doesn’t yet use to generate responses. The most surprising is a standard element for any content professional: alt text.

To get around this, add a descriptive introductory sentence and a figure caption to images.

3. Ensure the AI can understand the information you want it to access

Copilot is remarkably good at understanding intent, tone, and context of language. This is because natural language processing (NLP) is at the core of how it interprets and responds to queries.

It can handle natural language variation

It’s also remarkably forgiving when it comes to variation in phrasing and even in the typos that often slip in as we type. No need to pause to correct a typo when entering a query – Copilot will probably figure out what you meant.

But ambiguity is something else

What it’s not good with is ambiguity.

Clarity has always been a core quality of good content, but when read by humans with subject matter knowledge and lived experience, an amount of ambiguity can be manageable. And the human can always turn to a colleague for clarification when needed.

But when Copilot is used on ambiguous content, no such clarification is possible. Copilot generates its response based on the content provided. It predicts the most likely correct response based on the text as it has been written.

When faced with ambiguity in the underlying content, AI can incorrectly conflate information and misunderstand the original intent. Or it may generate a response that is too vague and generic to be helpful.

The old adage ‘rubbish in, rubbish out’ gets magnified when AI is involved. Identifying ambiguity in existing knowledge assets and replacing it with clear, concise text is an important step in getting your content AI ready.

Conclusion: Powerful tools that need some setup

Copilot and other LLMs are powerful tools for knowledge teams, but if a blanket mandate to “use AI” is implemented without first optimising the underlying content, results can be unreliable and disappointing.

By ensuring that you give Copilot access to the right information, that it can read all the information you want it to access, and it can understand that information clearly, you are laying the foundations for a more successful, reliable experience for your team.