Loading Knowledge into Copilot AI Agents with Power Automate

Introduction

Sometimes, the best ideas emerge when you’re not looking for them. What began as idle curiosity about offline internet led me to an unexpected solution – one that transformed how I provide knowledge to Copilot AI. Along the way, I unearthed some fascinating tools, unexpected connections, and a surprisingly elegant way to automate a manual process.

I went on a bit of a journey here, so bear with me…

Microsoft Copilots

When setting up a Copilot AI agent, there are several options for feeding it data to generate informed responses. These options include connecting to Dataverse, public websites, SharePoint sites, or uploading files directly as knowledge artifacts. Here’s a great article from my colleague, Siva Saripilli, that explains the process in more detail.

Knowledge Files

Adding files manually as knowledge artifacts in Copilot Studio typically involves a drag-and-drop or file browsing step — at least, that’s what I initially thought. However, a YouTube video on electromagnetic pulses (EMPs) from solar storms, unexpectedly led me to discover a way to automate this process.

Copilot Studio – File Upload Screen

Offline Knowledge

Watching that YouTube video about EMP risks made me think about the vulnerabilities in our interconnected systems. If online access were disrupted, how would we maintain access to essential information? This led me to consider “offline internet” options, as having access to key knowledge without a live connection would be valuable.

Kiwix Browser

My search for offline internet led me to Kiwix, an open-source offline browser that allows users to download entire wiki websites for access without an internet connection.

I downloaded the Kiwix application and set up a local copy of www.ifixit.com — a goldmine of repair guides and tutorials. Now, with or without the internet, I’d still be able to search the wiki and get my hands dirty with repairs.

Kiwix Browser with list of wikis for download
Browser running the offline iFixit wiki

The ZIM File Format

My curiosity about how Kiwix stores entire wiki websites led me to explore the ZIM file format, an open standard used for this purpose. ZIM files are efficient at storing text, images, and other web data in a highly compressed format. With this knowledge, I wondered if I could work with these ZIM files in .NET. So, I tried writing a simple console app that reads and extracts each HTML file in the www.ifixit.com ZIM archive. This task was actually easy because I found a ZIM file reader package at NuGet.org.

.Net code for ZIM file data extraction

ZIM File Data & The Dataverse

Knowing I could extract all this wiki data sparked an idea: what if I could add these HTML files to a Microsoft Copilot AI agent, allowing it to analyse and draw from this vast knowledge base? However, going back to my initial description of Copilot agents, adding individual files is a manual and time-consuming process. I needed to understand how Copilot stores knowledge file artifacts when you upload them manually. So obviously I asked Copilot — answer, Dataverse of course! 😊

With a little more digging around, I discovered the specific Dataverse table being used is called “Copilot component” — see screenshot below. It was time to write an automated upload process!

Enter Power Automate

I set up a Power Automate flow that triggers when my .NET console app creates output in a designated OneDrive folder. When the flow detects new files, it uploads the content into Dataverse, making it available for my Copilot agent to analyse. The agent can then use this information to answer user questions based on the content within the www.ifixit.com knowledge base.

Here’s an overview of the Power Automate flow — after obtaining the file content and establishing some metadata (name, schema, botid), you need to create a new row in the “Copilot component” table and then attach the file content to that row.

The two important takeaways from this flow —

  1. You must first determine you Copilot AI agent GUID from the Dataverse
    • Check the “Copilot” table
  2. You must use a unique “SchemaName” for each file added

Results

Below are some screenshots of the process in action.

1. You can see here that as I run the console application the HTML files are being written to my OneDrive folder.

2. This then kicks off my Power Automate flow called “SaveFile”, which creates the new knowledge files in Dataverse.

3. The instant those files are added to the Dataverse, my Copilot AI agent automatically refreshes, and the analysis begins. I enjoyed seeing this automatic processing action start up in real time 👍

4. It took a minute or two for an individual file to be processed.

5. I could now start asking my Copilot AI agent about how to repair my recently broken Makita Chainsaw!

License Costs

It does need mentioning that there are some licensing considerations with this approach. A Power Automate Premium license is needed for the flow to use the Dataverse connector. A Copilot Studio (tenant) subscription will be necessary to create and publish your own standalone copilot.

Conclusion

For organisations managing internal Copilot AI agents, this setup offers a valuable solution. While public domain wikis like those on Kiwix are already accessible to common AI systems, the ability to seamlessly integrate private documents is game-changing for internal use — especially if you don’t want to use SharePoint.

With Power Automate handling uploads and integrating files into Dataverse, thousands of documents can be processed quickly and efficiently, ensuring that the AI’s knowledge base grows automatically with each new file — without manual intervention.

Read more recent blogs

Get started on the right path to cloud success today. Our Crew are standing by to answer your questions and get you up and running.