Introducing Microsoft 365 Copilot!
Microsoft 365 Copilot is more than just a few new features in your favorite office productivity applications. Microsoft Copilot is a whole new way to work.
EXP Technical recently invited Austin Hampton, Productivity Solutions Consultant VII at Pax8 to join us in a preview: Introducing Microsoft 365 Copilot.
This session included a live demonstration and a vigorous round of audience questions. Video and transcript appear below.
Kelly Paletta: It looks like we have a high degree of familiarity with artificial intelligence, which is a good starting point.
So I'm gonna jump in with a question for you, Austin. And that is a big umbrella question, but one of the things that as a Microsoft end user is kind of frustrating is that they tend to give products one big name for everything.
And I see “Copilot” everywhere.
You know, I see it associated with many different… Practically every marketing release Microsoft comes out with today has the word “Copilot” in it.
And so I'm wondering, can you explain to me and to our audience: what is Microsoft Copilot? What is Microsoft 365 Copilot? When they say “Bing Chat Enterprise is your Copilot,” what do these terms mean?
Can we start there with general definitions?
Austin Hampton: Yeah, so in true Microsoft fashion, the naming is always a little hard to wrap our heads around, but that's okay. The way that I think about it is Copilot is really at its core a brand name. It is Microsoft's brand name for their AI products.
And depending on where we are interacting with it, what the data set is and what the use case is, we're gonna have a little bit of a different name.
And so we've got Windows Copilot, which is now native to Windows 11 Pro and Windows 11 Enterprise that is meant to use the large language model in AI. So make searching your internal file repository locally. There. That's what that's for.
We've got Security Copilot, which is much more on the security administrative side and analyzing your telemetry data and making actionable insights.
We've got Bing Chat, which was recently rebranded to “Microsoft Copilot.” But that is the large language model where the internet is the data set and that is the consumer version.
So anybody with an Edge browser on a PC can open it up and start interacting with Microsoft Copilot what was “Bing Chat.”
If you have a work or school account and you're signed in and you have the right licensing, you can use what was Bing Chat Enterprise, which is now “Microsoft Copilot Enterprise,” which is the same large language model using the internet as the data set. The only difference is there's some additional protections around the inputs a user provides to the system and the outputs the system provides a user.
And then we have M365 Copilot, which is the large language model that is leveraging the data within your tenancy, your Microsoft environment or the data that has been actively integrated into that Microsoft environment to be able to search, ask questions of, pull insight, summarize, all the things that the 77% of the users already know what ChatGPT can do so.
I'm glad we have this audience. This messaging is gonna resonate really well with you. Especially when we're not starting from a place of “What's ChatGPT???”
Kelly Paletta: Yeah! Well and they're ahead of the curve and that's great. And that's the point of this event.
You know, one of our values at EXP is “Share knowledge. Share success.” And we're doing this to help for one our clients but really the region. There are a lot of folks attending that are not our clients but we want to help them get ahead of this. Because these are tools that I believe very soon we'll all be using every day.
And so it's great to see that the people in attendance, it's not surprising, I guess, that they are the type that are early adopters or at least early investigators.
And so to circle back and summarize, so it sounds to me like whenever I see Copilot associated with Microsoft, it's an umbrella term that means artificial intelligence. It's one flavor or another. It's a large language model. It's generative AI for creating images or some other artificial intelligence tool that's been integrated into a Microsoft product.
Austin Hampton: You nailed it. You nailed it.
Kelly Paletta: You know, we both laughed about this little tangent, but Bing Chat Enterprise… It frustrated me that it didn't have “Copilot” in the name because everything else had Copilot in the name. And then they'd say: “Bing Chat Enterprise—your Copilot for searching and working.”
It's like, wait!!! You're confusing me! (Laughter.)
Okay, well moving on…
So, How, and this is where we get to the good stuff, but can you talk a little bit about how it works?
Austin Hampton: Yeah, so easiest way to talk about how the tool works is to go over a little diagram that I find incredibly helpful.
So for those of you that are familiar with chat you will know that Chat GPT is a large language model that allows
Kelly Paletta: I'm gonna interrupt you because we have a question in the chat that asked specifically, “What is the meaning of a ‘large language model’?”
So let's be sure that we define that too before you get further into it.
Austin Hampton: Yeah, so taking it even a step back, what is a large language model? A large language model is a AI system that has been fed--I believe ChatGPT was given 150,000,000,000 data points. That is trained on human language—written human language, specifically.
And what they have done with that is given this artificial intelligence the ability to understand written human language—person/place/object/thing—all of the little nuances that we really take for granted as humans that are learning to speak as infants.
So that is what a large language model is. It is an artificial intelligence system that has been trained on written language.
And then taking it a step further beyond just understanding it…
Once it can understand the written language and understand meaning, tone, inflection… it can start to do some really cool things for us.
We can ask it questions in a human language answer or in a human language prompt. Looking for information, analyzing information, aggregating information, pulling insights, making predictions and give that back to us in a human language answer, so the lay person can interact with it without having any requisite knowledge of coding or technical skill whatsoever.
Kelly Paletta: Right, right. So if I can interrupt and summarize a little bit…
So the “large” means that it's trained on the entire internet or you know just huge huge volumes of data.
The “language” part is that it understands not only how sentences are structured, but it also works a lot like predictive text on your phone in that it knows what word tends to come after another one.
And it's so good at that that it seems like you're talking to a person. It seems like a native English speaker when you are interacting with it.
Austin Hampton: Yeah, that seems like a very accurate summary for me.
Kelly Paletta: Sure.
Austin Hampton: And so kind of now that we've, defined what a large language model is. Copilot is a large language model and Open AI really came out with the first marketable solution of a large language model and for the 70 plus percent of us that have been experienced it.
Kelly Paletta: You know what we're talking about for those of you that might not.
Kelly Paletta: It's uncanny!
Austin Hampton: It is. It's absolutely uncanny what you can do there.
And so Microsoft really has been waiting to figure out what the next direction of technology was going to be. And they got wind of OpenAI back in 2019.
They formed a partnership and they pledged a whole bunch of money.
And once OpenAI’s large language model got to the point where it became a viable usable solution, Microsoft through their investment was able to copy OpenAI’s, ChatGPT, large language model code, take that code, run it in the Microsoft data centers, which we know as their cloud is Azure, and they've created Azure Open AI.
And so we have this large language model that's running in Azure.
And when we purchase the $30 add-on license and we assign it to a user the very first thing that happens is all of the data that is within the Microsoft environment. Is cataloged and indexed in place.
And I want to reiterate that it is cataloged and it is indexed, but it is done so in place in the environment in the server it already lives in.
Which turns it from data to a functional data set.
Kelly Paletta: And I'm gonna interrupt again there. So when you say the environment, you mean SharePoint site OneDrive, a file server as well, the organization's computing environment, not the outside world in the cloud, but the the organization that “ABC Company” that's buying a license for Microsoft 365 Copilot, correct?
Austin Hampton: Yes, and specifically the data that resides within their Microsoft which is just the definition of the segregation of digital space that houses your data and keeps it separate from all other Microsoft users data.
Kelly Paletta: Right.
Austin Hampton: So everything that's in that. Microsoft Environment, all of the user information, SharePoint information, OneDrive information, Teams information, all of that data is just sitting there and it's sitting in the Microsoft graph.
And Copilot catalogs and indexes that data in place so that it becomes searchable.
It becomes queryable. It becomes a data set that can be analyzed and pulled insights out of.
And so once we've got the license signed to a user, the data is then converted into this functional data set.
Kelly Paletta: A user can start interacting with that data through the M365 Copilot chat bot.
Austin Hampton: And so the Copilot chat bot is going to live in all of your different Microsoft applications: Word, Excel. PowerPoint, Outlook. Whiteboard, Power BI, Dynamics, Power Automate, Teams… It's the whole, basically everything.
Everything is getting a variant of Copilot injected into it if you have the license.
And then, depending on which application you are interacting with, the chat bot in is going to determine your user experience and the ultimate use case of what you're trying, the use case of what the end result is going to look like.
And so with Word, we have the, we have a writing coach. (Screen freezes.)
Kelly Paletta: Are you there Austin? It looks like you froze. I'll fill while he's reconnecting.
Copilot becomes your artificial intelligence assistant with the within each of the tools that you use.
Within Word it can summarize long documents and create a summary. It can write a first draft of documents that you want.
And when Austin mentions that it "catalogs your environment." What that means is that it's trained on your SharePoint data so that it can create documents like PowerPoint presentations or Word documents or sales proposals--based on a library of documents that you have that are available to you.
I'll give you an example of what we did in preparation.
And so, what we did, with Austin's help, was I gave him a transcript of all of the EXP Academy courses and asked him to take that input that into Word, summarize that, and then create a PowerPoint presentation that we could use… with the appropriate bullet points and the appropriate information.
So we can work from that for a live in-person presentation.
And what ended up happening was… It took about 5 minutes for us to create that. Whereas if I were doing that manually, it may take 3 or 4 hours for me to summarize a bunch of video content into a document that I can use, that I can share with another presenter, and that I can then use as a foundation for a PowerPoint presentation.
And I have a question coming in too. “Do the documents have to be saved to SharePoint in order to be used in Word or PowerPoint?”
I believe that if you're training it on your data, that is correct.
So if you're using your SharePoint or One Drive it needs to be in your computing environment for you to use Copilot to summarize that.
You could also copy and paste, and this is where it's a little bit different.
For those of you that use Chat GPT right now, you know that if you want to summarize the document in Chat GPT, you copy and paste it and then you may be able to manipulate within Chat GPT.
But one of the dangers is that it's saving that data, and it's also using that data to train for future conversations.
And one of the benefits of working within your own computing environment and within your own tenant is that your data is not exposed. So there are not the same security concerns.
I'm gonna pause for a second because I want to be sure that we're able to get Austin connected back in here. Bear with me for a second.
There he is. Austin, we lost you for about 3 minutes. Can you hear us?
Austin Hampton: I can, so sorry about that everybody.
Kelly Paletta: So let's, let's jump to the illustration. So let's, I think what people wanted to see is how this works in real life.
And I gave an I described the situation where one of the examples where I gave you a transcript of the content at EXP Academy and we were able to very quickly create a PowerPoint presentation.
Can we use it? I mean, I know you had some case studies available. Is that a good one?
Austin Hampton: Perfect. Alright, so yeah, in preparation for our presentation today, Kelly went ahead and sent me over a transcript for their Security Awareness Training course.
And as we can see, we've got a good table of contents, 16 pages, and this is pretty dense.
Kelly Paletta: Right, it's a talking-head video transcript, so it's a lot of words.
Austin Hampton: It's a lot of words. We got a little bit of formatting, but a lot of words.
And so what we were able to do with Copilot is ask Copilot to take this transcript and generate a summary. And so we went from 16 pages to 7 pages, still a little on the dense side…
But we asked it to be a little bit more literal and stayed true to the transcript.
The cool thing about Copilot is when you're having it generate information for you, especially if you're having it write…create written content, you've always got this regenerate button. And then you can also give it feedback as to why you would like it to be regenerated or what you would like it to be, how you would like it to be different.
If you don't provide that, Microsoft will just guess. And it's actually a little fun sometimes just to sit there and hit that regenerate and see what comes out.
And so once we've made this 7-page summary—
We're able to open up PowerPoint. And we were able to ask Copilot using my file path from SharePoint to create a presentation from this file.
We went from 16-page written transcript to a 20-slide slide deck where it gives us the agenda, the introduction, module one. We don't have to go through every single slide, but—
When Kelly and I were talking earlier today we were running a thought experiment:
If we went through this on our own…to go from a 16 page transcript to a detailed summary that is shareable. I think we landed on 90 minutes to maybe two hours.
Kelly Paletta: Yeah, depending on the familiarity with the content.
Austin Hampton: And then to go from this summary document to anything on a in a slide deck. What do we land on there?
Kelly Paletta: Oh, it's yeah, at least another 90. Well, probably more…closer to 2 hours to create 20 slides and to obsess over fonts and titles and all of those things that you get sucked into—
Austin Hampton: And finding this digital Trojan horse image and then setting the, you know, resizing it appropriately, setting the gradient of opacity so that it flows. This slide alone could be 15-20 minutes on for a skilled PowerPoint presentation user, let alone.
And so for what this took us even with a couple of iterations, at most yeah 10 minutes.
Kelly Paletta: Yeah. Closer to 5, probably. Yeah, it took practically no time.
And of course, this isn't the end result. I'm gonna have to get in and edit. I'm gonna have to customize so that it fits more of the tone that I want to present to our audience, but this got me 90% of the way there in minutes instead of spending hours working on this.
Austin Hampton: Something else we should call out: we can ask it to reorganize it. We can ask for a summary. We can go from raw data to summarized data to presentation, back to summarized data, or reorganize it. All within minutes.
Kelly Paletta: And the important thing for me as a sales rep is: it gives me time to focus on sales-related activity rather than administrative activity, which is where the real value is as well.
And you know, there was a question that came in that speaks to this and I'm not sure if I answered it correctly, but the question came in that is, “Do documents have to be saved in SharePoint in order to be used in Word or PowerPoint?”
Austin Hampton: The source material does need to live in the cloud, whether that is your personal OneDrive or the company's organizational SharePoint architecture.
And in order for you to pull the information and use the data, you need to have access to it. So that's a really good thing to just kind of call out here.
Copilot will pull whatever information you ask of it, provided you have access.
Kelly Paletta: Right. And there are 2 questions. You're one step ahead of the chat. What sort of access controls need to be in place? How can you protect sensitive data?
Austin Hampton: You 100% can protect the sensitive data and really lock down the information within your environment to limit the exposure that Copilot will give to other users.
I don’t want you to think about it as if Copilot is this omniscient being that we’re trying to hide information from. It’s really not.
It’s more just an agent that will catalog and index the data--and provided you have access—it will return it.
It doesn't actually do anything with your data though from session to session prompt to prompt. It just sits there in a functional data set and it is only pulled in Copilot when a user prompts for it.
So I would think about it less as “How do I protect my data so it won’t be a part of the Copilot pool that people can access?” and think about it more like “Who should be able to access that HR data? That PII?”
I’m sure there are some people that should it’s their job to have access to that data. Copilot for them—since they are authorized users granted access to that sensitive information—could be a tool to help them be better at their jobs.
The question here is, “How do I make so my HR people can see the HR data and nobody else can?”
It’s a change of thought processing: “How do I make sure that these specific users only have access to what they need to and explicitly nothing else?”
And that’s how we protect the data in a AI world.
Kelly Paletta: That’s, frankly, the same concern that businesses have already, even before Copilot. We implement certain controls around sensitive data. We follow a principle of least privilege so that people have access to on the information that they need access to. There are lots of tools on the back end that EXP Technical can help you out with to ensure that this is the case.
The one thing that is a risk though is that provided they have access to it then they can query this information and pull it into a document they are working on.
One thing to be concerned about is that when you give people access to this tool it’s a much more powerful way for them to root around on the network. It’s very important that you do an audit beforehand to ensure that the appropriate controls are in place. That people don't already have inappropriate access. Because they can say, “Hey, what about this sensitive HR data?” and Copilot--because they already have this access but didn't know how to use it—may actually enable them to do that.
And again, EXP can help you with that to be sure that the appropriate controls are in place and the appropriate rights, permissions, and privileges are assigned to individual users.
Because they can set the appropriate rights and permissions and privileges are signed to individual users. Another security question, but then I want to jump back to I think you have a couple of other illustrations to and I want to show people more than just PowerPoint because that's just kind of a sidetrack.
Another question was, “Are you able to see the types of queries your employees are making to co-pilot?” What sort of audit controls are there within co-pilot or are there any right now?
Austin Hampton: So this is a good question and. The short answer is you can't. There are no audit controls. There is no visibility into how your users are using Copilot, at least not today.
And that—while it seems like it might be concerning—is actually a massive benefit.
The way that co-pilot works is the data is cataloged and indexed in place so it's functional.
The user asks for the data. “Find me this.” “What is this?”
Copilot processes that human language into math via the process of grounding. It uses the semantic index to find the actual information that they're looking for.
It creates an alpha numeric representation of that data, not the actual data itself.
So “Analyze this Excel file and tell me the financial trends of the of my top 5 companies over the last 5 years.”
Provided all that's information is in that Excel and the user has access to it And that user has access to it. The alpha numeric representations of the individual data points. Will be encrypted.
Sent to this large language model, but again, the actual file, the actual data does not leave.
The large language model reorganizes that data. Sends it back. There's post processing and then it goes back to the user in pretty much real time.
Once that prompt is completed, Microsoft wipes the memory. Delete everything. Does not save any history of the search. A cache of the information doesn’t exist. It doesn't use that information to then train additional searches later on.
It is like it never happened. It is at one time blip in the lifespan of Copilot in the environment and from prompt to prompt Microsoft has no memory of any of the previous prompts.
Which makes the auditing very hard. But the benefit of it is No one can access it.
If you can access a list of the queries that you're, that your users have been and trying to see what they've been using co-pilot to do, that means somebody else theoretically could too.
Whether they broke into your environment through a breach, whether they broken through a Microsoft back door. If information was saved and audible it could be taken and misused against us which is just not possible.
So while you can't audit it, you can't know what how and what they're using it for, no one else can either.
It’s kind of a double-edged sword there. While it might be annoying on the administrative and management side. On the security and the compliance side it is a godsend.
Kelly Paletta: Right, that's a key distinction between chat GPT, which right now I can go into chat GPT and see my entire history of every query I've ever— every prompt that I've given chat GPT and it exists somewhere outside of my control. And not only can ChatGPT be training on my conversations, but theoretically because it exists somewhere outside of my control, someone else could potentially get access to that.
And to summarize the current round of security questions, I think the summary is just to say that if somebody has inappropriate access now in your computing environment it's a problem now and if they have Copilot they will also have inappropriate access and Copilot.
So the important thing is to be concerned that the appropriate layers of control that are in place to protect sensitive data.
Can we see a demo?
Austun Hampton: Yeah, let's go ahead and jump into the live demo. So as I mentioned, Microsoft 365 Copilot is going to live in all of your applications one application that we tend to forget about that is a Microsoft application is the Edge browser. It is. It's one of the applications that's included with your with your with your licensing. It's also free. But, Edge browser. Is going to be the most global reaching version of M365 Copilot.
Meaning that when I'm in Word, Excel, PowerPoint, Outlook.
I'm a little bit more limited in my use case and what I can do with it. While M365 Copilot access from the, Edge browser. Is going to be able to go anywhere in my environment that I have access to.
So it's like it has the same permissions that I do. So if I have access to a data set, I can query it.
And so if you guys are wanna follow along, go ahead and open up the Edge browser and you're gonna click this little co-pilot logo in the upper right if you didn't know what the co-pilot logo is, there it is.
And so when you click that little logo unless you also have an early access license like myself, you're not gonna see. This M365 chat.
This right here is M365 Copilot. Bing Chat is the Bing Chat/Bing Chat Enterprise version of Copilot large language model that uses the internet as the data set.
And so with Bing Chat Enterprise, we know it's enterprise because we see this green, shield with the status “protected,” your personal and company data are protected in this chat.
With the consumer version, we just have an edge browser. You might be on a home or a personal laptop. We're not going to see that “protected” thing. Doesn't mean that Microsoft is necessarily misusing our data within the consumer side, but on the enterprise paid for provided with a license.
Kelly Paletta: It's explicitly stated.
Austin Hampton: And similarly to the M365 Copilot where once the input is input the information is found in the output is returned to you the data in the memory is wiped—same thing with Bing Chat Enterprise.
And so while you can in a single session, i.e. you open up a fresh edge browser, your click in, you can see a list of your history in that single session, but the moment you close this X and you relaunch. It's back to scratch. It's deleted, it's not cached, it's not saved, meaning it cannot be taken stolen or used against you in the court of law.
Kelly Paletta: Or by the bad guys. Yeah.
Austin Hampton: Or by the bad guys.
And so when we're in Bing Cchat, we have the ability to help us write, compare, analyze.
We can a have it choose a conversation style. Do we want it to be precise? Do we want to be creative or somewhere in the middle?
And additionally, we have the ability to be in the chat versus the compose.
Write me something about…
I'm a professional or a casual enthusiastic informational funny or sic tone What is the format slash where am I utilizing this content? What is the length and generate the draft? Whereas this is gonna be more of I'm looking for information that is not available within my tenancy.
So I do always like to to do the Bing Chat, demo on demand. So what should we ask you, Kelly?
What should we ask it for?
Kelly Paletta: Oh gosh, now you're putting me on the spot. Let's ask, can you summarize best practices regarding data backup and disaster recovery?
Austin Hampton: And so right now it's searching the known internet. For best practices for data backup disaster recovery and generating our responses.
One thing that I do wanna call out that I think is huge that Microsoft does with Copilot and being chat is they very elegantly addressed the hallucination problem.
Quick, really quick. The hallucination problem is when a generative AI model generates false information that is then used to support an analysis or an argument or an opinion that it formed.
We've seen ChatGPT do this. We've seen the AI solution that was built for legal firms to read through legal precedents and analyze and help.
Quick, really quick. The hallucination problem is when a generative AI model generates false information. That is then used to support an analysis or an argument or an opinion that it formed. We've seen Chat GPT do this.
We've seen the AI solution that was built for legal firms to read through legal precedents and analyze and help, right?
Where it just created fake cases and you reference those fake cases in real cases and the AI was disbarred virtually, and there was a lot of punitive action taken upon that law firm.
So, Microsoft addressed this with the most elegant solution possible. “What if we just cited references?”
So each one of these statements develops a comprehensive backup plan. This involves periodically creating or updating one or more copies. Well, notice that when I hover over it, it's hyperlinked. And so, I can see that this is an article from IBM, a relatively reliable source of information around cloud and computing. And I have the opportunity, well, if I don't think that this is exactly right or maybe it was a misaligned analysis, I can go look at the source material.
OpenAI and ChatGPT just does not do that, at least not yet. And so because of that, when we query ChatGPT or any other generative AI model that's specific around large language, we don't know how reputable the reference material that it found to give us the answer is.
And as we know, the internet is an amazing, amazing place full of all of the known full of all of Well, it's full of a ton of great information and a ton of bad information.
And ChatGPT doesn't know which one is better. Copilot doesn't really either, but at least with Copilot, they are giving you the sources so you can make that judgment call yourself.
So that's Bing Chat and then with M365 Copilot…
Kelly Paletta: Here comes the good stuff, right? So this is the step that people don't have access to right now.
Austin Hampton: Exactly. And so while Bing chat uses the internet as our dataset, Copilot uses the data that is within my Microsoft environment that I have access to as the dataset.
I have the ability to see what's new from different people in my organization to get caught up on Teams messages or emails if I've been out of office for a few days.
I can quickly come in here, select a person and get key updates. I can ask for key info from different files that I have in OneDrive.
I can ask it to draft an FAQ based off of a document.
I'll use this one as an example, since we already showed the Security Awareness Training course summary.
Kelly Paletta: Wow!
Austin Hampton: While Bing Chat and Bing Chat Enterprise give us access to information that we don't have, Microsoft Copilot gives us information access to information that we already have and it helps us analyze it.
It helps us aggregate it, helps us make better decisions faster.
Kelly Paletta: Right.
I'm gonna address a question that came up and I think this was when you're demonstrating Bing Chat Enterprise and the question was, “Is this essentially the same as Chat GPT except it doesn't save prompts?”
Bing Chat Enterprise kind of is. I mean, that's the protection. One is that it doesn't save your prompts. The other is that it gives you citation information, but Microsoft 365 Copilot uses that same artificial intelligent assistant, but it's trained on your data and that's a significant difference between ChatGPT.
And it's secured the same way that your computing environment is secured.
So for example, for me as a sales rep, I often get our requests for proposals and they are extremely laborious to complete.
They all seem to ask the same 40 questions 40 different ways. If I could train an AI assistant to look at all the proposals that I've written before and answer each of those questions either individually or collectively based on how I've answered them in the past, it would save me a tremendous amount of time.
And it would not be using just general information that's out on the web, it would be using information that is specific to my organization that lives within my tenant that is otherwise, protected and secure and of concern to my organization.
Kind of speaking over you there Austin but..
Austin Hampton: Oh no, you're all good. You made me just think of a really good thing I wanna demo here.
Kelly Paletta: And the last question Dave asked, "Can we just talk about the capabilities of Copilot instead of trying to compare it to Chat GPT?"
Sure. But it at the root it is Chat GPT though because Microsoft has invested in OpenAI and they have integrated literally that tool into their products but in a way that's protected and that's trained on your data.
But anyway, go ahead with what you were going to say.
Austin Hampton: Oh, no, that was, that was it. I was, I was gonna say that. It's fair to compare it, but if we were to peel back all the layers and we just looked at the ones and zeros, they are identical pieces of software.
Microsoft gave Open AI 13 billion dollars and Open AI let them copy their code.
So they are the same thing just with some different parameters around the technology that makes them different products and also they run Microsoft Copilot and all their stuff runs in the Microsoft data centers OpenAI has all of their compute done on their data centers.
Getting back to that demo, Kelly…
I don't know if anybody on the call has ever found themselves sending the exact same email 30, 40, 50 times. You have really 2 options there. You can hard key it out each time or you can write a template that you can repurpose and send.
With Copilot, you can ask it to write a template, based on emails that you've already sent, written in your voice.
So finishing up this little prompt: “final emails I have ever sent in response to Copilot Not being available yet and what users can do now. And create an email template for me to use, written in my voice.”
Cause isn't the worst thing about automatic canned AI generated emails and templates that you can just tell. You can just tell that this is a canned thing. This is the marketing thing. The person that I work with every day, this isn't how he writes is now he speaks.
This is it's the uncanny valley thing right we can tell that something isn't quite right we can’t put our finger on it but it's not quite right.
And so this is an example that I brought up specifically because Microsoft announced that Copilot was gonna be available November 11th. They did not include the fine print which was: November 11th it was generally available to enterprise customers on an active enterprise agreement willing to purchase 300 seats at a minimum and ride those out until the end of their commitment term.
I knew that on November 11th I was going to get slammed!
I knew that my inbox constantly, it was just going to be: “Where's Copilot, how do I get it? Why can't I see it? How do I use it? How do I get it? Microsoft says it's available!!!”
And so I had Copilot write me a template and I made some minor changes to it. I was able to just respond to like, 47 emails almost instantaneously as opposed to sitting down and putting together a thoughtful response for each email saying almost the exact same thing.
Kelly Paletta: And it's much more sophisticated than just a male merge. It's generating content without you having to do it.
Austin Hampton: Exactly. And the cool thing about it is that it used my content that I've already hard handwritten.
To generate that new content. And that's really the whole thing about what large language models are. Leveraging existing contents, existing words, existing statements, existing data.
Aggregating it, analyzing it, and creating something new from it.
Kelly Paletta: So here's another example then or another illustration. Suppose I'm somebody that attends a lot of Teams meetings.
And you know and and they all seem to run together and I can't remember what was discussed. Can you talk a little bit to the integration of Copilot within Teams?
Austin Hampton: Most certainly!
With Copilot within Teams we have the ability to record and transcribe a meeting to get automated, automatically generated AI notes. And have the ability to come back and ask questions.
This was just a little example call that Kelly and I got on a little bit earlier today.
So we could show the example here.
Recap the meeting.
Evaluate proposition for M365 Copilot
Next steps for the demo.
It was a short one, but if you're on a very long meeting with a lot of different people talking, and you're struggling to take notes and stay on top of things and stay engaged you can just let Copilot run the transcription and then get the automatic meeting notes generated for you and then later come back in and ask questions.
Kelly Paletta: And it looks like it's trained to look for accountability, right? I mean, it says, “Next steps.”
It looks like it's intelligent enough to know that when somebody says, “Hey, I'll do this…” it captures that and highlights that in a summary of the meeting. This speaks to accountability within your organization and gives you a third-party source of truth for those sorts of things so you can keep track and stay on top of timelines.
Austin Hampton: Perfect. Yeah, it looks like we didn't run the run it as long as we could have to make that a little bit more rich. This is our transcript…and also here's the one that we just kind of generated a little bit earlier.
Kelly Paletta: Right. This was just that you had it running when you were generating the PowerPoint presentation for me.
Oh, let's see what it says if it'll recognize.
So I was I was enthusiastic. And I'll tell you why I'm enthusiastic.
I think they're frankly there are problems so I'm pessimistic too and I'm not generally an enthusiastic too and I'm not generally an enthusiastic person.
So there are definitely security concerns. You do want to be sure that you have the appropriate controls in place before you grant your end users access to Copilot.
That's one concern.
Another is it doesn't work perfectly yet, but this is version one.
And what I get enthusiastic about is being able to delegate all of this grunt work. Because in sales, a lot of work is repetitive and a lot of it is administrative.
I look forward to the day where I can delegate a lot of that work to an AI assistant because I don't have an assistant and can't afford one. What gets me excited is maybe not the first version of Copilot, but I'm looking forward to using that when I have access or 3 or 5 years from now.
I dream of a day where I can throw a bunch of receipts at an AI assistant and it submits my expense reports and we're not quite there but you know, that may be in the not too distant future.
Austin Hampton: Most certainly. Just to reiterate that… This is the dumbest version it's ever going to be.
Kelly Paletta: Yep.
Austin Hampton: It will never get worse than this. It will only get better. It will only iterate.
Kelly Paletta: Yeah. That's hardly our resounding ringing endorsement, but I hear you. I'm saying the same thing.
Austin Hampton: It's a system that's going to learn, it's going to evolve, it's going to get better.
And so I'm personally incredibly excited about it. Thinking about this like very long term…
I am of the generation that like experienced a radical amount of change around technology all in my life, in my short lifetime so far. I'm 31 years old. I remember cassette tapes, VHS. DVD, HD DVD, Blue Ray, now streaming.
Pre-internet/Post internet. Pre-smartphone/post-smartphone, all the generations that come after me, Gen Z. They don't know a world without having a smartphone. My children, luckily enough if I have them won't know a world without AI And so this is just the very, very beginning of a long journey.
Interacting with and integrating with and leveraging AI to be the tool that it's supposed to be, which is helping people make better decisions faster.
And I did wanna get to one question. “Microsoft is notorious for having open access and then leaving it to the admins or users to lock things down and protect it. Is there a field they are following a similar model here?”
You are correct. And this is actually by design.
Microsoft has what is called the model of shared responsibility, meaning that “We are going to be responsible for making sure that the infrastructure that houses your data is always up. We are going to be responsible for making sure that your applications always work. The technology itself. We are responsible for creating it, maintaining it, updating it, and keeping it going.”
The data that you put into the system is your organization's responsibility. Is it secure? Is it safe? Is your environment locked down? How are we mitigating these risks? Because Microsoft can't make one solution that is perfect for everybody. That's impossible. Every customer, every organization, every business is slightly different in enough nuance that it would be impossible to have a one size fits all.
And so they make things configurable. And they are configured to meet your needs. And so while it does seem frustrating that Microsoft creates all these things and then puts the responsibility on us to keep our data and ourselves secure; It's actually a good thing. Because again, there is they responsible for making sure that Your data is accessible. And that nobody is stealing it from the Microsoft side. Nobody's walking into a data center finding your server rack and then taking a USB and downloading all of your data.
They have a ton of security in place to prevent that. No one's hacking into Microsoft servers and stealing your source code and your data.
The way that they hack in is they Get your username and your password. And if you don't have MFA, they're in. They're off to the races.
If they can bypass your MFA… if they can send you a link that you click. Or malicious piece of code ends up on your that's how people get in. They're not breaking in through Microsoft.
And so. Taking that to Copilot because we know that Microsoft isn't saving your information, storing your information, training the model. There's only one way for your information to be exposed. Externally. In a bad way.
And that is somebody breaks in. And steals it from the access point side.
Kelly Paletta: So we're down to 2 min, so I'm gonna speed through the last couple of things here.
One is we haven't we've kind of mentioned it in passing but getting access to this tool it is currently only available to enterprise users. Do you know when it will be available to others?
Austin Hampton: Not yet. We are hopeful that it will be early 2024 ideally Q1, but as soon as I know everybody that I work with will also be getting phone calls, emails, blasts, and I'll be it'll be Christmas morning for everybody and I'll be letting
Kelly Paletta: But, Bing Chat Enterprise is available today if you have Microsoft 365 Business Standard, Business Premium, E3, or E5.
Austin Hampton: Yep.
Kelly Paletta: So, one thing that people attending can do if they want to be prepared for Copilot in the future is start interacting with Bing Chat Enterprise. It gives you a protected version of an AI assistant that you can work with.
Another question was, 'What does it cost?'
It's $30 per user.
Do I have to roll it out to the entire environment? You do not. So you can give this tool only to select individuals within your organization, which is another way that you can keep track or another control that you can place on sensitive data.
We're nearly out of time. Austin, do you have any last words? I'll let you.
Austin Hampton: Yeah, I also wanna try to rapid fire some of these questions.
It is available only via the web; only data that's stored in the cloud can be used by it.
Will it be available for small businesses…we don't know exactly when. It will be available for (M365) Standard Business subscribers. However, due to the need to lock down identities and information to a greater extent, you're probably not going to want to use Business Standard you’re going to look at Premium, E3, or E5 because those provide the security and compliance features that allow for the safe using of it.
I see a file path in here that I only see trying to buy it through the purchase services.
It's just not available through any purchasing channel at this time. The demo that I'm showing you is an early access license because I work for Pax8, which is a global distributor for Microsoft.
They gave us a few licenses for purposes like this, to be able to make people aware and educate.
Kelly Paletta: Thank you everyone for attending. Next up in early 2024, we're in discussions with Eva Benn, who is a security specialist for Microsoft. She is concerned about security controls around artificial intelligence for Microsoft. So that should be a fascinating conversation.
We've also invited Eric Huffman for a presentation in February about the cyber psychology of artificial intelligence. So that should be fascinating too. That will be on leap day.
And I will end with just a mention of next steps. These are some things that you can do right now. Again, as we mentioned a minute ago, use Bing Chat Enterprise; it's a way for you to get familiar with these tools.
You could consider Teams Premium now if you want that ability to summarize Teams conversations, Teams meetings.
Connect with EXP Technical if you have questions about security because as has been highlighted in this meeting, you need to be sure that people have appropriate access and not inappropriate access before you give them access to this tool because it can be a powerful tool for them to use maybe inadvertently the wrong way.
Austin, thank you so much for joining us for this preview. I really appreciate you taking the time for everyone in attendance. I appreciate you for taking the time too.
We're a couple of minutes over. I think I will end there. Austin, any final words, one last goodbye, or anything else to say before we close?
Austin Hampton: I just appreciate everybody's attendance and participation. This is the most questions I've ever seen in Q&A where it was almost hard—it was hard. It was almost impossible to keep up with you guys. We might want to do maybe another one of these in the new year that's more of just a straight Q&A and less formal content just so that we can give each question the time to thoroughly answer it as much as possible, but thank you guys for having me. I'm excited to be a part of this with you.
Kelly Paletta: Right. And I love that idea. In fact, I will make it happen. That sounds like a great idea—a little bit more interactive and a little bit more directly interactive in the future. So on that, I will thank everyone for attending and, Austin, once again thank you for your time, and we'll call this meeting closed.