Skip to main content

Introduction

Virtual meeting assistants are Artificial Intelligence (AI)-powered tools that may be invited to or automatically join online meetings. Virtual meeting assistants are designed to enhance productivity during online meetings by automating tasks such as note-taking, transcription, and providing actionable insights.

Categories of Virtual Meeting Assistants

There are two categories of these AI tools:

  1. Embedded Assistants: This category encompasses AI features that are built into and operate within University-managed collaboration platforms (e.g., Copilot in Microsoft Teams, Zoom AI Companion). These assistants function as part of the collaboration platform itself, rather than as separate tools.
  2. Stand-Alone Third-Party Assistants: This category includes separate, third-party AI tools (e.g., Otter.ai, Read.ai, tl;dv, Fathom.ai) that can connect to or join virtual meetings across multiple collaboration platforms.

Risks Associated with Stand-Alone Third-Party Assistants

While virtual meeting assistants can be useful, they often rely on large datasets and may use or retain information in ways that are neither transparent nor predictable. Use of stand-alone third-party assistants may introduce significant risks to the University, including:

Data Exposure and Unauthorized Data Sharing: 

  • Meeting content, including confidential information or data shared or discussed during a meeting, may be accessed and stored by stand-alone third-party assistants without the contractual protections that are in place for University-managed systems.
  • Meeting content and Penn’s confidential information or data may be used to train third-party AI models.
  • Meeting content may be exposed to third parties.

Security Risks:

  • Stand-alone third-party assistants are not subject to Penn security assessments and may introduce vulnerabilities to our computing environment.
  • Using stand-alone third-party assistants may conflict with various compliance requirements for secure computing.
  • Data integrity may be at risk due to “hallucination” issues with AI models used by stand-alone third-party assistants.

Guidance for Use of Virtual Meeting Assistants

Due to the privacy and security risks associated with stand-alone, third-party virtual meeting assistants, these tools should not be used in any manner and should not be permitted to join any Penn virtual meetings, regardless of whether the platform is managed by the University or a third-party (e.g., a vendor platform or research collaborator).  Faculty and staff may only use the virtual meeting assistants that are embedded in University-managed collaboration platforms. 

Questions

For questions regarding this guidance, please contact the Office of Information Security (security@isc.upenn.edu) or the Office of Privacy (privacy@upenn.edu).