Generative AI
Public generative AI tools offer exciting possibilities for our learning, teaching, work, and personal lives. Some common tools in use are Google Gemini, OpenAI ChatGPT, Microsoft CoPilot, DALL-E, Midjourney, etc. These systems can generate content, provide insights, summarize, and help streamline our tasks.
Microsoft Copilot
VCU has licensed Microsoft Copilot, a generative artificial intelligence chatbot for all faculty, staff, and students to use for free. Navigate to go.vcu.edu/copilot and login with your VCU email address and password. Copilot is built with the same security, privacy, and compliance standards as other Microsoft products. Your prompts and Copilot’s responses are:
- Not available to other customers.
- Not used to train or improve any third-party products or services (such as OpenAI models)
- Not used to train or improve Microsoft AI models
Guidelines for the use of generative AI tools
It is crucial to exercise caution when using generative AI. The following guidelines are about using and purchasing generative artificial intelligence (AI) tools at VCU. Note that these guidelines do not create a new university policy. Instead, they use existing university policies and standards. We welcome your feedback and will work to continually update our guidelines as things change.
Do not enter confidential (Category 1) or sensitive (Category 2) data into any AI tools. Information given to generative AI tools is not private. Remember, humans may review content put into public models. Do not put anything in you wouldn’t want someone to review or read or to be available to anyone on the internet.
Confidential and sensitive data includes non-public research data, finance and credit card information, HR and employee data, student names, IDs, email addresses, records, and grades. It also includes medical info protected by HIPAA. Refer to VCU’s Data Classification Standard for more information. Or, use VCU’s Data Classification Tool to classify data.
AI-generated content can be wrong or misleading. Fabrications are possible and infringement of copyright laws may occur. Be aware of the unreliability of AI tools and possible bias. AI tools are excellent research assistants, but they can "hallucinate" and suggest false facts and sources that sound plausible.
Check the generated content against reliable sources. Do not trust AI data to give medical, legal, financial advice and other professional advice.
Adhere to current university policies on academic integrity when using AI tools. It is prohibited to upload or use content that is copyrighted or infringes on the rights of any third party. Review the "Plagiarism and Copyright" section in the "Generative Artificial Intelligence (Gen AI) and Teaching & Learning Tool” provided by VCU Faculty Affairs. As the landscape of AI technology evolves, the university will continue to update its guidance and policies to reflect our growing understanding of the effects of generative AI tools in academic settings.
Meetings may contain sensitive information that requires the guarantee of confidentiality and security. Third-party bots can scrape your calendar for info and keep a written account or recorded minutes of meetings. They may save meetings in unknown places and join meetings when you’re not there. VCU has blocked various bots from accessing Zoom and Google Meet.
VCU has no current contracts with third-party live transcription services to protect our data. We recommend avoiding third-party bots in virtual meetings. Please use Zoom and Google Meet's built-in transcription. They are a secure alternative for your meetings.
Learn the terms of service and privacy policies for the public Generative AI system you want to use. Make sure you are comfortable with how these tools handle and store your data. Be mindful of any rights the platform may claim over the content you create using their system. Check policies governing data use and sharing on these platforms.
Generative AI has made it easier for bad actors to create scams on a larger scale. Keep following security best practices. Send suspicious messages to infosec@vcu.edu.
Any technology that stores, processes, or sends VCU data must have the required privacy, security and accessibility protections and be reviewed. Only use third-party tools that have been vetted and approved by IT Governance.
The only tool currently available to VCU for generative AI use is Microsoft Copilot. Other tools lack university-wide contracts and require VCU IT Governance review and approval in order for them to be used with university data.