Frequently Asked Questions
About the Gen AI Lab
-
We envision the lab as a collaborative space where Canadian nonprofits learn about, use, share and co-design AI solutions. For now, we offer a user-friendly Nonprofit AI workspace with dedicated assistants, a library with prompts relevant for nonprofits, and an AI evaluation planner. In the next phase of the lab, we will roll out opportunities for co-design and a curated resource library.
-
Use the interest form on our webpage to sign up for the workspace. If you meet our eligibility criteria, you will receive a confirmation email, containing instructions to create a Nonprofit AI workspace account. Note that it may take 30 min to receive the confirmation email.
-
To use the AI workspace, you need to work or volunteer for a Canadian registered nonprofit that provides community-based services in Ontario.
We recommend that nonprofit staff and volunteers discuss their use of the platform with their organization’s management. Some agencies may restrict staff use of AI. It’s the responsibility of individual users to ensure that their activity on the platform is in keeping with their organization’s policies.
-
For now, the Nonprofit AI workspace is available free-of-charge to staff and volunteers with Canadian registered nonprofits that offer community-based programs and services in Ontario.
Platform privacy and security
-
We use OpenWebUI, an open-source platform. It offers a user-friendly interface for AI interactions while maintaining high security and performance standards. We host OpenWebUI on Elest.io with servers based in Germany.
-
No. We use 'Enterprise' licenses for Large Language Models (LLMs) and have opted out of services that use data for training purposes. This means your conversations and uploaded files won't be used to improve the underlying AI models.
-
No. Your chats and uploaded files are private and cannot be seen by other users unless you deliberately share them by sending a link.
-
Our Open WebUI administrators at LogicalOutcomes cannot see or download your chats. Only our admins with backend access to the Elestio database have access to chats. We have opted out of the settings that would allow us to see communications sent to the LLMs.
-
Administrators at the following levels might access data for troubleshooting purposes:
Managed service level (e.g., Elest.io administrators)
LLM provider companies (e.g., OpenAI administrators)
Hosting providers
Each company has its own privacy policies that may change. Additionally, chats and files may be subject to government warrants or subpoenas.
-
Follow these practices:
Never upload electronic health records or detailed client records with personal identifiers.
If you’re working with a dataset that contains personally identifiable information, strip identifiers before uploading for analysis.
When discussing sensitive or confidential data with an LLM, treat your AI conversations with the same level of caution that you’d apply to an email to someone outside your organization.
-
We employ several strategies to protect your privacy:
European hosting: Our AI platforms are hosted in Germany by a German company, managed by an Irish provider, and subject to European privacy regulations (GDPR), offering stronger privacy protections than many other jurisdictions.
Privacy-focused configuration:
Opting out of data sharing for training purposes
Disabling administrator access to user chats
Custom deployment options: We can help agencies set up their own cloud-hosted AI instances that their IT staff can manage, either in Canada or Europe.
-
Our data is hosted in Germany and managed by an Irish provider, subject to European privacy regulations (GDPR). We chose European hosting over Canadian hosting because:
It provides protection from US government warrants
It's subject to stronger privacy laws under GDPR
The available Canadian-located services for our platforms are owned by US companies
-
The privacy risks of AI chatbots are similar to those of email services. Information shared with AI chatbots could potentially be accessed by system administrators, service providers, or government entities under certain circumstances. Just as with email, you should be mindful of the type of information you share in these systems.
-
Both AI chatbots and email services have similar privacy considerations at different levels:
User level: Just as emails can be forwarded, chats can be shared.
Provider level: Technical staff may access content for troubleshooting.
System level: Administrators have technical capacity to access data.
Legal level: Both may respond to government requests or subpoenas.
Security level: Both have risks of unauthorized access or accidental exposure.
The main difference is that AI services process your information through machine learning systems, which may have additional data handling requirements.
-
We will be posting our configurations in a public wiki. We are also developing detailed descriptions of privacy risks and options as part of our ongoing work over the next few months.
Integrations
-
Many nonprofits use Microsoft 365 and rely on its security systems. If it's important for your organization to stay in the Microsoft environment, we suggest:
Using either Copilot M365 or Copilot Studio for sensitive information
Using our application for general purposes where Microsoft integration isn't required
Microsoft Copilot has the advantage of integration with Word, Excel, OneDrive and other Microsoft services. You can still use our prompt library within Copilot 365.
-
Yes, this is possible. If your agency is interested, we can help your IT staff set up private AI workspace on your own computer network using open source Large Language Models. This would keep all information within your agency's infrastructure.