Skip to main content

Privacy

Privacy and data

The technical companion to the legal Privacy Policy: where your data is stored, what's encrypted, what touches the network, and how to verify the claims.

This page is the technical companion to the user-facing Privacy Policy. The legal page tells you what we promise; this one tells you the file paths, the network calls, and the off-the-shelf tools you can use to verify any of it.

The principle is one sentence: your data stays on your device unless you explicitly send it somewhere else.

What stays on your device

By default, all of this is local-only:

  • Audio recordings (WAV files in your app data folder).
  • Transcripts (SQLite database).
  • Summaries (same database).
  • Action items, flashcards (Pro features — generated and consumed in-app).
  • Study Sage chat history (browser-style local storage on your device, with a size cap).
  • Settings, custom templates, reading preferences.
  • AI models — Whisper for transcription, GGUF models (Ministral, EuroLLM, …) for summaries.
  • Analytics — computed locally from your database, never reported.

There's no NeuroBridgeEDU server holding your data, no account, no sign-in.

What touches the network — and only when you ask

Three things can reach the internet:

  1. Model downloads. When you click Download on a Whisper or LLM model, the file is fetched from Hugging Face. After that, the model lives on your disk and runs offline.
  2. Cloud AI providers (opt-in). If you add an API key for OpenAI, Anthropic, Mistral, Gemini, or an OpenAI-compatible endpoint, only transcript text is sent to that provider when you generate a summary or chat with Study Sage. Audio never leaves your device. All API calls use HTTPS with TLS.
  3. External links in the About page open your browser. The app itself doesn't phone home.

Optionally, anonymous error reporting can be enabled from Settings → Diagnostics. It's off by default.

How API keys are stored

API keys live in your operating system's keychain via the keyring crate, not in the SQLite database:

  • macOS — Keychain Access
  • Windows — Credential Manager
  • Linux — Secret Service (GNOME Keyring / KWallet)

This means the keys are protected by your OS user account and aren't present in any file you might back up or accidentally share.

Where your files live

  • Database (neurobridge_edu.sqlite) — OS app data directory for com.neurobridge.edu.v2.
  • Recordings (recordings/) — same directory.
  • Models (models/) — same directory.
  • Logs (logs/app.log.<date>) — dirs::data_dir()/NeuroBridgeEDU/logs/.

The exact OS app data directory is platform-dependent. The simplest way to find any of these is Settings → Diagnostics → Open Log Folder for logs, or Copy Diagnostic Report for the database path.

How to verify the privacy claims

  1. Network monitor. Open Little Snitch (macOS), netstat (any platform), or your firewall's connection log while you record. With cloud AI off, NeuroBridgeEDU only opens a network connection when you explicitly download a model.
  2. File system inspection. Your audio is in recordings/; your transcripts are in neurobridge_edu.sqlite. Both are inspectable with off-the-shelf tools (any SQLite viewer, any audio player).
  3. Source code. The project is open source. The Rust backend is in src-tauri/src/; network calls are concentrated in the cloud-provider files (summary/providers/anthropic.rs, openai.rs, etc.) and the model-download command.

Your GDPR rights, in practical terms

Because all your data lives on your own machine by default, you already have full GDPR-style control over it:

  • Access — open neurobridge_edu.sqlite with any SQLite viewer; play your WAV files.
  • Rectification — edit transcripts and summaries directly in the meeting view.
  • Erasure — delete a single meeting, or Settings → Storage → Delete All Data.
  • Restriction — stop using a feature or remove an API key.
  • Portability — export to PDF (Pro); back up the database file and recordings/.
  • Objection — opt out of telemetry; don't use cloud providers.

NeuroBridgeEDU itself holds no personal data about you. There's no account to delete and no profile to download.

If you've used a cloud AI provider, that provider holds your transcript text in their logs per their own retention policy. Contact them directly to exercise GDPR rights against their data.

Deleting your data

A single meeting

Open the meeting, click the menu, choose Delete. Removes the database row, the WAV file, and any associated chat history.

Everything

Settings → Storage → Delete All Data. Confirms before running. Wipes the SQLite database, the recordings folder, the chat history, the API keys, the local templates — everything except the downloaded AI models (those are large, so we leave them; remove them manually if you also want them gone).

Uninstalling the app

Removing the app via your OS's normal uninstall flow does not automatically delete your data folder. Use Delete All Data first if you want a clean exit.