GDPR-compliant AI.
Why most tools are not, and how Namulai handles it.
GDPR compliance is not a badge an AI vendor awards itself. It is a set of obligations under Regulation (EU) 2016/679 covering lawful basis, data minimisation, retention, transparency, and the rights of the data subject. Most general-purpose AI tools were built before those obligations were treated seriously. Namulai was built after. This page sets out, plainly, what we do, what we delegate to subprocessors, and where the boundaries actually sit.
What GDPR actually requires from an AI tool
Regulation (EU) 2016/679 is not a checklist of buzzwords. It requires a lawful basis for processing (Art. 6), purpose limitation (Art. 5.1.b), data minimisation (Art. 5.1.c), storage limitation (Art. 5.1.e), and a clear chain of accountability through controllers and processors (Art. 24 to 28).
For an AI assistant specifically, that translates into concrete questions. Why is the prompt being stored. For how long. Is it used to retrain a model, and on what basis. Who are the subprocessors, where are they located, and is there a transfer mechanism for any traffic that leaves the EEA. These are the questions a competent DPO will ask before signing off on a tool, and they are the right ones.
Where most major AI tools fall short
The shortcomings tend to cluster. Free consumer tiers often default to using prompts for model improvement, with opt-out buried in settings. Standard data processing addenda are missing or only available on enterprise plans. Subprocessor lists are vague or unpublished. Retention windows are described in marketing copy rather than contract.
None of this makes the tools inherently unlawful, but it pushes the compliance burden onto the customer. A solo consultant or a five-person studio will rarely have the time or leverage to negotiate that out. The result is a quiet drift of client data into systems whose legal basis has never been properly established.
How Namulai approaches it
Namulai operates as a controller for account data and as a processor for the content of your conversations. Account data, message history, and billing metadata sit in MongoDB on a single Hetzner VM in Falkenstein, Germany. There is no analytics warehouse, no second copy in another region, no shadow database.
Lawful basis is contractual necessity under Art. 6.1.b for the service itself, and consent under Art. 6.1.a for optional analytics. We do not train models on your prompts. Inference is routed through OpenRouter, which has provider-level retention disabled by default. You can export and delete your account from settings, and the underlying records are purged within thirty days through MongoDB TTL indexes.
Where EU residency and GDPR-likeness diverge
Namulai is not a pure EU-only stack, and pretending otherwise would be misleading. OpenRouter routes many models through EU-region endpoints when the underlying provider supports it, but some models are only served from US infrastructure. When you choose one of those models, the prompt leaves the EEA for the duration of that single request.
That transfer is covered by Standard Contractual Clauses and provider-level commitments, which is the same legal footing most enterprise SaaS relies on after Schrems II. It is lawful, but it is not the same thing as never leaving Frankfurt. We surface the distinction in the model picker rather than hide it, so you can choose accordingly.
What this means for EU businesses and freelancers
If you handle client data subject to professional secrecy, health data, legal files, or anything else where Art. 9 special categories are in play, you should not be pasting that material into a free consumer chatbot. The lawful basis is shaky and the audit trail is non-existent.
Namulai gives you a more defensible posture: a single EU-hosted controller, a published subprocessor list, no training on prompts, and the ability to scope your sensitive work to EU-routed models. It is not a substitute for your own DPIA, and it does not turn an Art. 9 use case into a casual one. It does mean that, for the routine work that fills most of a working week, the compliance story is one you can actually tell.
GDPR and AI, common questions
Is Namulai a controller or a processor?
Both, depending on the data. We act as a controller for your account information, billing records, and authentication metadata, which we need to operate the service. For the content of your conversations with the models, we act as a processor on your behalf and route the request through our subprocessor, OpenRouter.
Do you sign a Data Processing Addendum?
Yes, on request for paid customers. The DPA mirrors the structure expected under Art. 28, lists the subprocessors involved (OpenRouter, Stripe Payments Europe Ltd., the email provider), specifies the categories of data processed, and includes the Standard Contractual Clauses for any transfer of conversation content to non-EEA model endpoints.
What is the legal basis for processing my prompts?
Contractual necessity under Art. 6.1.b of Regulation (EU) 2016/679. Without processing the prompt we cannot deliver the service you subscribed to. Optional analytics, when enabled, rely on consent under Art. 6.1.a, and you can withdraw that consent at any time without affecting the contract itself.
How long do you keep my conversation history?
For as long as your account is active, so you can come back to it. When you delete your account from settings, the conversation history and account record are purged from MongoDB through TTL indexes within thirty days. Encrypted database backups are retained on a rolling 24-hour window and overwritten in the normal course.
A defensible AI workflow for EU teams, without the enterprise overhead.
Try Namulai free30-day free trial · €19.80/month after · cancel anytime