A blockchain-based patient data system, and what it taught me about building against infrastructure
The idea began at a Christmas dinner. My in-laws had invited W. — gynecologist, former chief of a large private clinic in Mainz, the kind of doctor who has seen enough of the system to have opinions about it. His daughter, a dermatologist running her own practice, started talking about KIM.
KIM — Kommunikation im Medizinwesen — is the encrypted communication system between doctors and insurers in Germany. It is, in theory, the backbone of digital healthcare communication. In practice, it is a box that sits in your office, maintained by technicians who show up unannounced, occupy treatment rooms during business hours, offer no transparency about what they're doing, and leave behind a system that works the way old-school IT has always worked: opaquely, expensively, and on someone else's terms.
She was not complaining about a detail. She was describing a structure. The doctor — the person responsible for the patient — has no meaningful control over the communication channel through which patient data travels. The patient has even less. The data moves from practice software through KIM to the insurer, and at no point does anyone with a medical relationship to that data hold the keys.
W. listened, then said something I have not forgotten: "Someone just needs to build it."
He meant: a system where the patient holds the keys. Where consent is not a form you sign in the waiting room and forget, but a cryptographic event — verifiable, revocable, logged. Where the doctor sends data through the patient's permission, not around it.
I thought I could build it. I had the background — data science, blockchain development at Pagoda (the company behind NEAR Protocol), years of working at the intersection of systems architecture and real-world data flows. And I had a prototype from years earlier: a hackathon project at a Sensors & Data event in Heidelberg, where my team had built a blockchain-based medical data system.
That earlier project had won first prize for best business idea, awarded by the Frankfurt School of Finance. It was, in retrospect, slightly absurd. The concept was the inverse of what sicherlich would become: instead of giving patients sovereignty over their data, we had built a system that let patients auction their anonymized medical records — to pharmaceutical companies, research institutions, anyone willing to pay. Data as commodity. The judges loved it.
The distance between that hackathon project and sicherlich is the distance between a clever idea and an honest one. Auctioning your own data sounds empowering until you think about it for five minutes. Who sets the price? Who verifies anonymization? Who benefits when a thousand diabetics sell their records to the same buyer? The patient becomes a micro-entrepreneur of their own illness. It is Silicon Valley logic applied to the body, and it is exactly the kind of thing that wins prizes from finance schools.
Sicherlich started from the opposite end. Not: how can the patient monetize their data? But: how can the patient control it?
The Architecture
The system I designed has five layers. None of them is exotic. All of them exist as proven technology. The difficulty was never the components — it was making them work together inside a healthcare system that was not built to accommodate them.
The NFC keycard. A physical card the patient carries — like a bank card, but holding cryptographic keys. When the patient presents the card at a doctor's office (via NFC reader or smartphone), it generates a ConsentToken: a time-limited, scope-limited, cryptographically signed permission. "This doctor may send my diagnosis to this insurer, for this purpose, until this date." Not a blanket release. A specific, auditable act of consent.
The KIM Interceptor. A middleware layer that sits between the doctor's practice software and the KIM communication channel. When the doctor sends medical data — a diagnosis, a referral, a billing record — the interceptor checks: does a valid ConsentToken exist for this transmission? If yes, the data passes. If no, it doesn't. The doctor's workflow doesn't change. The infrastructure does.
The Zero-Knowledge verification layer. This is where it gets elegant. The ZK circuit validates that the ConsentToken is authentic, correctly signed, and within scope — without revealing the token's contents. The insurer learns: "this transmission is authorized." They do not learn how it is authorized, or what other authorizations exist. The proof is boolean. Yes or no. Nothing more.
The private chain. Every consent event, every data transmission, every access attempt is logged on a private blockchain — Hyperledger Fabric in the test environment, with Calimero (a NEAR-based private layer) considered and ultimately set aside. The chain is not public. It is an audit trail. The patient can review it. A regulator can audit it. No one else sees it.
The consent interface. A web application where the patient manages everything: which doctors have access, which transmissions are pending, which consents are active, which have expired. QR code fallback for situations where the NFC card isn't available. The interface is privacy-first — designed so that the default state is: no one sees anything, unless you explicitly allow it.
I built working modules for most of these layers. The test wallet generated keys and signed tokens. The ZK circuits compiled and verified. The chain logged events. But the pieces never fully came together into an integrated system. The KIM Interceptor, in particular, never ran against real KIM traffic — only simulated payloads. This matters, and I will not pretend otherwise.
The Wall
The technical architecture was solvable. I know this not because I solved it completely, but because nothing I encountered was architecturally novel. ConsentTokens, ZK proofs, private chains, NFC authentication — these are established patterns. A well-funded team of five could build this in a year.
What stopped me was not technology. It was Germany.
The first problem was KIM itself. KIM is not just a communication protocol. It is a regulated infrastructure operated under the authority of gematik, the national agency for digital health. Intercepting KIM traffic — even to add a consent layer — is not a feature you bolt on. It is, depending on interpretation, potentially illegal. You cannot simply insert a proxy between a doctor's practice software and the KIM endpoint without regulatory clearance. And regulatory clearance for modifying healthcare communication infrastructure in Germany is not a form you fill out. It is a process that assumes you are an institution, not a person.
The second problem was BaFin — the Federal Financial Supervisory Authority. As the architecture crystallized, it became clear that sicherlich, if it were to operate independently, would need to function as something resembling an insurance product. The patient pays for sovereignty over their data. The system guarantees certain protections. This is, from a regulatory perspective, dangerously close to offering insurance — which requires a BaFin license, which requires capital reserves, compliance infrastructure, and a legal entity that looks nothing like a solo developer with a prototype.
I discovered this not through a dramatic confrontation with regulators, but through research. Online, at my desk, reading legal frameworks. The realization was quiet and total.
The third problem was structural and, in some ways, the most interesting. The German health insurance system — the interplay between gesetzliche and private Krankenversicherung — creates a set of incentives that actively resist patient data sovereignty. If the patient controls what the insurer sees, the insurer cannot perform risk assessment. If the insurer cannot perform risk assessment, the pricing model collapses. The entire architecture of private health insurance in Germany depends on the insurer having access to the patient's medical history. A system that encrypts this access — that gives the patient a kill switch — is not a technical problem for insurers. It is an existential one.
And then, during my Heilpraktiker training, I learned something that added a final constraint: in Germany, patients with certain psychiatric diagnoses — psychotic disorders, schizophrenia — are legally restricted from viewing their own medical records. The logic is protective: the concern is that unmediated access to certain diagnoses could be harmful. Whether or not you agree with this — and I have complicated feelings about it — it means that a universal "patients own their data" system immediately runs into a legal exception. Not every patient can own their data, under current law. The system would need to accommodate this, which means building access restrictions into a platform whose entire premise is removing them.
Each of these problems is solvable in isolation. Together, they form a constraint field that multiplies faster than a single person can address it. It is not that sicherlich is impossible. It is that sicherlich, built properly, requires simultaneous engagement with cryptographic engineering, healthcare regulation, insurance law, psychiatric ethics, and institutional politics — and it requires this engagement not as research but as operational reality, with legal opinions, compliance documentation, and institutional partnerships.
I am one person. I have half a day. The math does not work.
What I Learned
Sicherlich is not dead. It is paused. The architecture is documented. The technical components exist as working modules. The insight that drives it — that patient data sovereignty is not a feature but a right, and that the infrastructure to enforce it is buildable — has not changed.
What has changed is my understanding of what "buildable" means in a regulated system. In software, "buildable" usually means: the technology exists, the architecture is sound, the implementation is a matter of time and skill. In German healthcare, "buildable" means: the technology exists, the architecture is sound, and between you and implementation stand seventeen regulatory bodies, three incompatible legal frameworks, and an insurance industry whose business model depends on the problem you are trying to solve.
This is not a complaint. It is a structural observation. The German healthcare system is not broken because bad people run it. It is resistant to patient data sovereignty because it was designed — over decades, through incremental legislation, through the interplay of public and private insurance, through the consolidation of digital infrastructure under gematik — to function without it. Adding sovereignty after the fact is not an upgrade. It is a redesign. And the system, reasonably, resists being redesigned by someone who is not the system.
But there is a question I have been avoiding, and it is the uncomfortable one.
Signal built end-to-end encryption for messaging and shipped it to the world without asking anyone's permission. WhatsApp adopted the Signal protocol for two billion users — overnight, as a software update. Prosecutors, intelligence agencies, entire governments found themselves staring at encrypted channels they could not open. No one applied for a license. No one waited for regulatory clarity. They built the infrastructure, released it, and let the legal system catch up.
I stopped when I read that intercepting KIM might be illegal. I stopped when I realized BaFin might classify the system as an insurance product. I stopped when the constraints multiplied. Was I being responsible — recognizing that healthcare is not messaging, that routing errors here mean treatment errors, that regulation in this field exists for reasons that matter? Or was I being too cautious — mistaking the system's resistance for impossibility, when what was needed was the Signal approach: build it, ship it, let them react?
I do not know. Both might be true. But I notice that the GDPR already gives every patient in Germany the right to a full copy of their medical records, the right to have data deleted, the right to portability in machine-readable formats. These are not theoretical rights. They are enforceable, with deadlines. No doctor can legally refuse. A tool that does nothing more than automate the exercise of these existing rights — generate the request, encrypt the response, enable selective sharing — needs no KIM interceptor, no blockchain, no BaFin license. It needs a patient who installs it and a law that already exists.
That is a smaller thing than sicherlich was designed to be. Less elegant, less architecturally ambitious. But it is buildable now, by one person, without permission. And perhaps that is the lesson: not that the full vision was wrong, but that I was looking for a way to build the cathedral when what was needed first was a door.
The hackathon version of sicherlich — the one where patients auction their data — would have been easier to build. Not because the technology is simpler, but because it aligns with existing incentives. The market wants medical data. Building a pipeline to sell it is building with the current, not against it. Sicherlich, the real version, builds against it. It says: this data is not for sale, not for silent extraction, not for risk assessment without consent. It says: the patient decides. And the system, as currently constructed, has no slot for that sentence.
I still believe someone needs to build it. W. was right. And "sicherlich" — the word means "certainly" in German, but with a softness, almost an irony — remains the right name for a project that is certain about its direction and uncertain about its timeline. private chain. Every consent event, every data transmission, every access attempt is logged on a private blockchain — Hyperledger Fabric in the test environment, with Calimero (a NEAR-based private layer) considered and ultimately set aside. The chain is not public. It is an audit trail. The patient can review it. A regulator can audit it. No one else sees it.
The consent interface. A web application where the patient manages everything: which doctors have access, which transmissions are pending, which consents are active, which have expired. QR code fallback for situations where the NFC card isn't available. The interface is privacy-first — designed so that the default state is: no one sees anything, unless you explicitly allow it.
I built working modules for most of these layers. The test wallet generated keys and signed tokens. The ZK circuits compiled and verified. The chain logged events. But the pieces never fully came together into an integrated system. The KIM Interceptor, in particular, never ran against real KIM traffic — only simulated payloads. This matters, and I will not pretend otherwise.
The Wall
The technical architecture was solvable. I know this not because I solved it completely, but because nothing I encountered was architecturally novel. ConsentTokens, ZK proofs, private chains, NFC authentication — these are established patterns. A well-funded team of five could build this in a year.
What stopped me was not technology. It was Germany.
The first problem was KIM itself. KIM is not just a communication protocol. It is a regulated infrastructure operated under the authority of gematik, the national agency for digital health. Intercepting KIM traffic — even to add a consent layer — is not a feature you bolt on. It is, depending on interpretation, potentially illegal. You cannot simply insert a proxy between a doctor's practice software and the KIM endpoint without regulatory clearance. And regulatory clearance for modifying healthcare communication infrastructure in Germany is not a form you fill out. It is a process that assumes you are an institution, not a person.
The second problem was BaFin — the Federal Financial Supervisory Authority. As the architecture crystallized, it became clear that sicherlich, if it were to operate independently, would need to function as something resembling an insurance product. The patient pays for sovereignty over their data. The system guarantees certain protections. This is, from a regulatory perspective, dangerously close to offering insurance — which requires a BaFin license, which requires capital reserves, compliance infrastructure, and a legal entity that looks nothing like a solo developer with a prototype.
I discovered this not through a dramatic confrontation with regulators, but through research. Online, at my desk, reading legal frameworks. The realization was quiet and total.
The third problem was structural and, in some ways, the most interesting. The German health insurance system — the interplay between gesetzliche and private Krankenversicherung — creates a set of incentives that actively resist patient data sovereignty. If the patient controls what the insurer sees, the insurer cannot perform risk assessment. If the insurer cannot perform risk assessment, the pricing model collapses. The entire architecture of private health insurance in Germany depends on the insurer having access to the patient's medical history. A system that encrypts this access — that gives the patient a kill switch — is not a technical problem for insurers. It is an existential one.
And then, I learned something that added a final constraint: in Germany, patients with certain psychiatric diagnoses — psychotic disorders, schizophrenia — are legally restricted from viewing their own medical records. The logic is protective: the concern is that unmediated access to certain diagnoses could be harmful. Whether or not you agree with this — and I have complicated feelings about it — it means that a universal "patients own their data" system immediately runs into a legal exception. Not every patient can own their data, under current law. The system would need to accommodate this, which means building access restrictions into a platform whose entire premise is removing them.
Each of these problems is solvable in isolation. Together, they form a constraint field that multiplies faster than a single person can address it. It is not that sicherlich is impossible. It is that sicherlich, built properly, requires simultaneous engagement with cryptographic engineering, healthcare regulation, insurance law, psychiatric ethics, and institutional politics — and it requires this engagement not as research but as operational reality, with legal opinions, compliance documentation, and institutional partnerships.
I am one person. I have half a day. The math does not work.
What I Learned
Sicherlich is not dead. It is paused. The architecture is documented. The technical components exist as working modules. The insight that drives it — that patient data sovereignty is not a feature but a right, and that the infrastructure to enforce it is buildable — has not changed.
What has changed is my understanding of what "buildable" means in a regulated system. In software, "buildable" usually means: the technology exists, the architecture is sound, the implementation is a matter of time and skill. In German healthcare, "buildable" means: the technology exists, the architecture is sound, and between you and implementation stand seventeen regulatory bodies, three incompatible legal frameworks, and an insurance industry whose business model depends on the problem you are trying to solve.
This is not a complaint. It is a structural observation. The German healthcare system is not broken because bad people run it. It is resistant to patient data sovereignty because it was designed — over decades, through incremental legislation, through the interplay of public and private insurance, through the consolidation of digital infrastructure under gematik — to function without it. Adding sovereignty after the fact is not an upgrade. It is a redesign. And the system, reasonably, resists being redesigned by someone who is not the system.
The hackathon version of sicherlich — the one where patients auction their data — would have been easier to build. Not because the technology is simpler, but because it aligns with existing incentives. The market wants medical data. Building a pipeline to sell it is building with the current, not against it. Sicherlich, the real version, builds against it. It says: this data is not for sale, not for silent extraction, not for risk assessment without consent. It says: the patient decides. And the system, as currently constructed, has no slot for that sentence.
I still believe someone needs to build it. W. was right. The question is not whether it is possible — it is. The question is whether the builder needs to be an institution, a coalition, a political movement — or whether it can still be, as W. put it at that Christmas dinner, just someone who decides to do it.
I have not answered that question. But I have the blueprints. And "sicherlich" — the word means "certainly" in German, but with a softness, almost an irony — remains the right name for a project that is certain about its direction and uncertain about its timeline.
The problem has not gone away.
It never does.