HIPAA is one of those regulations that's simultaneously over-discussed and under-understood. Teams building healthcare software treat it as a binary — “are we HIPAA compliant?” — when it's really a continuous practice. There's no certification body that stamps your product “HIPAA compliant.” There are only the decisions you make and the audit trail that shows you made them thoughtfully.
This is what's actually relevant to engineers building products that handle Protected Health Information (PHI).
PHI is health information tied to an individual. The regulation lists 18 identifiers — name, date of birth, email address, IP address, and others — that, combined with health information, constitute PHI. Note that IP address is on that list. If your analytics pipeline captures IP addresses alongside any health-related event (a page view on a diagnosis page, a search for a medication), you may be processing PHI without realizing it.
The most common mistake we see is teams scoping PHI too narrowly. “We don't store medical records, so we don't handle PHI” — but they do store appointment scheduling data with names and dates of service, which is PHI. Getting the scope right upfront determines your entire compliance architecture. Get it wrong and you'll be retrofitting encryption, access controls, and audit logging into systems that weren't built for them.
The HIPAA Security Rule covers electronic PHI (ePHI). It requires three categories of safeguards:
The regulation is deliberately non-prescriptive about how you implement these. It doesn't say “use AES-256” — it says you must protect the integrity and confidentiality of ePHI. This gives flexibility, but it also means you have to make and document your own security decisions. That documentation is what you hand to an auditor.
Encryption in transit is the easy part: TLS 1.2 or higher on every connection. Don't terminate TLS early inside your network unless you re-encrypt before PHI leaves the boundary. Check your internal service-to-service calls — this is where teams often have gaps.
Encryption at rest is more nuanced. Database-level encryption (enabled by default on RDS and most managed databases) satisfies the letter of the requirement. But think carefully about where PHI surfaces outside the database: in log files, in backups, in analytics pipelines, in error tracking tools. Each of those surfaces needs to be evaluated. A Sentry error that includes a patient's name in the stack trace is a potential breach.
We recommend a “follow the PHI” exercise early in every healthcare project: trace every path that PHI takes through your system, from ingestion to storage to retrieval to deletion. Every point on that path needs to be in your security model.
HIPAA requires audit controls — mechanisms to record and examine activity in systems that contain ePHI. At minimum, you need to log: who accessed PHI, what record they accessed, when, and from where (IP address). Failed access attempts matter too.
Practically, this means your PHI data layer needs to emit audit events, not just application logs. These are different things. Application logs tell you what your system did. Audit logs tell you who did what to which records. Build them into your data access layer from the start — retrofitting audit logging is painful and often incomplete.
Retention: HIPAA doesn't specify a minimum retention period for audit logs specifically, but state laws often do (California requires 7 years for medical records). We recommend a default of 6 years for audit logs to be safe across jurisdictions.
Any vendor who handles PHI on your behalf needs a signed Business Associate Agreement (BAA). This is broader than most teams think. Your database provider, your blob storage provider, your email service (if you send PHI in emails), your error tracking tool (if it captures PHI), your analytics platform — all of them.
AWS, GCP, and Azure all offer BAAs on qualifying service tiers. Twilio, SendGrid, and Mailgun offer BAAs. Most observability platforms (Datadog, New Relic) will sign BAAs. Tools that won't sign a BAA cannot be used to process PHI, full stop — regardless of how they configure their security.
Keep a vendor inventory with BAA status for every tool in your stack. This is one of the first things an auditor will ask for.
If you have a breach — unauthorized access to or disclosure of PHI — you must notify affected individuals within 60 days of discovering it. Breaches affecting more than 500 individuals in a state also require notifying prominent media in that state. All breaches must be reported to HHS annually.
Have your incident response playbook written before you need it. What constitutes a breach? Who makes that determination? Who gets notified internally? Who drafts the customer notification? How do you contact affected individuals if the breach involves your contact database? Figuring this out at 2am after an incident is not the time.
HIPAA compliance is an operational practice, not a checkbox. The teams that do it well treat it as part of their engineering culture — security reviews are routine, the audit log is checked, the BAA inventory is current, and the incident response plan is rehearsed. The teams that struggle are the ones who think compliance is someone else's job until it suddenly isn't.
If you're building healthcare software and want to talk through your compliance architecture, we've been through this process enough times that we can usually spot the gaps quickly. Reach out.
Writing about design, engineering, and the craft of building healthcare technology.