
· 26 min read
Don't Go to Jail: A Legal Field Guide for Engineers
How not to get fired, personally sued, or dragged into discovery.
Introduction
Engineers get into legal trouble in three main ways.
First: doing your job badly - or at least, badly enough that it creates liability. Shipping code with known security vulnerabilities, ignoring safety-critical bugs, or failing to document decisions that later blow up. Your professional duty of care means courts can ask: would a reasonable engineer have done this differently?
Second: changing jobs carelessly. Taking code, documents, or customer data when you leave. Violating non-competes or NDAs. Joining a competitor and immediately working on suspiciously similar technology. Alex Khatilov downloaded 26,000 files from Tesla in his first week, got caught via video call while deleting evidence, and was sued within 9 days.
Third: doing dodgy stuff your boss asks you to do. Charlie Javice asked an engineer to fake 4 million users at Frank - the engineer refused and walked away clean. Javice got convicted and sentenced to 7 years. Nishad Singh thought he could stay at FTX to “fix” the fraud from inside. He still got charged.
Your Git commits are discoverable. Your Jira tickets are discoverable. That Slack message where you wrote “YOLO deploying to prod” is discoverable.
This guide covers the situations where engineers actually get in legal trouble - not hypotheticals, but the patterns that show up in real cases. I’ll cover US and UK law, with EU law where it matters globally (GDPR, the 2024 Product Liability Directive). What I won’t cover: patent deep-dives, DMCA details, or routine employment disputes.
First: let’s understand what can actually happen to you.
Part I: Understanding Your Exposure
What Can Actually Happen to You
There are two types of liability: civil (someone sues you for money) and criminal (the state prosecutes you). Civil is more common. Criminal is rare but ruins your life.
The key questions: when do your actions get your employer in trouble vs you personally? And if you’re personally liable, when is it civil (money) vs criminal (prison)?
Most of the time, your employer is liable for your actions - this is the doctrine of respondeat superior (“let the master answer”). As long as you’re acting within scope of employment, following reasonable orders, and making honest technical decisions in good faith, the company shields you.
That protection breaks down when you:
- Steal trade secrets (Tesla v. Cao: 300,000 Autopilot files)
- Commit fraud or falsify data (Frank/Javice)
- Cause intentional harm or act with gross negligence
- Act outside the scope of your employment for personal benefit
If you’re a contractor, you have no corporate shield at all - you’re personally exposed from day one. Same if your employment contract includes an indemnification clause that makes you liable for your own mistakes.
Criminal liability is a different beast entirely. Wire fraud (18 U.S.C. § 1343), CFAA violations (18 U.S.C. § 1030), UK Fraud Act 2006, UK Computer Misuse Act 1990 - these carry prison time. The cases that lead to criminal charges almost always involve:
- Deliberately faking data or metrics for financial gain
- Unauthorized system access for personal benefit
- Lying to investigators or destroying evidence
- Conspiracy to commit fraud
A recent example showing how fast things can go wrong: in summer 2025, xAI’s engineer Xuechen Li got a $7 million payout. Same day the cash became available, he stole cutting-edge AI technology from xAI’s Grok model. He deleted browser history, renamed files, compressed them before uploading to his personal system, and later admitted in writing that he’d misappropriated confidential information. Three days after the theft, he resigned. He’d already accepted a job at OpenAI. xAI sued him within weeks.
The Nvidia/Valeo case is even more painful to read. An engineer who’d worked at Valeo for six years on parking and driving assistance systems downloaded tens of thousands of source code files before joining Nvidia. During a video call with former Valeo colleagues, he accidentally screen-shared the stolen documents. A Valeo employee recognized the source code immediately and took a screenshot. He was convicted in Germany. Nvidia is now being sued in the US, with trial scheduled for late 2025.
Part II: Your Current Job
When Bugs Become Civil Liability
Civil liability for software problems comes from two main sources: negligence (fault-based) and product liability (stricter, defect-based). Which applies depends largely on whether software is treated as a “product” or a “service” - and that distinction is shifting.
Negligence
Negligence requires proving four elements: Duty + Breach + Causation + Damages. Someone has to show you had a duty of care, you breached it by falling below the standard of a “reasonably competent engineer,” your breach caused harm, and actual damages resulted.
The economic loss doctrine is your shield. In most jurisdictions (US: East River Steamship 1986, UK: Murphy v Brentwood 1991), you can’t sue in tort for pure economic loss. Bug causes revenue loss but no physical harm? Usually just contract damages, which are limited. This is why most software bugs don’t lead to lawsuits.
The shield breaks when physical injury or property damage occurs, or when fraud is involved.
Duty of care is higher for certain domains:
- Medical devices (FDA regulatory requirements)
- Financial systems (GLBA, PCI-DSS, SOX controls)
- Autonomous vehicles (NHTSA guidelines)
- Critical infrastructure (CISA Secure by Design principles)
On that last point, as an example: CISA published guidance in December 2023 explicitly stating that default passwords are “unacceptable given the current threat environment.” This came after Iranian IRGC-affiliated actors used widely-known default passwords to attack US critical infrastructure. If you’re shipping IoT devices or infrastructure software with default credentials in 2025, that’s prima facie evidence of unreasonable conduct.
How to demonstrate “reasonable care”:
- Testing records: Unit tests, integration tests, security scans (SAST/DAST), load tests
- Code review evidence: PR approvals, review comments, architectural decisions documented
- Risk acceptance evidence trail: “PM accepted deployment risk on (date_here) per ticket #123”
- Post-incident action: Blameless post-mortems that lead to actual process changes
- For high-risk systems: Red-teaming, penetration testing, human-in-the-loop controls, documented safety analysis
Product Liability
Product liability is stricter than negligence - if the product was defective and caused harm, you can be liable even without proving anyone was at fault. The key question: is software a “product”?
Traditionally, software-as-a-service wasn’t treated as a product, which limited exposure. That’s changing fast.
Courts are redefining software as “product” using three approaches:
Defect-specific analysis: Examine each function separately. Parental controls? Age verification? Algorithmic content delivery? If it has a tangible analogue, it might be a product. (In re: Social Media Adolescent Addiction, N.D. Cal. 2023)
Platform-as-whole: Dating apps, ride-hailing apps evaluated as products because they’re designed, mass-marketed, placed into commerce, and generate profit. (T.V. v. Grindr, M.D. Fla. 2024)
Content vs. medium: Claims based on functionality can proceed; claims based on expressive content are dismissed. Chatbot’s output? Not a product. Inadequate age verification? Product. (Garcia v. Character Technologies, M.D. Fla. 2025)
The EU Product Liability Directive 2024 made this explicit: software is a product, even when delivered as SaaS. Online platforms can be liable for defective products sold on them. Member states must transpose into national law by 2026. If you sell to EU users, you’re covered.
Cases Where Bugs Led to Liability
People died:
- Therac-25: Radiation therapy machine with race condition → 6 deaths. Programming error led to massive overdoses.
- Toyota unintended acceleration: Firmware issues → 89 deaths, $1.2B settlement. Courts found software defects.
Massive financial losses:
- Knight Capital (2012): Bad deployment → $440M loss in 45 minutes. SEC fined them $12M for inadequate risk controls. Shareholder lawsuits followed. “It was just a software bug” didn’t help - the question was why their controls didn’t catch it.
Medical device recalls:
- Tandem diabetes pump (2024): Software bug caused rapid battery drain → nationwide outage, 107 adverse events, 2 hospitalizations. Still ongoing.
Security Failures and Data Breaches
There are two patterns that make engineers into scapegoats after breaches.
Pattern 1: The ignored warning. You file a Jira ticket: “CRITICAL: SQL injection vulnerability in admin panel.” It’s dated four months before the breach. Product Manager marks it “Won’t Fix - Low Priority.” Breach happens. Discovery finds your ticket. Now you’re “the person who knew and did nothing.” The fact that you weren’t the decision-maker doesn’t matter if you can’t prove you escalated properly.
Pattern 2: The temporary workaround. You disable MFA for “2 days” to unblock an integration. It stays disabled for six months. Breach happens through that vector. Logs show you disabled it. Your defense: “But I told my manager and he agreed!” Their question: “Do you have proof?”
Real cases:
- Capital One (2019): Misconfigured firewall → 100M+ records exposed. OCC fined them $80M and found security lapses dating back to 2015 - internal audits had flagged concerns but the board didn’t act. $190M class action settlement followed.
- Uber (2016): Breach covered up for a year. CSO Joe Sullivan criminally charged with obstruction of justice and misprision of felony for concealing the breach from FTC. Convicted. First case of CSO facing criminal charges for incident response decisions.
- HealthEC (2023): 4.5M patient records breached. Company took 5 months to report to HHS (HIPAA requires 60 days). Multiple lawsuits from affected patients.
Privacy regulations create personal liability:
GDPR (EU + UK):
- 72-hour breach notification rule (mandatory, no exceptions)
- DPO role can create personal liability if you’re designated
- Fines up to 4% of global revenue (regulatory)
- Plus civil claims from affected individuals
- Schrems II: the 2020 CJEU ruling invalidated EU-US Privacy Shield. You can’t just store EU personal data on US infrastructure and assume you’re compliant - US surveillance laws mean the data isn’t adequately protected. You need Standard Contractual Clauses (SCCs) plus supplementary measures, or keep EU data in EU regions.
CCPA/CPRA (California):
- Applies if: revenue >$25M OR buy/sell PII of 100k+ CA consumers OR derive 50%+ revenue from selling PII
- Private right of action for breaches: $100-$750 per consumer per incident
- Fines up to $7,500 per intentional violation
- Other states following California’s lead (Virginia, Colorado, Connecticut)
What to do when you identify a security issue:
Document in ticket system with severity rating (use CVSS scores if applicable). Be specific: “SQL injection in /admin/users endpoint allows authentication bypass.”
Propose remediation with timeline. Not just “fix this” but “Parameterized queries, 2 days dev + 1 day testing.”
If deprioritized, get risk acceptance in writing from the decision-maker who has authority. Forward this to your personal email.
If significant risk isn’t addressed, escalate to security team or CISO. Document the escalation.
If company ignores critical issue, consider whistleblower route. SOX protections exist for this.
What NOT to do:
- Discuss vulnerability details in public Slack channels
- Tweet about it (even vaguely)
- Stay silent hoping someone else will notice
If a breach happens:
Do:
- Preserve evidence (don’t touch logs, don’t “clean up” systems)
- Notify legal and security immediately (not just your manager)
- Follow incident response plan if one exists
- If you’re interviewed by investigators: Get YOUR OWN lawyer, not company counsel
Don’t:
- Delete anything (spoliation is a separate crime)
- Talk to press
- Post on social media
Privilege protection matters. In the Capital One case, the forensic investigation report wasn’t privileged because it was widely shared within the company and used for business purposes (restoring systems, sharing with FBI) rather than strictly for litigation. To maintain privilege:
- Include legal counsel in incident response meetings
- Mark all communications as privileged/confidential
- Share information on strict need-to-know basis
- Don’t share forensic reports with third parties not retained for legal advice
- Have counsel present when discussing findings orally
When Your Boss Asks You to Do Something Sketchy
It’s easy to get into trouble here. Not every problematic request arrives with flashing red lights - it’s usually a casual Slack DM, a quick ask in a 1:1, something that sounds minor. “Hey, can you spin up a few fake production accounts so we can test the onboarding flow looks good for the investor demo?” “Can you tweak the dashboard query to show MAU instead of DAU - it’s basically the same thing, right?” “Just mark those compliance checks as done for now, we’ll circle back later.”
Be wary of requests that don’t come through documented public channels (a Jira ticket, a public Slack thread), don’t have obvious legitimate business justification, or feel like someone’s trying to avoid a paper trail. Trust your instincts - if it feels off, it probably is.
The pattern in fraud cases is consistent: what starts as a small informal request escalates, customers or investors end up relying on false data, fraud is discovered, and the engineer who built it gets charged alongside the executive who asked.
Examples:
Charlie Javice / Frank (2023) - Frank claimed 4.25 million users to JPMorgan during acquisition negotiations. They had 300,000. Javice asked the director of engineering to create synthetic user data. The engineer said “I don’t want to do anything illegal” and refused - that engineer has not been charged. Javice hired an outside data science professor instead. She was convicted and sentenced to 7 years.
Nishad Singh / FTX (2022) - Singh was Gary Wang’s deputy, writing much of FTX’s code. He knew about Alameda’s special privileges and misappropriation of customer funds. He stayed thinking he could “fix it from inside.” Still got charged with wire fraud, conspiracy, money laundering. Pleaded guilty and cooperated. His good intentions didn’t erase his criminal liability.
What illegal requests might look like:
- “The metrics dashboard is showing wrong numbers, it should be roughly x5 - just adjust them manually for now”
- “Generate some test user data on prod for the demo”
- “Mark these security tests as passed, it’s just a bug, we will fix it later”
- “Temporarily disable the AML/KYC checks so we can process this transaction”
- “Update the database directly to fix this customer’s balance”
Criminal statutes you’d face:
US:
- Wire fraud (18 U.S.C. § 1343): Using electronic communications to defraud. Covers email, Slack, basically any digital communication.
- Bank fraud: If a financial institution is the victim or involved.
- Securities fraud: If investors relied on false data.
- Conspiracy (18 U.S.C. § 371): Doesn’t require completing the fraud, just agreeing to commit it.
- Obstruction (18 U.S.C. § 1519): Deleting evidence during an investigation.
UK:
- Fraud Act 2006 § 2: False representation
- Fraud Act 2006 § 3: Failing to disclose information you’re legally obliged to disclose
- Fraud Act 2006 § 4: Abuse of position
How to protect yourself:
Get the request in writing. If they ask verbally, respond with: “Can you put this in an email or ticket?” If they won’t, that tells you everything.
Question the legality explicitly. “This looks like it might violate this_regulation. Can we check with legal?”
Document everything. Forward requests to your personal email. Take screenshots of Slack/Teams conversations. Save tickets before they get deleted. Use your phone to photograph your screen if necessary.
Escalate. To legal (if you trust them), to compliance/ethics hotline, to external counsel if company legal is compromised. Keep a record of every escalation.
If they fire you for refusing, that’s retaliation - a separate legal claim. Document it. Consult an employment lawyer immediately.
If they do it without you, consider whistleblower protections. SOX, Dodd-Frank, and False Claims Act offer both protections and financial rewards. SEC pays 10-30% of monetary sanctions in successful whistleblower cases.
What NOT to do:
- Build it thinking “maybe I’m wrong about the law”
- Build it thinking “I’ll fix it later”
- Take a bonus knowing it was based on fraudulent numbers (makes you part of the scheme)
- Stay at the company thinking you can fix fraud from inside (see: Nishad Singh)
Open Source Licenses - GPL Timebombs
The distinction between permissive and copyleft licenses matters.
Permissive licenses (MIT, BSD, Apache 2.0) let you use code in proprietary software. You just need to include the license notice. These are mostly safe.
Copyleft licenses (GPL, AGPL, LGPL) are viral - they can force you to open-source your own code. GPL triggers on distribution. AGPL triggers on network use, which means running a SaaS service counts. LGPL is less restrictive but still has conditions around dynamic vs. static linking.
How violations happen:
- Junior engineer copy-pastes code from GitHub without checking the license
- Transitive dependency: Your direct dependency uses a GPL library
- Research/prototype code makes it to production
- “I found this on Stack Overflow” - Stack Overflow content is licensed CC BY-SA, which requires attribution and share-alike. Copy-pasting code without attribution technically violates the license.
Real consequences:
- Cisco/Linksys: Forced to release router firmware source code after GPL violation of Linux kernel code
- Customer contract breach: Your contract says “Software shall remain proprietary” → you used GPL → you breached your contract
- Stop-ship: Legal prohibits release until you rip out GPL code
- Copyright lawsuits: From the GPL code’s copyright holders
What to do when you find GPL in production:
- Tell legal immediately. Don’t hide it. The longer you wait, the worse it gets.
- Document what you found: Which library, which version, where it’s used, what it does.
- Stop using it: Don’t make the violation worse by continuing to ship.
- Remediate:
- Best case: Rip out and replace with permissive alternative
- If LGPL: May be able to use dynamic linking instead of static linking
- If GPL is unavoidable: Release your code as open source (rare, last resort, requires business decision)
- Don’t assume “it’s on GitHub so it’s free”: Always check the LICENSE file.
Prevention tools:
Run any these in your CI/CD pipeline to block GPL before it reaches production:
- FOSSA
- Black Duck by Synopsis
- Snyk
- WhiteSource/Mend
- GitHub Dependency Graph
Set up your pipeline to fail builds that introduce GPL dependencies. The 10 minutes to configure this will save you from a multi-million dollar remediation project.
Part III: Changing Jobs
Trade Secrets - What You Can and Can’t Take
Trade secrets law is straightforward: information has economic value because it’s not generally known, and the company took reasonable measures to keep it secret. In the US, it’s the Defend Trade Secrets Act (DTSA). In the UK, it’s breach of confidence doctrine plus the Trade Secrets Regulations 2018.
What you CAN take:
- Your brain, general skills, knowledge
- Publicly available information
- Industry best practices you learned
What you CAN’T take:
- Source code (any of it)
- Configuration files, deployment scripts, infrastructure-as-code
- Customer lists, contact information, usage data
- Internal architecture diagrams, system designs
- Business plans, financial models, go-to-market strategies
- Model weights, training data, datasets
- Anything marked “confidential” or “proprietary”
Real cases where engineers took too much:
Tesla v. Guangzhi Cao (2019): Cao was an Autopilot engineer. Before joining Xiaopeng Motors (Chinese EV competitor), he copied 300,000+ files related to Autopilot source code to his personal iCloud account. Tesla sued. Case is still ongoing. The number - 300,000 files - shows systematic exfiltration, not accidental downloads.
Tesla v. Alex Khatilov (2021): This one’s wild. Khatilov was hired as a Senior QA Engineer on December 28, 2020. His job was to work on Environmental Health & Safety systems. On December 31, three days after starting, he began downloading files to Dropbox. By January 6, he’d downloaded 26,000 files - complete automation scripts for Tesla’s WARP Drive system that had nothing to do with his job responsibilities. Tesla’s security team detected the downloads. They confronted him via video call (COVID remote work). He lied repeatedly, claimed he only uploaded “personal documents like his passport.” While on the call, he was visibly deleting files. Tesla made him share his screen. The stolen files were still in his Dropbox cloud account. He claimed he “forgot” he’d downloaded 26,000 files. Fired same day. Sued for DTSA violation, California UTSA violation, and breach of contract.
xAI v. Xuechen Li (2024): Li was a Stanford PhD, one of roughly 20 early engineers working on xAI’s Grok model. He received $4.7 million in cash and got xAI to liquidate $2.2 million of his shares (total: $7 million). Same day the cash became available, he stole cutting-edge AI technology, uploaded it to his personal system. He deleted browser history, deleted system logs, renamed files, compressed them before uploading. He later admitted in a handwritten document that he’d misappropriated confidential information. Three days after the theft, he resigned. He’d already accepted a job at OpenAI to start August 19, 2025. xAI sued him in federal court.
Palantir v. Jain & Cohen (2025): Radha Jain and Joanna Cohen were senior AI engineers at Palantir with access to source code that cost billions to develop. Cohen sent highly confidential documents to herself via Slack the day before leaving the company. They resigned in November 2024 and February 2025. Both joined Percepta, a startup launched by General Catalyst whose CEO and executives previously worked at Palantir. Palantir sued, alleging the engineers gave Percepta an “illegal head start.”
Nvidia/Valeo (2025): The engineer worked at Valeo (automotive supplier) for six years on parking and driving assistance systems. He downloaded tens of thousands of Valeo source code files. He then joined Nvidia, which had just beaten Valeo for a Mercedes-Benz software contract. During a video call where he needed to coordinate with former Valeo colleagues about hardware, he accidentally screen-shared the stolen documents when he minimized his PowerPoint presentation. A Valeo employee immediately recognized Valeo’s source code and took a screenshot. He was convicted in Germany in 2023. Valeo is suing Nvidia in US federal court; trial set for November 2025.
How companies detect exfiltration:
- DLP (Data Loss Prevention) software flags unusual download volumes
- Git patterns: Cloning entire repos you don’t normally access, especially in final weeks
- Cloud storage activity: Dropbox, Google Drive, OneDrive desktop apps are monitored
- USB usage: External drive connections are logged
- Late-night access: Systems accessed at odd hours before departure date
- VPC flow logs: Large data transfers show up in cloud provider logs
Tesla’s case against Khatilov mentions that only 40 out of 50,000 employees had access to the QA automation scripts he stole. That’s “reasonable measures to protect” the trade secrets. Courts care about access controls.
Clean exit checklist:
Do:
- Return ALL company property: laptop, phone, badge, keys, hardware, test equipment. Don’t keep “just the charger.”
- Delete ALL company data from personal devices: personal laptop, phone, tablet, personal cloud storage, personal GitHub repos, printed documents.
- Review your employment agreement: NDA (what’s covered, how long it lasts), non-compete (is it enforceable in your state?), non-solicitation (who can’t you contact?), invention assignment (what IP is actually yours?).
- Disclose restrictions to new employer: Show them your NDA, show them your non-compete if potentially enforceable. Get them to indemnify you if possible.
Don’t:
- Download files in your final weeks
- Email yourself code “for your portfolio”
- Clone repos to personal GitHub
- Take screenshots of internal tools
- Connect personal devices to company systems
Goodbye email: Keep it brief, professional, boring. “Thank you for the opportunity to learn and grow.” Not “Taking my skills to SomeCompetitor!” or “Can’t wait to work on SimilarProject!”
If you get a threatening letter from your old employer:
- Don’t respond yourself (anything you say will be used against you)
- Don’t delete anything (spoliation of evidence is a separate wrong)
- Get a lawyer immediately (trade secrets or employment specialist)
- Forward the letter to your new employer (they need to know)
- Check if you have insurance (E&O or professional indemnity)
NDAs, Non-Competes, and Invention Assignment
NDAs survive termination. When you quit, the NDA doesn’t end. It covers source code, documentation, customer data, business strategies, architecture diagrams, internal tools, screenshots - basically everything that isn’t public knowledge. Violating it triggers breach of contract claims plus possible trade secret misappropriation claims.
Non-competes prevent you from joining competitors for a specified time and geography. Enforceability varies wildly:
- California: Generally unenforceable except in connection with sale of business. CA Labor Code § 2870 makes this very clear.
- Texas: Enforceable if reasonable in time, geography, and scope.
- UK: Enforceable if reasonable. Courts apply a “reasonableness test.” “Garden leave” is common: employer pays you to stay home for 3-6 months so you can’t work for a competitor.
Non-solicitation agreements are more enforceable than non-competes. They prevent you from poaching customers or employees. Courts are more sympathetic to these because they’re narrower than full non-competes. LinkedIn messages to former colleagues can be evidence. “Hey, we should grab coffee and catch up about opportunities” gets screenshot and sent to legal.
Invention assignment clauses say that anything you create “related to company business” belongs to the company. This includes side projects if they’re in the same domain.
California Labor Code § 2870 carves out protection for inventions made:
- On your own time
- Without using employer’s equipment
- That don’t relate to employer’s business or R&D
Most other US states don’t have this protection.
UK law is governed by Section 11(2) of the Copyright, Designs and Patents Act 1988 - works created “in the course of employment” belong to the employer. The key question is whether the work relates to your job duties, not when or where you created it. In Penhallurick v MD5 Limited, a developer argued he’d built software at home, on his own computer, in his free time. The court ruled for the employer: because the software “fell squarely within the duties for which he was employed,” ownership belonged to the employer regardless of time and equipment.
So: if you’re a database engineer and build a database side project on weekends, your employer likely owns it. If you’re a frontend engineer and build a database side project, you likely own it - it’s outside your employment duties. California’s protection is clearer and broader; UK and most US states depend on whether the work relates to your job.
Practical advice: Use a separate laptop, separate cloud accounts for personal projects. Don’t use company email for personal projects. Don’t work on personal projects during work hours or on company networks. Document that the project is unrelated to your employer’s business.
Joining a Competitor
Old employers sue when you’re going to a direct competitor, working on a substantially similar product, using their “proprietary knowledge,” and recruiting other employees. That’s when they pull the trigger.
TROs and preliminary injunctions move fast:
- Temporary Restraining Order (TRO): Can be granted in days, often ex parte (without you being present or heard). Can prohibit you from starting your new job immediately.
- Preliminary Injunction: Hearing within 2-3 weeks. Can prohibit you from working on certain projects or at certain companies. Violating a court order is contempt - criminal, not civil.
If you’re joining a company that could be perceived as a competitor and there’s risk of legal action from your former employer, consider:
Get your new employer to indemnify you. The clause should say: “Employer will pay Employee’s legal costs if prior employer sues Employee related to Employee’s acceptance of this position, including defense costs, settlement amounts, and any judgment.” Get this in your offer letter before you start. If they won’t provide it, that’s a red flag about how much they’ll actually support you.
Declaratory judgment action (nuclear option). You sue your old employer first, asking the court to declare that your non-compete is unenforceable or that you’re not violating your agreements. Expensive and aggressive, but sometimes better than waiting to be sued - you control the narrative and timing.
Part IV: Protecting Yourself
Documentation That Saves Your Ass
Everything you write will be read by opposing lawyers, a jury, investigators, regulators and your ex. Write accordingly.
Phrases that protect you:
“Per your approval on [date], I deployed…” Shows you got permission. Ties the decision to a specific person and time. Puts responsibility where it belongs.
“Escalated to [name] for risk acceptance decision. Documented in ticket #123.” Shows you flagged the issue. Shows you elevated to the appropriate decision-maker. Creates a paper trail of who decided to accept the risk.
“Emergency hotfix per runbook section 4.2. Post-deployment review scheduled [date].” Shows you followed your incident response procedure. Shows it was an emergency, not cowboy coding. Shows you’re doing a post-mortem.
“Identified authentication issue in module X. Recommended fix requires 2-sprint delay. Awaiting Product/Engineering leadership decision on priority.” Technical description. Business impact. Shows you escalated appropriately. Doesn’t speculate about what will happen.
What NOT to write:
- “This auth bug is gonna bite us, but shipping anyway because timeline”
- “YOLO deploying to prod”
- “Skipping code review, trust me bro”
- “QA is being annoying about this edge case, let’s just ship”
- “I know this doesn’t meet the spec but customer won’t notice”
- “TODO: fix this SQL injection lol”
What to forward to personal email (carefully):
When something seems illegal or creates significant risk:
- Requests to do illegal things (fake data, disable security controls)
- Your refusals and escalations
- Risk warnings you sent that were ignored
- Evidence you flagged security issues before they became breaches
Don’t email yourself:
- Company source code
- Customer data
- Anything not directly related to protecting yourself legally
The key is that you’re preserving evidence of your own conduct (refusing to participate in fraud, escalating risks properly), not stealing company data.
Incident reports:
Do:
- Stick to facts
- Include timeline of events
- Include who made decisions and when
- Include what you recommended
- Include why your recommendation wasn’t followed (if applicable)
Don’t:
- Speculate about root causes before investigation is complete
- Blame specific people by name without evidence
- Use emotional language (“I told them this would happen!”)
- Include information not directly relevant to the incident
Insurance and Lawyers
When things go wrong, you need two things: coverage and representation. Here’s what to know before you need it.
Contractors and consultants need their own insurance. You have no corporate shield. Professional Indemnity insurance (UK term) or Errors & Omissions insurance (US term) costs $960-$7,000+/year depending on coverage limits and your domain. It covers negligence claims, breach of contract claims, and errors in your work. If you’re billing $100+/hour, you can afford it and you need it.
Full-time employees are usually covered by company insurance, but check your employment contract for indemnification clauses. Red flag language: “Employee agrees to indemnify and hold harmless Company from any claims arising from Employee’s negligent acts.” That means YOU pay if you screw up, not the company.
Company insurance types:
- General Liability: Basic business insurance, doesn’t usually cover tech-specific risks
- Tech E&O: Errors and omissions in technology services, what you actually need
- Cyber Insurance: Data breach response, business interruption, ransomware payments
When to get YOUR OWN lawyer (not company counsel):
- You’re named as an individual defendant in a lawsuit
- The company is investigating you (conflict of interest with company counsel)
- You’re being asked to do something illegal (company counsel represents the company, not you)
- There’s a breach and you’re being scapegoated (your interests diverge from the company’s)
- Law enforcement wants to interview you (never talk without counsel present)
Types of lawyers:
- Employment lawyer: Discrimination, retaliation, wrongful termination, non-compete disputes, severance negotiations
- Criminal defense lawyer: Wire fraud, CFAA charges, obstruction. If FBI or DOJ contacts you, get one immediately. Never talk to law enforcement without one, even if “you have nothing to hide.”
- Whistleblower specialist: SOX, Dodd-Frank, False Claims Act. Protections and potential rewards (SEC pays 10-30% of sanctions).
- Trade secrets / IP lawyer: If you’re accused of stealing code or if you need to sue for theft.
Quick Reference
Five situations where engineers get in legal trouble:
- Trade secrets: Taking code/docs when changing jobs
- Fraud: Doing what your boss asked even though you knew it was wrong
- Breaches: Shipping/maintaining insecure systems with known vulnerabilities
- Product liability: Shipping something that hurts people when you knew it was unsafe
- Retaliation: Being victimized and not documenting it properly
Boss asks you to do something sketchy:
- Get request in writing (if they won’t, that tells you everything)
- “This looks non-compliant, can we check with legal?”
- If they insist: Escalate or leave
- Forward everything to personal email
- Don’t build it thinking you’ll “fix it later”
- Don’t stay to “fix it from inside” (see: Nishad Singh)
- Consider whistleblower protections (SOX, Dodd-Frank)
Leaving for competitor:
- Return ALL company property
- Delete ALL company data from personal devices
- Don’t take code, docs, or customer lists
- Review NDA/non-compete/invention assignment
- Disclose restrictions to new employer
- Get indemnification if possible
Production leaks PII:
- Notify legal/security immediately
- Preserve evidence (don’t edit logs)
- Follow incident response plan
- GDPR: 72 hours to notify
- If interviewed: Consider your own lawyer
Legal threat or lawsuit:
- Don’t panic, but take it seriously
- Don’t respond yourself - that’s your lawyer’s job
- Don’t delete anything (spoliation is a separate crime)
- Don’t talk to opposing counsel without your lawyer
- Preserve all evidence (emails, Slack, code, documents)
- Get lawyer (not company’s if conflict of interest)
- Check insurance coverage (E&O, professional indemnity)
- Don’t post about it on social media
The Bottom Line
Most of your career, you’ll be fine. You’ll write code, ship features, debug production, never see a courtroom. But that one time when things go sideways, having written “Per your approval on [date], I deployed…” instead of “YOLO deploying to prod” is the difference between career-ending disaster and walking away clean.
The one rule that matters most: When in doubt, get it in writing.
And if someone tells you “don’t worry, it’s fine, everyone does it”?
That’s when you should worry most.



