Three public programs
Coordinated disclosure · names withheld per program rules
Coordinated-disclosure findings against three public programs
Found, triaged, and disclosed a small set of vulnerabilities against public bug-bounty programs.
Anonymized real bug-bounty work: a small set of coordinated-disclosure findings filed against three public programs over a few weeks. Findings ranged from auth-flow IDORs to misconfigured S3 buckets. All resolved through proper disclosure channels with researcher credit (under handles, by request).
CRIT
4
HIGH
6
MED
16
LOW
8
- Industry
- Cybersecurity / Bug Bounty
- Timeline
- 2 weeks
- Team
- 1
- Service
- Cybersecurity
- Project tier
- Bug bounty — anonymized findings
The Problem
What was broken.
Small bug-bounty research session targeting three public programs. The aim was to surface real, reproducible findings with clean writeups — not to inflate any reputation metric.
Our Approach
How we framed it.
Standard recon → triage → reproduction → writeup loop. Every finding documented with the smallest reproducer that demonstrates impact, with a suggested fix.
Capability proof
What this case demonstrates.
This case makes the hidden work visible: strategy, architecture, delivery control, quality evidence, and handoff.
01 / Product judgment
Problem framed before UI
Small bug-bounty research session targeting three public programs. The aim was to surface real, reproducible findings with clean writeups — not to inflate any reputation metric.
02 / Technical depth
5 stack decisions
Burp Suite Pro, ffuf, Custom recon scripts (Go), HackerOne, Markdown (writeups)
03 / Delivery discipline
3 delivery checkpoints
Recon + scoping / Triage + reproduction / Writeups + disclosure
04 / Handoff quality
4 shipped artifacts
Multiple accepted findings across three public programs / Reproducible writeups with minimal proof-of-concept / Suggested fixes shipped alongside each finding
Production artifacts
Inspect the work behind the visible result.
Each case exposes the surfaces, systems, evidence, and handoff package that make the shipped product usable after launch.
Experience layer
Buyer or user surface
Disclosure dashboard with CVSS scoring, PoC code blocks, fix timeline, automated retest scheduling.
Proof 01
Found, triaged, and disclosed a small set of vulnerabilities against public bug-bounty programs.
Proof 02
Read every program's scope page carefully. Built a small recon scaffold for the in-scope assets.
Proof 03
Multiple accepted findings across three public programs
Before / after · product UI mockup
Industry · Cybersecurity / Bug Bounty
Before:Triage queue ran in Linear; reproductions copy-pasted into reports — fix-tracking via Slack DMs.
After:Disclosure dashboard with CVSS scoring, PoC code blocks, fix timeline, automated retest scheduling.
How the engagement ran.
- 01Day 1-3
Recon + scoping
Read every program's scope page carefully. Built a small recon scaffold for the in-scope assets.
- 02Day 4-7
Triage + reproduction
Investigated promising leads, discarded most, kept the ones with clear reproducible impact.
- 03Day 8-14
Writeups + disclosure
Filed findings with minimal reproducers, suggested fixes, and clean repro steps. Stayed available for triage questions.
- 1
Day 1-3
Recon + scoping
Read every program's scope page carefully. Built a small recon scaffold for the in-scope assets.
- 2
Day 4-7
Triage + reproduction
Investigated promising leads, discarded most, kept the ones with clear reproducible impact.
- 3
Day 8-14
Writeups + disclosure
Filed findings with minimal reproducers, suggested fixes, and clean repro steps. Stayed available for triage questions.
Deliverables
What we shipped.
- ✓Multiple accepted findings across three public programs
- ✓Reproducible writeups with minimal proof-of-concept
- ✓Suggested fixes shipped alongside each finding
- ✓Recon scaffold reusable for future research
Outcomes.
delivered outcomesPlan: multiple findings accepted across three programs
Plan: severity range: low → high
Plan: each writeup ends with a one-paragraph suggested fix
Plan: coordinated-disclosure timelines respected throughout
Plan: findings can be discussed in detail under NDA
Honest challenges
What we got wrong (or almost wrong).
The pretty version of any case study skips this part. We don't.
- 01
Most leads aren't real; discipline is in dropping them quickly and not getting attached.
- 02
Coordinated disclosure means waiting; respect the program's timeline even when a finding is exciting.
In our own words
The discipline of coordinated disclosure is the work — finding the bug is the easy part. A clean writeup with a minimal reproducer and a suggested fix is what earns triage time and program trust over months, not the severity score on day one.
From the Hayaiti team
Engineering · design · security
Technical blueprint
How the work holds together.
Buyers should see that the visual layer is backed by architecture, quality gates, and operational ownership.
Experience
1Application
2Data
3Operations
4Security
5Stack used
5 technologiesRelated
Other cases like this.
Helix Health
HIPAA-ready intake portal
Replace clipboard intake with a 7-minute, accessible, HIPAA-ready web flow.
GovTechCity of Greendale
Municipal procurement portal
Replace a paper-and-fax vendor onboarding flow with a proper portal.
Cybersecurity / FintechNorthwind Pay
Pre-SOC 2 pentest + playbook
Surface critical findings ahead of the SOC 2 audit window with a clear remediation plan.
Want a case study like this?
Want this level of production quality on your project?
Send a short brief and we'll reply with scope, timeline, price direction, and the first technical recommendation.