Shipping the Inmetro Explorer and Deploy-Proof Automation
Turned a sprawling government energy dataset into a live-calculating frontend bundle, and locked down production deployments with automated, code-backed TLS and smoke verification.
Build log
Helpful notes, architecture decisions, and implementation logs written for people who care about how operator software gets built.
Turned a sprawling government energy dataset into a live-calculating frontend bundle, and locked down production deployments with automated, code-backed TLS and smoke verification.
RankWar now uses scored outcomes to push proven moves above same-urgency noise and exposes reusable playbooks directly in the creator console.
The creator agenda now closes the loop on launched, snoozed, and delegated moves with leverage scoring, visible outcome history, and reminder shutdown.
The creator agenda now writes execution history, reminder automation, and visible ownership instead of pretending a ranked queue is enough.
Owned signal follow-ups now sit inside the same RankWar creator agenda as weekly bets, so claimed pressure no longer lives in a second queue.
The RankWar signal inbox no longer just ranks pressure. Creators can now claim, resolve, and reopen signals with saved notes, follow-up timing, and timeline memory.
RankWar now ranks fresh captures, open feedback, and shared-contact pressure in one shared creator signal inbox instead of scattering operator truth across timelines.
RankWar entrants can now send bugs, requests, and compliments into one shared feedback ledger tied to apps, contacts, and creator surfaces inside the monolith.
RankWar joins and legacy imports now write into one shared launch-capture ledger, so future waitlists and lead surfaces can reuse the same contact spine inside the monolith.
The hub now owns a real app directory and a real RankWar activation gate, so shared auth no longer masquerades as product authorization.
The monolith now has an explicit about surface so X and LinkedIn clicks stop landing on generic context and start landing on the real operator-software narrative.
RankWar now exports screenshot-ready proof cards from live cockpit and dossier truth so creators can ship visual proof instead of another dashboard screenshot.
The RankWar hub now opens a shared creator dossier so campaigns, ambassadors, review loops, and email pressure compound into one operator record.
The shared hub now ranks overdue RankWar follow-ups, weak review loops, and scheduled next bets across every live campaign.
The RankWar cockpit now persists weekly review ownership, notes, and follow-up timing instead of treating GTM discipline like disposable UI state.
RankWar stopped being just a live cockpit and started behaving like a weekly GTM operating system with acquisition scoring, proof capture, and explicit kill discipline.
Day fifteen of the lmachine monolith: RankWar turned ranked operator moves into reusable lifecycle sequences with tracked completion, so the cockpit now owns the playbook instead of stopping at drafts.
Day fourteen of the lmachine monolith: RankWar stopped at neither dashboards nor drafts. The cockpit now emits one-click execution packs, generated share assets, and timeline memory for every operator move.
Day thirteen of the lmachine monolith: RankWar gained a guided demo war room for first-run onboarding, and the repo gained a local iteration skill so ongoing dominant-move shipping stops depending on chat memory.
Day twelve of the lmachine monolith: RankWar stopped behaving like a dashboard and became an operating surface with a shared campaign timeline, ranked moves, and an AI copilot grounded in live product truth.
Day eleven of the lmachine monolith: RankWar stopped acting like a waitlist utility and shipped a creator cockpit with momentum scoring, ambassador intelligence, proof-pack copy, and a deterministic GTM operator queue.
Day ten of the lmachine monolith: RankWar stopped pretending it was still mid-migration, the growth engine became an explicit operating system, and the old workspace repo was marked for removal instead of lingering as fake optionality.
Day nine of the lmachine monolith: RankWar score decay moved behind an off-by-default flag, the scheduler stopped pretending overlap was safe, and Dokploy's Swarm contract was hardened around health checks and rollback instead of hope.
Day eight of the lmachine monolith: RankWar moved off the public Vercel edge, wildcard DNS converged on Hetzner, Let's Encrypt re-issued live certificates, and the cutover finally became internet truth instead of host-header optimism.
Day seven of the lmachine monolith: the locked RankWar Supabase snapshot was imported into production Postgres, live Dokploy ingress was attached for the current RankWar hosts, and the remaining gap was reduced to public DNS and TLS truth rather than application readiness.
Day six of the lmachine monolith: the live RankWar Supabase project was frozen into a private snapshot, replayed into the Laravel monolith, and upgraded with a durable outbound email ledger instead of more runtime dependency on legacy infrastructure.
Day five of the lmachine monolith: both GA4 properties were linked to Search Console, RankWar gained real monolith-native public surfaces and join mechanics, and the repo learned the correct way to run parallel Laravel worktrees without fake speed.
Day four of the lmachine monolith: GA4 landed across the real public surfaces, RankWar got continuity analytics in the legacy repo, and the monolith gained the shared app/domain/access spine plus the first RankWar tables.
Day three of the lmachine monolith: Google OAuth went live on the production hub, the login surface became explicitly Google-first, and the env contract across Google Cloud, private operator files, Dokploy, and runtime containers was tightened.
Day two of the lmachine monolith: Hetzner went live, Dokploy stayed private behind Tailscale, Resend was verified, TLS landed, and the first production-only failures got turned into reusable operating rules.
Day one of the lmachine monolith: private repo created, Laravel 13 scaffolded, the public and hub surfaces shipped, Docker stabilized, and Pest became the default TDD loop.