← Back to Brain Food 8 Days, 5 Chats, One Webinar: How I Built a B2B Presentation With AI as My Co-Pilot Mixed

8 Days, 5 Chats, One Webinar: How I Built a B2B Presentation With AI as My Co-Pilot

16 min read lab-reports

I said yes to a webinar before doing the math. 8 days later: 18 slides, an 11-page field guide, and a reusable workflow. Here's exactly how I did it—plus the master prompt you can copy.

I had eight days to build a complete presentation package for 300+ security industry professionals. Not just slides—a full field guide, implementation templates, planted Q&A questions, and a follow-up strategy. Here's how AI became my development partner across five focused working sessions.


The call came on a Wednesday. One of our industry partners wanted me to present at their dealer webinar series. Topic: Using AI to improve attach rates on service calls. Audience: 300+ regional security company owners and sales managers. Timeline: eight days.

I said yes before thinking through what "yes" actually meant.

A presentation like this isn't just a set of slides. It's research into who's actually in the room. It's a downloadable resource that makes the content actionable. It's anticipating the questions they'll ask and the objections they'll raise. It's a follow-up strategy that extends the relationship beyond the 45-minute slot.

Eight days. While running my actual business.

I'm curious whether what emerged would have been possible three years ago. Not the quality—I could have eventually built something comparable. But the speed, the ability to work in focused bursts between client calls and still produce something comprehensive? That's where AI changed the math.


The Constraint That Shaped Everything

Before building anything, I needed to make a decision that would constrain the entire project.

The audience was small business owners—security dealers running 5-50 person companies. They know their numbers. If I said, "this strategy can add $500 per service call," every person in that room would mentally test that against their own business. Half would think it's too low. Half would think it's inflated. Everyone would be skeptical.

This came up in my first working session. AI had suggested a hook: "The $97 problem: Average security service call = $150. Average missed upsell opportunity = $97." Clean framing. Memorable number.

I pushed back: "The one thing I'm not sure about is using dollar values for the problem."

What followed was a quick strategic conversation about why specific numbers invite mental testing. The insight: "If their average ticket is $280, you've lost credibility. But '20-30% improvement in identified opportunities'? That's universal. They apply their own math."

So the rule became: percentages and ratios only, no specific dollar values.

That one decision—made in the first working session—shaped every slide, every example, every claim in the field guide. AI helped me think it through, but I had to make the call. That's the pattern I keep finding: AI accelerates the work, but the strategic decisions still require human judgment.


Five Chats, Five Phases

I didn't plan to build this across five distinct sessions. It emerged from how the project naturally broke apart.

Chat 1: Research & Strategy

The first session was purely strategic. Who's in the room? What do they actually care about? What are their objections to AI?

I started with a simple frame: "I do want this to have real examples and solid takeaways. I don't want to live demo anything, but giving them prompts or tools or screenshots of examples will be helpful."

The audience profile that emerged: regional dealers (not national chains), mostly Microsoft shops, using ServiceTitan for CRM, skeptical of AI hype, under pressure to hit growth targets. They've heard "sell one more" a hundred times. What they haven't heard is how to systematize it.

What fascinates me about this phase: we spent zero time on slides. The entire session was about understanding the people who would be sitting in those virtual seats. AI didn't suggest we start with content—it pushed for clarity about the audience first.

This session produced the three-strategy framework that became the presentation backbone:

  1. AI call analysis for upsell triggers

  2. Personalized follow-up automation

  3. Predictive customer needs mapping

No slides yet. Just structure.

Chat 2: Content Development

The second session built the actual presentation. 18 slides, full speaker notes, scenario-based examples.

But here's what made it efficient: I asked for multiple output formats from the same thinking.

My request: "I need to share the slide outline with the team at ESX today. Generate a markdown document that shows the slide outline only, no speaker notes or design direction."

Then later: "I want to push this into Gamma for building the slides. Can you generate a version with instructions that I can copy/paste into Gamma for the creation of this deck."

Same content, three containers:

One thinking session. Three deliverables. The content was the same; only the container changed.

I'm curious if this approach—designing for multiple outputs from a single thinking session—is something others are doing. It feels like obvious leverage once you see it, but I'd never thought to ask for it before.

Chat 3: Speaker Bio

This sounds trivial, but it's where I learned something about iteration.

The first version was solid. Then I noticed two problems:

"Add in ...practical marketing and AI strategies.... Also, remove client names reference and mention 'security companies' significant traffic growth and conversion...' making it more evergreen."

Here's what I was fixing: The original bio said only "AI strategies," but my company does marketing and AI. That positioning matters. And it mentioned a specific client name, which makes the bio time-bound and requires permission to use the name in perpetuity.

The AI immediately understood: "Makes total sense—keeps the bio evergreen and avoids any client sensitivities."

Three rounds of refinement on 50 words. Each round caught something the previous round missed. The final bio was evergreen: no client names, no temporal references, just positioning.

Small deliverable. Useful lesson about reviewing even the "simple" stuff—especially the stuff that gets copied everywhere.

Chat 4: Field Guide Development

This was the biggest session—about 90 minutes of focused work. The downloadable resource that would make the presentation actionable.

My initial framing: "Review all of the content in the chats of this project and the files, too. Then, help me outline the field guide that we are using for this presentation."

What emerged: an 11-page field guide with co-branded headers, implementation checklists, 11 copy-paste AI prompts with example outputs, Excel templates with ROI calculators, a 90-day implementation timeline, tool comparison charts, state-by-state compliance guidance for call recording, and troubleshooting FAQs.

Not a thin PDF. The kind of resource that builds authority because it's actually useful.

I also built 13 planted Q&A questions across four categories: implementation challenges, tool selection, compliance concerns, and ROI validation. Sample questions like:

The idea was to have team members ready to ask questions if the audience went quiet.

Then came the platform problem. The field guide was built in Markdown, which doesn't translate cleanly into design tools. First attempt: Typeset. Complete failure—the formatting broke in ways that weren't fixable without starting over. Second attempt: Canva. Required a full rebuild, but at least the tool was flexible enough to handle it. Three platform iterations before landing on something usable.

Chat 5: Pre-Presentation Review

The final session was pure QA. I uploaded the finished slide deck and asked for a fresh eyes review.

Caught three issues:

All obvious in hindsight. All are invisible after multiple iterations. Fresh perspective—even from AI—catches what familiarity misses.


Where AI Got It Wrong

I'm not going to pretend this was flawless.

The format translation problem. Markdown doesn't paste cleanly into design tools. I learned this the hard way when Typeset completely choked on the field guide formatting. Canva worked, but required a full rebuild—not a paste job. I spent more time reformatting than I expected because I didn't anticipate the gaps between platforms.

The citation confidence issue. AI provided statistics with citations, but I had to verify each one. Two of the original citations were close but not quite right—the numbers were real, but attributed to the wrong source. Fixable, but only because I checked.

The "helpful addition" trap. In one iteration, AI added a section I hadn't asked for. It was good content, but it didn't fit the scope. I almost kept it because it was there. Scope creep happens even when your collaborator is artificial.


Where I Got It Wrong

No baseline expectations upfront. I jumped into the strategies without establishing what the audience should already know. Some attendees felt the content was too in-the-weeds; others probably wanted more depth. A simple "here's what we're assuming you already understand" at the top would have grounded the room and set clearer expectations. Next time: establish baselines before diving into tactics.

The acronym stumble. During the live presentation, I forgot what an industry acronym stood for mid-sentence. Recovered, but it was visible. Should have drilled the terminology harder.

Strategy numbering confusion. I ran a poll asking which strategy attendees wanted to start with. But my numbering in the poll didn't match the slide order exactly. Minor confusion that better preparation would have prevented.

Planted Q&A met reality. I prepared 13 questions, anticipating low engagement. The audience engaged differently—through polls, not open Q&A. Only one substantive question came from the floor. The preparation wasn't wasted (it shaped my mental framework), but I'd assumed the wrong engagement format. Lesson learned: polls consistently outperform open Q&A for webinar engagement. Build for that.


What Actually Happened

The partner's reaction after the presentation: "So much information, I even took notes."

Then the ask I wasn't expecting: could I come back in 60-90 days for a follow-up session on implementation progress?

That's the outcome that matters. Not the slides themselves, but the relationship extension. The presentation was a vehicle; the trust built through genuinely useful content was the destination.

The field guide is generating downloads. The framework is reusable for other industry presentations. The planted Q&A questions—even though they weren't used as planned—became a resource for anticipating objections in future conversations.


The Part I Keep Thinking About

Five chats. Eight days. A complete presentation package built in the gaps between running my actual business.

None of these sessions was longer than 90 minutes. Most were 20-30 minutes of focused work. The AI didn't replace my thinking—it accelerated the parts that were mechanical, so I could focus on the parts that required judgment.

The strategic decision about percentages over dollars? That was mine.

The three-strategy framework? Emerged from the conversation, but I chose it.

The field guide depth? I decided that thin PDFs don't build authority.

AI handled the execution. The decisions stayed human.

This pattern keeps showing up across my work. Last month, I used a similar phased approach to build a client website—audience research first, content development second, supporting materials third, and final review last. Different deliverables, same workflow. The phases seem to be universal; what changes is the speed AI enables within each one.

I'm curious if others are finding the same thing. Is this workflow transferable, or is it specific to how I think about projects?


If You Want to Try This

The workflow is reproducible:

1. Start with audience research, not content. The first session should be entirely about who's in the room and what they care about. Slides come later.

2. Establish baselines early in the presentation. Before diving into tactics, tell the audience what you're assuming they already know. "This assumes you're familiar with X and Y" grounds expectations and prevents the too-basic/too-advanced disconnect.

3. Make strategic decisions explicit. "No dollar values, use percentages" became a constraint that shaped everything. Name your constraints early—they'll shape everything downstream.

4. Build in multiple formats. Same content, different containers. One thinking session can produce three deliverables if you ask for them. Request: "Give me this as a Gamma prompt, a team outline, and full speaker notes."

5. Create resources with actual depth. A 2-page summary doesn't build authority. Implementation checklists, actual prompts, compliance details—that's what people remember.

6. Plan for polls, not open Q&A. In webinars, polls consistently drive more engagement than asking "any questions?" Prepare your planted questions anyway—they'll shape your thinking—but build the engagement strategy around structured interaction.

7. Expect platform translation problems. Markdown won't paste cleanly into Canva or Typeset. Budget time for reformatting, or ask AI to generate content in the target platform's preferred format from the start.

8. Review the final deliverable fresh. The slide 3 placeholder was obvious once I looked with fresh eyes. But I'd stopped seeing it after multiple iterations. Build in a final review session with that specific purpose.


The Prompt That Makes This Repeatable

Reading about a workflow is one thing. Actually using it is another.

I've packaged everything I learned from this project into a master prompt you can copy, paste, and customize. It's designed to guide you (and AI) through the same five-phase process I used—but adapted to whatever presentation you're building.

Why this prompt is structured the way it is:

How to use it:

  1. Copy the entire prompt below

  2. Fill in the bracketed sections at the top with your specific context

  3. Paste into your AI tool of choice (Claude, ChatGPT, etc.)

  4. Work through the phases in order—don't skip ahead

  5. Customize as you learn what works for your style

The prompt is a starting point, not a straitjacket. After a few uses, you'll find yourself modifying it to suit your presentations.


Master Prompt: 5-Phase Presentation Builder

# PRESENTATION DEVELOPMENT PARTNER

You're helping me build a complete presentation package using a 5-phase workflow. We'll work through each phase in sequence. Don't jump ahead—each phase builds on the previous one.

## MY CONTEXT
- **Topic:** [What you're presenting on]
- **Audience:** [Who's in the room - titles, company size, industry]
- **Timeline:** [How long until presentation day]
- **Presentation length:** [Minutes allocated]
- **Deliverables needed:** [Slides, handouts, field guide, Q&A prep, etc.]

---

## PHASE 1: AUDIENCE RESEARCH & STRATEGY
Before we build any content, help me understand my audience.

Ask me about:
- Who specifically will be in the room (titles, experience levels, company sizes)
- What they already know vs. what's new to them
- Their biggest objections or skepticism about this topic
- What success looks like for them after this presentation
- Any constraints I should establish upfront (Example: "percentages only, no specific dollar amounts")

Then help me develop:
- An audience profile summary
- 2-3 core strategies or frameworks that will structure the presentation
- Baseline assumptions to state at the start ("This assumes you're familiar with X")

DO NOT create slides yet. Strategy first.

---

## PHASE 2: CONTENT DEVELOPMENT
Now build the presentation content.

Create:
1. **Full slide outline** with speaker notes for each slide
2. **Gamma AI prompt** I can paste to auto-generate a first-pass deck
3. **Team outline** - simplified version I can share for review

Apply the constraints we established in Phase 1 throughout.

Include scenario-based examples, not just concepts. Make it concrete.

---

## PHASE 3: SUPPORTING MATERIALS
Build the resources that make this presentation actionable.

Based on what we've developed, create:
- **Field guide/handout** with implementation checklists, templates, or frameworks
- **Copy-paste prompts** the audience can use immediately
- **ROI framework or calculator** if relevant
- Any compliance considerations for the industry

Go deep. Thin PDFs don't build authority.

---

## PHASE 4: Q&A PREPARATION
Help me anticipate questions and objections.

Create planted Q&A questions across categories:
- Implementation challenges ("How do we get our team to actually do this?")
- Tool/resource selection
- Compliance or risk concerns
- ROI validation and proof points

Also: suggest 2-3 poll questions I can use during the presentation. Polls drive more engagement than open Q&A in webinars.

---

## PHASE 5: FINAL REVIEW
Review all deliverables with fresh eyes.

Check for:
- Placeholder text or template artifacts I might have stopped seeing
- Consistency between slides, handouts, and speaker notes
- Numbering/ordering mismatches
- Any claims that need citation or verification

Flag anything that looks like it was copied from a template and not customized.

---

## OUTPUT FORMATS
When I ask for deliverables, provide them in formats I can actually use:
- Slides: Markdown with clear slide breaks, or Gamma-ready prompt
- Handouts: Clean formatting that pastes into Canva or design tools
- Speaker notes: Conversational tone, not scripts

---

Let's start with Phase 1. Ask me the audience research questions.

The Meta-Lesson

I didn't set out to build a five-phase AI workflow. It emerged from how the project naturally broke apart.

Research is different from content development. Content development is different from design implementation. Design implementation is different from QA review.

Each phase has different requirements. Each benefits from a focused session rather than trying to do everything at once.

AI made it possible to treat each phase as a complete working session with a specific deliverable. Thirty minutes of focused work, then back to running the business. Repeat across eight days.

That's not how I would have built this presentation three years ago. I would have blocked a full day, tried to do everything at once, and ended up with something less complete because fatigue compounds errors.

The AI workflow lets me bring fresh thinking to each phase. The gaps between sessions weren't interruptions—they were features.


This is part of my ongoing documentation of AI experiments across business and life. I test these tools so you don't have to guess what works. Sometimes they work beautifully. Sometimes I forget what PDK stands for mid-sentence. Both are worth sharing.

Building a presentation like this for your team? Let's talk about how AI can accelerate your process.

Created with ❤️ by humans + AI assistance 🤖