undefined - Chris Pedregal + Sam Stephenson: Making Meetings More Effective with Granola

Chris Pedregal + Sam Stephenson: Making Meetings More Effective with Granola

How can AI make meetings better? That’s the simple question that inspired Granola, a productivity tool that can tell you what was actually discussed in that meeting last week and what the real next steps are.Β In this episode of Generative Now, host Michael Mignano, partner at Lightspeed, sits down with Granola co-founders Chris Pedregal and Sam Stephenson at their headquarters in London. They talk about how they first met, their early product bets, and how they decided to focus on solving one pa...

β€’May 15, 2025β€’38:05

Table of Contents

00:02-07:44
07:49-17:48
17:54-28:01
28:06-38:01
Segment 5

πŸ‘‹ Welcome and Introduction

Michael Mignano welcomes viewers to Generative Now, a show where he speaks with founders building AI-powered tools. In this episode, he interviews Chris Pedregal and Sam Stephenson, co-founders of Granola, a note-taking app that uses AI to compile and summarize meeting notes.

The conversation takes place at Granola's headquarters in London, where the founders have recently set up their new office space. There's some lighthearted discussion about Sam personally carrying plants into the office after a 4 a.m. visit to a plant market.

Timestamp: [00:02-01:15]Youtube Icon

🀝 Finding the Right Co-Founder

Chris shares how he began his entrepreneurial journey three years ago after leaving Google. Within a week of quitting, he started experimenting with GPT-3's instruct version and was immediately impressed by its capabilities.

Chris realized he needed a co-founder with specific skills - initially thinking he needed someone who could train models (a view he later changed) and someone who could design AI-native interfaces. This search led him to explore "tools for thought" forums online.

Timestamp: [01:15-01:52]Youtube Icon

🧠 Tools for Thought and AI

When asked to explain what "tools for thought" means, Chris elaborates on how humans are fundamentally toolmakers and how our capabilities are limited by the tools available to us.

He traces the evolution of these cognitive tools from written language to mathematical notation, explaining how each advancement has expanded human cognitive capabilities. For instance, modern numerical systems allow for more complex calculations than Roman numerals did.

Chris sees AI as "the ultimate turbocharger of tools for thinking," which led him to connect with Sam through an online tools for thinking meetup group - without even meeting in person first.

Timestamp: [01:52-03:24]Youtube Icon

πŸ” Identifying the Problem

Sam explains that from the beginning, both founders shared a strong belief that AI would transform the landscape of productivity tools. They recognized that either existing tools would need to completely reinvent themselves or new players would disrupt the market.

The founders were determined to focus on a specific, painful user problem rather than building technology for its own sake. They spent time "wandering around" and talking to people about their work challenges to find a genuine pain point.

Through these conversations, they identified a universal frustration that emerged repeatedly: the administrative burden that follows meetings.

Timestamp: [03:24-04:38]Youtube Icon

πŸ› οΈ Building the Solution

Sam describes how they discovered that people whose jobs revolve around meetings consistently struggle with the "pile of follow-up work" generated by each conversation.

This work ranges from simple tasks like writing up notes and sending follow-up emails to more complex actions such as updating multiple fields in a CRM or triggering workflows. The founders recognized these administrative tasks as universally disliked busy work.

Sam explains that this pain point seemed perfectly suited for AI intervention, even if the technology wasn't fully ready when they started. This insight led them to begin developing a tool that could be present during meetings and eventually handle much of this administrative burden.

Timestamp: [04:38-05:07]Youtube Icon

πŸ“± The Evolution of AI and App Layer

Michael notes that while building great product experiences around AI seems obvious now, this wasn't always the conventional wisdom. He recalls that just 12-18 months ago, there was skepticism about companies building at the "app layer" of AI, with critics dismissing them as mere "wrappers" on large language models like GPT-4 or Claude.

Michael observes that sentiment has completely reversed, with substantial excitement now surrounding the app layer. He asks the founders why they think this shift has occurred.

Chris identifies three key factors that changed industry perspectives:

  1. The rapid improvement of large language models made it more practical to use existing frontier models rather than train custom ones.

  2. The prohibitive cost and difficulty of training custom models meant only a few large companies could realistically do so.

  3. The recognition of where specialized applications provide value versus general-purpose AI.

Chris points to companies like Cursor (with its recent high valuation) and WindSurf (recently acquired) as examples of successful application-layer companies that are "just wrappers" on frontier models but deliver tremendous value through their specialized implementations.

Timestamp: [05:07-07:44]Youtube Icon

πŸ’Ž Key Insights

  • Early adoption of frontier AI models like GPT-3 can provide a founding advantage in building new tools
  • The "tools for thought" concept frames AI as the next major evolution in human cognitive enhancement
  • Successful AI products focus on specific, painful user problems rather than technology for technology's sake
  • Meeting follow-up work is universally disliked yet necessary, making it an ideal target for AI assistance
  • The industry has shifted from skepticism about AI application layers to recognizing their value
  • Specialized AI tools provide more value than general-purpose AI for professional, high-frequency use cases
  • Building great software experiences around AI models is challenging and valuable, contrary to early dismissals of "wrapper" apps

Timestamp: [00:02-07:44]Youtube Icon

πŸ“š References

Companies:

  • Google - Chris's previous employer before founding Granola
  • Granola - The AI-powered meeting notes app founded by Chris Pedregal and Sam Stephenson
  • Lightspeed - Venture capital firm where host Michael Mignano is a partner
  • Cursor - AI-powered development tool mentioned as having received high valuation
  • WindSurf - AI application recently acquired, mentioned as an example of successful "wrapper" app

People:

  • Michael Mignano - Host of Generative Now and partner at Lightspeed
  • Chris Pedregal - Co-founder of Granola, former Google employee
  • Sam Stephenson - Co-founder of Granola, met Chris through tools for thought forum
  • Paul - Friend of Chris mentioned as teaching him about humans as toolmakers

AI Concepts:

  • GPT-3 - Early large language model that Chris began experimenting with after leaving Google
  • GPT-4 - More advanced language model mentioned in context of the "wrapper" debate
  • Claude - Anthropic's large language model mentioned alongside GPT-4
  • App Layer - Level of AI implementation that builds applications on top of foundation models
  • Tools for Thought - Concept describing technologies that enhance human cognitive capabilities

Timestamp: [00:02-07:44]Youtube Icon

🧩 Challenges and Innovations

Chris discusses how the AI tooling market operates like a pendulum, swinging between different trends, but believes that professional tools that significantly improve productivity will always have value.

Sam explains that when they started Granola during the early GPT-3 era, real-time transcription had just become available via APIs but wasn't great quality. This created a strategic challenge for the team.

The founders developed a framework for deciding what to work on: distinguishing between areas that would improve naturally with advancing AI capabilities versus problems that required their specific innovation. This approach helped them focus their limited resources on areas where they could create unique value.

Timestamp: [07:49-09:18]Youtube Icon

🎯 Strategic Focus

When asked about specific features they intentionally delayed, Chris shares several examples of their strategic prioritization.

After spending a week investigating language support, they realized it would require a month-long project to create a good interface for selecting languages. Instead of making this substantial investment, they decided to wait for the underlying transcription technology to improve naturally, as many companies were already incentivized to solve this problem.

Similarly, they initially faced context window limitations that restricted Granola to 30-minute meetings. Rather than building complex chunking solutions, they waited for language models to improve their context windows naturally.

Chris also explains their approach to retrieval augmented generation (RAG), noting that as context windows expand, they can sometimes get better results by simply including more information rather than building sophisticated retrieval systems.

Timestamp: [09:18-11:25]Youtube Icon

πŸ’° Business Model and Future Outlook

Michael asks about business models for AI app companies, noting many simply charge for "credits" that primarily cover API costs to language models. Sam explains their approach is more traditional than it might appear.

Rather than just monetizing AI access, Sam emphasizes creating value through network effects - making Granola more valuable as more people in a team use it, creating a repository of organizational knowledge that becomes increasingly valuable over time.

Chris acknowledges they're in an interesting moment in history - a "land grab" where new AI-enabled products are emerging while the costs of running these products are expected to decrease dramatically in the coming years.

The founders believe they must build for the future rather than optimizing for today's constraints, which makes their approach capital intensive in the short term. Their financial forecasts assume AI costs will decrease over time, with some capabilities (like transcription) eventually becoming commoditized while others (like powerful document creation and chat) might require staying on the frontier of AI capabilities.

Timestamp: [11:25-15:01]Youtube Icon

πŸš€ The Launch and Early Success

Michael notes that Lightspeed was an early investor in Granola and reflects on watching the company grow from zero lines of code to their current product. He mentions their launch on May 22nd (nearly a year ago) and how it seemed to achieve "instant product market fit," which he describes as extremely rare.

Sam explains their deliberate approach to product development in the months leading up to launch. The team recognized early that Granola's success would depend on creating natural, effortless user interactions that help people extract what they care about from meetings.

This period involved gradually adding complexity as they tested ideas, then cutting back and streamlining once they found a core interaction that worked - the approach where users type notes at the end of a meeting and Granola expands upon them.

The team maintained a strong focus on building a daily habit for users. They created a visualization called the "dot plot" that showed individual user engagement day by day, helping them honestly assess whether people were consistently using the product or just occasionally trying it.

Sam reveals that despite the successful launch, he personally didn't feel the product was ready, and credits Michael with pushing them to launch.

Timestamp: [15:01-17:48]Youtube Icon

πŸ’Ž Key Insights

  • Professional tools that significantly improve productivity (10-50%) will maintain economic value regardless of AI hype cycles
  • Strategic product development requires distinguishing between what will improve naturally with advancing AI and what requires dedicated innovation
  • Sometimes it's better to wait for underlying AI capabilities to improve rather than building complex workarounds
  • Language models can produce surprisingly intuitive results when given unstructured information, challenging traditional software engineering approaches
  • Business models for AI apps should focus on creating unique value beyond just reselling API access to foundation models
  • Network effects and becoming a valuable knowledge repository represent sustainable competitive advantages beyond AI features
  • AI startup economics involve a "land grab" where companies must build for future cost structures rather than optimizing for today's constraints
  • Successful product development requires continuous experimentation followed by ruthless simplification around the core value proposition
  • Building products that become daily habits is crucial for sustainable growth
  • Launching earlier than feels comfortable can accelerate learning and growth

Timestamp: [07:49-17:48]Youtube Icon

πŸ“š References

Technical Concepts:

  • GPT-3 - Early large language model mentioned in context of when Granola started development
  • Real-time transcription - Core technology that had just become available via API when they started Granola
  • Context window - AI model limitation that initially restricted Granola to 30-minute meetings
  • Retrieval Augmented Generation (RAG) - Technique for selectively feeding relevant information to AI models
  • Dot plot - Internal visualization tool used to track daily user engagement with Granola

Business Concepts:

  • Network effects - Strategic advantage where Granola becomes more valuable as more people in a team use it
  • Product market fit - Business milestone that Granola seemingly achieved quickly after launch
  • Beta testing - Granola spent a year in closed beta with 150 manually onboarded users

Companies/Products:

  • OpenAI - Mentioned in context of AI app companies essentially reselling their API access
  • Lightspeed - Venture capital firm that was an early investor in Granola

Events:

  • Granola Launch - Occurred on May 22nd (approaching one-year anniversary at time of recording)

Timestamp: [07:49-17:48]Youtube Icon

🧠 Theoretical vs. Practical User Needs

Sam continues discussing their product launch, revealing that their venture capital investors had been pushing them to launch for nine months before they finally did.

The founders struggled to see past the product's flaws, focusing on what was wrong rather than what was working. This created an interesting tension between perfection and shipping.

This highlights the classic entrepreneur's dilemma - recognizing when a product is "good enough" to launch versus waiting for perfection, and how actual market reception can differ dramatically from founder expectations.

Timestamp: [17:54-18:17]Youtube Icon

😰 Stress and Software Design

Michael asks Sam about the team's design philosophy, particularly how they design for what people actually need versus what they think they need. Sam explains their approach to building software that works in high-stress situations.

Sam explains that they were "paranoid" about designing for the actual context in which Granola would be used - between back-to-back meetings when users are stressed, rushed, and have minimal cognitive bandwidth for learning new software.

In these stressful transitions between meetings, users have extremely limited mental capacity to engage with software.

This constraint became a powerful design principle that kept the team focused on simplicity and minimal friction.

Timestamp: [18:17-20:05]Youtube Icon

πŸ€– Scaling with AI

Michael asks Chris to compare his experience building Granola, an AI-powered company, with his previous experience building and selling Socratic, a company without AI. Chris notes that it's too early to give a definitive answer but shares several observations.

First, Chris highlights their CTO's emphasis on using AI throughout their engineering process to reduce the amount of code their team writes.

This requires intentional effort because established habits can be hard to break, even as the world changes rapidly.

Chris also speculates on how AI might change company structure and team composition. While product development will still require "best-in-class people," other functions might look different.

He suggests these teams might function more like engineers, "building systems even though they might not be writing code."

Finally, Chris notes how the level of interest in AI has dramatically changed the startup experience.

Timestamp: [20:05-22:33]Youtube Icon

πŸ‘¨β€πŸŽ¨ Maintaining Quality and Taste

Michael observes that AI companies are forced to move extremely fast, which creates challenges for maintaining quality and "taste" - a term he notes has become somewhat overused in tech. He asks how Granola maintains its reputation for beautiful design and taste while moving at such a rapid pace.

Chris admits there's room for improvement but shares several strategies they use:

This focus on user-centric thinking helps the team make appropriate trade-offs:

The team distinguishes between different types of features based on their importance, allowing for different levels of scrutiny:

Chris explains they're trying to get better at distinguishing between "one-way door" decisions (difficult to reverse) and "two-way door" decisions (easily reversible):

Despite the pressure to move quickly and add features, Chris acknowledges the need to preserve what makes Granola special:

Timestamp: [22:33-24:51]Youtube Icon

🌍 Building a Silicon Valley Startup in London

Michael observes that while Granola is based in London, the product has achieved significant popularity in Silicon Valley and the U.S. tech scene. He asks if this positioning is intentional and what it's like to build a team in London.

Chris confirms this was intentional and explains their hybrid approach:

He acknowledges the importance of Silicon Valley's startup culture and methodology:

While they draw inspiration from Silicon Valley, they leverage London's unique advantages:

Sam adds that being one of the few prominent AI application companies in London provides a strategic advantage:

This positioning helps them attract talent that might be spread across many companies in Silicon Valley:

Both founders acknowledge there are trade-offs with this approach. While they get access to incredible talent in London with less competition, they must work harder to stay current with developments in Silicon Valley.

When asked about other London-based companies they admire, they mention AI-focused firms like ElevenLabs and Plaine, as well as successful fintech companies like Monzo and Wise. Chris emphasizes that despite being in London, they maintain a global outlook:

Timestamp: [24:51-28:01]Youtube Icon

πŸ’Ž Key Insights

  • Founders often struggle to launch products they perceive as imperfect, even when investors see market readiness
  • Designing for high-stress contexts (like transitions between meetings) requires extreme simplicity and minimal cognitive load
  • AI startups should actively encourage using AI within their own engineering processes to stay at the forefront
  • Company structures in the AI era may shift, with traditionally large departments becoming more systems-focused
  • AI startups can experience unexpectedly rapid growth compared to traditional startups, creating operational challenges
  • Maintaining product quality while moving quickly requires clear prioritization between core and peripheral features
  • Distinguishing between reversible decisions (two-way doors) and irreversible ones (one-way doors) helps teams move faster safely
  • Building in London while targeting Silicon Valley-style ambition creates a unique advantage for talent acquisition
  • Being a prominent AI app company outside of Silicon Valley can make you a "bigger fish in a smaller pond" for recruitment
  • Global competitiveness requires focusing beyond local markets, especially winning in the U.S.

Timestamp: [17:54-28:01]Youtube Icon

πŸ“š References

Companies:

  • Granola - AI-powered meeting notes app built by Chris and Sam
  • Socratic - Chris's previous company that he built and sold before Granola
  • ElevenLabs - London-based AI voice technology company mentioned as impressive
  • DeepMind - London-based AI research lab mentioned as part of London's foundation model expertise
  • Plaine - London company noted for building great user experiences
  • Monzo - UK fintech company mentioned as a London success story
  • Wise - UK fintech company (formerly TransferWise) mentioned as a London success story

People:

  • Boss - Granola's CTO mentioned for pushing the team to use AI extensively in their engineering process

Business Concepts:

  • One-way vs. Two-way doors - Decision-making framework for determining when to move quickly (reversible decisions) vs. carefully (irreversible decisions)
  • Product thinking - Skills that Granola screens for in engineering interviews, focusing on user perspective
  • Lizard brain - Term mentioned by Michael referring to instinctive user needs vs. stated preferences
  • Golden goose - Metaphor used to describe Granola's core value of simplicity that could be killed by feature bloat

Locations:

  • London - Where Granola is headquartered
  • Silicon Valley - Region whose startup methodology and culture influences Granola despite being based in London
  • New York - Mentioned as where Chris previously built Socratic

Timestamp: [17:54-28:01]Youtube Icon

πŸš€ The Future of Granola

Michael asks the founders about their ultimate ambition for Granola, beyond being just a note-taking app for people in back-to-back meetings.

Sam explains that other professional categories already have powerful tools that amplify productivity, but people who primarily work through conversations have been left behind:

He points out that professionals whose work revolves around "people stuff" – like sales, customer-facing roles, management, or investing – haven't had similar tools because the fundamental unit of their work is natural language and conversation.

Sam sees a historic opportunity to create powerful workspaces for these professionals now that AI can understand natural language:

Chris expands on this vision, placing it in a broader historical context:

He references early computing pioneers and their vision for how technology could enhance human capabilities:

Timestamp: [28:06-31:01]Youtube Icon

πŸ”„ Early Feedback and Iteration

An audience member named Emily asks about the founders' approach to feedback loops in the early stages of building Granola, particularly how they determined when they had enough data to move forward with decisions.

Chris offers a philosophical perspective that emphasizes intuition over quantitative data in the earliest stages:

He clarifies that this doesn't mean working in isolation. Instead, he advocates for constant user interaction to develop better intuition:

Chris explains that by observing users struggle with tasks, founders train their intuition to make better product decisions:

He concludes that looking for quantitative signals is "almost impossible in the early days," suggesting that qualitative insights and refined intuition are more valuable for early-stage products.

Timestamp: [31:01-32:28]Youtube Icon

🧩 Maintaining Quality and Taste

Another audience member asks about Granola's design philosophy for creating the "jetpack for the mind" while avoiding becoming a complex CRM system.

Sam emphasizes their user-centric approach:

He explains that while companies pay for the product, the individual user experience drives product decisions:

Sam describes two directions they're exploring. First, creating shared context for teams:

This creates new possibilities like analyzing patterns across all sales calls to identify what's working and what isn't.

Chris builds on this, explaining how meeting data provides extraordinary context for AI:

He sees meetings as just the beginning, with plans to incorporate emails, Slack, and other data sources to enable more powerful use cases:

Chris shares an example of the potential: a team member built a demo of a "self-writing wiki" for Granola that automatically creates and updates documentation based on meeting content:

Timestamp: [32:28-35:45]Youtube Icon

πŸ”’ Privacy and Data Handling

An audience member named Sundeep asks about user preferences regarding having meeting information stored and transcribed, particularly in sensitive contexts like financial services.

Chris responds by framing AI meeting tools as becoming essential in professional settings:

He acknowledges the importance of establishing appropriate boundaries:

Chris explains how Granola took a less invasive approach from the beginning:

He predicts that the privacy conversation will evolve beyond whether meetings are transcribed to focus more on access controls:

Chris concludes with a metaphor comparing AI meeting tools to the discovery of fire – too useful to abandon, but requiring thoughtful norms:

The episode concludes with Michael thanking the guests and asking listeners to rate and review the show.

Timestamp: [35:45-38:01]Youtube Icon

πŸ’Ž Key Insights

  • While designers and engineers have powerful productivity tools, professionals whose work centers on conversations have lacked similar tools until AI made natural language processing possible
  • Granola aims to be a "jetpack for the mind" by creating a workspace that amplifies people who work primarily through conversations and meetings
  • In early-stage product development, founder intuition and qualitative user feedback are more valuable than quantitative data
  • Watching users struggle with tasks provides the context needed to develop better product intuition
  • Meeting transcripts contain extraordinarily rich data that can power a wide range of AI applications beyond just note-taking
  • The future of Granola involves expanding beyond meeting notes to become the central workspace where people write documents, emails, and other content that benefits from organizational context
  • A self-writing company wiki that automatically stays up-to-date based on meeting content represents the type of revolutionary applications possible with meeting data
  • AI meeting tools will likely become expected in professional settings, shifting privacy concerns from whether meetings are transcribed to who has access to the transcripts
  • Like fire, AI meeting tools are too useful to abandon, but require thoughtful norms and boundaries to maximize benefits while minimizing risks
  • Building global products requires thinking beyond local markets from the beginning, particularly focusing on competitiveness in the U.S. market

Timestamp: [28:06-38:01]Youtube Icon

πŸ“š References

Companies/Products:

  • Granola - AI-powered meeting notes app built by Chris and Sam
  • Figma - Design tool used by designers, mentioned as an example of a professional power tool
  • Photoshop - Design tool mentioned as what designers used "back in the day"
  • Cursor - IDE (Integrated Development Environment) used by engineers
  • VS Code - Microsoft's code editor used by engineers
  • Automation Anywhere - Company of audience member Sundeep who asked about privacy
  • Lightseed - Venture capital firm that produces the Generative Now podcast
  • Pod People - Production partner for the Generative Now podcast

People:

  • Jim - Granola team member who built the "self-writing wiki" demo
  • Emily - Audience member who asked about early feedback loops
  • Sundeep - Audience member from Automation Anywhere who asked about privacy
  • Douglas Engelbart - Early computing pioneer referenced by Chris, known for his work on human-computer interaction

Concepts:

  • Jetpack for the mind - Chris's description of AI's potential, contrasted with Steve Jobs' description of computers as a "bicycle for the mind"
  • Self-writing wiki - Demo showing how meeting data can automatically generate and update documentation
  • Faster horses - Reference to the apocryphal Henry Ford quote about not asking customers what they want
  • Context window - AI term referring to how much information a model can process at once

Technical Terms:

  • Websim/Webui - Referenced tool that generates HTML pages using large language models
  • LLM - Large Language Model, the type of AI that powers Granola and similar tools

Timestamp: [28:06-38:01]Youtube Icon

πŸ“’ Promotional Content & Announcements

Podcast Information:

  • Show name: Generative Now
  • Host: Michael Mignano, partner at Lightseed
  • Production: Produced by Lightseed in partnership with Pod People

Call to Action:

  • "If you like this episode please do us a favor and rate and review the show on Spotify and Apple Podcasts"
  • "Follow Lightseed at LightseedVP on YouTube, X, LinkedIn and everywhere else"

Future Episodes:

  • "We will be back next week with another conversation"

Timestamp: [37:42-38:01]Youtube Icon