Build an Interactive Q&A Bot in Slack That Talks to NotebookLM Sources

Overview

This article walks you through creating an interactive Slack bot that answers questions using the same documents and notes used in NotebookLM. NotebookLM doesn't yet expose a direct public API, so the bot uses Google Drive documents + Gemini model API to replicate NotebookLM-style reasoning.

This gives you a powerful workflow:
Ask Slack → Bot reads your Drive docs → Gemini answers using those sources → Replies in Slack

What You Will Build

  • A Slack bot you can message or @mention.

  • It reads the question.

  • It retrieves relevant content from Google Docs (the same sources you use in NotebookLM).

  • It uses Google Gemini (AI Studio or Vertex) to generate a grounded answer.

  • It replies directly in Slack threads.

Prerequisites

  • Slack workspace admin access

  • Google Cloud project with:

    • Gemini API enabled

    • OAuth credentials

  • Google Drive access

  • Node.js or Python environment

  • Your NotebookLM "source documents" stored in Google Drive

Step-by-Step Instructions

Step 1 — Prepare your Knowledge Sources

NotebookLM uses Drive documents as sources.
Your bot will too.

  1. Place your research notes, meeting notes, PDFs, docs, etc. in a folder like:
    NotebookLM Sources

  2. Make sure your service account or OAuth key can read these files.

Step 2 — Create the Slack Bot

  1. Go to api.slack.com/appsCreate New App.

  2. Choose From scratch.

  3. Add Bot User.

  4. Add OAuth scopes:

    • chat:write

    • app_mentions:read

    • channels:history or im:history (depending on where you want questions asked)

  5. Install app to workspace.

  6. Note your:

    • Bot User OAuth Token

    • Signing Secret

Step 3 — Connect your bot to Slack events

Enable Event Subscriptions:

  1. Turn Event Subscriptions ON.

  2. Add Request URL (your server endpoint).

  3. Subscribe to:

    • app_mention

    • optionally message.channels

  4. Save and reinstall the app if prompted.

Step 4 — Write your bot logic (high-level)

Bot Flow

  1. User sends a message like:
    "@notebookbot summarize the latest notes"

  2. Your backend receives the app_mention event.

  3. It extracts the text after the bot name—this becomes the query.

  4. Backend fetches relevant documents from your Drive folder:

    • Use Drive API: search-by-folder

  5. Convert docs to text (Docs API or PDF-to-text extraction).

  6. Call Gemini model:

    • Provide the query + extracted text

    • Prompt Gemini to cite sources and stay grounded

  7. Send the answer back to Slack using chat.postMessage.

Step 5 — Optional Enhancements

  • Threaded replies so the bot answers under the original question

  • Keyword routing (e.g., “summarize”, “search”, “compare”)

  • Caching of Drive files for speed

  • Summaries stored back into NotebookLM folder

Step 6 — Deploy your bot

You can deploy to:

  • Google Cloud Run

  • Heroku

  • Railway

  • AWS Lambda

  • Vercel

Make sure Slack’s Event Subscriptions are updated with your final URL.

You Now Have an Interactive Q&A Interface

Your team can now ask:

  • @notebookbot summarize last week’s meeting

  • @notebookbot find all insights about customer pain points

  • @notebookbot compare the three product proposals

The bot will use the same sources that power NotebookLM—bringing research and analysis directly into Slack.