Why Notion is rarely the right central system for automation
A lot of teams pick Notion as their single source of truth. In an automated stack, that often becomes a dead-end. Here is what to do instead.
Why Notion is rarely the right central system for automation
Founders love Notion. So do I, for the right things. But every quarter I sit across from a team that has spent eighteen months building their entire operation on top of it, and we have the same conversation: their automations are flaky, their data is split across nine half-synced databases, and nobody is quite sure which one is the real one anymore.
If you are considering Notion as the central data source for your automation stack, this is the article I wish someone had handed me three years ago.
What a central system actually has to do
When you are serious about automation, you need a data source your workflows can trust. Specifically, it has to do three things well.
It has to push notifications in real time when something changes. A record is created, updated, or deleted, and your other systems hear about it without anyone polling.
It has to support clean queries. "All orders without an invoice in the last 14 days" should be a single query, not a script that paginates through pages.
It has to handle high read and write throughput without throttling. Once several workflows run in parallel, the API calls add up faster than you would think.
Notion is weak on all three.
Webhooks are a half-built road
Notion only added webhooks at the end of 2024, and they are still partial. Block changes within a page are only partly captured, and subscription configuration is not as granular as you would expect coming from HubSpot, Stripe, or any Postgres setup.
So in practice, most teams end up polling. Make or n8n hits Notion every five minutes to detect changes. That works for a while. Until you have 25 databases and 30 workflows, at which point you are firing 8,000 polling requests a day to detect changes that other systems would have pushed via webhooks for free.
The API is not built for automation
Notion's API is primarily a content API. It is excellent at reading and writing pages as block trees. It is mediocre at the kind of database querying you actually need in an automation.
Three concrete examples.
Filter expressions are limited. Nest a few AND/OR conditions across different property types and the API either rejects them or, worse, silently interprets them differently than you would expect. I have debugged workflows where a filter for "Status = done AND created over 7 days ago" was quietly returning records still marked "in progress," because the Notion engine resolved an edge case in a way I had not predicted.
Property types are brittle. Formula columns cannot reliably be written via the API. Neither can rollup columns. Relation columns only work if you already know the target IDs. Anyone who automates with Notion learns quickly which property types to avoid, because the API ignores them or throws errors.
The rate limit is low. Notion allows roughly three API calls per second per integration. That sounds like a lot until you run a loop over a few hundred records. A simple sync job that updates 500 entries takes about three and a half minutes in practice, two and a half of which is waiting. For live synchronization, that is a non-starter.
Notion is a view, not a source
This is the most important point, and it is conceptual.
Most Notion setups I see conflate two things that should stay separate: the system of record, where the truth lives, and the system of engagement, where humans interact with it.
Notion is good as a system of engagement. A salesperson appreciates a well-built Notion board for their pipeline. A project manager finds their roadmap clearer in Notion than in Jira.
But as a system of record, the source your workflows, reports, and integrations trust, Notion offers no real strengths. Only convenience. And convenience is the most expensive property a central data source can have, because it papers over every other problem until they get too big to ignore.
The familiar dead-end
Here is the arc I have watched play out repeatedly.
Phase 1, enthusiasm. A team builds two or three Notion databases, links them with relations, automates a few small things with Make. It works. Everyone is happy.
Phase 2, growth. Three databases turn into twelve. Three workflows turn into twenty. Databases start depending on each other, and workflows weave across them.
Phase 3, friction. Filters need workarounds. Individual workflows fail silently because someone changed a property type. Make starts throwing rate-limit errors. Nobody is sure which customer database is the real one, because there are three of them, all supposedly syncing.
Phase 4, fragmentation. Employees start keeping spreadsheets on the side because the Notion view is too slow. Data drifts. Reports contradict each other. Eventually someone asks the killer question: "What is our actual data source?"
That is where Notion stops helping.
What works instead
What I usually tell clients: keep Notion for what it is good at. But build the layer underneath properly.
Specifically, depending on context.
A dedicated CRM for customer and sales data. HubSpot, Pipedrive, Attio, or if you prefer open source, EspoCRM. These have real webhooks, sane filtering, higher rate limits, and a data model built for sales processes. Notion gets a read-only view via a synced database or an embed, if your team still wants to work there.
A real database for structured business data. Postgres or Supabase, paired with a lightweight frontend like NocoDB or Baserow for staff who need a spreadsheet-style UI. Sounds heavy, is not. A Supabase project is two hours of setup, and the API is in a different league from Notion's.
Airtable for the middle ground. If you have a fundamentally tabular data world where a real database would be overkill, Airtable is a better fit. Better API, better filters, higher rate limits, reliable webhooks.
Notion stays as a wiki, internal handbook, meeting notes, project overview. There it is hard to beat.
When Notion does fit
To keep this honest: there are setups where Notion as a central source works fine.
- Teams under ten people with fewer than five parallel workflows.
- Use cases where latency does not matter. A workflow that runs once a night sits comfortably inside the rate limit.
- Pure content or knowledge processes, where Notion is the right home anyway: blog drafts, newsletter pipelines, help center articles.
In those cases, it is a sensible default. The moment you start feeling that workflows keep bumping into Notion's edges, that is a signal to rethink the architecture before you add the eleventh database.
The honest recommendation
If you are early and considering whether Notion will be your central data source for automation, think two steps ahead.
Ask yourself: what happens when volume triples? When three people edit the same database at once? When a second system needs live updates from this data? When a workflow halts mid-loop with a rate-limit error?
If you do not have a satisfying answer to those, Notion is not the right home for that data. Not today, and not a year from now.
This is not a smear. Notion is genuinely excellent at what it does. It is just not a database product, and treating it like one ends in tears.
If you want to know whether your current data architecture can carry the next stage of growth, the free Automations Check gives you a clear picture in about 30 minutes.