Somewhere on the internet, right now, a senior developer is telling a junior:
"Never store money as decimals. Always use integer cents."
The junior, who will one day repeat that sentence to another junior, nods solemnly and goes off to refactor price to price_cents across 47 files.
The advice isn't wrong. It's just usually unnecessary. And when you cargo-cult it into a project that doesn't actually need it, you end up paying for it every day in small, stupid ways.
The rule makes sense, if you know where it came from
The integer-cents rule exists because floating-point numbers are a disaster for money.
// JavaScript 0.1 + 0.2 // 0.30000000000000004
Not a typo. That's IEEE 754 doing exactly what it was designed to do, which unfortunately isn't represent decimal fractions exactly. Multiply that tiny error across millions of transactions and you've got yourself an auditable problem.
So the old wisdom said:
"Don't use floats. Treat $19.99 as 1999, do integer math, sleep well at night."
Solid advice. In 2004.
But DECIMAL has been standing right there the whole time
Postgres, MySQL, SQL Server, Oracle. Every serious database has had a DECIMAL (or NUMERIC) type for decades. It is not a float. It's fixed-point, exact, base-10 arithmetic. The database is not guessing.
amount DECIMAL(10, 2)
You store 19.99, you get back 19.99. You add 0.10 + 0.20 and the answer is 0.30, not 0.30000000000000004. It's the thing you always wished floats were.
For a SaaS billing app, an e-commerce checkout, an invoicing tool, or pretty much any product dealing with normal currency amounts, this is already the end of the conversation. DECIMAL solved the float problem before most of us were born. And yet the rule keeps getting repeated like we're all still living in the PHP 4 era.
The hidden tax of storing cents
When you store 1999 instead of 19.99, you don't actually solve a problem. You move it out of the database and into every other layer of the app.
Now every piece of code that touches money has to remember which unit it's in. Humans are bad at remembering. Forget to divide by 100 in one obscure email template and your customer gets billed $1,999 for their $19.99 subscription. Nobody recovers from that Monday gracefully.
SQL queries turn into SELECT amount / 100.0 FROM .... CSV exports need a post-processing step. Debugging sessions start with wait, is this column in cents or dollars? Your analysts learn to hate you. I've been those analysts.
Then there's the sub-cent trap. Gas costs $3.459 a gallon. Interest accrues in fractions. APIs bill in micro-amounts. The moment you need anything smaller than a cent, your tidy integer-cents abstraction collapses and you're refactoring to tenths-of-a-cent or milli-cents or whatever the accountants are calling it this quarter. DECIMAL(19, 4) doesn't care. It just works.
Where integers genuinely earn their keep
I'm not saying integer cents are a mistake everywhere. There are real cases for them, they're just narrower than the rule suggests.
- Crypto is the obvious one. Bitcoin lives in satoshis (10⁻⁸ BTC), and once you're working at 8 or 18 decimal places, DECIMAL starts to feel cramped and integers start to feel honest.
- Payment APIs (like Stripe) are another. Stripe speaks in integer minor units because JSON has no decimal type, only numbers that JavaScript will gleefully mangle into floats in transit. That's a serialization choice, not a storage choice. You multiply by 100 at the boundary and move on with your life.
- Core ledger infrastructure, payment rails, HFT engines where every nanosecond of integer arithmetic matters? Sure, go integers. You probably already know if that's you.
- Sometimes the domain genuinely is count of the smallest unit rather than money with decimals. That's a different model, and integers fit it better.
If none of that sounds like your app, it isn't your app. Use decimals.
What I actually do
In the database, DECIMAL(10, 2) for normal money, DECIMAL(19, 4) when I want headroom for tax or interest accruals, DECIMAL(18, 8) for anything crypto-adjacent.
In application code, a real decimal type. BCMath or a Decimal class in PHP, decimal.Decimal in Python, etc... Never float or double. Storing decimals correctly in the database and then doing float math on them in application code is like buying a fireproof safe and keeping matches inside.
At API boundaries, I'm obnoxiously explicit. Document whether a field is 19.99, "19.99", or 1999. Convert at the edge. Don't let conversions drift around the codebase, because that's where most money bugs actually live, not in storage.
The actual rule
The grown-up version of the rule isn't always use integer cents. It's closer to this:
"Don't use floating-point for money, use exact decimal types by default, and reach for integer minor units only when your problem actually benefits from it."
That protects you from the bug that motivated the rule in the first place, without sticking a permanent conversion tax on a codebase that never needed one.
Your database has a type that does fixed-point decimal arithmetic correctly, at basically no performance cost. It has been sitting there the whole time. Use it.
And the next time somebody hands you the integer-cents rule like it's scripture, you can nod politely and keep writing DECIMAL(10, 2) like the adult you are.
Otar