Ask HN: Did anyone learn basic arithmetic as "snapshots" instead of procedures?

I’ve been thinking about how people actually learn basic arithmetic — whether their first intuition comes from procedures or from “snapshots”.

I’ve noticed adults with surprisingly different gut-level approaches, and I’m curious how early those patterns form.

This isn’t about correctness — just how your earliest intuition took shape.

Some people learn through procedures (e.g., understanding 2+3 as (1+1+1)+(1+1)). Others seem to learn what I’d call “snapshot learning”:

they simply internalize that “2 + 3 = 5” without seeing the intermediate structure.

So I’m curious:

• Did you learn arithmetic as procedures or as memorized snapshots?

• If you learned snapshots first, when did procedural understanding show up?

• Do you think math education should make the inner steps more visible?

I’ve been exploring a notation (Δⁿ sort) that exposes intermediate steps, but the real question is how people learned arithmetic, not the notation itself.

Curious to hear your experiences.

(Background below — feel free to skip.)

─── ─── ─── ─── ─── ─── ─── ─── ─── ───

Thinking back, I don’t think I was particularly bad at basic arithmetic as a child (though I may be idealizing my memory a bit).

What I do remember clearly is getting stuck at division. In hindsight, this might have been a side effect of learning mainly through snapshots.

Terms like “numerator” and “denominator” didn’t connect to anything. Decimals felt vaguely unstable. π was just a symbol that produced correct answers but didn’t relate to any internal structure.

Even “2 × 2 = 4” made sense to me only as a kind of visual pairing, not as the procedure (1+1)+(1+1). It simply became “4”. For me, it was a result, not a process.

It’s also possible that this partial procedural understanding is the reason I drifted into pure snapshot learning later on. I honestly don’t remember.

To illustrate what I mean by “making the steps visible”, here’s the same idea using Δ-sort-style notation:

2 × 2

=³ (1+1) + (1+1)

=² 2 + 2

= 4

─── ─── ───

Division, however, still felt opaque.

“2 ÷ 2 = 1” — I knew the answer, but even now it doesn’t fully click. I understand it intellectually as “splitting 2 into two equal parts”, but I don’t recall anyone teaching it that way.

Without first internalizing the idea that “half of 2 is 1”, the operation never felt grounded.

To illustrate what I mean, here’s that same idea using a Δ-style breakdown:

2 ÷ 2

=³ (1+1) ÷ 2

=² 1 and 1

= 1

* “1 and 1” could just as well be written as (1, 1) or (1 | 1).

I’m not a mathematician, so I’m not claiming the notation is rigorous. The point isn’t precision — it’s visibility.

This is meant as a lightweight way to make intermediate steps visible for early learners.

For people who won’t go on to study higher mathematics, something like this might be sufficient — or at least less alienating.

─── ─── ───

Thanks for reading.

None of this is meant as a strong claim — I’m mainly curious how others learned.

If this is off-topic for Ask HN, feel free to ignore.

2 points | by ursAxZA 7 hours ago

0 comments