lesson_05: how computation actually works
the tool shows a simple operation
a + b = c
but internally, this is not arithmetic
it is state transformation
what the system really does
a computer does not understand numbers
it operates on:
binary values
registers
logic operations
transitions between states
in your tool, this is visible step by step
breaking down the process
when you press “run”:
values are loaded into registers
data is moved into the alu
the operation is executed
the result is written back
execution stops
this is not a single action
it is a sequence of micro-steps
binary representation
every number becomes a fixed-size binary value
example:
5 = 00000101
3 = 00000011
the system never sees “5” or “3”
only bits
addition is bit logic
the operation is performed bit by bit
with carry propagation:
1 + 1 = 0 with carry 1
carry moves to the next bit
this continues until all bits are processed
limits and overflow
in an 8-bit system:
max value = 255
if result exceeds this:
carry is set to 1
result wraps (modulo 256)
example:
170 + 100 = 270
270 → 14 (with carry)
this is not an error
this is expected behavior
what this teaches you
everything in computing is:
deterministic
constrained
mechanical
there is no “intelligence” at this level
why this matters
modern systems hide this layer
but it is still there
every high-level system
every ai model
every distributed platform
is built on top of this
the key shift
if you don’t understand this layer
everything above it looks like magic
if you do
everything becomes explainable
practical mindset
when working with systems, ask:
what are the states
what are the transitions
where are the limits
modern systems
today’s systems did not change the underlying model
they scaled it and optimized its execution
modern processors use 32-bit and 64-bit registers
and every operation is still
bit-level
bounded
deterministic
when a result exceeds the limit
it does not expand
it wraps
interpretation
the same bits can represent different meanings
unsigned → magnitude
signed → includes sign
the underlying operations are the same
but interpretation is guided by flags and context
the hardware processes bits
meaning is applied through instruction sets and software
what actually changed
modern systems introduced layers of optimization
but not a new computational model
everything today
from web systems to ai
still relies on
fixed-width registers
bit-level operations
deterministic transitions
the fundamentals did not change
only the scale and complexity increased