I think he just meant that you can add 3 and 2 together instantly in your head whereas it will take you longer to add 334212132454779 + 675456421213132457964 since most people can't add them instantly without resorting to pencil and paper.
Humans can't add large numbers, because of lack of memory. You can't memorize the addition of 8173 with 2509, let alone with larger numbers. That's why you write it down, and then operate mechanically the step-by-step algorithm for column addition (starting from the rightmost column), as you've learned in elementary school, which has a time complexity of O(n).
Computers on the other hand, handle addition differently. When you give a computer a number, it reads it in a fixed-size set of bits. For example, 8173 and 2509 in 32-bit will be expressed as:
00000000000000000001111111101101
00000000000000000000100111001101
The addition here is done bit-by-bit, starting from the rightmost, with the following conditions:
- If both bits added are '0', final bit is '0'.
- If one bit is '1' and the other is '0', final bit is '1'.
- If both bits are '1', final bit is '0' and we keep a carry ('1') for the next addition.
00000000000000000001111111101101
00000000000000000000100111001101
+
00000000000000000010100110111010
This way, regardless the digits of your number, if it can be represented with 32 or 64 bits, addition finishes in O(1) time complexity, in contrast with humans' algorithm, where time complexity increases linearly as the digits increase.