To truly understand how signature is encoded you have to understand how DER encoding works. It is a very simple stream of bytes that uses the Tag Length Value (TLV for short) encoding scheme meaning
* everything always starts with a "tag" telling you the type of the "value" to expect, for example a sequence (0x30), an integer (0x02), a string (0x0C), boolean (0x01), null (0x00),... We only use sequence and integer tags in bitcoin.
* it is followed by the "length" of the value" encoded using a special way
* and finally the value itself
So when you see 0x3044022100b55d62280cad71235b7b9ff771e3f6839ae7f02ff117cc620511d027068063b3021 f63c7c0b9561e8b59dfde896d2203d4e990df42bd0062ec12cce615b91ce1c501 it translates into:
[0x44 byte length]
[value=
[integer][0x21 byte length][value=00b55d62280cad71235b7b9ff771e3f6839ae7f02ff117cc620511d027068063b3]
[integer][0x1f byte length][value=63c7c0b9561e8b59dfde896d2203d4e990df42bd0062ec12cce615b91ce1c5]
]
When we encode integers as arbitrary length byte arrays, in order the interpreter whether it is positive or negative we have 2 methods, one is using an additional byte (which is what we do here) if the highest bit of the highest byte is set.
0x63=0b01100011 -> not set -> don't need 0x00
0xb5=0b10110101 -> is set -> we need 0x00