Author

Topic: Trying to understand how numbers are interpreted in scripts (Read 196 times)

legendary
Activity: 4522
Merit: 3426
Your values look correct to me. Numbers are little-endian with a sign in the most significant bit.

RangeRepresentation
======================================
0 - 12700 - 7f
0 - -12780 - ff
128 - 3276780 00 - ff 7f
-128 - -3276780 80 - ff ff
32768 - 223-100 80 00 - ff ff 7f
-32768 - -223+100 80 80 - ff ff ff
223 - 231-100 00 80 00 - ff ff ff 7f
-223 - -231+100 00 80 80 - ff ff ff ff
legendary
Activity: 1042
Merit: 2805
Bitcoin and C♯ Enthusiast
I'm trying to figure out how numbers are interpreted inside scripts. Could someone check if the following test cases are correct?
"bytes" is only the data part, I'm skipping the size part which would be 0x01, 0x02 and 0x03 for 1, 2 and 3 byte data respectively.
Code:
Integer --- bytes
17       { 17 }
75       { 75 }
76       { 76 }
127      { 127 }
128      { 128, 0 }
129      { 129, 0 }
255      { 255, 0 }
256      { 0, 1 }
32767    { 255, 127 }    // 32767 = 0xff7f = 0b11111111 01111111
32768    { 0, 128, 0 }   // 32768 = 0x0080 = 0b00000000 10000000
32769    { 1, 128, 0 }   // 32769 = 0x0080 = 0b00000001 10000000
65535    { 255, 255, 0 } // 65535 = 0xffff
-2       { 0b10000010 }  // 0b10000010 = 0x82 = 130
-127     { 0b11111111 }  // 0b11111111 = 0xff = 255
-128     { 128, 128 }    // 128 = 0b10000000
-129     { 129, 128 }    // 128 = 0b10000000
Jump to: