For those playing along at home, you'll no doubt note that (should my understanding be correct), BW consumption scales at O(n^2).
I'm sure it will improve when the routing invention breakthrough occurs.
Of course - the clueless developers! D'oh, what an unforgivable oversight! The system they implemented is doomed. They chose the SAN (Spam All Network) algorithm for both route discovery and node state update. They slept during their networking lessons, or was it calculus? So they failed to notice that bandwidth grows quadratically with the number of nodes. This way, it's almost as bad as if one increased the block size. But who would ever think of
that?
Hyperbole duly noted. Be that as it may, do you have any evidence that suggests that my understanding is incorrect?
Honestly, I don't know which solution has eventually been adopted in this first version of the LN, but already in 2016 the proposed method was significantly better than the naive SAN technique you hint at.
Great. What does that have to do with the shipping implementation? Anything?
Let's summarize our path so far:
you: I'll try it out
me: cool - please let me know about your BW use. My understanding is it broadcasts all state to everyone
you: oh posh! how dare you assume the devs did not do better
me: well, that's what I'm led to believe - you have better info?
you: no, but here's a possibly relevant scrap of a possibly workable design that would do better
I mean, I don't mean to misrepresent anything here, but that's the way it looks from my vantage point.
https://steemit.com/bitcoin/@emabfuri/bitfury-bitcoin-routing-lightning-network-solutionHere's a relevant excerpt.
Using a fog of war like design, the collected information by the routing algorithm “includes channels within a low hop-distance and paths to randomly selected nodes further away… As a result, a node will have a well-illuminated map of its local neighborhood within the network, with random patches of visibility further away enabled by the selection of beacon nodes.”
I often write with rhetorical metaphorical flourish as well. But not within a technical specification that I expect to be taken seriously. So in addition to questioning this excerpt's relevance to current code, I also question it's author's capability.
So, it's not an initially low TTL that gets conservatively increased as I had wildly imagined, but not so far off either: closer nodes are exhaustively enumerated, while a small set of distant nodes is selected pseudorandomly.
Which still leaves quite the bulk elided from view. In what way is this anonymous? How does it handle nodes dropping off the network? And the ensuing rejoins? 'Beacons'? Is that a euphemism for trusted intermediary?
It just sounds like we're being asked to buy a pig in a poke. Again, my assumptions could be wrong. But life has taught me well that SW projects are rarely underpromised and overdelivered.