Pages:
Author

Topic: Ogg Opus - page 2. (Read 2148 times)

sr. member
Activity: 420
Merit: 262
February 27, 2016, 05:30:07 AM
#3
The container is very efficiently seek-able over the network, in fact. But your implementation must be sufficiently intelligent.

Here are some benchmarks from the opusfile library, On a 25 hour long variable bitrate audio recording performing 1000 jumps to exact sample positions with no caching:

Total seek operations: 1873 (1.873 per exact seek, 4 maximum).

So an average of less than two operations, and in the worst case 4-- meaning even with a 100ms RTT a worse case seek will not cause a noticeable delay. (and obviously, if it cached, it would be much better).

I am not an expert on streaming media and have just begun my research, but it seems to me that your quoted benchmark is assuming many seeks will be done on the stream. But there are cases where the user wants to only skip once or twice into a song as they are sampling the music for the first time, which is my use case. For that case, the lack of an index is afaics horrific because there will be the latency of some roundtrips required to optimally locate the seek position (by a bisection sampling method) and wasted bandwidth as well.

There are many complaints about the lack of an index found on Google search. In particular I note the "Random Access" section of a list of complaints about the Ogg container design.

And this requires no indexes which require the file be written in multiple passes or that it only be seekable when it's "finished"-- a live stream is seekable while it's being recorded.

You apparently assume your container is only going to be used for live streams.

So you are saying Ogg is not designed as an optimal archive format. You could have instead made the index optional.

I think the seeking performance, given a good seeking implementation, is pretty good, and it's often more efficient than other less flexible containers even when they have indexes-- because to keep their indexes reasonably sized they're not high resolution-- and getting them requires seeking to the end or a rewrite of the file. To the extent that an average near 2 is less good than 1 might be with a perfect index, that is a cost for the streaming functionality, and I think a reasonable one.

Hey the resolution could be a configurable parameter so programmers can decide the tradeoff that is ideal for their application. Since when should you decide for them.

I didn't design the container, but if I did-- I might have only added an additional back-reference or two at each page; I wouldn't change how seeking works.

Seems you are excluding use cases.

I am getting the idea now after several times interacting with you, that you are clearly better at math than I am but I am better designer than you. You seem to pigeon-hole often. Perhaps it is the heads-down quality that is required to have the patience to learn all that math?

I recognize your intellect and attention to detail, but you seem to also be inflexible which is not the trait of the best software designers I've known in my life. I am guessing that maybe you are strongly German cultured and need everything neatly ordered in your own space. Note I have some German ancestry (and it shows sometimes in my perfectionism at times), but I also am a mix breed of French, Celtic, and Cherokee native. I think this makes me more creative/flexible than you. Not that I often think about comparing ourselves, just at times like this where you disrespect others (and then somehow expect they would respect you  Huh).

Perhaps you were not intending to disrespect me in this case, but I think the past track record (see link above) is what leads to this tension which is ready to blow at any time. Also I holding you to a higher standard w.r.t. to this crap that you seem to be foisting with Segregated Witness, because you hold everyone else to such a higher standard and even threatening them of being scammers without sufficient proof (again see my link above). Perhaps I should realize this is just the German trait and brush it off my shoulder. I think the less we interact the better. Thanks for the reply and I wish your orderly orderness didn't rub me the wrong way. Sorry I am about as compatible with a German as I am with shooting myself in the head. I love freedom which means stop stomping on others. I am a sheepdog which means I will fight those who I perceive are oppressing freedom of expression. Hey I understand what it is like to deal with trolls and in that case I would support the swift action since trolls only aim to disrupt, but I wasn't trolling you.

Tim hasn't really had anything to do with Rust-- I had some infinitesimally small influence on the language (lobbied successfully for overflow of ordinary signed types to be defined as an error). Rust has some nice properties for cryptocurrency infrastructure: in particular it has no overhead, deterministic operation, with high levels of safety enforced at compile time. These things matter for decentralized cryptocurrency, since speed is also a security consideration.

I know I read something about Rust and one of your two names mentioned, but I forgot the specifics and couldn't locate it again readily with a Google search (at least not on Tim's name).

Okay I could understand the incomplete typing of Rust (as compared to say for example Haskell or Scala) would not be the priority when wanting to get as close as possible to the metal while having some higher level functionality not provided by C.

Any way, as I said, the specifics of my former criticisms have been mostly forgotten by myself.  I would need to refresh my memory. I think the flaw was w.r.t. to declaring the invariants for class members in the class methods if I am not mistaken (but that is very vague so I might be recollecting incorrectly).
staff
Activity: 4284
Merit: 8808
February 26, 2016, 11:31:33 PM
#2
The container is very efficiently seek-able over the network, in fact. But your implementation must be sufficiently intelligent.

Here are some benchmarks from the opusfile library, On a 25 hour long variable bitrate audio recording performing 1000 jumps to exact sample positions with no caching:

Total seek operations: 1873 (1.873 per exact seek, 4 maximum).

So an average of less than two operations, and in the worst case 4-- meaning even with a 100ms RTT a worse case seek will not cause a noticeable delay. (and obviously, if it cached, it would be much better.) (In other words, these figures involve absolutely no caching or shared state between the 1000 trials, they're completely and totally independent... if your application only ever did a single seeking operation it would perform an expected 1.873 random reads to do it).

This test is part of the opusfile tests directory, you can perform it yourself on your own files. Even if you constructed a malicious file that looked nothing like actual VBR audio the worst it can do is bisection-- so log() operations-- (I believe the method we use can formally be proven to never perform more than 2x the number of probes as plain bisection on any input no matter how crazy).  It turns out that over large timescales VBR isn't very variable, the variations average out and once your search space is small enough that they don't you've already pretty much landed at the solution.

And this requires no indexes which require the file be written in multiple passes or that it only be seekable when it's "finished"-- a live stream is seekable while it's being recorded. You can truncate or even extract a chunk of the middle of a stream with basically no more complexity that range requests (and putting the header at the front, if you're extracting from the middle), and the resulting output is perfectly seek-able too. Corruption to the file will not break its seek-ability (except, perhaps, for specifically malicious corruption that also fixes up a checksum). And it does this with <1% overhead.

I think the seeking performance, given a good seeking implementation, is pretty good, and it's often more efficient than other less flexible containers even when they have indexes-- because to keep their indexes reasonably sized they're not high resolution-- and getting them requires seeking to the end or a rewrite of the file. To the extent that an average near 2 is less good than 1 might be with a perfect index, that is a cost for the streaming functionality, and I think a reasonable one.

I didn't design the container, but if I did-- I might have only added an additional back-reference or two at each page; I wouldn't change how seeking works.

Tim hasn't really had anything to do with Rust-- I had some infinitesimally small influence on the language (lobbied successfully for overflow of ordinary signed types to be defined as an error). Rust has some nice properties for cryptocurrency infrastructure: in particular it has no overhead, deterministic operation, with high levels of safety enforced at compile time. These things matter for decentralized cryptocurrency, since speed is also a security consideration.
sr. member
Activity: 420
Merit: 262
February 26, 2016, 11:16:24 PM
#1
The client is a mixture of rust

Btw, I have been looking at Ogg Opus which you were intimately involved in, and kudos on Opus! But it seems one major error was made on the container format Ogg in that it doesn't include markers so that one can't jump forward in a stream efficiently over a bounded bandwidth source such as the internet. I presume the container format was not your responsibility.

Any way, I mention that because I notice Timothy B. Terriberry was your colleague on Opus and I believe he also has a connection to Mozilla and the RUST language development. This caused me to remember that I had been critical of RUST several years ago when it was conceived, because I had some complaint about the way it types invariants. I will try to locate my former criticism and add it to this post.

So I guess it makes sense you would want to experiment with RUST. I am not sure if I would still agree with my former criticism, so I will try to locate it and see if it still makes sense now.

Btw, I applaud the effort to develop zk-snarks for smart contracts. I believe it is absolutely necessary per the revelation in the "cut & choose" thread which if you think about actually has implications for any scripting on a block chain.

Pages:
Jump to: