Class | Result type | Machine model
legendary
Activity: 966
Merit: 1000
Really beautifull technical explanation, lots of thanks for the beauty and interesting explanation, one really worth thread
legendary
Activity: 996
Merit: 1013
All good and waiting for the next installment
sr. member
Activity: 420
Merit: 262
Well, my layman's journey did not last very far. (I remembered I left the oven on so had to go back home).
Apologies I will try to bring it back to layman's level of explanation on the next post. Arguing with mathtards over at cs.stackexchange can certainly fly over the heads of laymen. Some of the first principles stuff is difficult to explain to someone who has no programming and math experience at all. But eventually I will also make summaries that can make sense to layman who don't even have those rudimentary skills. I view myself as a layman relative to those who are (more and probably formerly) trained in the field of cryptography, such as Adam Back and Gregory Maxwell of Blockstream and the academics who are writing the white papers I will be trying to understand and explain in this thread.
legendary
Activity: 996
Merit: 1013
Well, my layman's journey did not take very far. (I remembered I left the oven on so had to go back home).
sr. member
Activity: 420
Merit: 262
Edit: it is pitiful that these self-annointed "compsci" idiots have such high reputations (thus fooling the readers and spreading incorrect information as "authorities"). This is really fucking stoopid. They mods over at cs.stackexchange deleted the entire comment trail wherein I proved that the given answer was incorrect. This must have embarrassed the shit out of them because the answer had 75 up votes from professors, PhD students, etc.. Likely the sparkling academic cathedral portion of our world is apparently getting dumber and more corrupt. Here follows the comment trail which I saved before it was deleted. 1. Take a random Turing machine and a random input. 2. Either the computation will go on for ever or it will stop at some point and there is a (constant) computable function describing each one of these behaviors. 3. 4. Profit! I feel like, following your proof scheme, one could prove that termination is decidable. Did I miss something? Watch out what the Halting theorem states: it says that there exists no single program that can decide whether a given program halts. You can easily make two programs such that either one computes whether a given program halts: the first always says 'it halts', the second 'it doesn't halt' - one program is always right, we just can't compute which one of them is! That is exactly my point: how can you claim that a problem is decidable (cf. this thread's title) when you're not able to decide which one of the two algorithms is the good one? That's not the same thing though. In the case of Alex's example neither of the algorithms will return the right result for all inputs. In the case of this question one of them will. You can claim that the problem is decidable because you know that there is an algorithm that produces the right result for all inputs. It doesn't matter whether you know which one that algorithm is. The definition of "decidable" is that an algorithm exists to perform the task, not that we know which algorithm it is. As long as we can prove there is at least one suitable algorithm in existence then we know the problem is decidable. @gallais Good that you raise this point; it is the pitfall most fall into. Think of it this way: The halting problem function wants to decide something for all Turing machines. Here, we only talk about one number. If you extend $f$ here to take a real number as parameter and ask the same question as for $\pi$, now that will be undecidable! The other way round, for any fixed Turing machine the halting problem may very well be decidable (and it is for many). I am also having a bit of a problem to accept this answer. But I think my issue is that this answer is non-constructive, and I am more inclined to constructive mathematics. But I guess the normal definition of computable is computable in a classic sense, and personally such a definition feels like nonsense. I agree with @danr: this proof is non-constructive, and it's always unsettling to have non-constructive proofs of statements about computability... @carl : you pointed out that the definition of "decidable" is that an algorithm exists to perform the task, not that we know which algorithm it is. As long as we can prove there is at least one suitable algorithm in existence then we know the problem is decidable. But can we say that the second algorithm exists without knowing the value of N? Its one thing to say that one of the 2 algorithms is correct always and another thing to say that algorithm uses a constant which is not computable itself. If $N$ exists, it is just a natural number, and every natural number is computable (because we can just hard-code it). But doesn't our inability to enumerate and hence test for every natural number matter? No. It doesn't. @Raphael at al, the answer is incorrect. It is undecidable whether any algorithm will terminate when searching for a string of 0 values in the digits of π, because the input entropy is unbounded (bz π is irrational so the digits patterns never repeat) thus the problem is Turing-complete. Any algorithm you invent will for each n either find the string, or never find it & never terminate, and it is undecidable a priori which. Perhaps you are conflating NP with this problem. NP is computable because we only run a deterministic verification (i.e. always terminates) on a non-deterministic algorithm, e.g. meaning that someone has told us to verify that a particular input n terminates within M number of algorithmic steps with a "yes" result, so verifying that is deterministic (can terminate after if we reach M steps) . But this problem is asking to prove that (it is decidable that) n is always computable, which is not the same as asking to verify that a particular value is computable in a defined number of steps. That is not what "Turing-complete" means. Your argument about algorithms that search for strings in the binary expansion of π does not apply to all algorithms. In particular, none of the algorithms in my answer search for strings in the binary expansion of π. Nevertheless, exhaustive case analysis implies that exactly one of them is always correct. Nobody knows which one, and perhaps nobody can ever know which one, but that doesn't matter. A correct algorithm exists, and therefore the problem is decidable. No, I am not confusing anything with NP. This problem is not asking us to decide "whether n is always computable". The problem is asking us to prove that the problem "Given an integer n, is there a string of n consecutive zeros in the binary expansion of π?" is computable. The question does not ask "whether n is always computable". Rather it states "Prove f(n) is computable" (not n but f(n)). Thus the question demands that our algorithm for f(n) is computable. Even the title of the question reaffirms the requirement that the algorithm be computable, "How can it be decidable whether π has some sequence of digits?". Turing-completeness is the requirement for unbounded input, memory, time, etc so as to be general enough to compute any computation. Since expansion of the series for π is infinite and the patterns don't repeat, then there is no algorithm that doesn't require unbounded steps in the general case of any value for input n. Thus f(n) is Turing-complete and thus is undecidable. QED. You shouldn't just foist nonsense on what is supposed to be accurate site for learning about compsci. Again, it does not seem as though you knew what you are talking about, at all. "Thus f(n) is Turing-complete and thus is undecidable" -- that's ... not even wrong. The implication is that f(n) requires a Turing-complete programming language because it requires unbounded recursion to expand the digits of π. Here is a canonical quote for you, "When applied to a programming language, Turing-complete means that it can fully exploit the capabilities of a Turing complete computer". You do a disservice to this site with your abundant nonsense. Although the digits of π are computable (and only if we ask for infinite digits do we get non-termination), the question asks to compute a search for existence of a string of 0 digits, which means it undecidable whether the computation halts or not. Thus the computation is not computable, because we don't know if it is still busy or looping forever. This is just fundamental computable function theory. If you mess this up, you need to go back to Compsci 101. "f(n) requires a Turing-complete programming language because it requires unbounded recursion to expand the digits of π" -- that's a non-sequitur, and JeffE has proven that $f$ does not have to expand any digits. "If you mess this up, you need to go back to Compsci 101" -- agreed. See you there, I'll be your TA. The only way that the 1st case of JeffE's answer is correct, is if we can assume that every finite string of 0s is contained within the expanded digits of π, without needing to actually expand the digits. On further thought (not sleepless now), I suppose that is true because the expansion of π is unbounded thus eventually every pattern is produced i the expansion. I don't recall if that has been formally proven & n isn't finite? JeffE hedges against that with the 2nd case which says if there is a maximum N, then we enumerate to it. This hedge doesn't work because N is undecidable. Thus either the first case is always true or his idea of avoiding actually expanding the digits is undecidable per the logic I presented. When I wrote my comments (for me after being awake for 20+ hours nonstop working) I wasn't really paying attention to the point that JeffE was proposing not to do any computation (expansion) and always return true. So in that case, can we formally prove the first case? If yes, then JeffE's answer is correct. But I think the problem is that n isn't bounded. So I don't think that can be proven. "N is undecidable" -- that statement does not make sense, and if it did mean what you think it does, it would still not make your case -- because we don't have to determine $N$ algorithmically. You've almost got it. I am indeed hedging my bets by presenting an infinite sequence of algorithms. I don't know which one is correct; maybe the first one, maybe the second one with N=1234567890 hardcoded in, maybe the second one with N=104378942378493027478326457832 hardcoded in, or maybe the second one with some other value of N hardcoded in. I'm quite willing to believe that no one will ever know which of these algorithms is correct. But I don't have to know which algorithm is correct. All I have to know is that my infinite collection contains a correct algorithm. You can't know N without running an algorithm to confirm it. Sorry guys. And you can't prove the first case either, c.f. Yuval's comment that not all sequences of 0 are proven to occur in irrational numbers, besides n is also unbounded (in addition to the sequence being unbounded) which afaik mucks up any attempt to prove the assumption of the first case. Thus afaics this answer is wrong. Raphael ever faithful to your endearing personality defect, you continue ASS-U-ME you know what I am thinking. You can't know N. That's correct. But I don't have to know N. You need to confirm N because as Yuval pointed out, you can't assume any finite sequence of 0s will exist in a finite sequence of the expansion No, I don't need to confirm N. I don't need to confirm that any particular finite sequence occurs in π. I need to know that either all strings of 0's appear in π or there is some longest string of 0's in π. If the first alternative is true, the first algorithm is correct. If the second alternative is true, then for some value of N the second algorithm is correct. Either way, at least one of the algorithms is correct. But you can't compute the second case without knowing the value of N for which n > N should return 0. f(n) must always return a correct value for any n. Without knowing N, f(n) in second case can't return the correct value. I can't believe you are making me write this. What the heck could I possibly not be seeing? It seems so obvious to me. You can algorithmically compute the correct return value for the second case and be sure to terminate for any finite value of n. But for the unbounded case of n, then the second case is undecidable (maximum N can't be known without confirming it so you don't know whether to return 0 or 1), and the first case is unproven. What am I missing? Your logic is based on not needing to use an algorithm in the unbounded case, but Seems the error in your logic was forgetting that the return value of the second case matters in the unbounded case. 75 wrong up votes. Oh my. I have moved over to chat, but please note I am recording this discussion offsite in a popular forum, bcz I view chat as a closed source and I believe i in open source. Also I will post any final summary back to the comments on the page from whence we originated. The error is precisely that you assume your proposed second case of f(n) will only be necessary for some finite N and that it can be computed because N is a maximum finite value. But if it is true that N is finite, then your first proposed case for f(n) must always be invoked in the unbounded case. So the precise error is it is undecidable which case should be invoked, because your logic requires that 2nd case be invoked when the first case would fail. That proves your answer is flawed. Let me rephrase to make it more concrete that the answer is broken. We must know N, because above values of N we need to return either 0 from the second case of the proposed f(n) or 1 from the first case of the proposed f(n). Because your logic assumes the second case of f(n) will be invoked when the first case is inapplicable. But this assumes the first case will always be applicable in the unbounded case, otherwise we must assume that the second case will be invoked with an unbounded value of n. Since we can't prove that the first case is always applicable, then the second case is undecidable. QED. I could probably improve the proof if I wanted to spend more time on it, but this was an enormaous waste of my precious time already. Adios to this site!! Comments are not for extended discussion; this conversation has been moved to chat. You deleted the entire comment trail because I showed that everyone here was wrong! I will be publishing the entire comment trail and pointing out how corrupt this cs.stackexchange is!
sr. member
Activity: 420
Merit: 262
Pertaining to the Computational Complexity Classes, the negative votes one has to endure when they are the posting insight that is too conceptually unifying and the readers are too thick to grasp that reality. I've had other examples, but at least this one has come up from a -9 vote to a -2 lately (as more readers start to grasp I was correct all along). Yet I have another one at -6 which I am very confident I am correct and the voters are idiots. Edit: it is pitiful that these self-annointed "compsci" idiots have such high reputations (thus fooling the readers and spreading incorrect information as "authorities"). This is really fucking stoopid.
Edit#2: I need to start copying all the comments over here because the one bezerk moderator at cs.stackexchange is deleting my information/work again! http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104814_50623That isn't what nondeterminism means in computer science. Nondeterministic algorithms are not "unpredictable." Nondeterminism is precisely unbounded entropy in computer science. Go review the Halting problem, undecideability, and Turing-completeness. The unpredictability in nondeterministic algorithms is due to not being able predict if the algorithm will terminate. Period. All the downvoters are ignorant. Whereas, randomization comes from entropy that is orthogonal to the input variables, i.e. under control of an orthogonal source of entropy. I do hope people realize that entropy is disorder and all randomness is due to entropy. Geez. Even /dev/urandom is seeded with entropy. You have completely misunderstood what nondeterminism means in computer science. It has nothing whatsoever to do with entropy. It does not mean what you think it means: that is why you think all the other answers are wrong. Perhaps the name was badly chosen but that's beside the point. It has a particular meaning in computer science, which is different from the meaning it has in other sciences. You are trying to use the wrong definition, and that is why everything seems completely wrong to you. You can't seem to grasp that I am unifying the definition compatible to the way it is used in computer science. The use of the term nondeterminism when talking about computational complexity theory (e.g. NP vs. RP vs. P) is clearly all about the entropy as I have explained it in my answer. That finite automa decided to abuse the one term "nondeterministic", is just an aberration. The existence of the three terms in NP, RP, and P is strongly canonical. Also canonical bcz consistent with the universe of definitions for nondeterminism. Sorry dude, you are wrong.
Clearly NFA is only congruent with the rest of the universe of definitions for nondeterminism (including the more canonical computer science definition in complexity theory for NP, RP, P) if we realize that the NF go together. Thus NFA is not referring to the subject of this question which is nondeterminism, but rather to nondeterminism in the presence of finite entropy, which is another conceptual entity entirely different from nondeterminism. The mistake in your logic is assuming the N in NFA is distinct from the F, which it obviously isn't! Cripes. http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104790_50623I am appalled because other than Alen tin Brink's answer, all the other answers are highly incorrect. And yet I observe readers upvoting those highly incorrect answers and downvoting mine. It is bewildering and flabberghast. It shows that this topic is highly misunderstood. Computational complexity issues are a very high IQ abstraction apparently. Thus apparently only a small percent of readers do really understand these issues. Otherwise explain to me how you think I am incorrect. http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104857_50623http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104859_6061I[your] answer has nothing to with how non-determinism is defined in automata resp. computability theory. You should stop flaming around and get a textbook. If you don't understand the other answers, I don't think we can help you in a comment.
Finite automata is not relevant to the question of non-determinism because finite automata has a finite entropy! That is fundamental. The use of the word "non-determinism" in finite automata is a misnomer and a mirage. You can't just take a few researchers error in appropriate naming and pretend to foist that on correct logic. Also I understand the other answers to the extent I have commented on them, e.g. I understand that finite state machines can't represent an unbounded entropy nor generally run Turing-complete programs. "The use of the term nondeterminism when talking about computational complexity theory [...] is clearly all about the entropy" -- no, it's not. Appeal to authority is not argument. Please provide your reasoning or give up. Perhaps these quotes from Minsky can help you gain clarity I don't need any reasoning. I am only appealing to the formal definition of non-determinism in automata/computability theory. Which, I daresay, you have either never read or not understood. Formal definition of non-deterministic finite automa (NFA) is finite input entropy (data: the 5-tuple). Thus every NFA can run on a deterministic Turing machine, i.e. doesn't require a nondeterministic Turing-complete machine. Thus NFAs are not in the class of nondeterministic problems. The notion of "nondeterminism" in NFA is that its determinism (while clearly present since every NFA can be converted to a DFA) is not explicitly expanded — not the same as nondeterminism of computation http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104860_50623http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104861_6061As I explained already, the notion of non-determinism in NFAs couple the non-deterministic with the finite entropy. Thus the non-determinism is a local concept of not expanding the determinism as a form of compression or convenience, thus we don't say NFAs are non-deterministic, rather they possess appearance of randomness to an oracle unwilling to compute the deterministic expansion. But it is all a mirage because it call be expanded deterministically bcz the entropy is not unbounded, i.e. finite. If you don't have an argument, I have no more time to waste on your nonsense. http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104864_50623http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104865_6061The claimed "non-determinism" in NFAs is really randomness is sense of my definition of the distinction between randomness & nondeterminism. My definition is that randomness is where some of the entropy that is not under the control, knowledge (, or desired non-explicit expansion in the case of a NFA) of the input to the program or function. Whereas, true nondeterminism is the inability to know the entropy in any case, because it is unbounded. This is precisely what distinguished randomized from nondeterminism. So NFA should be an example of the former, not the latter as you claimed. http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104858_6061Why are my comments disappearing from your answer? Are you appealing to moderators to delete information? http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104868_50623You all will eventually realize I am correct. There is no other formulation of the information set which is harmonious and congruent. My answer is correct as well as my comments. I have no idea how long it will take you to realize the truth. That isn't my job. It is rather bewildering that cliques of grown men can go off on political tirades of up voting each other's comments, downvoting correct answers, and flagging my comments for deletion by moderators. I am ashamed of you. I am now duplicating all these comments on a public forum to document this abuse. http://cs.stackexchange.com/questions/367/how-can-it-be-decidable-whether-pi-has-some-sequence-of-digits?noredirect=1#comment9096_367http://cs.stackexchange.com/questions/367/how-can-it-be-decidable-whether-pi-has-some-sequence-of-digits#comment104869_367@Raphael, for the third time, since you've had the moderator delete this comment on the prior two occasions, the notation I have normally seen in the cryptography papers I read is {0}^n which is less ambiguous bcz {} is the notation for set. I am now duplicating these comments where I have 1000s of readers in order to document moderators abusing their authority and deleting information. Ah you are German, that entirely explains the control freak attitude. The link is from Eric S. Raymond the 150+ IQ genius who coined the moniker "open source". I am a moderator. I removed your comments because they are redundant/obsolete, which is completely in line with our policy. Stop reposting them. (Did not you tell me that argument by authority is no argument at all?) How is a comment about the notation you are using redundant? I am pointing out that you used a more ambiguous notation than what I've seen used. Why do you feel not only compelled but entitled to delete that comment? And you've removed it again for the 3rd time. http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/6061?noredirect=1#comment104883_6061I'm removing your comments because they are obsolete (literal copies from the comments below your own answer). They are also not information but misleading tirades. If you want to tell the world how wrong every book and article written about this ever is, start a blog. This is not the place. So you are the mod here (oh lord help us) and thus abusing your position by being the biased judge in your own case? A highly ethical person would recluse themselves from a conflict of interest. Your characterization of the quality of my comments is laughable. You don't even seem to comprehend what nondeterminism is and yet you are a PhD student in Algorithms & Complexity group at University of Kaiserslautern, Germany. An embarrassment to your university. Keep up the ad hominen against me & I will return it in kind. http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104882_50623We can agree on that: please stop trying to "teach" us, it's not helping anybody.
And one piece of advice: if everybody else -- including professors, doctors and graduates of (theoretical) computer science -- tell you that you are wrong, you may want to stop and consider if they are right. As a help for this reflection process, have a look at this learner classification by Wolcott/Lynch (p36). In particular, note the commen weaknesses of the "Biased Jumper". You continue to appeal to authority. Also you don't know my qualifications. I am still waiting for you to present a coherent logic. I have presented all the coherent logic. Allow the open source peer review from many academics over a sufficient sample, then you will see this initial barrage of nonsense from your private clique here is overturned by reason. P.S. I am very familiar with the concept of the Dunning-Kruger syndrome. I do not like your "I know it all" and very inflexible attitude to learning new concepts. You've been condescending to others who asked reasonable questions. http://cs.stackexchange.com/questions/5008/differences-and-relationships-between-randomized-and-nondeterministic-algorithms/50623?noredirect=1#comment104872_6079-1 Randomness can be deterministic, if it inputs a seed of a bounded number of bits in possibilities (bounded entropy) instead of accessing new seeds an unbounded number of occurrences. Nondeterministic is unbounded possibilities (entropy). Wikipedia defines nondeterministic algorithms to be those which can't be proven to terminate. Thus Wikipedia is not presenting two different definitions. Thus I am downvoting. There is nothing 'unreal' about running a nondeterministic algorithm, it just means that while running, choices need to be made for which the outcome isn't determined by the algorithm. The only difference between 1 and 2 is that in 1 you haven't said when the algorithm is considered to "succeed", whereas in 2 the algorithm must be of the kind that always says "yes" or "no" at the end of a run and you have a decision problem and you define the algorithm's answer to the problem as "yes" if and only if any one of its possible runs answers "yes". So 2 is just a special case of 1. Everyone is conflating the difference between randomized and nondeterministic. This causes your comment to be muddled. The algorithm responds to the interaction of the input (variable) entropy and its source code (invariant) internal entropy. Nondeterminism is unbounded entropy. Invariant entropy can even be internally unbounded such as expanding the digits of π. Randomized is some of the entropy is not coupled to the input as defined (i.e. it may be coming from a system call to /dev/random, or simulated randomness e.g. NFA or a PRNG).
sr. member
Activity: 420
Merit: 262
Computational Complexity Classes: OverviewAn algorithm (or program) is a set of logical steps that transform some state (i.e. variables). If the algorithm is a function (i.e. not a procedure), then it will read the input state and write the output state, thus being a transformation from the input state to the output state. (I'm referring to a pure function which does not read nor write any global variables). A computational complexity class defines boundaries on the computation allowed in an algorithm. Some examples of boundaries are: 1. | Result type:
sr. member
Activity: 420
Merit: 262
Join me on a technical journey to try to understand Zerocash ( not Zerocoin which is a separate project although Zerocash does name their anonymous units zerocoins), starting from first principles. I am curious and want to broaden and deepen my understanding of the technologies in Zerocash. I want to share my learning experience and perhaps also obtain collaboration with the community in learning and elucidating in layman's terms. Here is a high-level enumeration of technologies we need to understand in reverse order of what we need to understand first, i.e. where the former requires understanding of the latter: - Zerocash
- zk-SNARKS
- quadratic arithematic programs (QAPs)[1]
- quadratic span programs (QSPs)[1]
- probabilistically checked proofs (PCPs), as an alternative to QAPs or QSPs
- span programs
- computational complexity classes, e.g. P, NC, NP, E, #P, ⊕P
I recently realized that Zerocash is the only way to achieve anonymity without the (virtually impossible to guarantee) requirement to obscure our IP address. This is a technical education and discussion of technology thread, so moderation is present to be able to keep it focused on such ( because I understand there are political vested interests regarding anonymity technologies in the Altcoin arena). P.S. sometimes I need a distraction from pure coding, and learning about theories and ideas is an outlet/diversion for me to relieve the stress or mental monotony of the coding process as compared to the "grand ideas" mental process.
|
|