Pages:
Author

Topic: [neㄘcash, ᨇcash, net⚷eys, or viᖚes?] Name AnonyMint's vapor coin? - page 14. (Read 95279 times)

legendary
Activity: 2968
Merit: 1198
Compilers (including C) do autovectorize and use SSE, etc. instructions. It's not always at the same level you could get by doing it explicitly but it does happen.

As far as the rest, programming languages are a nasty thicket. Every decision seems like some sort of tradeoff with few pure wins. Make it better in some way and it gets worse in others.

Yes most OS apps on Linux are C but that is largely legacy. They were written 20 years ago, and rewriting doesn't have major advantages (if any).

I guess that's another place C gets its current high level of popularity and that is maintenance of legacy applications. There are a lot of them.

If you look around at software that is being developed today by Google, Amazon, etc., very little of it is written in C (other than maybe low level hardware stuff as I mentioned earlier)
legendary
Activity: 1708
Merit: 1049
Personally I'm absolutely amazed how such a weird language like c, that was intended to write ...OPERATING SYSTEMS, has been used to this day to write ...applications.

Are you a programmer AlexGR? I understood from the other thread discussion we had that you mostly aren't, but not sure.

It depends on the definition. I do not consider myself one but I've written some simple programs, mainly for my own use, in the past.

I still code occasionally some things I need. For example I started recently (an attempt) to implement some neural network training to find patterns in SHA256, to see if it might be feasible to create a counterbalanced approach to ASIC mining, by pooling cpu and gpu resources in a distributed-processing neural network with tens of thousands of nodes (the equivalent of a mining pool). Anyway while I think the idea has merit, I didn't get very far. Now I'm contemplating whether neural networks can find weaknesses in hashing functions like cryptonight, x11, etc etc and leverage this for shortcutting the hashing. In theory all hashing should be pretty (pseudo)random but since there is no such thing as random it's just a question of finding the pattern... and where "bad crypto" is involved, it could be far easier to do so.

Anyway these are a mess in terms of complexity (for my skills) and I'm way over my head due to my multifaceted ignorance. My background wasn't that much to begin with: I started with zx spectrum basic back in the 80s then did some basic and pascal for the pc, then did some assembly.... and then the Internet came and I kind of lost interest in "singular" programming. In the "offline" days it was easy to just stick around a screen, with not much distraction, and "burn" yourself to exhaustion, staring the screen for 15 hours to write a program. In the post-internet era that much was impossible... ICQ, msn, forums, all these gradually reduced my attention span and intention to focus.

I always hated c with a passion while pascal was much better aligned with how I was thinking. It helped enormously that TPascal had a great IDE for DOS, a nice F1-Help reference etc. Now I'm using free pascal for linux, which is kind of similar, but not for anything serious. As for c, I try to understand what programs do so that I can change a few lines to do my work in a different way.

In theory, both languages (or most languages?) do the same job... you call the library you want, use the part you want as the "command" and voila. Where they may differ is code execution speed and other more subtle things. Library support is obviously not the same, although I think fpc supports some c libs as well.

Still, despite my preference for the way pascal was designed as a more human-friendly language, it falls waaaaaaaaaay short of what I think a language should actually be.

Syntax should be very simple and understandable - kind of like pseudocode. Stuff like integers overflowing, variables not having the precision they need, programs crashing due to idiotic stuff like divisions by zero, having buffers overflow etc etc - things like that are ridiculous and shouldn't even exist.

By default a language should have maximum allowance for the sizes that a user could fit in vars and consts but also let the user fine tune it only if he likes to do so to increase speed (although a good compiler should be able to tell which sizes can be downsized safely and do so, by itself, anyway). Compilers should be infinitely better, taking PROPER advantage of our hardware. How the hell can we be like over 15 years later than SSE2 and not having them properly used? You make a password cracker and you have to manually insert ...sse intrinsics. I mean wtf? Is this for real? It's not an instruction set that was deployed last month - it's there for ages. Same for SSE3/SSSE3/SSE4.1/4.2 that are like 8yr old or more and go unused. How the hell can a language pretend to be "fast" when it throws away SIMD acceleration, is beyond me. But that's not the fault of the language (=> the compiler is at fault). Taking advantage of GPU resources should also be a more automated process by now.

Quote
I think, though I'm not sure, that most C coding these days (which still seems to be quite popular) is indeed system programming, firmware for devices, performance-critical callouts from other languages (Python for example), etc. But I have few if any sources for that, and it may be completely wrong.

OS are definitely C but almost the majority of OS apps are also C, especially in linux.
sr. member
Activity: 420
Merit: 262
You will find that if you look at applications written in C they are built up using layers of libraries. By the time you get to what might be called the business logic (although a "business" might not necessarily be involved -- this is a term of art) you are barely using C any more, just a bunch of library calls where almost every language ends up being almost identical.

I'm not a fan C for larger team projects because the process of building up your own libraries on top of libraries is difficult to express in a constant and reliable way. Not impossible, but difficult. It tends to work better for a project written and maintained by one person or perhaps a very small group

Good point that due to the lack of a modern type system, C is really limited on the orthogonal coding, modules aspect. Intermixing/interopting with Javascript via Emscripten makes this even more brittle.

I think, though I'm not sure, that most C coding these days (which still seems to be quite popular) is indeed system programming, firmware for devices, performance-critical callouts from other languages (Python for example), etc. But I have few if any sources for that, and it may be completely wrong.

Well that seems to be true and my Github exemplifies me coding in C for those cases, but it would be better if I didn't have to add the thorn of C and FFI to interopt C with my other code.
sr. member
Activity: 420
Merit: 262
What about targeting Chrome NaCl? Maybe the languages you want to use don't support it as a target, which would mean an ugly intermediate step. It is not clear to me how you do anything else in a browser other than JavaScript, although maybe your target is your own client and not existing web browsers?

I want to replace the browser. Its time for the browser to die as an App platform.

If you want a combination (i.e. compromise) of modern language features and maturity and platform support, it is probably hard to beat Scala overall, though you are stuck with JVM+JavaScript as targets I guess. You basically have to give up one or the other to move beyond that. Erlang is another one that seems to have a bit of maturity, but I haven't used it. Haskel seems very interesting but unusable in practice given the lack of platform maturity.

I've become convinced Martin Odersky doesn't understand that subclassing is an anti-pattern. I'm abandoning Scala.

Edit: obviously open source because I can't (even if 100% healthy) code all that by myself.

Edit#2: Last year Martin asked me to stop discussing my ideas on the Scala language google group about eliminating subsclassing.

It appears he doesn't want to simplify by throwing away unncessary anti-patterns. The coming Scala 3 Dependent Object Calculus is about unification of subclassing with abstract types. Scala will forever be a complexity PITA.
legendary
Activity: 2968
Merit: 1198
Personally I'm absolutely amazed how such a weird language like c, that was intended to write ...OPERATING SYSTEMS, has been used to this day to write ...applications.

Are you a programmer AlexGR? I understood from the other thread discussion we had that you mostly aren't, but not sure.

You will find that if you look at applications written in C they are built up using layers of libraries. By the time you get to what might be called the business logic (although a "business" might not necessarily be involved -- this is a term of art) you are barely using C any more, just a bunch of library calls where almost every language ends up being almost identical.

I'm not a fan C for larger team projects because the process of building up your own libraries on top of libraries is difficult to express in a constant and reliable way. Not impossible, but difficult. It tends to work better for a project written and maintained by one person or perhaps a very small group

I think, though I'm not sure, that most C coding these days (which still seems to be quite popular) is indeed system programming, firmware for devices, performance-critical callouts from other languages (Python for example), etc. But I have few if any sources for that, and it may be completely wrong.
legendary
Activity: 1708
Merit: 1049
Personally I'm absolutely amazed how such a weird language like c, that was intended to write ...OPERATING SYSTEMS, has been used to this day to write ...applications.

There's too much code that needs to be written but there won't be many programmers to write it with all those piece of shit languages that turn people away from coding. There needs to be a breakthrough in terms of what a language is. Not the same old, same old with slightly different syntax and features.
sr. member
Activity: 420
Merit: 262
Private discussion about my programming language post above:

Well you mentioned 'apps' there, I'm not sure what is meant by that but if it means something created by a large community of not necessarily experts, many of which may be relatively simple, then JavaScript or something JavaScript-like is probably the way to go.

There is not really anything else that is popular enough that so many area already comfortable with it and accessible enough to a wide audience. Apple only got away with this for iOS apps (with Objective C) because iOS was an extreme hit product. Even then Android's use of better-known Java was probably a factor that helped it catch up in the app race.

But I don't know the context really.

Availability of libraries is critically important in current development. It is impossible to achieve good productivity in many current application spaces without being able to draw on libraries. Many of the minority languages just fail there in actual practice, whatever there design merits (making them largely at the hobby stage where people can continue to build libraries and hopefully make them useful). This applies especially to languages that aren't JVM based since native FFIs tend to be a huge PITA.

Regarding JVM unsigned, I thought Scala had unsigned, but I never looked that carefully? It is indeed a huge PITA in Java to do low level work because of that.

I was thinking more about technical feedback, than market adoption feedback. If you want to add that, that is welcome.

Nevertheless, this is a good point obviously that I should also analyze.

One of the very significant issues, is that I am very tired of coding in the languages which are shit. For example, I have become (at least initially) convinced that asynchronous concurrency model is superior to multithreading for most scenarios that I need to code, but Javascript such a PITA to code low-level with. I ran into this issue on the first attempt to start coding my crypto-currency in Node.js. I can code in mixture of C (or C++) employing Emscripten and Javascript, but there are still significant pain points and lack of unification. Also still need to use transcoders for some features in some browser versions on the client (for wallet code). If I switched to coding the entire thing in C++ to gain for example generics via templates, I lose elegant first-class functions and gain C++'s fugly complexity+corner cases, have to transcode with Emscription, and then don't have good debugging support because I can't test the C++ with Node.js without transcoding.

I am also confirming from many cases, that subclassing appear to be an anti-pattern and that there is a lack of extensibility by the inability to emulate Haskell's orthogonal interface and data (i.e. Typeclasses and algebraic data types), yet we need also first-class unions and intersections to retain subtyping of them and the compiler needs to automagically code the mashups of the interface implementations for the intersection for each function. No language on earth appears to do this, and this afaics would radically improve modular, no-refactor extensible code.

Garbage collection is another beast that impacts performance, and Rust's new paradigm for smart pointers looks like possibly a huge winner. No other language has that afaik.

Regarding Apps, I need the compiler to be super fast and eventually ameniable to JIT compilation, because I want these Apps to run instantly as downloaded in real-time over the Internet (as opposed to installed). Yet these Apps (such as games) will still need low-level capabilities, so the issues of Javascript apply. Although many games are written in C++, C++ is a fugly PITA of a language and a complex compiler that afaik isn't that fast. Also we can forget JVM languages for this reason too; and although apparently Google has put a lot of effort into making Davlik fast and incremental compilation, it still sucks worse than Hotspot which also sucked for quick start compared to Javascript.

Rust appears to have everything except the first-class intersections and also the compilation targets and modularity of compilation targets may be badly conflated into the rest of the compiler (haven't looked, but just wondering why the fuck they didn't leverage LLVM?).

Technical thoughts in that context?

As for adoption, as in all things if you create a significantly better mousetrap that is open source, then the resources come to you if it is needed by the market. If it is not very much needed, then creating a new language will fail. Rust appears to have at least two features which are significantly needed:

1. Higher performance compilation stage "smart" pointers to avoid GC without overhead and corner cases of runtime incremented/decremented smart pointers.

2. Eliminated subclassing, but unfortunately didn't enable subtyping via first-class intersections.

I guess investigating the source code of Rust might be a next step, to see if it is good base to improve upon. Perhaps since they didn't plan to support first-class intersections and given they removed Typestate, their typing model may be a mess. And also the potential concern about the modularity of the compilation targets (and remember I am targeting quick start JIT eventually).
sr. member
Activity: 420
Merit: 262
I've been taking more Oregano oil past 2 days or so. Up to 8 times per day. Today I was able to run 4.8 kms (2 times 2.4 kms first in noon tropical swelting heat, then in the evening tropical swamp heat) and did 200 pushups today. Yesterday evening I did barbell workout for my shoulders. I was able to sleep 6 hours last night then 3 hours this afternoon. Problem lately had been the inability to sleep more than 5 hours per night.

I feel I may be making some progress on my health. Still not where I need it to be (still have limitations and downtime due to this illness), but perhaps nearing a breakthrough. I hope!
sr. member
Activity: 420
Merit: 262
Sharing a private message:

Quote from: myself
Subject: April 10: programming language research update

First read this (and following the links) to get an inkling that I know about programming language design. I was doing a lot of research on PLD before I got into researching crypto-currencies in 2013.

https://bitcointalksearch.org/topic/m.14488687

The main reason I am considering a different language for Apps is because Javascript is a really horrible low-level language, i.e. when manipulating data structures that are to be efficiently stored in memory and/or disk. And huge performance penalties for the garbage collection, lack of pointers, etc.. This is really important for games or any low-level code such as the crypto-currency wallets and server code. Although C/C++ code can be compiled to ASM.js Javascript with Emscripten (which is very near to the speed of native compiled code), C++ is a fugly complex legacy monstrosity nor does it have the features I'd like to see below. First-class lambda's in C++? Mixing C and Javascript code achieves the best of no worlds and is an inelegant, non-unified mashup.

One thing I really like about the latest ECMAScript is the coming support for generators and Promises, which enables writing really elegant asynchronous concurrency code, which I frankly think is superior for scaling concurrency (e.g. 1000s of requests to a web server for Node.js or even multiple concurrent operations on a client fetching data over the internet) than multi-threading because it avoids both the overhead of threads and the race conditions of re-entrant code. Reusable thread pools still have their role, e.g. even on Node.js. Unfortunately I know of no modern statically typed language that supports generators and Promises as a language construct, although I think the Akka library for Scaling accomplishes some of the same goals, perhaps less elegantly (will need to study more).

I really want drive towards my idea for maximum modularity and extensibility without refactoring (as I had explained last year on the Scaling google group) to remove the anti-pattern subclassing (or at least virtual inheritance) entirely and keep only subtyping of unions and intersections. By imploring separately implementable interfaces:

http://doc.rust-lang.org/book/traits.html  (Notice how HasArea can be implemented separately from Circle and Square)

And automating in the compiler the union and intersection of trail implementations (something Rust doesn't do, because it doesn't have first-class unions and intersections).

Ceylon has first-class union and intersections (even Haskell does not because it messes up global type inference), but apparently doesn't allow interfaces to implemented separately from the class they operate on:

http://ceylon-lang.org/documentation/1.2/tour/inheritance/ (notice the "satisfies interfacename" on the class declarations)

Conceptually I like Rust's memory deallocation model which avoids the need for garbage collection in many scenarios with just slightly more cognitive load on the programmer (although no experience with it in practice yet):

http://doc.rust-lang.org/book/ownership.html
http://doc.rust-lang.org/book/references-and-borrowing.html

Note the comments on Apple's new Swift language by the creator of Rust:

http://graydon2.dreamwidth.org/5785.html

Btw, Go sucks as a replacement for C++ templates and anyone using that doesn't understand well modern Generics programming language design, which includes Ethereum:

https://en.wikipedia.org/wiki/Go_%28programming_language%29#Notable_users
http://yager.io/programming/go.html
http://jozefg.bitbucket.org/posts/2013-08-23-leaving-go.html

Also the marriage of a GC with what is intended to be a low-level systems programming language seems to be a mismatch:

http://jaredforsyth.com/2014/03/22/rust-vs-go/

So the conclusion thus far is there is no language on earth that meets my requirements. If I wanted to take a baby-step with the least effort, perhaps adopting Rust or writing a lexer translator (no AST parser, if possible) from my designed minimal syntax that is supported by Rust. Then later adding first-class unions and intersections either to Rust or a new dedicated compiler.

One problem is Rust doesn't output Javascript, so no way to interopt with Node.js on the server-side (for the server code we need, not for Apps). So it would require writing new libraries to provide similar functionality (which may already exist). The ARM support for Rust is not fully mature:

http://doc.rust-lang.org/book/getting-started.html

Forget JVM languages because the numerical stack and lack of precisely sized (and unsigned!) data types is a pita!

Edit: appears that Rust doesn't offer variance annotations on type polymorphism parameters (aka generics), unless it is detecting the variance from usage and enforcing the inability of a type polymorphism parameter to be simultaneous more than one of invariant, covariant, and contravariant; and such detection would be impossible for traits since they are multiple implemented.

Edit#2: Rust doesn't need variance annotations because it has virtually no subtyping nor apparently subclassing. The absence of first-class unions and intersections means we can't construct subtypes orthogonal to subclasses.
sr. member
Activity: 420
Merit: 262
Brendan Eich better stick with his core competency of the programming language Javascript.

I see Rust ended up removing the Typestate system that Brendan was raving about, for the reasons I had stated and thus predicted.
sr. member
Activity: 420
Merit: 262
Nobody wants to use any of this shit, Bitcoin included.

Not entirely true. I used Bitcoin to receive funding/donations/loans and mostly not for HODLing. This was cross-border without hardly any fees and no need to know the identity of whom I was dealing with. We bypassed the bank wire tsuris.

Many have used Bitcoin to do "anonymous" activities.

Sorry I think you are wrong here smooth. Bitcoin has a use case for large value transfers. This is why I think Monero may have more value than people realize now.

Stores that accept Bitcoin usually do so via payment processors and auto-dump it. Accepting Bitcoin via payment processors is like an affiliate program for them; they don't care about the currency at all, it is just about traffic.

Agreed, but note they don't accept the other altcoins because the other altcoins don't provide that traffic (free advertising) boost and the altcoins aren't liquid enough to hedge to fiat they want to be paid in.

It just all doesn't serve any purpose that any sane mainstream person would care about, outside of a SHTF scenario.

Disagree. It is just that the use case is a fairly small % of the population. No one has yet focused on the masses use case. As you know, I am focused on that with my non-existent vaporware.

Trying design after design in a futile effort to chase "adoption" is churning investors to pay developer salaries. Bitshares investors are among the biggest suckers, since they paid (and I guess continue to pay, although I don't follow it closely) the Larimers to develop something which serves little to no purpose as decentralized crypto but can now be used as a vehicle to be paid again by banks for "blockchain".

Well I agree when they have no adoption and no viable plan to attain it. Bitcoin investors shouldn't pay for adoption until the adoption is already there.

That is my plan.
legendary
Activity: 1708
Merit: 1049
Right, 5 -> 5k -> 5mn.
legendary
Activity: 2968
Merit: 1198
If we go another 1000x to 2035 and if we go yet another 1000x to 2055, then the current 5tx/s would be 50.000 and 50.000.000 in 2035 and 2055 respectively.

Off by a factor of 10 here, but that doesn't change your core argument.
legendary
Activity: 1708
Merit: 1049
58:00 - Block chains can't out perform Tx/sec of Visa w/o losing the trustless attribute. I will challenge him on that with my white paper and argue that we can't have the trustless attribute w/o scaling!

They don't need to outperform them. Blockchains only need to get the job done.

Human transaction needs are finite. Meaning, that even if technology, whether centralized or decentralized, can process 10 trillion tx/sec, it's useless to us because we have, say, 10.000 tx/sec in terms of actual needs.

Now, technology went 1000x from 1995 to 2015, in terms of cpu, storage, ram, network bandwidth.

A 486 or Pentium PC, with 4-16mb ram, 1gb disk and 28kbps modem is now replaced with 1000x+ more powerful processor, gbytes of ram, terabytes of disk and mbps of network connection.

If we go another 1000x to 2035 and if we go yet another 1000x to 2055, then the current 5tx/s would be 5.000 and 5.000.000 in 2035 and 2055 respectively. And that's not accounting for

a) software optimizations
b) software exploiting the hardware better (SIMDs, GPUs etc)

So what we think of blockchains changes depending the time coordinates. If blockchains existed in their current form back in 1995 they would be a nice but useless thought experiment. In 2015 they are just starting to work in terms of hardware and in 2035 they'll probably be pulling more throughput than (current) visa - but even if visa can do 1000x by then it won't matter because there won't be 1000x more txs for them to handle due to finite consumer needs.

If the capabilities of the decentralized network scale more than performance increases in terms of hardware or software, then centralization occurs as a result to more specialized equipment and data centers.

If the capabilities of the decentralized network scale in sync with increases of hardware and software speed, then there is not much loss of decentralization. To give an example, if a 2035 pc and home connection can sustain a 5k tx/sec network with its storage, processing and bandwidth requirements, just like a 2015 pc can sustain a 5tx/sec network with its respective requirements, then there is no issue in terms of decentralization.

The above doesn't mean we must wait until 2035, or wait for hardware. There is work to be done in the software optimization level.

edit: fixed / thanks Smooth.
sr. member
Activity: 420
Merit: 262
Gresham's law - all alts are dumped for BTC.  No stores will accept them until they get their own dollar pegs and then they might climb the ladder of store of value.

Good point. Note r0ach I think I will do something about this, but we will see...




58:00 - Block chains can't out perform Tx/sec of Visa w/o losing the trustless attribute. I will challenge him on that with my white paper and argue that we can't have the trustless attribute w/o scaling!
sr. member
Activity: 420
Merit: 262
Copy of a private message:

Quote from: myself
Subject: Oregano oil & sleeping

Back over here at 11:30am, after sleeping 10 hours, because brownout again at other main house. That was 4 times brownout yesterday. The drought has disabled the main power source here in Mindanao, the Pulangi Dam, and an emergency shutdown of a diesel generation plant has led to this sudden 4 hour rotating brownout schedule:

http://www.davaolight.com/

I am going to need to do something about this. So far, I am able to juggle between my two rental houses, since so far there is always power at one of them. But I lose 45 minutes in travel time back and forth. This is becoming a twice daily travel now.

The main impediment to even faster progress remains my health. My health is more stable and thus I am making more progress. When I say stable, I mean that most of the time now I am not entirely useless with foggy brain and head on the keyboard. Usually every day I can function for being at the computer for many hours. But it is not yet at the point where I have full power energy always. I only get that full blast energy on some days (maybe 1 a week or perhaps 2 sometimes) and usually only for a few hours of those day(s). That full blast energy is where I can do amazing productivity, because my thought process is so much sharper and clearer. With only low or moderate energy and slight aches here and there, my mind is muddled (not foggy but not sharp and able to solve deep puzzles with ease). I had that full energy nearly always during my entire life up until this illness. If I can get that full energy back, I will be able to really impress with my coding productivity.

That is what I am working on now, experimenting with the Oregano oil. I had been only digesting it (because I can't find the drops here only capsules) for past couple of weeks. That had seemed to make an improvement. Then yesterday I tried letting a capsule dissolve under my tongue because I've read that sublingual under the tongue is by far the most effective means of getting the key ingredients into the blood stream (whereas in the stomach acid they are mostly lost). I did that twice yesterday and again this morning @ 8am. It has made me so incredibly sleepy. I had been only able to sleep about 4 - 5 hours daily lately which was wearing me down and also indicating that I am not cured. The lack of sleep was a negative sign in otherwise mostly positive signs lately. So I am very encouraged that I slept 10 hours last night. I slept again from 8am to 11am after awaking to eat. I would have slept more (I'm still groggy now) except I was sweating so much without airconditioning due to the brownout which started @ 8am at the other house.

The one time I thought I was totally cured of this illness was in Sept/Oct 2012 when I slept always for about a week. All my symptoms were entirely gone. I was in Subic, taking a lot of vitamin D3, a gym rat every day, and eating the food for foreigners there. Upon returning to Davao, my bad symptoms started again. During 2013 when I lived in the very cold mountain at very high elevation, I slept a lot. And my symptoms were much more under control (except when I would exercise, then they would go bezerk), but I was slowly getting worse. When I returned to Davao City to live and started to work on crypto-currency Nov. 2013 and into 2014, I worsened. Then 2015, my health fell off the cliff after June. Basically I've been a zombie since Nov. 2012. I was already starting to experience this very low energy and chronic fatigue in 2010 and especially 2011. I had issues with fatigue every since the fateful event in 2006, but kicked into higher gear roughly 2010 and then acute after the May 2012 hospitalization for acute peptic ulcer.

So any way, back to the Oregano oil. If it is causing my body to be able to eradicate what ever infection is inside of me that is what I believe is the cause of my ailments, then perhaps that is why my body wants to sleep so much now. So that could be a very positive sign. If I can get the breakthrough to my full energy again, then we can really rock and roll even without a co-developer. If I can't get the full energy back, I can still be an effective leader and get some code done, but we will really depend on a superstar co-developer to be the most productive of the team.

Again remember I've been recently tested negative for HIV, Hepatitis B & C, Syphilis, pancreas & prostate cancer markers, and few others.
sr. member
Activity: 420
Merit: 262
While you all bicker and waste time & capital on useless altcoins, I have a bigger problem on my mind that I am trying to help the world avoid:

I chose to agree. This would be the single biggest issue that would cause an enslaving of nations.
thats why we need country fiat and bitcoin.

I thought we are already in this situation.

*Ahem the US dollar? last time I checked everyone loves it, despite its covered bad value.

So why would it recourse into another world reserve currency.

The difference will be that the new one-world reserve coming approximately 2020, will not be controlled by any nation, but rather by a world government body.

This will be viewed by the world as more fair. But in reality it will be much less fair, because the world government will act basically the way the Troika does in the EU now, lending to the nations and never letting them default. They will lend in the world currency, but the people will be paid in their nation's shit currency which is debased like hell by the national politics. So then when the national currency loses value, the people are stuck paying back loans in the relatively more expensive world currency.

This is precisely what the Troika did to the PIIGS to destroy them. They will then do this on a global scale to enslave us all.
sr. member
Activity: 420
Merit: 262
Synereo pumper of presold AMP tokens for vaporware, wants to attack me already:

All Shelby has released is a drawing that puts him on track to be sued by Nintendo for trademark infringement.



A copyright owner can’t prove infringement by pointing to features of his work that are found in the defendant’s work as well but that are so rudimentary, commonplace, standard, or unavoidable that they do not serve to distinguish one work within a class of works from another.

The "stock" idea of working class hero, gaining abilities from powerups, and South European origin are however all usable.

Features of above logo:

✓ Wavy moustache
✓ Pointed eyebrows
✓ Eyes with misaligned pupils
✓ Semi-long nose
✓ Hair above a music headphone

I presume DecentralizeEconomics is alleging infringement on Wario?



✓ Jagged moustache
✓ Curved eyebrows
✓ Eyes with aligned pupils
✓ Stubbed nose
✓ Hair on the sides under a hat



The plaintiff rests.

Are you seriously sticking with the claim that no one can use a nose, eyeballs, eyebrows, and a moustache in a logo, because Nintendo used those features in one of their game characters  Huh

My logo sketch does not resemble Wario which has a hat. The features are all different.

By your logic, Nintendo is violating Paramount Pictures's intellectual property and Saddam Hussein's likeness:

https://en.wikipedia.org/wiki/Bluto




sr. member
Activity: 420
Merit: 262
This is How We Do.

https://bitcointalksearch.org/topic/m.14427015



rangedriver, my health is improving significantly. Thank you! Hope it sustains.
sr. member
Activity: 420
Merit: 262
Not updating that website has no relationship to my coding skills. For you to make such a statement indicates you jump to irrational assessment based on insufficient data.

And of course I never said it has. This is just your usual bullshitting. I said, it indicates how tragicomical is, and how delusional you are having such website and in the meantime you lecture other software engineers in 24/7 about UI design.

I can't erase the website from the Webarchive. Why do I have to erase coolpage.com in order to not be tragicomical. That makes no sense. I've moved on to other things as I explained to you.

Of course you deleted my post in order to keep bullshitting.

I quoted your venom. No need for two copies of it.

The rest about your skills and accomplishments are irrelevant.

Why don't you tell that to the 1 million users who downloaded CoolPage and 335,000 websites that were confirmed created by Altavista as of 2001. A one man company! Do you have any idea how much work that was!

Why don't you tell that to the Painter 3.1-J version which shipped a $million in its first month in Japan in the mid-1990s.

Why don't you tell that to EOS Systems and claim to them that the 3D Viewer work I did was "useless and irrelevant" in 1996.

Why don't you tell that to the 30,000+ users of my WordUp software in the 1980s, which if compared to internet distribution we have now was on the order of a million users.

And of course I saw your few Github files - that indicates nothing special at all.

You didn't look closely then or you probably don't even understand what is clever in that C code I wrote. And that is just local cleverness. You'd need to look at my larger code bases to see how I apply holistic design cleverness.

When a 51 years old software professional quotes his minuscular CSS standard contribution as a major accomplishment

Back when you were recruiting me, you were stating my reputation would be a boon to GadgetCoin. Now you are cutting me down to size eh.

Well I never claimed those were monumental contributions I made to the W3C. I do claim that they conflated the framing and data layers for WebSockets and didn't adhere to correct design principles which I argued for and made a technical proposal for.

then that says all about you.

Yep appears it does. Repeat offender of real world accomplishments.

Now, it is clear, all your talk and contribution is just to build up & market a professional profile for the next P&D. Your P&D is coming - after all you promised a P&D to your "angel investors" don't you

I have not done that. I will not do that. And this will be the last time you will post in my thread, because you are not writing rationally.
Pages:
Jump to: