Author

Topic: more tricky analysis (Read 1623 times)

donator
Activity: 2772
Merit: 1019
January 08, 2012, 06:42:36 PM
#19
I'm still generally interested, although I must say my hopes for getting good predictions have been lowered by some random thinking I've been doing when this idea popped into my mind while thinking about something else.

I still found no reason to believe the predictions we'd get would be any worse than those of "classical" technical analysis and my hope they would be better does still exist.
hero member
Activity: 1778
Merit: 504
WorkAsPro
January 08, 2012, 05:45:42 PM
#18
bump
donator
Activity: 2772
Merit: 1019
December 21, 2011, 03:48:22 AM
#17
I have a graduate course in optimization this spring. Maybe I could use this as my project.

I'm up for getting involved with anyone who's implementing this coding wise.

I want in, too.

Do you guys hang out on irc? There are many questions now, most importantly: what language could we agree on? I'd vote for java or python.
donator
Activity: 2772
Merit: 1019
December 21, 2011, 03:08:34 AM
#16

How can you possibly determine
1. How many trading models there are and which ones
2. Proportion for each model
3. Whether you missed any models


1 and 2: by "teaching". We'd "guess" these values, then evaluate them against historical data, make another, hopefully better guess, try again... rinse and repeat.

3: as Dan argues, the set of models the "optimizer" has at his disposal does not necessarily have to be complete, just "good enough".
hero member
Activity: 1778
Merit: 504
WorkAsPro
December 20, 2011, 08:59:32 PM
#15
I'm up for getting involved with anyone who's implementing this coding wise.
hero member
Activity: 672
Merit: 500
December 20, 2011, 07:47:14 PM
#14
How can you possibly determine
1. How many trading models there are and which ones
2. Proportion for each model
3. Whether you missed any models


Well that would be the catch. Of course you probably won't cover every trading model. But with the basic ones, I think you can capture a lot, and represent the rest with random functions. The proportion for each model would have to be seeded to fit the historical data and then adjusted based on which models are gaining or losing money. If some really important model isn't included, it will end up being covered by a random noise model. This would of course be a bad predictor of that model and probably throw off your results.

Maybe with some work, you could come up with a comprehensive framework to describe every possible trading model using a finite set of parameters. What kind of variables describe a trading model? Basically there is a forecasted value in time. This is the main thing. Modelling that would require a whole subset of parameters. There is a certainty or confidence on that forecast. There is some sort of risk aversion/acceptance... what else do you need to consider? What else goes into deciding how to trade?
hero member
Activity: 784
Merit: 1000
bitcoin hundred-aire
December 20, 2011, 07:24:07 PM
#13
Also individuals could be modelled to run out of money, for example, and are then forced to switch trading model. Should that be modelled down to the individual level or can that be modelled by flux to different bots? Will we end up with too many possibilities of "optimal distributions" and no predictive power in the end?

I think these (and more) problems cannot be easily answered without getting our hands dirty and trying the idea.

On the other hand: this just has to be better and more (automatically) flexible than technical analysis Wink

Actually this has to have been tried already or be used in the wild (non-bitcoin trading world), no?

Individuals shouldn't matter at all, just the amount of money following each model. The nice thing about this is that any completely inferior trading model will naturally bleed money to the better models (like a bad trader running out of money). I have a graduate course in optimization this spring. Maybe I could use this as my project.

How can you possibly determine
1. How many trading models there are and which ones
2. Proportion for each model
3. Whether you missed any models
hero member
Activity: 672
Merit: 500
December 20, 2011, 07:21:09 PM
#12
Also individuals could be modelled to run out of money, for example, and are then forced to switch trading model. Should that be modelled down to the individual level or can that be modelled by flux to different bots? Will we end up with too many possibilities of "optimal distributions" and no predictive power in the end?

I think these (and more) problems cannot be easily answered without getting our hands dirty and trying the idea.

On the other hand: this just has to be better and more (automatically) flexible than technical analysis Wink

Actually this has to have been tried already or be used in the wild (non-bitcoin trading world), no?

Individuals shouldn't matter at all, just the amount of money following each model. The nice thing about this is that any completely inferior trading model will naturally bleed money to the better models (like a bad trader running out of money). I have a graduate course in optimization this spring. Maybe I could use this as my project.
donator
Activity: 2772
Merit: 1019
December 20, 2011, 06:05:35 PM
#11
And yes, I completly agree regarding how a beginner can see things others can't, really the best of both is the best. As a programmer I am very aware many in my industry seem to have kept things hard for beginners, almost seemingly on purpose. Keeping things accessible is close to my heart, ironically working out the simplest way of doing something so that it still all works well can be very complex.

Sounds like you've been programming for a while.

In the end you sometimes (if you're lucky), look at your egg of columbus asking: "why didn't I just do it like that in the first place?"

That's why "stealing ideas" from others in this realm is to be considered smart behaviour. It's also why "patterns" have been so successfull.

"to develop" sometimes means to start with something, then keep "developing" (changing, rearranging, cleaning out, morphing around, bending, hacking, adding, removing, abstracting,...) until you have somehting nice and simple that does the job and is often smaller, simpler and more usable than the "first working version" you had when you started developing. For me, at least, it's very hard to take a problem, think about it for however long it takes and then finally implement a good solution on the first try. This might also have to do with the fact that the problem is usually not well-defined in the first place.

ok, I'm rambling now... better stop.
donator
Activity: 2772
Merit: 1019
December 20, 2011, 05:57:43 PM
#10
Hmm.  If only someone 200 years ago had devised a method for analyzing signals in terms of simpler signals...

Honestly, financial signals don't really work well under spectral analysis.  The feedback mechanism is too noisy.

Doesn't fourier transform assume periodicity of the time domain function? If so, it'd be unusable for making meaningful predictions, no?

Sometimes a naive approach by someone unencumbered can yield better results than using proven tools. I'm not saying this is the case here, just suggesting to not dismiss the approach prematurely.

Surprisingly, fourier works perfectly fine on non-periodic signals, sorta.  You can decompose any signal into periodic components, but in lots of cases those components aren't very meaningful.  A random signal will decompose into random components, a DC signal will compose into odd harmonics, etc.

If you model "the past" as being periodic, the future will just be predicted as "the past repeated", right?
donator
Activity: 2772
Merit: 1019
December 20, 2011, 05:56:09 PM
#9
Fourier transform doesn't "assume" anything about your signal. It just is a different way of looking at the same data. What it does it make clear any frequency peaks or cutoffs in your data that you probably can't see in a time domain.

My best idea for "tricky analysis" would be to numerically model the trading models of others. In other words, try to predict the models that everyone else is using to trade, and then model that. So take all of the bitcoin in existence, and all of the money and divide it up by what model you think that money is following. For most of it, the model will be "no trading". It is really simple to model that. Add to that a random model. A certain amount of money and bitcoin will just act without any logic or reason. Add to that a buy and hold model for people wanting to horde bitcoin. After that, maybe something simple like a trailing average model. If the price is above the average, sell, if it's below, buy. The size of the trailing average would be a spectrum along which money would have to be allocated. Then a linear prediction model based model where a given time history is used to build a linear trend which is used like the trailing average again on a time spectrum. On top of this, add a panic sell model for buying/selling to minimize loss. Calculate for each timestep, how money moves between these models based on gains/losses, and also how untraded money moves into the market perhaps based on external events. Optimize the initial distribution on historical data and see if you can get any usable results. Of course this would in practice be a lot harder than I've made it sound, which is why I haven't actually done it. But since there are very few "fundamentals" governing the value of bitcoin, its price should be mostly dictated by the psychology of those trading it.

Excellent idea, it's nice to see someone is on this page as I've thought about this too and I've been reluctant to try it because of the amount of work it is to do it.

Actually to have some benifits along the way I thought about going by roughly this route:

  • implement trading bot abstraction layer and market abstraction layer (benefit: be able to write "general" trading bots for multiple markets)
  • implement some bots (benefit: have bot implementations for actual trading use)
  • implement some real exchange apis on market abstraction (side benefit: you have usable trading bots)
  • implement "simulated market" replaying historical data to test/optimize bots and swarm models
  • implement some (continuous) optimizer (search algorithm) to find locally optimal values for initial state and flux model
  • to close the loop, implement a bot that uses predictions made by our optimal model instance

I'm stuck working (very slowly) on the first 3 items and there's a lot of "implements" in the list that are either non-trivial or a lot of work or both.

I am sure this idea carries some problems, too: for example the flux between the different trading models (or in-/outflux from outside) might be high and unpredictable enough to render the predictions unusable. I'm not sure wether introducing yet another layer on top here would be sensible. Maybe people (money) switching trading models can be modelled to be triggered by certain market conditions (e.g. your "panic mode"). Also individuals could be modelled to run out of money, for example, and are then forced to switch trading model. Should that be modelled down to the individual level or can that be modelled by flux to different bots? Will we end up with too many possibilities of "optimal distributions" and no predictive power in the end?

I think these (and more) problems cannot be easily answered without getting our hands dirty and trying the idea.

On the other hand: this just has to be better and more (automatically) flexible than technical analysis Wink

Actually this has to have been tried already or be used in the wild (non-bitcoin trading world), no?
kjj
legendary
Activity: 1302
Merit: 1026
December 20, 2011, 04:35:08 PM
#8
Dan, that is brilliant :¬) I'm in.

I imagine it could be like modeling the weather in that you'r model would loose accuracy increasingly rapidly into the future.

Worse.  The weather, last I checked, doesn't actively try to defy the forecasts.
hero member
Activity: 1778
Merit: 504
WorkAsPro
December 20, 2011, 04:26:06 PM
#7
Dan, that is brilliant :¬) I'm in.

I imagine it could be like modeling the weather in that you'r model would loose accuracy increasingly rapidly into the future.
hero member
Activity: 672
Merit: 500
December 20, 2011, 04:16:38 PM
#6
Fourier transform doesn't "assume" anything about your signal. It just is a different way of looking at the same data. What it does it make clear any frequency peaks or cutoffs in your data that you probably can't see in a time domain.

My best idea for "tricky analysis" would be to numerically model the trading models of others. In other words, try to predict the models that everyone else is using to trade, and then model that. So take all of the bitcoin in existence, and all of the money and divide it up by what model you think that money is following. For most of it, the model will be "no trading". It is really simple to model that. Add to that a random model. A certain amount of money and bitcoin will just act without any logic or reason. Add to that a buy and hold model for people wanting to horde bitcoin. After that, maybe something simple like a trailing average model. If the price is above the average, sell, if it's below, buy. The size of the trailing average would be a spectrum along which money would have to be allocated. Then a linear prediction model based model where a given time history is used to build a linear trend which is used like the trailing average again on a time spectrum. On top of this, add a panic sell model for buying/selling to minimize loss. Calculate for each timestep, how money moves between these models based on gains/losses, and also how untraded money moves into the market perhaps based on external events. Optimize the initial distribution on historical data and see if you can get any usable results. Of course this would in practice be a lot harder than I've made it sound, which is why I haven't actually done it. But since there are very few "fundamentals" governing the value of bitcoin, its price should be mostly dictated by the psychology of those trading it.
hero member
Activity: 1778
Merit: 504
WorkAsPro
December 20, 2011, 04:03:28 PM
#5
It presumes some occelation is involved but the occelation dosn't have to be regular, it can start and stop in paterns, for example it's very usefull in music, it can even be non sinusoidal etc... I wonder if the concept can even be taken to simplify complex systems into the most efficient groupings of components, however I digress.

Fourier compaires everything to a sign wave. In the same way that an Evolutionary Algorithm from another perspective can be seen as only one of many search methods, spectrum analysis could be seen as just one configuration of the core idea behind it, there will be many other permutations, for example adding abstraction whitin it, for example in music it could probbably be done with a piano note instead, so middle C shows up as 261.626 Hz (instead of the more conventional many differnt frequencies with only the Sign wave in middle C showing as 261.626 Hz). This example varient might not be that usefull, it's just an example.

I give phase, frequencies and search as mearly 3 things of many, everything has probbably been tried on the stock market but this is not the stock market.

And yes, I completly agree regarding how a beginner can see things others can't, really the best of both is the best. As a programmer I am very aware many in my industry seem to have kept things hard for beginners, almost seemingly on purpose. Keeping things accessible is close to my heart, ironically working out the simplest way of doing something so that it still all works well can be very complex.
kjj
legendary
Activity: 1302
Merit: 1026
December 20, 2011, 03:45:40 PM
#4
Hmm.  If only someone 200 years ago had devised a method for analyzing signals in terms of simpler signals...

Honestly, financial signals don't really work well under spectral analysis.  The feedback mechanism is too noisy.

Doesn't fourier transform assume periodicity of the time domain function? If so, it'd be unusable for making meaningful predictions, no?

Sometimes a naive approach by someone unencumbered can yield better results than using proven tools. I'm not saying this is the case here, just suggesting to not dismiss the approach prematurely.

Surprisingly, fourier works perfectly fine on non-periodic signals, sorta.  You can decompose any signal into periodic components, but in lots of cases those components aren't very meaningful.  A random signal will decompose into random components, a DC signal will compose into odd harmonics, etc.
donator
Activity: 2772
Merit: 1019
December 20, 2011, 01:40:53 PM
#3
Hmm.  If only someone 200 years ago had devised a method for analyzing signals in terms of simpler signals...

Honestly, financial signals don't really work well under spectral analysis.  The feedback mechanism is too noisy.

Doesn't fourier transform assume periodicity of the time domain function? If so, it'd be unusable for making meaningful predictions, no?

Sometimes a naive approach by someone unencumbered can yield better results than using proven tools. I'm not saying this is the case here, just suggesting to not dismiss the approach prematurely.
kjj
legendary
Activity: 1302
Merit: 1026
December 20, 2011, 12:35:45 PM
#2
Hmm.  If only someone 200 years ago had devised a method for analyzing signals in terms of simpler signals...

Honestly, financial signals don't really work well under spectral analysis.  The feedback mechanism is too noisy.
hero member
Activity: 1778
Merit: 504
WorkAsPro
December 20, 2011, 12:08:34 PM
#1
Like an endless mountain it rolls on, the price over time of BTC over USD. I've seen various methods of estimating the future price, but I'd like to consider a few more advanced ones, ones that are not specifically financial.

I tried what I think is called phase analysis myself. Taking the price at each moment in time in X and the price at the previous moment in Y for each moment producing a scatter graph. While this dosn't seem helpfull it's suprisngly good at showing patterns in erratically varying values. I found nothing but static, perhaps this is a good indication, we are looking for patterns in clouds.

Another possiblity is that by using spectrum analysis a graph can be made where X is time, Y is freqancy and the shade at that position shows the amplitude at that time and freqancy.

A third and more complicated method could be to use search methods.

"How much is the graph the same as Y = 2X + 4?"
"How much is the graph the same as Y = sin(x * 5) + cos(x + 3) ^ 2"

etc...

While thats the bases of searching it can be more practical than that. For example the description for the graph shape could be more algorythmic, looking and acting much less like an equation and more like a program, or not. The search would probbably be directed, so the most promising directions get searched the most. Examples are evolutionary algorithms, the Monte Carlo tree search and things that can be done with neural networks.
Jump to: