I've been reading here and there on people using models to predict BTC price for years, which is something that is performed in the stock market on a regular basis. The common approach is to incorporate historical data at a very granular level, and try to predict from patterns found within the data. This approach of course lacks the inputs of all the fundamentals that make the market sway to the tune of real world events and Tweets, which more often than not seem much more relevant than mere historical data.
There are models that try to combine data and information derived from social networks, incorporating at a minimum sentiment analysis, though point in time events that sway the market are way more difficult to analyse and incorporate, and as far as I recall, some even try to derive this sort of information from social networks.
Of course, models are bound to be useful, but they predict, not envision. The set goal is also key in all this: One could set out a goal to predict BTC’s max price in a 24 hour window, BTC´s price in 1 minute’s time, or BTC’s price in one week’s time. These goals render different algorithms, require different data curation, and derived potential usage. The more granular the goal, the more chances that something will throw the prediction off its tracks.
Everything is now buzzed as AI, though modelling has been going on for years. Nowadays the key is that you can created hundreds of models (variants with different parameters) and evaluate their soundness with much less effort (but more computing power) than before. Whilst modelling was the term until not long ago, everything needs to be rebranded as AI now it seems.
A few recent articles I’ve seen:
These guys claim that their models can yield results three times higher than by simply holding whilst
these others claim to try to incorporate past data and Google Trends to their model (which I don’t believe is a great predictor).