There is one thing that never really makes sense to me when we talk about AI being used in trading for example. When the goal of an AI should be to generate an alpha for a trader, there is only so much alpha in a certain situation in the market to be made. If the AI does a good job, it generates a max alpha for user A. If user B now requests the AI to generate max alpha, wouldn't it necessarily lead to an alpha being perfectly neutralized to zero with a growing number of users?
To me this seems like there is a contradiction because everyone could just start to use a certain bot. In the end it should come down again to who has access to the best software out there, which means it is again nothing more but a programming competition and speed and even then I would be careful because I think it is prone to errors when it comes to external events. How would the AI anticipate something like a pandemic? Or a war? Or any other event that has significant impact on the global economy?