If we talk about chatbots as others are:
AI draws all its information from currently known facts and sources, there is nothing magical in AI analyzing something, it's just the result of the model it has been built on, if the model is based on a thing the results generated by it will be just as biased, now would you let something like that try to stop an addition that might become life-threatening if advice goes wrong who's going to be in charge of patients that need help?
Furthermore, treating addition includes a few things:
- avoiding isolation and communicating with family
- interacting with people who don't have this problem and can easily say now, having them a role model
A chatbot might simply get ignored as an addict can simply turn it off and that's it, a support group or a therapist can press you to attend meetings, and family and friends can be of much more help and influence.
It looks good in theory but the results will be much like those guys going over a divorce and building a relationship with an AI girlfriend.
BUT!!! The article isn't about chatbots!
What's worse, it goes even further:
Simply, it is possible to have a dynamic AI display with probabilities, for states of mind, per situation, around gambling. It could be adjusted with the amplification of a grader or the attenuation of another. It could also work with expanding relays and so forth
all this is a fancy way of saying it the solution it to turn you into a lab rat where you're allowing manipulation of your own senses, dystopian is a light word for this thing, basically an AI analyzing your brain and drawing patterns for your actions based on the effects felt by your brain from different interactions.