Pages:
Author

Topic: Tokenization is a new economic model, but what is the best application? - page 3. (Read 1395 times)

member
Activity: 163
Merit: 10
The blockchain technology and idea of tokenization itself is not about raising money for non-existent factories and enterprises, not about government registers and not about the creation of another crowdfunding or p2p credit social platform. Public blockchains are about creating new business models that have not been existent before and definitely could not be implemented in a centralized paradigm.

What is the most promising area for tokenization in your opinion? Thanks 🙏

I think the gaming industry is one of them. It just suits it perfectly!

Probably you are right. And the gaming industry is one of the fast growing. But what could be the new business model there? Every blockchain based project for gaming industry I have seen actually can be implemented without the decentralization and actually has a big centralized competitor...
newbie
Activity: 176
Merit: 0
The blockchain technology and idea of tokenization itself is not about raising money for non-existent factories and enterprises, not about government registers and not about the creation of another crowdfunding or p2p credit social platform. Public blockchains are about creating new business models that have not been existent before and definitely could not be implemented in a centralized paradigm.

What is the most promising area for tokenization in your opinion? Thanks 🙏

I think the gaming industry is one of them. It just suits it perfectly!
member
Activity: 163
Merit: 10
The blockchain technology and idea of tokenization itself is not about raising money for non-existent factories and enterprises, not about government registers and not about the creation of another crowdfunding or p2p credit social platform. Public blockchains are about creating new business models that have not been existent before and definitely could not be implemented in a centralized paradigm.

What is the most promising area for tokenization in your opinion? Thanks 🙏
Pages:
Jump to: