and it can only be used to verify whether the content was modified or not. it won't be able to show what the modification was. for example it could simply be an edit fixing a mistake (a typo in the article), or any change in the html of the site would result in an entirely different hash. so it won't be "unalterable historical record".
ah. right. hmm. good point.
i'm not exactly sure on HOW this would work mind you - is it some sort of bot scraping articles for dates, events, pictures and video? well if so, how do we know the articles are posting real things, and not, say, the onion and it's satire?
You can't, you'd need trusted human input for that. At best your project could be a decentralized alternative for archiving sites, but I'm sure someone has already tried it, so you may want to do some research and check if they succeed or not, and what's the current state of such projects.
As for fakes, I believe that they can be combated with the same technology that creates them - machine learning. If you can teach neural network to imitate something, you can also teach it to spot all those artifacts and other patterns that are present in fake photos and videos. As for articles, there's also neural networks that do sentiment analysis and can spot manipulative articles with some accuracy.
ugh... another good point.
as should be no shock to anyone here, i need to do a lot more research.
i kinda figured someone would have already heard of something like this and point me to it or just blow it full of holes right away. guess it was kinda a middle ground.
yea, the already existing archiving sites largely do this... maybe the route to take is speak with them about implmenting blockchain to prove their archives are genuine or something? i don't know.
bleh.