After my last project (Hackaday.io Global Feed Scraper) was made obsolete when the API came out, I decided I should re-do the project - making use of the API this time.
So, I put together a basic PHP script that takes all of the past entries from the global feed and dumps them into a database. Because the API has an hourly rate limit (seems to be somewhere around 1000 hits per key per hour), I added functionality for it to pick up where it left off, building up the database each time the script is run. I set it up to be called once an hour on my server, and after a few hours I had every feed update in my database.
Code is available on github if anyone wants to try it out, otherwise stay tuned for some data analysis in the future.
I'd be interested in seeing some project quality metrics. For example it'd be reasonable that a quality project would recieve more followers per project view than a poorly documented project. Another quality metric could be skulls per follow as well.