I definitely did not write this myself, it is running on the very big computer of a very big facility. It needed work to get running on a personal computer. A missing file, some corrections, and a few replacements of deprecated functions.
My floor-top Linux PC from 2007 has a 3 GHz quad-core i7 CPU and 6Gbytes of RAM, which is fine for everything I do at home. Amazing to think I haven't needed to buy a new computer since then (apart from buying bigger disk drives).
The four data sets are very big so the analyser software scrapes the data it wants from the web files. The software takes about 72 minutes to process the first and smallest file, but Linux kills it when it tries the other, larger files. I think this is because it exhausts the memory resources. The 2Gbyte of swap file space hits 97% utilised.
I shall try it on my workplace laptop (dated 2022), which has 32 Gbyte of RAM. The CPU is an i7 quad-core i7, about the same processing power as my floor-top PC.
Discussions
Become a Hackaday.io Member
Create an account to leave a comment. Already have an account? Log In.