I started doing this with SETI@Home. And have continued to run these sorts of
programs on my computers ever since. SETI@Home used BOINC, which is still used by other projects. I also use World Community Grid. Highly recommend!!
I currently have tasks qued/being worked on, from World Community Grid, yoyo@home, and Rosetta@home.
I just started running tasks for yoyo[@home, and by golly. These tasks are freaking HUGE! The shortest task estimate is 2 days, and 18 hours.
Rosetta@home offered up tasks that took me more than a day for most, to be completed.
World Community Grid has been the best in completing tasks and not taking more than a day to finish. The longest estimated time to completion has been under 9 hours.
This time of year I take the computer running my home NAS and move it to my bedroom and set up BOINC. Literally keeps the room 7-10 degrees (F) warmer.
Unless of course you have a heat pump (or an A/C with a reversing valve, which is the same thing). Which by all means please use that instead. You're not going to top a heat pump in terms of efficiency. We're talking 300% efficient heat production minimum. Best you can hope for with electric heating is 100%.
Pretty sure it's 100% efficient (100% energy consumed is emitted as heat) meaning it's exactly as efficient as a space heater. Only way to get more efficient is a heat pump.
They're all kind of old, though. Most of the active ones seem like 5-10 years old. Are there any recent new projects?
And are the projects from like 2009 still feasable? I mean both argorithms and compute hardware in the datacenters of those universities may have made leaps forwards since then?
Wow, I didn't know. First exaFLOP computing system... I tried looking up more but that seems complicated. I'm missing some graph with the TFLOPs over time. Only thing I found is some old one from 2012. Do you happen to know if the participants get in return any list of what their contribution achieved? I mean it'd be nice to know what kinds of scientific papers were written about Covid, with help of that massive compute capacity.
From what I've gathered from my recent experience of running tasks, the project might have started years ago, but they are still offering tasks to be completed.
Einstein@Home does pulsar and (continuous) gravitational wave research. They have some long-running pulsar projects, which still find new pulsars getting published, and continuous gravitational wave research usually has a new project every 6-12 months.
The algorithms are improving all the time, and so do the volunteer computers.
Which of these make their data publically available?
Because the greatest scientific contribution would not be hording the data so you can publish your paper, but making it freely available, so any group of researchers can look through it and contribute to scientific knowledge by analysing the findings in different ways
I was going to mention ArchiveTeam's warrior because I thought it wouldn't be listed, since computing isn't really the important thing you're donating, more your virgin IP address and internet connection... but it's third on the list!
The URLs project plays it fast and loose and archives an assortment of random URLs. This one has an IP block warning.
Some have NSFW warnings.
Other projects aim to archive a single site as accurately as possible (possibly with a deadline when the site is shutting down), so they can't afford to have their warriors blocked or rate limited. If you are, that would be because of an issue. You can choose to archive sites you don't want to visit to avoid issues.
OONI monitors internet censorship and other forms of network interference, especially by state actors, worldwide. It's an important contributor to digital rights and freedoms IMO, and you can run their client in the background to contribute non-personal data on pretty much any device.
I used to have this docker image for Archive Team running. I've helped archive a hundred gigabytes or so of reddit data. After a couple reset of my home server, I don't have it running right now.
Yep. Theres a lot of misinformation due to bank lobbys propaganda. Bitcoin was designed to be more efficient.
It doesn't need humans to deal with transaction reversals (this was a key part of the design per the white paper), so you cut out all the energy that goes into creating buildings for offices and maintaining them. Also it scales up without using additional energy, so if you actually look at the numbers it uses magnitudes less energy.