- climateprediction.net: Weather@Home Mexico: New Climate Modelling Experiment Launching Soon
25.08.2015 14:24 Uhr
We’re pleased to announce that we will be adding a new regional model to our weather@home regional modelling experiments, covering Mexico and parts of North and South America.
The experiment that will be run with this model will initially be looking at the influence of human-caused climate change on two unusual weather events in 2004/5: the very wet winter season over the northwest of Mexico and the anomalous wet summer over the southeast of Mexico, which was the most active Atlantic hurricane season in recorded history.
This experiment is part of the RECLIM-UK (Regional Climate Projections Initiative Mexico – UK) project, sponsored by the British Council, and will be led by Dr Ruth Cerezo-Mota at the Universidad Nacional Autónoma de México (UNAM).
You can read more about this new region and experiment on the weather@home Mexico page.
Our volunteers can expect to be running Mexico region models in the next couple of months.
Tropical system’s trajectories during the 2005 hurricane season over the Atlantic (data from the National Hurricane Center, image from Wikipedia)
- climateprediction.net: Change needed to avoid ‘paralysis’ in climate policies
04.08.2015 13:54 Uhr
Climate scientists, including climateprediction.net’s Dr Friederike Otto and Professor Myles Allen, are calling for an overhaul of the way climate change pledges are assessed, in order to avoid ‘indefinite procrastination’ on designing efficient mitigation policies.
Writing in Nature Climate Change, they say that the ‘pledge and review’ approach that will form the basis of commitments made at the UN climate change negotiations in December, presents an opportunity to link mitigation goals explicitly to the evolving climate response.
“Scientific uncertainty about the long-term impacts of climate change is often used as an excuse for inaction, or as a basis for recommending highly precautionary worst-case-scenario strategies, which may be unpalatable to policy makers juggling economic and political interests,” said Dr Friederike Otto of Oxford’s Environmental Change Institute, lead author of the paper.
“Human-induced warming has brought us 10% closer to 2°C since 2009. So any country whose government acknowledged in 2009 that CO2 emissions must reach net zero by the time temperatures reach the target stabilisation level of 2°C should be 10% of the way there now. However there still is no overall strategy to achieve this.”
The authors, who include the Oxford Martin School’s Professor Myles Allen, argue that strategies should be ‘anti-fragile’, meaning they are not just robust under uncertainty but more successful under a wide range of uncertainties, including scientific, economic and political risks. Learning from trial and error is an integral part of such an ‘anti-fragile’ strategy, allowing for evolving knowledge to be incorporated at low costs.
They looked at what climate policy makers could learn from adaptive management techniques, to create an approach to mitigation that more fully accounts for the set of risks that governments care about, is less dependent on a globally binding mandate, and which could, therefore, be a better way to preserve flexibility in climate mitigation.
They recommend an adaptive strategy grounded on an index of the warming attributable to human influence, which is itself based on observed temperatures. Calculated in 2014 the rise in global mean temperature attributable to anthropogenic influences was 0.91?C.
In contrast to global mean temperature the ‘attributable anthropogenic warming’ index is not subject to high year to year and decadal variability. It also requires no complex modelling and could be updated on an annual basis, allowing governments to review their pledges to reduce greenhouse gas emissions.
The paper outlines three policy options using the index and concludes that indexing to attributable anthropogenic warming allows a transparent link between the policy instrument and the policy goal. It is a simple way to monitor the overall consistency between the evolving climate change signal, individual countries’ pledges and the overall goal of achieving net zero CO2 emissions by the time we reach 2°C of human-related warming.
Dr Otto concludes: “At a crucial time for climate negotiations, the proposed index offers a transparent and accountable method of evaluating climate policies that deals with the remaining uncertainty of the climate response, which has so far had a paralysing effect on climate change policies.”
- View the full text of ‘Embracing uncertainty in climate change policy‘
- More information on the index can be found at www.safecarbon.org, an up-to-the-second index of human-induced warming relative to the mid-19th century (1861-80) and cumulative carbon dioxide emissions from fossil fuel use, cement manufacturing and land-use-change.
- Human-induced warming is currently increasing at 1.8 hundredths of a degree per year, or 0.6 billionths of a degree per second.
This article originally appeared on the Oxford Martin School website – read the original here.
- climateprediction.net: Weather@home 2015 Western US Drought: All models have been sent out, about half have come back
23.07.2015 11:01 Uhr
So, thanks to the fantastic efforts of our volunteers, all 22,000 models for this experiment have now been downloaded to run on people’s home computers, and nearly half have finished running and have been uploaded back to our servers.
The plots for the results so far have been updated to include all the models that have been returned to us – currently sitting at nearly 8,000 models!
Thanks again to everyone who has helped us with this experiment by running our models on your computer, we couldn’t do this without you.
There are 3 plots for each state, here’s one of them showing temperature in California:
- climateprediction.net: New Climatology Results for Western US Drought Experiment
15.07.2015 09:49 Uhr
We now have the first results for our Climatological simulations, investigating the influence of removing the ‘blob’ of warm sea surface temperatures off the western US coast.
The ‘blob’ has a strong influence on the temperature, for example the climatological simulations without the ‘blob’ are colder than the actual or natural simulations.
In the climatological simulations, it is interesting to see a different response in the precipitation between the different states. This is something our scientists will be investigating in more detail in the upcoming weeks.
There are 3 plots for each state, here’s one of them showing temperature in California. The experiment is looking at two possible influences on the drought in the Western US – climate change and the “blob”. In the plot below, there are 3 sets of data:
- “Actual” – these are models that use observed data to simulate the climate
- “Natural” – these are models that show a “world that might have been without climate change”
- “Climatology” – these are models that include climate change, as observed, but have removed the “blob”
There are still a few thousand models left to run, so please do sign up if you haven’t already, and help us answer this fascinating and important question!
Read more about the experiment setup.
See all the results so far for individual states here:
- climateprediction.net: Update: Heatwave Twice as Likely Due to Climate Change
09.07.2015 14:25 Uhr
A team of international scientists says that it is virtually certain that the heat wave that stretched across much of Europe in early July was more likely to happen now than in the past due to climate change.
Based on the synthesis of two independent peer-reviewed approaches, they conclude that heat waves like this now occur twice as often over a large part of Europe, and four times more often in some of the hottest cities. The results are a part of the developing field of “weather attribution” that uses observational weather and climate data, weather forecasts and climate models.
In recent years, more and more “event attribution analyses” have appeared in the scientific literature (see 2014 BAMS Special Report). This current analysis however, was conducted in real-time, providing results as the heat wave unfolded, but is still based on peer-reviewed scientific methods.
As we reported last week, for this ongoing heat wave in Europe, Climate Central convened an international team of scientists from Oxford University, KNMI, Red Cross Red Crescent Climate Centre, along with regional partners from CNRS (France), DWD (Germany), and MeteoSwiss. They produced a preliminary analysis of the annual maximum of 3-day maximum temperature based on observations up to July 1 and forecasts up to July 5.
It is important to note the difference in results between the first phase of this analysis and the updated version using observations. The increases in likelihood are in good overall agreement for each of the 5 cities. Return time agreement was also good for the stations in the west where the majority of the heat wave had already occurred (i.e., Madrid) or was occurring (i.e., De Bilt). However, the return times changed substantially in the eastern stations where the peak of the heat wave was up to five days into the future (at the time the first analysis was done). The following table below shows how the forecast temperatures for each city verified with observations:
3-day Observed/Forecast Maximum Temperatures and 3-Day Observed Maximum Temperatures
3-day Max obs/fx (July 2) Date 3-day max obs (July 7) Date
De Bilt 31.9°C July 2-4 31.8°C July 1-3 Madrid (diff. station) 39.6°C June 28-30 Mannheim 37.0°C July 2-4 38.0°C July 3-5 Beauvais-Tille n/a n/a 33.2°C July 1-3 Zurich 35.4°C July 4-6 34.0°C July 3-5
Now, that analysis has been redone using observations up to July 6, by which time the heat wave had subsided over most of Europe. The scientists used two independent methods in this analysis, allowing for high confidence in the results. In one method, a statistical analysis of observational records was performed (using the KNMI Climate Explorer) to compare this summer’s heat with summers during the early part of the century, before global warming played a significant role in our climate. This detects trends but cannot attribute the causes.
The second method, conducted by our researchers in Oxford, uses a large computing network (weather@home) to simulate the likelihood of seeing days as hot as those Europe has been experiencing over the past week. At the same time, we also simulated a summer without human-influenced climate change. Comparing those two “worlds,” we found that in the 5 cities analysed, the current conditions are now at least twice as likely due to climate change. The model does not include the urban effects that are accounted for in the methodology based on observations of urban stations.
“The regional weather@home model serves as a nice way to do an independent check on the observational analysis,” said Oxford’s team lead Friederike Otto. “Think of the combined results as a good first step towards answering the climate question.” In this case the Oxford team was a bit limited because the observed sea surface temperatures that drive the model are not yet available. Instead, the summer of 2014 was used as a proxy. The team felt the choice was solid because the influence of the exact sea surface temperatures on summer temperatures in Europe is small compared to the overall effect of global warming.
The results were:
- In De Bilt, the trend analysis of the observational data shows that a 3-day period as hot as experienced over this past week is now more than 4 times more likely to occur than it was around 1900. Using the weather@home model, scientists estimate that climate change has made the observed heat wave almost two times more likely to occur. This means that what would be a 1-in-7 year event in the world without climate change is now a 1-in-4 year event.
- In Madrid, using the weather@home model, we estimate that climate change has made the observed heat wave 5 times more likely to occur. Said differently, what was once a 1-in-100 year event in the world without climate change, is now a 1- in 20-year event.
- In Mannheim, the trend analysis of the observational data shows that the heat wave was a rare event even in the current climate. A 3-day period as hot as experienced over the past week should occur roughly every 30 years now, but, using the weather@home model, we estimate that climate change has made it almost 4 times more likely to occur.
- In Beauvais-Tille (a town 80 km north of Paris, far beyond the suburbs with a good series of observations without urban effects), the trend analysis of the observational data shows that an event like the 2015 heat wave is now expected every three years. Using the weather@home model, we estimate that climate change has made the observed heat wave 25% more likely to occur.
- In Zürich, the trend analysis of the observational data shows that a 3-day period as hot as experienced over the past week is expected nowadays every 15 years or so. This is more than 2.5 times more likely than it was around 1900. Using the weather@home model, we estimate that climate change has made the observed heat wave about 3 times more likely to occur. What would have been a 1-in-40 year event in a world without climate change is now a 1- in- 15 -year event.
Read more on the Climate Central website:
Read Nature’s coverage of this story:
- climateprediction.net: Preliminary Results for Western US Drought – 607 Models
07.07.2015 15:25 Uhr
We sent out the first models to our volunteers last week and so far we’ve had 607 returned to us – here are some plots using those initial models.
Eventually, we will be running over 20,000 models, so we will add these to the plots as they come in – keep an eye on our results page where we’ll be posting updated plots regularly.
We still have many simulations ready for people to run, so if you are not running our models already, please consider signing up!
The “climatology” results, that look at the possible effect of the “blob” on the drought, are not ready yet so we’re just showing you the “actual” (world with climate change) and “natural” (world that might have been without climate change) experiments.
For an explanation of the 3 different sets of models we’re running, read the Experimental Setup page.
For an explanation of what these plots are showing, read the Return Time Plots page.
We have 3 plots for each of the 3 states we’re looking at, but as an example, here’s temperature for California:
You can see the full plots (so far) for precipitation, snow and temperature on the individual results pages for each State:
- climateprediction.net: Climate Change Plays Significant Role in European Heat Wave
02.07.2015 13:27 Uhr
The ongoing heat wave dominating a large swath of Europe is being exacerbated by climate change, according to a new analysis by a team of international scientists using both observational data and climate models.
It is widely accepted that climate change will increase the frequency, intensity and duration of heat waves (Meehl and Tebaldi, 2004). In 2004, a paper published in Nature showed that climate change had at least doubled the risk of the record-breaking 2003 European summer heat wave that is believed to have killed more than 70,000 people.
In this case, as part of the world weather attribution project, our partners at Climate Central convened an international team of scientists from Oxford University, KNMI, Red Cross Red Crescent Climate Centre, along with regional partners from CNRS and MeteoSwiss in order to assess the potential role of global warming on a specific heat event – not just a record hot summer. The team defined the specific heat event for selected European cities (De Bilt, Madrid, Mannheim, Paris, and Zurich) using a reference of 3-day maximum temperature.
Using our large weather@home computing network, we simulated the likelihood of seeing days as hot as as those Europe is experiencing now. At the same time, we also simulated a summer where there is no human-influenced climate change. Comparing those two “worlds” we found that in the 5 cities analyzed, average 3-day high temperatures like those currently being observed and forecast have been made at least twice as likely in the world with climate change.
“The results we currently have should be viewed as preliminary,” said Friederike Otto, “because the observed sea surface temperatures are not yet available for this summer, we have simulated the summer of 2014 as a proxy. However, because it is well understood that the influence of the exact sea surface temperatures in Europe is small compared to the overall effect of global warming, these numbers provide a good first step towards answering the climate question.”
Using a combination of observed and forecast data, scientists from the team at KNMI computed the annual maximum of 3-day maximum temperature (observations up to July 1, forecasts up to July 5).
“There is a strong upward trend in 3-day maximum temperatures over the area affected by this heat wave,” said Geert Jan van Oldenborgh, a climate scientist at KNMI. “The trend is clear both in station data and in reanalysis data.” Because the heat wave is ongoing, the analysis partly relies on forecasts for the next few days. A statistical analysis of the observations shows that the probability of observing such a heat wave has more than doubled over the past 37 years in most of the affected region. In the selected cities the increase is even stronger.
The methodology used in these two approaches is drawn from peer reviewed literature. For more details on each approach please refer to the Methodology outlined for a previous analysis on the record heat in Europe in 2014.
- Meehl, G.A. and C. Tebaldi. 2004. More intense, more frequent, and longer lasting heatwaves in the 21st century. Science 305:994-997.
- Stott, P.A., Stone, D.A., and Allen, M.R. (2004) Human contribution to the European heatwave of 2003. Nature, 432(7017): 610N614
- climateprediction.net: climateprediction.net at Our Common Future Under Climate Change, Paris
02.07.2015 10:28 Uhr
Most of our science team will be in Paris next week for the international science conference, Our Common Future Under Climate Change.
If you’re going, here are the talks that climateprediction.net researchers are involved in:
Tue 07 July
1118 (a) – Attribution of extreme events: How are high impact extreme events changing and why, UNESCO Fontenoy
- 17:30 Attribution of record high daily temperatures in Australia in 2013, D. Karoly (University of Melbourne, University of Melbourne, VIC, Australia), M. Black (University of Melbourne, University of Melbourne, VIC, Australia), S. Lewis, (Australian National University,Canberra, Australia), A. King (University of Melbourne, Melbourne, Victoria, Australia), M. Allen (University of Oxford, Oxford, United Kingdom), F. Otto (University of Oxford, Oxford, United Kingdom), S. Rosier, (NIWA, Wellington, New Zealand)
- 17:40 Human influence on climate in the 2014 Southern England winter floods and their impacts, N. Schaller (University of Oxford, Oxford, United Kingdom), A. Kay (CEH, Wallingford, United Kingdom), R. Lamb (JBA, Skipton, United Kingdom), N. Massey (University of Oxford, Oxford, United Kingdom), G. J. Van Oldenborgh (KMNI, De Bilt, Netherlands), F. Otto (University of Oxford, Oxford, United Kingdom), S. Sparrow (University of Oxford, Oxford, United Kingdom), R. Vautard (Laboratoire des Sciences du Climat et de l’Environnement, Saclay, France), P. Yiou (Laboratoire des Sciences du Climat et de l’Environnement, Saclay, France), A. Bowery (University of Oxford, Oxford, United Kingdom), S. Crooks (CEH, Wallingford, United Kingdom), C. Huntingford (CEH, Wallingford, United Kingdom), W. Ingram (University of Oxford, Oxford, United Kingdom), R. Jones (Met Office, Exeter, United Kingdom), T. Legg (Met Office, Exeter, United Kingdom), J. Miller (University of Oxford, Oxford, United Kingdom), D. Wallom (University of Oxford, Oxford, United Kingdom), A. Weisheimer (University of Oxford, Oxford,United Kingdom), P. Stott (UK Meteorological Office, Exeter, United Kingdom), M. Allen (University of Oxford, Oxford, United Kingdom)
Wed 08 July
1114 – Global emissions and their implications for climate targets, UPMC Jussieu – Amphi 24
- 17:30 Short Lived Promise? Short-lived climate pollutants, cumulative carbon and emission metrics, M. Allen (University of Oxford, Oxford, United Kingdom)
1119 (b) – Extreme hydrological events: deciphering changes in hazard and risk at different time-scales, UPMC Jussieu – ROOM 207
- 17:30 Attribution of Hydrological Extreme Events and the Need for National Inventories of Loss and Damage, M. Allen (University of Oxford, Oxford, United Kingdom)
2240 – Perceptions of climate change, UPMC Jussieu – ROOM 101
- 18:30 Panel discussion, N. Pidgeon (Cardiff University, Cardiff, United Kingdom), F. Otto (University of Oxford, Oxford, United Kingdom)
1118 (b) – Attribution of extreme events: How are High impact extreme events changing and why? UPMC Jussieu – ROOM 307
- 18:25 Attributing regional effects of the 2014 Jordanian drought to external climate drivers, R. Zaaboul, (ICBA, Dubai, United Arab Emirates), D. Mitchell (University of Oxford, Oxford, United Kingdom), K. Bergaoui, (ICBA, Dubai, United Arab Emirates), F. Otto (University of Oxford, Oxford,United Kingdom), R. Mcdonnell, (ICBA, Dubai, United Arab Emirates), S. Dadson, (University of Oxford, Oxford, United Kingdom), M. Allen (University of Oxford, Oxford, United Kingdom)
- 18:35 Distinguishing natural and anthropogenic influences on extreme fire danger in Australia, M. Black (University of Melbourne, Melbourne, Australia), D. Karoly (University of Melbourne, Melbourne, Australia), T. Lane, (University of Melbourne, Melbourne, Australia), L. Alexander (University of New South Wales, Sydney, Australia)
- Collatz Conjecture: SSL Certificate Replaced
28.10.2015 13:29 Uhr
After finally getting the site cleared by Google’s security scan, I was able to apply for and install a new SSL certificate. Thanks for your patience.
- Collatz Conjecture: Google's Anti-Malware Detection Gives False Positives
22.10.2015 14:22 Uhr
For the last several months, Google has identified the Collatz apps as “having an unknown virus”. I have asked Google for specifics since their scan states that Collatz has “Undetermined malware” and all virus scans I have done show no issues. Because other sites relay on their security tools, I have not been able to update the expired SSL certificate. Since the BOINC client will not fall back to HTTP if there is an issue with the SSL Certificate (e.g. it is expired), I had to remove the certificate. That means that logins are no longer encrypted. Thanks, Google, you’ve accomplished the exact opposite of what you intended and all attempts to get you to rectify the situation have fallen on deaf ears. Proof? The applications that Google complains about are exactly the same as the ones created several years ago (before the suspected trojan even existed). I can compile the application on a clean machined (fresh install of everything from DVDs direct from Microsoft) and the applications match the supposedly infected ones byte for byte. The issue is that in order to scan the files quickly, Google uses a “thumbprint” of each virus and if the application happens to have any code that matches the small snippet in the thumbprint, Google deems it to be infected. Google __should__ check for the real virus instead of a thumbprint when it finds an initial match. That would avoid false positives. Instead, they assume they are correct and ban the site. Hmmmm… I thought we were supposed to be innocent until proven guilty, not just suspected of being guilty. But, I guess that’s what happens when you become the 800 lb. gorilla.
- Collatz Conjecture: Sieve in Production; Large and Solo deprecated
23.09.2015 18:48 Uhr
The collatz_sieve applications have been changed from beta to production status so all volunteers will now be getting them. I also increased the sieve workunit size by 4 times. I expect it will take BOINC a while to re-adjust it’s runtime estimates as they are based on floating point calculations which aren’t used in Collatz except to show the percent complete. I deprecated the solo_collatz and large_collatz applications and disabled the associated work generators. Any that are in progress will still be validated, you just won’t get any new solo or large WUs since you will receive sieve WUs instead. If the sieve application fails immediately after it starts, you probably need to install the latest Microsoft Visual C++ runtime and, since BOINC will send your computer both 32 and 64 bit applications, you will need to install both the 32 and 64 but C++ runtime (see main page for a link or just Google it).
- Collatz Conjecture: Collatz Sieve Applications released for Windows, Linux, and OS X CPUs
10.09.2015 05:20 Uhr
Sieve applications for CPUs have been released as beta/test applications. Most of the config settings do not apply for CPUs but you will find that adjusting the lut_size and especially the sieve_size will affect both the processing speed and RAM required. (e.g. lut_size=20 and sieve_size=31 uses 200MB per app whereas the stock settings of lut_size=12 and sieve_size=26 uses 9MB per app but runs about 30% slower).
- Collatz Conjecture: Collatz Sieve v1.21 Released for Linux and OS X
06.09.2015 13:37 Uhr
32 and 64 bit Linux versions and 64 bit OS X versions of the v1.21 sieve application have been released as test/beta applications.
- Collatz Conjecture: Collatz Sieve v1.21 released for Windows
05.09.2015 20:55 Uhr
The validation bug is now fixed but some have reported that the application errors out immediately. If you run into error -1073741515, please try installing the latest VS2012 C++ runtime available from Microsoft at https://www.microsoft.com/en-us/download/details.aspx?id=30679. VS2012 was recently updated to SP4 on my laptop and that is likely the cause of the missing DLL error. New versions for Linux and OS X will be forthcoming as soon as I figure out how to get the VMs used to build the Linux versions to route properly through the VPN pipe on my laptop and since it is Labor Day weekend, I’m trying not to work too hard.
- Collatz Conjecture: v1.20 sieve apps have been deprecated due to validation failures
05.09.2015 16:54 Uhr
While the app runs great (except on OS X with AMD GPUs — as usual), the WUs are failing to validate so I have deprecated the 1.20 version and am in the process of debugging the issue and hope to have a 1.21 version ready soon.
- Collatz Conjecture: Collatz Sieve 1.20 released for Windows, Linux, and OS X
04.09.2015 20:48 Uhr
Version 1.20 of the Collatz Sieve has been released for testing. (You have to enable beta/test apps in your preferences in order to get collatz_sieve workunits.) The code has been cleaned up since 1.10 and a couple bugs fixed. Both 32 and 64 bit versions for Linux and Windows are now available. OS X is limited to 64 bit only. The Linux applications were compiled on Oracle’s version of RHEL 6.5 which is from the CUDA 3.0 era so I’m hoping the binaries will work on other newer Linux distributions. As usual, if it doesn’t, let me know. The OS X applications were compiled on Mavericks so I have no idea whether older Macs will be able to run it or now. Supposedly, it should work for 10.5 or later but I’m not holding my breath on that one. I’m hoping the kernel will build properly on the AMD machines as I’ve only tested on Mavericks with an nVidia GPU.
- Constellation: Server down for reinstall & relocation
26.07.2015 13:51 Uhr
The server will go down for a reinstallation on a new hardware and moving the server into a new datacenter. We’re back in about 2 weeks.
- Constellation: The NASA Space Apps Stuttgart 2015 Hackathon winner BigWhoop is a finalist for the NASA Global People’s Choice Award!
28.04.2015 08:38 Uhr
The NASA Space Application Challenge 2015: one hackathon weekend, 133 cities around the world, 949 projects, and our BigWhoop project is among the final 15 for the global People’s Choice Award. Read on to learn what happened during the event at the Stuttgart hackerspace shackspace and then go on and vote for BigWhoop and make it win the award! NASA’s Space Apps Challenge is an international mass collaboration/hackathon organized on a global scale and held in all major cities worldwide, with a focus on finding solutions to challenges faced in space exploration and also to life on Earth. The Winner of the Stuttgart chapter of Global NASA Space Apps Challenge is BigWhoop. BigWhoop addresses the “Open Source Air Traffic Tracking” Challenge. This challenge required the building of a tool that allowed users to select a particular flight and see its out-the-window or other views of the aircraft, and airspace. We decided to extend the scope of the project a bit. Our app was designed as a global sensor grid to measure the whole radio spectrum -thus making air traffic monitoring a subset of the solution. This free and community driven approach based on small and low-cost software defined radio devices earned the local team a global nomination for NASA’s People’s Choice Awards and will be competing to be among the final 15 out of 949 projects for global People’s Choice awards. And now you can vote for the Stuttgart Team as of 27 April. Just visit this page and vote for our #BigWhoop app (once a day). Watch the BigWhoop Video >> https://www.youtube.com/watch?v=GnamC7pU1Ok All this was possible because we started early. We had a kick-off event in February to give people the opportunity to meet, find a challenge to tackle and prepare it before the official Space Apps hackathon weekend during the 10th to 12th April. Having an early launch event was one of the results of our first Space Apps in 2012 and it worked out beautifully. A two day hackathon is great to pitch an idea, but we wanted to show a working project. From sensor nodes that can be deployed at hundreds of locations worldwide to the processed data ready to be analyzed on a map. It also made it possible to include interested people as virtual participants. Three BigWhoop team members come from Bolivia, Poland and the UK respectively and they already knew the goals of the project and provided essential parts to the project via Skype and GitHub. This already made our project feel global and important for everybody. You can still join BigWhoop, just get our code and start using it! The hackathon was great to knit the team even tighter and add last minute additions. The support of random people at shackspace was also a big factor. People not even on the team overheard us discussing project details and simply joined ad-hoc and contributed their solution. This lead to the music arrangement you listen to during our video presentation and “node zero”, the very first node of our sensor grid now working on a virtual machine in shackspace’s experimental datacenter is not just producing data but also responsible to collect data from everyone contributing to the grid. Ultimately BigWhoop is intended to run on the Constellation computation grid with 60,000 computers. However, we started a pre-alpha test. So we asked for your help during the hackathon weekend to plug in your software defined radio devices and start a sensor node for us. Our BigWhoop software was already able to send this to our server at shackspace and we received data from nice people in Virginia, US and Bremen, Germany. With this help, we were able to show you a first live demo at the end of the hackathon. Since then, we received further data and are really overwhelmed by everyone’s support and want to say a big THANK YOU! One team member took BigWhoop to Observatorio del Teide As a team, we believe in BigWhoop and we will keep on working on it. One team member took a sensor device with him on a measurements campaign of his work and monitored the spectrum on his free time next to the Teide volcano on Tenerife. The scientific paper about the project was selected for the International Astronautical Congress 2015 in Jerusalem and we will present the Space Apps Challenge with “Opportunities of an open-source global sensor network monitoring the radio spectrum for the (new) space community” during the SPACE COMMUNICATIONS AND NAVIGATION SYMPOSIUM. And lastly, BigWhoop will serve as the ground station application for the Distributed Ground Station Network project also related to shackspace. There, it will help us tracking and communicating with small satellites globally and continuously and thus helping the satellite operators to get their positions and payload data faster than other systems. Concludingly, BigWhoop and the NASA Space Apps Challenge was fun and a great experience. We learned so much during the event, we felt to be part of a global community, work(ed) on solving an important problem and started a project that can become even more than we can currently imagine. We as the local organizer and team will definitely apply for Space Apps Challenge 2016 again and bring it back to Stuttgart for you to tackle new challenges. We hope, that you like our BigWhoop project as much we do and give us your vote during the People’s Choice awards and make it win the space challenge! Links: [list=]bigwhoop.aerospaceresearch.net vote for #BigWhoop code on GitHub.com/aerospaceresearch/dgsn_bigwhoop follow twitter.com/dgsn_bigwhoop[/list] [Remaining Photos]
- DistributedDataMining: Recent problems
10.12.2014 22:19 Uhr
Hi there, recently, we experienced some unfortunate problems: 1. The server hardware is old and can barely deal with the increasing load. This frequently leads to an unresponsive server. Website visitors are mainly affected since the boinc clients deal with the problem by trying to reconnect later on. The reconnects increase the problem additionally. 2. After getting back from two weeks of vacation, I realized that parts of the dDM database crashed and had to be restored. As far as I can see, we suffered no loss of important data. Maybe, some changes of user profiles were not stored. 3. The amount of dDM data increases steadily and the corresponding backup takes much more time than in the early stages of the project. As a result, the announced maintenance periods are no longer sufficient. I will think about a new backup strategy. 4. Some users reported about validation problems. I guess, this was due to the issues mentioned above. Right now, the validation of WUs is almost finished. I apologize for any inconvenience caused by these problems. Cheers, Nico
- DistributedDataMining: Forum under spam attack
22.04.2014 16:01 Uhr
Hi there, the ddm forums are currently under heavy spam attack. I’ve just deleted the relevant posts. To prevent further spam, from now on users can only post new messages if they have at least one credit point. Cheers, Nico
- DistributedDataMining: dDM went offline
21.04.2013 12:46 Uhr
Hi there, this week, we had some serious hardware problems which forced us to switch over to a new server. Caused by hard disc failures some data was lost – but nothing important. Now, dDM is running on new hardware. We are still recovering data by replaying our backups. That’s the reason why, some WUs still finish with download errors. We are confident, that we will be fully operational at Tuesday. We apologize for any inconvenience. Cheers, Nico
- DistributedDataMining: Usage of 'ARM' technology for the dDM project
12.03.2013 18:12 Uhr
Hello, Mr. Schlitter, I really enjoy this project and would like to contribute using the CPUs on my mobile device(s). Projects such as PrimeGrid, Enigma, SubsetSum@home, OProject@home, WUProp@home, and others have successfully used ‘ARM’ technology, which I understand is open-source, and I have been crunching for these projects for about a month successfully. Is DDM currently compatible with this technology, or are there plans to allow crunching on such devices? Many of the projects listed above have noted a sharp increase in the number of users as well as amount of research that is completed. Are you able to make this work for your project as well? I would be honoured to test and/or participate in any way. Thanks again, StandardbredHorse
- DistributedDataMining: Team challenges in September
28.08.2012 08:04 Uhr
Hi there, recently, I got message that two teams are going to focus on the dDM Project in September. The first one is team BOINC@Poland the second one is team The Knights Who Say Ni. I am going to create some more WUs in order to satisfy the upcoming requests. Many thanks goes to both teams for their support of our research. Best regards, Nico
- DistributedDataMining: New application version 1.36 for Medical Data Analysis
09.07.2012 19:57 Uhr
Recently, I’ve released version 1.36 of our application for medical data analysis. The new version takes care of finding the location of java on windows clients. As a consequence, it shouldn’t be necessary to set the path variable manually. Best regards, Nico
- DistributedDataMining: New application version 1.04 for our Multi-Agent Simulation of Evolution
21.06.2012 14:27 Uhr
Today, I’ve released version 1.04 of our Multi-Agent Simulation of Evolution. As planned, we finished the tests of this new application. Consequently, the latest version is now available for all dDM members. The new version deals with some problems that were reported by our beta testers:
- The first version was responsible for frequent hard drive activities. This was caused by writing checkpoints and detailed log files during the test period. Now, we have reduced the frequency between checkpoint writing and the level of detail in the log files.
- From time to time, WUs finished with error code -148, which indicates that something went wrong while starting the WU. We investigated this problem and did some code changes. Right now, we are not sure, if this solves the problem. We release the new version anyway in order to get some more log information and test data.
I also reorganized the structure of our forum:
- I added a new section Multi-Agent Simulation of Evolution dedicated to this new research topic and moved some recent postings.
- I removed the CafÃ© section because it was hardly used in the past and didn’t provide any additional benefit.
Best regards, Nico
- DistributedDataMining: New Research Topic
23.03.2012 16:05 Uhr
Hi there, currently, we are in the final stage of releasing a new application which deals basically with the simulation of evolutionary processes. Using the Multi-Agent Simulation of Evolution we investigate the biological phenomenon of aposematism (also referred to as warning coloration). This term describes the evolutionary strategy of certain animal species to indicate their unpalatability/toxicity to potential predators by developing skin colors and patterns that can be easily perceived by them. Prominent examples of toxic animals with distinct warning coloration are poison dart frogs, coral snakes and fire salamanders. For tackling this interesting research challenge, we developed a distributed multi-agent model that simulates the dynamic interactions of predator and prey populations over time. By systematically testing different adaptation and learning strategies for the agents and exploring the parameter space of our simulation model using the computational power of the dDM project, we might be able to deepen the understanding of the aposematism phenomenon and the evolutionary paths leading to it. So far, dDM members won’t get WUs of this new application. Currently, we are finishing our final tests and distribute these WUs to preselected hosts only. Soon, I’ll send out these WUs to beta test members. After finishing our beta test the new application will be available for all members. Cheers, Nico
- DistributedDataMining: Website Restructuring
22.03.2012 18:58 Uhr
Hi there, as you may have seen, I’ve restructured the dDM website in order to gain clarity. The main changes are:
- I reworked the start site, which now provides a brief overview about the dDM project, its history and its objectives.
- The research challenges went to a separate page.
- I also created a new site which briefly reports about the dDM achievements.
- The menu structure was revised. All items related to the member account were moved the right.
- I also created four different color schemes, which can be freely chosen by the members (left bottom corner).
Your comments, suggestions and wishes are much appreciated. I am going to add some more minor web site features soon and would be happy to include some of your ideas. Furthermore, I’ll update and add some research related information. That’s it for now. Cheers, Nico
- DistributedDataMining: Download problems caused by hardware issues
26.02.2012 19:17 Uhr
Recently, we had some hard-disc issues and as a result the data files of our medical application are temporary not available. We are going to reconstruct the data using the latest backup. Since this will take some hours, just a few workunits will be available in the meantime. We do not expect any problems regarding the user data (e.g. credits) because our ddm database is not affected. Best regards Nico
- DistributedDataMining: Security Update to Drupal 6.24
08.02.2012 15:07 Uhr
Due to some reported security vulnerabilities I’ve updated the content management system of our dDM website to Drupal version 6.24. Please, report any problems related to the dDM website in our forum. Best regards Nico
- DistributedDataMining: Server Maintenance
20.01.2012 11:18 Uhr
Since we are moving the dDM-Server to new hardware the project website and all BOINC functions won’t be available on Sunday (2012/01/22). So far, we don’t expect any problems. There should be no need for any changes in the client configuration. Best regards Nico