Alle BOINC Projektnews

SETI.Germany bietet auch einen feed Alle BOINC ProjektnewsRSS-Feed aller BOINC-News und ein twitter Alle BOINC ProjektnewsTwitter-Konto, auf dem die BOINC Projektnews veröffentlicht werden. Nähere Informationen dazu gibt es hier.

  • climateprediction.net: Climate change found to increase heavy rains like those of UK’s Storm Desmond
    14.12.2015 01:31 Uhr

    In the second real-time extreme weather attribution study in the context of the World Weather Attribution project the team found a 5-80% increase in the likelihood of heavy precipitation like those associated with storm Desmond to occur due to anthropogenic climate change.

    The Atlantic Storm Desmond brought torrential rain and gale-force winds to parts of England, Scotland, and Ireland from early Friday, Dec. 4 to early Sunday, Dec. 6. The fourth named storm of the Autumn/Winter 2015 season, Desmond dumped so much rain in such a short period of time that the U.K. provisionally set a new all-time national record for the greatest 24-hour rainfall when 13.44 inches (341.1 mm) of rain fell on Honister, Cumbria between 6:30 p.m. Dec. 4 and 6:30 p.m. Dec. 5. The U.K. Met Office issued a rare red “take action” warning – the first since 12 February 2014 – for parts of Cumbria and the Scottish Borders as a result of this powerful storm. The excessive nature of this record rainfall event, which led to flooding of more than 5,000 homes and businesses, and left over 60,000 people without power has led many to question whether climate change played a role, especially since there have been several large floods over the last decades.

     

     

    To assess the potential link between the U.K.’s record rainfall and man-made greenhouse gases in the atmosphere, the CPDN team together with the Royal Netherlands Meteorological Institute (KNMI), as part of the World Weather Attribution (WWA) partnership led by Climate Central conducted independent assessments using three peer-reviewed approaches. These approaches involve statistical analyses of the historical temperature record, the trend in a global coupled climate model, and the results of thousands of weather@home simulations. Applying multiple methods provides scientists with a means to assess confidence in the results.

    Based on these three approaches – all of which are in agreement – we found that global warming increased the likelihood of the heavy precipitation associated with a storm like Desmond. The increase is small but robust. It should be noted that a positive attribution for an extreme rainfall event like Desmond is still somewhat rare. Evidence of this can be found in a summary of the events analysed as part of the annual BAMS Special Issue on Explaining Extreme Events from a Climate Perspective. Whereas the vast majority of heat events studied found a climate change signal, less than half of the papers looking at extreme rainfall events found a human influence.

    By comparing recent extreme events with the historical record and climate model simulations, we found that an event like this is now roughly 40% more likely due to climate change than it was in the past, with an uncertainty range of 5% to 80%. It is important to note that this analysis only considers externally driven changes in precipitation. It does not take into account other factors that influenced the flooding. While our analysis provides evidence that climate change has aggravated the flood hazard in this part of the world, risk is also determined by trends in exposure and vulnerability. As events like this become more common in the U.K., it will be important to discuss both changing risks associated global warming and the overall adequacy of flood defences.

    Following a press briefing of these findings by CPDN’s Friederike Otto and WWA colleague Maarten van Aalst during the climate conference in Paris, the research findings were widely covered in the British press. With the first article below being the title story of FT UK online on Friday 11th December.

    http://www.ft.com/intl/cms/s/0/e1466920-9f81-11e5-8613-08e211ea5317.html#slide0

    http://www.mirror.co.uk/news/uk-news/climate-change-blame-devastating-storm-6992177

    http://www.theguardian.com/environment/2015/dec/11/storm-desmond-rainfall-flooding-partly-due-to-climate-change-scientists-conclude

    http://www.oxfordmail.co.uk/news/14139502.Oxford_scientists_say_climate_change_may_have_caused_recent_heavy_rainfall_and_flooding_in_UK/?ref=rss

     

    At the same time of the press briefing a scientific paper with the details of the research has been submitted to the open review journal Hydrology and Earth System Sciences (HESS).

     

  • climateprediction.net: 8 publications in special report rely on weather@home simulations to explain extreme weather events of 2014 in Australia, Africa and South America
    05.11.2015 17:00 Uhr

    Human-induced climate change plays a clear and significant role in some extreme weather events but understanding the other risks at a local level is also important, highlights Bulletin of the American Meteorological Society’s annual special report, Explaining Extreme Events of 2014 from a Climate Perspective. For the fourth year in a row it investigates the causes of a wide variety of extreme weather and climate events from around the world, including eight studies using weather@home simulations. Researchers from the core CPDN teams in Oxford and Melbourne teamed up with collaborators around the globe. They examined serious droughts in Brazil, East Africa and the eastern Mediterranean, heat and cold waves in Argentina and Australia, and extreme rainfall in New Zealand.

    EEE 2014 cover large Alle BOINC Projektnews

    There was a clear influence of human-induced climate change in the temperature driven extreme events, the heavy rainfall in New Zealand and on the failing rains in the Levant region. However, a fingerprint of human activity was not detected in the in the other two droughts. In those cases, other causes of water shortages came into play due to local factors, such as increased water demand, population growth or methods used for irrigating the crops.

    These eight papers looking at extreme events in 2014 show just how much global warming has become a part of todays climate. The also highlight that the field called extreme event attribution, which looks for the fingerprints of human-caused warming in extreme weather events, has made considerable advances over the past years. The goal of extreme event attribution science is to provide this evidence and thanks to our dedicated volunteers we are in a unique position to provide the necessary modelling framework to look into the changing statistics of rare and unprecedented events.
    Research findings:

    An increase in extreme heat events around the globe is observed and expected to be one of the first apparent symptoms of global warming. This increase can be quantified thanks to the very large ensembles created by CPDN volunteers. Research led by Dr Andrew King from Melbourne University found as a result of global warming the 34°C temperature of the first day of the Brisbane G20 World Leaders Forum was 25% more likely because of global warming and 38°C November days are now 44% more likely than in a world without human-induced climate change.

    Investigations into the January heatwave that struck the Australian Open and Adelaide were less clear. While climate change likely played a role in the four days above 41°C that plagued the hard courts of Melbourne Park, there is a small chance that it was due to natural variability alone. For the heatwave with five consecutive days above 42°C at Adelaide, lead author Mitchell Black, also from the University of Melbourne found climate change had increased the odds of that event by 16%.

    australialsta tmo 201361 Alle BOINC Projektnews

    Land surface temperatures (LSTs) in January 2014 over Australia monitored by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite. Credit NASA

    A clearer signal found Dr Alexis Hannart from Buenos Aires in collaboration with CPDN’s Friederike Otto looking at the severe Argentinan heatwave in December 2013 and found a five times increase in the likelihood of such an event occurring in a warming world.

    Climate change can also turn down the thermostat as a team led by Dr Michael Grose from CISRO in Tasmania in collaboration with CPDN’s David Karoly and Mitchell Black discovered when they examined an unusual high surface pressure anomaly off the south coast of Australia in August 2014. This pressure system drove cold air into southeast Australia that led to severe late-season frosts across the area with snow falling down to 200m altitude in Tasmania. The research team found climate change doubled the likelihood of such a high pressure system forming.

    Staying on the Southern Hemisphere, a research team led by Sue Rosier, looked at extreme precipitation over the North Island of New Zealand that led to severe flooding in July 2014. They found that although still a rare event, such heavy rainfall is now expected roughly once in 200 years while it would only have been a 1 in 350 year event in a world without global warming.

    The second study focusing on rainfall, was led by CPDN partner Karim Bergaoui from International Center for Biosaline Agriculture (ICBA), Dubai. This focussed on the lack of rainfall in the Levant region in 2014 during a time that would normally have been the rainy season, and found human-induced climate change did increase the risk of such a severe and unprecedented drought by around 45%.

    Studying the severe drought in São Paulo, the largest city in South America with a population of about 20 million, a team led by Friederike Otto found that human-induced climate change was not a major influence. The researchers examined the drought in terms of lack of rainfall, water availability, and water demand. They found the consequences of the drought – which included temporary water shut-offs, a spike in dengue fever cases, and higher electricity prices – were a result of low water availability combined with the numbers of people involved and damage to the infrastructure system. They also concluded that the lack of rainfall in southeast Brazil in 2014 and 2015 while unusual was not unprecedented, with similar dry periods occurring before, with the most recent in 2001.

    saopaulo oli 2014215 Alle BOINC Projektnews

    The Operational Land Imager on the Landsat 8 satellite acquired this natural-color view of the Jaguari Reservoir in Brazil on August 3, 2014. Jaguari is one of five reservoirs in the Cantareira System, which used to supply water to roughly half of the people in the São Paulo metropolitan area before the water crisis was established. Credit: NASA 

    The second Oxford-led drought study was by Dr Toby Marthews with CPDN’s Dann Mitchell, which focused on the Horn of Africa. They found no influence of human-induced climate change in the lack of rain that year. The researchers did find, however, that human-induced climate change led to higher temperatures and incoming radiation, which made the population more vulnerable in drought events.

  • climateprediction.net: Record hot October in Australia at least 6 times more likely due to global warming
    05.11.2015 12:58 Uhr

    Writing in The Conversation CPDN partners David Karoly and Mitchell Black provide a real-time assessment of the role human-induced climate change and the ongoing El Nino are playing in the record breaking October temperatures in Australia. The magnitude of the monthly mean anaomalies is huge, with 1 deg Celcisus above the previous October record for Melbourne and much of southeast Australia. And this is no co-incidence.

    To understand the role of human-induced climate change in these new records they compare simulations of the Earth’s climate from nine different state-of-the-art climate models and the very large ensemble of climate simulations provided by CPDN volunteers for the weather@home ANZ experiments for the world with and without human-induced climate change. Using thus 10 different climate models and over 10,000 simulations for the weather@home experiments alone, they find that breaking the previous record for maximum mean October temperatures in Australia is at least six times more likely due to global warming.

    Taking the current El Nino conditions into account as well, the increase in the likelihood of setting a new record is even higher and estimated at at least a factor of ten.

    These results demonstrate again that climate change is having a major influence in setting new heat records. They furthermore show that it is now possible to quantify this influence not months after the event but while the event is still unfolding. Quoted in The Age, David Karoly says: “What we’ve done that is really new and exciting is we can now do this analysis as the event is happening. We don’t have to wait.”

    image 20151028 21106 1b9uqqw Alle BOINC Projektnews

     

  • climateprediction.net: Weather@Home Mexico: New Climate Modelling Experiment Launching Soon
    25.08.2015 14:24 Uhr

    We’re pleased to announce that we will be adding a new regional model to our weather@home regional modelling experiments, covering Mexico and parts of North and South America.

    The experiment that will be run with this model will initially be looking at the influence of human-caused climate change on two unusual weather events in 2004/5: the very wet winter season over the northwest of Mexico and the anomalous wet summer over the southeast of Mexico, which was the most active Atlantic hurricane season in recorded history.

    This experiment is part of the RECLIM-UK (Regional Climate Projections Initiative Mexico – UK) project, sponsored by the British Council, and will be led by Dr Ruth Cerezo-Mota at the Universidad Nacional Autónoma de México (UNAM).

    You can read more about this new region and experiment on the weather@home Mexico page.

    Our volunteers can expect to be running Mexico region models in the next couple of months.

    mexico plot 02 Alle BOINC Projektnews
    Tropical system’s trajectories during the 2005 hurricane season over the Atlantic (data from the National Hurricane Center, image from Wikipedia)

  • climateprediction.net: Change needed to avoid ‘paralysis’ in climate policies
    04.08.2015 13:54 Uhr

    Climate scientists, including climateprediction.net’s Dr Friederike Otto and Professor Myles Allen, are calling for an overhaul of the way climate change pledges are assessed, in order to avoid ‘indefinite procrastination’ on designing efficient mitigation policies.

    Writing in Nature Climate Change, they say that the ‘pledge and review’ approach that will form the basis of commitments made at the UN climate change negotiations in December, presents an opportunity to link mitigation goals explicitly to the evolving climate response.

    “Scientific uncertainty about the long-term impacts of climate change is often used as an excuse for inaction, or as a basis for recommending highly precautionary worst-case-scenario strategies, which may be unpalatable to policy makers juggling economic and political interests,” said Dr Friederike Otto of Oxford’s Environmental Change Institute, lead author of the paper.

    “Human-induced warming has brought us 10% closer to 2°C since 2009. So any country whose government acknowledged in 2009 that CO2 emissions must reach net zero by the time temperatures reach the target stabilisation level of 2°C should be 10% of the way there now. However there still is no overall strategy to achieve this.”

    The authors, who include the Oxford Martin School’s Professor Myles Allen, argue that strategies should be ‘anti-fragile’, meaning they are not just robust under uncertainty but more successful under a wide range of uncertainties, including scientific, economic and political risks. Learning from trial and error is an integral part of such an ‘anti-fragile’ strategy, allowing for evolving knowledge to be incorporated at low costs.

    They looked at what climate policy makers could learn from adaptive management techniques, to create an approach to mitigation that more fully accounts for the set of risks that governments care about, is less dependent on a globally binding mandate, and which could, therefore, be a better way to preserve flexibility in climate mitigation.

    They recommend an adaptive strategy grounded on an index of the warming attributable to human influence, which is itself based on observed temperatures.  Calculated in 2014 the rise in global mean temperature attributable to anthropogenic influences was 0.91?C.

    In contrast to global mean temperature the ‘attributable anthropogenic warming’ index is not subject to high year to year and decadal variability. It also requires no complex modelling and could be updated on an annual basis, allowing governments to review their pledges to reduce greenhouse gas emissions.

    The paper outlines three policy options using the index and concludes that indexing to attributable anthropogenic warming allows a transparent link between the policy instrument and the policy goal. It is a simple way to monitor the overall consistency between the evolving climate change signal, individual countries’ pledges and the overall goal of achieving net zero CO2 emissions by the time we reach 2°C of human-related warming.

    Dr Otto concludes: “At a crucial time for climate negotiations, the proposed index offers a transparent and accountable method of evaluating climate policies that deals with the remaining uncertainty of the climate response, which has so far had a paralysing effect on climate change policies.”

    Further information:

    • View the full text of ‘Embracing uncertainty in climate change policy
    • More information on the index can be found at www.safecarbon.org, an up-to-the-second index of human-induced warming relative to the mid-19th century (1861-80) and cumulative carbon dioxide emissions from fossil fuel use, cement manufacturing and land-use-change.
    • Human-induced warming is currently increasing at 1.8 hundredths of a degree per year, or 0.6 billionths of a degree per second.

    This article originally appeared on the Oxford Martin School website – read the original here.

  • climateprediction.net: Weather@home 2015 Western US Drought: All models have been sent out, about half have come back
    23.07.2015 11:01 Uhr

    So, thanks to the fantastic efforts of our volunteers, all 22,000 models for this experiment have now been downloaded to run on people’s home computers, and nearly half have finished running and have been uploaded back to our servers.

    The plots for the results so far have been updated to include all the models that have been returned to us – currently sitting at nearly 8,000 models!

    Thanks again to everyone who has helped us with this experiment by running our models on your computer, we couldn’t do this without you.

    There are 3 plots for each state, here’s one of them showing temperature in California:

  • climateprediction.net: New Climatology Results for Western US Drought Experiment
    15.07.2015 09:49 Uhr

    We now have the first results for our Climatological simulations, investigating the influence of removing the ‘blob’ of warm sea surface temperatures off the western US coast.

    The ‘blob’ has a strong influence on the temperature, for example the climatological simulations without the ‘blob’ are colder than the actual or natural simulations.

    In the climatological simulations, it is interesting to see a different response in the precipitation between the different states. This is something our scientists will be investigating in more detail in the upcoming weeks.

    There are 3 plots for each state, here’s one of them showing temperature in California. The experiment is looking at two possible influences on the drought in the Western US – climate change and the “blob”. In the plot below, there are 3 sets of data:

    1. “Actual” – these are models that use observed data to simulate the climate
    2. “Natural” – these are models that show a “world that might have been without climate change”
    3. “Climatology” – these are models that include climate change, as observed, but have removed the “blob”

    There are still a few thousand models left to run, so please do sign up if you haven’t already, and help us answer this fascinating and important question!

    Read more about the experiment setup.

    See all the results so far for individual states here:

  • Constellation: Server down for reinstall & relocation
    26.07.2015 13:51 Uhr
    The server will go down for a reinstallation on a new hardware and moving the server into a new datacenter. We’re back in about 2 weeks.
  • Constellation: The NASA Space Apps Stuttgart 2015 Hackathon winner BigWhoop is a finalist for the NASA Global People’s Choice Award!
    28.04.2015 08:38 Uhr
    The NASA Space Application Challenge 2015: one hackathon weekend, 133 cities around the world, 949 projects, and our BigWhoop project is among the final 15 for the global People’s Choice Award. Read on to learn what happened during the event at the Stuttgart hackerspace shackspace and then go on and vote for BigWhoop and make it win the award! NASA’s Space Apps Challenge is an international mass collaboration/hackathon organized on a global scale and held in all major cities worldwide, with a focus on finding solutions to challenges faced in space exploration and also to life on Earth. The Winner of the Stuttgart chapter of Global NASA Space Apps Challenge is BigWhoop. BigWhoop addresses the “Open Source Air Traffic Tracking” Challenge. This challenge required the building of a tool that allowed users to select a particular flight and see its out-the-window or other views of the aircraft, and airspace. We decided to extend the scope of the project a bit. Our app was designed as a global sensor grid to measure the whole radio spectrum -thus making air traffic monitoring a subset of the solution. This free and community driven approach based on small and low-cost software defined radio devices earned the local team a global nomination for NASA’s People’s Choice Awards and will be competing to be among the final 15 out of 949 projects for global People’s Choice awards. And now you can vote for the Stuttgart Team as of 27 April. Just visit this page and vote for our #BigWhoop app (once a day). Watch the BigWhoop Video >> https://www.youtube.com/watch?v=GnamC7pU1Ok All this was possible because we started early. We had a kick-off event in February to give people the opportunity to meet, find a challenge to tackle and prepare it before the official Space Apps hackathon weekend during the 10th to 12th April. Having an early launch event was one of the results of our first Space Apps in 2012 and it worked out beautifully. A two day hackathon is great to pitch an idea, but we wanted to show a working project. From sensor nodes that can be deployed at hundreds of locations worldwide to the processed data ready to be analyzed on a map. It also made it possible to include interested people as virtual participants. Three BigWhoop team members come from Bolivia, Poland and the UK respectively and they already knew the goals of the project and provided essential parts to the project via Skype and GitHub. This already made our project feel global and important for everybody. You can still join BigWhoop, just get our code and start using it! The hackathon was great to knit the team even tighter and add last minute additions. The support of random people at shackspace was also a big factor. People not even on the team overheard us discussing project details and simply joined ad-hoc and contributed their solution. This lead to the music arrangement you listen to during our video presentation and “node zero”, the very first node of our sensor grid now working on a virtual machine in shackspace’s experimental datacenter is not just producing data but also responsible to collect data from everyone contributing to the grid. 17090681987 d055e357b9 z Alle BOINC Projektnews Ultimately BigWhoop is intended to run on the Constellation computation grid with 60,000 computers. However, we started a pre-alpha test. So we asked for your help during the hackathon weekend to plug in your software defined radio devices and start a sensor node for us. Our BigWhoop software was already able to send this to our server at shackspace and we received data from nice people in Virginia, US and Bremen, Germany. With this help, we were able to show you a first live demo at the end of the hackathon. Since then, we received further data and are really overwhelmed by everyone’s support and want to say a big THANK YOU! CC6dbLhWgAAiRLR Alle BOINC Projektnews One team member took BigWhoop to Observatorio del Teide As a team, we believe in BigWhoop and we will keep on working on it. One team member took a sensor device with him on a measurements campaign of his work and monitored the spectrum on his free time next to the Teide volcano on Tenerife. The scientific paper about the project was selected for the International Astronautical Congress 2015 in Jerusalem and we will present the Space Apps Challenge with “Opportunities of an open-source global sensor network monitoring the radio spectrum for the (new) space community” during the SPACE COMMUNICATIONS AND NAVIGATION SYMPOSIUM. And lastly, BigWhoop will serve as the ground station application for the Distributed Ground Station Network project also related to shackspace. There, it will help us tracking and communicating with small satellites globally and continuously and thus helping the satellite operators to get their positions and payload data faster than other systems. Concludingly, BigWhoop and the NASA Space Apps Challenge was fun and a great experience. We learned so much during the event, we felt to be part of a global community, work(ed) on solving an important problem and started a project that can become even more than we can currently imagine. We as the local organizer and team will definitely apply for Space Apps Challenge 2016 again and bring it back to Stuttgart for you to tackle new challenges. We hope, that you like our BigWhoop project as much we do and give us your vote during the People’s Choice awards and make it win the space challenge! Links: [list=]bigwhoop.aerospaceresearch.net vote for #BigWhoop code on GitHub.com/aerospaceresearch/dgsn_bigwhoop follow twitter.com/dgsn_bigwhoop[/list] [Remaining Photos]
  • DistributedDataMining: Recent problems
    10.12.2014 22:19 Uhr
    Hi there, recently, we experienced some unfortunate problems: 1. The server hardware is old and can barely deal with the increasing load. This frequently leads to an unresponsive server. Website visitors are mainly affected since the boinc clients deal with the problem by trying to reconnect later on. The reconnects increase the problem additionally. 2. After getting back from two weeks of vacation, I realized that parts of the dDM database crashed and had to be restored. As far as I can see, we suffered no loss of important data. Maybe, some changes of user profiles were not stored. 3. The amount of dDM data increases steadily and the corresponding backup takes much more time than in the early stages of the project. As a result, the announced maintenance periods are no longer sufficient. I will think about a new backup strategy. 4. Some users reported about validation problems. I guess, this was due to the issues mentioned above. Right now, the validation of WUs is almost finished. I apologize for any inconvenience caused by these problems. Cheers, Nico
  • DistributedDataMining: Forum under spam attack
    22.04.2014 16:01 Uhr
    Hi there, the ddm forums are currently under heavy spam attack. I’ve just deleted the relevant posts. To prevent further spam, from now on users can only post new messages if they have at least one credit point. Cheers, Nico
  • DistributedDataMining: dDM went offline
    21.04.2013 12:46 Uhr
    Hi there, this week, we had some serious hardware problems which forced us to switch over to a new server. Caused by hard disc failures some data was lost – but nothing important. Now, dDM is running on new hardware. We are still recovering data by replaying our backups. That’s the reason why, some WUs still finish with download errors. We are confident, that we will be fully operational at Tuesday. We apologize for any inconvenience. Cheers, Nico
  • DistributedDataMining: Usage of 'ARM' technology for the dDM project
    12.03.2013 18:12 Uhr
    Hello, Mr. Schlitter, I really enjoy this project and would like to contribute using the CPUs on my mobile device(s). Projects such as PrimeGrid, Enigma, SubsetSum@home, OProject@home, WUProp@home, and others have successfully used ‘ARM’ technology, which I understand is open-source, and I have been crunching for these projects for about a month successfully. Is DDM currently compatible with this technology, or are there plans to allow crunching on such devices? Many of the projects listed above have noted a sharp increase in the number of users as well as amount of research that is completed. Are you able to make this work for your project as well? I would be honoured to test and/or participate in any way. Thanks again, StandardbredHorse
  • DistributedDataMining: Team challenges in September
    28.08.2012 08:04 Uhr
    Hi there, recently, I got message that two teams are going to focus on the dDM Project in September. The first one is team BOINC@Poland the second one is team The Knights Who Say Ni. I am going to create some more WUs in order to satisfy the upcoming requests. Many thanks goes to both teams for their support of our research. Best regards, Nico
  • DistributedDataMining: New application version 1.36 for Medical Data Analysis
    09.07.2012 19:57 Uhr
    Recently, I’ve released version 1.36 of our application for medical data analysis. The new version takes care of finding the location of java on windows clients. As a consequence, it shouldn’t be necessary to set the path variable manually. Best regards, Nico
  • DistributedDataMining: New application version 1.04 for our Multi-Agent Simulation of Evolution
    21.06.2012 14:27 Uhr
    Today, I’ve released version 1.04 of our Multi-Agent Simulation of Evolution. As planned, we finished the tests of this new application. Consequently, the latest version is now available for all dDM members. The new version deals with some problems that were reported by our beta testers:
    • The first version was responsible for frequent hard drive activities. This was caused by writing checkpoints and detailed log files during the test period. Now, we have reduced the frequency between checkpoint writing and the level of detail in the log files.
    • From time to time, WUs finished with error code -148, which indicates that something went wrong while starting the WU. We investigated this problem and did some code changes. Right now, we are not sure, if this solves the problem. We release the new version anyway in order to get some more log information and test data.

    I also reorganized the structure of our forum:

    • I added a new section Multi-Agent Simulation of Evolution dedicated to this new research topic and moved some recent postings.
    • I removed the Café section because it was hardly used in the past and didn’t provide any additional benefit.

    Best regards, Nico

  • DistributedDataMining: New Research Topic
    23.03.2012 16:05 Uhr
    Hi there, currently, we are in the final stage of releasing a new application which deals basically with the simulation of evolutionary processes. Using the Multi-Agent Simulation of Evolution we investigate the biological phenomenon of aposematism (also referred to as warning coloration). This term describes the evolutionary strategy of certain animal species to indicate their unpalatability/toxicity to potential predators by developing skin colors and patterns that can be easily perceived by them. Prominent examples of toxic animals with distinct warning coloration are poison dart frogs, coral snakes and fire salamanders. For tackling this interesting research challenge, we developed a distributed multi-agent model that simulates the dynamic interactions of predator and prey populations over time. By systematically testing different adaptation and learning strategies for the agents and exploring the parameter space of our simulation model using the computational power of the dDM project, we might be able to deepen the understanding of the aposematism phenomenon and the evolutionary paths leading to it. So far, dDM members won’t get WUs of this new application. Currently, we are finishing our final tests and distribute these WUs to preselected hosts only. Soon, I’ll send out these WUs to beta test members. After finishing our beta test the new application will be available for all members. Cheers, Nico
  • DistributedDataMining: Website Restructuring
    22.03.2012 18:58 Uhr
    Hi there, as you may have seen, I’ve restructured the dDM website in order to gain clarity. The main changes are:
    • I reworked the start site, which now provides a brief overview about the dDM project, its history and its objectives.
    • The research challenges went to a separate page.
    • I also created a new site which briefly reports about the dDM achievements.
    • The menu structure was revised. All items related to the member account were moved the right.
    • I also created four different color schemes, which can be freely chosen by the members (left bottom corner).

    Your comments, suggestions and wishes are much appreciated. I am going to add some more minor web site features soon and would be happy to include some of your ideas. Furthermore, I’ll update and add some research related information. That’s it for now. Cheers, Nico

  • DistributedDataMining: Download problems caused by hardware issues
    26.02.2012 19:17 Uhr
    Recently, we had some hard-disc issues and as a result the data files of our medical application are temporary not available. We are going to reconstruct the data using the latest backup. Since this will take some hours, just a few workunits will be available in the meantime. We do not expect any problems regarding the user data (e.g. credits) because our ddm database is not affected. Best regards Nico
  • DistributedDataMining: Security Update to Drupal 6.24
    08.02.2012 15:07 Uhr
    Due to some reported security vulnerabilities I’ve updated the content management system of our dDM website to Drupal version 6.24. Please, report any problems related to the dDM website in our forum. Best regards Nico
  • DistributedDataMining: Server Maintenance
    20.01.2012 11:18 Uhr
    Since we are moving the dDM-Server to new hardware the project website and all BOINC functions won’t be available on Sunday (2012/01/22). So far, we don’t expect any problems. There should be no need for any changes in the client configuration. Best regards Nico
  • DistributedDataMining: Scientific contribution
    09.01.2012 11:56 Uhr
    As already announced in an earlier post, some new research results from our Medical Data Analysis application were presented at the 162nd Meeting of the Acoustical Society of America which was held in San Diego, CA from 31 Oct – 4 Nov 2011. Our poster contribution was titled “Identifying relevant analysis parameters for the classification of vocal fold dynamics” and received lots of positive feedback from conference attendees and initiated interesting and inspiring scientific discussions. In the presented work, we systematically investigated the influence of a set of control parameters on the classification accuracy of the automatic diagnosis system for vocal fold dynamics based on high-speed videos. The particular suitability of certain parameter combinations was revealed in this study, helping to further improve the practical application of our diagnostic framework. The poster was heavily based on the results that we got from the extensive experiments conducted with the massive help of the DistributedDataMining community. An abstract of this work can be found in the accompanying conference proceedings: J. Acoust. Soc. Am. Volume 130, Issue 4, pp. 2550-2550 (2011). Thanks a lot for your support! Best, Daniel
  • DistributedDataMining: New WUs and MD5 download errors
    11.12.2011 09:44 Uhr
    Currently, I am generating new workunits for our medical application. It turned out, I was acting a bit careless, because as a consequence many members are getting MD5 download errors now. The reason is quite simple. During WU generation files are getting copied to the server. Sometimes these files are still existing because we use the some data files for multiple workunits. In these cases we just change the learning parameters or the experiment setup. Anway, overwriting these existing files leads to problems, because the old and the new file have different MD5 checksums and hence all WUs that are related to the old files error out with an MD5 download error. About 30.000 WUs are affected. I’ll take care of it but it might take a while to identify the failing WUs. There is a chance, that the dDM members error out the affected old WUs in the meantime. In that case, the new WUs should work fine. Lesson learned: Never generate new WUs if you are in a hurry – in the end it takes much more time and causes preventable trouble. I am sorry for the inconvenience! Best regards, Nico
  • DistributedDataMining: Let dDM benefit from your Christmas shopping
    07.12.2011 16:16 Uhr
    Hi there, As you know, the dDM project does not get financial support by universities, research institutes or private commercial organizations. Thus, we depend on private help to keep the project and our research running. Since the dDM website became part of the Amazon Associates Program, the dDM project can benefit from your Christmas shopping at Amazons online shop. Amazon rewards dDM with up to 6% of any issued gift certificate and up to 10% of each sold store item. If you intend to buy some Christmas gifts at Amazon, please remember dDM and follow the URL provided on this page. The provided URL links to the closest Amazon store of your region and contains our dDM partner ID. In case the wrong store is chosen, please select the right one out of the given country list. The dDM project gets a reward for each item bought after using these links. As always, all Amazon rewards or PayPal Donations are used to pay the server rent, maintenance and internet traffic. The possible remain is used to finance our scientific publications or conference presentations. Perhaps, we can even buy some new hardware in order to replace the aged project server. Thank you in advance for all of your generous support! Best regards and have a nice Christmas season! Nico
  • DistributedDataMining: New application version 1.35 for Medical Data Analysis
    05.12.2011 13:22 Uhr
    Recently, I’ve released version 1.35 of our medical data analysis application. I am happy to announce that this version solves the problem of never ending workunits. The new version is out there for almost a week and so far there aren’t any workunits that had to be killed manually by our ddm members. Best regards, Nico
  • DistributedDataMining: New application version 1.34 for Medical Data Analysis
    29.11.2011 12:00 Uhr
    Today, I’ve released a new version for our medical application. Version 1.34 should be able to detect the java location automatically. Doing so, it shouldn’t be necessary to adapt the PATH variable manually. This was necessary because it seems that the path variable that we used until now was changed by an automatic windows update. As a consequence our wrapper couldn’t find java and all workunits failed. Some users might have received an email notification and the suggestion to install java even if java was already installed on their computers. I am sorry for the confusion and the inconvenience caused by the circumstances. The new version uses the variables JAVA_HOME, JRE_HOME and JDK_HOME, which are usually set during java installation. I did some tests and it worked fine here. Lets see how it works out there. Any feedback and error reports are welcome in the forum related to the medical application. Regards, Nico
  • DistributedDataMining: Update on Medical Application
    29.08.2011 12:38 Uhr
    Hi there, Here’s a quick update on our medical application: the latest results look really good! Lots of interesting relationships between the analyzed input variables were revealed and relevant research questions could be answered. Parts of these results will be presented at a scientific conference in November. Expect more details on that soon! We’ve learned a lot from the results regarding the use of parameter optimization and feature selection – which now allows us to create much more efficient (and hopefully even more reliable) experiments. For example, we’re planning to apply sophisticated search methods from the field of evolutionary computation combined with a more statistically sound validation approach. For that purpose, we developed a new version of our application, which will run the new experiments much faster and will take over quite a lot of the data analysis steps that were done manually before. This new version is being tested extensively at the moment and will be available soon. But we’re also thinking about implementing completely new ideas and exploring different research directions. For example, one potential project will be focusing on agent-based modeling in the field of simulating the emergence of biological and social phenomena. Well, that’s it for now. Thanks again for your support! Best, Daniel
  • DistributedDataMining: Pending credit
    17.08.2011 10:32 Uhr
    As you might have noticed we had some minor problems regarding pending credits in the past. Today, I’ve fixed this issue. The problem was caused by a safety mechanism that is uses to avoid benchmark cheating. The algorithm marks suspicious workunits for further inspection and the affected workunits were pending until I’ve checked them manually. Unfortunately, there was a slight error in the cheating detection heuristic and as a result much more workunits than necessary were marked. Today, I found the problem and corrected the code. At a first glance, there won’t be any more pending workunits.
  • DistributedDataMining: New application version 5.01 for Time Series Analysis Application
    16.08.2011 14:39 Uhr
    Today, I’ve released version 5.01 of the Time Series Analysis Application. The new version uses the same wrapper technology that is already in use for the medical application. It overcomes several problems and decreases the fraction of failing workunits. So far, the new version supports 32&64 bit Linux systems only. Versions for Windows will be published soon. As always, comments and error reports are welcome in the forum.
  • DistributedDataMining: Website translations powered by Google
    11.08.2011 18:22 Uhr
    Hi there, recently, some users suggested to provide this website in different languages. Unfortunately, a translation in other languages would be a huge effort and can’t be done by myself. Therefore, I was looking for a simple solution and I’ve found Google Translate. Today, I’ve added this nice feature in our dDM website in order to translate the content in different languages. From now on, you find a Translate section in the right top corner. I hope this motivates new users to join the dDM community and helps to understand what we are trying to achieve. Any comments are welcome in the website issues forum. Best regards Nico

Autor: crille
Datum: Samstag, 7. Mai 2011 11:11
Trackback: Trackback-URL Themengebiet:

Kommentare sind geschlossen,
aber Du kannst einen trackback auf Deiner Seite setzen.

Keine weiteren Kommentare möglich.