Coe aide steps down from IAAF – French magistrates file tougher charges against Diack

first_img “In order to demonstrate that I am willing to have all allegations of unethical behaviour on my part in 2013 properly and fairly investigated, I have referred my emails to Papa Massata Diack in 2013, my statements and the circumstances of the emails to the IAAF ethics board,” Davies said in yesterday’s statement. Davies has suggested that emails reflected his job “to manage and promote the reputation of the IAAF”. Russia’s athletics federation was suspended by the IAAF after a World Anti-Doping Agency independent commission found evidence of systemic doping and cover-ups. Lamine Diack is now accused of “active corruption”, which generally involves offering money or other promises in exchange for violating a rule, an official with the Paris financial prosecutor’s office said yesterday. Diack was previously accused of “passive corruption”, on suspicion he took around €1 million (US$1.1 million) to cover up positive drug tests by Russian athletes. The official told the Associated Press the new preliminary charges center on suspicions that Diack bribed Gabriel Dolle, the former IAAF anti-doping chief who was also under investigation, to delay reporting of violations by Russian athletes. The preliminary charges allow magistrates more time to investigate before deciding whether to file formal charges and whether to send a case to trial. Diack, an 82-year-old former long jumper, is free on bail pending further investigation, but barred from leaving France. EMAILS REFERRED TO BOARD LONDON (AP): The director of IAAF President Sebastian Coe’s office temporarily stepped down yesterday pending an ethics investigation into emails linked to Russian doping cases. Emails sent by Nick Davies, leaked to a French newspaper and published this week, indicated that he tried to delay public identification of alleged Russian drug cheats before the 2013 World Championships in Moscow. “I have decided to step aside from my role with the IAAF until such time as the ethics board is able to review the matter properly and decide if I am responsible for any breach of the IAAF code of ethics,” Davies said in a statement. Davies was IAAF communications director at the time under Lamine Diack, who was succeeded by Coe as president in August. The French criminal case against Diack deepened this week, with magistrates filing new, tougher corruption charges against him in connection with cover-ups of Russian doping. French prosecutors also suspect Diack’s son, Papa Massata Diack, a former IAAF marketing consultant, played an active role in an alleged “system of corruption” that sought to blackmail athletes, with demands of money to hush-up suspected doping. Davies is reported to have sent an email to Papa Massata Diack in 2013 asking what “Russian ‘skeleton’ we have still in the cupboard regarding doping” and suggesting using the marketing company chaired by Coe – then IAAF vice-president – to lead an “unofficial PR campaign” to “avoid international media scandals” related to the Moscow championships. If Russian athletes guilty of doping were not competing in Moscow, “then we might as well wait until the event is over to announce them”, Davies wrote in the email, which was published on Monday by Le Monde. After promoting Davies to run his office at the IAAF in September, Coe is now losing one of his closest aides just as he tries to clean up the scandal-tainted governing body.last_img read more

Raw talent Navdeep Saini can definitely build a name for himself: Virat Kohli

first_imgIndia managed to scrape past West Indies by 4 wickets in the 1st T20I of the 3-match series in Lauderhill on Saturday. Set a total of 96 runs to chase from their 20 overs, India lost 6 wickets on their way to victory and West Indies were left ruing their batting mistakes as they realized that if they had put up a few more runs on the board, they could have had a realistic chance of a win.Indian captain Virat Kohli, during the post match presentation ceremony praised debutant Navdeep Saini who was adjudged the Man of the Match for a fiery spell of 3/17 including bowling a maiden-wicket 20th over of the West Indian innings.Kohli believes the young bowler “can build a name for himself.””The bowling and fielding was right up there. The pitch wasn’t great to be honest, but they couldn’t have done much with all the rain around and the covers being on for three odd days. They did a good job to get the game started. The bowlers were on top throughout. He (Saini) is from Delhi. He is a raw talent, has raw pace. He is one of them who can bowl 150 clicks and there are hardly any who bowl at that pace and he is fit. He is someone who can build a name for himself and he is hungry as well. Hopefully he builds on from here. We would have liked to chase this four down maximum, that’s how this pitch was. It was tough to get your shots away with the old ball. We just want to contribute as a team in the second game, said Kohli.advertisementIndia and West Indies will clash again on Sunday on the same ground in the 2nd T20I before shifting base to the Caribbean for the later part of the tour.Also Read | 1st T20I: India overcome batting collapse to beat West Indies after Navdeep Saini heroicsAlso Read | Gautam Gambhir blasts Bishen Bedi, Chetan Chauhan after Navdeep Saini’s dream India debutlast_img read more

Fentanyl surpasses heroin in cause of US drug overdose deaths

Source:https://www.cdc.gov/nchs/data/nvsr/nvsr67/nvsr67_09-508.pdf Epidemiologists and researchers at the U.S. Centers for Disease Control and Prevention’s National Center for Health Statistics led by Dr. Holly Hedegaard write that cocaine and heroin still remain the street drug of choice for abuse. However fentanyl alone or mixed with other narcotics is taking over in terms of cause of deaths due to drug overdose. The researchers noted that between 2013 and 2016 there has been a 113 percent rise in fentanyl induced overdose deaths in the country.Dr. Hedegaard in a statement said, “The drugs most frequently involved in overdose deaths change quickly from one year to the next.” She added that many of these deaths are due to fentanyl or its combination with other drugs. Fentanyl is being combined with cocaine as well as heroin she said.Fentanyl is 80 to 100 times more potent than heroin and cocaine says the U.S. Drug Enforcement Agency (DEA). It is used for pain relief in patients with terminal cancer pain and as an adjunct to anesthetic agents. The DEA explains that when fentanyl is added to heroin, it potency is increased and drug users are being sold a combination of heroin and fentanyl as “highly potent heroin”. This means that many users are unaware that they are using a combination rather than heroin alone. This increased potency could also be a reason for deaths due to overdose, the DEA states.Related StoriesMeth in the morning, heroin at night: Inside the seesaw struggle of dual addictionCannabidiol reduces cue-induced craving and anxiety in individuals with history of heroin abuseLow rates of recommended treatment for tobacco dependence in patients hospitalized with SUDsThe CDC report states that the common causes of deaths due to overdose include fentanyl, heroin, hydrocodone (Vicodin), methadone, morphine, oxycodone, alprazolam, diazepam, cocaine and methamphetamine. Oxycodone was a top killer among drug users in 2011. Between 2012 and 2015 the leading killer was heroin and since then Fentanyl has been at the top position. Cocaine remains the second or third most common drug of abuse during this period between 2011 and 2017. Overdose deaths during the period of 2011 and 2016 due to heroin or methamphetamine has tripled write the researchers. Deaths due to fentanyl overdose were less than 1 per 100,000 in 2013 and has risen to around 6 per 100,000 in 2016. During this period deaths due to methadone overdose has dropped the authors of the report add.The team of researchers note that deaths due to suicides are mostly with prescription or over-the-counter drugs (commonly OxyContin, Benadryl, Vicodin and Xanax) but deaths due to drug overdoses is seen commonly with illegal drugs. The team notes that death certificates do not always clearly mention the exact drugs that led to overdosage and death.Experts have stated that there are other forms of fentanyl which are doing the rounds. These include carfentanil or butyrfentanyl etc. These are more potent than fentanyl. CDC statistics report that around 50000 Americans die each year from drug overdoses. By Dr. Ananya Mandal, MDDec 13 2018According to a new report from health officials, Fentanyl – a synthetic opioid drug, is surpassing heroin as cause of drug overdose deaths in the United States. The report was released 12th of December 2018 in the CDC’s National Vital Statistics Report. Prescription opioids. Image Credit: Darwin Brandis / Shutterstock read more

Study reveals link between air pollution and employee productivity

first_img Source:https://news.nus.edu.sg/press-releases/air-pollution-affects-productivity Reviewed by Alina Shrourou, B.Sc. (Editor)Jan 3 2019Prolonged exposure to pollutant particles was shown to reduce the output of workers in ChinaEconomists from the National University of Singapore (NUS) have completed an extensive study which reveals that exposure to air pollution over several weeks is not just unhealthy, it can also reduce employee productivity.Associate Professor Alberto Salvo from the Department of Economics at the NUS Faculty of Arts and Social Sciences and an author of the study, explained, “Most of us are familiar with the negative impact air pollution can have on health, but as economists, we wanted to look for other socioeconomic outcomes. Our aim with this research was to broaden the understanding of air pollution in ways that have not been explored. We typically think that firms benefit from lax pollution regulations, by saving on emission control equipment and the like; here we document an adverse effect on the productivity of their work force.”The results of this study were published in the American Economic Journal: Applied Economics on 3 January 2019.The link between air pollution and productivityThe NUS team, including Associate Professor Haoming Liu and Dr Jiaxiu He, spent over a year gathering information from factories in China. This involved interviewing managers at one dozen firms in four separate provinces, before obtaining access to data for two factories, one in Henan and the other in Jiangsu.The factories were textile mills, and workers were paid according to each piece of fabric they made. This meant that daily records of productivity for specific workers on particular shifts could be examined. Hence, the researchers compared how many pieces each worker produced each day to measures of the concentration of particulate matter that the worker was exposed to over time.A standard way of determining the severity of pollution is to measure how many fine particles less than 2.5 micrometres in diameter (PM2.5) are in the air. The majority of people living in developing countries are exposed to particle concentrations that health authorities deem harmful. At the two factory locations, pollution levels varied significantly from day to day, and overall they were consistently high. At one location, PM2.5 levels averaged about seven times the safe limit set by the US Environmental Protection Agency, at 85 micrograms per cubic metre.Interestingly, unlike previous literature, the team found that daily fluctuations in pollution did not immediately affect the productivity of workers. However, when they measured for more prolonged exposures of up to 30 days, a definite drop in output can be seen. The study was careful to control for confounding factors such as regional economic activity.Related StoriesStudy reveals how air pollution affects the human body at the metabolite levelNew small, optical nano-sensor could soon measure air pollutionResearch finds link between air pollution and coronary heart disease in China”We found that an increase in PM2.5, by 10 micrograms per cubic metre sustained over 25 days, reduces daily output by 1 per cent, harming firms and workers,” says Associate Professor Liu. “The effects are subtle but highly significant.”The researchers remain agnostic about the reasons that explain why productivity goes down when pollution goes up. “High levels of particles are visible and might affect an individual’s well-being in a multitude of ways,” explained Assoc Prof Liu. “Besides entering via the lungs and into the bloodstream, there could also be a psychological element. Working in a highly polluted setting for long periods of time could affect your mood or disposition to work.”First-of-its-kind study examining prolonged exposure to air pollutionResearch on how living and working in such a polluted atmosphere affects productivity is very limited, partly due to worker output being difficult to quantify. One previous study that focused on workers packing fruit in California found a large and immediate effect from exposure to ambient PM2.5, namely that when levels rise by 10 micrograms per cubic metre, workers become 6 per cent less productive on the same day.That study’s estimate appears large for a developing country. “Labourers in China can be working under far worse daily conditions while maintaining levels of productivity that look comparable to clean air days. If the effect were this pronounced and this immediate, we think that factory and office managers would take more notice of pollution than transpired in our field interviews. Therefore, our finding that pollution has a subtle influence on productivity seems realistic,” Assoc Prof Liu added.All the data collected in the NUS study are being made open access to serve as a resource for other researchers to accelerate progress in this topic. “This was a key criterion for inclusion in our study,” Assoc Prof Salvo added. “We wanted to share all the information we gathered so that other researchers may use it as well, hopefully adding to this literature’s long-run credibility. We saw no reason why data on anonymous workers at a fragmented industry could not be shared.”last_img read more

Sleep disturbances linked to mental health problems among survivors of natural disaster

first_imgReviewed by Kate Anderton, B.Sc. (Editor)Jun 8 2019Preliminary results from a new study suggest that sleep disturbances are associated with mental health problems among survivors of a natural disaster even two years after the event.The researchers surveyed survivors two years after the 2010 earthquake in Haiti. Results show that 94% reported experiencing subsequent insomnia symptoms after the disaster. Two years later, 42% showed clinically significant levels of post-traumatic stress disorder (PTSD), and nearly 22% had symptoms of depression. There were significant positive correlations between sleep disturbances and peritraumatic distress (i.e., emotional reaction during and immediately after the event), PTSD, and symptoms of depression. Resilience did not appear to be a buffer against sleep disturbance. This is one of the first epidemiological studies to investigate the prevalence of sleep disturbances among survivors of the 2010 Haiti earthquake. Our study underscores the strong association between common trauma-related disorders and comorbid sleep conditions among a group of survivors.”Lead author and principal investigator Judite Blanc, Ph.D., an NIH T32 postdoctoral fellow in the Center for Healthful Behavior Change within the Division of Health and Behavior in the Department of Population Health at NYU School of Medicine The study involved 165 participants with a mean age of about 31 years; 52% were men. Participants were living in Port-au-Prince, Haiti, one of the areas affected by the 2010 earthquake. According to the authors, it was the most devastating earthquake in the country’s history, killing more than 200,000 people and displacing more than 1 million residents.Related StoriesNovel bed system with VR brainwave-control for sleep blissSleep disorders in patients with low back pain linked to increased healthcare visits, costsInternational study aims to more accurately describe mental health disordersMeasures included demographic factors, the Peritraumatic Distress Inventory, the PTSD Checklist Specific, the Beck Depression Inventory, and the Connor-Davidson Resilience Scale. Spearman correlations and multilinear regressions were used to explore associations among resilience, PTSD, depression, and sleep disturbances.”Findings from our study highlight the need to assess and treat sleep issues among disaster survivors, as they are highly prevalent after a natural disaster and are related to mental health conditions,” said Blanc. “Our work supports the importance of sleep in disaster preparedness programs globally.”Blanc added that sleep is often neglected in the aftermath of traumatic events, but in these situations, sleep should be considered an important target of mental and physical health interventions.”Our results make the case that sleep health should be a major component of all public and global health programs and specifically in humanitarian crises,” said Blanc.Source:American Academy of Sleep MedicineJournal reference:Blanc, J. et al. (2019) Is Resilience A Protective Factor For Sleep Disturbances Among Earthquake Survivors? Sleep. doi.org/10.1093/sleep/zsz067.883.last_img read more

Human factors research accelerates mission planning

The key to a successful flight mission is planning – sometimes several hours of it. Georgia Tech Research Institute (GTRI) specialists in human factors and human computer interfaces are working with PMA-281, Strike Planning and Execution Systems in Patuxent River, Maryland, to streamline the current mission planning process and identify user interface requirements supporting multi-domain mission management in next-generation naval planning capabilities. Citation: Human factors research accelerates mission planning (2018, April 4) retrieved 18 July 2019 from https://phys.org/news/2018-04-human-factors-mission.html Explore further GTRI researchers analyze the mission planning task. Shown are Senior Research Engineer John Huggins, Research Scientist Elizabeth Weldon, Research Engineer Jerry Ray and Research Scientist Stuart Michelson. Credit: Rob Felt, Georgia Tech Provided by Georgia Institute of Technology She added: “We tried to understand the tasks of the user and therefore how the workflow could be streamlined. From that, we designed user interfaces that better implement the tasks, and we developed a style guide to help the DoD software programmers who were implementing it.”At each iteration of the process, prototype interface designs were evaluated with experts. In some cases, those experts visited the GTRI team in Atlanta to review and discuss the designs.”We took them through each of the screens to find out what is intuitive to them and what is not,” Crosland said. “We did this multiple times with different user groups to make sure we had a good set of interface concepts. In this work, it’s critical to involve the intended users of the system.”The GTRI team has applied lessons learned from a variety of domains – desktop and web design, and commercial and military applications. For instance, shortening the distance between buttons on a screen, reducing the number of clicks necessary for a task, consolidating screens, and providing a consistent workflow direction make a digital system easier and faster to use – whether it’s a website or mission planning system. “We want to make the system a companion for the aircrews so they consider it a partner in these critical processes,” she added. In one case, the researchers were able to consolidate nine separate screens, each with different tabs, into a single screen.”At the root of all user interface design, whether it’s web or something else, it’s all about creating a time-efficient task that is intuitive so using it takes less time and less training and creates fewer errors,” Crosland said. “If you can cut down on errors because users understand the system, it will make the system more efficient.”GTRI’s Human Systems Engineering Branch (HSEB) has been in operation for more than 30 years to help improve the interaction between warfighters and the technologies they use. “We have significant experience in understanding the domains of mission planning and mission execution, and the components that make technology easier to use,” Crosland said. “We use established design standards customized for a particular format, whether it’s a mobile tablet or standard computer.” With guidance from the GTRI researchers, the project will improve usability of the mission planning software tools, creating a more consistent and intuitive screen design that’s easier to learn and more logical to follow. This effort could benefit all Department of Defense (DoD) agencies for collaborative mission planning.”We are working with Navy and Marine Corp Aviators to identify areas in mission planning where work-flow can be streamlined, reducing the time required to mission plan,” said Marcia Crosland, project director for GTRI’s Joint Mission Planning System (JMPS) User Interface Design and Usability efforts. “Our task has been to define the user interface concepts and decision-making tools to help reduce the time required for mission planning. We’ve created detailed designs and specifications to direct current and future development of mission planning systems.”Mission planning needs to support the ability to collaboratively plan missions involving multiple aircraft but currently does not have that capability. The planning challenge can be quite complex, involving multiple targets, ground-based threats, different aircraft types and a variety of weapons systems. The most complex part of the process is often done by multiple pilots using whiteboards, paper, and spreadsheets to combine relevant information, consider alternatives, and reveal complicated issues.Information from the white-boarding process is then entered into the software system, which produces the mission plans that go on board the aircraft. The GTRI human factors team realized that supporting these white-boarding activities in the mission planning system could accelerate the mission planning process, and they created new designs to support this functionality. “We are making recommendations for how the Navy can streamline the process and move it all into the digital world to eliminate the paper and whiteboard processes,” said Crosland. “That will allow aircrews to plan a mission more efficiently, reducing the time required and potentially highlighting places where automated decision-making tools could be brought into the process.” Breakthrough capability keeps subs, ships on safe track This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. read more

The hightech evolution of scientific computing

first_img This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. Citation: The high-tech evolution of scientific computing (2018, August 2) retrieved 18 July 2019 from https://phys.org/news/2018-08-high-tech-evolution-scientific.html Certainly, this idea was not lost on the U.S. Department of Energy’s (DOE) Argonne National Laboratory, which has helped advance the boundaries of high-performance computing technologies through the Argonne Leadership Computing Facility (ALCF).Realizing the promise of exascale computing, the ALCF is developing the framework by which to harness this immense computing power to an advanced combination of simulation, data analysis, and machine learning. This effort will undoubtedly reframe the way science is conducted, and do so on a global scale.Since the ALCF was established in 2006, the methods used to collect, analyze and employ data have changed dramatically. Where data was once the product of and limited by physical observation and experiment, advances in feeds from scientific instrumentation such as beamlines, colliders, and space telescopes—just to name a few—have increased data output substantially, giving way to new terminologies, like “big data.”While the scientific method remains intact and the human instinct to ask big questions still drives research, the way we respond to this new windfall of information requires a fundamental shift in how we use emerging computing technologies for analysis and discovery.This convergence of simulation, data, and learning is driving an ever-more complex but logical feedback loop.Increased computational capability supports larger scientific simulations that generate massive datasets used to feed a machine learning process, the output of which informs further and more precise simulation. This, too, is further augmented by data from observations, experiments, etc., to refine the process using data-driven approaches.”While we have always had this tradition of running simulations, we’ve been working incrementally for more than a few years now to robustly integrate data and learning,” says Michael Papka, ALCF director and deputy associate laboratory director for Computing, Environment and Life Sciences (CELS).To advance that objective, the facility launched its ALCF Data Science Program in 2016 to explore and improve computational methods that could better enable data-driven discoveries across scientific disciplines. The ALCF also recently expanded its Aurora Early Science Program with the addition of 10 new projects that will help prepare the facility’s future exascale supercomputer for data and learning approaches. Project to elucidate the structure of atomic nuclei at the femtoscale Science has always relied on a combination of approaches to derive an answer or develop a theory. The seeds for Darwin’s theory of natural selection grew under a Herculean aggregation of observation, data, and experiment. The more recent confirmation of gravitational waves by the Laser Interferometer Gravitational-Wave Observatory (LIGO) was a decades-long interplay of theory, experiment, and computation. And earlier this year, the CELS directorate announced the creation of the Computational Science (CPS) and Data Science and Learning (DSL) divisions to explore challenging scientific problems through advanced modeling and simulation, and data analysis and other artificial intelligence methods, respectively.”These combined efforts will focus on domain sciences and identify important problems that can be addressed through a combination of simulation, data science, and machine learning approaches. In many cases, we’ll be drawing on people with relevant expertise across multiple divisions,” says CPS director Paul Messina.Already, this combination of programs and entities is being tested and proved through studies that cross the scientific spectrum, from understanding the origins of the universe to deciphering the neural connectivity of the brain.Convergence for a brighter futureData has always been a key driver in science and yes, it’s true that there is an exponentially larger amount than there was, say, ten years ago. But while the size and complexity of the data now available poses challenges, it is also providing opportunities for new insights.No doubt Darwin’s research was big data for its time, but it was the culmination of nearly 30 years of painstaking collection and analysis. He might have whittled the process considerably had he had access to high-performance computers, and data analysis and machine learning techniques, such as data mining.”These techniques don’t fundamentally change the scientific method, but they do change the scale or the velocity or the kind of complexity you can deal with,” notes Rick Stevens, CELS associate laboratory director and University of Chicago professor.Take, for example, research into new materials designed to generate solar power as sunlight passes through windows. The technology has been hampered for lack of the right dye molecule, the discovery of which requires the time-consuming task of searching through reams of chemistry literature to find molecules with the right parameters.Chemist and physicist Jacqueline Cole leads a collaborative effort between Argonne and the University of Cambridge to bring such molecules to light. Cole has developed a multi-stage process that cycles through simulation; data extraction, enrichment, and mining; materials prediction and experimental validation.The team runs large-scale simulations on targeted molecules to predict chemical dyes with key optical properties. From these data, molecules are selected for synthesis, and the resulting chemicals are fabricated into devices for validating their prospects in solar-powered windows. The results determine whether further investigation is required.”There’s a positive feedback loop inherent in this,” she says. “Even if the validation process doesn’t go well, it can still provide some helpful insights. We might learn, for example, that we need to refine the structure-function relationships of the molecules for a particular application or add a new type of data to the existing data.”A large part of the effort was focused on constructing a database of desirable organic molecules, much of which was compiled by data mining some 300,000 published research articles. The research was spurred by the Materials Genome Initiative, a government initiative to take functional materials to market much faster than the decades it once took.”The advantage of this process is to really take away the old manual curation of databases, which is lifetimes of work, and reduce it to a matter of a few months. Ultimately, a few days,” says Cole.One machine to bind them allWhether it’s the search for very specific dye molecules or understanding key flow physics to develop more efficient wind turbine blades, the merging and flourishing of simulation, data, and learning is only possible because of the exponential and deliberate development of high-performance computing and data delivery systems.”Supercomputer architectures are being structured to make them more amenable to dealing with large amounts of data and facilitate learning, in addition to traditional simulations,” says Venkat Vishwanath, ALCF data sciences lead. “And we are fitting these machines with massive conduits that allow us to stream large amounts of data from the outside world, like the Large Hadron Collider at CERN and our own Advanced Photon Source (APS) and enable data-driven models.”Many current architectures still require the transfer of data from computer to computer, from one machine, the sole function of which is simulation, to another that excels in data analysis and/or machine learning.Within the last few years, Argonne and the ALCF have made a solid investment in high-performance computing that gets them closer to a fully integrated machine. The process accelerated in 2017, with the introduction of the Intel-Cray system, Theta, which is capable of combining traditional simulation runs and machine learning techniques.The ALCF will help drive simulation, data, and learning to a new level in 2021, when they unveil the nation’s first exascale machine, Aurora. While it can perform a billion billion calculations per second, its main advantage may be its ability to conduct and converge simulation, data analysis, and machine learning under one hood. The end result will allow researchers to approach new types as well as much larger problems and reduce time to solution.”Aurora will change the game,” says the ALCF’s Papka. “We’re working with vendors Intel and Cray to assure that we can support science through this confluence of simulation, data, and learning all on day one of Aurora’s deployment.”Whether by Darwin or Turing, whether with chalkboard or graph paper, some of the world’s great scientific innovations were the product of one or several determined individuals who well understood the weight of applying balanced and varied approaches to support—or refute—a hypothesis.Because current innovation is driven by collaboration among colleagues and between disciplines, the potential for discovery through the pragmatic application of new computational resources, coupled with unrestrained data flow, staggers the imagination. Provided by US Department of Energy Explore further The high-tech evolution of scientific computing. Credit: US Department of Energylast_img read more

Pumped storage hydropower a gamechanger

first_img This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. Provided by Australian National University A series of Pumped Storage Hydropower (PSH) projects planned across 5 states could triple Australia’s electricity storage capacity, according to a new study by a researcher at The Australian National University (ANU). Professor Jamie Pittock says if the projects go ahead, they will accelerate our transition to renewable energy.”We’re talking about more than 20 projects being assessed or built. This would put us well on the way to having a national grid that could rely almost entirely on renewables,” Professor Pittock says.”It’s really a game changer. It destroys any argument that solar and wind can’t provide the baseload power needed to keep the lights on in eastern Australia.”PSH works by having two connected reservoirs. When there is excess power, (for example, on especially sunny or windy days) it’s used to pump water uphill.During times of greater demand, power can then be generated by releasing water back down to a generator. Professor Pittock’s paper outlines the environmental implications of this system. He says it does throw up some unusual challenges.”A lot of people live in rural areas because they don’t want to live next to a big industrial project, it might be a shock if somebody suddenly turns around and says they want to build a reservoir on top of the nearest mountain.”A lot of high elevation areas that would otherwise be suitable have to be ruled out because of national parks or cultural sites. Other sites are too far from water or existing electricity transmission lines.Professor Pittock says the sites which could soon be home to PSH projects include everything from old quarries, to doubling existing pumped hydro schemes and a “green” steel mill.”One example is the old gold mining tunnels under Bendigo in Victoria, so sucking the contaminated water up to the surface and feeding it back down the mine shafts,” Mr Pittock said.In South Australia, another project proposes the use of sea water to generate power.This means there’s no blanket rule when it comes to sourcing the water needed for PSH.”In South Australia for example, one project will buy water entitlements out of the Murray Darling Basin system.””Then you’ve got Snowy Hydro, whose operators say legally nothing changes, we’re using the same water, we’re just recycling it.”Mr Pittock says despite the complexity, a number of the proposed sites are really promising, and more than enough to back up the grid.”Estimates are that we would need about 20 big PSH facilities to back up the entire national grid. It’s partly a judgement call about how much risk you want to take in terms of the reliability of the electricity supply.”The research has been published in Australian Environment Review. Explore further Snowy Hydro’s Tumut 3 PSH project. Credit: Jamie Pittock, ANU More information: Jamie Pittock. Are we there yet? The Murray-Darling Basin and sustainable water management, Thesis Eleven (2019). DOI: 10.1177/0725513618821970 Citation: Pumped storage hydropower a ‘game-changer’ (2019, March 25) retrieved 17 July 2019 from https://phys.org/news/2019-03-storage-hydropower-game-changer.html Australia streaks ahead to be renewables world championlast_img read more

Huawei warns US patent curbs would hurt global tech

first_img © 2019 The Associated Press. All rights reserved. Huawei’s Chief Legal Officer Song Liuping speaks at a press conference at the company’s headquarters in Shenzhen in southern China’s Guangdong province, Thursday, June 27, 2019. Chinese tech giant Huawei has warned a U.S. proposal to block the company from pursuing damages in the country’s patent courts would be a “catastrophe for global innovation.” (AP Photo/Dake Kang) Huawei’s Chief Legal Officer Song Liuping speaks at a press conference at the company’s headquarters in Shenzhen in southern China’s Guangdong province, Thursday, June 27, 2019. Chinese tech giant Huawei has warned a U.S. proposal to block the company from pursuing damages in the country’s patent courts would be a “catastrophe for global innovation.” (AP Photo/Dake Kang) Huawei’s Head of Intellectual Property Jason Ding speaks at a press conference at the company’s headquarters in Shenzhen in southern China’s Guangdong province, Thursday, June 27, 2019. Chinese tech giant Huawei has warned a U.S. proposal to block the company from pursuing damages in the country’s patent courts would be a “catastrophe for global innovation.” (AP Photo/Dake Kang) The proposal comes amid mounting U.S. action against Huawei, the biggest maker of switching gear for phone carriers. The company has been devastated by the Trump administration’s decision to impose restrictions on its access to American chips for smartphones and other components and technology.Disrupting Huawei’s access to U.S. patent courts would threaten the intellectual property system that supports technology development, said Song Liping, the company’s chief legal officer.The proposal by Sen. Marco Rubio, a Republican from Florida, followed reports Huawei Technologies Ltd. is asking for $1 billion from American phone carrier Verizon for use of the Chinese company’s patents.”If such a legislative proposal were to be passed, it would be a catastrophe for global innovation. It would have terrible consequences,” Song said at a news conference. He said it would “break the foundation of IP protection.”American officials accuse Huawei of facilitating Chinese spying, a charge the company denies, and see it as a growing competitive threat to U.S. technology industries. Explore further This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only. The jury rejected Huawei’s claims that Cnex Labs Inc. co-founder Yiren Huang stole its technology while he worked at a Huawei subsidiary.Huawei’s head of intellectual property, Jason Ding, said the company was studying the verdict and deciding what to do next.Asked about a report by Bloomberg News that some Huawei researchers had published papers with Chinese military personnel over the past decade, Song said the company wasn’t aware of its employees publishing research as private individuals.”We don’t customize products or do research for the military,” said Song. “We are not aware of employees publishing papers. We don’t have projects of that kind.” Huawei’s founder, Ren Zhengfei, said this month it has cut its project sales by $30 billion over the next two years due to curbs on access to American chips and other components. He said smartphone sales outside China will fall 40%.Huawei’s U.S. sales of network gear evaporated after a congressional panel labeled the company a security threat in 2012 and told phone carriers to avoid it. But the Chinese company has a patent portfolio it licenses to manufacturers and carriers.Song gave no confirmation of how much Huawei wants from Verizon or the basis of its claims.”We aren’t taking an aggressive approach to intellectual property rights,” Song said. “We aim to protect our IP in order to safeguard our global business and we have no intention of weaponizing IP. We are against charging exorbitant royalties, and we think that the fees should be within reasonable realms.”Huawei, founded in 1986, has China’s biggest corporate research and development budget at $15 billion in 2018. The company is a leader in developing next-generation telecoms technology. China telecom giant Huawei hints US pressure hurting sales Citation: Huawei warns US patent curbs would hurt global tech (2019, June 27) retrieved 17 July 2019 from https://phys.org/news/2019-06-huawei-patent-curbs-global-tech.html Chinese tech giant Huawei warned Thursday a U.S. senator’s proposal to block the company from pursuing damages in patent courts would be a “catastrophe for global innovation.” On Wednesday, a U.S. federal court jury in Texas ruled Huawei stole trade secrets from a Silicon Valley company but awarded no damages, saying the Chinese company didn’t benefit.last_img read more