Jump to content

Skylights's Content - Page 31 - InviteHawk - Your Only Source for Free Torrent Invites

Buy, Sell, Trade or Find Free Torrent Invites for Private Torrent Trackers Such As redacted, blutopia, losslessclub, femdomcult, filelist, Chdbits, Uhdbits, empornium, iptorrents, hdbits, gazellegames, animebytes, privatehd, myspleen, torrentleech, morethantv, bibliotik, alpharatio, blady, passthepopcorn, brokenstones, pornbay, cgpeers, cinemageddon, broadcasthenet, learnbits, torrentseeds, beyondhd, cinemaz, u2.dmhy, Karagarga, PTerclub, Nyaa.si, Polishtracker etc.

Skylights

Banned
  • Posts

    953
  • Joined

  • Last visited

  • Feedback

    94.7%
  • Points

    75 [ Donate ]

Everything posted by Skylights

  1. Neuroscientists at the University of Sussex have shown by using brain scans of drug users that heroin stimulates a more pleasurable response when taken at home but cocaine is more pleasurable outside the home, such as in a club. The findings demonstrate for the first time that the setting of drug-taking is an important factor in how the brain processes the experience, and could have important implications for the treatment of drug addiction. The study is published today, Monday 14 May, in the Journal of Neuroscience, the official journal of the Society for Neuroscience. Dr Silvana De Pirro and Professor Aldo Badiani at the Sussex Addiction Research and Intervention Centre (SARIC) at the University of Sussex, in collaboration with colleagues at Sapienza University of Rome, Italy, recruited people with addiction to both cocaine and heroin who receive treatment at a medical centre in Rome. The researchers tested the 'mismatch hypothesis': that taking heroin -- which induces a sedative state -- while in a stimulating context like a club, and taking cocaine -- which is a stimulant -- in a private context, creates an emotional state that is at odds with the environment. This mismatch prevents the drug feeling pleasurable. The neuroscientists asked the first group of 53 people to recall a typical drug episode and indicate how arousing and pleasant their experience was with each drug (heroin or cocaine) in two different settings (at home or outside the home). With guidance, the second group of 20 people imagined using the drugs in each setting while their brain activity was measured with functional magnetic resonance imaging. The results show: Nine in ten (89.1%) of the drug users reported a pleasurable experience when using heroin at home Fewer than four in ten (39.1%) of the drug users reported a positive state when taking heroin outside the home A little over a quarter (26.9%) of the drug users reported a positive state when taking cocaine at home Half (50%) of the drug users reported a pleasant or mixed state (17.3%) when taking cocaine outside the home The brain scans showed that during drug imagery that the same setting produced opposite neural responses for each of the two drugs in the brain regions involved in processing reward and context: the prefrontal cortex, caudate and cerebellum. The researchers conclude that the emotional and neural response to addictive drugs changes as a function of both the substance and of the setting of use. The neuroscientists are calling on governments and therapy providers to take into account the impact of different environmental factors on different classes of addictive drugs. They hope this will lead to more effective treatment and fewer people suffering relapses. Dr De Pirro, who undertook this study for her PhD at the University of Sussex, said: "The findings related to the cerebellum are particularly interesting because that part of the brain helps us understand the context of our emotional experiences, so it may explain why the effects of drug taking vary by setting. "This also has important implications for the therapeutic treatment of drug abusers. Considering the interaction between drug type and location could help to prevent relapse. Governments should adapt policies to ensure that therapies take into account the impact of the environmental factors on the risk of relapsing, and on its role in supporting recovery from addiction." Professor Badiani, Director of SARIC at the University of Sussex, says: "These findings challenge the classic view that all drugs produce identical changes in the reward regions of the brain and that they are addictive because of their ability to induce an extremely pleasurable state. "This study shows that the provision of methadone alone is not sufficient for treating heroin addiction. Treatments should also tackle important social and environmental factors. For example, evidence-based intervention such as cognitive behavioural therapy and 'ecological momentary interventions' (such as smart phone applications that people can access anywhere at any time in their real life when they feel an urge to abuse drugs) should be a critical part of the treatment process."
  2. Radar satellites supply the data used to map sea level and ocean currents. However, up until now the radar's "eyes" have been blind where the oceans are covered by ice. Researchers at the Technical University of Munich (TUM) have now developed a new analysis method to solve this problem. The melting of the polar ice cap would have a drastic effect: Sea level would rise by several meters around the world, impacting hundreds of millions of people who live close to coasts. "This means one of the most important questions of our time is how climate change is affecting the polar regions," explains Dr. Marcello Passaro of the TUM German Geodetic Research Institute. The blind spot of the radar "eye" But changes in sea level and ocean currents in the ice-covered regions of the Arctic and Antarctic in particular are very difficult to detect. The reason: The radar signals of the altimeter satellites that have been surveying the surfaces of the earth and oceans for more than two decades are reflected by the ice at the poles. This renders the water underneath the ice invisible. But ocean water also passes through cracks and openings in the permanent ice, reaching the surface. "These patches of water are however very small and the signals are highly distorted by the surrounding ice. Here standard evaluation methods like those used for measurements made on the open seas are incapable of returning reliable results," Passaro points out. Together with an international team he has now developed a data analysis method which sharpens the focus of the radar's eyes. An algorithm for all occasions The core of this virtual "contact lens" is the adaptive algorithm ALES+, (Adaptive Leading Edge Subwaveform). ALES+ automatically identifies the portion of the radar signal which is reflected by water and derives sea level values using this information only. This makes it possible to precisely measure the altitude of the ocean water which reaches the surface through ice cracks and openings. By comparing several years of measurements, climate researchers and oceanographers can now draw conclusions about changes in sea level and ocean currents. "The special thing about our method is that it is adaptive," Passaro notes. "We can use one and the same algorithm to measure sea level in both open and ice-covered ocean areas. ALES+ can also be used for coastal waters, lakes and rivers. Here the signals are highly varied, but always exhibit certain characteristic properties which the system then learns." The scientists were able to use a test scenario in the Greenland Sea to demonstrate that ALES+ returns water levels for ice-covered and open ocean regions which are significantly more precise than the results of previous evaluation methods.
  3. A massive decade-long study of Western Equatorial Africa's gorillas and chimpanzees has uncovered both good news and bad about our nearest relatives. The good news: there are one third more western lowland gorillas and one tenth more central chimpanzees than previously thought. The bad news: the vast majority of these great apes (80 percent) exist outside of protected areas, and gorilla populations are declining by 2.7 percent annually. The WCS-led study titled "Guns, germs and trees determine density and distribution of gorillas and chimpanzees in Western Equatorial Africa" appears in the latest edition of the journal Science Advances. The newly published paper was written by 54 co-authors from several organizations and government agencies, including WCS (Wildlife Conservation Society), WWF (World Wide Fund for Nature), Max Planck Institute for Evolutionary Anthropology, Jane Goodall Institute, Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) -- Monitoring the Illegal Killing of Elephants (MIKE), Lincoln Park Zoo, and the Universities of Stirling and Washington, and involved the protected area authorities of five countries. Researchers collected field data during foot surveys carried out over a 10-year period across the range of both western lowland gorillas (Gorilla gorilla gorilla) and central chimpanzees (Pan troglodytes troglodytes) -- surveying an area of 192,000 square kilometers (72,000 square miles -- equivalent to the size of the state of Washington) and including some of the most remote forests on the African continent. The authors of the study report an estimated abundance of over 360,000 gorillas and nearly 130,000 chimpanzees across the combined ranges of both subspecies, both of which were higher than previously believed. The gorilla estimate is approximately one-third higher and the chimpanzee estimate is about one-tenth higher. These revised numbers come largely from refinements to the survey methodology, new data from areas not previously included in range-wide estimates, as well as predictions of numbers in the areas between survey sites. "It's great news that the forests of Western Equatorial Africa still contain hundreds of thousands of gorillas and chimpanzees, but we're also concerned that so many of these primates are outside of protected areas and vulnerable to poachers, disease, and habitat degradation and loss," said lead author Samantha Strindberg of WCS. "These findings can help inform national and regional management strategies that safeguard the remaining habitat, increase anti-poaching efforts, and curtail the effects of development on great apes and other wildlife." Although the majority of great apes were found outside of protected areas, they were still in large forested landscapes close to or bordering existing national parks and reserves and away from centers of human activity. This suggests that protecting large and intact forested areas, with protected areas at their core, is critical to conserving gorillas and chimpanzees in this region. The data analysis also revealed a 2.7 percent annual decline in gorilla numbers, a finding that supports the continued status of the species as "Critically Endangered" on the IUCN Red List of Threatened Species. Chimpanzees are listed as "Endangered." The combined field time spent by researchers collecting data for the study totaled approximately 61,000 days (or 167 person-years) of time. Researchers walked more than 8,700 kilometers (5,400 miles) -- a distance longer than the north-south axis of the African continent, or from New York to London -- while collecting data on great ape nests that was used to generate population estimates and trends. Said co-author Dave Morgan of the Lincoln Park Zoo and Goualougo Triangle Ape Project: "The boots on the ground research teams and partnerships are crucial to the success of these programs and the conservation of gorillas and chimpanzees. These long-term studies enable us to make informed recommendations regarding protected lands and management to help great apes." The main factors responsible for the decline of gorillas and chimpanzees are illegal hunting, habitat degradation, and disease. At the same time, it was clear that where wildlife guards were present, above all in protected areas with intact forests, both gorillas and chimpanzees can thrive. David Greer of WWF said: "All great apes, whether in Africa or Asia, are threatened by poaching, especially for the bushmeat trade. Our study found that apes could live in safety, and thus in higher numbers, at guarded sites than if there was no protection." Said Fiona Maisels of WCS: "Our study underscores the huge importance of intact forests to gorillas and chimpanzees, and of preventing illegal felling of good quality forests." Other conservation recommendations made by the authors include land-use planning at national scales to keep ecologically-harmful activities, such as agriculture and new road construction, away from intact forests and the protected areas that serve as important gorilla and chimpanzee refuges. Another priority is the implementation of careful logging practices in existing logging concessions that follow Forest Stewardship Council (FSC) Standards for reducing impacts on wildlife and habitats. These standards require that access to forests is controlled, old logging roads are effectively decommissioned and effective patrol systems are put in place to prevent illegal hunting. Ensuring strong implementation is critical. An additional threat to great apes -- as well as human health -- is the Ebola virus disease. Continued research into developing a vaccine and the means to deliver it are priorities, as are educational efforts on how to avoid spreading the disease and transmission between humans and great apes. Of all the 14 living great ape taxa, western lowland gorillas and central chimpanzees have the largest remaining populations. This is certainly good news. However, their future preservation cannot be taken for granted, given the fact that their dependence on suitable habitat collides with local to global demand for natural resources from their habitat, particularly outside of protected areas, where most of them occur. Said Hjalmar KĂźhl of the Max Planck Institute for Evolutionary Anthropology: "Protecting our gorillas and chimpanzees will therefore require a major increase in political will at all levels -- national, regional, and global. Financial commitments from governments, international agencies for endangered species conservation and the private sector, are also critical for conserving our closest relatives and their habitats." Liz Williamson from the University of Stirling and the IUCN Red List Authority Coordinator for great apes said: "A combination of responsible industrial practices, conservation policies, and a network of well-managed parks and corridors would provide wildlife managers with a winning formula for conserving great apes in Central Africa. Our study has revealed that it is not too late to secure a future for gorillas and chimpanzees."
  4. Experts from the Alfred Wegener Institute and the Universities of Oldenburg and Potsdam, Germany have confirmed the existence of a new cryptic amphipod species in the North Sea. For the first time for the description of a new species, they used a level of mitogenomic information, which was normally applied in other areas of genetics. The discovery of Epimeria frankei was now published in the journal Scientific Reports. In the future, this level of molecular information could revolutionise biodiversity research. Reports of "new species" in the North Sea, usually relate to animals or algae that were newly introduced by human activities. The discovery of a new amphipod species is proof that there are still unknown organisms lurking in the German Bight. A team of scientists around Dr Jan Beermann from the Alfred Wegener Institute, Helmholtz Centre for Polar and Marine Research (AWI) and Dr Michael J. Raupach from the University of Oldenburg, as well as colleagues from the University of Potsdam, now successfully discovered and described a previously overlooked species in the North Sea -- a rare event, considering that the region is one of the best-studied seas in the world. In the first place, AWI ecologist Beermann and Michael Raupach from the University of Oldenburg analysed so-called "DNA barcodes" of North Sea crustaceans: small genetic sequences that are a common tool in modern biodiversity research. Scientists create molecular libraries with these barcodes in order to simplify the identification of species. When Beermann and Raupach analysed their data, the researchers began to wonder whether they were dealing with only one, but two different species. "Once we had a closer look, we noticed that, for instance, some animals exhibited more pointed plates on their legs than others, but these subtle differences aren't always easy to detect," reports Beermann. "The moment you realise that you've probably discovered a new species is fascinating and incredibly exciting. The North Sea isn't the first place you'd expect to stumble across an unknown species -- especially in a genus, which is comparably large in the North Sea with a body length of up to three centimetres, and which eye-catching colourations have also attracted the attention of earlier generations of researchers," says Jan Beermann. The new Epimeria species was named Epimeria frankei, after Prof. Heinz-Dieter Franke, an ecologist who worked for many years at the AWI marine station at Helgoland, and who was Jan Beermann's PhD mentor. With the discovery and the availability of extensive information on the two species, both species had to be newly described. "In this regard we wanted to prepare species descriptions that weren't restricted to the physical appearance but also include detailed genomic information," explains Michael Raupach from the University of Oldenburg. "A few years ago, this would have been an extremely time-consuming. But nowadays, modern technologies make the analyses much faster and easier." For the descriptions, the scientists made use of the entire mitochondrial genome, using state-of-the-art decoding methodologies. In collaboration with the Genomics Team of Prof Michael Hofreiter from the University of Potsdam, they sequenced the genome with cutting-edge technologies. Classifying the importance of their work, Hofreiter and Raupach conclude: "We are the first team in the world to analyse the complete genetic material of the mitochondria, base pair by base pair, in the context of a species description." From the first indication to the confirmation that they truly detected a previously undiscovered amphipod species, it took the researchers more than six years. They were originally investigating the species Epimeria cornigera when they took notice of its sister species. Until then, Epimeria cornigera was commonly assumed to occur from the Mediterranean Sea to Iceland -- a quite broad but possible distribution. Nevertheless, reliable information on the species' biology was still scarce. As Jan Beermann explains, "We now know that the new species, Epimeria frankei, ranges from the Mediterranean to the North Sea, whereas the old species, Epimeria cornigera, is more restricted to the northern North Atlantic. There is a small area of overlap in the North Sea were both species can be found." With the addition of E. frankei, the number of known Epimeria species in the north-eastern Atlantic increased to a total of five species. This new discovery underlines that, even today, marine biodiversity can still be underestimated and that molecular methods have become an indispensable tool for modern biodiversity research. For their publication, the researchers combined various molecular genetic and morphological methods to a so-called integrated taxonomic approach ("taxonomics"). The authors are convinced that: "The successful validation of this approach confirms that, for future biodiversity research, taxonomics could also prove to be extremely important for further considerations such as marine conservation."
  5. What difference does it make to the Earth's water resources if we can limit global warming to 1.5°C instead of 2°C? A research group led by Goethe University Frankfurt has simulated these scenarios with global hydrological models. An important result: High flows and thus flood hazards will increase significantly over an average of 21 percent of the global land area if the temperature rises by 2°C. On the other hand, if we manage to limit the rise in global warming to 1.5°C only 11 percent of the global land area would be affected. According to the Paris Agreement on climate change of December 2015, the increase in global average temperature should be kept well below 2°C compared to pre-industrial levels, if possible even below 1.5°C. To find out what the two scenarios mean specifically in terms of reducing risks for the global freshwater system, the Federal Ministry of Education and Research commissioned a study which has now been published and is intended for inclusion in the forthcoming special report by the Intergovernmental Panel on Climate Change (IPCC) on global warming of 1.5°C. As the research group led by Professor Petra DÜll from the Department of Physical Geography at Goethe University Frankfurt reports in the current issue of Environmental Research Letters, it used two global hydrological models for the analysis, which were "fed" with a new type of climate simulations, known as HAPPI simulations. These are more suitable than previous types of simulations for quantifying the risks of the two long-term climate goals. By calculating seven indicators, risks for humans, freshwater organisms and vegetation were characterized. "If we compare four groups of countries with different per-capita incomes, those countries with a low or lower middle income would profit most from a limitation of global warming to 1.5°C in the sense that the increase in flood risk in those countries would remain far lower than at 2°C," explains Petra DÜll, first author of the study. Countries with a high income would profit most of all from the fact that rivers and land would dry out far less in the dry months of the year.
  6. For Emperor penguins waddling around a warming Antarctic, diminishing sea ice means less fish to eat. How the diets of these tuxedoed birds will hold up in the face of climate change is a big question scientists are grappling with. Researchers at the Woods Hole Oceanographic Institution (WHOI) have developed a way to help determine the foraging success of Emperor penguins by using time-lapse video observations relayed to scientists thousands of miles away. The new remote sensing method is described in the May 2, 2018, issue of the Journal of Applied Physics. "Global warming may be cutting in on food availability for Emperor penguins," said Dan Zitterbart, a scientist at WHOI and co-author of the study. "And if their diets change significantly, it could have implications on the health and longevity of these animals -- which are already expected to be highly threatened or close to extinct by the end of this century. With this new approach, we now have a logistically viable way to determine the foraging success of these animals by taking images of their behavior once they return back to the colony from their foraging trips." Off all the penguin species, Emperor penguins tend to be the biggest eaters. And for good reason: they make exceptionally long treks on sea ice to reach their foraging grounds -- sometimes up to 75 miles during the winter -- and feed their large chicks when they return. But as sea ice diminishes, so does the microscopic plankton living underneath, which serves as the primary food source for fish that penguins eat. Sea ice also provides an important resting platform for the penguins in between foraging dives, so melting can make foraging that much harder. Determining the species' foraging success involves a two-step process. First, digital photos of the birds are taken every minute throughout the day using an inexpensive time-lapse camera perched above the colony 100 feet away. The camera is rugged enough to withstand up to ?50° Celsius temperatures and wind speeds above 150 kilometers per hour. CÊline Le Bohec, a research scientist in ecology from the Centre national de la recherche scientifique (CNRS) and the Centre Scientifique de Monaco, and co-author of the study, says this spying capability overcomes a major limitation in Antarctic field research: the ability to monitor conditions remotely. "It's really important to be able to understand how changing environmental conditions will impact penguin populations, but the harsh weather conditions and logistic difficulties linked to the remoteness of the white continent have made it very challenging to get information from over there," she said. "Now, with our observatories, especially remotely-controlled ones, we can go online anytime and instantly see what is happening in the colony. Moreover, due to their position at the upper level of the food web, working on top-predators such as Emperor penguins, is very useful for understanding and predicting the impact of global changes on the polar marine biome: it's like having an alarm system on the health of these ecosystems." Images are recorded and stored in an image database and later correlated with sensor-based measurements of air temperature, relative humidity, solar radiation, and wind. The combined data sets enable Zitterbart and his team to calculate a "perceived penguin temperature" -- the temperature that penguins are feeling. It is much like the wind chill factor for humans: the air temperature may be -12° Celsius, but other factors can make it feel colder. "Early in the project, we thought if, for example, the wind was blowing faster than 15 meters per second, the penguins would always be huddling, regardless of the other environmental conditions," said Sebastian Richter, a Ph.D. student in Zitterbart's group and lead author of the study. "However, we did not find this to be true, and soon realized that we needed to account for the other weather conditions when assessing huddling behavior." By correlating the penguin's "wind chill" temperature with video observations of when the penguins begin huddling, they're able to come up with a "transition temperature" -- the temperature at which colonies shift from a scattered, liquid-like state to a huddled, solid-like state. If the transition occurs at warmer temperatures, it means the penguins are feeling cold earlier and begin huddling to stay warm and conserve energy. And that indicates that the penguins had less body fat upon their return from foraging and were probably undernourished because they did not find enough food to eat within a reasonable distance from their breeding colony. If the transition temperature is lower later in the season, it suggests that the foraging season was a success and the animals returned well-fed and with higher amounts of body fat. Zitterbart says the information may ultimately be used to derive conservation measures to protect Emperor penguins. According to a previous WHOI study, the species is critically endangered, and it's projected that by 2100, the global population will have declined by 20% and some colonies might reduce by as much as 70% of the current number of breeding pairs of Emperor penguins if heat-trapping gas emissions continue to rise and Antarctic sea ice continues to retreat. "With the information produced by our observatories, population modelling will help us to better project the fate of the different colonies that are left," he said. "It's important to know which colonies are going to be the first most affected by climate change, so if it appears that a certain colony will remain strong over the next century, conservation measures like marine protected areas can be established to better protect them."
  7. The amount of mercury extracted from the sea by industrial fishing has grown steadily since the 1950s, potentially increasing mercury exposure among the populations of several coastal and island nations to levels that are unsafe for foetal development. These are the findings of a study carried out by researchers from UniversitĂŠ de MontrĂŠal's Department of Biological Sciences and published this week in Scientific Reports. The study combined data on the amount of mercury fished out of oceans and seas from 1950 to 2014 and the weekly consumption of fish and seafood by the populations of 175 countries between 1961 and 2011. By comparing this data, which was published by the Food and Agriculture Organization of the United Nations (FAO), postdoctoral fellow RaphaĂŤl Lavoie was able to estimate these populations' per capita intake of methylmercury (MeHg), a highly toxic form of mercury. Working under the direction of Professor Marc Amyot, Lavoie estimated that the people of 38 per cent (66 of 175) of the countries examined by the study might be exposed to methylmercury levels higher than the maximum deemed safe for fetal development. The highest-risk countries include the Maldives, Iceland, Malaysia, Lithuania, Japan, Barbados and South Korea. When humans ingest excessively high levels of methylmercury, the toxin's molecules can penetrate the blood-brain barrier and impact cerebral development, especially in children and foetuses. Industrialization has released vast quantities of mercury into the atmosphere, which have settled in oceans and waterways. This mercury is absorbed by sea creatures, many of which are consumed by humans. Since 1950, demand for seafood has skyrocketed while technological breakthroughs have enabled more intensive forms of industrial fishing. Since the 1990s, when overfishing drastically reduced stocks, industrial fishing has gradually migrated to deep-sea and international waters. "The global marine catch totals 80 million tonnes of fish per year, which means that we are also pulling out increasingly large amounts of mercury," said Amyot. Of the industrial fishing areas listed by the FAO, the Northwest Pacific currently exports the most fish -- and the most methylmercury. The Western Central Pacific holds second place, while the Indian Ocean ranks third. "Together, these three fishing areas exported 60 per cent of the mercury resulting from global seafood production in 2014," said Lavoie. The people in these regions are some of the world's top seafood consumers. Species high up on the food chain contain the highest concentrations of mercury. From 1950 to 2014, large fish represented approximately 60 per cent of the global catch (by weight) and nearly 90 per cent of the mercury ingested by consumers from fish. To be safe for fetal development, the threshold for methylmercury consumption is 1.6 micrograms for each kilogram of a person's body weight per week (1.6 Îźg/kg/week). "By comparing FAO data on global seafood consumption, we observed that from 2001 to 2011 the populations of 38 per cent of the 175 countries we analyzed would have been exposed to weekly doses of methylmercury far above the maximum safe level of consumption for fetal development," said Lavoie. "Many of these populations are in coastal and island nations, especially developing countries." For instance, during that 10-year period, people in the Maldives would have consumed an average of 23 micrograms of methylmercury per kilogram of body weight each week, or more than 14 times what's deemed safe. The next highest-ranking were people in Kiribati (8 Îźg/kg/week), Iceland (7.5 Îźg/kg/week), Malaysia and Samoa (6.4 Îźg/kg/week), French Polynesia (5 Îźg/kg/week), Lithuania, Japan and Barbados (4.8 Îźg/kg/week) and South Korea (4.7 Îźg/kg/week). By contrast, the global average for mercury exposure over the same 2001-2011 period was estimated at 1.7 Îźg/kg/week. In Canada, exposure totalled 1 Îźg/kg/week. Lavoie and Amyot said their estimates are conservative. The global catch by the fishing industry, including artisanal and illegal fishing, is probably 50 per cent higher than the FAO data indicates, they said. Both researchers believe these estimates could help authorities find ways to reduce the risk of mercury exposure, especially among high-risk populations such as children and pregnant women. Some methods of preparing and consuming fish seem to reduce the risk of methylmercury contamination, they pointed out. In a recent study, they found that cooking fish or consuming it in combination with certain polyphenols contained in foodstuffs like tea could reduce the bioavailability of methylmercury in the human body. Which is good news, because contrary to prevailing opinion, the methylmercury we consume may not be fully absorbed.
  8. Human activities have contributed to global warming subsequently leading to increasing erosion of land. This results in conductive minerals being washed increasingly into water streams. The inflow of conductive particles can enable unusual electric partnerships between microbes leading to additional emissions of methane, which is a potent greenhouse gas. Microbes work in mysterious ways. Some thrive in extreme cold, under extreme pressure, in extreme heat, in extreme salinity or in extreme acidity. Some feed on organic material, while others prefer rock, chemical compounds or heavy metals. In the journal mBio, researchers report their encounter with microbes that go for something completely different when looking for attractive living conditions. - In The Baltic Ocean we discovered microbes that share a meal, which neither could efficiently eat alone. They establish their odd interaction by transferring electricity from one species to another via conductive particles. So they use the conductive particles to interact electrically with each other and outcompete anyone else that might be enticed by the same food source, explains lead author Amelia-Elena Rotaru from University of Southern Denmark. The bad side of 'electric' microbes This rather bizarre electric relationship via conductive particles is not only a biological curiosity -- it also prompts the production of methane, which is a potent greenhouse gas. The SDU-lead team showed that the microbes required the conductive particles in order to thrive and release methane. Such particles can be of many different origins, as many materials are conductive. One example is magnetite, which is an abundant iron-oxide mineral in Baltic Sea sediments. Magnetite is a type of particle which can be supplied to the seawater from land erosion. Microbe alliances to keep away the foes Two microbes established a conductive particle driven alliance. These microorganisms are named Geobacter and Methanosarcina. The study shows that as long as conductive particles were available, they both persisted, but when conducting particles were taken away, Geobacter went extinct and Methanosarcinadramatically decreased its activity. Not only do the two studied bacteria species benefit from this collaboration -- they also manage to keep other microbes (Methanothrix) attempting to compete for the same resources away. In the Bothnian Bay, conductive particles can be supplied from river runoff from the eight rivers entering the bay from Sweden and Finland, and also via the runoff from the forestry industry and various coastal industries. "This was the first time we could scientifically document a conductive particle based associations between bacteria and methanogens from environmental samples. We are now finding similar trends in other aquatic environments such as oceanic oxygen minimum zone, or lake sediments," said the lead author. The good side of 'electric' microbes Amelia-Elena Rotaru and her team presently focus on finding a way to make microorganisms work for us to produce the chemicals of the future. - If we can find a way to use the electric properties of microbes to store carbon and electricity while producing biodegradable and safe materials similar to those made today from fossil fuels we would have achieved our ultimate goal to foster a sustainable society, said Amelia-Elena Rotary.
  9. Temperature fluctuations that are amplified by climate change will hit the world's poorest countries hardest, new research suggests. For every degree of global warming, the study suggests temperature variability will increase by up to 15% in southern Africa and Amazonia, and up to 10% in the Sahel, India and South East Asia. Meanwhile, countries outside the tropics -- many of which are richer countries that have contributed most to climate change -- should see a decrease in temperature variability. The researchers, from the universities of Exeter, Wageningen and Montpellier, discovered this "unfair pattern" as they addressed the difficult problem of predicting how weather extremes such as heat waves and cold snaps might change in a future climate. "The countries that have contributed least to climate change, and have the least economic potential to cope with the impacts are facing the largest increases in temperature variability," said lead author Dr Sebastian Bathiany, of Wageningen University. Co-author Professor Tim Lenton, from the University of Exeter, added: "The countries affected by this dual challenge of poverty and increasing temperature variability already share half of the world's population, and population growth rates are particularly large in these countries." "These increases are bad news for tropical societies and ecosystems that are not adapted to fluctuations outside of the typical range." The study also reveals that most of the increased temperature fluctuations in the tropics are associated with droughts -- an extra threat to food and water supplies. For their investigation, the team analysed 37 different climate models that have been used for the last report of the Intergovernmental Panel on Climate Change (IPCC). Although climate variability has been studied extensively by climate scientists, the fact that climate variability is going to change has received little attention in fields investigating the impacts of climate change.
  10. A new study of chemical reactions that occur when organic matter decomposes in freshwater lakes has revealed that the debris from trees suppresses production of methane -- while debris from plants found in reed beds actually promotes this harmful greenhouse gas. As vegetation in and around bodies of water continues to change, with forest cover being lost while global warming causes wetland plants to thrive, the many lakes of the northern hemisphere -- already a major source of methane -- could almost double their emissions in the next fifty years. The researchers say that the findings suggest the discovery of yet another "feedback loop" in which environmental disruption and climate change trigger the release of ever more greenhouse gas that further warms the planet, similar to the concerns over the methane released by fast-melting arctic permafrost. "Methane is a greenhouse gas at least twenty-five times more potent than carbon dioxide. Freshwater ecosystems already contribute as much as 16% of the Earth's natural methane emissions, compared to just 1% from all the world's oceans," said study senior author Dr Andrew Tanentzap from the University of Cambridge's Department of Plant Sciences. "We believe we have discovered a new mechanism that has the potential to cause increasingly more greenhouse gases to be produced by freshwater lakes. The warming climates that promote the growth of aquatic plants have the potential to trigger a damaging feedback loop in natural ecosystems." The researchers point out that the current methane emissions of freshwater ecosystems alone offsets around a quarter of all the carbon soaked up by land plants and soil: the natural 'carbon sink' that drains and stores CO2 from the atmosphere. Up to 77% of the methane emissions from an individual lake are the result of the organic matter shed primarily by plants that grow in or near the water. This matter gets buried in the sediment found toward the edge of lakes, where it is consumed by communities of microbes. Methane gets generated as a byproduct, which then bubbles up to the surface. Working with colleagues from Canada and Germany, Tanentzap's group found that the levels of methane produced in lakes varies enormously depending on the type of plants contributing their organic matter to the lake sediment. The study, funded by the UK's Natural Environment Research Council, is published today in the journal Nature Communications. To test how organic matter affects methane emissions, the scientists took lake sediment and added three common types of plant debris: deciduous trees that shed leaves annually, evergreen pine-shedding coniferous trees, and cattails (often known in the UK as 'bulrushes') -- a common aquatic plant that grows in the shallows of freshwater lakes. These sediments were incubated in the lab for 150 days, during which time the scientists siphoned off and measured the methane produced. They found that the bulrush sediment produced over 400 times the amount of methane as the coniferous sediment, and almost 2,800 times the methane than that of the deciduous. Unlike the cattail debris, the chemical makeup of the organic matter from trees appears to trap large quantities of carbon within the lake sediment -- carbon that would otherwise combine with hydrogen and get released as methane into the atmosphere. To confirm their findings, the researchers also "spiked" the three samples with the microbes that produce methane to gauge the chemical reaction. While the forest-derived sediment remained unchanged, the sample containing the bulrush organic matter doubled its methane production. "The organic matter that runs into lakes from the forest trees acts as a latch that suppresses the production of methane within lake sediment. These forests have long surrounded the millions of lakes in the northern hemisphere, but are now under threat," said Dr Erik Emilson, first author of the study, who has since left Cambridge to work at Natural Resources Canada. "At the same time, changing climates are providing favourable conditions for the growth and spread of aquatic plants such as cattails, and the organic matter from these plants promotes the release of even more methane from the freshwater ecosystems of the global north." Using species distribution models for the Boreal Shield, an area that covers central and eastern Canada and "houses more forests and lakes than just about anywhere on Earth," the researchers calculated that the number of lakes colonised by just the common cattail (Typha latifolia) could double in the next fifty years -- causing current levels of lake-produced methane to increase by at least 73% in this part of the world alone. Added Tanentzap: "Accurately predicting methane emissions is vital to the scientific calculations used to try and understand the pace of climate change and the effects of a warmer world. We still have limited understanding of the fluctuations in methane production from plants and freshwater lakes."
  11. A first-of-its-kind laser instrument designed to map the world's forests in 3-D is moving toward an earlier launch to the International Space Station than previously expected. The Global Ecosystem Dynamics Investigation -- or GEDI, pronounced like "Jedi," of Star Wars fame -- instrument is undergoing final integration and testing this spring and summer at NASA's Goddard Space Flight Center in Greenbelt, Maryland. The instrument is expected to launch aboard SpaceX's 16th commercial resupply services mission, targeted for late 2018. GEDI is being led by the University of Maryland, College Park; the instrument is being built at NASA Goddard. "Scientists have been planning for decades to get comprehensive information about the structure of forests from space to deepen our understanding of how this structure impacts carbon resources and biodiversity across large regions and even globally, as well as a host of other science issues," said Ralph Dubayah, GEDI principal investigator and a professor of geographical sciences at the University of Maryland. "This is why seeing the instrument built and racing toward launch is so exciting." From its perch on the exterior of the orbiting laboratory, GEDI will be the first space-borne laser instrument to measure the structure of Earth's tropical and temperate forests in high resolution and three dimensions. These measurements will help fill in critical gaps in scientists' understanding of how much carbon is stored in the world's forests, the potential for ecosystems to absorb rising concentrations of carbon dioxide in Earth's atmosphere, and the impact of forest changes on biodiversity. GEDI will accomplish its science goals through an ingenious use of light. The instrument is a lidar, which stands for light detection and ranging. It captures information by sending out laser pulses and then precisely measuring the light that is reflected back. GEDI's three lasers will produce eight ground tracks -- two of the lasers will generate two ground tracks each, and the third will generate four. As the space station and GEDI orbit Earth, laser pulses will reflect off clouds, trees and the planet's surface. While the instrument will gather height information about everything in its path, it is specifically designed to measure forests. The amount and intensity of the light that bounces back to GEDI's telescope will reveal details about the height and density of trees and vegetation, and even the structure of leaves and branches within a forest's canopy. NASA has flown multiple Earth-observing lidars in space, notably the ICESat (Ice, Cloud and land Elevation Satellite) and CALIPSO (Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation) missions. But GEDI will be the first to provide high-resolution laser ranging of Earth's forests. "GEDI originally was scheduled to launch aboard a resupply mission in mid-2019, but the team at Goddard who is building and testing GEDI was always on track to deliver a finished instrument by the fall of this year," said Project Manager Jim Pontius, making the move to an earlier resupply mission feasible. The team is now preparing to put GEDI through a battery of pre-launch tests to ensure it is ready to withstand the rigors of launch and operating in space. NASA selected the proposal for GEDI in 2014 through the Earth Venture Instrument program, which is run by NASA's Earth System Science Pathfinder (ESSP) office. ESSP oversees a portfolio of projects ranging from satellites, instruments on the space station, and suborbital field campaigns on Earth that are designed to be lower-cost and more focused in scope than larger, free-flying satellite missions.
  12. In the 1890s, settlers crossed the Rocky Mountains seeking new opportunities -- and bearing frogs. A new study coauthored by a San Francisco State University biology professor draws a link between that introduction of American bullfrogs (Rana catesbeiana) to the western half of the United States with the spread of a fungus deadly to amphibians. The work highlights the catastrophic results of moving animals and plants to new regions. The fungus Batrachochytrium dendrobatidis (Bd) has rapidly spread around the world since the 1970s, causing a skin disease called chytridiomycosis and wiping out more than 200 species of amphibians globally. In the United States, these declines have followed a curious pattern. "In the whole region east of the Rockies, there hasn't been a single outbreak of Bd," said study author Vance Vredenburg, a professor of biology at San Francisco State. "But in the West there's hundreds, if not thousands." American bullfrogs, a species introduced to the West by settlers who wished to populate ponds with an abundant source of frog legs, have for over a decade been a main suspect. Bullfrogs can carry Bd without falling victim to it themselves, making them a potential vehicle for the fungus to colonize new habitats that harbor vulnerable amphibians. Firmly placing the blame on bullfrogs, however, has been difficult. "The problem is we need a time machine to see what happened," Vredenburg explained. So he and a team of colleagues sought out historical data from around the American West to pinpoint when bullfrogs arrived in each region and how those dates line up with the first local records of Bd. In 83 out of the 100 watersheds where the team could dig up data on both bullfrog and Bd occurrence, the frogs were spotted first or in the same year. And in 13 of the remaining 17 cases, bullfrogs had previously been found in a neighboring region. The team reported their results on April 16 in the journal PLoS ONE. "Even when Bd got there before bullfrogs, the frogs were usually close by," Vredenburg explained. So these new findings are more evidence in the case scientists have built against American bullfrogs -- if their presence is a prerequisite for an outbreak, it appears even more likely that they've contributed to Bd's spread. That link between frog and fungus explains patterns in the U.S., but it's also relevant far beyond the country's borders. Thanks in part to a U.S. Agency for International Development program that shipped bullfrogs to developing countries to start frog farms, the invasive amphibians have taken hold in parts of Europe, Asia and South America. "I hope researchers will take this study and try it in other parts of the world," Vredenburg said. As for the rest of us, the study holds a simple lesson: Keep plants and animals where they are. "We need to stop introducing non-native species to areas where they don't belong," Vredenburg said. "Not just in our own backyard but globally."
  13. With warming climates around the world, many regions are experiencing changes in snow accumulation and persistence. Historically, researchers and water managers have used snow accumulation amounts to predict streamflow, but this can be challenging to measure across mountain environments. In a new study, a team of researchers at Colorado State University found that snow persistence -- the amount of time snow remains on the ground -- can be used to map patterns of annual streamflow in dry parts of the western United States. The ultimate goal of this research is to determine how melting snow affects the flow of rivers and streams, which has an impact on agriculture, recreation and people's everyday lives. Scientists said the findings may be useful for predicting streamflow in drier regions around the world, including in the Andes mountains in South America or the Himalayas in Asia. The study was published in Water Resources Research, a journal from the American Geophysical Union. John Hammond, a doctoral student in the Department of Geosciences at CSU and lead author of the study, said the research is the first of its kind to explicitly link snow persistence and water resources using hard data. Similar research has only been conducted using computer-generated models. The research team examined how snow and changes in climate relate to streamflow measurements for small watersheds across the western United States, using data from MODIS, a satellite sensor, and from stream gauging stations operated by the U.S. Geological Survey. They studied mountainous regions with varying climates in the western United States, Cascades of the northwest, the Sierras and the northern and southern Rockies. Stephanie Kampf, associate professor in the Department of Ecosystem Science and Sustainability and study co-author, said the snow persistence data is particularly useful in dry mountain regions. "If we look at how increases in snow relate to annual streamflow, we see basically no pattern in wet watersheds," she said. "But we see a really strong increase in streamflow with increasing snow persistence in dry areas, like Colorado." CSU researchers also explored snow persistence in middle to lower elevations, which are often ignored in snow research, said Hammond. "Half of the streamflow for the Upper Colorado River Basin came from a persistent snowpack above 10,000 feet," he said. "The snow-packed areas above 10,000 feet are really small and are also very isolated across the West. The middle to lower elevations don't accumulate as much snow, but they cover much more area." Streamflow in the Upper Colorado River Basin showed a reliance on snow persistence in these lower elevation areas, according to the study. Researchers said that this highlights the need to broaden research beyond the snow at high elevations, to not miss important changes in lower-elevation snowpack that also affect streamflow.
  14. Alligators on the beach. Killer whales in rivers. Mountain lions miles from the nearest mountain. In recent years, sightings of large predators in places where conventional wisdom says they "shouldn't be" have increased, in large part because local populations, once hunted to near-extinction, are rebounding -- thanks to conservation. Many observers have hypothesized that as these populations recover the predators are expanding their ranges and colonizing new habitats in search of food. A Duke University-led paper published today in the journal Current Biology suggests otherwise. It finds that, rather than venturing into new and alien habitats for the first time, alligators, sea otters and many other large predators -- marine and terrestrial species alike -- are re-colonizing ecosystems that used to be prime hunting grounds for them before humans decimated their populations and well before scientists started studying them. "We can no longer chock up a large alligator on a beach or coral reef as an aberrant sighting," said Brian Silliman, Rachel Carson Associate Professor of Marine Conservation Biology at Duke's Nicholas School of the Environment. "It's not an outlier or short-term blip. It's the old norm, the way it used to be before we pushed these species onto their last legs in hard-to-reach refuges. Now, they are returning." By synthesizing data from recent scientific studies and government reports, Silliman and his colleagues found that alligators, sea otters, river otters, gray whales, gray wolfs, mountain lions, orangutans and bald eagles, among other large predators, may now be as abundant or more abundant in "novel" habitats than in traditional ones. Their successful return to ecosystems and climatic zones long considered off-limits or too stressful for them upends one of the most widely held paradigms of large animal ecology, Silliman said. "The assumption, widely reinforced in both the scientific and popular media, is that these animals live where they live because they are habitat specialists. Alligators love swamps; sea otters do best in saltwater kelp forests; orangutans need undisturbed forests; marine mammals prefer polar waters. But this is based on studies and observations made while these populations were in sharp decline. Now that they are rebounding, they're surprising us by demonstrating how adaptable and cosmopolitan they really are," Silliman said. For instance, marine species such as sting rays, sharks, shrimps, horseshoe crabs and manatees now make up 90 percent of alligators' diet when they're in seagrass or mangrove ecosystems, showing that gators adapt very well to life in a saltwater habitat. The unanticipated adaptability of these returning species presents exciting new conservation opportunities, Silliman stressed. "It tells us these species can thrive in a much greater variety of habitats. Sea otters, for instance, can adapt and thrive if we introduce them into estuaries that don't have kelp forests. So even if kelp forests disappear because of climate change, the otters won't," he said. "Maybe they can even live in rivers. We will find out soon enough." As top predators return, the habitats they re-occupy also see benefits, he said. For instance, introducing sea otters to estuarine seagrass beds helps protect the beds from being smothered by epiphytic algae that feed on excess nutrient runoff from inland farms and cities. The otters do this by eating Dungeness crabs, which otherwise eat too many algae-grazing sea slugs that form the bed's front line of defense. "It would cost tens of millions of dollars to protect these beds by re-constructing upstream watersheds with proper nutrient buffers," Silliman said, "but sea otters are achieving a similar result on their own, at little or no cost to taxpayers."
  15. Every 405,000 years, gravitational tugs from Jupiter and Venus slightly elongate Earth's orbit, an amazingly consistent pattern that has influenced our planet's climate for at least 215 million years and allows scientists to more precisely date geological events like the spread of dinosaurs, according to a Rutgers-led study. The findings are published online today in the Proceedings of the National Academy of Sciences. "It's an astonishing result because this long cycle, which had been predicted from planetary motions through about 50 million years ago, has been confirmed through at least 215 million years ago," said lead author Dennis V. Kent, a Board of Governors professor in the Department of Earth and Planetary Sciences at Rutgers University-New Brunswick. "Scientists can now link changes in the climate, environment, dinosaurs, mammals and fossils around the world to this 405,000-year cycle in a very precise way." The scientists linked reversals in the Earth's magnetic field -- when compasses point south instead of north and vice versa -- to sediments with and without zircons (minerals with uranium that allow radioactive dating) as well as to climate cycles. "The climate cycles are directly related to how the Earth orbits the sun and slight variations in sunlight reaching Earth lead to climate and ecological changes," said Kent, who studies Earth's magnetic field. "The Earth's orbit changes from close to perfectly circular to about 5 percent elongated especially every 405,000 years." The scientists studied the long-term record of reversals in the Earth's magnetic field in sediments in the Newark basin, a prehistoric lake that spanned most of New Jersey, and in sediments with volcanic detritus including zircons in the Chinle Formation in Petrified Forest National Park in Arizona. They collected a core of rock from the Triassic Period, some 202 million to 253 million years ago. The core is 2.5 inches in diameter and about 1,700 feet long, Kent said. The results showed that the 405,000-year cycle is the most regular astronomical pattern linked to the Earth's annual turn around the sun, he said. Prior to this study, dates to accurately time when magnetic fields reversed were unavailable for 30 million years of the Late Triassic. That's when dinosaurs and mammals appeared and the Pangea supercontinent broke up. The break-up led to the Atlantic Ocean forming, with the sea-floor spreading as the continents drifted apart, and a mass extinction event that affected dinosaurs at the end of that period, Kent said. "Developing a very precise time-scale allows us to say something new about the fossils, including their differences and similarities in wide-ranging areas," he said.
  16. About 700 million years ago, the Earth experienced unusual episodes of global cooling that geologists refer to as "Snowball Earth." Several theories have been proposed to explain what triggered this dramatic cool down, which occurred during a geological era called the Neoproterozoic. Now two geologists at The University of Texas at Dallas and UT Austin suggest that those major climate changes can be linked to one thing: the advent of plate tectonics. The research was published online in December 2017 and in the April print edition of the journal Terra Nova. Plate tectonics is a theory formulated in the late 1960s that states the Earth's crust and upper mantle -- a layer called the lithosphere -- is broken into moving pieces, or plates. These plates move very slowly -- about as fast as your fingernails and hair grow -- causing earthquakes, mountain ranges and volcanoes. "Earth is the only body in our solar system known to currently have plate tectonics, where the lithosphere is fragmented like puzzle pieces that move independently," said Dr. Robert Stern, professor of geosciences in UT Dallas' School of Natural Sciences and Mathematics, and co-author of the study, along with Dr. Nathaniel Miller, a research scientist in UT Austin's Jackson School of Geosciences who earned his PhD in geosciences from UT Dallas in 1995. "It is much more common for planets to have an outer solid shell that is not fragmented, which is known as 'single lid tectonics'," Stern said. Geoscientists disagree about when the Earth changed from single lid to plate tectonics, with the plate fragmenting from one plate to two plates and so on to the present global system of seven major and many smaller plates. But Stern highlights geological and theoretical evidence that plate tectonics began between 800 million and 600 million years ago, and has published several articles arguing for this timing. In the new study, Stern and Miller provide new insights by suggesting that the onset of plate tectonics likely initiated the changes on Earth's surface that led to Snowball Earth. They argue that plate tectonics is the event that can explain 22 theories that other scientists have advanced as triggers of the Neoproterozoic Snowball Earth. "We went through the literature and examined all the mechanisms that have been put forward for Snowball Earth," Stern said. "The start of plate tectonics could be responsible for each of these explanations." The onset of plate tectonics should have disturbed the oceans and the atmosphere by redistributing continents, increasing explosive arc volcanism and stimulating mantle plumes, Stern said. "The fact that strong climate and oceanographic effects are observed in the Neoproterozoic time is a powerful supporting argument that this is indeed the time of the transition from single lid to plate tectonics," Stern said. "It's an argument that, to our knowledge, hasn't yet been considered. "In the present day, climate is in the news because we're changing it by putting more carbon dioxide into the atmosphere," Stern said. "But imagine a time when Earth didn't have plate tectonics, and it then evolved to have plate tectonics -- that would have been a major shift in the Earth's operating system, and it would have had a huge effect on climate, too."
  17. In the next-generation OPIR program, Air Force wants to show it can take years off the typical timeline for launching a new constellation. WASHINGTON — The Air Force on Friday released a “notice of intent” to sole-source contracts to Lockheed Martin and Northrop Grumman for the next-generation overhead persistent infrared program. The five-satellite constellation known as the next-generation OPIR will succeed the current Space Based Infrared System. The Air Force wants a new system that is more survivable against emerging threats. Lockheed Martin will be responsible for three geosynchronous orbit satellites and Northrop Grumman for two polar orbit satellites. In a news release, the Air Force said it is implementing “rapid procurement authorities” and is targeting the first next-generation OPIR launch in 2023. The Air Force did not provide any estimates on the value of the contracts. That may not be known for months, as the notice of intent only marks the beginning of the negotiations with both contractors. Neither company would comment on the Air Force announcement. It is not unusual for the Pentagon to publicly notify an intent to sole-source a contract, but this is a especially sensitive program where the Air Force wants to show it can take years off the typical timeline for launching a new constellation. Negotiating deals for major military systems can take several months, so revealing the intent to sole-source now can help speed up the contracting process. The Air Force’s goal of putting up a new constellation in five years, however, seems ambitious. A 2023 launch is the target for the first geosynchronous satellite. The first polar orbit satellite would launch in 2027. And the entire system, known as the “block 0 architecture” would be on orbit by 2029, according to documents. The Air Force late Friday published two “presolicitation” notices of intent to sole-source OPIR satellite development and production. Lockheed Martin’s contract will be for geosynchronous orbit space vehicles 1 through 3. Northrop Grumman’s contract will be for polar orbit space vehicles 1 and 2. Both deals will be predominantly “cost plus incentive fee.” The Air Force informed Congress in February it wanted to end the procurement of Lockheed-made SBIRS satellites after vehicle 6 and shift the funds previously allocated for SBIRS 7 and 8 to develop a new system. This was viewed by some analysts as a game changer and a sign of a potential shakeup in the military satellite market. But the Air Force’s decision to sole-source the next-generation OPIR further solidifies Lockheed Martin’s dominance. Northrop Grumman provides the SBIRS payloads and is Lockheed’s primary subcontractor. Giving Northrop Grumman a share of the satellites also strengthens the company’s foothold in the program even if there is a future payload competition. The Air Force Space and Missile Systems Center’s Remote Sensing Systems Directorate said opening up the program to new entrants was not a realistic option given the urgency of the program. “Based on market research, an award to any other source would result in an acceptable delay in fulfilling the Air Force’s critical and urgent requirements and substantial duplication of costs to the government that is not expected to be recovered through competition,” said he presolicitation. The sole-source decision announced on Friday should be no surprise to anyone who has followed the Air Force’s efforts to replace SBIRS. In November, the Remote Sensing Directorate informed contractors of a need to address an “unusual and compelling urgency”for a new missile-warning system and stated Lockheed was the only company qualified do the job within the required timeline. Lockheed is the sole producer of Air Force-validated nuclear hardened spacecraft that can meet government requirements and “urgent need dates,” the directorate said. The November request for information said the government was “considering soliciting and negotiating a sole source contract” with Lockheed Martin for the the entire block 0 system, including all five satellites. The RFI indicated that the full architecture would be operational by Fiscal Year 2029, with an initial launch capability in Fiscal Year 2025. It was known from the November solicitation that the new system would have five geosynchronous and two polar orbit satellites “to counter emerging threats while operating in a contested environment.” What has changed since November? The initial launch of the first GEO satellites is being moved up by two years to 2023, and the satellite contract was broken to give Northrop Grumman the polar spacecraft. Regardless of how quickly or not the Air Force develops the future missile warning satellites, officials caution that this program is only the first step toward long-term changes in how space systems are acquired. The next-generation missile warning constellation will be a “pacesetter” for learning to speed up traditional acquisitions, said Assistant Secretary of the Air Force for Acquisition Will Roper. He hopes to see a “switch in the mindset” of procurement managers as they try to balance the need to deliver on time with a “reasonable amount of experimentation and prototyping.” Developing, producing and launching into orbit a new constellation in five years is “aggressive,” said Roper. “Five to six years is a gold medal.” Whether it’s five or six years, the idea is to start changing the thinking “so program managers can take advantage of discovery in getting things right but can hedge their bets in case something goes wrong.” Deputy Assistant Secretary of Defense for Space Policy Stephen Kitay gave the Air Force props for “working hard” to make space systems more resilient. “The next-generation strategic missile warning system is part of a transition to a future overhead persistent infrared architecture that implants new resiliency features,” Kitay said on Friday at a Mitchell Institute event on Capitol Hill. “There is not a ‘one size fits all’ solution to mission assurance,” he said. “Just as there are a variety of threats and missions, we’ll need a variety of capabilities,” he said. “We’re going to have to bring creativity and innovation to this problem. And we’re working on going faster.”
  18. DoD has deep expertise in “space situational awareness,” or SSA, whereas Commerce faces a steep learning curve. WASHINGTON — A new policy said to be on President Trump’s desk for final approval would designate the Department of Commerce as the public face of space traffic management. The job of policing space currently is performed by the Department of Defense. It involves answering queries from private citizens, corporations and foreign governments about the position of satellites on orbit, and warning agencies and commercial satellite operators of potential orbital collisions. For the past several years the Pentagon had prepared to turn these responsibilities over to the Federal Aviation Administration but the Trump administration decided Commerce was a better fit in light of the booming private space economy. Although there is overall agreement on the transfer of responsibilities, the specifics of who will do what may take years to sort out. DoD has deep expertise in “space situational awareness,” or SSA, whereas Commerce faces a steep learning curve. DoD holds the SSA authority and public service responsibility under section 2274 of Title 10 of the U.S. Code In advance of the president signing off on the new policy, the House Armed Services Committee is weighing in. The chairman’s mark of the Fiscal Year 2019 National Defense Authorization Act unveiled on Monday includes language that lays a path for the implementation. Section 1603 of the HASC chairman’s bill — titled “Space Situational Awareness Services and Information” — would amend section 2274 by “terminating the authority of the Department of Defense to provide space situational awareness data to commercial and foreign entities on January 1, 2024.” This section also would require the secretary of defense to hire a federally funded research and development center to assess which department or departments should assume the authorities of section 2274 of title 10. The secretary of defense would have to develop a plan to “ensure that one or more departments may provide space situational awareness services to foreign governments.” Doug Loverro, former deputy assistant secretary of defense for space policy during the Obama administration, said this language “suggests that Congress is looking for a smooth transition without a break in service.” This provision is not inconsistent with the administration’s plan to move SSA services to Commerce, Loverro told SpaceNews. “The good news is that several years ago this same committee was clearly against DoD ever losing control of this vital function. So I would say that this represents true progress in moving this inherently civil function to a civil agency.” The decision to assign space traffic management to the Commerce Department was revealed by Vice President Mike Pence last month at the Space Symposium in Colorado Springs. He said Commerce will be instructed to “provide a basic level of space situational awareness for public and private use, based on the space catalog compiled by the Department of Defense, so that our military leaders can focus on protecting and defending our national security assets in space.” The space catalog is maintained by the Joint Space Operations Center at Vandenberg Air Force Base, which is part of U.S. Strategic Command. The center operates the Space Surveillance Network, a worldwide system of ground-based radars along with ground-based and orbital telescopes. SSA mission is complex Deputy Assistant Secretary of Defense for Space Policy Stephen Kitay said the White House directive is an “important step forward in light the increased commercial activity in space and the need for DoD to focus on its war fighting missions.” “We’re excited about the new partnership with Commerce on space traffic management,” Kitay said last week at a Mitchell Institute event on Capitol Hill. Kitay cautioned he did not want to get ahead of the White House and would not discuss specifics of the transition. He noted that SSA is a complex mission. There are 1,500 active satellites on orbit today. DoD monitors about 20,000 objects in space that are 10 cm or larger. Plus there are hundreds of thousands of objects that are smaller. The clutter could soon reach alarming proportions. “You’ve seen the business plans coming forward. If they come to fruition, people are discussing constellations of thousands of satellites. These objects are moving at thousands of miles per hour. Some of these small satellites may not have active propulsion on them,” said Kitay. “How do we think about the long term safety, sustainability of this domain so we can all benefit from it? This is an area that is going to require a lot of attention.” The Pentagon will remain closely involved in this mission, even with Commerce as the public face of SSA, Kitay said. “Our partnership with Commerce is not going to be us saying, ‘Commerce, this is now yours, and SSA is your problem.’ SSA is a critical mission for us, and we will continue to maintain those resources and sensors.” A number of details will have to be worked out over time, Kitay noted, including the role of the private sector in SSA. “A lot is happening out there with companies,” he said. “We want to make sure we take advantage of advances in commercial technology.” Private space surveillance companies are enthused by the transition of SSA to Commerce, said Travis Langster, vice president of business development at AGI Inc., a provider of space data and analytical tools. Companies in this sector for years have been frustrated by DoD’s resistance to opening up the market to the private sector. “Moving to Commerce is a good thing,” Langster said in an interview. “The FAA took it as far as they could.” Commerce has said it wants to promote innovation, he said. “That is a big mantra for companies like us that provide commercial SSA services.” DoD today offers a “basic level of service, which needs to be enhanced,” said Langster. “The space traffic management is not adequate for commercial use.” The industry would like to see this policy executed in a way that advances, not stifles, the progress of technology, Langster said. “The current model pits the commercial SSA marketplace against the DoD, and does not allow the commercial SSA market to fully form.” The DoD catalog is “critical for national security,” said Langster. “But there is an opportunity to enhance that catalog with other types of information.” DoD and Commerce will have different SSA priorities, said John Monahan, senior vice president of Kratos Defense. The company provides radio-frequency monitoring services for military SSA. “DoD is going to have to continue being involved in space traffic management,” he said. “Commerce is going to have to have an SSA tool but they will not completely own the SSA mission. SSA is too critical and too deeply integrated. It’s a warfighting mission. Both organizations are going to need it, but each has different needs.” When Commerce takes over SSA duties, there will be much greater appetite for use of commercial sensors, said Greg Caicedo, Kratos’ vice president of data and network solutions. The population of data and analysis vendors is starting to expand in the United States and internationally, he said. And many countries increasingly worry about how they will be able to detect nefarious acts in space. “What is the intent? How do we tell commercial activity from aggression?” Those are some of the questions being asked, said Caicedo. “By having a civilian agency take over space traffic management we can have greater international cooperation and transparent communications.” Debate goes back years Loverro recalled that the debate over space traffic management started to pick up steam about five years ago. “The plan had been to go ahead and eventually transfer the responsibility to the FAA.” Pilot programs were kicked off to figure out a transition. But at one point the FAA decided it didn’t want this job and Trump’s Commerce Secretary Wilbur Ross thought it made sense for his agency to take it. There are still questions on who will perform specific duties such as operating sensors, Loverro noted. When there are two satellites coming close to each other, somebody has to “task” a sensor to go take measurements and calculate distances between objects. “That kind of back and forth between the calculations and the tasking has to happen,” he said. “How that’s mechanized we don’t know yet. Commerce will have access to the data but will they be able to task DoD sensors around the world, or will they hire commercial firms that have sensors?” Loverro said Commerce could outsource these duties, as companies do this type of work routinely. “Whether or not that’s the direction Commerce will go remains to be seen.” Earl Comstock, director of the Commerce Department’s Office of Policy and Strategic Planning, said DoD has “done a fantastic job providing a public service tracking objects in space.” Now there are more actors, and they would like to have a civilian agency to undertake that responsibility, he said at a recent Hudson Institute conference on space policy. “They’ll continue to be involved,” Comstock said of DoD. “They have the resources to track these capabilities. But the public face of this should be a civilian agency.” The National Space Council will work on many of the transition details, said Comstock. The challenge: “How do we maintain the DoD’s resources but relieve them of the burden of having to interface so much with the public?” The reality is that Commerce or any other civilian department the White House might pick doesn’t have the resources to do SSA, said Comstock. “DoD will continue to do that, but Commerce believes it should interface with the public and the industry as a means of attracting people to the United States as a place to come and do business,” he said. “We also want to set a standard for the rest of the world … and unite the club of ‘responsible actors.’”
  19. WASHINGTON — A House appropriations bill released May 8 offers more than $21.5 billion for NASA in fiscal year 2019, a significant increase over both what the agency received in 2018 and what the White House proposed for 2019. The bill, released by the House Appropriations Committee on the eve of its markup by the commerce, justice and science (CJS) subcommittee, includes $21.546 billion for NASA. That is an increase of more than $1.65 billion over the administration’s request and $810 million more than what NASA received in the fiscal year 2018 omnibus spending bill passed in March. The bill “continues NASA’s record-level funding, setting the agency on the trajectory to rise above and beyond the glory days of Apollo,” Rep. John Culberson (R-Texas), chairman of the CJS appropriations subcommittee, said in a statement about the bill. Spending for the various accounts within the NASA budget, such as science, exploration and aeronautics, is specified in the bill, but it includes few details about how funding should be apportioned within those accounts. Those details will likely wait until the report accompanying the bill is released when the full committee takes up the bill. That means the bill is silent on several major programs proposed for cancellation by the administration in its budget request, including the Wide-Field Infrared Survey Telescope and four Earth science missions, none of which are explicitly mentioned in the bill. The bill also does not discuss potential cost overruns for the James Webb Space Telescope, whose launch was delayed in March by about a year to May 2020. The bill instead leaves in place language about the mission’s $8 billion cost cap also found in previous year’s bills. The bill, though, does specify funding for some programs. It calls for spending $545 million on the Europa Clipper mission and $195 million for a follow-on lander. NASA requested only $264.7 million for Europa Clipper and nothing for the lander. NASA said in the budget proposal it was seeking to launch Europa Clipper in 2025 on a commercial vehicle, while the bill calls for the use of the Space Launch System and a launch by 2022. In its budget proposal, NASA estimated needing $565 million in 2019 to keep Europa Clipper on track for a 2022 launch but warned of “potential impacts to the rest of the Science portfolio” if funded at that level. The bill includes $1.35 billion for Orion and $2.15 billion for SLS, the same funding those exploration programs received in 2018. NASA requested slightly less for each: $1.164 billion for Orion and $2.078 billion for SLS. The bill fully funds the administration’s request for the Lunar Orbital Platform-Gateway, at $504 million in 2019. In the committee’s statement about the bill, it said other elements of NASA’s lunar exploration programs were also fully funded, including $218 million in science for lunar missions, $116.5 million in advanced exploration systems for cislunar and lunar surface capabilities, and $150 million in the LEO and spaceflight operations account to begin the transition of the ISS to commercial alternatives. Those other spending levels were not included in the bill. The same bill also includes funding for the National Oceanic and Atmospheric Administration, part of the Department of Commerce. The bill did not go into details about NOAA satellite programs, but the statement about the bill says it fully funds both the Geostationary Operational Environmental Satellite (GOES) and Joint Polar Satellite System (JPSS) programs. NOAA requested $408.4 million for GOES and $878 million for JPSS, which now includes the Polar Follow-On program funding the third and fourth JPSS satellites that was previously a separate line item.
  20. WASHINGTON — When it comes to SpaceX’s Falcon 9 rocket, the company’s most daring customers have been NASA and satellite fleet operator SES. Now add Bangladesh to that mix. NASA was SpaceX’s first customer after Falcon 9’s debut flight in 2010, taking its next four consecutive launches for International Space Station resupply missions. SES of Luxembourg was the first satellite operator to trust SpaceX with the launch of a multimillion-dollar geostationary communications satellite. Following the success of that 2013 mission (and many others), SES backed SpaceX last year by launching on the first Falcon 9 to use a previously flown first stage. On Thursday, the Bangladesh Telecommunication Regulatory Commission (BTRC) will be the first customer for the Falcon 9 Block 5 — SpaceX’s final and most powerful version of the rocket. The Falcon 9 Block 5 includes upgrades to meet NASA commercial crew requirements and U.S. Defense Department criteria. The Block 5 is designed for 10 or more flights using the same first stage booster; previous versions were only designed to handle two or three flights. NASA and SES both had and have motivation to take bold bets on SpaceX — NASA to cultivate private sector launch options for near-Earth activities, and SES to drive down prices in the global launch sector with new competition. Tomorrow’s launch of Bangladesh’s 3,500-kilogram Bangabandhu-1 satellite will be SpaceX’s 54th Falcon 9, and marks what the company hopes will be the beginning of a new chapter of accelerated reusability. So how did Bangladesh become the customer of SpaceX’s next big milestone? “It honestly just happened,” Sajeeb Wazed, Bangladesh’s honorary adviser to the prime minister for information and communications technology, told SpaceNews. “That was basically SpaceX’s choice and we were fine with that.” Wazed, who is the son of Bangladesh’s prime minister, Sheikh Hasina, said BTRC required each bidder angling to build Bangabandhu-1 to also include a first-choice launch vehicle and a backup. The winning bid from Thales Alenia Space listed Arianespace’s Ariane 5 as the default launcher, he said. SpaceX’s Falcon 9 was the runner-up. Bangladesh deemed schedule certainty as one of its biggest criteria — a factor that would typically play to the advantage of SpaceX rivals. Arianespace, when prevented from launching for five weeks last spring by local protests that blockaded Europe’s spaceport in French Guiana, caught up on three delayed missions in two months, preventing cascading delays on its manifest. But Arianespace couldn’t guarantee Bangladesh that its satellite would launch Dec. 16, 2017 — Bangladesh’s “National Victory Day” commemorating the surrender of Pakistani forces during the Bangladesh Liberation War in 1971. Ariane 5 rockets typically carry two satellites at a time, a larger satellite in the upper berth and a smaller satellite in the lower berth. Sharing launch vehicles lowers the cost for satellite operators, but requires their schedules to be in sync. Without that synchronization, delays can ensue. “Because the size of our satellite only fits in the lower berth of Ariane and they couldn’t guarantee us a launch slot by December, we had them switch to the backup,” Wazed said. “SpaceX wanted us to go on the Block 5 and we were OK with that.” SpaceX obviously did not meet BTRC’s desired launch date, either. While 2017 was the launch provider’s most successful year with 18 missions, much of that was playing catch-up on launches delayed by Falcon 9 production strains and failures in 2015 and 2016. Wazed said he strove to temper expectations about a Victory Day launch, but to no avail. “I told everybody that wasn’t realistic, but it’s OK, we will try, you know?” Bangabandhu-1 is named after Bangabandhu Sheikh Mujibur Rahman, the assassinated founder of Bangladesh and Wazed’s grandfather. The satellite carries 26 Ku-band transponders and 14 C-band transponders for television and broadband communications services for the nation and surrounding regions.
  21. WASHINGTON — With uncertainty about the future of two large space telescopes, NASA is continuing to suggest that the next decadal survey for astrophysics be postponed, a move opposed by many astronomers. Recently, the Cosmic Origins Program Analysis Group, one of three advisory groups chartered by NASA to support the agency’s astrophysics program, sent out a questionnaire to astronomers asking for their thoughts about delaying the next survey, currently scheduled for release in late 2020. The questionnaire states that NASA officials, including Thomas Zurbuchen, associate administrator for science, “are concerned that the next decadal survey committee may not be able to effectively prioritize missions in the next decade due to uncertainties in the status of JWST and WFIRST.” A two-year delay of that study, they argued, would mitigate those concerns, but they added they are open to alternatives. The decadal survey identifies priorities in astrophysics research and prioritizes both ground-based and space-based observatories to carry out that work. Of particular interest is what the survey identifies as the top-priority flagship, or large, mission for the next decade, missions in the past that have includes the James Webb Space Telescope and Wide-Field Infrared Survey Telescope (WFIRST). The current schedule calls for starting the next decadal survey, known as Astro2020, around the end of this year with the selection of a chair of the committee that leads the study. That schedule would lead to the completion and release of the report by late 2020. That schedule was intended to accommodate earlier schedules for JWST, which was to launch in October 2018. The early science the telescope performed could then be incorporated in the Astro2020 committee’s work to refine its priorities. With JWST now scheduled to launch around May of 2020, that is no longer possible under the current schedule. WFIRST, the top priority in the previous decadal survey released in 2010, is also shrouded in uncertainty. The administration proposed cancelling the mission in its fiscal year 2019 budget request, even as the project was incorporating revisions to lower its estimated cost to $3.2 billion. Immediately after NASA announced the latest delay in JWST, Zurbuchen proposed delaying the Astro2020 study by two years. At a May 2 meeting of the National Academies’ Space Studies Board, he reiterated his desire to delay the decadal to avoid a “missed decade” in astrophysics. “I think it will be easier to do after Webb flies and is successful,” he said. “It’s very hard to do a visionary and a great decadal while half the decade is already allocated for, and some of the big strategic missions have not cleared the queue.” Paul Hertz, director of NASA’s astrophysics division, offered a similar assessment at the meeting. “NASA’s highest priority isn’t to delay the decadal survey,” he said. Instead, he said the agency’s priority was an “ambitious” survey “that provides the government with the priorities we need to lead the world in doing astrophysics in space.” Many astronomers have no desire to delay Astro2020. “We have very good knowledge of what to expect when we get on orbit” with JWST, said Marcie Rieke, co-chair of the Committee on Astronomy and Astrophysics, at the Space Studies Board meeting May 2. On JWST’s current launch schedule of May 2020, she said, the first “early release science” results from the telescope would be published in late 2020, around the time Astro2020 is released. “When you think about that, that’s very nice timing,” she said. “If you have a decadal survey that projects doing more big stuff, but you can show gorgeous results from the one you just finished, that’s a nice sales tactic.” Any decision to delay Astro2020 would have to be coordinated with other agencies that support the study, such as the National Science Foundation, which uses it for planning ground-based observatories. However, shortly after NASA announced the latest JWST delay, an NSF official said the decadal should remain on schedule. “We’re going to have to have some more discussion among the three funding agencies,” which includes the Department of Energy, Rieke said May 2. “The NSF has made clear that they would like to start on time. Anecdotal discussions among various astronomers imply that astronomers would like to move ahead and start on roughly the current schedule.” Zurbuchen said he would follow the desires of the astronomical community regarding the schedule for Astro2020. “At the end, the decadal is the community’s decadal,” he said. “I will follow what the community says.”
  22. WASHINGTON — SpaceX has set an ambitious goal for 2019: using the same Falcon 9 booster to conduct two launches in 24 hours. Such a feat would require more than just the rapid turnaround of Falcon 9’s reusable first-stage booster. It would also require a rapid turnaround of Air Force range support and some speedy payload integration — assuming SpaceX doesn’t want to launch an empty fairing second time around. But Elon Musk, SpaceX’s founder and chief executive, has never been shy about setting bold goals. “We intend to demonstrate two orbital launches of the same Falcon 9 vehicle within 24 hours no later than next year,” Musk said May 10 during a call with reporters. “That will be, I think, truly remarkable to launch the same orbit-class rocket twice in one day.” Musk mentioned the goal in the hours leading up to the first launch attempt of the Block 5 Falcon 9, which is designed with a first stage that can launch 10 times without refurbishment. That launch, carrying Bangladesh’s first telecom satellite, Bangabandhu-1, was rescheduled for today after a last-minute glitch scrubbed the countdown with 58 seconds left on the clock. “Next year is when we intend … [to do] the same-day reflight of the same rocket,” Musk said. “I think that’s really a key milestone.” The ability to relaunch the same first stage in a single day would help SpaceX bolster its case that a used rocket is more reliable than a new one. SpaceX executives often reference air travel as a model for future launch activity. Musk reiterated that point. “Would you rather be flying in an aircraft that’s never had a test flight before, or would you rather fly in an aircraft that has flown many times successfully?” he said. Musk said he thinks customers eventually “will actually prefer to fly on a flight-proven rocket than one that has never flown.” SpaceX has given discounts to some early customers of Falcon 9 rockets with used first stages to ease their acceptance, particularly among risk-averse satellite operators who might otherwise be reluctant to launch a spacecraft costing $100 million or more on rocket booster already subjected to the rigors of launch and landing. Musk said SpaceX lowered prices from “about $60 million to about $50 million for a reflown booster,” and expects “to see a steady reduction in prices” going forward. He cautioned though that SpaceX has lots of fixed costs, its future Starlink satellite internet constellation and development of the Big Falcon Rocket (BFR) that require revenue from launches, meaning prices can only go so low. Ocean recoveries, which require sending drone ships out to sea for landing Falcon 9 first stages, also cost “a few million dollars,” he said. Given the extensive modifications made to Block 5, SpaceX will take extra time after the Bangabandhu-1 launch to disassemble and inspect the rocket. “Ironically, we need to take it apart to confirm that it does not need to be taken apart,” Musk said. “This rocket probably won’t refly for probably a couple of months. But by late this year we should be seeing substantial reflight of Block 5 vehicles, probably with Block 5 boosters seeing their third, maybe their fourth reflight.” Musk estimated the Falcon 9 Block 5 will make “something on the order of 300 flights,” before retiring. SpaceX plans to succeed the Falcon 9 and Falcon Heavy with the BFR, and is targeting a cargo mission to Mars with the larger rocket in 2022. Full Reusability SpaceX has attempted, so far unsuccessfully, to recover the Falcon 9 payload fairings used to protect satellites on their ascent through the atmosphere. The company has also talked about retrieving the upper stage instead of letting it burn up over the Pacific Ocean. Musk said SpaceX won’t attempt fairing recovery on the Bangabandhu-1 mission, but is intent on saving the $6 million protective shrouds in the future. Upper stage recovery is a longer-term goal. “I’m certain we can achieve reusability of the upper stage, the question is simply what the mass penalty is,” he said. “We don’t want to put too much engineering effort into that relative to BFR, and we obviously will not take any action that creates risk for the ascent phase of the rocket.” Over the course of this year, SpaceX will gradually add thermal protection to the upper stage to optimize the stage for the return journey to Earth, Musk said. For near-term flights, Musk said the goal will be mainly to gather data such as reentry temperature, altitude and health, likely using Iridium Communication’s satellite constellation to relay the data. If SpaceX can reuse every part of the Falcon 9, “we would be able to reduce the cost for launch by an order of magnitude,” Musk said. “And then as our launch rate increases, we can further optimize the per-launch costs.” Musk estimated 60 percent of the Falcon 9’s marginal cost comes from the first stage, 20 percent from the second stage, 10 percent for the fairing, and 10 percent for the everything else associated with the launch. Propellant costs a negligible $300,000 to $400,000, he said. Musk said it is possible to reduce the marginal costs for a Falcon 9 launch to “down under $5 or $6 million,” in around three years.
  23. WASHINGTON — United Launch Alliance has picked Aerojet Rocketdyne’s RL10 engine to power the upper stage of its next-generation Vulcan rocket, the second such contract Aerojet has secured in as many months. In a May 11 statement, ULA said it will use a new variant of the RL10, known as the RL10C-X, on the upper stage of the Vulcan. That version incorporates improvements, like additive manufacturing of engine components, to improve its quality and affordability. The agreement continues a long relationship with Aerojet Rocketdyne, which also provides versions of the RL10 for the upper stages of ULA’s Atlas and Delta rockets. “We could not be more pleased to have selected the proven and reliable RL10 to power our Vulcan Centaur upper stage,” said Tory Bruno, president and chief executive of ULA, in a statement. The agreement covers the delivery of engines over the next decade. Terms of the agreement weren’t disclosed, but ULA described the deal as a “competitive procurement.” “Key determining factors to our selection included price and delivery schedule,” Bruno said in the statement. The agreement, Aerojet said in a separate statement, includes “a joint commitment to invest in next-generation engine development” by the companies, a reference to the RL10C-X. “The agreement also defines a path forward that will enable us to develop the next generation of RL10 engines that will incorporate additive manufacturing and other advanced technologies to make the engine more affordable while retaining its proven performance and reliability,” Eileen Drake, president and chief executive of Aerojet Rocketdyne, said. In a briefing with reporters last month at the 34th Space Symposium in Colorado Springs, Drake highlighted the use of additive manufacturing as a means to reduce the costs of the new RL10 variant. She declined to quantify that cost reduction, but said that in one case, the company was able to reduce the part content of a copper thrust chamber for the RL10 by 70 percent through the use of additive manufacturing. “You reduce the time to produce that part well in excess of 50 percent,” she said. “There’s less labor content, less supplier content, less parts that you have to put together that could cause an issue.” ULA did not disclose who Aerojet beat out for the Vulcan upper stage contract. However, the leading competitor is widely thought to be Blue Origin, which offered the BE-3U, an upper-stage version of the BE-3 engine currently used on its New Shepard suborbital vehicle. Blue Origin plans to use the BE-3U on the second stage of its New Glenn orbital launch vehicle, a recent design change. ULA also for a time worked on an engine project with XCOR Aerospace, a small company best known for its effort to develop the Lynx suborbital spaceplane. That engine, the 8H21, was considered for at one point for use on the Vulcan’s upper stage, but XCOR suffered financial problems and went out of business last November. The Vulcan agreement is the second deal Aerojet has won for the RL10 in as many months. Orbital ATK announced April 16 that it will use the RL10C on the upper stage of OmegA, the vehicle former known as Next Generation Launch that the company is seeking an Air Force contract to build. Orbital had previously considered the BE-3U before opting for the RL10. “It has an extensive flight history and provides a low-risk affordable engine,” said Mike Pinkston, deputy general manager of Orbital ATK’s launch vehicles division, in an interview last month about the selection of the RL10 for OmegA. While ULA has picked the engine that will go in Vulcan’s upper stage, it has yet to select the engine for the rocket’s first stage. That competition again pits Aerojet Rocketdyne, with its AR1 engine, against Blue Origin and its BE-4. Bruno, on multiple occasions, has said a decision will come “soon,” but has declined to offer a more specific schedule. “ULA continues its competitive procurement process for the booster engine and plans to make a down select soon,” the company said in its statement about choosing the RL10.
  24. WASHINGTON — OneWeb has shifted the debut launch for its satellite megaconstellation to the fourth quarter of the year. The startup’s first launch of 10 satellites aboard an Arianespace Soyuz rocket was scheduled for this month, but was pushed out toward the end of the year to allow for more testing and to incorporate improved components in the final spacecraft design. “Our production launches will start in Q4,” Greg Wyler, OneWeb’s founder, told SpaceNews. “We decided to continue with more ground testing and then go right into production because we can test virtually everything we need on the ground.” OneWeb is building the first 900 satellites of its constellation through a joint venture with Airbus called OneWeb Satellites. Wyler said the decision to delay was “based on which component revisions were available.” Backed by SoftBank, Intelsat, Coca-Cola and other investors, OneWeb is creating a constellation of small telecom satellites with the goal of making the internet accessible to everyone on Earth by 2027. How many satellites exactly is still to be determined — OneWeb in March asked the U.S. Federal Communications Commission to expand its authorization from 720 to 1,980 Ku-band satellites. The company still expects to begin service in 2019, starting with the first few hundred spacecraft. “As long as we begin our production launches this year we are still on schedule,” Wyler said. Arianespace is launching the bulk of OneWeb’s first generation constellation, and has 21 Soyuz launches contracted, along with options for more with Soyuz and the next generation Ariane 6. In a January interview, Arianespace CEO Stephane Israel was noncommittal on how many OneWeb Soyuz launches the company would do this year, saying it would “launch at least once for OneWeb this year and maybe more.” “There is a saying commonly used in engineering: ‘the perfect is the enemy of the good,’” Wyler said. “There are always margins that you could increase with more testing and design modifications. We are using this time to increase our margins and also implement some improvements. We didn’t absolutely need to do everything we are doing, but after internal discussion, we are taking advantage of the timing opportunity to iterate.” Since OneWeb’s first launch will only have 10 satellites onboard, the rocket will travel straight to their 1,200-kilometer low Earth orbit instead of the 500-kilometer drop off planned for subsequent flights, Wyler said. The near-direct insert cuts a few months of orbit raising time that would have relied on each satellite’s internal propulsion. Wyler estimated there might be a two month gap between the first launch and the rest of the launch campaign, which consists of a Soyuz launch every 21 days. After the first launch, each Soyuz will carry 36 satellites, with some occasionally carrying 34, he said. OneWeb also has a contract with Virgin Orbit for 39 LauncherOne missions and a memorandum of understanding with Blue Origin for five New Glenn launches to supplement its primary Arianespace campaign. Neither of those vehicles are launching yet, however. Wyler said OneWeb’s first generation satellites have “actually beat our plans,” on mass, weighing in at around 145 kilograms each instead of the projected 150 kilograms. “We are at about 14.5 kilograms per Gbps. Each satellite is about the same performance as the satellites I designed for O3b, yet we are putting nine times as many on a rocket,” he said. Prior to OneWeb, Wyler founded O3b Networks, a company since bought by Luxembourgian fleet operator SES, that provides connectivity services through a constellation of medium-Earth-orbit satellites. Wyler said each OneWeb satellite provides nine times as much throughput per kilogram as an original O3b satellite. “Our next generation, which we are working on now, will see at least a 15X increase in performance,” he said.
  25. Cardi B found herself in the middle of a mall ruckus after refusing to take a picture with fans last month (Apr18). Video has emerged of the Bodak Yellow star leaving the Fendi store at City Center in Las Vegas on 27 April when a scuffle broke out. Now, Lolita Beckford-Dawkins has claimed that she and a group of her friends approached Cardi and asked for a picture, but were allegedly ignored by the star. Upon being blanked, Lolita shouted "F**k Cardi, you ain’t s**t!" - which led to Cardi escalating things her side. According to sources, Cardi, who is expecting her first child with fiance Offset, was attempting to explain to the women that she didn't like the way she looked so didn't want a picture taken. However, Lolita told The Blast that she and her pals felt as though Cardi was being "fake" - especially as she only responded to them when things got heated. In the video, mall security and several store employees are seen with Cardi inside the shop as the chaos broke out outside, with Lolita's husband stepping in to try and keep his wife from getting to the rapper. However, as a mother-of-five, Lolita insisted she would never hit a pregnant woman. Just weeks after the Vegas incident, Cardi took to Twitter to urge fans to respect her privacy and space after she was approached by autograph seeker Giovanni Arnold following a Met Gala party on 8 May. Arnold later told The Blast he approached the couple as they waited in the car outside The Mark Hotel, and Offset didn't like the way he approached Cardi and allegedly told three members of his entourage to "go after" him. He claimed he was jumped by three men who pushed him to the ground and beat him up before running away - with the alleged attack leading to him seeking medical help at a nearby hospital. Cardi later addressed the situation on her Twitter page, writing: "If you check my tag pics i take a lot of pics with fans .Some people are not fans &sometimes i don’t want no pics and i simply don’t want people too close cause of (my pregnancy) (sic). "I don’t know what are people’s intentions sooo i Be careful .Why can’t people respect that?”
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.