Jump to content

Tipup's Content - Page 13 - InviteHawk - Your Only Source for Free Torrent Invites

Buy, Sell, Trade or Find Free Torrent Invites for Private Torrent Trackers Such As redacted, blutopia, losslessclub, femdomcult, filelist, Chdbits, Uhdbits, empornium, iptorrents, hdbits, gazellegames, animebytes, privatehd, myspleen, torrentleech, morethantv, bibliotik, alpharatio, blady, passthepopcorn, brokenstones, pornbay, cgpeers, cinemageddon, broadcasthenet, learnbits, torrentseeds, beyondhd, cinemaz, u2.dmhy, Karagarga, PTerclub, Nyaa.si, Polishtracker etc.

Tipup

Advanced Members
  • Posts

    911
  • Joined

  • Last visited

  • Feedback

    0%
  • Points

    34,245 [ Donate ]

Everything posted by Tipup

  1. Researchers at Kanazawa University report in Communications Chemistry that certain pentagonal and hexagonal organic molecules exhibit self-sorting. The effect can be used to grow multilayered tubular structures that preserve the geometry of the initial cavities. Supramolecular assemblies are nanostructures resulting from molecules binding together, through intermolecular interactions, into larger units. One approach for controlling supramolecular assembly involves self-sorting: molecules recognizing copies of themselves, and binding with them. Now, the findings of an interdisciplinary collaboration between the Supramolecular group (Tomoki Ogoshi and coworkers) Atomic Force Microscopy (AFM) group (Hitoshi Asakawa, Takeshi Fukuma, and coworkers) of the Nano Life Science Institute (WPI-NanoLSI) Kanazawa University showed that self-sorting behavior can arise from the principle of geometrical complementarity by shape: in a mixture of specific pentagonal and hexagonal molecular building blocks, pentagons bind to pentagons and hexagons to hexagons, and no mixing occurs. Asakawa and members of the AFM group conducted experiments with molecules called pillar[n]arenes, with n = 5 and n = 6, corresponding to pentagonal and hexagonal shapes, respectively. Both molecules come in two 'flavors': positively (cationic) or negatively charged (anionic). The polygonal molecules are essentially rings of 5 or 6 identical organic units, each featuring a benzene ring, but the composition of the units is different for the cationic and the anionic variants. Ogoshi and his colleagues of the Supramolecular group let cationic pillar[5]arenes (P[5]+ in shorthand notation) adsorb on a quartz substrate. From this structure, they were able to grow P[5]+/P[5]–/P[5]+/… multilayers by immersing it alternatingly in anionic and cationic pillar[5]arene solutions. The addition of a layer was verified each time by ultraviolet-visible spectroscopy measurements. The resulting overall structure is a 'nanomat' of tubular structures with pentagonal pores. Similar results were obtained for the pillar[6]arenes: stacks of alternating cationic and anionic layers of the hexagonal molecules could be easily fabricated. The arrangement of pillar[n]arenes on a surface was investigated by collaboration with Prof. Takanori Fukushima, Prof. Tomofumi Tada and co-workers from Tokyo Institute of Technology. What the scientists found surprising was that it was not possible to stack pentagonal and hexagonal building blocks when trying to build an anionic layer on a cationic one (and vice versa). This is a manifestation of self-sorting: only like polygons can self-assemble, even if ionic interactions drive the formation of cation–anion layered structures. The researchers also examined the structure of the first layer of P[5]+ or P[6]+ molecules on the quartz substrate. For the hexagonal molecules, the two-dimensional packing structure did no exhibit long-range structural order, whereas for the pentagonal molecules, it did. This is partly attributed to a lower density for the latter. For the multilayer 'nanomats', the same trend was observed: long-range order for the pentagonal stacks. The ring shape-dependent packing structures were simulated by a Monte Carlo simulation by collaboration with Prof. Tomonori Dotera from Kindai University. The self-sorting effect discovered by Ogoshi and colleagues has promising potential applications. Quoting the scientists: "The ultimate challenge will be to propagate cavity-shape information on the surface to provide shape-recognisable adsorption and adhesive materials." Pillar[n]arenes Pillar[n]arenes, collectively named pillararenes (and sometimes pillarenes), are cyclic organic molecules consisting of n so-called hydroquinone units, which can be substituted. Hydroquinone, also known as quinol, has the chemical formula C6H4(OH)2. It consists of a benzene ring with two hydroxyl (OH) groups bound to it at opposite sides of the benzene hexagon. The first pillararene was synthesized in 2008 by Tomoki Ogoshi and colleagues from Kanazawa University. The name pillararene was chosen since the molecules are cylindrical (pillar-like) in shape and composed of aromatic moieties (arenes). Furthermore, Ogoshi and colleagues have shown that n = 5 and n = 6 pillararenes exhibit self-sorting capabilities. Cationic and anion versions of the molecules form tubular structures preserving the original pentagonal or hexagonal geometry of the pillararene cavity.
  2. Researchers at the Tokyo Institute of Technology (Tokyo Tech) have developed a hydraulic actuator that will allow tough robots to operate in disaster sites and other harsh environments. The Tokyo Tech Venture H-MUSCLE Corporation was established to pursue applications for the actuator, and shipping of product samples will begin in February 2019. The majority of today's robots are driven by electric motors, but hydraulic actuators, with their high output and impact resistance, would be well-suited to robots operating in harsh environments. However, typical hydraulic actuators are developed for industrial machinery, like power shovels, and are too large and heavy to be used in robots; nor can they provide smooth movement or force control. Tokyo Tech School of Engineering professor Koichi Suzumori and colleagues developed a hydraulic actuator to solve these issues. The actuator offers greatly increased power and shock resistance compared with conventional electric motors. It also grants a smaller size, higher output (force-to-mass ratio), and smoother control compared with conventional hydraulic actuators. Providing high power, durability, and excellent control, the actuator will allow robots to operate in the harshest of environments and perform tough work with a gentle touch. This hydraulic actuator is the result of the Tough Robotics Challenge, organized by the Impulsing Paradigm Change through Disruptive Technologies Program (ImPACT) of the Cabinet Office of Japan. Tokyo Tech; JPN Co., Ltd.; Bridgestone Corporation; and KYB Corporation; along with other universities and enterprises concerned with hydraulic equipment, have participated in development of the hydraulic actuator for tough robots since 2014. To promote adoption of the technology, H-MUSCLE will ship samples of its hydraulic cylinders and hydraulic motors to domestic manufacturers, expand its lineup of actuators for future sale, and explore further applications. New hydraulic actuator will make robots tougher A small, lightweight, smooth-sliding motor. Credit: Suzumori laboratory Background The ImPACT Tough Robotics Challenge (2014-2018, Program Manager: Satoshi Tadokoro, Professor, Tohoku University) has conducted research with the aim of creating robots for tough operation, even in extreme disaster sites. As a part of this project, hydraulic actuators specifically for robots and their robotic application was set as a research theme. Koichi Suzumori (robotics, actuator engineering) at Tokyo Tech was the leader of the group with participants Tokyo Tech; Okayama University; Ritsumeikan University, JPN Co., Ltd.; Bridgestone Corporation; and KYB Corporation. With the cooperation of many other enterprises with highly specialized technologies, they succeeded in developing an actuator for hydraulic robots which is small, lightweight, high-output, and smooth-sliding, something not available in existing products. Merits of small, lightweight, smooth-sliding actuators The new hydraulic actuator offers the following advances over existing hydraulic actuators. Small size. Japanese Industrial Standards (JIS) specifies only cylinders with an internal diameter of 35 mm or greater. However, robots require smaller cylinders. H-MUSCLE has developed cylinders with an internal diameter of 20 to 30 mm in collaboration with JPN Co., Ltd. High force-to-mass ratio. "Force" is the generated axial force, and "mass" is the weight of the cylinder itself. Robots require a higher force-to-mass ratio than general stationary industrial machinery. Though the figure is only of a representative sample, H-MUSCLE cylinders can output an overwhelmingly higher value. This was made possible by (1) a drive pressure of 35 MPa, (2) titanium and magnesium alloys, and (3) inventive design. Smooth sliding. This cylinder operates at remarkably lower pressure than that of normal JIS cylinders. Conventional hydraulic cylinders and motors have stiff seals between the piston and the cylinder to seal in the fluid, and the great friction from this prevented smooth movements and control of force. With low-friction seals and inventive design, this research realized low friction, about one-tenth of conventional products. This addresses the difficulty in precise movement and force control found with conventional hydraulic robots. ImPACT has built several tough robot prototypes to test potential applications for the hydraulic actuator. New hydraulic actuator will make robots tougher A construction robot being developed by Komatsu, Osaka University, and others. The smaller of the two arms is driven by smooth-sliding cylinders developed in this program, contributing to its ability to do fine manipulations. The end of the …more
  3. The World Institute of Kimch (WiKim) has reported that the white colonies on the surface of kimchi are not formed by molds but by yeasts. The researchers also acquired genomic data regarding the hygienic safety of the yeast strains. This report is based on a study conducted by Dr. Tae-Woon Kim and Dr. Seong Woon Roh's team at Microbiology and Functionality Research Group of WiKim. The study involves a next-generation sequencing (NGS) approach to the collected white colonies from the surface of kimchi samples such as cabbage kimchi, mustard leaf kimchi, young radish kimchi, and watery kimchi. The findings were published in the latest online edition (Oct. 2018) of the Journal of Microbiology, an international academic journal. In general, yeasts produce alcoholic and aromatic compounds that generate the flavor of fermented foods; hence, they are frequently used in making bread or rice wine. Kimchi is primarily fermented by lactic acid bacteria rather than yeasts. However, during the later phase of fermentation, when the activity of lactic acid bacteria is decreased, a white colony on kimchi surface is formed by yeasts. The white colony is often observed on the surface of moist fermented food products including soy sauce, soy bean paste, rice wine and kimchi. Producers of white colonies on kimchi surface, mistaken as molds, have been identified Isolation and Whole-Genome Analysis of White Colony-Forming Yeasts on Kimchi Surface Credit: WiKim' research The research group performed microbial community structure analysis to identify five representative yeast strains responsible for white colony on kimchi surface: Hanseniaspora uvarum, Pichia kluyveri, Yarrowia lipolytica, Kazachstania servazzii, and Candida sake. Furthermore, whole-genome sequencing of the five yeast strains confirmed that they do not have known toxin-related genes. This study is the first to analyze the diversity of microbial community structures and perform whole-genome sequencing of white colony-forming yeasts on kimchi surface using NGS technology. In the future, WiKim intends to disseminate this genetic information regarding white colony-forming yeasts on kimchi surfaces in the Genome Database of Kimchi-associated Microbes (GDKM) and to perform additional studies such as toxicity tests based on animal experiments to verify the safety of the identified yeasts and to develop methods to prevent their formation. In order to prevent white colony formation, the surface of kimchi should be covered with a sanitized cover or be immersed in the kimchi liquid so that the surface is not exposed to the air. Furthermore, it is advised to maintain kimchi at a storage temperature below 4°C. White colonies on kimchi surfaces should be skimmed off and the kimchi should be washed and heated before eating. General Director Dr. Jaeho Ha at WiKim said, "This study is significant in that it has scientifically identified white colony-forming yeasts for which the people used to have vague anxiety and it is a step forward toward the alleviation of the anxiety for hygienic safety of kimchi."
  4. Measuring the knowledge of students in online courses poses a number of challenges. Researchers from the Higher School of Economics and the University of Leuven made improvements to the model for assessing academic achievements and published their results in the journal Heliyon. Several systemic factors make it difficult for the developers of online courses to assess student proficiency accurately. First, the average 10 to 15 test questions are too few to produce an accurate and reliable measure of knowledge. Second, the use of multiple-choice questions leads to guessing and a distortion of the results. Third, frequent use of the same set of correct answers as a measure of proficiency makes it difficult to compare students when the test is updated even slightly. Researchers of the Higher School of Economics and the University of Leuven managed to solve these problems by expanding the classic Rasch model with additional parameters. "First, our expanded approach includes the effect of multiple attempts, making it possible to distinguish between students who guess and those who know the answers," said HSE Centre for Psychometrics in eLearning Head Dmitry Abbakumov. "Second, because the knowledge metrics obtained with this expanded approach are expressed on a single scale, they can be compared, even when the test questions are changed significantly. And finally, we calculate metrics based not only on test results, but also by taking into account the student's experience—their activity when watching videos and performance in hands-on sessions—providing a more comprehensive understanding of the student's competence." In the future, the approach proposed by the researchers could be used in assessment engines on educational platforms to obtain more accurate measurements of students' knowledge. And the metrics could be built into the navigation and recommendation solutions in digital education.
  5. Adaptations to environmental change are the most important asset for the persistence of any plant or animal species. This is usually achieved through genetic mutation and selection, a slow process driven by chance. Faster and more targeted are so called epigenetic modifications which do not alter the genetic code but promote specialisations during cell maturation. A new study carried out by scientists from the Leibniz-IZW in Germany shows that in wild guinea pigs, epigenetic modifications specific to individual environmental factors are passed on to the next generation. The study is published in the scientific journal Genes. The team of researchers around Alexandra Weyrich from the Leibniz Institute for Zoo and Wildlife Research (Leibniz-IZW) in Berlin, Germany, studied two groups of male wild guinea pigs. One group was fed a protein-reduced diet for two months, the other group was exposed to an increase in ambient temperature of ten degrees (Celsius) for the same period. The animals responded to these changes through epigenetic modifications at the cellular level. "Epigenetic modifications have been studied for some time. What we were after was to determine, whether these modifications are passed on to the next generation of guinea pigs and whether fathers played a role in this," says Weyrich. The team studied offspring sired by males prior to their exposure to the environmental change and those sired by these males after the two-month experimental period – each time sired with the same females who were not exposed to these changes in conditions. The comparison revealed significant differences in the the methylation pattern of the offsprings' DNA – for the scientists documentation that "inheriting" parental epigenetic responses to environmental changes is possible and that males can play an important role in these processes. "We were most interested in comparing the two different groups," Weyrich adds. "Our results show for the first time that the epigenetic response to environmental changes comprises two parts: A general part, which reflects the fact that there was some environmental change – independent of the specific factor of change. And a very specific part that is the specific response to a particular environmental change." Rapid environmental change in the context of man-made global change, including climate change, for example rising temperatures or changes in resource availability and food supply, pose significant challenges to plants and animals. For some species these challenges can become existential threats. Corals for instance are highly temperature-sensitive and the reproduction of some frog and crocodile species are closely linked to specific temperature constellations. With the radical environmental changes currently underway, species that show a high adaptability have an advantage. The well-known mechanism of mutation and selection, however, may be too slow to cope with rapid changes. It relies on accidental changes to the genetic code which may or may not provide an advantage to survival and reproduction (natural selection). So-called epigenetic modifications can translate environmental changes much faster provided the genome already contains the necessary flexibility for an adequate response. During epigenetic modifications, the genetic code is not altered but specific genes are activated and strengthened or shut down through several chemical processes. These processes are also common during cell maturation, when cells specialise to differentiate into skin, bone or liver cells. "One of the most important epigenetic modifications is the so-called DNA methylation," Weyrich explains. The scientists compared methylation patterns of the offspring sired before and after environmental conditions experienced by the fathers, focsuing on sections of the genome that showed differential methylation (differentially methylated regions, DMRs). Specific responses both to rising temperatures and to the altered diet could be traced to the methylation patterns in the genomes of the offspring. "Previously, most epigenetic studies were carried out using populations of laboratory animals that have been living under artificial conditions for generations. Studies on wild species are still rare," Weyrich says. "Our comparative study design fostered these new insights." In order to understand in more detail how epigenetic modifications in the context of environmental changes are passed on to future generations, further studies in this field are required.
  6. Hurricane Florence's slow trot over North and South Carolina in September led to inundating rain, record storm surges, and another major disaster for the Federal Emergency Management Agency (FEMA) to contend with. Facing damage over hundreds of square miles, FEMA again called upon MIT Lincoln Laboratory to use their state-of-the-art lidar system to image the destruction in the region. Installed onto an airplane and flown nightly over a disaster site, lidar (which stands for light detection and ranging) sends out pulses of laser light that bounce off the land and structures below and are collected again by the instrument. The timing of each light pulse's return to the instrument is used to build what researchers call a "point-cloud map," a high-resolution 3-D model of the scanned area that depicts the heights of structures and landscape features. Laboratory analysts can then process this point-cloud data to glean information that helps FEMA focus their recovery efforts—for example, by estimating the number of collapsed houses in an area, the volume of debris piles, and the reach of flood waters. Yet quickly sending the nearly two terabytes of data from a single night's scan, or sortie, to the Laboratory for processing is a challenge. After a storm, local internet connections may be gone or spotty. When the Laboratory used this same lidar platform after Hurricane Maria in Puerto Rico, downed networks meant having to physically ship a hard drive back to Massachusetts—a more than two-day delay in getting the data into analysts' hands. When the team started the campaign in the Carolinas in mid-September, they faced the same obstacle. This time, the obstacle was hurdled thanks to MCNC. The nonprofit organization formerly known as the Microelectronics Center of North Carolina is based out of Research Triangle Park near Durham, North Carolina, which was not directly affected by Hurricane Florence. MCNC gave the Laboratory free access to their North Carolina Research and Education Network (NCREN). "Our state was hit hard by Hurricane Florence," says Tommy Jacobson, MCNC's chief operating officer and vice president. "For MCNC's leadership, it was a quick and easy decision to enable MIT, who was in the state to assist FEMA, with access to our network resources to help however we could in making sure relief got to those that needed it." NCREN is North Carolina's broadband backbone, connecting more than 750 institutions including all public school districts in the state, universities and colleges, public safety locations, and other community anchor institutions. Access for the Laboratory meant rack space for equipment inside the MCNC data center. From there, MCNC provisioned a 10-gigabit IP connection from the NCREN to Internet2, an ultrafast, advanced network that connects research centers around the world. This connection gave the team the ability to upload large volumes of data daily from their equipment inside the data center back to a computing center on MIT campus that is also connected to Internet2. From there, another 10-gigabit connection bounced the data from campus to the Lincoln Laboratory Supercomputing Center in Holyoke, where the data were processed. "The 10-gig uplink from MCNC allowed us to transmit the data at such a higher speed that some of our uploads were done in about six to seven hours," says Daniel Ribeirinha-Braga, a member of the Laboratory's data management team in this hurricane effort. "Keep in mind that this is lidar data, which we get about 1.5 to 1.9 terabytes a night of, that needs to be copied to multiple places, such as other hard drives, organized to a single SSD [solid-state drive], and then uploaded to the Laboratory from MCNC." The collaboration between the Laboratory and MCNC came about through Matt Daggett, a staff member in the Humanitarian Assistance and Disaster Relief (HADR) Systems Group. He had worked at MCNC more than a decade ago. "I was aware of the NCREN backbone and the data center on the MCNC campus," Daggett says. "When it became clear that our flight operations would be based out of the RDU [Raleigh–Durham International] airport, I knew MCNC would be the perfect place to get our data onto the Internet2." Adds Jacobson: "We were grateful that MIT sought us out to provide that help," With the data processing underway, the Laboratory has begun delivering reports to FEMA. The lidar imagery reveals things that would be impossible for FEMA to know from looking only at satellite images. "The most important difference between a satellite image and the lidar image is that you can do 3-D measurements on it," says Anthony Lapadula of the HADR Systems Group. "So, because it's 3-D data, we can do things like tell you how big a hole in a road is, or tell you how big an elevation drop is as a result of a landslide." One of the greatest advantages of the lidar work has been the time saved for FEMA. When someone reported damage at a specific location, FEMA could assess the damage quickly by asking Laboratory analysts to virtually visit the location in the point-cloud map and report what they found. For instance, FEMA asked them to zoom in on a small town on the Lumber River in North Carolina that had been inundated with flood waters. Analysis of the data told FEMA the extent of the flood inundation, the volume of debris piles in the town, and changes in the river's path. There were also environment questions to be answered, such as what the volume was of a coal ash pile to determine how much, if any, washed away with flood waters. They could also check in on public infrastructure, like a lock and dam along the Cape Fear River that the data showed to be completely flooded. Completing the 40 sorties requested over the Carolinas took the team several weeks to complete, right up until Thanksgiving. The sorties covered areas down the coastline from the Cape Lookout National Seashore to Myrtle Beach and through bands stretching inland. These hundreds of miles of lidar data were processed to a resolution of approximately 25 centimeters. To put that resolution into perspective, Lapadula says that if the scanned areas were covered completely with basketballs, they would be able to precisely measure the location of each ball. But with only one of these extremely advanced systems available for use, Lincoln Laboratory staff are limited in how much area they can cover and how many disasters they can respond to with the technology. The lidar system was originally developed by the Active Optical Systems Group, who has been assisting the HADR Systems Group with data collection, processing, and algorithm development. Several industry collaborators also participated in this effort. Employees from the small business 3DEO, which specializes in Geiger-mode lidar technology, assisted with the data collection. The small business LEEP has been helping with data analytics and providing training to FEMA analysts to facilitate the use of the data. Another partner, Basler Turbo Conversions, supported engineering aspects of installing the lidar on the BT-67 aircraft, which is being leased from the company AIRtec. While the laboratory has been involved in disaster recovery since the 2010 Haiti earthquake, it has never been so active in these efforts as in the past year since Hurricanes Harvey, Irma, and Maria in 2017. "We went a decade without a major hurricane hitting the continental United States," Lapadula reflected. "Now, it's like they just keep coming."
  7. The processes that led to glaciation at the cratered poles of Mercury, the planet closest to the sun, have been modeled by a University of Maine-led research team. James Fastook, a UMaine professor of computer science and Climate Change Institute researcher, and James Head and Ariel Deutsch of Brown University, studied the accumulation and flow of ice on Mercury, and how the glacial deposits on the smallest planet in our solar system compare to those on Earth and Mars. Their findings, published in the journal Icarus, add to our understanding of how Mercury's ice accumulations—estimated to be less than 50 million years old and up to 50 meters thick in places—may have changed over time. Changes in ice sheets serve as climatic indicators. Analysis of Mercury's cold-based glaciers, located in the permanently shadowed craters near the poles and visible by Earth-based radar, was funded by a NASA Solar System Exploration Research Virtual Institute grant for Evolution and Environment of Exploration Destinations, and is part of a study of volatile deposits on the moon. Like the moon, Mercury does not have an atmosphere that produces snow or ice that could account for glaciers at the poles. Simulations by Fastook's team suggest that the planet's ice was deposited—likely the result of a water-rich comet or other impact event—and has remained stable, with little or no flow velocity. That's despite the extreme temperature difference between the permanently shadowed locations of the glaciers on Mercury and the adjacent regions illuminated by the sun. One of the team's primary scientific tools was the University of Maine Ice Sheet Model (UMISM), developed by Fastook with National Science Foundation funding. Fastook has used UMISM to reconstruct the shape and outline of past and present ice sheets on Earth and Mars, with findings published in 2002 and 2008, respectively. "We expect the deposits (on Mercury) are supply limited, and that they are basically stagnant unmoving deposits, reflecting the extreme efficiency of the cold-trapping mechanism" of the polar terrain, according to the researchers. Explore further: Dawn maps Ceres craters where ice can accumulate
  8. Researchers at the Telecommunications and Multimedia Applications Institute (iTEAM) of Valencia's Polytechnic University (UPV) have taken a step toward creating an infallible chip. They have developed an advanced method for the analysis and à la cart configuration of photonic circuits, which makes it possible to pre-emptively deal with the possible faults that a chip may suffer and reduce their impact in the design phase, before the chips become operational. The work of the UPV researchers is centred on generic-purpose photonic circuits, which provide multiple functionalities while using a single architecture, in an analogue way to how microprocessors work in electronics. "With the tools we have developed, we will simplify and optimise the manufacturing and performance of these chips," says José Campany, researcher at the Photonics Research Labs (PRL) of the iTEAM UPV. According to professor Campany, faults often take place within the components of the circuits, which end up affecting their final performance. "The technique makes it possible to predict where the circuit will fail and configure the other components to make up for these deficiencies, thus guaranteeing their maximum performance," he says. All this is invisible to the user. "The analysis method is relatively simple: Each one of the units of the circuit is configured, and by applying mathematic induction techniques, offers a diagnosis of how the circuit would behave in each of the ports. Based on this diagnosis, we can conduct the modifications we see necessary in the configuration," explains Daniel Pérez, fellow researcher at the PRL-iTEAM of the UPV. "Furthermore, the method enables us to simulate larger circuits and validate their capabilities with current manufacturing techniques." Another benefit of the work is the chip cost decrease. "If you are able to optimise the circuit with software, the manufacturing phase is not as demanding, which makes it possible to increase the performance when producing these devices," adds Campany. Chips with Artificial Intelligence The work developed by the iTEAM researchers also entails a first step for the design and manufacturing of photonic circuits with artificial intelligence techniques. "With this method, we can use machine learning algorithms to synthesise and design circuits. Current day work is the seed that an automated learning method needs," adds Daniel Pérez. The next challenge for the UPV iTEAM researchers is to merge their most recent works for the design of hardware of the circuits with advanced algorithms that make it possible to squeeze all the potential out of the integrated optics.
  9. Our senses are stuck in the past. There's a flash of lightning, and then seconds pass until we hear the rumble of distant thunder. We hear the past. We are seeing into the past too. While sound travels about a kilometre every three seconds, light travels 300,000 kilometres every second. When we see a flash of lighting three kilometres away, we are seeing something that happened a hundredth of a millisecond ago. That's not exactly the distant past. But as we look further afield, we can peer further back. We can see seconds, minutes, hours and years into the past with our own eyes. Looking through a telescope, we can look even further into the past. A second back in time If you really want to look back in time, you need to look up. The moon is our nearest celestial neighbour—a world with valleys, mountains and craters. It's also about 380,000km away, so it takes 1.3 seconds for light to travel from the moon to us. We see the moon not as it is, but as it was 1.3 seconds ago. The moon doesn't change much from instant to instant, but this 1.3-second delay is perceptible when mission control talks to astronauts on the moon. Radio waves travel at the speed of light, so a message from mission control takes 1.3 seconds to get to the moon, and even the quickest of replies takes another 1.3 seconds to come back. Radio communications to the moon have a perceptible time delay. Minutes and hours It's not hard to look beyond the moon and further back in time. The Sun is about 150 million km away, so we see it as it was about 8 minutes ago. Even our nearest planetary neighbours, Venus and Mars, are tens of millions of kilometres away, so we see them as they were minutes ago. When Mars is very close to Earth, we are seeing it as it was about three minutes ago, but at other times light takes more than 20 minutes to travel from Mars to Earth. This presents some problems if you're on Earth controlling a Rover on Mars. If you're driving the Rover at 1km per hour then the lag, due to the finite speed of light, means the rover could be 200 metres ahead of where you see it, and it could travel another 200 metres after you command it to hit the brakes. Not surprisingly, Martian Rovers aren't breaking any speed records, travelling at 5cm per second (0.18kph or 0.11mph), and on-board computers help with driving, to prevent rover wrecks. When you look up, how far back in time do you see? The light travel time from Mars to Earth changes as the distance to Mars changes. Credit: NASA, ESA, and Z. Levay (STScI), CC BY Not surprisingly, Martian Rovers aren't breaking any speed records, travelling at 5cm per second (0.18kph or 0.11mph), with rovers following carefully programmed sequences and using on-board computers to avoid hazards and prevent punctures. Let's go a bit further out in space. At its closest to Earth, Saturn is still more than a billion kilometres away, so we see it as it was more than an hour ago. When the world tuned into the Cassini spacecraft's plunge into Saturn's atmosphere in 2017, we were hearing echos from a spacecraft that had already been destroyed more than an hour before. Years The night sky is full of stars, and those stars are incredibly distant. The distances are measured in light years, which corresponds to the distance travelled by light in one year. That's about 9 trillion km. As light moves at finite speed, we can see bursts of light echo off interstellar dust. Alpha Centauri, the nearest star visible to the unaided eye, is at a distance 270,000 times the distance between Earth and the Sun. That's 4 light years, so we see Alpha Centauri as it was 4 years ago. Some bright stars are much more distant still. Betelgeuse, in the constellation Orion, is about 640 light years away. If Betelgeuse exploded tomorrow (and it will explode one day), we wouldn't know about it for centuries. Even without a telescope we can see much much further. The Andromeda galaxy and the Magellanic Clouds are relatively nearby galaxies that are bright enough to be seen with the unaided eye. The Large Magellanic cloud is a mere 160,000 light years away, while Andromeda is 2.5 million light years away. For comparison, modern humans have only walked the Earth for about 300,000 years. When you look up, how far back in time do you see? 3C 273 can be seen with a small telescope despite being billions of light years away. Credit: ESA/Hubble & NASA, CC BY Billions With the unaided eye you can look millions of years into the past, but how about billions? Well, you can do that at the eyepiece of an amateur telescope. Quasar 3C 273 is an incredibly luminous object, which is brighter than individual galaxies, and powered by a huge black hole. But it's 1,000 times fainter than what the unaided eye can see because it's 2.5 billion light years away. That said, you can spot it with a 20cm aperture telescope. A bigger telescope allows you to peer even further into space, and I once had the pleasure of using an eyepiece on a 1.5-metre diameter telescope. Quasar APM 08279+5255 was just a faint dot, which isn't surprising as it's 12 billion light years away. When you look up, how far back in time do you see? With a big enough telescope you can see quasar APM 08279+5255 and look 12 billion years back in time. Credit: Sloan Digital Sky Survey, CC BY Earth is just 4.5 billion years old, and even the universe itself is 13.8 billion years old. Relatively few people have seen APM 08279+5255 with their own eyes, and in doing so they (and I) have looked back across almost the entire history of our universe. So when you look up, remember you aren't seeing things as they are now; you're seeing things as they were. Without really trying, you can see years into the past. And with the aid of a telescope you can see millions or even billions of years into the past with your very own eyes.
  10. Finding defects in electron microscopy images takes months. Now, there's a faster way. It's called MENNDL, the Multinode Evolutionary Neural Networks for Deep Learning. It creates artificial neural networks—computational systems that loosely mimic the human brain—that tease defects out of dynamic data. It runs on all available nodes of the Summit supercomputer, performing 152 thousand million million calculations a second. In mere hours, scientists using MENNDL created a neural network that performed as well as a human expert. It reduces the time to analyze electron microscopy images by months. MENNDL is the first known approach to automatically identify atomic-level structural information in scanning transmission electron microscopy data. In 2018, MENNDL received an R&D 100 award, considered the Oscars of innovation. It's also a finalist for the Gordon Bell award. MENNDL, an artificial intelligence system, automatically designed an optimal deep learning network to extract structural information from raw atomic-resolution microscopy data. To design the network, MENNDL used 18,000 GPUs on all of the available 3000 nodes of the Summit supercomputer. In a few hours, MENNDL creates and evaluates millions of networks using a scalable, parallel, asynchronous genetic algorithm augmented with a support vector machine to automatically find a superior deep learning network topology and hyper-parameter set. This work is far faster than could be done by a human expert. For the application of electron microscopy, the system furthers the goal of better understanding the electron-beam-matter interactions and real-time image-based feedback, which enables a huge step beyond human capacity toward nanofabricating materials automatically.
  11. Late in 2018, the gravitational wave observatory, LIGO, announced that they had detected the most distant and massive source of ripples of spacetime ever monitored: waves triggered by pairs of black holes colliding in deep space. Only since 2015 have we been able to observe these invisible astronomical bodies, which can be detected only by their gravitational attraction. The history of our hunt for these enigmatic objects traces back to the 18th century, but the crucial phase took place in a suitably dark period of human history – World War II. The concept of a body that would trap light, thereby becoming invisible to the rest of the universe, had first been considered by the natural philosophers John Michell and later Pierre-Simon Laplace in the 18th century. They used Newton's gravitational laws to calculate the escape velocity of a light particle from a body, predicting the existence of stars so dense that light could not escape from them. Michell called them "dark stars". But after the discovery that light took the form of a wave in 1801, it became unclear how light would be affected by the Newtonian gravitational field, so the idea of dark stars was dropped. It took roughly 115 years to understand how light in the form of a wave would behave under the influence of a gravitational field, with Albert Einstein's General Relativity Theory in 1915, and Karl Schwarzschild's solution to this problem a year later. Schwarzschild also predicted the existence of a critical circumference of a body, beyond which light would be unable to cross: the Schwarzschild radius. This idea was similar to that of Michell, but now this critical circumference was understood as an impenetrable barrier. It was only in 1933 that George Lemaître showed that this impenetrability was only an illusion that a distant observer would have. Using the now famous Alice and Bob illustration, the physicist hypothesised that if Bob stood still while Alice jumped into the black hole, Bob would see Alice's image slowing down until freezing just before reaching the Schwarzschild radius. Lemaître also showed that in reality, Alice crosses that barrier: Bob and Alice just experience the event differently. Despite this theory, at the time there was no known object of such a size, nothing even close to a black hole. So nobody believed that something similar to the dark stars as hypothesised by Michell would exist. In fact, no one even dared to treat the possibility with seriousness. Not until World War II. From dark stars to black holes On September 1 1939, the Nazi German army invaded Poland, triggering the beginning of the war that changed the world's history forever. Remarkably, it was on this very same day that the first academic paper on black holes was published. The now acclaimed article, On Continued Gravitational Contraction, by J Robert Oppenheimer and Hartland Snyder, two American physicists, was a crucial point in the history of black holes. This timing seems particularly odd when you consider the centrality of the rest of World War II in the development of the theory of black holes. A brief history of black holes The Schwarzchild radius. Credit: Tetra Quark/Wikimedia Commons, CC BY-SA This was Oppenheimer's third and final paper in astrophysics. In it, he and Snyder predict the continued contraction of a star under the influence of its own gravitational field, creating a body with an intense attraction force that not even light could escape from it. This was the first version of the modern concept of a black hole, an astronomical body so massive that it can only be detected by its gravitational attraction. In 1939, this was still an idea that was too strange to be believed. It would take two decades until the concept was developed enough that physicists would start to accept the consequences of the continued contraction described by Oppenheimer. And World War II itself had a crucial role in its development, because of the US government's investment in researching atomic bombs. Reborn from the ashes Oppenheimer, of course, was not only an important character in the history of black holes. He would later become the head of the Manhattan Project, the research centre that led to the development of atomic weapons. Politicians understood the importance of investing in science in order to bring military advantage. Consequently, across the board, there was wide investment in war-related revolutionary physics research, nuclear physics and the development of new technologies. All sorts of physicists dedicated themselves to this kind of research, and as an immediate consequence, the fields of cosmology and astrophysics were mostly forgotten, including Oppenheimer's paper. In spite of the decade lost to large-scale astronomical research, the discipline of physics thrived as a whole as a result of the war – in fact, military physics ended up augmenting astronomy. The US left the war as the centre of modern physics. The number of Ph.D.s skyrocketed, and a new tradition of postdoctoral education was set up. By the end of the war, the study of the universe was rekindled. There was a renaissance in the once underestimated theory of general relativity. The war changed the way we do physics: and eventually, this led to the fields of cosmology and general relativity getting the recognition they deserve. And this was fundamental to the acceptance and understanding of the black holes. Princeton University then became the centre of a new generation of relativists. It was there that the nuclear physicist, John A Wheeler, who later popularised the name "black hole", had his first contact with general relativity, and reanalysed Oppenheimer's work. Sceptical at first, the influence of close relativists, new advances in computational simulation and radio technology – developed during the war – turned him into the greatest enthusiast for Oppenheimer's prediction on the day that war broke out, September 1 1939. Since then, new properties and types of black holes have been theorised and discovered, but all this only culminated in 2015. The measurement of the gravitational waves created in a black hole binary system was the first concrete proof that black holes exist.
  12. The recent tumult in financial markets has shined a light on the rising role of automated trading on Wall Street and whether it is exacerbating volatility. Since the 2008 financial crisis, investors have increasingly turned to computerized trading systems that have been programmed to render quickfire "buy" and "sell" orders based on economic data, utterances of central bankers or complex artificial intelligence software that employ algorithms. Though set up by humans, these trades are based on a snap assessment that lacks the subtle discernment of the human eye. Whenever an unexpected lurch on Wall Street slams investors, fingers are pointed at such systems that increasingly dominate trading. Critics have questioned whether the market's recent swoon—which could result in the worst December since the Great Depression—is due to a liquidity drain and other unanticipated effects of the computerization of trading, rather than fundamental economic factors at a time when US unemployment is low and economic growth is solid. Treasury Secretary Steven Mnuchin, in a recent interview with Bloomberg, blamed the uptick in volatility on the surge in high-frequency trading, a type of automated trading. Trading from quantitative hedge funds relying on computer models now accounts for 28.7 percent of overall volumes in the United States, according to the Tabb Group consultancy. That is more than twice the share from five years ago and, since 2017, above the percentage held by individual investors. JPMorgan Chase analyst Marko Kolanovic has estimated that only about one-third of the assets in the stock market are actively managed and that only 10 percent of the daily trading volume is the result of specific deliberation. But while the rise of automated trading is undeniable, it is less clear that it is responsible for increased market turmoil. Traders have had a nervous December, which could be Wall Street's worst since the Great Depression Traders have had a nervous December, which could be Wall Street's worst since the Great Depression Tabb Group Founder Larry Tabb said most electronic trading firms employ algorithms that identify and take advantage of price discrepancies between the price of a given security and what it fetches elsewhere. "They are looking to buy the cheap ones," Tabb said, adding, "most models actually dampen volatility rather than enhance volatility." 'Flash Crash' At the same time, Tabb concedes that the proliferation of exchanges where stocks are bought and sold can result in limited liquidity on platforms. That can make markets vulnerable to a "flash crash," although this possibility was mitigated with circuit breakers instituted after 2010. The system of automated trading is "all about supply and demand like it's always been," Tabb said. "It's just a supply and demand at a quicker pace." Another oft-cited risk is the tendency for computers to behave with "herd"-like behavior because they are engineered in a similar fashion. "Because of the design similarities, they tend to buy and sell futures at similar price levels," said Peter Hahn, co-founder of Bridgeton Research Group. "When they are hitting 'sell' stop-loss levels at similar times they can add significant price pressure at the beginning of down-trends," said Hahn, adding that the impact is more muted when trades are triggered by fundamental factors, such as an economic indicator. Kolanovic warned that the shift away from active investment could pinch the market's ability to "prevent and recover from large drawdowns." "The $2 trillion rotation from active and value to passive and momentum strategies since the last crisis eliminated a large pool of assets that would be standing ready to buy cheap public securities and backstop a market disruption," Kolanovic said.
  13. At least three deaths were attributed to severe weather in the US as heavy snow and high winds snarled air and ground transportation during a busy holiday travel period. More than 500 flight cancelations and 5,700 delays were reported Friday as the winter storm blanketed areas from the north central plains and the Midwest with eight to 12 inches (20-30 centimeters) of snow. As much snow, if not more, was forecast to fall in the coming days in the southwestern state of New Mexico, along with a deluge of rain in some southern and eastern states—ruining New Year travel plans for thousands of Americans. Millions more in the South were warned of potential flooding from heavy rains. A 58-year-old woman in Louisiana was killed Wednesday evening when lightning struck a tree, which then fell on her home, according to TV station WDSU. In Kansas, police said icy roads caused a fatal car crash Thursday on an interstate highway. Another crash involving a snow plow and a car in North Dakota claimed one life. More than 6,500 flights were delayed and some 800 more were canceled on Thursday, according to the flight tracking website FlightAware. Some airline passengers reported being stranded for days. In Kansas, pictured here in 2013, was struck again with severe weather that police say caused a fatal car crash In Kansas, pictured here in 2013, was struck again with severe weather that police say caused a fatal car crash "I didn't want to spend three days in the airport, missing out on the holidays—New Year's and all that," Anthony Scott told Texas television station KDFW at Dallas-Fort Worth International Airport. "I have to go back to work the first of the year. So this is my time," he said. "This was my little vacation. I'm not trying to spend it in the airport." Road travel treacherous Numerous roads were closed Thursday in the Dakotas, Minnesota, Kansas and Iowa. Ground crews worked to clear affected areas, but many remained packed with snow and ice Friday. The South Dakota Department of Transportation said advisories warning against travel remained in effect. "Roads are icy, blowing snow is still limiting visibility," the agency said. "Crews are working but mother nature is making safe travel tough." North Dakota on Friday lifted a no-travel advisory that had been issued for the entire east side of the state, even as drifting snow continued to frustrate drivers. National Weather Service (NWS) officials in Minnesota cautioned that roads in the upper Midwestern state were cloaked in snow. Police cars covered in snow in the 2016 "Snowzilla" storm, which dumped some 25 inches on New York Police cars covered in snow in the 2016 "Snowzilla" storm, which dumped some 25 inches on New York The weather service predicted the treacherous weather would continue through the weekend across the country. Heavy snow was expected in the southwestern state of New Mexico from a new storm, with as much as 18 inches possible, NWS said. To the south, heavy rains were forecast in the central Gulf Coast, in the Florida Panhandle, and stretching east to the mid-Atlantic. A risk of flash flooding was possible in a few areas. The nasty weather was still no match for a colossal blizzard that smothered the eastern United States in January 2016. That storm shut down New York and Washington, leaving 15 people dead and impacting some 85 million residents. Forecasters said the storm—dubbed "Snowzilla"—dumped 22.2 inches in Washington and 25.1 inches in New York's Central Park, the third highest accumulation since records began in 1869.
  14. Experts from the National Research Nuclear University MEPhI (Russia), the University of Oulu (Finland), and the St. Petersburg-based Ioffe Physical-Technical Institute of the Russian Academy of Sciences (Russia) have compared the effect of cosmic ray solar modulation as recorded by neutron monitors and the PAMELA (Payload for Antimatter Matter Exploration and Light-Nuclei Astrophysics) satellite experiment. According to the scientists, this will make it possible to predict radiation levels in near-Earth space more accurately, an important aspect of planning space missions. The results of this project were published in the Journal of Geophysical Research: Space Physics. Launched in 2006, the PAMELA satellite experiment aims to locate and record antimatter and to measure the spectrum bands of cosmic radiation components, as well as near-Earth radiation conditions, and to establish the origin of dark matter. The research paper's authors compare the effects of the solar modulation of cosmic rays, recorded by the PAMELA international experiment and neutron monitors. These neutron monitors are a chain of ground-based units that have been operating since the 1950s and which record secondary particles generated during interaction between cosmic rays and atmospheric nuclei. Russian scientists used data recorded in real time by a neutron monitor in Oulu, Finland. These results will help gauge the neutron monitors' correct response function during solar activity. This was only made possible after launching the PAMELA experiment, said Sergei Koldobsky, a senior lecturer with MEPhI's Institute of Nuclear Physics and Engineering. "The correct responses of neutron monitors, as well as huge statistical records of uninterrupted operation over the past 70 years, allow us to predict radiation levels in near-Earth space, and this has tremendous significance for planning space missions," Sergei Koldobsky told. Direct measurements conducted during the PAMELA experiment made it possible to check the accuracy of the neutron monitors' response function, which links the cosmic ray spectral band that reaches the top layers of the terrestrial atmosphere with the number of neutrons being recorded by a given monitor. The research paper also mentions the calibration of ground-based neutron monitors using PAMELA experiment data.
  15. NASA's unmanned New Horizons spacecraft is closing in on its historic New Year's flyby target, the most distant world ever studied, a frozen relic of the solar system some four billion miles (6.4 billion kilometers) away. The cosmic object, known as Ultima Thule, is about the size of the US capital, Washington, and orbits in the dark and frigid Kuiper Belt about a billion miles beyond the dwarf planet, Pluto. The spacecraft's closest approach to this primitive space rock comes January 1 at 12:33 am ET (0533 GMT). Until then, what it looks like, and what it is made of, remain a mystery. "This is a time capsule that is going to take us back four and a half billion years to the birth of the solar system," said Alan Stern, the principal investigator on the project at the Southwest Research Institute, during a press briefing Friday. A camera on board the New Horizons spacecraft is currently zooming in on Ultima Thule, so scientists can get a better sense of its shape and configuration—whether it is one object or several. "We've never been to a type of object like this before," said Kelsi Singer, New Horizons co-investigator at the Southwest Research Institute. About a day prior, "we will start to see what the actual shape of the object is," she said. The spacecraft entered "encounter mode" on December 26, and is "very healthy," added Stern. Communicating with a spacecraft that is so far away takes six hours and eight minutes each way—or about 12 hours and 15 minutes round trip. New Horizons' eagerly awaited "phone home" command, indicating if it survived the close pass—at a distance of just 2,200 miles (3,500 kilometers)—is expected January 1 at 10:29 am (1529 GMT). Until then, the New Horizons spacecraft continues speeding through space at 32,000 miles (51,500 kilometers) per hour, traveling almost a million miles per day. And NASA scientists are eagerly awaiting the first images. "Because this is a flyby mission, we only have one chance to get it right," said Alice Bowman, missions operations manager for New Horizons. The spacecraft, which launched in 2006, captured stunning images of Pluto when it flew by the dwarf planet in 2015.
  16. Trash, particularly plastic, in the ocean and along the shoreline is an economic, environmental, human health, and aesthetic problem causing serious challenges to coastal communities around the world, including the Gulf of Mexico. Researchers from the Dauphin Island Sea Lab and the Mission-Aransas National Estuarine Research Reserve teamed up for a two-year study to document the problem along the Gulf of Mexico shorelines. Their findings are documented in the publication, Accumulation and distribution of marine debris on barrier islands across the northern Gulf of Mexico, in ScienceDirect's Marine Pollution Bulletin. From February 2015 to August of 2017, the researchers kept tabs on marine debris that washed up on the shoreline every month at 12 different sites on nine barrier islands from North Padre Island, Texas to Santa Rosa, Florida. The trash was sorted by type, frequency, and location. The most shocking discovery was that ten times more trash washes up on the coast of Texas than any of the other Gulf states throughout the year. Most of the trash, 69 to 95 percent, was plastic. The plastic items included bottles and bottle caps, straws, and broken pieces of plastic. Researchers also cited that more trash washed ashore during the spring and summer. This could be because more people are outside and on the water during this time.
  17. Unions for Ryanair's 1,800 cabin crew in Spain threatened Friday to strike in January unless the Irish low-cost airline agrees to improve work and pay conditions. It was just the latest setback for the airline, which has faced a wave of strikes in several European countries in recent months. The two unions representing the staff, USO and Sitcpla, called for 24-hour strikes on January 8, 10 and 13 because Ryanair had failed to reach an agreement with them during mediation. The unions are demanding local contracts under local law rather than the Irish contracts Ryanair uses widely. It was "disgusting" that Ryanair "continues to refuse to accept national law with all its consequences", USO representative Jairo Gonzalo said in a statement. Europe's biggest low-cost airline only began recognising unions for the first time in its 30-year history in December last year, to avert mass strikes during the busy Christmas period. In July, strikes by cockpit and cabin crew disrupted 600 flights in Belgium, Ireland, Italy, Portugal and Spain, affecting 100,000 travellers. Then on September 28, cabin crew walked out again in Germany, Belgium, Italy, the Netherlands, Portugal and Spain and in some countries pilots' unions also took action. The budget carrier has so far managed to clinch labour agreements with staff in several countries including Britain, Germany, Portugal and Italy. Spain is Ryanair's third biggest market. The airline has 13 of its 89 bases in the country.
  18. Facebook chief Mark Zuckerberg said Friday the world's biggest social network has "fundamentally" changed to focus on securing its systems against manipulation and misinformation. Capping a tumultuous year marked by data protection scandals and government probes, Zuckerberg said he was "proud of the progress we've made" in addressing Facebook's problems. "For 2018, my personal challenge has been to focus on addressing some of the most important issues facing our community—whether that's preventing election interference, stopping the spread of hate speech and misinformation, making sure people have control of their information, and ensuring our services improve people's well-being," he wrote on his Facebook page. "We're a very different company today than we were in 2016, or even a year ago. We've fundamentally altered our DNA to focus more on preventing harm in all our services, and we've systematically shifted a large portion of our company to work on preventing harm." He said Facebook now has more than 30,000 people "working on safety" and invests billions of dollars in security. Zuckerberg's comments come at the close of a year when Facebook was roiled by revelations about the misuse of personal data by the political consultancy Cambridge Analytica in the 2016 US election and on data sharing with business partners. But he said the questions around Facebook are "more than a one-year challenge" and that the California giant was in the process of "multi-year plans to overhaul our systems." "In the past we didn't focus as much on these issues as we needed to, but we're now much more proactive," he said. The comments follow a message from Zuckerberg in January, before many of Facebook's troubles emerged, when he outlined his goals of stemming abuse and hate and foreign interference, among other things, on the network used by more than two billion people. "My personal challenge for 2018 is to focus on fixing these important issues," Zuckerberg said in January. In Friday's message, Zuckerberg enumerated a series of steps taken over the past year, including fact-checking partnerships, advertising transparency and artificial intelligence to remove harmful content. He added that Facebook's systems were also being retooled with the aim of helping "improve people's well-being," based on research it conducted. The research, he said, "found that when people use the internet to interact with others, that's associated with all the positive aspects of well-being... But when you just use the internet to consume content passively, that's not associated with those same positive effects." One of the changes aims to reduce "viral videos" that are shared across the Facebook platform. "These changes intentionally reduced engagement and revenue in the near term, although we believe they'll help us build a stronger community and business over the long term," Zuckerberg said.
  19. Fireworks have been banned on the Galapagos Islands to protect the archipelago's unique fauna, the local government said on Friday. The local council said in a statement that it had agreed "unanimously a resolution that prohibits the importation, sale, distribution and use of fireworks or pyrotechnics in the Galapagos province." Those fireworks that produce light but no noise have been excluded from the ban. The islands are home to thousands of residents as well as being a tourist destination, and the measure comes just days before New Year celebrations in which many people traditionally set off fireworks. "Ecosystems as sensitive as the Galapagos Islands are affected (by fireworks), principally its fauna that is unique," said the council. It also wants to avoid any potential deterioration in air quality or pollution of water sources. Animals have suffered from elevated heart rates, nervous stress and anxiety, which have "notably" changed their behavior and affected the survival of species inhabiting this World Heritage Site that belongs to Ecuador. "This is a gift to conservation for Ecuador and the world," Lorena Tapia, president of the local council, said on her Twitter account. A campaign to limit the use of fireworks on the Galapagos Islands was launched in 2017. Single-use plastics have also been banned on the archipelago, about 1,000 kilometers (600 miles) off the coast of Ecuador. Known for its endemic species, the volcanic Galapagos Islands played a crucial role in British naturalist Charles Darwin's studies before he came up with his theory of evolution.
  20. The Trump administration on Friday targeted an Obama-era regulation credited with helping dramatically reduce toxic mercury pollution from coal-fired power plants, saying the benefits to human health and the environment may not be worth the cost of the regulation. The 2011 Obama administration rule, called the Mercury and Air Toxics Standards, led to what electric utilities say was an $18 billion clean-up of mercury and other toxins from the smokestacks of coal-fired power plants. Overall, environmental groups say, federal and state efforts have cut mercury emissions from coal-fired power plants by 85 percent in roughly the last decade. Mercury causes brain damage, learning disabilities and other birth defects in children, among other harm. Coal power plants in this country are the largest single manmade source of mercury pollutants, which enters the food chain through fish and other items that people consume. A proposal Friday from the Environmental Protection Agency challenges the basis for the Obama regulation. It calculates that the crackdown on mercury and other toxins from coal plants produced only a few million dollars a year in measurable health benefits and was not "appropriate and necessary"—a legal benchmark under the country's landmark Clean Air Act. The proposal, which now goes up for public comment before any final administration approval, would leave the current mercury regulation in place. However, the EPA said it will seek comment during a 60-day public-review period on whether "we would be obligated to rescind" the Obama-era rule if the agency adopts Friday's finding that the regulation was not appropriate and necessary. Any such change would trigger new rounds in what have already been years of court battles over regulating mercury pollution from coal plants. Friday's move is the latest by the Trump administration that changes estimates of the costs and payoffs of regulations as part of an overhaul of Obama-era environmental protections. It's also the administration's latest proposed move on behalf of the U.S. coal industry, which has been struggling in the face of competition from natural gas and other cheaper, cleaner forms of energy. The Trump administration in August proposed an overhaul for another Obama-era regulation that would have prodded electricity providers to get less of their energy from dirtier-burning coal plants. In a statement, the EPA said Friday the administration was "providing regulatory certainty" by more accurately estimating the costs and benefits of the Obama administration crackdown on mercury and other toxic emissions from smokestacks. Hal Quinn, head of the National Mining Association, charged in a statement Friday that the Obama administration had carried out "perhaps the largest regulatory accounting fraud perpetrated on American consumers" when it calculated that the broad health benefits to Americans would outweigh the cost of equipment upgrades by power providers. Sen. Tom Carper of Delaware, the top Democrat on the Senate's Environment and Public Works Committee, condemned the Trump administration's move. The EPA has "decided to snatch defeat from the jaws of victory" after the successful clean-up of toxins from the country's coal-plant smokestacks, Carper said. He and other opponents of the move said the Trump administration was playing with numbers, ignoring what Carper said were clear health, environmental and economic benefits to come up with a bottom line that suited the administration's deregulatory aims. Janet McCabe, a former air-quality official in the Obama administration's EPA, called the proposal part of "the quiet dismantling of the regulatory framework" for the federal government's environmental protections. Coming one week into a government shutdown, and in the lull between Christmas and New Year, "this low-key announcement shouldn't fool anyone—it is a big deal, with significant implications," McCabe said.
  21. There was a boom; then a hum. The lights flickered. A giant plume of smoke filled the New York City sky, and turned it blue. From a report: "A sort of unnatural, fluorescent shade of blue," said Bill San Antonio, 28, who was watching Thursday night from inside a terminal at La Guardia Airport. "We thought it was a U.F.O.," said Yiota Androtsakis, a longtime Astoria resident. Ms. Androtsakis was not the only one. In the earliest moments, hundreds of Twitter users from across the city posted videos of the eerie lights, causing many on social media to fear an alien invasion. By late Thursday night officials said the event was caused by nothing more than a transformer explosion. "No injuries, no fire, no evidence of extraterrestrial activity," the New York Police Department tweeted, adding later that the explosion was not suspicious. There was one Con Edison employee nearby when the fire started, and the authorities said he was unharmed. Still, Deputy Inspector Osvaldo Nunez, the commanding officer of the 114th Precinct, conceded that the episode "was spectacular." "You could see it from the precinct, and the precinct is about a half-mile away," he said. "You felt it in your chest, the explosions, and the night sky turned an electric blue."
  22. schwit1 shares a National Review report: After three years, there is no proof that Apple's, Google's, and Microsoft's infiltration of the classroom is producing actual academic improvement and results. Take Facebook's efforts for an example. The company -- under fire for privacy breaches worldwide -- is peddling something called "Summit Learning," a web-based curriculum bankrolled by CEO Mark Zuckerberg and his wife, Priscilla Chan. Last month, students in New York City schools walked out in protest of the program. "It's annoying to just sit there staring at one screen for so long," freshman Mitchel Storman, 14, told the New York Post. He spends close to five hours a day on Summit classes in algebra, biology, English, world history, and physics. Teacher interaction is minimal. "You have to teach yourself," Storman rightly complained. No outside research supports any claim that Summit Learning actually enhances, um, learning. What more studies are showing, however, is that endless hours of screen time are turning kids into zombies who are more easily distracted, less happy, less socially adept, and less physically fit. Standing up to the Silicon Valley Santas and asserting your family's "right to no" may well be the best long-term gift you can give your school-age children.
  23. Samsung's 2019 smart TVs will allow consumers to browse the web, access their PCs and even edit work documents from the comfort of their living room couch. From a report: The company previewed a new feature dubbed Remote Access this week, which integrates both Samsung's own Knox security framework as well as remote access software from VMWare. Samsung stopped short on revealing key details about Remote Access. It did disclose that Remote Access will make it possible to remotely access a PC from a TV, which then seems to function as a gateway to the web, as well as a way to play PC-based games. To use Remote Access, consumers won't have to just rely on their TV remote controls. Instead, it will also work with a keyboard, mouse, and other input devices. These may come in handy when consumers access what Samsung vaguely described as a "web browser-based cloud office service" to "access files and work on documents."
  24. Crippling drought this year has caused more than $1 billion in damage. As it has played out, anyone affected by the drought or trying to manage it has turned to a once obscure map that has become key to understanding what's happening: the U.S. Drought Monitor. From a report: That includes water planners who decide resource allotments. Farmers who need water for their livelihood. Federal bureaucrats who use the map to calculate aid for the Livestock Forage Disaster Program. And then there are citizen scientists like Dave Kitts outside of Sante Fe, N.M. "I think it's a little obsessive, but I check it every Thursday," says Kitts, who has lived on the same 2-acre spread in New Mexico for decades. Dry years like this past one can crust the soil and kill his pinyon trees. "It's just upsetting and depressing to me," he says. "And when it moves the other direction, it definitely lifts my spirits." Scientist Mark Svoboda started the drought map 20 years ago, when Congress took an interest after drought struck Washington, D.C. He directs the National Drought Mitigation Center at the University of Nebraska, Lincoln. "We're covering everything," he says, "from groundwater, stream flow, temperature." In bad drought years like this one, the map has patches of crayon yellow, orange and red that show the levels of drought. Right now, there's a deep crimson bull's-eye in the hardest-hit area of the southwest, where Colorado borders Utah, Arizona and New Mexico. The Drought Monitor map is updated weekly, often taking into account input from hundreds of people -- in addition to scientists. Ranchers and farmers from across the country also send missives to state and national offices, making the map a mix of art, science and farmer wisdom. But it starts with recommendations from state climatologists on any potential changes.
  25. Are you having trouble accessing your CenturyLink internet today? Well, you're not alone. From a report: Several Treasure Valley residents reached out to KTVB Thursday morning to report they had no access to their CenturyLink internet or in some cases, their phone services. Some KTVB employees experienced the outage as well. A KTVB employee who uses the internet service provider called CenturyLink customer service. The customer service associate said they are aware of the outage for the 208 area code and technicians are working on a fix. He added there were similar outages in other states across the country. Downdetector.com indicated issues began being reported a little before 2 a.m. MST. As far as a cause, the CenturyLink representative could not comment. The associate said the outage could last anywhere from 24 to 48-hours. CenturyLink released a statement to Newsweek explaining that its network "is experiencing a disruption affecting customer services." "We know how important these services are to our customers and we are working to restore services as quickly as possible," CenturyLink's statement continued.
Ă—
Ă—
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.