Welcome to InviteHawk - Your Open Source for Sharing Torrent Invites -

  • We're one of the best invite forum on the internet! Here you will find free invites, free seedboxes, free bonuses, and even you can Buy/Sell your torrent invites or accounts
  • InviteHawk gives you the opportunity to get into the best private trackers out there either by buying your way in or just grab free invites given by our members
  • InviteHawk gives you a platform to earn money by selling the extra invites and accounts you have of torrent sites
  • Get the best deals and discounts for various torrent sites only on InviteHawk
  • Never miss a chance to signup on a tracker with open registrations. InviteHawk sends you regular updates about sites with open signups. Just subscribe to our Open Signup Section
  • Get to know everything about a tracker with all the updated information by checking out the tracker reviews

DMlogo.pngrevtt2.pngtorrentleech.pngscenexpress2.pngscene_access.pngtti.pngttixs5.pngwafflesfm.pngwhatcd.pngmusic_vids.pngftn.pnghdbits.png


MrShooter

Junior Moderator
  • Content Count

    9,807
  • Donations

    $0.00 
  • Joined

  • Last visited

  • Days Won

    20
  • Feedback

    100%
  • Points

    155,094 [ Donate ]

MrShooter last won the day on September 18

MrShooter had the most liked content!

Community Reputation

997 Prestiged User

User Groups

About MrShooter


  • User Group: Junior Moderator


  • Member ID: 37968


  • Rank: Invite King


  • Post Count: 9,807


  • Post Ratio: 37.15


  • Total Rep: 997


  • Member Of The Days Won: 20


  • Joined: 01/04/2018


  • Been With Us For: 264 Days


  • Last Activity:


  • Currently: Viewing Forums Index


Profile Information

  • Gender
    Male
  • Interests
    Movies, Anime & Cartoon

Recent Profile Visitors

1,343 profile views
  1. File-sharing traffic, BitTorrent in particular, is making a comeback. New data from Sandvine, shared exclusively with TorrentFreak, reveals that BitTorrent is still a dominant source of upstream traffic worldwide. According to Sandvine, increased fragmentation in the legal streaming market may play a role in this resurgence. Many Internet traffic reports have been published over the years, documenting how traffic patterns change over time. One of the trends that emerged in recent years, is that BitTorrent’s share of total Internet traffic decreased. With the growth of services such as YouTube and Netflix, streaming started to generate massive amounts of bandwidth. As a result, BitTorrent lost a significant chunk of its ‘market share.’ This trend gradually increased, until recently. In some parts of the world file-sharing traffic, BitTorrent in particular, is growing. That’s what’s suggested by Canadian broadband management company Sandvine, which has kept a close eye on these developments for over a decade. The company will release its latest Global Internet Phenomena report next month but gave us an exclusive sneak peek. Globally, across both mobile and fixed access networks file-sharing accounts for 3% of downstream and 22% of upstream traffic. More than 97% of this upstream is BitTorrent, which makes it the dominant P2P force. In the EMEA region, which covers Europe, the Middle East, and Africa there’s a clear upward trend. BitTorrent traffic now accounts for 32% of all upstream traffic. This means that roughly a third of all uploads are torrent-related. Keep in mind that overall bandwidth usage per household also increased during this period, which means that the volume of BitTorrent traffic grew even more aggressively. BitTorrent traffic also remains the top upstream source in the Asia Pacific region with 19% of total traffic. Percentage-wise this is down compared to two years ago, but in volume, it’s relatively stable according to Sandvine. Other popular file-sharing upload sources in the Asia Pacific region are the Korean P2P app “K grid” (7%) and “Afreeca TV” (2%). In the Americas, BitTorrent is the second largest source of upstream traffic. It has a market share of little over 9% and is most popular in Latin America. BitTorrent is only a fraction behind MPEG-TS, which is used for backhauling data from video cameras and security systems. BitTorrent dead? https://torrentfreak.com/images/bittorrentnotdead.png TorrentFreak spoke to Sandvine’s Vice President of Solutions Marketing Cam Cullen, who notes that more details will be released in the upcoming report. However, it’s clear that BitTorrent is not dead yet. The next question is why BitTorrent traffic is on the rise again? According to Cullen, increased fragmentation in the streaming service market may play an important role. “More sources than ever are producing ‘exclusive’ content available on a single streaming or broadcast service – think Game of Thrones for HBO, House of Cards for Netflix, The Handmaid’s Tale for Hulu, or Jack Ryan for Amazon. To get access to all of these services, it gets very expensive for a consumer, so they subscribe to one or two and pirate the rest. “Since these numbers were taken in June for this edition, there were no Game of Thrones episodes coming out, so consider these numbers depressed from peak!” Cullen notes. And we haven’t even mentioned non-filesharing traffic sources such as cyberlockers and streaming sites, which are even more popular than BitTorrent… Source: Torrentfreak.com
  2. Nostalgic Torrents News - Staff pick STAFF PICK: here (Way better than expected, high budget)
  3. The results of a large-scale international study involving the treatment of Alzheimer's disease, which was led and coordinated by researchers at Trinity College Dublin, have just been published this week in a major medical journal, PLOS Medicine. Alzheimer's disease, the commonest form of dementia, currently affects approximately 34 million people worldwide, with 10.5 million people living with dementia in Europe. In Ireland, 55,000 people have dementia and approximately 40,000 of these will have Alzheimer's disease. Currently there is no medication that can delay the onset or slow the progression of this disease. The study involved a clinical trial testing a new treatment against a placebo. The trial, called NILVAD, tested 511 people with mild and moderate Alzheimer's disease in Ireland and across Europe using a single dose of a medication called Nilvadipine to see if it could slow progression in this condition. NIlvadipine is licensed to treat high blood pressure, and studies in animals showed that it lowered brain amyloid, the toxic protein associated with Alzheimer's disease. It was therefore considered to be a possible effective treatment for Alzheimer's disease in patients. The results of this large-scale clinical trial showed that while Nilvadipine was well tolerated in people with Alzheimer's disease, there was no benefit from the drug treatment on memory or functioning in the overall group of patients with mild and moderate Alzheimer's disease. When looked at separately, a smaller sub-group of people at the milder stage of the disease appeared to benefit from Nilvadipine treatment, but these results need further study and exploration. Commenting on the research findings, lead author and coordinator of the study, Professor of Old Age Psychiatry at Trinity College Dublin and St James's Hospital, Dublin, Brian Lawlor said: "The outcome of the trial for the overall combined group of people with mild and moderate Alzheimer's disease was negative; however, when we broke it down according to severity, those with mild disease appeared to benefit from Nilvadipine whereas those with moderate disease seemed to do worse on the medication. These findings will need to be followed up and teased apart in future studies. It may be that to be successful, we would need to target people at the earliest phase of the disease process." A series of additional studies on blood flow effects using MRI scanning were also carried out. Professor Marcel Olde Rikkert, NILVAD's Principal Investigator in the Netherlands commented: "The possible dual action of Nilvadipine both on amyloid and blood flow to the brain paves the way for further important studies in this area."
  4. Exercise and physical activity are of vast global importance to prevent and control the increasing problem of heart disease and stroke, according to a review paper published today in the Journal of the American College of Cardiology. This paper is part of an eight-part health promotion series where each paper will focus on a different risk factor for cardiovascular disease. Physical inactivity is considered one of the leading modifiable risk factors for heart disease, along with smoking status and high low-density lipoprotein (LDL) cholesterol levels. A 2012 study found physical inactivity accounted for 9 percent of premature deaths worldwide and was shown to be the reason behind 6 percent of coronary heart disease, 7 percent of Type 2 diabetes and 10 percent of both breast and colon cancer diagnoses. In this systematic review, the authors compiled the results of 25 published reviews that addressed both personal and environmental variables related to physical activity to determine how health care professionals can empower patients to adhere to a heart-healthy lifestyle. "Proper physical activity should be a lifelong commitment," said Gerald Fletcher, MD, professor of medicine and cardiovascular disease at Mayo Clinic Florida and the review's lead author. "The benefits of being physically active exist regardless of sex, ethnicity or age. The most active individuals have an approximate 40 percent lower risk of developing heart disease than those who do not exercise at all." To benefit overall heart health, current guidelines recommend at least 150 minutes of moderate-intensity or 75 minutes of vigorous-intensity aerobic exercise per week. Aerobic forms of exercise have been shown to lower systolic and diastolic blood pressure as much as 15 and 9 mmHg, respectively, among hypertensive patients, as well as reduce ischemic stroke risk and decrease LDL levels with the aid of a proper diet. Sedentary behaviors (e.g. sitting in front of a computer or watching television) occupy almost eight hours of the average person's day, but replacing one hour of sitting time with an equal amount of activity has been shown to effectively lower all-cause mortality. The researchers recommend incorporating more daily lifestyle activities into the day, such as yard work, household chores, or walking/biking to and from work. The authors list stand-up desks, stand-up conference rooms with no chairs and using the stairs instead of an elevator as a few of the ways a work environment can promote physical activity for its employees. According to the review, both in-patient and out-patient cardiac rehabilitation have also been shown to successfully reduce all-cause mortality and empower heart disease patients to combat modifiable cardiac risk factors. The success of these preventive programs heavily relies on the patient's commitment to changing sedentary behaviors and consistent follow-up from the patient's health care provider. "Just like medication, the right form of physical activity has to be specialized for each patient. Physical activity is no different from smoking cessation or eating a heart-healthy diet," said Fletcher. "It is up to health care professionals to set an example for their patients in all aspects of life."
  5. Walter and Eliza Hall Institute researchers have discovered a way to halt the invasion of the Toxoplasma gondii parasite into cells. Melbourne researchers have discovered a way to halt the invasion of the toxoplasmosis-causing parasite into cells, depriving the parasite of a key factor necessary for its growth. The findings are a key step in getting closer to a vaccine to protect pregnant women from the parasite Toxoplasma gondii, which carries a serious risk of miscarriage or birth defects. The parasite is common in Australia, being carried by 30 per cent of the Australian population, and is transmitted by cat faeces and can also be acquired from raw meat. Toxoplasma infection may also have a link to neurological disorders such as schizophrenia. The research, published today in the journal PLOS Biology, was led by Associate Professor Chris Tonkin, Dr Alex Uboldi and Ms Mary-Louise Wilde from the Walter and Eliza Hall Institute. At a glance *The parasite Toxoplasma gondii is carried by 30 per cent of the Australian population, and is a significant cause of miscarriage and birth defects. *Walter and Eliza Hall Institute researchers have discovered that a factor called protein kinase A (PKA) is required for Toxoplasma gondii to invade a host cell. *The finding could lead to a vaccine or treatment for Toxoplasmosis, and sheds light on more general processes involved in other diseases caused by related parasites such as malaria. *The breakthrough was possible through use of high-quality imaging facilities at the Institute's Centre for Dynamic Imaging. Cut the brakes There are two important steps allowing Toxoplasma gondii to take hold within our body: the parasite needs to enter a host cell, and from there it replicates and spreads. "After Toxoplasma infects humans it needs to switch off the infection machinery and switch on replication," Associate Professor Tonkin said. 'Without the ability to do this, Toxoplasma will die and be unable to cause disease. We discovered that the gene protein kinase A (PKA) is required for this switch. Without PKA, Toxoplasma can't hold steady." Importance of new technology According to Dr Uboldi, the discovery was made using advanced microscopy technology available at the Institute's Centre for Dynamic Imaging. "It wasn't until we observed the parasite down the microscope and studied its behaviour that we noticed something unexpected," Dr Uboldi said. "This was fully dependent on our access to the sophisticated equipment at the Institute's Centre for Dynamic Imaging. "By actually watching the process take place in real time, we had that rare Eureka moment." Wide-ranging implications Ms Wilde, a PhD student at the Institute, noted that Toxoplasma gondii is closely related to parasites that cause a range of globally significant diseases, such as malaria. "Central to all these parasites is that they need to invade host cells in order to survive," Ms Wilde said. "Understanding the role that PKA plays in the Toxoplasma lifecycle could provide insights into the biology of other disease-causing parasites, such as Plasmodium, which causes malaria. "PKA belongs to a class of molecules that are a really important drug target for other diseases such as cancer and diabetes. We now might be able to take our understanding of how PKA functions in these diseases to design new therapies for toxoplasmosis and infections caused by related parasites," she said. The research findings were supported by the National Health and Medical Research Council and an Australian Research Council fellowship.
  6. People who include a little yoga or tai chi in their day may be more likely to remember where they put their keys. Researchers at the University of California, Irvine and Japan's University of Tsukuba found that even very light workouts can increase the connectivity between parts of the brain responsible for memory formation and storage. In a study of 36 healthy young adults, the researchers discovered that a single 10-minute period of mild exertion can yield considerable cognitive benefits. Using high-resolution functional magnetic resonance imaging, the team examined subjects' brains shortly after exercise sessions and saw better connectivity between the hippocampal dentate gyrus and cortical areas linked to detailed memory processing. Their results were published today in Proceedings of the National Academy of Sciences. "The hippocampus is critical for the creation of new memories; it's one of the first regions of the brain to deteriorate as we get older -- and much more severely in Alzheimer's disease," said project co-leader Michael Yassa, UCI professor and Chancellor's Fellow of neurobiology & behavior. "Improving the function of the hippocampus holds much promise for improving memory in everyday settings." The neuroscientists found that the level of heightened connectivity predicted the degree of recall enhancement. Yassa, director of UCI's Center for the Neurobiology of Learning and Memory and the recently launched UCI Brain Initiative, said that while prior research has centered on the way exercise promotes the generation of new brain cells in memory regions, this new study demonstrates a more immediate impact: strengthened communication between memory-focused parts of the brain. "We don't discount the possibility that new cells are being born, but that's a process that takes a bit longer to unfold," he said. "What we observed is that these 10-minute periods of exercise showed results immediately afterward." A little bit of physical activity can go a long way, Yassa stressed. "It's encouraging to see more people keeping track of their exercise habits -- by monitoring the number of steps they're taking, for example," he said. "Even short walking breaks throughout the day may have considerable effects on improving memory and cognition." Yassa and his colleagues at UCI and at the University of Tsukuba are extending this avenue of research by testing older adults who are at greater risk of age-related mental impairment and by conducting long-term interventions to see if regular, brief, light exercise done daily for several weeks or months can have a positive impact on the brain's structure and function in these subjects. "Clearly, there is tremendous value to understanding the exercise prescription that best works in the elderly so that we can make recommendations for staving off cognitive decline," he said.
  7. USC scientists say Alzheimer's could be diagnosed earlier if scientists focus on an early warning within the brain's circulation system. That's important because researchers believe that the earlier Alzheimer's is spotted, the better chance there is to stop or slow the disease. "Cognitive impairment, and accumulation in the brain of the abnormal proteins amyloid and tau, are what we currently rely upon to diagnose Alzheimer's disease, but blood-brain barrier breakdown and cerebral blood flow changes can be seen much earlier," said Berislav Zlokovic, the Mary Hayley and Selim Zilkha Chair in Alzheimer's Disease Research at the Keck School of Medicine of USC. "This shows why healthy blood vessels are so important for normal brain functioning." In a new review article in the Sept. 24 issue of Nature Neuroscience, Zlokovic and his colleagues recommend that the blood-brain barrier, or BBB, be considered an important biomarker -- and potential drug target -- for Alzheimer's disease. Because Alzheimer's is irreversible, and not fully understood, understanding the first step in the disease process is a critical step in fighting it. Alzheimer's afflicts 5.7 million Americans and is expected to impair about 14 million by 2050, according to the U.S. Centers for Disease Control and Prevention. Treatment costs total hundreds of billions of dollars annually in the United States. Alzheimer's kills more people than breast cancer and prostate cancer combined. The blood-brain barrier is a filtration system, letting in good things (glucose, amino acids) and keeping out bad things (viruses, bacteria, blood). It's mostly comprised of endothelial cells lining the 400 miles of arteries, veins and capillaries that feed our brains. Some evidence indicates that leaks in the blood-brain barrier may allow a protein called amyloid into the brain where it sticks to neurons. This triggers the accumulation of more amyloid, which eventually overwhelms and kills brain cells. "Something is off with the system when that happens," said Arthur Toga, director of the Laboratory of Neuro Imaging (LONI) and the USC Mark and Mary Stevens Neuroimaging and Informatics Institute at the Keck School of Medicine. "Healthy people have amyloid in their bodies. When the system is dysregulated, amyloid can build up and cells die off." Blood-to-brain leaks are seen in other neurodegenerative diseases, such as Huntington's disease, Parkinson's and multiple sclerosis. BBB leaks can be detected with an intravenously administered contrast substance in concert with magnetic resonance imaging. Brain microbleeds, another sign of leakage, also can be picked up with MRI. A slowdown in the brain's uptake of glucose, visible via PET scan, can be a another result of BBB breakdown. Zlokovic notes that these aren't tests routinely offered at a doctor's office. In addition to Toga and Zlokovic, the paper's senior and corresponding author, the paper's other authors are Melanie Sweeney, Kassandra Kisler and Axel Montagne, all of USC's Zilkha Neurogenetic Institute.
  8. Organs affected by autoimmune disease could be fighting back by "exhausting" immune cells that cause damage using methods similar to those used by cancer cells to escape detection, according to a study by researchers at the University of Pittsburgh School of Medicine published today in the Journal of Clinical Investigation. The conclusions, based on studies in mouse models of systemic lupus erythematous (SLE) -- referred to as lupus -- could explain why autoimmune diseases may take a long time to cause significant organ damage. They could also explain how widely used cancer immunotherapy drugs can have deleterious autoimmune side effects on normal organs. "These findings really turn our current understanding of autoimmune tissue damage on its head and suggest that we could more effectively treat these diseases if we can develop targeted methods to enhance the body's natural ability to tune down the immune system," said senior author Mark Shlomchik, M.D., Ph.D., UPMC endowed professor and chair, Department of Immunology, Pitt School of Medicine, and an investigator at the UPMC Immune Transplant and Therapy Center. In autoimmune diseases like lupus, immune cells that normally protect against invaders, such as bacteria or cancer cells, instead begin to recognize the body's own cells as foreign and attack them. In lupus nephritis, a kidney disease associated with SLE, a large number of these autoreactive cells -- called kidney infiltrating T cells (KITs) -- were thought to be activated, causing damage over time. Wondering how exactly these cells cause kidney damage, Jeremy Tilstra, M.D., Ph.D., an assistant professor of medicine at Pitt and a researcher in Shlomchik's lab, began to study them in three different mouse models of lupus nephritis. As the researchers expected, there were millions of KITs in the kidney, but surprisingly, they were not highly active as had previously been thought. "The T cells were there, but they weren't aggressively active, in fact, it was the exact opposite," said Tilstra. "They were sluggish, ineffective killers and didn't divide very well, which was completely unexpected." Experiments showed that these KITs did not respond to stimulation like normal T cells -- they neither released characteristic inflammatory proteins, nor did they reproduce very well. The cells also took up and used much less energy, displaying signs of metabolic exhaustion. Interestingly, the exhausted KITs were quite similar to T cells found inside tumors. The affected kidney cells also resembled tumor cells in certain ways, as they expressed higher levels of a protein called PD-L1, which cancer cells use to suppress T cells that enter the tumor. "Our findings suggest that the body is capable of actively fighting back against autoimmune diseases, not sitting idly by. The similarity between T cells in lupus-affected kidneys and in tumors has important implications," noted Shlomchik. "It suggests that the ability to suppress T cells is not an abnormal mechanism that cancer cells have somehow developed to defeat the immune system, rather it's an existing natural mechanism against autoimmune disease that tumors have adopted to their advantage." In the future, the researchers plan to expand the study to patients with lupus to see if they can find similar exhausted T cells in urine or tissue samples.
  9. A new study out today in the Journal of Neurology finds that pregabalin is not effective in controlling the chronic pain that sometimes develops following traumatic nerve injury. The results of the international study, which was driven by an effort to identify effective non-opioid pain medications, did show potential in relieving in pain that sometimes lingers after surgery. "The unrelenting burning or stabbing symptoms due to nerve trauma are a leading reason why people seek treatment for chronic pain after a fall, car accident, or surgery," said John Markman, M.D., director of the Translational Pain Research Program in the University of Rochester Department of Neurosurgery and lead author of the study. "While these finding show that pregabalin is not effective in controlling the long-term pain for traumatic injury, it may provide relief for patients experience post-surgical pain." Pregabalin, which is marketed by Pfizer under the name Lyrica, is approved to treat chronic pain associated with shingles, spinal cord injury, fibromyalgia, and diabetic peripheral neuropathy. However, it is also commonly prescribed as an "off label" treatment for chronic nerve injury syndromes that occur after motor vehicle accidents, falls, sports injuries, knee or hip replacement and surgeries such as hernia repair or mastectomy. A previous eight-week study had shown that pregabalin reduced pain intensity better than placebo in these chronic, post-traumatic pain syndromes. These results led many doctors to prescribe this medication for long-term pain that does not resolve as expected. Chronic postsurgical pain syndromes occur in approximately one or two out of every 10 surgical patients and the levels rated as intolerable after roughly one or two in every 100 operations. With 55 million surgeries performed in the U.S. every year, severe chronic pain impacts more than a million new people annually. Roughly one third of these patients are believed to have neuropathic pain or ongoing pain related to nerve injury. These rates vary widely by type of surgery. The risks factors and underlying mechanisms of this type of chronic pain are not well understood, but because the types of symptoms patients describe like "burning," "unpleasant tingling," or "numbness" resemble other nerve pain syndromes like shingles pain. As a result, physicians trying to find useful non-opioid pain relievers have often turned to prescribing gabapentin or pregabalin. The current study was conducted in 101 centers in in North America, Europe, Africa, and Asia and followed 539 individuals for three months. Study participants were randomized into two groups who either prescribed pregabalin or a placebo. The study found that pregabalin was not effective in controlling pain for individuals with traumatic nerve injury. A retrospective analysis of a subgroup of study of participants, whose nerve pain was attributed to surgery, showed that the drug did provide better pain relief than placebo at 3 months. "The possibility that there was pain relief for those patients who had a hernia repair, or breast surgery for cancer, or a joint replacement lays the groundwork for future studies in these post-surgical syndromes where there is so much need for non-opioid treatments," said Markman. One major challenge is that different biological changes in the nerves and other tissues that cause pain to persist after healing from trauma vary from one patient to the next. Currently, there is no diagnostic method that allows doctors to readily identify the patients whose pain will respond to a particular type of pain treatment. Despite employing new strategies to reduce placebo effects, the patients receiving placebo also had a steady lowering of their pain over the course of the study. The pattern of these placebo effects in longer studies has proved to be a major challenge to the development of new pain medications. "Given the rising rates of surgery and shrinking reliance on opioids, it is critical that we understand how to study new drugs that work differently in patients like the ones included in this study," Markman added.
  10. The team from Imperial College London were able to crash caged populations of the malaria vector mosquito Anopheles gambiae in only 7-11 generations. This is the first time experiments have been able to completely block the reproductive capacity of a complex organism in the laboratory using a designer molecular approach. The technique, called gene drive, was used to selectively target the specific mosquito species An. gambiae that is responsible for malaria transmission in sub-Saharan Africa. There are around 3500 species of mosquito worldwide, of which only 40 related species can carry malaria. The hope is that mosquitoes carrying a gene drive would be released in the future, spreading female infertility within local malaria-carrying mosquito populations and causing them to collapse. In 2016, there were around 216 million malaria cases and an estimated 445,000 deaths worldwide, mostly of children under five years old. Lead researcher Professor Andrea Crisanti, from the Department of Life Sciences at Imperial, said: "2016 marked the first time in over two decades that malaria cases did not fall year-on-year despite huge efforts and resources, suggesting we need more tools in the fight." The team's results, published today in Nature Biotechnology, represent the first time gene drive has been able to completely suppress a population, overcoming resistance issues previous approaches have faced. Professor Crisanti added: "This breakthrough shows that gene drive can work, providing hope in the fight against a disease that has plagued humankind for centuries. There is still more work to be done, both in terms of testing the technology in larger lab-based studies and working with affected countries to assess the feasibility of such an intervention. "It will still be at least 5-10 years before we consider testing any mosquitoes with gene drive in the wild, but now we have some encouraging proof that we're on the right path. Gene drive solutions have the potential one day to expedite malaria eradication by overcoming the barriers of logistics in resource-poor countries." The team targeted a gene in An. gambiae called doublesex, which determines whether an individual mosquito develops as a male or as a female. The team engineered a gene drive solution designed to selectively alter a region of the doublesex gene that is responsible for female development. Males who carried this modified gene showed no changes, and neither did females with only one copy of the modified gene. However, females with two copies of the modified gene showed both male and female characteristics, failed to bite and did not lay eggs. Their experiments showed that the gene drive transmitted the genetic modification nearly 100% of the time. After eight generations no females were produced and the populations collapsed because of lack of offspring. Previous attempts to develop gene drive for population suppression have encountered 'resistance', where targeted genes developed mutations that allowed the gene to carry out its function, but that that were resistant to the drive. These changes would then be passed down to the offspring, halting the gene drive in its tracks. One of the reasons doublesex was picked for the gene drive target was that it was thought not to tolerate any mutations, overcoming this potential source of resistance. Indeed, in the study no functional mutated copy of the doublesex gene arose and spread in the population. While this is the first time resistance has been overcome, the team say additional experiments are needed to investigate the efficacy and the stability of the gene drive under confined laboratory settings that mimic tropical environments. This involves testing the technology on larger populations of mosquitoes confined in more realistic settings, where competition for food and other ecological factors may change the fate of the gene drive. The doublesex gene targeted in the study is similar across the insect world, although different insects have different exact genetic sequences. This suggests the technology could be used in the future to specifically target other disease-carrying insects. Recent work from Imperial showed that suppressing An. gambiae populations in local areas is unlikely to affect the local ecosystem.
  11. Rice University researchers modeled two-dimensional materials to quantify how they react to light. They calculated how the atom-thick materials in single or stacked layers would transmit, absorb and reflect light. The graphs above measure the maximum absorbance of several of the 55 materials tested. The ability of metallic or semiconducting materials to absorb, reflect and act upon light is of primary importance to scientists developing optoelectronics -- electronic devices that interact with light to perform tasks. Rice University scientists have now produced a method to determine the properties of atom-thin materials that promise to refine the modulation and manipulation of light. Two-dimensional materials have been a hot research topic since graphene, a flat lattice of carbon atoms, was identified in 2001. Since then, scientists have raced to develop, either in theory or in the lab, novel 2D materials with a range of optical, electronic and physical properties. Until now, they have lacked a comprehensive guide to the optical properties those materials offer as ultrathin reflectors, transmitters or absorbers. The Rice lab of materials theorist Boris Yakobson took up the challenge. Yakobson and his co-authors, graduate student and lead author Sunny Gupta, postdoctoral researcher Sharmila Shirodkar and research scientist Alex Kutana, used state-of-the-art theoretical methods to compute the maximum optical properties of 55 2D materials. "The important thing now that we understand the protocol is that we can use it to analyze any 2D material," Gupta said. "This is a big computational effort, but now it's possible to evaluate any material at a deeper quantitative level." Their work, which appears this month in the American Chemical Society journal ACS Nano, details the monolayers' transmittance, absorbance and reflectance, properties they collectively dubbed TAR. At the nanoscale, light can interact with materials in unique ways, prompting electron-photon interactions or triggering plasmons that absorb light at one frequency and emit it in another. Manipulating 2D materials lets researchers design ever smaller devices like sensors or light-driven circuits. But first it helps to know how sensitive a material is to a particular wavelength of light, from infrared to visible colors to ultraviolet. "Generally, the common wisdom is that 2D materials are so thin that they should appear to be essentially transparent, with negligible reflection and absorption," Yakobson said. "Surprisingly, we found that each material has an expressive optical signature, with a large portion of light of a particular color (wavelength) being absorbed or reflected." The co-authors anticipate photodetecting and modulating devices and polarizing filters are possible applications for 2D materials that have directionally dependent optical properties. "Multilayer coatings could provide good protection from radiation or light, like from lasers," Shirodkar said. "In the latter case, heterostructured (multilayered) films -- coatings of complementary materials -- may be needed. Greater intensities of light could produce nonlinear effects, and accounting for those will certainly require further research." The researchers modeled 2D stacks as well as single layers. "Stacks can broaden the spectral range or bring about new functionality, like polarizers," Kutana said. "We can think about using stacked heterostructure patterns to store information or even for cryptography." Among their results, the researchers verified that stacks of graphene and borophene are highly reflective of mid-infrared light. Their most striking discovery was that a material made of more than 100 single-atom layers of boron -- which would still be only about 40 nanometers thick -- would reflect more than 99 percent of light from the infrared to ultraviolet, outperforming doped graphene and bulk silver. There's a side benefit that fits with Yakobson's artistic sensibility as well. "Now that we know the optical properties of all these materials -- the colors they reflect and transmit when hit with light -- we can think about making Tiffany-style stained-glass windows on the nanoscale," he said. "That would be fantastic!"
  12. A new study suggests the power industry is underestimating how climate change could affect the long-term demand for electricity in the United States. The research, published today in the journal Risk Analysis, was led by the University at Buffalo and Purdue University. It describes the limitations of prediction models used by electricity providers and regulators for medium- and long-term energy forecasting. And it outlines a new model that includes key climate predictors -- mean dew point temperature and extreme maximum temperature -- that researchers say present a more accurate view of how climate change will alter future electricity demands. "Existing energy demand models haven't kept pace with our increasing knowledge of how the climate is changing," says the study's lead author Sayanti Mukherjee, PhD, assistant professor of industrial and systems engineering in UB's School of Engineering and Applied Sciences. "This is troublesome because it could lead to supply inadequacy risks that cause more power outages, which can affect everything from national security and the digital economy to public health and the environment." "The availability of public data in the energy sector, combined with advances in algorithmic modeling, has enabled us to go beyond existing approaches that often exhibit poor predictive performance. As a result, we're able to better characterize the nexus between energy demand and climate change, and assess future supply inadequacy risks," says co-author Roshanak Nateghi, PhD, assistant professor of industrial engineering and environmental and ecological engineering at Purdue. The limitations of existing models The overwhelming majority of climate scientists predict global temperatures will rise throughout 21st century. This is expected to increase the demand for electricity as more people turn to air conditioners to keep cool. One of the most common energy modeling platforms used to predict future electricity demand -- MARKAL, named after MARKet and ALlocation -- does not consider climate variability. Another common energy-economic model, the National Energy Modeling System, or NEMS, does consider the climate. However, it's limited to heating and cooling degree days. A heating degree day is defined as a day when the average temperature is above 65 degrees Fahrenheit (18 degrees Celsius). A cooling degree day is when the average temperature is below 65 degrees. While there are different ways to measure heating and cooling degree days, they are most often calculated by adding the day's high temperature to the day's low temperature, and then dividing the sum by two. For example, a high of 76 degrees and a low of 60 degrees results in an average temperature of 68 degrees. The trouble with this approach, Mukherjee says, is that it doesn't consider time. For example, it could be 76 degrees for 23 hours and 60 degrees for one hour -- yet the average temperature that day would still be recorded as 68 degrees. "Moreover, choice of the accurate balance point temperature is highly contentious, and there is no consensus from the research community of how to best select it," says Mukherjee. Dew point temperature is the key To address these limitations, she and Nateghi studied more than a dozen weather measurements. They found that the mean dew point temperature -- the temperature at which air is saturated with water vapor -- is the best predictor of increased energy demand. The next best predictor is the extreme maximum temperature for a month, they say. The researchers combined these climate predictors with three other categories -- the sector (residential, commercial and industrial) consuming the energy, weather data and socioeconomic data -- to create their model. They applied the model to the state of Ohio and found that the residential sector is most sensitive to climate variabilities. With a moderate rise in dew point temperature, electricity demand could increase up to 20 percent. The prediction jumps to 40 percent with a severe rise. By comparison, the Public Utility Commission of Ohio (PUCO), which does not consider climate change in its models, predicts residential demand increases of less than 4 percent up to 2033. It's similar in the commercial sector, where the researchers say demand could increase to 14 percent. Again, PUCO's projections are lower, 3.2 percent. The industrial sector is less sensitive to temperature variability, however, researchers say the demand could still exceed projections. During the winter months, variations between the models is less significant. That is due, in part, to the relatively low percentage (22.6 percent) of Ohio residents who heat their homes via electricity. While the study is limited to Ohio, researchers say the model can be applied to other states. To communicate results, the researchers used heat maps, which provide an immediate visual summary of the data represented by colors. The idea, they say, is to better inform decision makers with accurate and easy to understand information.
  13. These are proposed Wigner crystals for magic-angle bilayer graphene. In Figure A, the criterion for observing this lattice structure is not satisfied experimentally, resulting in metallic transport when a single electron occupies a moiré cell. Figures B and C show the insulating state, explaining the experimental observation when 2 or 3 electrons are in a moiré cell. Recently, a team of scientists led by Pablo Jarillo-Herrero at the Massachusetts Institute of Technology (MIT) created a huge stir in the field of condensed matter physics when they showed that two sheets of graphene twisted at specific angles -- dubbed "magic-angle" graphene -- display two emergent phases of matter not observed in single sheets of graphene. Graphene is a honeycomb lattice of carbon atoms -- it's essentially a one-atom-thick layer of graphite, the dark, flaky material in pencils. In two articles published online in March 2018 and appearing in the April 5, 2018 issue of the journal Nature, the team reported the twisted bilayer graphene (tBLG) exhibits an unconventional superconducting phase, akin to what is seen in high-temperature superconducting cuprates. This phase is obtained by doping (injecting electrons into) an insulating state, which the MIT group interpreted as an example of Mott insulation. A joint team of scientists at UCSB and Columbia University has reproduces the remarkable MIT results. The discovery holds promise for the eventual development of room-temperature superconductors and a host of other equally groundbreaking applications. Researchers at the University of Illinois at Urbana-Champaign have recently shown that the insulating behavior reported by the MIT team has been misattributed. Professor Philip Phillips, a noted expert in the physics of Mott insulators, says a careful review of the MIT experimental data by his team revealed that the insulating behavior of the "magic-angle" graphene is not Mott insulation, but something even more profound -- a Wigner crystal. "People have been looking for clear examples of Wigner crystals since Wigner first predicted them in the 1930s," Phillips asserts. "I think this is even more exciting than if it were a Mott insulator." Lead author of the U of I study, graduate student Bikash Padhi, explains, "When one sheet of graphene is twisted on top of another, moiré patterns emerge as a result of the offset in the honeycomb structure. By artificially injecting electrons into these sheets, the MIT group obtained novel phases of matter which can be understood by studying these extra electrons on the bed of this moiré pattern. By increasing the electron density, the MIT group observed an insulating state when 2 and 3 electrons reside in a moiré unit cell. They argued this behavior is an example of Mott physics." Why can't it be Mott physics? Phillips explains, "Mott insulators are a class of materials that should be conductive if electronic interactions are not taken into account, but once that's taken into account, are insulating instead. There are two primary reasons why we suspect the tBLG does not form a Mott insulator -- the observed metal-insulator transition offers only one characteristic energy scale, whereas conventional Mott insulators are described by two scales. Next, in the MIT report, in contrast to what one expects for a Mott system, there was no insulator when there was only 1 electron per unit cell. This is fundamentally inconsistent with Mottness." The accompanying figure displays the crystalline states that explain this data. What is a Wigner crystal? To understand Wigner crystals, Padhi offers this analogy: "Imagine a group of people each inside a large orb and running around in a closed room. If this orb is small they can move freely but as it grows bigger one may collide more frequently than before and eventually there might be a point when all of them are stuck at their positions since any small movement will be immediately prevented by the next person. This is basically what a crystal is. The people here are electrons, and the orb is a measure of their repulsion." Phillips credits Padhi with providing the impetus for the study. These results were pre-published online in the journal Nano Letters in the article, "Doped Twisted Bilayer Graphene near Magic Angles: Proximity to Wigner Crystallization not Mott Insulation," on September 5, 2018, with the final official redaction to be included in the journal's October 2018 issue. This research was funded by the Center for Emergent Superconductivity, a Department of Energy-funded Energy Frontier Research Center, and by the National Science Foundation. The conclusions presented are those of the researchers and not necessarily those of the funding agencies.
  14. Researchers have developed a new way to model seismic risk, which they hope will better inform disaster risk reduction planning in earthquake-prone areas. The study, which is published in Proceedings of the National Academy of Sciences today (Monday 24 September 2018), and was led by academics from Durham University's Department of Geography, has developed a methodology that assesses seismic risk by looking at multiple earthquake scenarios and identifying impacts that are common to multiple scenarios. This approach, which the team calls 'ensemble modelling', allows the researchers to estimate whether particular impacts are specific to certain earthquakes, or occur irrespective of the location or magnitude of an earthquake. The team hopes that this method will provide contingency planners with a more complete picture of earthquake risk and potentially help guide the use of limited resources available for earthquake risk reduction. The ensemble modelling method is novel as it goes beyond the standard probabilistic (identifying all possible earthquake scenarios at a given site) and deterministic (worst-case-event) approaches, focusing instead on the impacts of multiple possible earthquake scenarios. Dr Tom Robinson, Durham University Department of Geography, said: "Earthquakes remain one of the deadliest natural hazards in the world and are a significant planning challenge for governments and aid agencies. "Traditional assessments of seismic risk focus primarily on improving understanding of earthquake hazard, in terms of potential ground shaking but for contingency planning, it is the potential impacts of an earthquake that are of more importance. "Our method provides critical information on the likelihood, and probable scale, of impacts in future earthquakes. We hope this can help better inform how governments and aid agencies direct limited disaster mitigation resources, for example how they distribute resources geographically." The research team hope that the ensemble modelling method will help planners to better understand where risks are greater, for example because of the relative vulnerability of communities, or their location in relation to identified likely earthquake impacts, and direct resources in a more targeted, informed way. As part of their study the research team worked with colleagues at Nepal's National Society of Earthquake Technology to use Nepal as a case study for their modelling approach. Together the team modelled fatalities from 90 different scenario earthquakes and established whether or not the impacts where specific to a certain scenario. Dr Robinson said: "The results showed that for most districts in Nepal similar impacts occurred irrespective of the scenario earthquake and that impacts were typically closer to the minimum rather than the worst-case scenario. "This suggests that planning for the worst-case scenario in Nepal may place an unnecessarily large burden on the limited resources available. "Our results also showed that the most at-risk districts are predominantly in rural western Nepal and that there are around 9.5 million Nepalese people who live in districts that are at a higher seismic risk than the capital, Kathmandu. "Disaster risk reduction planning therefore needs to focus on rural, as well as urban, communities, as our modelling shows they are at higher risk." The results of the case study allow the team to demonstrate that a sole planning focus on urban earthquake risk in Kathmandu could be inappropriate, as many rural populations within Nepal are at greater relative risk. However, the new modelling approach is not only relevant to Nepal and can be applied anywhere, to help inform earthquake disaster risk reduction planning.