Jump to content

Skylights's Content - Page 11 - InviteHawk - Your Only Source for Free Torrent Invites

Buy, Sell, Trade or Find Free Torrent Invites for Private Torrent Trackers Such As redacted, blutopia, losslessclub, femdomcult, filelist, Chdbits, Uhdbits, empornium, iptorrents, hdbits, gazellegames, animebytes, privatehd, myspleen, torrentleech, morethantv, bibliotik, alpharatio, blady, passthepopcorn, brokenstones, pornbay, cgpeers, cinemageddon, broadcasthenet, learnbits, torrentseeds, beyondhd, cinemaz, u2.dmhy, Karagarga, PTerclub, Nyaa.si, Polishtracker etc.

Skylights

Banned
  • Posts

    953
  • Joined

  • Last visited

  • Feedback

    94.7%
  • Points

    75 [ Donate ]

Everything posted by Skylights

  1. What makes kevlar stop a bullet, at the atomic level? The properties of materials emerge from their molecular or atomic structure, yet many details between the micro and the macro remain a mystery to science. Scientists are actively researching the rational design of targeted supramolecular architectures, with the goal of engineering their structural dynamics and their response to environmental cues. A team of chemists at the University of California, San Diego (UCSD) has now designed a two-dimensional protein crystal that toggles between states of varying porosity and density. This is a first in biomolecular design that combined experimental studies with computation done on supercomputers. The research, published in April 2018 in Nature Chemistry, could help create new materials for renewable energy, medicine, water purification, and more. "We did an extensive set of molecular dynamics simulations and experiments, which explained the basis of the unusual structural dynamics of these artificial proteins, based on which we were able to make rational decisions and alter the structural dynamics of the assembly," said study co-author Akif Tezcan, a professor of chemistry and biochemistry at UCSD. Tezcan's team worked with the protein L-rhamnulose-1-phosphate aldolase (RhuA), which was modified with cysteine amino acids in its four corners at position 98 (C98RhuA). He and his group had previously published work on the self-assembly of this artificial, two-dimensional protein architecture, which he said showed an interesting behavior called auxeticity. "These crystalline assemblies can actually open and close in coherence," Tezcan said. "As they do, they shrink or expand equally in X and Y directions, which is the opposite of what normal materials do. We wanted to investigate what these motions are due to and what governs them." An example of auxeticity can be seen in the Hoberman Sphere, a toy ball that expands through its scissor-like hinges when you pull the ends apart. "Our goal was to be able to do the same thing, using proteins as building blocks, to create new types of materials with advanced properties," Tezcan said. "The example that we're studying here was essentially the fruit of those efforts, where we used this particular protein that has a square-like shape, which we attached to one another through chemical linkages that were reversible and acted like hinges. This allowed these materials to form very well-ordered crystals that were also dynamic due to the flexibility of these chemical bonds, which ended up giving us these new, emergent properties." Control of the opening and closing of the pores in the C98RhuA protein 2-D lattices could capture or release specific molecular targets useful for drug delivery or creation of better batteries with more research, Tezcan said. Or they could selectively pass through or block the passage of biological molecules and filter water. "Our idea was to be able to build complex materials, like evolution has done, using proteins as building blocks," Tezcan said. The way Tezcan's team did so was to first express the proteins in E. coli bacteria cells and purify them, after which they induced the formation of the chemical linkages that actually create the crystals of C98RhuA, which vary as a function of their oxidation state, through the addition of redox-active chemicals. "Once the crystals are formed, the big characterization becomes the openness or closeness of the crystals themselves," explained Tezcan, which was determined through statistical analysis of hundreds of images captured using electron microscopy. The experiments worked hand-in-hand with computation, primarily all-atom simulations using the NAMD software developed at the University of Illinois at Urbana Champaign by the group of the late biophysicist Klaus Schulten. Tezcan's team used a reduced system of just four proteins linked together, which can be tiled infinitely to get to the bottom of how the crystal opens and closes. "The reduced system allowed us to make these calculations feasible for us, because there are still hundreds of thousands of atoms, even in this reduced system," Tezcan said. His team took advantage of features specific to C98RhuA, such as using a single reaction coordinate corresponding to its openness. "We were really able to validate this model as being representative of what we observed in the experiment," Tezcan said. The all-atom molecular simulations of the C98RhuA crystal lattices were used to map the free-energy landscape. This energy landscape looks like a natural landscape, with valleys, mountains, and mountain passes, explained study co-author Francesco Paesani, a professor of chemistry and biochemistry at UCSD. "The valleys become the most stable configurations of your protein assemblies," Paesani said, which the molecular system prefers over having to spend energy to go over a mountain. And the mountain passes show the way from one stable structure to another. "Typically, free energy calculations are very expensive and challenging because essentially what you're trying to do is sample all possible configurations of a molecular system that contains thousands of atoms. And you want to know how many positions these atoms can acquire during a simulation. It takes a lot of time and a lot of computer resources," Paesani said. To meet these and other computational challenges, Paesani has been awarded supercomputer allocations through XSEDE, the Extreme Science and Engineering Discovery Environment, funded by the National Science Foundation. "Fortunately, XSEDE has provided us with an allocation on Maverick, the GPU computing clusters at the Texas Advanced Computing Center (TACC)," Paesani said. Maverick is a dedicated visualization and data analysis resource architected with 132 NVIDIA Tesla K40 "Atlas" graphics processing units (GPU) for remote visualization and GPU computing to the national community. "That was very useful to us, because the NAMD software that we use runs very well on GPUs. That allows us to speed up the calculations by orders of magnitudes," Paesani said. "Nowadays, we can afford calculations that ten years ago we couldn't even dream about because of these developments, both on the NAMD software and on the hardware. All of these computing clusters that XSEDE provides are actually quite useful for all molecular dynamic simulations." Through XSEDE, Paesani used several supercomputing systems, including Gordon, Comet, and Trestles at the San Diego Supercomputer Center; Kraken at the National Institute for Computational Sciences; and Ranger, Stampede, and Stampede2 at TACC. "Because all the simulations were run on GPUs, Maverick was the perfect choice for this type of application," Paesani said. Computation and experiment worked together to produce results. "I think this is a beautiful example of the synergy between theory and experiment," Paesani said. "Experiment posed the first question. Theory and computer simulation addressed that question, providing some understanding of the mechanism. And then we used computer simulation to make predictions and ask the experiments to test the validity of these hypotheses. Everything worked out very nicely because the simulations explained the experiments at the beginning. The predictions that were made were confirmed by the experiments at the end. It is an example of the perfect synergy between experiments and theoretical modeling." Tezcan added that "chemists traditionally like to build complex molecules from simpler building blocks, and one can envision doing such a combination of design, experiment and computation for smaller molecules to predict their behavior. But the fact that we can do it on molecules that are composed of hundreds of thousands of atoms is quite unprecedented." The science team also used molecular dynamics simulations to rigorously investigate the role of water in directing the lattice motion of C98RhuA. "This study showed us how important the active role of water is in controlling the structural dynamics of complex macromolecules, which in biochemistry can get overlooked," Tezcan said. "But this study showed, very clearly, that the dynamics of these proteins are driven actively by water dynamics, which I think brings the importance of water to the fore." Rob Alberstein, graduate student in the Tezcan group and first author of the Nature Chemistry article, added "At the heart of this research is understanding how the properties of materials arise from the underlying molecular or atomic structure. It's very difficult to describe. In this case we really sought to draw that connection as clearly as we could understand it ourselves and really show not only as from the experiment, where we can look at the macroscale behavior of these materials, but then with the computation relate that behavior back to what is actually going on at the scale of molecules. As we continue to develop as a society, we need to develop new materials for new sorts of global issues (water purification, etc), so understanding this relationship between atomic structure and the material property itself and the ability to predict those is going to become increasingly important." The study, "Engineering the entropy-driven free-energy landscape of a dynamic nanoporous protein assembly," was published in April of 2018 in the journal Nature Chemistry. The authors are Robert Alberstein, Yuta Suzuki, Francesco Paesani, and F. Akif Tezcan of the University of California, San Diego. Funding was provided by the US Department of Energy Award DE-SC0003844 and by the National Science Foundation through grant CHE-1453204. All computer simulations were performed on the NSF-funded Extreme Science and Engineering Discovery Environment through grant ACI-1053575.
  2. Researchers at the Indiana University Observatory on Social Media have launched upgrades to two tools playing a major role in countering the spread of misinformation online. The improvements to Hoaxy (hoaxy.iuni.iu.edu) and Botometer (botometer.iuni.iu.edu/) are supported by the Knight Prototype Fund on Misinformation, a joint venture of the John S. and James L. Knight Foundation, the Rita Allen Foundation and the Democracy Fund to address concerns about the spread of misinformation and to build trust in quality journalism. A third tool -- an educational game designed to make people smarter news consumers -- also launches with the upgrades. "The majority of the changes to Hoaxy and Botometer are specifically designed to make the tools more usable by journalists and average citizens," said Filippo Menczer, a professor in the IU School of Informatics, Computing and Engineering and a member of the IU Network Science Institute. "You can now easily detect when information is spreading virally, and who is responsible for its spread." Hoaxy is a search engine that shows users how stories from low-credibility sources spread on Twitter. Botometer is an app that assigns a score to Twitter users based on the likelihood that the account is automated. Hoaxy's new functions show users which stories are trending on Twitter, including those from low-credibility sources. It also indicates what proportion of the users who are spreading the stories are likely to be "bots." These new features were previewed April 12 at the International Symposium on Online Journalism in Austin, Texas, by Giovanni Luca Ciampaglia, a research scientist at the IU Network Science Institute who is part of the team that developed the tools. The new version of Botometer employs updated machine learning algorithms to identify "bots" with greater accuracy and is strongly integrated with Hoaxy. Users can observe not only how information spreads across Twitter, but also whether these messages are mostly shared by real people or pushed by a computer program potentially designed to sway public opinion. Automated accounts are commonly used to give the false impression that a large number of people are speaking about a specific topic online, Menczer said. Political campaigns, celebrities and advertisers are known to use bots to push specific agendas or products. The updated Hoaxy also has a "trending stories" section that displays popular news stories along with claims from low-credibility sources. This is possible because Hoaxy can now trace the spread of any online news story or hashtag over time across Twitter. Previously, users could only analyze headlines from specific websites identified by nonpartisan groups as likely to post false or misleading information. Ciampaglia said Hoaxy and Botometer currently process hundreds of thousands of daily online queries. The technology has enabled researchers, including a team at IU, to study how information flows online in the presence of bots. Examples are a study on the cover of the March issue of Science that analyzed the spread of false news on Twitter and an analysis from the Pew Research Center in April that found that nearly two-thirds of the links to popular websites on Twitter are shared by automated accounts. The newly launched project is Fakey (fakey.iuni.iu.edu), a web and mobile news literacy game that mixes news stories with false reports, clickbait headlines, conspiracy theories and "junk science." Players earn points by "fact-checking" false information and liking or sharing accurate stories. The project, led by IU graduate student Mihai Avram, was created to help people develop responsible social media consumption habits. An Android app is available, and an iOS versions will launch shortly. All three tools are united through their creators' goal to help individuals understand the role of misinformation online, Menczer said. "By partnering with other groups," he added, "we're able to significantly amplify the power of our work in the fight against online disinformation."
  3. In the field of self-driving cars, algorithms for controlling lane changes are an important topic of study. But most existing lane-change algorithms have one of two drawbacks: Either they rely on detailed statistical models of the driving environment, which are difficult to assemble and too complex to analyze on the fly; or they're so simple that they can lead to impractically conservative decisions, such as never changing lanes at all. At the International Conference on Robotics and Automation tomorrow, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) will present a new lane-change algorithm that splits the difference. It allows for more aggressive lane changes than the simple models do but relies only on immediate information about other vehicles' directions and velocities to make decisions. "The motivation is, 'What can we do with as little information as possible?'" says Alyssa Pierson, a postdoc at CSAIL and first author on the new paper. "How can we have an autonomous vehicle behave as a human driver might behave? What is the minimum amount of information the car needs to elicit that human-like behavior?" Pierson is joined on the paper by Daniela Rus, the Viterbi Professor of Electrical Engineering and Computer Science; Sertac Karaman, associate professor of aeronautics and astronautics; and Wilko Schwarting, a graduate student in electrical engineering and computer science. "The optimization solution will ensure navigation with lane changes that can model an entire range of driving styles, from conservative to aggressive, with safety guarantees," says Rus, who is the director of CSAIL. One standard way for autonomous vehicles to avoid collisions is to calculate buffer zones around the other vehicles in the environment. The buffer zones describe not only the vehicles' current positions but their likely future positions within some time frame. Planning lane changes then becomes a matter of simply staying out of other vehicles' buffer zones. For any given method of computing buffer zones, algorithm designers must prove that it guarantees collision avoidance, within the context of the mathematical model used to describe traffic patterns. That proof can be complex, so the optimal buffer zones are usually computed in advance. During operation, the autonomous vehicle then calls up the precomputed buffer zones that correspond to its situation. The problem is that if traffic is fast enough and dense enough, precomputed buffer zones may be too restrictive. An autonomous vehicle will fail to change lanes at all, whereas a human driver would cheerfully zip around the roadway. With the MIT researchers' system, if the default buffer zones are leading to performance that's far worse than a human driver's, the system will compute new buffer zones on the fly -- complete with proof of collision avoidance. That approach depends on a mathematically efficient method of describing buffer zones, so that the collision-avoidance proof can be executed quickly. And that's what the MIT researchers developed. They begin with a so-called Gaussian distribution -- the familiar bell-curve probability distribution. That distribution represents the current position of the car, factoring in both its length and the uncertainty of its location estimation. Then, based on estimates of the car's direction and velocity, the researchers' system constructs a so-called logistic function. Multiplying the logistic function by the Gaussian distribution skews the distribution in the direction of the car's movement, with higher speeds increasing the skew. The skewed distribution defines the vehicle's new buffer zone. But its mathematical description is so simple -- using only a few equation variables -- that the system can evaluate it on the fly. The researchers tested their algorithm in a simulation including up to 16 autonomous cars driving in an environment with several hundred other vehicles. "The autonomous vehicles were not in direct communication but ran the proposed algorithm in parallel without conflict or collisions," explains Pierson. "Each car used a different risk threshold that produced a different driving style, allowing us to create conservative and aggressive drivers. Using the static, precomputed buffer zones would only allow for conservative driving, whereas our dynamic algorithm allows for a broader range of driving styles."
  4. For all that seismologists have learned about earthquakes, new technologies show how much remains to be discovered. In a new study in Science Advances, researchers at Columbia University show that machine learning algorithms could pick out different types of earthquakes from three years of earthquake recordings at The Geysers in California, one of the world's oldest and largest geothermal reservoirs. The repeating patterns of earthquakes appear to match the seasonal rise and fall of water-injection flows into the hot rocks below, suggesting a link to the mechanical processes that cause rocks to slip or crack, triggering an earthquake. "It's a totally new way of studying earthquakes," said study coauthor Benjamin Holtzman, a geophysicist at Columbia's Lamont-Doherty Earth Observatory. "These machine learning methods pick out very subtle differences in the raw data that we're just learning to interpret." The approach is novel in several ways. The researchers assembled a catalog of 46,000 earthquake recordings, each represented as energy waves in a seismogram. They then mapped changes in the waves' frequency through time, which they plotted as a spectrogram -- a kind of musical roadmap of the waves' changing pitches, were they to be converted to sound. Seismologists typically analyze seismograms to estimate an earthquake's magnitude and where it originated. But looking at an earthquake's frequency information instead allowed the researchers to apply machine-learning tools that can pick out patterns in music and human speech with minimal human input. With these tools, the researchers reduced each earthquake to a spectral "fingerprint" reflecting its subtle differences from the other quakes, and then used a clustering algorithm to sort the fingerprints into groups. The machine-learning assist helped researchers make the link to the fluctuating amounts of water injected belowground during the energy-extraction process, giving the researchers a possible explanation for why the computer clustered the signals as it did. "The work now is to examine these clusters with traditional methods and see if we can understand the physics behind them," said study coauthor Felix Waldhauser, a seismologist at Lamont-Doherty. "Usually you have a hypothesis and test it. Here you're building a hypothesis from a pattern the machine has found." If the earthquakes in different clusters can be linked to the three mechanisms that typically generate earthquakes in a geothermal reservoir -- shear fracture, thermal fracture and hydraulic cracking -- it could be possible, the researchers say, to boost power output in geothermal reservoirs. If engineers can understand what's happening in the reservoir in near real-time, they can experiment with controlling water flows to create more small cracks, and thus, heated water to generate steam and eventually electricity. These methods could also help reduce the likelihood of triggering larger earthquakes -- at The Geysers, and anywhere else fluid is pumped underground, including at fracking-fluid disposal sites. Finally, the tools could help identify the warning signs of a big one on its way -- one of the holy grails of seismology. The research grew out of an unusual artistic collaboration. As a musician, Holtzman had long been attuned to the strange sounds of earthquakes. With sound designer Jason Candler, Holtzman had converted the seismic waves of recordings of notable earthquakes into sounds, and then speeded them up to make them intelligible to the human ear. Their collaboration, with study coauthor Douglas Repetto, became the basis for Seismodome, a recurring show at the American Museum of Natural History's Hayden Planetarium that puts people inside the earth to experience the living planet. As the exhibit evolved, Holtzman began to wonder if the human ear might have an intuitive grasp of earthquake physics. In a series of experiments, he and study coauthor Arthur Paté, then a postdoctoral researcher at Lamont-Doherty, confirmed that humans could distinguish between temblors propagating through the seafloor or more rigid continental crust, and originating from a thrust or strike-slip fault. Encouraged, and looking to expand the research, Holtzman reached out to study coauthor John Paisley, an electrical engineering professor at Columbia Engineering and Columbia's Data Science Institute. Holtzman wanted to know if machine-learning tools might detect something new in a gigantic dataset of earthquakes. He decided to start with data from The Geysers because of a longstanding interest in geothermal energy. "It was a typical clustering problem," says Paisley. "But with 46,000 earthquakes it was not a straightforward task." Paisley came up with a three-step solution. First, a type of topic modeling algorithm picked out the most common frequencies in the dataset. Next, another algorithm identified the most common frequency combinations in each 10-second spectrogram to calculate its unique acoustic fingerprint. Finally, a clustering algorithm, without being told how to organize the data, grouped the 46,000 fingerprints by similarity. Number crunching that might have taken a computer cluster several weeks was done in a few hours on a laptop thanks to another tool, stochastic variational inference, Paisley had earlier helped develop. When the researchers matched the clusters against average monthly water-injection volumes across The Geysers, a pattern jumped out: A high injection rate in winter, as cities send more run-off water to the area, was associated with more earthquakes and one type of signal. A low summertime injection rate corresponded to fewer earthquakes, and a separate signal, with transitional signals in spring and fall. The researchers plan to next apply these methods to recordings of other naturally occurring earthquakes as well as those simulated in the lab to see if they can link signal types with different faulting processes. Another study published last year in Geophysical Research Letters suggests they are on a promising track. A team led by Los Alamos researcher Paul Johnson showed that machine learning tools could pick out a subtle acoustic signal in data from laboratory experiments and predict when the next microscopic earthquake would occur. Though natural faults are more complex, the research suggests that machine learning could lead to insights for identifying precursors to big earthquakes. The current research was funded with a 2016 RISE grant from Columbia's Office of the Executive Vice President. It even inspired a new course, "Sonic and Visual Representation of Data," which Holtzman and Paisley taught last spring in Columbia's Music Department and developed with a Columbia Collaboratory grant: "The Search for Meaning in Big Data."
  5. UBC computer scientists have turned Amazon Alexa into a tool for software engineers, tasking the virtual assistant to take care of mundane programming tasks, helping increase productivity and speed up workflow. Software engineers use many different tools for any one project. They work with millions of lines of computer code and run their code through various independent tools to help edit, build and test systems and for project management to get their programs running smoothly. "It can be quite complicated to switch between the different tools because they each use a unique syntax and you have to understand how to put them together," said Nick Bradley, who led this work during his master's research in computer science at UBC. "The idea to use Alexa came out of my frustration from using these different tools and having to spend so much time looking up how to do it and use those tools together." Bradley and computer science professors Reid Holmes and Thomas Fritz decided to test whether Amazon's virtual assistant could help with this process. They wanted software engineers to use simple, conversational language to ask Alexa to complete some of their tasks, the same way we ask it to give us the weather forecast or play our favourite songs. Researchers said it was more than just a matter of teaching Alexa some key phrases and mapping different commands to the work, they also had to figure out common multi-step tasks engineers were performing and build a system that could automate those tasks. They then asked 21 engineers from local Vancouver software companies to test out their system and evaluate it. While the engineers found the tool useful and provided lots of positive feedback, there was one challenge. "The biggest problem was using voice commands in an office environment -- they found it distracting to their neighbours," said Bradley. The computer scientists' next development will be to create a chat bot to fulfill a similar function so engineers can type minimal requests and have the system perform their multi-step tasks so they can focus on the more important parts of their jobs. Holmes says this research is part of a larger effort to understand how software engineers do their jobs. "The pace of change in the software field is so fast that engineers don't have time to be introspective and think about the way they work," he said. "Our job in academia is to step back and really think about how we can better support engineers to quickly and correctly build the kinds of software we depend upon in our modern society. Systems keep getting larger and more complex and using personal assistants could be one way to help developers be more effective within this fast-paced environment." The researchers also recognize that these virtual assistants could be programmed for a variety of occupations including medicine, law, or accounting. "You can imagine a situation where a lawyer is reading a legal brief and asks Alexa to find relevant cases on similar topics to help with research," said Holmes. The study will be presented next week at the International Conference on Software Engineering (ICSE) in Gothenburg, Sweden: pdf" title="https://www.cs.ubc.ca/~rtholmes/papers/icse_2018_bradley.pdf">https://www.cs.ubc.ca/~rtholmes/papers/icse_2018_bradley.pdf Watch a video of the tool in action: https://youtu.be/Y-LqJaYEDSA
  6. Angelina Jolie just started work on the sequel to Maleficent, but it now appears that her next movie could be a slightly different reimagining of a pair of very different fairy tales. Jolie is in talks to for a role alongside David Oyelowo in Come Away, a film that will act as a sort of prequel to both Peter Pan and Alice in Wonderland. The film was written by Marissa Kate Goodhill, in her feature debut, and, according to Deadline, will be directed by Brave's Brenda Chapman, marking her debut directing live-action. The story focuses on two children, who lose another sibling in a tragic accident. This causes their parents, played by Jolie and Oyelowo, to fall into despair. The children attempt to rescue their parents but they are eventually forced to choose between their difficult home life and their imagination. You see, one of these children becomes Alice in Wonderland, while the other becomes Peter Pan. Clearly, Come Away is going to be a much more serious drama than it might at first appear. It seems to be more about children coping with loss by escaping into fantasy more than it is about the actual fantasy. Still, the idea of Peter Pan and Alice being brother and sister is certainly an interesting connection to create between those two fantasy stories. Both tales follow children who go to other worlds. One refuses to grow up while the other goes to a place that refuses to make sense. The idea that they are both means of escape for children having trouble with reality isn't a large leap to make. This is a far cry from Angelina Jolie's current project, a straight-up fairy tale movie that will see her return as Maleficent in the sequel to the movie that kickstarted Disney's love affair with reimaginings and recreations of their animated classics. The first Maleficent retold the story of Sleeping Beauty from the point of view of the iconic villain. Now, the new film will take a step into uncharted territory, as Disney's animated film never had a sequel, so the new movie will be entirely original. In addition to starring in the new film. David Oyelowo will also produce the feature via his Yoruba Saxon Productions banner. Brenda Chapman hasn't directed a movie since 2005's Brave, though she's credited as a writer on Disney's new adaptation of The Lion King which is currently in production.
  7. Good news Star Wars fans: Whether you've already seen Solo: A Star Wars Story or are boycotting the franchise, if you still need a little Star Wars love in your life, it's coming. Burlesque show Empire Strips Back, which originated in Australia way back in 2011, is heading to the United States. It's first show will be tonight. More details on the show's female Luke Skywalker and more later in the story. But first, take a look at some burlesque Storm Troopers, below. The upcoming shows mark the first time the parody production has headed to the United States, and the show looks to be pulling out all the stops. The U.S. show will have a Tauntaun that will be ridden by cast members, and Jabba the Hut will take part in the show. In addition to the aforementioned female Luke Skywalker (who twerks to Nick Minaj), the above Storm Trooper line, and a dominatrix Darth Vader, we've also been given a first look at Empire Strips Back's Boba Fett in all her glory. (It's an especially timely post, considering the recent rumors about a Boba Fett movie.) I never thought I'd live to see the day where nipple tassles and lightsabers co-existed hand-in-hand (especially now that Disney owns Lucasfilm), but this parody has made it happen. The Guardian mentions the show has been particularly special in Australia because it has introduced thousands of people to the world of burlesque through a fandom for the Star Wars property. There was reportedly some "red tape" before bringing the show to the U.S., but Empire Strips Back was able to pull it off thanks to having similarities in parody law between the U.S. and Australia. Although a representative for the show definitively says that New Zealand will be a no-go. Around these parts, porn parodies are a lot more common, and we've seen porn parodies for popular properties, including but not limited to Back to the Future, Guardians of the Galaxy and Justice League before. Maybe burlesque is the wave of the future? Empire Strips Back is touring California through part of June. The show will kick off in Riverside tonight and will hit The Theatre at Ace Hotel on June 1. Following that will be dates in San Diego, San Jose, Sacramento and San Francisco. The final show -- so far -- will happen on June 9. If you live in any one of those locations, tickets are available, here. If not, be sure to follow the show on social.
  8. Disney World is a little over a year away from the much-touted Star Wars Galaxy's Edge which will certainly be a major boon for the park. Now, some new rumors indicate Universal Studios might be looking to respond with another major sci-fi franchise. Rumors have been building for some time that Universal Studios Orlando is looking to build a fouth park location, and the newest rumor about that space is that it may include a land dedicated to Star Trek. The fact that Universal Studios is working on a fourth park is still a rumor itself, as the company has yet to confirm such a thing, but that that part at least is as confirmed as possible, as we know that Universal has been buying up land in Orlando, specifically, land the theme park once owned, but had sold off when Universal's former parent company was having financial issues. The only reason to own the land is to develop something, so a new theme park seems likely. Areas dedicated to Nintendo, Lord of the Rings and Fantastic Beasts and Where to Find Them have been previously rumored for this new park, and now, thanks to the Disney and More blog, we can add Star Trek to the list. The sci-fi brand previously came up as the potential basis for a show that would take up the location previously occupied by the Terminator 2 3D show, but now a source for the blog says that Star Trek could get much more than a show, as it's being considered for an entire land in the new park. As with all unnamed sources, we need to take this idea with a grain of salt, but there are many reasons why the rumor is so attractive. As mentioned, Star Wars Galaxy's Edge will be at Walt Disney World before you know it and Universal is sure to want to combat that with something. Star Wars and Star Trek are the two biggest sci-fi properties in history, so it makes sense to take on one with the other. Also, while Star Trek has a massive history to draw from that would allow for many potential attractions, it's also received new life in recent years thanks to new movies and an even newer TV series, so it's a current name that is exciting again. Certainly, the idea of an attraction that puts you on board the USS Enterprise is an exciting one. The only question would be which version of the iconic ship would you build? There's so much Star Trek material to draw from that the biggest difficulty would be finding a way to get everybody's favorite Star Trek show or movie in the land, while still making it all fit together seamlessly. It doesn't seem that a Star Trek land is a sure thing at this point, just that it's being considered, and we likely won't get anything official on the new park until at least the land decisions have been made. We'll have to wait and see what's in store.
  9. There is no sell-by date for sequels. Pixar will try to revive The Incredibles after a 14-year gap. Over on YouTube Red, the original Karate Kid actors are back in character after a 34-year hiatus for Cobra Kai. So long as the story works, audiences gladly will embrace nostalgia and revisit classic narratives, just to spend a few more hours alongside a beloved creation. So, when Tom Cruise shares this photo to announce the start of production on Top Gun 2, tell me your pulse doesn't pick up just a tad as you start to hum Kenny Loggins' "Danger Zone?" If you didn't notice, the call sign on the back of the flight helmet that Tom Cruise is holding says "Maverick," which means that the ace fighter pilot with a hotshot attitude and courage to spare will be back in the air for this hotly-anticipated sequel that truly has been years in the making. Literally, Top Gun 2 has been on Tom Cruise's mind for as far back as 2012, with different screenwriters and directors circling the gig. Now that Cruise has taken to Twitter to confirm the start of production, we know that nothing's going to ground this sequel as it prepares to take flight. The official title of this Top Gun sequel actually will be Top Gun: Maverick, placing added emphasis on the role that Tom Cruise will play on the follow up story. There was a time when rumors swirled that the sequel would shift to a new generation of pilots (maybe even Goose's son), with Cruise doing a reduced walk-on role. While plot details on the latest shooting scripts are being kept under wraps, we are all hoping that Cruise's part in the story is enhanced, and that the script goes the distance to make the A-list action icon a major part of Maverick. After all, this is the character we want to catch up with, and relegating him to instructor status for a gaggle of fresh recruits we don't know would be a tad disappointing. It's strange how little we know about Top Gun: Maverick. We have a release date -- July 12, 2019. And we know that Tom Cruise will reunite with his Oblivion director Joseph Kosinski, as original Top Gun director Tony Scott died in 2012. But as for co-stars and official plot details, they are scarce. We know Top Gun: Maverick will be set in a modern age, and will have our pilots taking on drone tech. Beyond that, it's a mystery... for now. We will continue to track every step of Top Gun: Maverick's progress, now that production is underway. Need your Tom Cruise fix while you wait? You are in luck. The actor has Mission: Impossible -- Fallout reaching theaters in July. And to keep track of all movies opening in theaters in 2018, bookmark our handy Release Date schedule.
  10. SPOILER WARNING: The following article contains massive spoilers for Annihilation. If you have not yet seen the film, and don't wish to know any important details about the end, please bookmark this page and save it until after your screening! While writer/director Alex Garland is best known for his contributions behind the camera, it may surprise fans to learn how he directly contributed to the most terrifying scene in his latest film, Annihilation. During production, Garland took it upon himself to operate the creature in the mutant bear sequence, and he did what he could to try and shock his actresses while slowly crawling around the small house. As seen on screen, this was very effective, but the filmmaker recently told me that it was actually Tessa Thompson who really scared him during the shoot, basically because she is just a crazy talented actor: Actually, [Tessa Thompson] freaked me out. She looked so frightened and worried at one point that I was afraid. I somehow thought, 'Oh my God, she's not acting. Something terrible has happened. Oh, Christ.' But actually she was acting. I remember it really unsettled me, actually. But she's an incredible actor. I first learned about Alex Garland operating the mutant bear from Tessa Thompsonwhen I spoke to her earlier this year during Annihilation's pre-release Los Angeles press day, and I followed up with the writer/director about it when I spoke with him on the phone late last week. I brought up the sequence and mentioned to him that Thompson had sincerely complemented his work with the bear, when he countered with a compliment of his own. It turns out she is such a talented individual that she fully convinced Garland she was legitimately terrified of being eaten by a horrific creature. And to his credit, that's how you feel watching her in the scene as well. Thankfully it sounds like Alex Garland didn't call, "Cut" mid-take, which allowed him to capture the full power of the scene. Of course, one big question still remains: why exactly was it that Alex Garland wound up operating the mutant bear in the first place? Thankfully, the filmmaker was very forthright in explaining. I asked him how he wound up with that gig, and he noted that it was really a matter of convenience, detailing the work that went into making that Annihilation scene as powerful as possible. Said Garland, We had different incarnations of the bear that we were using for different shots. So sometimes there's this huge stuntman who was knocking people out of the way and turning around, and then there was a really, really beautiful animatronic head that was used for some of the shot. And then the other times, for reasons to do with timing, it was just easier if I picked up this big foam bear head and used that, because rather than explaining exactly what it had to do, it was easier just to do it. Fans can now relive this terrifying sequence over and over again, with Annihilationnow available on home video. The film has been released not only on DigitalHD, but as of Tuesday is now available on Blu-ray and DVD. Be sure to pick up a copy, and stay tuned for more about the amazing film here on CinemaBlend!
  11. SPOILER WARNING: The following article contains massive spoilers for Solo: A Star Wars Story. If you have not yet seen the film, and don't wish to know any important details about the end, please bookmark this page and save it until after your screening! By the time Ron Howard's Solo: A Star Wars Story starts to wrap up, it begins making what are clearly steps towards setting up a sequel. Not only does Alden Ehrenreich's Han learn about the Jabba The Hutt job on Tatooine that was first mentioned all the way back in 1977, but it's also revealed that Emilia Clarke's Qi'ra is serving the insidious Darth Maul (Ray Park/Sam Witwer) in his evil machinations. From a narrative perspective alone, these threads are more than enough to justify making a direct follow-up, and it's easy to understand why a franchise as cocky as Star Wars would want those hints dropped in. However, those decisions were all made before the film was completed and released to mass audiences. Now fans have had a chance to see the finished movie... and the response hasn't exactly been overwhelming. Its opening weekend results are far and away the weakest in the modern era of the franchise, with a number of factors contributing including intense competition, brand fatigue, and mediocre buzz. As a result, an important question is left looming: how does Star Wars deal with the dangling plot threads? The typical Hollywood thing to do would be to craft a Solo 2 and hope for better results, but that really isn't the best option on the table. Instead, when it comes to following up, Lucasfilm might instead be much better off creatively utilizing some of their rumored upcoming projects - specifically the developing Boba Fett and the rumored Lando. To be frank, making a Han Solo origin movie was a troubling concept from the beginning. Star Wars is a franchise with an unfortunate legacy when it comes to prequels about beloved central characters, and even with Phil Lord and Chris Miller originally at the helm (the men known specifically for turning bad ideas into good ones) it seemed like troubling proposition. There's a problematic urge in these movies to answer questions that nobody ever asked, and Solo did exactly that. Instead, the drive should be to dive deeper into the characters we know less about, and that's exactly what Boba Fett and Lando can do. At the same time, however, those movies can also be utilized unlike any other franchise sequels in history: picking up a story from a previous title and continuing it with a different protagonist. Shortly before Solo: A Star Wars Story hit theaters last weekend, it was announced that director James Mangold would be taking the helm of the Boba Fett movie, and one can hope that timing has an extra bit of coincidence wrapped into it. After all, the famed Mandalorian bounty hunter is known to have a past relationship with both Han Solo and Jabba The Hutt -- so what if they wound up being supporting characters in his perspective-driven story? Perhaps Jabba gives the smuggling job to Han, but doesn't trust him so he hires Boba to track him and make sure he is doing his job. And given Boba's connection with the Empire (as established in The Empire Strikes Back), perhaps you even set up a twist that reveals he had something to do with Han getting boarded and forced to dump his cargo, as described by Han during his Mos Eisley Cantina conversation with Greedo in A New Hope. From there, Lando could round out a trilogy. There remains an unclear path that takes the smooth operator from his losing game of Sabacc with Han to his position as Baron-Administrator of Cloud City, and this film could fill the gaps while also continuing threads from Boba Fett -- which would also have Donald Glover's Lando Calrissian as a key supporting role. While not exactly an origin story, the movie could serve to contextualize a character who has never been given a ton of context, and has some built in emotional conflict given the deal he winds up making with the Empire to give up Han, Leia, Chewie and C-3PO. As of right now, it doesn't really look like the Star Wars franchise has much of a plan. With the exception of J.J. Abrams' untitled Star Wars: Episode IX, none of the other developing projects have release dates, including not only James Mangold's Boba Fett: A Star Wars Story, but also the separate trilogies being developed by Rian Johnson and Game of Thrones showrunners D.B. Weiss and David Benioff, respectively. There doesn't seem to be a ton of order to all of the goings-on -- but the plan proposed here could help to streamline some of the ideas. Right now we can't say if it is even a possibility that this could play out, but it's an idea worth considering.
  12. Last decade, George Clooney's Danny Ocean led a gang of talented thieves and cone men on three different heists across the Ocean's movies, but for 2018, it's a different member of the Ocean family leading the charge. For the upcoming Ocean's 8, Danny's sister Debbie, played by Sandra Bullock, will recruit her own team of criminals to pull off a daring jewel heist at the Met Gala, bringing the same comedic tone as its predecessors, but with an all-new cast. However, as Bullock and costar Cate Blanchett acknowledged, early on they had doubts that this movie would ever actually get made. As Bullock and Blanchett explained: Sandra Bullock: I honestly didn't think it would happen. I thought it was a fun idea ... I didn't at the time think the movie would get made. Cate Blanchett: Two or three years ago this seemed like an impossibility, like how could you possibly get this made? And it's so great that it's being released now. We go, 'Well, of course.' A lot has shifted I think. Following the release of Ocean's Thirteen in 2007, George Clooney and director Steven Soderbergh made it clear that an Ocean's Fourteen would not be made, mainly because they wanted to go out on a high note, although Bernie Mac, one of the main cast members, passed away in 2008. So for a while, it seemed as if the Ocean's series would just be a trilogy, but then came word in 2015 that an all-female spinoff was in the works, and the year after, what we now know as Ocean's 8 assembled its main cast. But as Sandra Bullock and Cate Blanchett said at the recent Ocean's 8 press conference in New York that CinemaBlend attended, even they were skeptical that the spinoff would see the light of day. Director and co-writer Gary Ross also acknowledged that not only had there ever been an all women-led ensemble of this scale before, but it also took time hooking these actresses in and convincing Warner Bros to put its weight behind the movie. As Ross put it: I realized that there'd never been this kind of ensemble (there had been a lot of male versions of this), this kind of kick ass ensemble of women coming together like this before and I thought that was easy. And then we went to Sandy [Bullock], and Sandy said, 'Well, if the script doesn't suck and you actually get these people that you hope to get, then I might interested.' Which we took as an absolute yes. Then we went through the very long process of trying to get the movie made that really took three or four years. Along with Sandra Bullock and Cate Blanchett, Ocean's 8 main lineup includes Anne Hathaway, Mindy Kaling, Sarah Paulson, Helena Bonham Carter, Rihanna and Awkwafina, while the rest of the cast will feature Richard Ermitage, James Corden, Dakota Fanning and Damian Young, as well as plenty of celebrity cameos. Assuming the spinoff does well critically and commercially, then perhaps like with the previous Ocean's movies, we'll see more unique heists being pulled off by these ladies on the big screen. Ocean's 8 will be released in theaters on June 8. If you're interested in finding out what other movies are coming out this year, browse through our 2018 release schedule.
  13. Fantastic Beasts and Where To Find Them: The Crimes of Grindelwald hits theaters this November, and with it the screenplay will also be published. The cover for the Fantastic Beasts 2 screenplay book has been revealed, and if you take a closer look at it, you'll likely spot some clues about the story for the second film in the Fantastic Beasts franchise. The image was posted over at the official Harry Potter fan-site, Pottermore, which credits graphic design team MimaLima for the artwork. As you can see, there's a skull up at the top and below the title text, and you'll surely notice the Eiffel Tower, likely a nod to the Parisian setting of Fantastic Beasts sequel. There are also two animals that appear to be black cats facing one another with the Deathly Hallows symbol between the two of them. But if you look even closer at the book's cover, you might see some other things, like the woman's face (with a feathery looking headdress) next to two creatures familiar to those who've seen the first Fantastic Beasts movie, a niffler and a bowtruckle. And above the words "The Original Screenplay" are three objects. The one to the left seems like some kind of pendant or ornament hanging from a chain. In the center (above "The") is a stone in a case. I'm going to guess that's the philosopher/sorcerer's stone, which would tie into the last time, what appears to be a padlock with the initials "NF" on it. Could that be Nicolas Flamel, the famous alchemist? We know Flamel is expected to appear in the film, played by Brontis Jodorowsky, so that seems like a fair guess to make. There are also what appear to be two birds set around the Eiffel Tower. Are those phoenixes and will Dumbledore's phoenix, Fawkes, be making an appearance in The Crimes of Grindelwald? Here's hoping! Last among the things we spotted on this cover is this little guy (above), seen in both bottom corners of the book. I'm wondering if he's a grindylow. The lines and bubbles around him have a sort of aquatic vibe to them, which makes me think that little guy with the horns is some kind of water creature. To theorize even further, grindylows were known to hang out in the Great Lake on the Hogwarts grounds, and since we know Hogwarts will be featured in Fantastic Beasts 2, perhaps that's a clue that the Lake -- well known for being inhabited by a number of magical beasts, including a giant squid -- could be featured in this movie. It seems like a prime location for Newt Scamander to visit. Considering he attended Hogwarts, we have to imagine the magizoologist knows a thing or two about what's going on down there, but we'll have to wait and see if it comes into play when Fantastic Beasts And Where To Find Them 2 hits theaters November 16, 2018. The Fantastic Beasts 2 screenplay book is set to drop in the U.S. and U.K. the same day the movie arrives.
  14. USC scientists have unlocked a new, more efficient pathway for converting methane -- a potent gas contributing to climate change -- directly into basic chemicals for manufacturing plastics, agrochemicals and pharmaceuticals. In research published on Dec. 4 in the Journal of the American Chemical Society, chemists at USC Loker Hydrocarbon Research Institute say they have found a way to help to utilize this abundant and dangerous greenhouse gas, which is generally burnt or flared to produce energy. Among common greenhouse gases, carbon dioxide is often cited as the largest culprit for trapping heat on earth, contributing to climate change. However, it is not the most potent. That distinction belongs to methane. According to the Intergovernmental Panel on Climate Change, methane traps heat and warms the planet 86 times more than carbon dioxide over a 20-year horizon. More fuel, fewer emissions, reduced energy use Lead author Patrice T. D. Batamack, senior author G. K. Surya Prakash and Thomas Mathew of the USC Loker Hydrocarbon Research Institute used a catalyst called H-SAPO-34, derived from a class of nanoporous crystals called zeolites. This simple method of converting methane directly to ethylene and propylene, or olefins, would replace what are traditionally difficult, expensive, and inefficient processes that add greenhouse gases to the atmosphere. The majority of ethylene and propylene is produced from petroleum oil and shale liquid cracking, which consumes enormous amounts of energy. When USC's first Nobel Prize winner, George Olah, converted methane to olefins in 1985, the process required three steps. Since then, researchers have reduced it to two steps, but the Loker team is the first to realize the conversion with a single catalyst based on zeolites. "Contact time is the key for this effective and simple catalyst to produce usable fuel from methane. In real estate, they say, location, location, location. In chemistry, it is all about condition, condition, condition," said Prakash. Global methane emissions have surged since 2007 and output is particularly bad in the United States. According to a recent Harvard University study, the United States could be solely responsible for as much as 60 percent of the global growth in human-caused atmospheric methane emissions during this century. Contributing to the global surge is the increased supply of livestock and rice fields in countries like India and China, the two leaders in total methane output, according to the World Bank. 'If carbon is the problem, carbon has to be the solution' While being the most potent of our popular greenhouse gases, and even after the largest methane leak in U.S. history at the Aliso Canyon natural gas storage facility a few years ago, there are no signs that methane's abundant production will slow down anytime soon. Shale fracking and other resource extraction techniques are increasing natural gas reserves, and the Loker scientists believe methane may soon become the most popular of all raw materials for producing petrochemical products. About 30 years ago, Prakash and his mentor Olah first began refining the concept of "The Methanol Economy," a host of methanol-based solutions mitigating the production cycle of the greenhouse gases that are accelerating climate change. While similar in structure and name, methane is not directly interchangeable with methanol, although most methanol is synthetically produced from methane. Methane is a naturally occurring gas and the simplest one-carbon compound containing hydrocarbon. By further reducing the steps necessary to efficiently convert methane to olefins, the scientists at Loker may have brought us that much closer to realizing one of the original steps laid out in "The Methanol Economy." "If carbon is the problem, carbon has to be the solution. There is plenty of methane to go around in the world and it is become easier and safer to turn it into products that we can actually use,'" said Prakash. This research was made possible with the support of the USC Loker Hydrocarbon Research Institute and the U.S. Department of Energy.
  15. Caltech researchers have made a discovery that they say could lead to the economically viable production of solar fuels in the next few years. For years, solar-fuel research has focused on developing catalysts that can split water into hydrogen and oxygen using only sunlight. The resulting hydrogen fuel could be used to power motor vehicles, electrical plants, and fuel cells. Since the only thing produced by burning hydrogen is water, no carbon pollution is added to the atmosphere. In 2014, researchers in the lab of Harry Gray, Caltech's Arnold O. Beckman Professor of Chemistry, developed a water-splitting catalyst made of layers of nickel and iron. However, no one was entirely sure how it worked. Many researchers hypothesized that the nickel layers, and not the iron atoms, were responsible for the water-splitting ability of the catalyst (and others like it). To find out for sure, Bryan Hunter (PhD '17), a former fellow at the Resnick Institute, and his colleagues in Gray's lab created an experimental setup that starved the catalyst of water. "When you take away some of the water, the reaction slows down, and you are able to take a picture of what's happening during the reaction," he says. Those pictures revealed the active site of the catalyst -- the specific location where water is broken down into oxygen -- and showed that iron was performing the water-splitting reaction, not nickel. "Our experimentally supported mechanism is very different than what was proposed," says Hunter, first author of a paper published February 6 in Joule, a journal of sustainable-energy research, describing the discovery. "Now we can start making changes to this material to improve it." Gray, whose work has focused on solar fuels for decades, says the discovery could be a "game changer" for the field. "This will alert people worldwide that iron is particularly good for this kind of catalysis," he says. "I wouldn't be at all shocked if people start using these catalysts in commercial applications in four or five years."
  16. Researchers of Kaunas University of Technology (KTU), Lithuania are working on improving the efficiency of microbial fuel cells (MFC) by using modified graphite felt. Primary results show that the new MFC can generate 20 percent higher voltage than usual cells. Over the past 20 years, nearly three-fourths of human-caused emissions came from the burning of fossil fuels. Increasing pollution and decreasing fossil energy resources encourage scientists to look for new clean and sustainable alternative energy resources. Microbial fuel cells, which are also being researched at KTU laboratories have broad usage possibilities and are one of the cleanest known energy sources. MFCs are powered by living microorganisms with clean and sustainable features; they can generate electricity from broad range of organic substrates under natural conditions. "Microbial fuel production is probably the only technology, in which the electricity is being generated from oxidation of organic compounds in room temperature. In other words, there is no need to burn anything, and the process is not depending on sunlight," says Dr Kristina Kantmnien? researcher at KTU Faculty of Chemical Technology. According to KTU researchers, MFC technology is unique because of its multifunctional application: for example, wastewater and slime, collected in wastewater treatment plants can be also used as food for bacteria. Integration of MFC into wastewater treatment system would significantly reduce the usage of electrical energy for its exploitation and would turn the plant into the closed ecosystem. The energy surplus produced by MFC might be integrated into the electricity grid and used elsewhere. Although the idea that microorganisms can generate electricity was introduced in 1911, it became more actively investigated in the 2000s. Groups of researchers around the world are working with the MFC technology, attempting to improve the efficiency of the cells. KTU researchers are testing the qualities and biocompatibility of MFC anodes, as the efficiency of microbial fuel cells by large part depend on them. In the framework of interdisciplinary research project "Innovative microbial fuel cells for sustainable production of bioelectricity" (MicrobElas) KTU researchers have developed a MFC prototype, which uses modified graphite felt as an anode. "The modification of the anode allowed to increase the cell voltage; it is 20 percent higher than that of the control MFC with the usual anode. Although we are researching this technology only for a year, the first results are really inspiring," says Kantimien?. The researchers predict that although MFC technology will not displace other sources of renewable energy, it could be beneficial in the small wastewater treatment plants or in remote regions where electrical energy supply is limited. In May, the results of the research will be introduced in the Topical Meeting of International Society of Electrochemistry in Vilnius.
  17. Eggs may soon fuel more than people in the morning. Researchers from the Osaka City University in Japan have developed a way to potentially use egg whites as a substrate to produce a carbon-free fuel. They published their results on February 2nd in Applied Catalysis B. "Hydrogen is a promising fuel and energy storage medium because hydrogen emits no global warming gas when used. Nevertheless, hydrogen generation reactions usually require fossil fuels and emit carbon dioxide," said Hiroyasu Tabe, a special appointment research associate at the Graduate School of Engineering at Osaka City University in Japan. According to Tabe, it would be extremely efficient to use a photocatalyst to speed the reaction of hydrogen generation from a renewable source, such as solar power. Called hydrogen evolution, the gas must be stored and kept from recombining into more common molecules that aren't useful for producing clean fuel. "Precise accumulation of molecules acting as catalytic components are important to construct a photocatalytic system," Tabe said. "When the molecular components are randomly distributed in the solution or formless compounds, the catalytic reactions cannot proceed." One promising way to precisely accumulate these catalytic molecules is through the production of pure proteins by cultivated bacteria, but they require special lab equipment. Chicken eggs, however, are well-known vessels of protein-based chemicals, according to Tabe. The whites of chicken eggs, which are inexpensive and inexhaustible, consist of porous lysozyme crystals. "Lysozyme crystals have a highly ordered nanostructure and, thus, we can manipulate the molecular components when they accumulate in the crystals," Tabe said, noting that the crystal structure can be easily analyzed with X-ray technology. This analysis is of particular importance, according to Tabe, because the molecular components within the crystals must be manipulated precisely through what is called cooperative immobilization. This is achieved by the application of rose bengal, which is commonly used as a dye in eye drops to identify damage. In this case, it entered the solvent channels in the lysozyme crystals and accelerated the hydrogen evolution reaction, since the functional molecules and nanoparticles can be accumulated within the crystals' inner spaces. "These results suggest that porous protein crystals are promising platforms to periodically and rationally accumulate catalytic components by using molecular interactions," Tabe said.
  18. Using advanced computational methods, University of Wisconsin-Madison materials scientists have discovered new materials that could bring widespread commercial use of solid oxide fuel cells closer to reality. A solid oxide fuel cell is essentially an engine that provides an alternative way to burn fossil fuels or hydrogen to generate power. These fuel cells burn their fuel electrochemically instead of by combustion, and are more efficient than any practical combustion engine. As an alternative energy technology, solid oxide fuel cells are a versatile, highly efficient power source that could play a vital role in the future of energy. Solid oxide fuel cells could be used in a variety of applications, from serving as a power supply for buildings to increasing fuel efficiency in vehicles. However, solid oxide fuel cells are more costly than conventional energy technologies, and that has limited their adoption. "Better cathode catalysts can allow lower-temperature operation, which can increase stability and reduce costs, potentially allowing you to take your building off the electrical grid and instead power it with a solid oxide fuel cell running on natural gas," says Dane Morgan, a materials science and engineering professor at UW-Madison. "If we can get to that point with solid oxide fuel cells, the infrastructure of power to many buildings in the country could change, and it would be a very big transformation to a more decentralized power infrastructure." Led by Morgan and Ryan Jacobs, a staff scientist in Morgan's research group, a team of UW-Madison engineers has harnessed quantum mechanics-based computational techniques to search for promising new candidate materials that could enable solid oxide fuel cells to operate at lower temperatures, with higher efficiency and longer lifetimes. Their computational screening of more than 2,000 candidate materials from a broad class of compounds called perovskites yielded a list of 52 potential new cathode materials for solid oxide fuel cells. The researchers published details of their advance recently in the journal Advanced Energy Materials. "With this research, we've provided specific recommendations of promising compounds that should be explored further," says Morgan, whose work is supported by the U.S. Air Force and the National Science Foundation. "Some of the new candidate cathode materials we identified could be transformative for solid oxide fuel cells for reducing costs." In addition to identifying new materials, the researchers' approach allowed them to codify material design principles that had previously been based on intuition and to offer suggestions for improving existing materials. Typically, solid oxide fuel cells must operate at temperatures around 800 degrees Celsius. But operating at these high temperatures means materials in the fuel cell degrade quickly and limit the device's working life. The goal, says Jacobs, is to enable solid oxide fuel cells to operate at a lower temperature, and slow that degradation. Fuel cells with long lifetimes wouldn't need frequent replacements, making them more cost-effective. To achieve this goal, the researchers set out to find stable compounds with high activity to catalyze the oxygen reduction reaction, a chemical process key to solid oxide fuel cell energy applications. "If you can find new compounds that are both stable under the operating conditions of the fuel cell and highly catalytically active, you can take that stable, highly active material and use it at a reduced temperature while still achieving the desired performance from the fuel cell," explains Jacobs, who was the lead author of the study. However, using computational modeling to quantitatively calculate the catalytic activity of a perovskite compound is prohibitively difficult because of the high complexity of the oxygen reduction reaction. To overcome this challenge, the researchers used an approach where they selected a physical parameter that was more straightforward to calculate, and then showed empirically that it correlated with the catalytic activity, thus serving as an effective proxy for the catalytic activity. Once they established these correlations with data from experiments, the researchers were able to use high-throughput computational tools to effectively screen a large group of materials for high catalytic activity. The UW-Madison researchers are collaborating with a group at the National Energy Technology Laboratory (NETL), which conducted initial testing on one of the team's candidate cathode materials. "This research is ongoing, but the early tests by our NETL collaborators found the material to be quite promising," Morgan says. Morgan says this project is an example of the kind of advances that are aided by the Materials Genome Initiative, an ongoing national effort that aims to double the speed with which the country discovers, develops and manufactures new materials. "This project integrated correlations from experiments with online digital databases and high-throughput computational tools in order to design new solid oxide fuel cell materials, so it's exactly the kind of thing that gets enabled by the infrastructure and approaches that have been developed and put in place by the Materials Genome Initiative," Morgan says. This research was supported by grants from the U.S. Air Force (FA9550-08-0052 and FA9550-11-0299) and the National Science Foundation (SI2-1148011 and OCI-1053575).
  19. Scientists, companies and government agencies are hard at work on decreasing greenhouse gas emissions that cause climate change. In recent years, biofuels produced from corn have emerged as a fuel source to power motor vehicles and, perhaps, airplanes. But corn is problematic as a biofuel source material. It's resource-intensive to grow, creates many environmental impacts, and is more useful as food. A study from Colorado State University finds new promise for biofuels produced from switchgrass, a non-edible native grass that grows in many parts of North America. Scientists used modeling to simulate various growing scenarios, and found a climate footprint ranging from -11 to 10 grams of carbon dioxide per mega-joule -- the standard way of measuring greenhouse gas emissions. To compare with other fuels, the impact of using gasoline results in 94 grams of carbon dioxide per mega-joule. The study, "High resolution techno-ecological modeling of a bioenergy landscape to identify climate mitigation opportunities in cellulosic ethanol production," was published online Feb. 19 in Nature Energy. John Field, research scientist at the Natural Resource Ecology Lab at CSU, said what the team found is significant. "What we saw with switchgrass is that you're actually storing carbon in the soil," he said. "You're building up organic matter and sequestering carbon." His CSU research team works on second-generation cellulosic biofuels made from non-edible plant material such as grasses. Cellulose is the stringy fiber of a plant. These grasses, including switchgrass, are potentially more productive as crops and can be grown with less of an environmental footprint than corn. "They don't require a lot of fertilizer or irrigation," Field said. "Farmers don't have to plow up the field every year to plant new crops, and they're good for a decade or longer." Researchers chose a study site in Kansas since it has a cellulosic biofuel production plant, one of only three in the United States. The team used DayCent, an ecosystem modeling tool that tracks the carbon cycle, plant growth, and how growth responds to weather, climate and other factors at a local scale. It was developed at CSU in the mid-1990s. The tool allows scientists to predict whether crop production contributes to or helps combat climate change, and how feasible it is to produce certain crops in a given area. Previous studies on cellulosic biofuels have focused on the engineering details of the supply chain. These details have included analyzing the distance between the farms where the plant material is produced, and the biofuel production plant to which it must be transported. However, the CSU analysis finds that the details of where and how you grow the plant material is just as significant or even more significant for the greenhouse gas footprint of the biofuel, said Field. The biofuel industry is experiencing challenges, due to low oil prices. The production plant referenced above has new owners and is undergoing a reorganization. But the future looks bright for biofuels and bioenergy, said Field. "Biofuels have some capabilities that other renewable energy sources like wind and solar power just don't have," said Field. "If and when the price of oil gets higher, we'll see continued interest and research in biofuels, including the construction of new facilities." Study co-authors include Samuel Evans (University of California-Berkeley), Ernie Marx, Mark Easter (Natural Resource Ecology Laboratory at CSU), Paul Adler (United States Department of Agriculture), Thai Dinh (University of Oklahoma), Bryan Willson (Energy Institute and Department of Mechanical Engineering, CSU) and Keith Paustian (Department of Soil and Crop Science, CSU).
  20. Researchers at KTH Royal Institute of Technology have successfully tested a new material that can be used for cheap and large-scale production of hydrogen -- a promising alternative to fossil fuel. Precious metals are the standard catalyst material used for extracting hydrogen from water. The problem is these materials -- such as platinum, ruthenium and iridium -- are too costly. A team from KTH Royal Institute of Technology recently announced a breakthrough that could change the economics of a hydrogen economy. Led by Licheng Sun, professor of molecular electronics at KTH, the researchers concluded that precious metals can be replaced by a much cheaper combination of nickel, iron and copper (NiFeCu). "The new alloy can be used to split water into hydrogen," says researcher Peili Zhang. "This catalyst becomes more efficient than the technologies available today, and significantly cheaper. "This technology could enable a large-scale hydrogen production economy," he says. Hydrogen can be used for example to reduce carbon dioxide from steel production or to produce diesel and aircraft fuel. It's not the first time a cheaper material has been proposed for water splitting, but the researchers argue that their solution is more effective than others. They published their results recently in the scientific journal Nature Communication. "The high catalytic performance of core-shell NiFeCu for water oxidation is attributed to the synergistic effect of Ni, Fe and Cu," Zhang says. Zhang says that copper plays an interesting role in the preparation of the electrode. In an aqueous solution, surface copper dissolves and leave a very porous structure to enhance the electrochemically active surface area. "The porous oxide shell with its high electrochemically active surface area is responsible for the catalytic activity, while the metallic cores work as facile electron transport highways," Zhang says. Sun has previously made progress in this field of research, including the construction of artificial photosynthesis (Nature Chem. 4/2012), and a catalyst based on nickel and vanadium (Nature Com. 7/2016). His and colleagues' research was one of the reasons that US President Barack Obama went to KTH in 2013 during the second-ever state visit of an American president in Sweden.
  21. Researchers from RMIT University in Melbourne, Australia have demonstrated for the first time a working rechargeable "proton battery" that could re-wire how we power our homes, vehicles and devices. The rechargeable battery is environmentally friendly, and has the potential, with further development, to store more energy than currently-available lithium ion batteries. Potential applications for the proton battery include household storage of electricity from solar photovoltaic panels, as done currently by the Tesla 'Power wall' using lithium ion batteries. With some modifications and scaling up, proton battery technology may also be used for medium-scale storage on electricity grids -- -- like the giant lithium battery in South Australia -- as well as powering electric vehicles. The working prototype proton battery uses a carbon electrode as a hydrogen store, coupled with a reversible fuel cell to produce electricity. It's the carbon electrode plus protons from water that give the proton battery it's environmental, energy and potential economic edge, says lead researcher Professor John Andrews. "Our latest advance is a crucial step towards cheap, sustainable proton batteries that can help meet our future energy needs without further damaging our already fragile environment," Andrews said. "As the world moves towards inherently-variable renewable energy to reduce greenhouse emissions and tackle climate change, requirements for electrical energy storage will be gargantuan. "The proton battery is one among many potential contributors towards meeting this enormous demand for energy storage. Powering batteries with protons has the potential to be more economical than using lithium ions, which are made from scare resources. "Carbon, which is the primary resource used in our proton battery, is abundant and cheap compared to both metal hydrogen-storage alloys, and the lithium needed for rechargeable lithium ion batteries." During charging, the carbon in the electrode bonds with protons generated by splitting water with the help of electrons from the power supply. The protons are released again and pass back through the reversible fuel cell to form water with oxygen from air to generate power. Unlike fossil fuels, the carbon does not burn or cause emissions in the process. The researchers' experiments showed that their small proton battery, with an active inside surface area of only 5.5 square centimetres (smaller than a 20 cent coin), was already able to store as much energy per unit mass as commercially-available lithium ion batteries. This was before the battery had been optimised. "Future work will now focus on further improving performance and energy density through use of atomically-thin layered carbon-based materials such as graphene, with the target of a proton battery that is truly competitive with lithium ion batteries firmly in sight," Andrews said. RMIT's research on the proton battery has been partly funded by the Australian Defence Science and Technology Group and the US Office of Naval Research Global. How the proton battery works The working prototype proton battery combines the best aspects of hydrogen fuel cells and battery-based electrical power. The latest version combines a carbon electrode for solid-state storage of hydrogen with a reversible fuel cell to provide an integrated rechargeable unit. The successful use of an electrode made from activated carbon in a proton battery is a significant step forward and is reported in the International Journal of Hydrogen Energy. During charging, protons produced by water splitting in a reversible fuel cell are conducted through the cell membrane and directly bond with the storage material with the aid of electrons supplied by the applied voltage, without forming hydrogen gas. In electricity supply mode this process is reversed; hydrogen atoms are released from the storage and lose an electron to become protons once again. These protons then pass back through the cell membrane where they combine with oxygen and electrons from the external circuit to re-form water. A major potential advantage of the proton battery is much higher energy efficiency than conventional hydrogen systems, making it comparable to lithium ion batteries. The losses associated with hydrogen gas evolution and splitting back into protons are eliminated. Several years ago the RMIT team showed that a proton battery with a metal alloy electrode for storing hydrogen could work, but its reversibility and rechargeability was too low. Also the alloy employed contained rare-earth elements, and was thus heavy and costly. The latest experimental results showed that a porous activated-carbon electrode made from phenolic resin was able to store around 1 wt% hydrogen in the electrode. This is an energy per unit mass already comparable with commercially-available lithium ion batteries, even though the proton battery is far from being optimised. The maximum cell voltage was 1.2 volt.
  22. Farm manure could be a viable source of renewable energy to help reduce greenhouse gas emissions that cause global warming. Researchers at the University of Waterloo are developing technology to produce renewable natural gas from manure so it can be added to the existing energy supply system for heating homes and powering industries. That would eliminate particularly harmful gases released by naturally decomposing manure when it is spread on farm fields as fertilizer and partially replace fossil natural gas, a significant contributor to global warming. "There are multiple ways we can benefit from this single approach," said David Simakov, a professor of chemical engineering at Waterloo. "The potential is huge." Simakov said the technology could be viable with several kinds of manure, particularly cow and pig manure, as well as at landfill sites. In addition to being used by industries and in homes, renewable natural gas could replace diesel fuel for trucks in the transportation sector, a major source of greenhouse gas emissions. To test the concept, researchers built a computer model of an actual 2,000-head dairy farm in Ontario that collects manure and converts it into biogas in anaerobic digesters. Some of that biogas is already used to produce electricity by burning it in generators, reducing the environmental impact of manure while also yielding about 30 to 40 percent of its energy potential. Researchers want to take those benefits a significant step further by upgrading, or converting, biogas from manure into renewable natural gas. That would involve mixing it with hydrogen, then running it through a catalytic converter. A chemical reaction in the converter would produce methane from carbon dioxide in the biogas. Known as methanation, the process would require electricity to produce hydrogen, but that power could be generated on-site by renewable wind or solar systems, or taken from the electrical grid at times of low demand. The net result would be renewable natural gas that yields almost all of manure's energy potential and also efficiently stores electricity, but has only a fraction of the greenhouse gas impact of manure used as fertilizer. "This is how we can make the transition from fossil-based energy to renewable energy using existing infrastructure, which is a tremendous advantage," said Simakov, who collaborates with fellow chemical engineering professor Michael Fowler. The modelling study showed that a $5-million investment in a methanation system at the Ontario farm would, with government price subsidies for renewable natural gas, have about a five-year payback period. A paper on modelling of a renewable natural gas generation facility at the Ontario farm, which also involved a post-doctoral researcher and several Waterloo students, was recently published in the International Journal of Energy Research.
  23. Any resident of the Great Plains can attest to the massive scale of wind farms that increasingly dot the countryside. In the Midwest and elsewhere, wind energy accounts for an ever-bigger slice of U.S. energy production: In the past decade, $143 billion was invested into new wind projects, according to the American Wind Energy Association. However, the boom in wind energy faces a hurdle -- how to effectively and cheaply store energy generated by turbines when the wind is blowing, but energy requirements are low. "We get a lot of wind at night, more than at daytime, but demand for electricity is lower at night, so, they're dumping it or they lock up turbines -- we're wasting electricity," said Trung Van Nguyen, professor of petroleum & chemical engineering at the University of Kansas. "If we could store this excess at night and sell or deliver it during daytime at peak demand, this would allow wind farm owners to make more money and leverage their investment. At the same time, you deploy more wind energy and reduce demand for fossil fuels." Since 2010, Nguyen has headed research to develop an advanced hydrogen-bromine flow battery, an advanced industrial-scale battery design -- it would be roughly the size of a semi-truck -- that engineers have strived to develop since the 1960s. It could work just as well to store electricity from solar farms, to be discharged overnight when there's no sun. Funded first by the National Science Foundation and later by the Advanced Research Projects Agency-Energy, Nguyen has worked with researchers from the University of California at Santa Barbara, Vanderbilt University, the University of Texas at Arlington and Case Western Reserve University. Along the way, Nguyen has overseen breakthrough work on key components of hydrogen-bromine battery design. For one, there's the electrode Nguyen developed at KU. A battery's electrode is where the electrical current enters or leaves the battery when it's discharged. To be maximally efficient, an electrode needs a lot of surface area. Nguyen's team has developed a higher-surface-area carbon electrode by growing carbon nanotubes directly on the carbon fibers of a porous electrode. "Before our work, people used paper-carbon electrodes and had to stack electrodes together to generate high-power output," he said. "The electrodes had to be a lot thicker and more expensive because you had to use multiples layers -- they were bulkier and more resistive. We came up with a simple but novel idea to grow tiny carbon nanotubes directly on top of carbon fibers inside of electrodes -- like tiny hairs -- and we boosted the surface area by 50-70 times. We solved the high-surface requirement for hydrogen-bromine battery electrodes." A key issue remaining before a hydrogen-bromide battery can be marketed successfully is the development of an effective catalyst to accelerate the reactions on the hydrogen side of the battery and provide higher output while surviving the extreme corrosiveness in the system. Now, with funding from an NSF sub-award through a private company called Proton OnSite, Nguyen is verging on solving this last barrier. "I think we're on the verge of a real breakthrough," he said. "We need a durable catalyst, something that has the same activity as the best catalyst out there, but that can survive this environment. Our previous material didn't have sufficient surface area to give enough power output. But I've been able to continue to work on this rhodium sulfide catalyst. I think we've figured out a way to increase surface area. We now have a better way, and we may publish that in three to six months -- we have some minor issues to resolve, but I think we'll have a suitable material for the hydrogen reaction in this system." The new results to develop an industrial scale advanced hydrogen-bromine flow battery will be presented at the meeting of the Electrochemical Society in Seattle this May. Indeed, Nguyen -- who has founded several startup companies over his research career -- noted the new hydrogen-bromine battery soon could be commercialized, and easily could be scaled to MW (power) MWh (energy) scales, coming in modular container form, about 1MWh in a full-size container. But he cautioned it could only be used in remote, industrial sites -- places like wind and solar farms, where the huge batteries likely would be buried underground. "This energy storage system, because of its corrosiveness, isn't suitable for residential or commercial systems," he said. "Bromine is like chlorine gas. Dig a hole, line it with cement or plastic, drop this battery down and cover it up -- it should be in an enclosed or sealed system to prevent leakage or emission of bromine gas. This will be suitable only for large-scale remote energy storage like solar farms and wind farms." The KU researcher said the rise of renewable energy would depend on technology breakthroughs that make the economics attractive to energy producers and investors, and he hoped his new battery design could play a part. "The way we use fossil fuel for energy is very inefficient, wasteful and generates greenhouse gasses," Nguyen said. "For fossil fuels, you make the initial investment, and also you pay for operation every day -- pay for coal or for natural gas for rest of the life of the power plant. Once you make the initial investment in renewable, the electricity you make is free."
  24. Even traces of oxygen can deactivate molecular catalyst that are incorporated in fuel cells. Consequently, this drawback hampered the use of such catalyst based on abundant metals, which mimic the active center of natural biocatalyst, in technological relevant applications. Now, a team of researchers from the Ruhr-Universität Bochum (RUB), the Max-Planck-Institute for Energy Conversion in Mülheim and the from the Pacific Northwest National Laboratories in Washington, USA, was able to equip such a catalyst with a self-defense mechanism against molecular oxygen. An alternative for scarce and noble catalyst Hydrogen is believed to be one of the most promising energy vectors in the future. Typically, catalyst based on noble and scarce materials like platinum are used for the use in highly efficient H2/O2 driven fuel cells. A promising alternative for this expensive and limiting catalyst materials are molecular catalysts based on abundant metals like nickel and/or iron, which resemble a mimic of the active center of nature's highly active hydrogenases. Oxygen damage An interesting class of such molecular catalyst are the DuBois type complexes. Their active center comprises a central nickel-atom that is coordinated by pendant beases. These catalyst reveal a high activity which is similar to those of the hydrogenases and their ligand structure can be altered to enable catalysis in aqueous systems and allow for the covalent attachment to electrode surfaces. "The latter is of particular importance for technological applications since the immobilization enhances the performance of such fuel cell system" as it is explained by Prof Dr Wolfgang Schuhmann, Analytical Chemistry, RUB. A drawback of such catalyst is their high oxygen sensitivity, which hampered the use of this material in technological applications in current fuel cell systems. However, in analogy to the hydrogenases, that can be protected by incorporation for the biocatalyst into an oxygen reducing polymer matrix, the researchers were now able to transpose this protection system also to a DuBois-catalyst. A polymer induces self-protection For the protection against oxygen, the researchers introduced a hydrophobic and redox-inactive polymer as immobilization matrix for the nickel-complex based catalyst. The embedment of the catalyst into the polymer matrix ensures the formation of two separated reaction layers: a catalytically active layer close to the electrode surface and a protection layer at the polymer/electrolyte interface. The first layer allows for an efficient conversion of hydrogen at the electrode surface and the second layer removes incoming oxygen at the interface and thus protects the active layer from oxygen damage. Electrically disconnected layers According to Wolfgang Schuhmann, "the catalyst itself provides the protection against oxygen. For this, the catalyst uses electrons from the hydrogen oxidation in the outer polymer layer which are then used to reduce oxygen at the catalyst centers." This becomes possible because the developed polymer matrix electrically disconnects the nickel-catalyst located in the outer polymer layer from the electrode surface. Hence, all electrons extracted from the hydrogen oxidation in the outer layer can be used for the reduction of harmful oxygen at the polymer/electrolyte interface. Concomitantly, the polymer prevents the transfer of electrons from the active hydrogen oxidation layer at the electrode surface to the protection layer. Thus, all electrons from the active layer are transferred to the electrode and are not used for protection. The polymer/catalyst modified electrodes showed an excellent long-term stability and remarkable current densities which are both prerequisites for powerful fuel cells. The proposed hydrogen oxidation electrodes are thus a promising alternative for the development of sustainable and cost-efficient energy conversion systems.
  25. Mitsubishi Hitachi Power Systems (MHPS) and Carnegie Mellon University (CMU) today announced the release of the 2018 Carnegie Mellon Power Sector Carbon Index, at CMU Energy Week, hosted by the Wilton E. Scott Institute for Energy Innovation. The Index tracks the environmental performance of U.S. power producers and compares current emissions to more than two decades of historical data collected nationwide. This release marks the one-year anniversary of the Index, developed as a new metric to track power sector carbon emissions performance trends. "The Carnegie Mellon Power Sector Carbon Index provides a snapshot of critical data regarding energy production and environmental performance," said Costa Samaras, Assistant Professor of Civil and Environmental Engineering CMU. "We've found this index to provide significant insight into trends in power generation and emissions. In particular, the data have shown that emissions intensity has fallen to the lowest level on record, as a combination of natural gas and renewable power have displaced more carbon intensive coal-fired power generation." The latest data revealed the following findings: U.S. power plant emissions averaged 967 lb. CO2 per megawatt-hour (MWh) in 2017, which was down 3.1 percent from the prior year and down 26.8 percent from the annual value of 1,321 lb CO2 per MWh in 2005. The result for 2016 was initially reported as 1,001 lb/MWh, but was later revised downward to 998 lb/MWh. "The power industry has made significant progress in reducing emissions for over a decade, as new technology, state and federal policies and market forces have increased power generation from natural gas and renewables, and decreased power generation from coal. As this Change in Power continues, the Carnegie Mellon Power Sector Carbon Index will not only report the results but also provide analysis of the underlying reasons for the changes we're seeing," said Paul Browning, President and CEO of MHPS Americas. "Our team at MHPS is proud to support this important work by Carnegie Mellon researchers." This year, CMU announced enhancements to the Index. New regional information from within the US will be reported, allowing for greater insight into the impact of regional trends on fuel types, usage, and emissions. In addition, the index will begin to incorporate emissions data from other countries across North and South America. As the Index continues to expand, it will serve as a source of objective insight regarding emissions trends across the Americas for policy makers, regulators, utilities, industry analysts and the public. "We are excited to regionalize and internationalize our work on the Power Sector Carbon Index," said Ines Azevedo, Professor of Engineering and Public Policy, and co-Director of the Climate and Energy Decision Making Center, "and to help educate and support the decision making process regarding emissions reductions." For the complete findings of the Index, please visit emissionsindex.org.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.