European Commissioner Tibor Navracsics met with Chinese Vice-Premier Liu Yandong on 13-14 November 2017 to discuss education, culture, youth and sport. It was on the occasion of the 4th EU-China High Level People-to-People Dialogue in Shanghai. The dialogue was launched in 2012 to build trust and understanding between the peoples of the EU and China. This year’s exchanges focused on culture, but education, gender equality, youth and, for the first time, sport were also discussed. Following the meeting, Navracsics said: “The EU and China increasingly share global responsibilities. We work together on complex issues, from fighting poverty and tackling climate change to boosting trade and security. We build on shared views but sometimes we need to bridge differences. Promoting mutual understanding and respect between our people and cultures is therefore today more important than ever if we want to succeed.” Over the past decade the EU and China have closely co-operated in the areas of education, training, culture, multilingualism and youth through sector-focused policy dialogues. The two parties took stock of progress achieved under Erasmus+ mobility actions. Since 2015, more than 4,000 students and staff have already benefitted from the programme. Additionally, with over 70 universities participating in the action, China remains the top beneficiary of capacity-building projects among partner countries, contributing to the modernisation and internationalisation of China’s higher education system. In research and innovation, following the outcome of the 3rd China-EU High Level Innovation Cooperation Dialogue held on 2 June 2017, both parties agreed to boost researchers’ mobility through Marie Skłodowska-Curie Actions. In the framework of the dialogue on gender equality, both sides discussed how to improve women’s economic empowerment and work-life balance. The post EU-China strengthen co-operation appeared first on Horizon 2020 Projects.
Asahi Kasei’s electrolyser technology will be one of the cornerstones for CO2 reuse and therefore the reduction of CO2 emissions. The ALIGN-CCUS Project Consortium announced the launch of the ALIGNCCUS (carbon capture, utilisation and storage) Project, a partnership project which runs from 2017 to 2020 and consists of 31 research institutes and industrial companies from five European countries. The project received €15m funding from the European ERA-NET ACT (Accelerating CCS Technologies) fund and aims at transforming six European industrial regions into low-carbon centres by 2025. ACT is a European Union initiative to accelerate the deployment of safe and cost-effective carbon capture and storage (CCS) technologies. ACT receives funding support from the European Commission’s Horizon 2020 instrument, the ERA NET Cofund. Hideki Tsutsumi, managing director, said: “The ALIGN-CCUS project contributes to reducing CO2 emissions in the fields of transportation and power generation. We are very happy to participate in this project by using our alkaline water electrolysis system to produce green hydrogen. By further improving our technology we will be a leading company for the realisation of a hydrogen society”. Europe, with its ambitious goals for CO2 reduction, its drop-out of nuclear energy by 2022 and its high share of electric power supply by renewable energy sources, has a high need for reliable CCUS- and power storage technologies. Hydrogen has been the focus recent years in the field of energy storage (Power-to-Gas) and to produce fuel for automobiles (Power-to-Fuel). Hydrogen produced with Asahi Kasei’s alkaline water electrolysis system and CO2 captured at power plants will be transformed into fuels such as green methanol and green dimethyl ether (DME). Together with European partner institutions and companies, Asahi Kasei Europe will be a member of Work Package 4 of the ALIGN-CCUS Project. The post Asahi Kasei Europe participates in multi-partner project appeared first on Horizon 2020 Projects.
PicoQuant and the Humboldt University Berlin, Germany, are co-hosting an Early Stage Researcher (ESR) within the framework of the Marie-Curie Actions of the Innovative Training Networks H2020 BE-OPTICAL. The ESR is working on a thesis during his stay at PicoQuant that is entitled ‘Advanced Nanoscale Microscopy: Time-Resolved Super-Resolution Fluorescence Studies of Biological Structures’. The project has received funding from the European Union’s Horizon 2020 Programme for research, technological development and demonstration. The aims of the project lie in exploring innovative strategies based on pulsed excitation coupled with nanosecond time-resolved detection to carry out multiplexed studies of complex dynamics in biological structures under optical super-resolution conditions. In the scope of this project, different pulsed interleaved excitation schemes will be combined with pattern-based fluorescence decay recognition to optimise the separation of multiple labels. Part of the project also includes the screening and full characterisation of promising dye labels as well as identifying optimal sample preparation and experimental conditions, starting from artificial model systems up to studies in real cells. This latter part will be performed in collaboration with our partners. A key aspect for this projects success is PicoQuant’s expertise in time-resolved and super-resolution microscopy via Stimulated Emission Depletion (STED). The post Projects partner on medical imaging project appeared first on Horizon 2020 Projects.
Graphene Flagship researchers are preparing to collaborate with the European Space Agency (ESA) to test graphene technologies for space applications. Two teams of researchers will explore the benefits of graphene as a light-propulsion material in solar sails, and as a smart coating in loop heat pipes for satellites. Both experiments will be performed in microgravity conditions to simulate the extreme conditions in space. The solar sails will float in microgravity in a drop tower experiment, while the team investigating heat pipes will experience weightlessness on-board the parabolic flight. The Graphene Flagship, funded by the Horizon 2020 Programme, is a pan-European research consortium committed to bringing graphene technologies through research laboratories to mature applications. Graphene, the single-atom thick carbon sheet, is promising for a range of applications thanks to its excellent electrical, mechanical and thermal properties. To test the graphene-coated wicks in microgravity conditions, the researchers will take part in low-gravity parabolic flights operated by ESA in partnership with Novespace. Dr Meganne Christian, a researcher at the National Research Council of Italy (CNR), said: “Getting to see these materials that we’ve been working on for so long, finally work in the conditions that we want them to is really exciting.” The experiment is a collaboration between Graphene Flagship partners at the Microgravity Research Centre, Université libre de Bruxelles, Belgium; the Cambridge Graphene Centre, University of Cambridge, UK; Institute for Organic Synthesis and Photoreactivity and Institute for Microelectronics and Microsystems, CNR, Italy; and Leonardo Spa, Italy, a global leader in aerospace, operating in space systems and high-tech instrument manufacturing and in the management of launch and in-orbit services and satellite services. These two ambitious experiments are a demonstration of graphene’s diverse potential, and will lay the groundwork to expand the frontiers of graphene research. The post Graphene Flagship and ESA collaborate appeared first on Horizon 2020 Projects.
Global climate change and the human impact on marine ecosystems have led to dramatic decreases in the number of fish in the ocean, but also to an increase in jellyfish. The GoJelly project, co-ordinated by the GEOMAR Helmholtz Centre for Ocean Research, Germany, would like to transform problematic jellyfish into a resource that can be used to produce microplastic filters, fertilisers or fish feed. GoJelly is a consortium of 15 scientific institutions from eight countries led by the GEOMAR Helmholtz Centre for Ocean Research in Kiel. The EU has approved funding of €6m over four years to support the project through its Horizon 2020 programme. Jellyfish are appearing in huge numbers that have already destroyed fish farms on European coasts and blocked the cooling systems of nearby power stations. Jamileh Javidpour of GEOMAR, said: “In Europe alone, the imported American comb jelly has a biomass of one billion tonnes. While we tend to ignore the jellyfish, there must be other solutions.” The project will first entail exploring the lifecycle of several jellyfish species. A lack of knowledge about lifecycles makes it almost impossible to predict when and why a large jellyfish bloom will occur. Jellyfish would be much more sustainable and would protect natural fish stocks if used as fertilisers for agriculture or as aquaculture feed, according to the GoJelly team. Another option uses jellyfish as food for humans. Javidpour added: “In some cultures, jellyfish are already on the menu. As long as the end product is no longer slimy, it could also gain greater general acceptance.” Jellyfish also contain collagen, a substance very much sought after in the cosmetics industry. The post Jellyfish to provide ‘useful’ products appeared first on Horizon 2020 Projects.
European battery manufacturers’ association Eurobat has welcomed a proposal by the European Commission to decarbonise the union’s transport sector. Eurobat welcomes it as “one of a number of sectors its members will help to decarbonise”. The group predicted that vehicles from hybrids through full electric “will co-exist for the foreseeable future” and, as such, “continuous efforts on the development of all battery technologies will be a fundamental cornerstone of the transition to a decarbonised economy”. According to the group, the decarbonisation of sectors from energy storage and grid stability to warehouse, port logistics and telecommunication will be underpinned by battery technology. Given this, Eurobat said it also welcomed the commission’s launch of a Battery Alliance earlier this month. As part of the alliance, the commission will deploy more than €2bn from the Horizon 2020 work programme for 2018-2020 to support research and innovation projects in four priority areas, all relevant for batteries, the decarbonication of the EU building stock, EU leadership on renewables, energy storage solutions and electro-mobility. Eurobat said the programme was in line with its proposed 2030 EU Battery Strategy, launched in February. The post Battery manufacturers welcome EU vehicle decarbonisation plan appeared first on Horizon 2020 Projects.
An EU-funded project to harness the Sun’s radiation to rid the oceans of plastic begins with a system developed at the KTH Royal Institute of Technology in Sweden. The new technology will be used to break down micro-plastics from personal care products and tested for implementation in homes and wastewater treatment plants. While exposure to sunlight can degrade plastics into harmless elements, it’s a slow process. In some cases, plastics can take several years to decompose. Joydeep Dutta, chair of the Functional Materials division at KTH, says this system will speed up that process by making more efficient use of available visible light and ultraviolet rays from the Sun. The system involves coatings with material made of nano-sized semiconductors that initiate and speed up a natural process called photocatalytic oxidation, Dutta adds. In a test household, these nanomaterial-coated filter systems will be placed at the exit of wastewater from homes. Similarly, in wastewater treatment plants, these devices will be used to initiate micro-plastics degradation after classical treatments are completed. Nearly every beach worldwide is reported to be contaminated by micro-plastics, according to the Norwegian Institute for Water Research. Along with contamination, marine life can ingest these plastics, which also adsorb pollutants such as DDT and PCB. Dutta says: “These plastics will start accumulating in the food chain, transferring from species to species, with direct adverse consequences to human population.” He added: “Tackling plastic pollution at its source is the most effective way to reduce marine litter.” The project, titled Cleaning Litter by Developing and Applying Innovative Methods in European Seas (CLAIM), will also deploy floating booms at river mouths in Europe to collect visible plastic waste; and along ferry routes in Denmark, the Gulf of Lion and the Ligurian Sea. The post Project investigates ways to remove plastic in wastewater appeared first on Horizon 2020 Projects.
The post Boehringer Ingelheim Animal Health GmbH appeared first on Horizon 2020 Projects.
The European Commission has awarded the 2017 European Capital of Innovation (iCapital) prize of €1m to the French capital Paris. The iCapital award, granted under the EU’s research and innovation programme Horizon 2020, recognises Paris for its inclusive innovation strategy. Tallinn, Estonia, and Tel Aviv, Israel, were selected as runners-up and both received €100,000. The prize money will be used to scale up and further expand the cities’ innovation efforts. Commissioner for Research, Science and Innovation Carlos Moedas announced the results, saying: “Cities are not defined by their size and population, but by the breadth of their vision and the power bestowed upon their citizens. Some cities are not afraid to experiment. They are not afraid to involve their citizens in developing and testing out new ideas. These are the cities that empower their citizens. Today we are here to acknowledge these cities.” Over the last decade, Paris has built more than 100,000 square metres of incubators and now hosts the world’s largest start-up campus. In addition, the city spends 5% of its budget on projects proposed and implemented by citizens. Thanks to this strategy, citizens and innovators from the private, non-profit and academic sectors have made Paris a true ‘FabCity’. Tallinn has been awarded for its initiative to act as a testing ground for potential breakthrough technologies. The Estonian capital fostered the use of self-driving cars, parcel delivery robots and ride-sharing, and has also implemented an innovative e-Residency system, which enables local citizens and businesses to work closely together with foreign entrepreneurs. Tel Aviv has set up a Smart City Urban Lab that links up innovative start-ups with leading technology companies to facilitate breakthrough innovations for solving urban challenges. Education being among Tel Aviv’s priorities, part of the prize will be dedicated to strengthening the Smart Education Initiative, developed by the municipality in collaboration with teachers, parents, students and local tech start-ups. The post Paris named most innovative city 2017 appeared first on Horizon 2020 Projects.
2016 will remain as a landmark in the history of science and, even more so, in the physical sciences. Image showing a color-coded density map (red is for highest density, blue for the lowest) of a supercomputer simulation of massive black hole binaries in a gas-rich galactic nucleus (Mayer 2013). The position of the two black holes is indicated with white dots, and the bar indicates the distance scale (50 pc = about 170 light years). The lighter black hole M_2, still more than 100000 times heavier than the Sun, is sinking towards the center. The red clumps are dense Giant Molecular Clouds, which can weigh millions of solar masses and can deviate the black hole’s trajectory IT is the year in which the first direct detection of gravitational waves has been made, by a vast international consortium employing the advanced LIGO ground-based interferometer. The LISA Pathfinder space probe, designed to test the drag-free technology necessary for LISA, the future space-born gravitational wave detector planned by ESA with international patterns, has not only flown with success after being launched at the end of 2015, but has reported a performance greatly superior to the expectations, with noise level detection already of the order of what we will need for LISA. The ability to detect gravitational waves pushes our current technology to the limits in a number of areas, but opens the window to a completely new way of looking at the Universe. Until now astronomy, astrophysics and cosmology have been based on some form of electromagnetic information, coming from any of the known emitting sources in our cosmos, from individual stars to entire galaxies. Astronomers have created communities specialised in the detection, analysis and interpretation of photons received from such sources in diverse regions of the electromagnetic spectrum, from visible optical frequencies to radio, and to X-ray or gamma rays associated with the most violent phenomena of the Universe, such as quasars or supernovae explosions. But, from the early astronomers in the Egyptian or Sumerian ancient civilisations to the modern astronomers using the Hubble space telescope, the Chandra X-ray space telescope or the Very Large Telescope, our Universe has always been studied by means of electromagnetic signals. With gravitational waves we are really in front of a transformative step in the way we look at the sky. One of the many testable, and tested, predictions of our theory of gravity, General Relativity, gravitational waves will become our new tool to unveil the nature of the Universe, probing for the first time the fabric of space time.   The ultimate fate of stars What are the prime sources in this new era, those that replace, for importance, stars in conventional astronomy? The answer is binaries of compact objects resulting from the ultimate fate of stars. Among these, binaries of massive black holes living at the centre of galaxies are the loudest sources, giving the strongest and most easily detectable signals when LISA will be operative. The black holes in these binaries can weigh from just shy of a million solar masses to more than ten billion solar masses. Our Milky Way hosts a (single) black hole weighing a few million solar masses for example (Schodel et al. 2010). LISA will detect preferentially massive black hole mergers happening in the early stages of the Universe, ten or more billions of years before our time, when galaxies were still young and were often colliding against each other as the Universe was much denser than it is today. As a scientist I like to think I should understand as much as possible the tools I need to carry out my research and go after the most challenging problems. Modern astronomy has come about because we have first elaborated a beautiful, coherent theory of stellar structure and evolution. The stars have been astronomy’s prime tool. Without that, most of what we now know would have not been possible. Cosmology itself as a quantitative, verifiable science started in the 1920s because it was possible to measure distances of objects, such as galaxies, and this is also done using stars. Now the question is, in our time, do we understand the nature of our new sources, massive black hole binaries, in the same way as we understood stars in the late 1800s? The answer is no. But this is an exciting time to bring the knowledge of such objects to a new level.   When galaxies collide It all begins when two galaxies, each with their own massive black hole sitting at their centre, collide and then gradually merge into one single galaxy as their large halos of dark matter create a mutual irresistible gravitational pull. Massive black hole binaries are then thought to evolve across an enormous range of spatial scales. This was already clear at the time of the first major theoretical work on massive black hole binaries (Begelman, Blandford and Ress 1980, Nature). For typical massive black holes, weighing 10-100 million solar masses, the stage at which gravitational wave emission becomes the dominant mechanism to drain the orbital energy of the binary and bring it to coalescence is reached only when the two black holes reach a separation of a milliparsec. But they start their journey tens of thousands light years away, when they are still in the nuclei of their merging host galaxies. Ideas of the physical processes governing the evolution of the orbit of the pair of massive black holes have been around for a while, but modelling them correctly requires the use of complex computer models that solve the set of coupled partial differential equations for gravity, pressure forces and radiation to the very least. The early part of their journey, until they are well above a milliparsec scale, can be described by Newtonian equations, while the latter part needs the intervention of general relativistic calculations solving Einstein’s equation, or at least some approximation of the latter in the form of the so-called post-Newtonian expansion (Prieto et al. 2008). Calculations of this type require the use of supercomputers. Indeed solving even the simplest of these models requires so many operations that it would take a thousand years on a conventional notebook or workstation. A critical stage is when the two black holes become close enough to become mutually bound by gravity. At this point we can say that the binary has formed. Supercomputer calculations through the years have shown that in this phase the drag by the dense, cold interstellar gas in galactic nuclei, is the dominant process (Mayer et al. 2007; Chapon, Mayer et al. 2013). After the binary has formed the jury is still out on what is the main source of the drag, but it may well depend on the type of galaxy where the binary is evolving. If there is plenty of cold gas drawn down to the heart of the nucleus, which would torque the binary as long as there are asymmetries in its distribution, a process similar to planet migration (Mayer 2013). Alternatively, stars can ‘rob’ kinetic energy and angular momentum of the binary as a result of their gravitational pull when they fly close to it, and bring it to the gravitational wave regime (Milosavljevic and Merritt 2001; Khan et al. 2012; Vasiliev and Merritt 2014).   Galaxy merger Traditionally computer models that were able to describe the effect of encounters with stars were not able to model friction and torques by gas, nor was it possible to study the whole binary shrinking process from the galaxy merger state to when relativistic effects begin. Recently we have used some of the fastest supercomputers in the world, located in Switzerland, China and Germany, to carry out the first simulation that follows all the phases of the evolution of the binary, up to the point when gravitational wave radiation begins (Khan, Fiacconi, Mayer et al. 2016). We started from a galaxy merger extracted from a state-of-the-art simulation of galaxy formation, called ARGO (Feldmann & Mayer 2015), which was previously run on the PizDaint supercomputer in Switzerland. The result is unexpected; the two black holes, which weigh more than 100 million solar masses, fuse into one with a gravitational wave burst in less than 10 million years after the galaxy collision. We also demonstrate that the emitted waves fall into the LISA band before they coalesce. The timescale of the process is almost 100 times shorter than usually assumed to make forecasts for how many black hole merger events LISA should detect. This is exciting news, and it is also well understood; it is simply a consequence of the fact that galaxies were about 100 times denser than today several billion years ago whereby the key processes determining the shrinking of the binary all depend on density.   The power of machines Now the challenge ahead of us is mostly computational. This simulation is the first of his kind, and required more than a year of nearly continuous computing even as we harnessed the power of such big machines. But there is a catch. Even the best simulation programs we currently have cannot use even 10% of the total computing power of these supercomputers at once. Inefficient usage could get even more evident when the bigger and more powerful exascale supercomputers appear in a couple of years. Yet computer science offers us new techniques to improve the so-called ‘scalability’ of simulation codes, namely their ability to run in parallel on a large number of processing units, from traditional CPUs to Graphics Processing Units (GPUs). If we can advance our codes to approach 100% efficiency on the new supercomputers we could run tens of simulations in the same time we can currently run only one. This will be the way to provide the necessary theoretical support to produce realistic forecasts for LISA, and help with the interpretation of the data afterwards. We can envision a supercomputer entirely dedicated to black hole merger simulations, including those focusing on the final phase of coalescence in full general relativity. This may seem ambitious but it may be the only way to go. The parameter space is huge and has to be explored with an ambitious simulation campaign. Supercomputers dedicated to very important tasks, such as weather forecasting, already exist.   A point in history The endeavour of looking at the Universe through the new window of gravitational waves might be a revolutionary step in Mankind’s knowledge; it might mark history as the first astronomical observations of Galileo, Kepler and Copernicus did five centuries before us. It definitely deserves an unprecedented effort in dedicating computational resources, and any kind of other necessary resources, to it. The post Dissecting the Universe appeared first on Horizon 2020 Projects.