The ITER project aims to create the world's largest nuclear fusion reactor. Credit: Wikimedia.

  • Tungsteno

The most ambitious scientific megaproject of the 21st century moves into high gear

Eliminating fossil fuels to combat climate change is one of the great challenges of the 21st century. What if the solution were to recreate the Sun on Earth? The ITER project seeks to construct the world's largest nuclear fusion reactor to achieve a clean, inexpensive and inexhaustible source of energy.

PABLO GARCÍA-RUBIO | Tungsteno

 

We are now entering the five decisive years leading up to the powering on of the most ambitious experiment in the history of science: ITER, the world's largest reactor designed to demonstrate nuclear fusion. With the initial construction phase completed in Cadarache (France), the assembly of this colossal engineering infrastructure, in which all the world's major powers are collaborating, has begun. The magnitude of this challenge is rivalled only by that of the three scientific megaprojects of the 20th century: the development of atomic energy in the 1940s (Manhattan Project), man's landing on the moon in the 1960s (Apollo Program) and the launch of the International Space Station in the 1990s.

While the 21st century began with the Human Genome Project and the Large Hadron Collider, the two most important international scientific cooperation projects to date, it is likely that neither of these will have as much impact on history as ITER. This experiment, due to start in 2025, is intended to provide the definitive solution that will eliminate our dependence on oil after 2050. Beyond the plan to reduce emissions by 2030 and the energy transitions planned for the next two decades, humanity needs a medium- to long-term plan to halt climate change.

 

A large nuclear reactor to save the planet

 

ITER (International Thermonuclear Experimental Reactor) is an international project aimed at creating the world's largest nuclear fusion reactor and proving that this method is capable of producing energy in a viable way. This nuclear reactor, which is currently under construction in the south of France, is a pioneer in attempting to reproduce nuclear fusion in a stable and long-term way.

In a way, it is trying to recreate on Earth and on a small scale the same process that powers stars in the universe, like our Sun. It is a natural process involving enormous amounts of energy, but it has not yet been possible to reproduce it artificially. If this could be achieved, it would be a clean, safe, cheap and virtually inexhaustible source of energy.

The reactor consists of a large 23,000-tonne doughnut-shaped Tokamak-type magnetic confinement chamber. This design was already conceptualised by Soviet scientists in the 1950s and an international committee to oversee its development was formed in 1986. The plan is that, when testing is completed, ITER could lead to a fusion power plant capable of producing electricity in the same way as existing thermal power plants, which burn fossil fuels or split radioactive elements to generate heat and produce steam, which is used to spin a turbine to produce electricity.

 

The reactor consists of a large magnetic confinement chamber weighing 23,000 tonnes. Credit: ITER.

 

The energy produced in a fusion power plant would also be highly efficient: with just one gram of fuel, the energy equivalent to burning 8 tonnes of petroleum could be obtained. Furthermore, it is estimated that fusion could generate three to four times more energy than the fission process—the only process currently used in nuclear power plants.

 

The great energy hope

 

Fusion is a nuclear reaction that consists of merging two light isotopes of hydrogen (hydrogen atoms with different numbers of neutrons). When these particles collide, they achieve a plasma state and large amounts of heat and energy are released. It is precisely this high temperature—around 150 million degrees—and the instability of the plasma that make fusion very difficult to control and maintain.

To date, all mass-produced nuclear energy in the world is generated by the fission process. That method follows the reverse process: it splits the nucleus of a heavy element, mainly uranium, into fragments in order to release energyThe main problem with the fission process is that the waste generated emits highly unstable and radioactive particles and must be buried in so-called "nuclear graveyards." In addition, a malfunction or leak in a nuclear power plant using fission can lead to human and natural disasters, such as Chernobyl or Fukushima.

In contrast, fusion only produces helium as a waste product, an inert gas that does not interact with the environment or humans. A fusion power plant could not cause any disaster because, if any of the protocols failed, the plasma would cool down and the process would be paused without causing any major disruption.

 

Fusion is a nuclear process that involves merging two light isotopes of hydrogen to release energy. Credit: IAEA.

 

When will it become a reality?

 

Calculating the timeline of the project is the most complicated part, mainly because it is an international initiative of unparalleled magnitude, whose members are China, the USA, Japan, South Korea, India, Russia and the European Union. Initially, the intention was to achieve the first plasma by 2020, but that timetable was scrapped and the target reset for December 2025. This year, the COVID-19 pandemic has delayed the assembly of the machine and the arrival of some of the components, so the organisation has already warned that there could be further delays.

However, these minor delays are not very significant, especially if one takes into account that it took 20 years from the time that Presidents Ronald Reagan (USA) and Mikhail Gorbachev (USSR) laid the foundations for the project until construction actually began. And we will likely still have to wait another two decades for the first stable plasma.

What seems clear is that the development of nuclear fusion as an energy alternative is not only a project for the future, but will probably be the great energy solution of the next generation. If the predictions are correct, it will be viable in the second half of this century. The project's CEO, Bernard Bigot, believes that the current timetable, although stretching into the distant future, is realistic. After all, says Bigot, "once oil was first extracted from the ground, it took 150 years for it to become a global industry."

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Energy
  • Nuclear fusion

Various companies have begun adopting new technologies to reduce the environmental impact of concrete. Credit: Unsplash.

  • Tungsteno

Reducing the carbon footprint of concrete

The manufacturing of cement emits 8% of the world's COemissions. As achieving carbon neutrality is becoming increasingly urgent, what technologies can reduce the ecological footprint of this material and, therefore, of concrete?

ISABEL RUBIO ARROYO | Tungsteno

 

Concrete, the world's most widely used building material, is crucial to meeting the challenges of the Paris Agreement on climate change. Some studies indicate that for the cement sector to meet the Paris goals, its annual emissions must be reduced by at least 16% by 2030. In a context where achieving carbon neutrality has become a priority, the World Cement Association has urged industry members to ramp up efforts to "adopt new technologies quickly, and at scale, to reduce their CO2 emissions."

 

Reducing emissions

 

In recent years, a number of companies have taken action. The American company Solidia claims to have created a more sustainable cement manufacturing technology that "can be produced in traditional cement kilns using less energy and reducing greenhouse gas emissions during manufacture by 30-40%."

Concrete, which is generally a mixture of cement, sand, water and gravel or crushed stone, has a huge environmental impact. While the ecological and socio-economic impacts of the billions of tonnes of sand and gravel extracted annually to feed the global concrete industry have received some attention, there is another major drawback of this useful building material: it contributes to climate change. The processes used to produce the cement from which concrete is then created generate large amounts of carbon dioxide emissions. More than four billion tonnes of cement are produced each year, accounting for around 8% of global CO2 emissionsaccording to the Chatham House report Making Concrete Change.

The kilns in which cement is made are heated by the combustion of different types of fuels, thus emitting greenhouse gases. However, 60% of these emissions are not due to the use of fossil fuels, but to the chemical reactions in the processas the European Commission points out. Cement is obtained by grinding its main component, clinker, together with gypsum and other compounds. To produce clinker, limestone, composed essentially of calcium carbonate, is calcinated at around 900°C to generate calcium oxide or lime. In doing so, a great deal of carbon dioxide is producedThe energy optimisation of this process is the way Solidia has chosen to make more environmentally friendly concrete.

 

The processes used to produce the cement from which concrete is then created generate significant carbon dioxide emissions. Credit: Pexels.

 

Capturing emissions

 

Other initiatives include the EU-funded LEILAC project, which aims to drastically reduce emissions from the cement industry in Europe. The research and innovation project aims to create a technology able to capture and store the carbon dioxide produced in cement manufacturing, rather than releasing it into the atmosphere. Preliminary trials at a cement factory in Belgium have shown promise, according to the European Commission. Researchers in another EU-funded project called CLEANKER have developed a new CO2 capture technology for cement plants that can, in theory, cut emissions by 90%.

Investing in such initiatives is important given that concrete is a ubiquitous, critical and indispensable material, and especially in a context in which the world's population is expected to increase dramatically by two billion people over the next 30 years. This growth may spur a dramatic surge in demand for all types of infrastructure in the coming decades. Concrete is not only used to support the structure of buildings, it is also the key material in bridges, ports and dams around the world. And it is even visible in emblematic buildings such as the Saint-Jean de Montmartre Church, in France, or the Stock Exchange Tower, in Canada.

 

Concrete, the world's most widely used building material, has been used for decades to construct all kinds of infrastructure. Credit: Unsplash.

 

Going further: concrete that stores C02

 

While some companies are looking to make the process of producing concrete more sustainable, others are going one step further and trying to make the material itself store carbon dioxide. Canadian start-up CarbonCure Technologies has found a way to use less cement when making concrete. In order to reduce its carbon footprint, the company introduces recycled CO2 into fresh concrete. In doing so, the carbon dioxide reacts with the calcium ions in the cement to form a nano-scale mineral: calcium carbonate. In this way, according to the company, the concrete retains its strength.

"Because the carbon dioxide actually helps to make the concrete stronger, concrete producers can still make concrete as strong as they need to, but use less cement in the process," Christie Gamble, the director of sustainability at CarbonCure, told CNNThis concrete not only stores excess CO2 from the atmosphere, but by requiring less cement, it also reduces emissions in its manufacture. According to its creators, this is a win-win: it is much more environmentally friendly "and does not compromise on performance."

 

CarbonCure introduces captured carbon dioxide into fresh concrete to make it stronger. Credit: CarbonCure.

 

Can these new concretes help to reduce emissions?

 

The reality is that, for the moment, most of these new generation concretes cannot compete on cost and performance with conventional concrete, according to the website Carbon Brief, which specialises in the science and policy of climate change: "None have achieved large-scale commercial use and are currently used only in niche applications." Among the reasons why these alternatives have not achieved widespread use is that their effectiveness is less proven than that of conventional cement, which has been in use for decades and decades.

In addition, green concretes suffer from carbonation faster than traditional concretes, according to the European Commission. Carbonation is a natural chemical process that occurs in concrete and can reduce its durability and strength. "When using green concrete, you reduce the carbon dioxide emissions, but the rate of carbonation is higher and corrosion can start earlier," says Dimitri Val, Professor of Infrastructure, Safety and Reliability at Heriot-Watt University. Therefore, "money would have to be spent on repairs, which increase costs and generate more emissions."

Still, the goals of the Global Cement and Concrete Association (GCCA) are ambitious: "Our climate ambition is the commitment of our member companies to drive down the CO₂ footprint of their operations and products, and aspire to deliver society with carbon-neutral concrete by 2050." While we wait for these solutions to become viable and for the construction sector to adopt them, we know that by 2050 there will be some 9.7 billion people on the planet, 68% of whom will live in cities. Carbon-neutral concrete, more than a wish, is an urgent necessity.

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Sustainability
  • Concrete
  • Emissions
  • Carbon footprint
  • Concessions

We preserve Colombia’s archaeological heritage 

Thanks to our preventative archaeology program we preserve the cultural and archaeological heritage of the communities surrounding our P3 projects. 

Thanks to our preventative archaeology program we preserve the cultural and archaeological heritage of the communities surrounding our P3 projects 

Our four P3 projects in Colombia have applied this preventative archaeology program before starting construction works to collect and classify the archaeological remains found in the project sites. Thanks to these activities, that we’ve been executing for the past five years, we have been able to find, analyze and preserve more than 515,000 archaeological findings. 

These elements comprise more than 505,000 ceramic fragments; more than 9,500 stone items (materials and stone tools) and close to 500 complete ceramics. 

 


The oldest pieces date from the 9th century BC (1961 BC, approximately) and the most recent from the 7th century AD. The most relevant are precious metal works, ceramic containers to prepare food, music instruments, human and animal bones.


These items have been found in 92 archaeological sites in six different departments within our projects’ influence area. These places are related to former human settlements and cemeteries from pre-Hispanic, Colonial and Modern times. 

 

 

Some findings are connected to pre-Hispanic indigenous communities, like the Zenu or the Malibu in Montes de María or the Chitareros in Norte de Santander. 

 

Analysis of the remains 

 

After excavations, the remains are classified and structured into a system, under the Instituto Colombiano de Antropología e Historia’s parameters. The pieces are analyzed at a lab, using techniques that let us know their age and inform us about what culture and costumes they originate from and the way of life of the humans that inhabited the different regions of what today is Colombia, apart from the first encounters and interactions between the indigenous tribes and the Spaniards during the conquest and Colonial time periods.

 
The concession societies hope to share these findings from the excavations, so the items will be donated to local universities, museums or conservation groups, along with pictures, drawings, maps, and technical sheets. 

Voyager Station aims to be the first hotel in space. Credit: Orbital Assembly Corporation.

  • Tungsteno

Are space hotels viable?

Going on holiday to space sounds like a dream more typical of science fiction than real life. Now that private space flights are taking off, thanks to tycoons like Jeff Bezos and Richard Branson, will tourist accommodation in orbit be the next step?

ISABEL RUBIO ARROYO | Tungsteno

 

The facilities—more than 300 rooms, an open-air gym, a Japanese garden, a casino and a concert venue—could be part of any luxury hotel, but will belong to a rather special type of lodging. Voyager Station aims to be the first space hotel that orbits in the stratosphere. Its creators hope to have it up and running by 2027, but taking such a project into space is far from easy. In fact, several similar initiatives have failed in the past. How viable are space hotels?

On 12 April 1961, Yuri Gagarin became the first human being to travel into space. This historic event kick-started the space race, and in the 60 years since then more than 570 astronauts have made it to space. While government agencies such as NASA and the European Space Agency (ESA) have driven much of the space travel, in recent decades various private companies have taken the first steps to build their own launch vehicles and tourist accommodation.

 

A new era of space tourism

 

In recent months, some billionaires have ushered in a new era of space tourism. Virgin Galactic successfully launched its founder, Richard Branson, and three other crewmembers into space in July. A few days later, Amazon founder Jeff Bezos also travelled into space on the first manned trip of his company, Blue Origin. "We're trying to make the public realise that this golden age of space travel is just around the corner. It's coming. It's coming fast," says ex-pilot John Blincow, who runs the Orbital Assembly Corporation.

The company’s flagship project is the Voyager Station hotel, a futuristic rotating wheel orbiting the Earth. The website of this initiative is full of enticing messages such as: "Reserve your place to be one of the first humans to vacation on a luxury space station," or "Make history as one of the first humans in history to own real estate in orbit."

 

Voyager Station will have more than 300 rooms, an open-air gym, a Japanese garden, a casino and a concert venue. Credit: Orbital Assembly Corporation.

 

Although the marketing slogans take its success for granted, it is still too early to tell whether this ambitious initiative will ever become a reality. In fact, most projects of this kind have either failed or have not yet come to fruition. In 2008, a total of 38 people had already made reservations to stay at the Galactic Suite space resort, which at the time aspired to be the first space hotel. The trip itself was to cost each of them around three million euros and, in addition to the four-day stay, would include 18 weeks of preparation on a Caribbean island.

The Galactic Suite was supposed to be ready in 2012. But it wasn't. The hotel, which was to have a spa and orbit 450 kilometres above the surface, never made it into space. Today the company's website is practically inoperative and its social networks have not been updated for several years. On its YouTube channel the last videos were shared nine years ago and on Facebook the most recent post dates back to 2017.

 

Maintaining a hotel in space is a real financial challenge

 

There are a number of significant hurdles to be overcome in setting up such a hotel. Getting a hotel into space can involve exorbitant amounts of money. For example, the development, assembly and 10-year operation of the International Space Station cost 100 billion euros, according to the European Space Agency. Major world powers such as the United States, Russia, Canada, Canada, Japan and 10 European nations that are part of ESA were involved in funding this project.

Some proponents of such projects say the growing success of commercial aerospace companies like SpaceX has made launch options more affordable. But Gary Kitmacher, who works for NASA, stresses that it is not just the cost of designing and maintaining the hotel in orbit that needs to be covered. There would also be "the cost associated with taking the paying passengers, the tourists, up and back."

 

In addition to the cost of maintaining a hotel in orbit, there is the cost of getting tourists to space and back to Earth. Credit: Blue Origin.

 

The health risks of flying into space

 

Adding to the cost of space travel are other challenges. "The real concern is to design the habitat—the pressurised module that you're going to be living in—so that it can handle the temperature changes," Kitmacher told space news website Space.comTemperatures in space vary from extreme heat to extreme colddepending on whether astronauts are in direct sunlight or darkness.

In addition, if an Earth-orbiting hotel were to be launched, it would have to have its own employees. But it should be noted that spending long periods of time in space can have detrimental effects on health, from circadian rhythm disorders (the physical, mental and behavioural changes throughout the day) to loss of muscle mass or calcium.

A person in space can lose between 1% and 1.5% of their bone mass in just one month, according to NASA. In addition, two-thirds of astronauts who spend a long time in space return with vision problems. A study presented in 2016 at the annual meeting of the American Society of Radiology indicates that the problem is due to changes in cerebrospinal fluid due to the lack of gravity.

Timothy Alatorre, one of the architects of Voyager Station, believes the inspiration behind some of these projects comes from "watching science fiction over the last 50 years and seeing how mankind has had this dream of starship culture." "I think it really started with Star Trek and then Star Wars and this concept of large groups of people living in space and having their own commerce, their own industry and their own culture," he says. For the time being, it looks like we will have to wait before we are able to live or vacation in space. To achieve this, space hotels will have to overcome all the hurdles to get them up and running.

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Innovation and technology
  • Sustainability

We help reduce roadside litter

At Sacyr, we not only build roads, but we also want the natural environment that surrounds them to be kept in perfect condition. 

Our Sacyr Maintenance delegation in Andalusia, particularly in the provinces of Jaén, Granada, and Huelva, made a commitment long ago to carry out circular economy initiatives. Hence, our collaboration with the Libera Cunetas project.

The project, launched by SEO/BirdLife and Ecoembes, which we have joined, contributes to science and expertise by sampling and categorizing litter and waste from different sections of the roadside. The goal is to obtain information about consumer habits, seasonality, possible trends in behavioral patterns, etc. 

“Ecoembes has an analysis and sampling program that identifies consumer habits, trends, evolution, and behavior. The idea is to raise awareness among the population: prevention, awareness, and education,” explains Jorge Zarzuelo, head of Sacyr Maintenance’s Innovation, Procurement, and Machinery Park. 
 

 

“The sum of small, individual actions can add up to tons of garbage polluting the highways and their surroundings,” Zarzuelo asserts.

The waste that is regularly collected from street gutters by Sacyr’s Maintenance staff will be identified and classified to help raise awareness among society. Several volunteer groups operating in other natural environments carry out similar operations.  

 

 

Southern area delegate, Simón Maestra, coordinates the training of our workers with Ecoembes and the COEX centers to help reduce roadside waste. The project requires the involvement of all Maintenance personnel.  

 

 

 

The campaign is called #BASURALEZA (litter)

#YouDroppedSomething #WeHaveAProblem #WhereLitterEndsUp and its launch is based on some chilling data: 


-Plastic. The most used, the most discarded: 8 million tons of plastic reach the world’s oceans every year.  
-Cigarette butts. 6 billion cigarettes: 4.5 billion discarded butts.
-Street gutters. Minimal data; waste smaller than 10 cm. (butts, paper, and plastic).
-W.C. The gateway to nature for small debris; or nature used as a W.C.

  • Roads
  • Circular economy
  • Sacyr conservation

Brine channel of the Muchamiel Desalination Plan (Alicante)

  • With Sacyrean accent

Brine mining to make the most of desalination processes

Brine mining consists of obtaining salts and chemical products from brine salt concentrate that can bring economic value to other industries.

In the past few years, desalination has become a new water resource that allows to supply populations, industries, and irrigation in times of drought and helps mitigate the effects of climate change over water shortages. Considering that one in every nine people in the world lacks access to drinking water and that 97.5% of the water on the planet is in oceans, its necessity and opportunity are more than evident. 

Currently, there are about 18,000 desalination plants in the world with the capacity to produce almost 100 million m3 of water daily. Spain, with leading industry and sector, is one of the countries with the greatest installed capacity (the fifth) in the world.

One of the environmental aspects that tend to worry the most about desalination is the practice of dumping concentrate salt and residues, also known as brine. When the dumping is done correctly, by previously diluting it and using diffusers (like in developed countries), the impact is practically undetectable, as the salt concentration goes back to sea levels at just a few meters away from the dumping point.

 

Perth Desalination Plant

 

Take the Persian Gulf, the region with the highest concentration of desalination plants and the largest dumping volume, the increase in salinity levels due to desalination concentrate is estimated to be 5% lower than the increase caused by water evaporation.

In any case, dumping is not the only possible solution in brine residue management. In the past years, the brine mining field has developed greatly. Brine mining consists of obtaining salt and chemical products from this salt concentrate.

Perhaps the most evident application could be using concentrate in salt evaporation ponds to produce table salt. However, except for some facilities in Greece and Israel, there are not many known large-scale examples of this use.

Some other salts or chemical products that can be obtained from the concentrate arise greater interest for their economic value, whether it comes from seawater desalination or brackish water.

On the one hand, we can generate Sodium hypochlorite in situ through technologies like electrochlorination (a common technology already used to add chlorine to pools), or use new technologies based on different variants of electrodialysis to produce chemical products, like Hydrochloric acid or Sodium hydroxide.

Through processes like evaporation-crystallization and other more or less conventional technologies, there have also been reports of obtaining salts like Anhydrite (CaSO4), Bischofite (MgCl2 6H2O), Calcite (CaCO3), Carnallite (MgCl2 KCl 6H2O), Dolomite (CaMg(CO3)2), Epsomite (MgSO4 7H2O), gypsum (CaSO4 2H2O), Halite (NaCl), Hexahydrite (MgSO4 6H2O), Kieserite (MgSO4 H2O), Langbeinite (K2SO4 2MgSO4), Mirabilite (Na2SO4 10H2O), Sylvinite (KCl+NaCl), Sylvite (KCl) and Thenardite (Na2SO4). 

 

Crystals from different salts obtained from brines

 

Likewise, there are eight elements of especial economic interest in brine: Phosphorus, Cesium, Indium, Rubidium, Germanium, Magnesium, Sodium chloride, and Potassium chloride. Their extraction could be technically and economically viable.

About using brines, it is worth mentioning that some work is currently being done on developing processes such as forward osmosis or bipolar electrodialysis to produce energy from mixing salt currents with freshwater. That is, generating energy due to the salinity gradient created from mixing these waters.

The first director of the Nobel Institute, Svante Arrhenius, calculated in 1903 the amount of gold in the sea, estimating 6 mg per ton of water. Perhaps the process to extract it is not viable, but there are studies and works to obtain the new “white gold” of the 21st century from the sea: Lithium. This material used in batteries is more precious by the day. Lithium is present in seawater, and it could be extracted, hence, meriting the name “new white gold”.

 

Sacyr Agua/Sacyr Water pilot brine evaporation-crystalization plant

 

Did you know…? 

 

  • The first reference to desalination was in the Bible when Moses turned saltwater into freshwater with a touch of his staff.
  • Unfortunately, this low-consumption energy technology has been impossible to reproduce to this date. 
  • Aristotle wrote several works on seawater and desalination.
  • Pliny the Old described desalination methods in his Natural History Encyclopedia.
  • Roman legions used solar desalination in their African campaigns. 
  • Vikings used the sails of their ships as fog nets to collect freshwater (this process is still used today in some parts of the Andes).
     

  • Desalination

U.S. Navy F/A-18 breaks de sound barrier: the white cloud forms as a result of the sonic boom. Credit: Ensign John Gay, U.S. Navy

  • Tungsteno

How we broke the sound barrier to fly in record time

The first supersonic aircraft, which mimicked the shape of a machine gun bullet, proved that it was possible to break the sound barrier. But as that revolutionary innovation was a commercial failure, will supersonic travel ever be economically viable?

ISABEL RUBIO ARROYO | Tungsteno

When aviation was still in its infancy, people imagined travelling from London to New York, Los Angeles to Tokyo or Madrid to Boston in just three hours, something that would be possible with aircraft capable of flying faster than sound. The first supersonic flight in history took place on 14 October 1947, a feat that took us into territory almost unexplored by science, beyond the theoretical. But that technological dream come true fizzled out within a few decades. And despite the many advances made in the aeronautical industry, there are still major challenges ahead for supersonic passenger flights to become a reality once again.

In 1903, the Wright brothers successfully flew an aircraft for the first time. Their flying machine remained in the air for 12 seconds and landed after a distance of 37 metres. In the following years, aircraft capable of flying longer distances were built. At that time, the speed of planes was much too slow to even consider the possibility of travelling faster than sound.

 

Orville Wright makes the first plane flight in history, as his brother Wilbur looks on. Credit: Library of Congress

 

From the Wright brothers to the challenge of breaking the sound barrier

 

Under normal conditions—at 20 degrees Celsius, 50% humidity and at sea level—sound travels through the air at about 340 metres per second. In other words, it travels one kilometre every three seconds (or about 1,235 kilometres per hour). The sound barrier is broken when this speed is exceeded. In doing so, it produces a phenomenon known as a sonic boom, like that produced by the bullwhip, which is considered the first human device designed to exceed the speed of sound.

Technological advances and new aeronautical materials made it possible for aircraft to fly faster and faster. While aircraft began to be used on a large scale in the First World War, it was not until the Second World War that jet-powered planes appeared on the scene.

The first jet flight was made in 1939 in the German aircraft Heinkel He 178. The Germans also produced the first functional jet-powered fighter plane—the Messerschmitt Me 262— which came into use in 1944 and was capable of flying at a maximum speed of 900 kilometres per hour. Aerodynamics experts then began to consider how to break the sound barrier, but they were not sure what would happen if they did so. They feared that high temperatures, structural stress and instability would destroy or crash the aircraft.

 

The Bell X-1 completed the first supersonic flight in history on October 14th, 1947. Credit: U.S. Air Force

 

The first supersonic flight

 

American Chuck Yeager, who was a fighter pilot in World War II, tested different aircraft until he managed to break the sound barrier on 14 October 1947. He did so in the Mojave Desert (in California, USA) aboard the Bell X-1. The X-1 aircraft were the result of the technological changes confronting aircraft designers in the late 1930s and early 1940s, as NASA explains. These aircraft were built solely for experimental purposes and without the restrictions imposed by commercial or military requirements.

The Bell X-1 was about nine metres long and three metres high, and was designed after the shape of a .50-calibre machine gun bullet, according to NASA. Such bullets could exceed the speed of sound and were considered "stable at supersonic speeds." To take full advantage of the flight capability, the Bell X-1 was carried by a B-29 bomber to the appropriate altitude, where it was then drop launched from the bomb bay. Powered by a rocket engine, the plane reached 1,299 kilometres per hour, surpassing Mach 1—the speed of sound according to the scale proposed by the physicist Ernst Mach. Yeager had achieved one of the milestones in the history of aviation: the first supersonic flight.

 

The first supersonic passenger aircraft

 

That feat brought with it scientific and technological advances that made it easier, among other things, to reach space. In the years to come, more sophisticated aircraft would be produced. The Tupolev Tu-144 was the world's first supersonic airliner. Its debut in 1968 was a major Soviet milestone in aviation history. Manufacturers in France and the UK were also successful with the Concorde, a supersonic airliner that could exceed 2,000 kilometres per hour.

 

La NASA is designan the X-59 to fly faster than the speed of sound without producing a loud, disruptive sonic boom as it breaks the sound barrier. Credit: NASA

 

When these aircraft broke the sound barrier, there was usually a sonic boom that caused a thunderous noise. In addition to the challenge of reducing the noise pollution, there was the additional hurdle of making such flights affordable, as flying in a supersonic aircraft was not economically feasible for many travellers. The Concorde, which made its first passenger flight on 21 January 1976, finally ceased operations in 2003 due to high maintenance costs and low occupancy. With the abandonment of this type of commercial aircraft, supersonic flight is now reduced to military aircraft, such as the F/A-18 fighter jets.

NASA and some companies aim to resurrect supersonic aircraft that can reach record speeds silently. "We want to build a future where mankind can travel between any two points on our planet in three hours," says Tom Vice, president of Aerion. Within the next decade, his company hopes to develop an aircraft capable of carrying 50 passengers from Los Angeles to Tokyo in less than three hours, a journey that currently takes more than 10 hours. Meanwhile, the company Boom is working on a plane for about 90 passengers that could theoretically fly from Paris to Montreal in about four hours (instead of seven) and from Los Angeles to Sydney in eight hours (instead of fourteen).

 

  • Planes

A system developed by the Biomechanics Institute of Valencia detects the emotional state of the passenger during the trip based on the analysis of body factors and the context. Credit: SUaaVE.

  • Tungsteno

Autonomous cars guided by their drivers' emotions

Artificial intelligence and prediction technologies have joined forces so that people and authorities can learn to trust self-driving cars. This is an even greater challenge for the latest generation of these vehicles, which are now one step closer to becoming fully autonomous.

ISABEL RUBIO ARROYO | Tungsteno

 

Human error is responsible for 90% of road accidents. While self-driving vehicles hold the promise of eliminating such incidents, they have also been involved in fatal accidentsAs these vehicles become a realityensuring safety is key if they are to be embraced by the public. Here we look at two European projects that bring together predictive technology and artificial intelligence, focusing on perfecting the predictive capabilities of these vehicles and on modifying their behaviour based on the reactions of their drivers.

One of the projects that hopes to inspire mainstream society to give the thumbs up to automated vehicles is SUaaVE, a European initiative that began in 2019 and will run until 2023. Led by the Biomechanics Institute of Valencia (IBV), it aims to analyse the needs of drivers, passengers and pedestrians, a particularly important challenge given that the car of the future is getting closer and closer.

The Society of Automotive Engineers establishes six driving levels for automated vehicles, ranging from 0 to 5. While Level 0 would be one in which the driver performs all tasks, Level 5 is characterised by full automation, that is, the driver is not needed at all and the vehicle functions properly under all possible conditions.

Taking your hands off the wheel to enjoy a video when in congested traffic

For years now, taking your hands off the steering wheel has been one of the most repeated and longed-for promises of manufacturers in the industry. At last it has become a reality. Honda is now selling the Honda Legend in Japan. It has become the first company to achieve Level 3 certification for autonomous drivingaccording to SAE standards. In this level, to which other manufacturers such as Audi, Ford and Tesla aspire, the driver is sometimes in control and sometimes not.

In other words, the driver has to be prepared to intervene if the system requests it or if a malfunction occurs. Once the system is activated, the driver can watch videos or operate the on-screen navigation system, which the company says helps mitigate fatigue and stress when driving in congested traffic. To drive autonomously, the Honda Legend uses high-definition three-dimensional map data and the global navigation satellite system. It also detects what's going on around the vehicle through an array of sensors.

 

The Honda Legend is the first commercial car with level 3 of autonomy. Credit: Honda.

 

Artificial intelligence knows if the driver is tired or motion sick

 

But the deployment of this type of vehicle will only be possible if public acceptance is achieved, according to SUaaVE researchers. There are multiple factors that influence how users perceive these vehicles. A survey of nearly 3,800 people in six different countries conducted by the project's experts reveals which features would make them more likely to accept these cars. Users are most concerned that this mode of transport should be safe, useful and environmentally friendly.

Automated cars can play an important role in increasing traveller safety, reducing congestion, improving mobility for those who cannot drive and reducing carbon dioxide emissions. Companies such as Waymo, owned by Google, are already testing Level 4 and 5 vehicles in restricted areas, and their achievements are promising. After simulating real-world fatal accidents, Waymo has concluded that its automated cars would have prevented almost all fatalitiesBut it is still too early for cars with such advanced systems to reach the market.

 

Ensuring the safety of autonomous vehicles and that they are respectful with the environment is key to getting the go-ahead from society. Credit: SUaaVE.

 

To improve safety and refine the car's behaviour in response to the occupants' reactions, the IBV uses a simulator. The Human Autonomous Vehicle (HAV) makes it possible to emulate the driving of vehicles with different degrees of autonomy and to detect, in real time, the emotions and vital signs of the occupants. Using advanced artificial intelligence techniques, this data would make it possible, for example, to detect whether a motorist is very tired or experiencing motion sickness. Indeed, in the future, a person's physical and emotional state will likely be able to alter the decisions made by an autonomous vehicle.

 

Improving predictive skills to boost acceptance of these vehicles

 

Researchers at the Brave project, financed by the European Commission, are also trying to analyse the difficulties involved in making these types of cars acceptable to the public and thus create guidelines for manufacturers. Javier Alonso Ruíz, a professor at the Department of Automatics at the University of Alcalá (UAH) and a participant in the project, explains that the aim is to answer multiple questions: "How should the transition be between manual and automated driving? How much should the driver be disturbed if he is already aware of the road? How will the vehicle communicate with pedestrians? For example, how will it tell them, 'I am an autonomous vehicle,' 'I am braking because I have seen you,' or 'I am accelerating, be careful'?"

 

Companies like Waymo try to allow their autonomous vehicles to anticipate the behaviors of the different road users. Credit: Waymo.

 

Researchers have created a system for automated vehicles that is able to anticipate the intentions of different road users. This technology makes it possible, for example, to detect the lane-change intentions of other drivers some 400 milliseconds ahead of a human driver's ability to do so.

But we will still have to wait to see these advances on the market. Rubén Izquierdo, a researcher at the UAH who has also participated in the project, recalls that, "it takes many years for something that is in a state of research to go to industry and become a product." "We are still at least five years away from seeing these products in real vehicles," he says.

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Autonomous cars
  • Artificial intelligence

Tech giants are looking for cold or remote areas to locate their data centres. Credit: Microsoft.

  • Tungsteno

When our data resides with the fish

Tech giants have located their data centres in chilly and sometimes unexpected locations—from an old mine in Norway to a Cold War nuclear bunker. Microsoft's strategy is to store its data on the seafloor. But are underwater data centres more efficient than those on the surface?

ISABEL RUBIO ARROYO | Tungsteno

 

In 2018, Microsoft submerged a data centre with 864 servers to a depth of 35 metres. Its aim was to test whether it was easier to maintain at the bottom of the sea than on the surface. Two years later it was pulled from the water and the results were promising: the data centre proved to be more stable and energy-efficient than an identical one on the surface. Where is our data usually stored? Does it make sense to submerge it beneath the waves?

Almost all our actions on the Internet generate data. This information is stored on physical servers maintained by large companies such as Google, Apple or Facebook. When choosing the location of a data centre, factors such as temperature are taken into account, as DE-CIX, a leading operator of Internet exchange points, notes. "The cooling of data centres is a major issue, which is a challenge in terms of the efficiency and energy consumption of these buildings," they say.

 

Mines, churches, bunkers or the Arctic: where our data resides

 

Servers need to be operational 24 hours a day, a requirement that "generates a lot of heat" and can pose a problem since high temperatures impair their proper functioning. One of the great challenges for tech giants is to control overheating in data centres. "It is common for some companies to locate data centres in cold or remote areas, both for energy efficiency reasons and because of the need for large square metres of space to house their servers," says Theresa Bobis, DE-CIX's southern Europe director.

Tech companies have managed to find chilly—and sometimes quirky—places to store data. Some examples include the Salem Chapel in Leeds (UK), a World War II bomb shelter under the famous Uspenski cathedral in Helsinki (Finland), a nuclear bunker from the Cold War era (Sweden) or a former mine in Norway, where the 120,000 square metre Lefdal Mine Datacenter is cooled by water from the Norwegian fjords and all the energy used comes from renewable sources, according to DE-CIX. To address the cooling issue, some have even taken data centres to the Arctic CircleFacebook has installed a data centre in the Swedish city of Luleå that has large fans to gather cold air from outside and cool the equipment and servers inside.

 

Facebook has built a data centre in the small town of Luleå, about 100 kilometres from the Arctic Circle. Credit: Mark Zuckerberg.

 

Data centres submerged in the sea

 

This relentless quest to find the perfect place to store our data has taken Microsoft to the ocean floor. Taking advantage of submarine technology and with the help of marine energy experts, the company submerged a data centre inside a cylindrical tank 12 metres long and three metres in diameter in the Orkney Islands off the Scottish coast. The operation involved ten winches, a crane, a barge and a remotely operated vehicle. The site was chosen because the grid there is powered entirely by solar and wind energy.

More than half the world's population lives within about 190 kilometres of the coast, according to Microsoft. By putting data centres underwater near coastal cities, "data would have a short distance to travel to reach coastal communities, leading to fast and smooth web surfing, video streaming and game playing." In addition, Microsoft claims that cold seas make it possible to design energy-efficient data centres.

Does an underwater data centre fail less than one on the surface?

A submerged data centre does not face some stresses that would affect it on the surface and could lead to system failures: for example, corrosion from oxygen, humidity, temperature fluctuations and the jostling of components by technicians. Microsoft, while submerging this data centre, set up an identical data centre on dry land for comparison. Over the two years that the experiment lastedthe underwater data centre suffered only one-eighth as many failures as the one on land, according to the company.

 

Microsoft in 2018 submerged a data centre with 864 servers at a depth of about 35 metres. Credit: Microsoft.

 

It is not clear whether, in addition to temperature, other factors may have influenced the underwater data centre to fail significantly less. "We have to figure out what exactly gives us this benefit," says Spencer Fowers of Microsoft's special projects research group. His team is going to study whether the good results are due to the fact that there was no human intervention inside the tank and that the capsule was filled exclusively with dry nitrogen, removing all the oxygen inside.

But what happens when a data centre submerged at a depth of 35 metres fails? Repairing it can be a real challenge. But Ben Cutler, head of special projects at Microsoft, says the model is self-sustaining—servers that fail are simply switched off. "It's designed to be so reliable that we can operate for several years without maintenance," he says.

While the results of this experiment are promising, it is still too early to tell to what extent underwater data centres will play an important role in safeguarding our data. Information is still lacking, for example, on how easy it will be to recover, recycle or replace these data centres once they reach the end of their useful life. Cutler does not believe they will replace the aboveground ones, but he does suggest that they will be an alternative for customers. Who knows if all that data we now say is "in the cloud" will one day end up "at the bottom of the sea."

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Data
  • New technologies
  • Services

Smart Street cleaning system: 3D Smart Cleaning

Sacyr has put in motion the 3D Smart Cleaning project, created to give an answer to a necessity in street cleaning services. 

Mechanical sweeping machines currently available in the market have technological limitations that complicate the driver’s work and cause machinery defects. 

The workers' vision needs to focus on the residue on the ground most of the time, and therefore they cannot pay enough attention to the road. Plus, the worker controls with a joy-stick the movement of the vacuum, which can damage the machinery, as there are unavoidable obstacles. 

For that reason, Valoriza has developed a Smart Street cleaning service, 3D Smart Cleaning, funded by the CDTI. It consists of a functional sweeping machine prototype, installed on a model from manufacturer Schmidt, based on the design of an innovative solution pack formed by three main elements: hardware, mechanical system and specific and transversal control software, with two 3D laser cameras as a key point. 

This protoype, adapted to the sweeping machines, achieves, among other things, automating vacuuming and improving performance and efficacy of mechanical sweeping. 

Automatization consists of: 
-Automatic regulation of the vacuum that adapts to the terrain.
-Detection of people, cyclists, cars, and anticollision with objects on the road and alert with sound and light signals in the cabin. 
-Guided sweep system to the driver thanks to the curb detecting system. 

 

 

“With this project, we have achieved a greater sweeping efficiency thanks to the automatic regulation of the vacuum, which allows automating the mechanical street sweeping process, avoiding the worker controlling the vacuum manually”, explains Jesús Hernández, Head of Machinery of Valoriza Medioambiente. 

Within the framework of this project, we have installed several cameras in blind spots and in the vacuuming system to provide complete vision and information to the worker. 

This entails a 10% productivity increase and better service quality, as the wear on the vacuum is reduced and avoids unnecessary machinery stops. We also achieve a better rate of collected waste residue, better safety for workers, and reducing carbon emissions. 

 

 

Ultimately, this is efficacious, functional equipment that improves the work’s performance, safety, and quality.

“After all these years of testing in different municipalities all over Spain and confirming its efficacy, we offer this option in current and future tenders”.

  • 3D
  • Sustainable
  • Machinery

Most modern pipelines use a complex computer network of pressure sensors, thermostats, valves and pumps to optimise their operation. Credit: Pxhere.

  • Tungsteno

Pipeline cyberattack and the risks of the hyper-connected workplace

Nine out of 10 gas stations in Washington D.C. ran out of fuel on 14 May after hackers crippled a pipeline. The cyberattack put the spotlight on a critical challenge: ensuring security in the age of teleworking.

ANTONIO LÓPEZ | Tungsteno

 

The rise of teleworking has been one of the most tangible consequences of the pandemic. We first noted the emergence of two models, virtual and mixed, and analysed the challenges of maintaining teleworking when the current global health emergency is over. Then we took a look at the most important digital tools for adapting to this new era of work. And now we examine one of its main, more intangible, weaknesses: ensuring security.

On 7 May, a group of hackers achieved something unprecedented in history: paralysing an oil pipeline that distributes 45% of the fuel consumed on the east coast of the US. The "hijacking" of the digital infrastructure that controls the Colonial Pipeline led those in charge of this critical facility to completely shut down its operation for days, resulting in disruptions to flight schedules and supply shortages at gas stations in the southeastern US. Many people rushed to hoard gasoline. 87% of filling stations in Washington D.C. ran out of fuel on 14 May and prices soared to a seven-year high.

The Darkside hacker group, which claimed responsibility for the ransomware attack, demanded a ransom of $5 million in cryptocurrencies in exchange for unlocking the pipeline. This event, along with other recent ones such as the latest massive theft of Facebook data or the cyberattack suffered by the Spanish Public Employment Service (SEPE), has once again brought the risks of hyperconnectivity back to the table.

 

On May 7, a group of hackers paralyzed an oil pipeline that distributes 45% of the fuel consumed on the east coast of the United States. Credit: U.S. Energy Information Administration.

 

The risks of connected infrastructures

 

The digital transformation that is affecting all types of sectors has reached industries as apparently anchored in the 20th century as the oil industry. Despite these appearances, there are already modern pipelines such as Colonial, which use a complex computer network of pressure sensors, thermostats, valves and pumps to remotely operate this large piece of infrastructure, detect faults and collect data in order to optimise its operation. While savings and efficiency are maximised, so too are opportunities to hack the system.

Such cyberattacks are not common since the level of protection is high. But where there is connectivity, there is vulnerability, as Jon Niccolls of Check Point Software reminded the BBC: "Some of the biggest attacks start with an email. A Colonial employee could have been tricked into downloading a piece of malware," said the cybersecurity expert, who believes it is more likely that hackers gained access to the pipeline's control system through the administrative side of the company.

Threats of this kind have multiplied in recent months. At the end of the first lockdown, Interpol warned about the alarming increase in cyberattacks brought about by the COVID-19 pandemic and teleworking. "The increasing reliance on the Internet by people around the world also provides new opportunities for criminals, as many businesses and individuals are not ensuring that their cyber-defences are up to date," said Interpol Secretary General Jürgen Stock, who urged closer cooperation between the public and private sectors to contain this risk.

 

Interpol has warned of an increase in cyber attacks during the coronavirus pandemic. Credit: Interpol.

 

The wave of cyberattacks brought by the pandemic

 

Cybercriminals then took advantage of "the fear and uncertainty caused by the resulting socio-economic instability" to intensify malware and ransomware attacks. They also carried out frauds, promoted malicious domains ostensibly related to information about the pandemic, and spread fake news. Phishing attempts disguised as supposedly official Netflix password reset emails, cyberattacks on hospitals or fake job offers have also been detected.

Taken to the world of work, this general threat made companies aware of the sudden increase in their cyber vulnerability: "We have gone from having one company with 50 computers to having 50 offices, one in each house, which increases the risk because we also have more devices connected at home. In fact, we detected a television that was capable of crashing the entire server due to a conflict with updates," Fernando González, CEO of the Megastar consultancy, told the newspaper Heraldo de Aragón.

 

During the pandemic, malware and ransomware attacks, scams or phishing attempts have intensified. Credit: Unsplash.

 

Javier Serrada, head of projects and pre-sales at the consultancy Gotor Comunicaciones, calls for computer security awareness to be broadly incorporated into occupational risk prevention and recalls the three pillars of cybersecurity: confidentiality, veracity and availability. "We must ensure that not everyone can access certain types of information, and that when they do, it is truthful, it has not been manipulated by a third party or has disappeared," he says.

 

Business security collides with home networks

 

The password barrier becomes a crucial firewall. Forgetting them or not updating them regularly can have serious consequences in terms of cybersecurity, which is why one of the most sensitive issues for companies, and one of the most tedious for virtual workers, is password management. Services such as Bitwarden or LastPass make it possible to store all of a user's passwords in the same secure database. They also speed up the registration process for new applications and synchronise the updating of passwords across multiple devices.

The report Cybersecurity Trends 2021, prepared by ESET, points to other solutions: "Some of the baseline security measures taken for granted in the office must be compensated for at home, such as requiring home workers to use multi-factor authentication or a VPN to access internal networks."

Still, it is difficult to guarantee the absence of security breaches. Even in the ideal case where remote workers always use company-delivered devices and follow these protocols, the security of their own home Wi-Fi networks—an increasingly open avenue for cybercriminals due to the rise of home automation devices—should also be verified.

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Security
  • Pandemia
  • teleworking

A team of researchers has developed a new biorefining process that harnesses waste to produce sustainable aviation fuel to reduce the carbon footprint of the aviation sector. Credit: Pexels.

  • Tungsteno

Turning organic waste into aviation fuel

If global aviation were a country, it would rank among the top 10 greenhouse gas emitters. A team of researchers claim to have found a way to use food waste, manure and sewage to create more sustainable fuel.

ISABEL RUBIO ARROYO | Tungsteno

The mythical DeLorean from the movie Back to the Future achieved one of the most ambitious goals in the history of mankind: time travel. While the vehicle’s need for plutonium fuel provided exciting plot points in the story, at the end of the first part of the saga the time machine has been upgraded with a futuristic Mr. Fusion generator able to use banana peels and other garbage as fuel… and away they fly! This science fiction idea, converting organic waste into energy for sustainable transport, is now closer to becoming a reality, and more and more sectors are attempting to capitalise on the concept.

A team of researchers has developed a new biorefining process that harnesses waste to produce sustainable aviation fuel (SAF) to reduce the carbon footprint of the aviation sector. The finding has been published in the prestigious Proceedings of the National Academy of Sciences, one of the highest impact scientific journals. In theory, the secret to producing SAF lies in using the untapped energy from food waste and other wet waste such as animal manure and sewage. Researchers at the US National Renewable Energy Laboratory, the universities of Dayton and Yale, and the Oak Ridge National Laboratory claim that SAF is identical to fossil jet fuel, but with "a carbon-negative twist". In other words, using it would supposedly remove more CO₂ from the atmosphere than is emitted.

This fuel would reduce net greenhouse gas emissions by up to 165%, according to the authors of the study. The figure comes from the reduction in carbon emitted by aircraft plus the emissions saved when food waste is diverted from landfill. 17% of the world's total food available to consumers in 2019 went to wasteaccording to the United Nations (UN).

Between 8% and 10% of global greenhouse gas emissions are associated with food that is not consumed, according to the UN. Credit: Wikimedia.

 

Food waste carries social and environmental implications. "Every year, millions of tonnes of food waste is hauled to landfills across the country. Once there, it rots and produces methanea greenhouse gas over 20 times more potent than carbon dioxide," explains the US National Renewable Energy Laboratory. Between 8% and 10% of global greenhouse gas emissions are associated with food that is not consumed, both through losses in the production and transport processes and at the final consumption stage, according to the UN. The authors of the SAF study say they have found a way to disrupt this process and create a more sustainable fuel that could be blended in higher concentrations—up to 70%—with conventional aviation fuel.

 

The huge global impact of aviation emissions

 

Achieving environmentally-friendly aviation fuel is one of the challenges of the 21st century. Although the aviation industry and some official bodies assert that the aviation sector is responsible for around 2% of global greenhouse gas emissions, a report by the Stay Grounded network puts the figure at between 5% and 8%. "Apart from carbon dioxide, planes produce other harmful elements such as methane, ozone, soot, contrails and induced cloudiness, with a greater climate impact than CO₂," explains Ecologistas en Acción, an organisation that is part of the Stay Grounded network.

Someone flying from Lisbon to New York and back generates approximately the same level of emissions as the average person in the European Union when heating their home for a whole year, according to the European Commission. It points out that if global aviation were a country, it would be among the world’s top ten emitters. And the situation could get even worse. Major companies in the international aviation industry such as Airbus forecast annual growth rates of 4.3% in the coming decades.

If the predictions come true, by 2050 greenhouse gas emissions from aviation could be four to eight times the current level, according to Stay Grounded. Before the coronavirus pandemic, the International Civil Aviation Organization also predicted that by 2050 emissions from international aviation could triple compared to 2015.

 

A report by the Stay Grounded network estimates that the aviation sector causes between 5% and 8% of global greenhouse gas emissions. Credit: Pixabay.

 

The dream of flying more sustainably

 

In a hyper-connected world, millions of people travel by air every day. The researchers who produced the report The Degrowth of Aviation advocate significant reductions in flying to mitigate the environmental impact of this mode of transport. "There is no alternative," they say. The report analyses a number of proposals, such as banning close proximity flights, reducing short- and medium-haul flights that can be made by train, and putting a moratorium on the expansion or construction of airport infrastructure.

In the meantime, researchers around the world are searching for a magic formula that will allow us to fly more sustainably. Whether or not the ultimate solution lies in food waste, we do not yet know. For now, major industry players such as Southwest Airlines are already collaborating with researchers at the US National Renewable Energy Laboratory and the other organisations involved in the SAF project.

"If our refining pathway is scaled up, it could take as little as a year or two for airlines like Southwest to get the fuel regulatory approvals they need to start using wet waste SAF in commercial flights," says Derek Vardon, a scientist at the lab and one of the paper's authors. While this fuel is "not a silver bullet", for the researcher it is "a piece of the puzzle that could make a significant dent in an industry notoriously hard to decarbonize."

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Planes
  • Biofuel

The Snake House, a two-storey, five-bedroom house built in southern England, is inspired by the Fibonacci spiral. Credit: Sadler Brown.

  • Tungsteno

Nature’s secret code, the key to sustainable architecture?

The Fibonacci spiral has been used for centuries as a means of connecting with the divine and for aesthetic reasons. But is the so-called golden ratio just a fascinating way to achieve pleasing designs or does it actually contribute to making buildings more sustainable?

PABLO GARCÍA-RUBIO | Tungsteno

 

In 2015, Steven and Elizabeth Tetlow set out to build their new home in the Devon region of southern England. They were inspired by nature to develop a structure that would coexist with its surroundings, making use of its contours and respecting the environment. The curved design of the Snake House is based on the shape of ammonite fossils—a subclass of extinct cephalopod molluscs—found in the area and the Fibonacci spiral, known as nature's secret code or the divine sequence. Is the so-called golden ratio just an intriguing way to achieve aesthetically pleasing designs, or does it also serve to make structures more environmentally friendly?

From the earliest historical structures to the present day, proportion has been a fundamental part of architectural design and engineering. The Fibonacci spiral originates from the growth pattern observed in nature and is a visual expression of the golden ratio. This proportion is found by dividing a line into two parts so that dividing the longer part by the shorter part gives the same result as dividing the entire line by the longer part. It is expressed by the number phi: 1.618.

The Fibonacci spiral originates from the growth pattern observed in nature. Credit: BBC.

There are many manifestations of this ratio in nature including the leaves of some plants, the cones of conifers, and the seeds in the centre of a sunflower. Fibonacci spiral patterns could be useful for optimising the arrangement of leaves or for capturing sunlight more efficiently. For centuries, the golden ratio has also inspired the design of all kinds of buildings. Here we review some historical examples to understand how this ratio has been applied in different periods.

 

Pazzi Chapel, Filippo Brunelleschi (1443)

 

Some studies have tried to prove that the golden ratio was used in the structures of past civilisations such as those of ancient Greece or Egypt, for example in the iconic Great Pyramid of Giza complex. But everything seems to indicate that in reality these types of constructions pursued other geometric goals whose aim was to achieve order and balance.

It was in the Middle Ages, from the 13th century onwards, when, following the introduction of Arabic numerals into Europe by disseminators including Fibonacci himself, these measurements began to be taken into account when designing structures. The intention of Gothic and Renaissance architects was to imitate nature in order to endow structures with a spiritual dimension, thereby achieving a closer connection with God.

One example is the Pazzi Chapel, built in Florence by the Italian Renaissance architect BrunelleschiSeveral studies have concluded that, both in its layout and facade, he used proportions similar to the golden ratio, and combined rectangular and circular elements to symbolise the union between the human and the divine.

Superimposition of the golden ratio on the floor plan and façade of the Pazzi Chapel. Credit: Wikimedia / Own elaboration.

 

Villa Savoye, Le Corbusier (1929)

 

Although modern architecture largely rejected classical standards, one of the most influential architects of the 20th century, Le Corbusier, inherited an interest in the golden ratio and the relationship between mathematics and nature. While curves and spirals are rarely found in his work, an obsession with proportion pervades his designs.

 

Le Corbusier inherited an interest in the golden ratio and the relationship between mathematics and nature. Credit: Archiproportion.

 

In fact, he developed a scale of visual measures that combined human proportions and the Fibonacci sequence, which he called the Modulor. Taking as a reference the measurements of a man with his arm raised, the proportions of the Modulor were intended to harmoniously represent proportion in structures. In the Villa Savoye building on the outskirts of Paris, he used these proportions to scale all the spaces and measurements.

 

The Core, Eden Project, Jolyon Brewis (2005)

 

Contemporary architecture no longer considers proportion as a central element of design. Without abandoning it, other elements such as energy efficiency, sustainability and the responsible use of materials are now considered a priority. This approach to sustainable design looks to nature to find optimal forms of adaptation that have evolved in the natural environment and then uses technology to apply them to the design of buildings.

Biomimicry was also the inspiration for the central structure of the Eden Project, an environmental education project in the south of Britain. This building, known as The Core, is inspired by the sequence described by Fibonacci, specifically its application to explain mathematically how plants reproduce. It was also constructed in an environmentally responsible manner. While underground pipes heat the air before it enters the building, photovoltaic panels on the roof provide electricity. The insulated walls are made of recycled newspapers.

The Core building is inspired by the sequence described by Fibonacci, and other shapes found in nature. Credit: Eden Project.

 

From the divine to the sustainable

 

While in antiquity the Fibonacci spiral was a way of connecting with God, in modern architecture it has served as an aesthetic reference for the application of more balanced proportions. Today, however, there is a trend towards sustainable construction, which places the functional above the aesthetic. As a result, elements such as proportion have taken a back seat.

The quest for environmental efficiency in buildings has led to the use of curved shapes, such as those inspired by the Fibonacci spiral, to create structures that are better integrated into the landscape. Curves, together with the use of new materials, reduced emissions and better orientation, contribute to building more sustainable and environmentally responsible structures. However, the use of the golden ratio has not proven to have an advantage over other curved shapes in the quest for greater energy efficiency or cost savings in materials.

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Architecture
  • Sustainable
  • Naturaleza

Technology can help achieve savings of more than 60,000 euros per year per project, according to a report by the consultancy EY. Credit: Built Robotics.

  • Tungsteno

Robots that can erect a home in two days

Artificial intelligence promises to optimise construction processes, reduce costs and improve safety. But how far can robots, exoskeletons, autonomous excavators and sensors actually go in achieving this?

ISABEL RUBIO ARROYO | Tungsteno

Autonomous robots that lay 1,000 bricks an hour and erect homes in two days. Exoskeletons that boost the strength of the wearer by a factor of 20. Sensors that monitor the heart rate or posture of workers. Systems that take the winter cold into account when planning a project. Artificial intelligence promises to completely revolutionise the construction industry. More and more companies are using this type of technology to improve project management and implementation, save costs and ensure worker safety.

The construction of a building involves all types of machines and workers, from draughtsmen, architects and engineers to bricklayers, electricians, truck drivers and crane operators. Getting them all to work efficiently while minimising any risks is a key challenge. Artificial intelligence can help achieve this, as well as delivering significant savings in time and resourcesA report by the consultancy EY shows that this technology can reduce the time spent on meetings related to the job site by up to 70% and save more than 60,000 euros per year per project.

Evaluating millions of possible scenarios to predict the evolution of a project

"Around 200 years ago, the industrial revolution changed society in unimaginable ways for the time. Today, another revolution is under way," says Marc Lahmann, project management expert at PwC. He is referring to the artificial intelligence revolution and how this technology will transform project management across the board, for example by automating tasks or avoiding costly delays in construction processes.

ALICE is a platform that uses artificial intelligence to improve project management in construction and mitigate risks. In theory, it can reduce the duration of each project by 17% and save 14% in labour costs. The software does in just a few minutes what would take a human being years to complete. It calculates millions of hypothetical scenarios taking into account all the variables that influence the construction of a building and selects the most efficient solutions.

This platform has been used to plan the construction of a luxury building with an outdoor pool, gym and garden in Bangkok and the extension of the City of Edmonton light rail in Alberta, Canada. In the latter project, ALICE took into account factors such as the difficulties in constructing in winter weather. In addition, the AI-powered platform also created schedules to optimise the work of construction crews and to minimise the number of concrete pumps needed.

 

Artificial intelligence can improve project management in construction and mitigate risks. Credit: Pexels.

Robots and autonomous excavators to speed up the construction process

Beyond project management, artificial intelligence can be particularly useful in on-site construction. While we wait for autonomous vehicles to become a reality on the streets, autonomous driving technology could appear first in the construction industry. Companies such as Caterpillar, Komatsu, SafeAI and Built Robotics are working to make this possible. This last company is developing machines that operate fully autonomously, for example excavators that can dig trenches or level land 24 hours a day. These machines have been used for example by Mortenson, a large US construction company, to build solar and wind farms.

Sarcos Robotics is another company that has been designing robots to increase human productivity and safety in hazardous jobs for 25 years. Among its creations is Guardian XO, an exoskeleton designed to manipulate heavy loads using the body's natural movements. The company claims that this robotic suit can boost the wearer's strength by a factor of 20. In theory, an operator could safely lift and manoeuvre up to 90 kilos for eight hours at a time with little effort.

Some robots can be used to erect structures in record time. Guardian GT is a gigantic, remote-controlled double-armed robot capable of lifting more than 450 kilos and also carrying out more complex welding and assembly processes. Also of note is Hadrian X, a robot developed by Australian company Fastbrick Robotics that is able to lay 1,000 bricks in just one hour and construct the walls of a home in less than two days. "In the right environment and working continuously, each unit could build between 100 and 300 homes per year," says the company, which claims the process generates less waste than traditional construction methods.

 

Some autonomous excavators can trench or level terrain 24 hours a day. Credit: Built Robotics.

 

Machine learning and sensors to improve safety

 

An article by the consulting firm McKinsey & Company indicates that machine learning can be applied to drone imagery and 3D-generated models to assess defects in the execution of a project, "both structural and aesthetic." It can also be used for the "early detection of critical events (e.g., bridge failure)." "These techniques could help engineers compare developing and final products against initial designs, or train an unsafe-behaviours detection algorithm to identify safety risks in project sites based on millions of drone-collected images," the firm says.

Increasingly, construction sites are equipped with cameras, IoT devices and sensors that constantly monitor the scene. Facial and object recognition technology can detect unsafe behaviour and warn of potential hazards. Sensors, together with artificial intelligence systems, serve to monitor conditions relevant to construction and materials such as temperature or humidity in the environment. These devices can also be present in wearables worn by workers themselves to monitor heart rate, body temperature, repetitive movements or posture. For example, Wearlumb, a smart T-shirt designed for the workplace, measures the posture of workers and suggests ways for improvement to minimise the risks and costs of lower back disorders.

Artificial intelligence can play an important role in the prevention of occupational hazards. In the United States, more than 5,000 workers died on the job in 2019, according to the country's Occupational Safety and Health Administration. That's more than 100 employees per week, or about 15 every day, and one in five deaths occurred in the construction field. Suffolk, a Boston-based general contractor, is developing an algorithm that analyses photos from its construction sites and scans them for safety hazards, such as workers not wearing protective equipment. The images are then correlated with its accident records so that safety briefings can be conducted when a potential threat is detected.

All these technologies can be useful during the construction process, but once the project is completed they still have great potential. For example, sensors attached to doors, windows, tables or chairs can constantly monitor everything from temperature and air quality to humidity, atmospheric pressure and even which lights are on or which doors or windows are open. The processing of millions of pieces of data with artificial intelligence systems serves to ensure the efficient use of resources and to keep buildings in the best possible condition.

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • IA
  • Robots
  • Machinery
  • We talk to science

“We are working to obtain biofuels and bioproducts from waste biomass” 

The principal objective of Dr. Raquel Iglesias Esteban’s research group is to contribute to the decarbonization of energy production systems and the development of the circular economy. 

Dr. Raquel Iglesias manages a research group of about 20 members, and she is particularly proud to see how her joint work evolves. And it is no wonder, for this is no small feat.

Since March 2019, she is Head of the Advanced Biofuels and Bioproducts Unit of CIEMAT, which depends on the Ministerio de Ciencia e Innovación (Ministry for Science and Innovation) . This group works on obtaining biofuels like ethanol, biogas, and biomethane as alternatives to natural gas, besides from bioproducts like lactic acid as a base molecule to produce bioplastics, that contribute to a more sustainable bioeconomy. 

Raquel Iglesias has a PhD in Agronomy from Universidad Politécnica de Madrid (UPM). Before working at CIEMAT, she worked at CEH-CEDEX, first as Head of Water Technology Projects and later as Program Director for the multipurpose reutilization of urban wastewater planning and treatment. 

We started this research line on biogas-biomethane two years ago to recover agricultural waste, urban waste, sewage sludge, or dedicated biomass. We believe that the project will have a promising development, thanks to the high demand for projects related to biogas as an energy carrier. We want to undertake projects that integrate biogas resulting from waste treatment into hybrid energy systems with other renewable energy sources”, explains Iglesias.

The roadmap for biogas power published by MITECO last July 15, 2021, has distinctively defined objectives, akin to this group’s, that require using methane as a fuel for heavy vehicles and substitute fossil gas, apart from strengthening the circular economy and affix rural populations. 

 

 

“We are after generating methane by biologically upgrading to renewable hydrogen or bringing new ideas to use biogas as a platform to obtain bioproducts”, explains Iglesias. “Currently, all technology to obtain alternative fuel sources to oil is welcome, in order to achieve the goals set by the EU to mitigate the effects of climate change”, affirms the researcher. 

Raquel Iglesias intends to help integrate biomethane as an energy source in local systems. “In my opinion, the energy model requires generating power where it is needed as much as possible, helping industry growth and creating new value chains”, she comments.

 

Sewage treatment plants turned biorefineries


Iglesias explains that her team is an example in the characterization and pre-treatment of lignocellulosic biomass for its transformation into bioproducts through enzymes and fermentation. For instance, she offers, they are working on a project to transform wet wipes that reach the sewage treatment plants into ethanol to use as a biofuel.

Sewage treatment plants are going to become something more than facilities to treat wastewater to dump back into its medium or to reuse in agriculture, they will become biorefineries to obtain energy, nutrients, and other bioproducts with added value”. 

 

 

Assuredly, Iglesias is partial to and is motivated by the concept of the circular bioeconomy. “To work in the circular economy, give waste products a second life and try to extend their lifecycle as much as possible … not only are these a riveting challenge, they are a necessity there no turning back from”.

 

Use of evolutionary engineering 


Another extraordinary project of the Advanced Biofuels and Bioproducts Unit consists of increasing the performance of microorganisms, like fungi or bacteria, that transform biomass into bioproducts through evolutionary engineering processes. 

“Our work is sugar-based. That is, we extract sugar from biomass, and microorganisms transform this sugar into the desired bioproducts. We are good at this sugar extraction process and preparing the microorganisms to work in these sugar-rich culture mediums. Biotechnological processes are becoming more and more efficient due to us being able to adapt these microorganisms and genetically modify them to use these sugars under certain conditions”, explains Raquel.  

In her previous position, Raquel worked on risk management plans for wastewater use. 

“Whenever you use wastewater with pathogens and harmful substances, you need to treat it and manage it to avoid environmental and health problems. European law on wastewater use in agriculture requires mandatory risk management plans. For years, I worked on recycled water use, both on treatment and regulation development, and I am still connected to these topics by collaborating with document revision or petitions from MITECO or EU projects”, comments doctor Iglesias.

 

An advocate for public research

 

Raquel Iglesias feels like her job in the public research system is demanding and challenging but she considers that getting results to improve the environment and thus, people’s quality of life by working for the common good is definitely worthwhile. “Elaborating data and participating in developing a regulatory framework so that everyone can keep functioning while respecting the environment is very satisfying”, she confesses.

Iglesias has worked mainly on the implementation and development of a framework for wastewater purification and reuse, both nationally and internationally, and on the assessment of treatment technologies for MITECO, two activities that she manages to combine in her current job at developing sewage treatment plants as biorefineries.

She remains connected to organizations related to wastewater treatment, energy efficiency and reuse like AEAS, and biogas related, like AEBIG. Furthermore, she also collaborates in plans and strategies to implement the circular economy, such as the Andalusian Circular Bioeconomy strategy, and she is highly active in platforms related to biomass use, like BIOPLAT, or bioethanol use, like Bio-e or APPA biofuels. 

 

  • Waste
  • Circular economy
  • Biofuel

The undersea cables deployed around the world make it possible to share, search, send and receive information at the speed of light. Credit: Seatools.

  • Tungsteno

How the Internet extends across the sea floor

98% of all international Internet traffic flows through an immense network of undersea cables. This infrastructure crosses seas and oceans all over the globe to connect countries such as China and the United States, Portugal and India or South Africa and Malaysia.

ISABEL RUBIO ARROYO | Tungsteno

 

Nowadays, we live connected to the Internet 24 hours a day. Although everything seems to indicate that we are moving towards an increasingly wireless world, our Internet connections actually depend on thousands of kilometres of undersea cables that crisscross the oceans of the entire planet. How does this infrastructure work and how important is it for us to be able to use Google, Facebook or WhatsApp almost anywhere in the world?

Currently, 98% of international internet traffic flows through undersea cables, according to Google data: "A vast underwater network of cables crisscrossing the ocean makes it possible to share, search, send and receive information around the world at the speed of light." These cables are made of optical fibre. Theresa Bobis, Regional Director Southern Europe at DE-CIX, says: "They are not too thick, but inside there are hundreds of filaments the thickness of a human hair and, of course, they are protected with layers of different materials."

Thousands of kilometres of undersea cables

Transatlantic cables began to be installed more than 150 years ago for the telegraph network. At the end of the 20th century, the deployment of fibre optics brought about a real revolution in the implementation of this type of cable. Today they connect distant continents along the ocean floor and are "all around the globe," according to Bobis.

In total, "there are more than 420 underwater cables around the world totalling about 1.3 million kilometres." Some interactive maps show all the undersea cables that are deployed around the world. One example is the Submarine Cable Map, a website that provides data on when each cable began transmitting data, its length and the companies that own it.

 

There are more than 420 underwater cables around the world totalling about 1.3 million kilometers. Credit: Submarine Cable Map.

 

Southern Europe alone has "connections with 45 underwater cables, ten of which connect to Spain and nine to Portugal, with another six in the process of being deployed," according to Bobis. Half of these new cables will also reach the peninsula, as the expert points out. For example, one being installed by Google and called Grace Hopper will connect the United States with the United Kingdom and Spain.

From anchors to earthquakes: how undersea cables break

Despite the fact that these cables are increasingly using more modern technology and are more resistant, they do occasionally break. Most damage to undersea cables comes from human activities such as fishing and anchoringaccording to TeleGeography, a telecommunications market research and consultancy company. "You might have heard that sharks are known to bite cables, but bites like this haven’t accounted for a single cable fault since 2007," the company says.

In recent years, various researchers have tried to use these underwater cables to detect and predict earthquakes, but in fact it is seismic movements themselves that are one of the main enemies of these cables. A major earthquake in 2006 in southwest Taiwan ruptured eight such cables, affecting several Asian countries. Repairing undersea cables is costly and time-consuming. When they break, a telecommunications operator has to find the location of the failure point, bring the damaged part to the surface and replace it with a new length of cable.

 

Google's new undersea cable will connect the United States with the United Kingdom and Spain. Credit: Google.

 

Avoiding breakdowns is important for a robust and reliable connection virtually anywhere in the world. These cables are, in Bobis' words, "indispensable" for transmitting large amounts of data with very low latency. "Today, having a reliable, resilient, high-capacity network is more important than ever, especially as we face a new digital normal with the COVID-19 crisis," said Google Cloud.

Although undersea cables span the oceans, in reality all these connections do not end at the coast, as Bobis explains. They continue over land to link to "data centres, which form the physical space for hosting data, or interconnection points, which are convergence points for content networks, internet service providers and other companies." In other words, while undersea cables "are a fundamental part, they are only one of the links in the Internet ecosystem."

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • IT
  • Google
  • Digitization

Batuco Wetland (Chile)

  • Water

Water Week: Committed to responsible use and consumption

Sacyr celebrates Water Week, re-affirming our commitment to the responsible use of natural resources and compliance with SDG 6 – Clean Water and Sanitation. 

At Sacyr we have a water management strategy that promotes sustainable practices that take into account the availability of this resource, its quality, and the balance of the corresponding ecosystems. Our Sustainable Sacyr Plan entails reducing our water footprint at least 10% by 2025, through decreasing consumption and supporting efficient processes.  

In addition, all activities must comply with the values detailed in our Quality, Environment and Energy Policy, which establishes the need for the responsible use and management of natural resources, as well as the protection of water resources as one of our values.

Our professionals are also involved in our efforts to continually improve and develop internal initiatives that help us—for example—find ways to streamline the reuse of water. As a result, we managed to increase the use of recycled water within the company by 21.74% in 2020. 

Innovative solutions that respect the environment

Not far from La Cadellada wastewater treatment plant (operated by Sacyr Water Concessions since 2020) is the Batuco Wetland, one of the most important in Chile. In an effort to contribute to the wetland’s water balance and the sector’s biodiversity, we began provisionally discharging treated water into the Estero Sin Nombre (Unnamed Estuary), which feeds into the wetland, in accordance with the project’s Environmental Quality Resolution. 

At several of our desalination plants, including Santa Cruz de Tenerife, Alicante I and II, and Cuevas de Almazora (Spain), we have implemented numerous technological and reuse upgrades that have not only increased energy efficiency, but have also boosted the efficiency of the entire plant. In many cases, this translates into a significant reduction in the seawater required to obtain the same amount of drinking water.  
Moreover, many facilities are powered by renewable energies—such as the Águilas and Binningup desalination plants, in Spain and in Australia, respectively—in an exemplary display of sustainability. 

Southern Seawater Desalination Plant (Binningup, Australia)

 

Sacyr Water Concessions has carried out several projects aimed at improving the sustainability of water treatment processes, including the reuse of reverse osmosis membranes (LIFE Transfomem), nitrogen recovery in wastewater (Denitox project), agricultural runoff recovery and sustainable water desalination for farming (LIFE Deseacrop), water treatment for crop irrigation (LIFE ETAD), etc.

 

  • Sustainability
  • Water
  • Sustainability

Amazon's goal is to send packages of up to two kilograms "safely" to customers in less than 30 minutes. Credit: Amazon.

  • Tungsteno

Drones get the green light to deliver goods

Fly an order through the air and deliver it in less than 30 minutes. This is the goal of Amazon, which after eight years has received the necessary authorization to deliver packages by drone in the U.S. Will these technologies really change the future of smart delivery?

ISABEL RUBIO ARROYO | Tungsteno

 

Back in 2013, Amazon was already dreaming of delivering millions of products with drones zipping through the skies, but it wasn't until late last year that the tech giant finally got approval from the U.S. Federal Aviation Administration (FAA) for its drone delivery service. Given the unstoppable rise of e-commerce, drones promise to revolutionise the parcel delivery sector. But if many companies have been investing in type of technology for years, why hasn’t package delivery by drones taken off yet?

What exactly is Amazon's goal and what other projects are underway?

When you receive a package in the future, you likely won't open the door to a delivery person, but to a machine. Amazon's goal is to send packages of up to two kilograms "safely" to customers in less than 30 minutes. The company claims to be testing "many different vehicle designs and delivery mechanisms to discover how best to deliver packages in a variety of operating environments." Testing is being conducted at development centres in the United States, the United Kingdom, Austria, France and Israel.

In 2019, it unveiled a hybrid between a helicopter and a drone at its global re:MARS event in Las Vegas. "It can do vertical take-offs and landings—like a helicopter. And it’s efficient and aerodynamic—like an airplane. It also easily transitions between these two modes—from vertical mode to airplane mode and back to vertical mode," explains Jeff Wilke, former CEO of Amazon's consumer division. Thanks to artificial intelligence systems, the device uses sensors and algorithms to try to identify and avoid static and moving objects—from chimneys to a paraglider, a helicopter or animals.

Some drones use sensors and algorithms to try to identify and avoid static and moving objects. Crédito: Amazon.

 

Amazon isn’t the only company experimenting with these devices. Alphabet—Google's parent company—UPS and the U.S. retail giant Walmart are some of the big companies that are introducing delivery drones in their businesses. In Spain, the national postal service Correos has carried out tests with drones capable of reaching 100 kilometres per hour and withstanding winds of up to 40 kilometres per hour.

 

Why is it so difficult to turn these small projects into mass trials?

 

In most cases, the answer lies in the regulations, whose main objective is to guarantee safety. In Spain, although experimental flights supervised by the Spanish Aviation Safety Agency (AESA) have been carried out, until a few months ago legislation did not permit the use of unmanned aircraft to deliver packages. This is now allowed, but only if "very demanding" requirements are met, according to Isabel Maestre, director of the State Aviation Safety Agency. Drone flights will be determined by the weight of the drone, the presence of other people and the proximity to buildings.

 

Given the unstoppable rise of e-commerce, drones promise to revolutionise the parcel delivery sector. Credit: Walmart.

 

What other obstacles do companies face?

 

In addition to the legislative barriers, there are the initial implementation costs and the possibility that this type of technology is not mature enough. A drone accident could result in fatalities. Amazon indicates that the appearance and features of its drones will continue to evolve over time. "We know customers will only feel comfortable receiving drone deliveries if they know the system is incredibly safe," Wilke says. Once these kinds of hurdles are overcome, long-term challenges include increasing cargo capacity. Even if legislation allowed it, some heavy shipments could not be handled by many of today's drones.

 

Why is the idea of using drones to deliver packages so appealing?

 

The arrival of drones in the skies could bring multiple benefits. The potential of these devices for delivery in emergency situations and isolated areas stands out, as highlighted by Correos: "The intention is to apply this technology to improve the distribution capabilities of Correos and as a complementary tool available to letter carriers to facilitate their delivery tasks in rural areas that are difficult to access or are isolated due to bad weather, without jeopardising their safety."

In addition to accessing hard-to-reach areas, autonomous drones could save companies money and reduce emissions of carbon dioxide and other greenhouse gases. This is particularly important at a time when online commerce is on the rise, boosted by the coronavirus pandemic. Half of Spaniards already shop online, according to the Survey on Equipment and Use of Information and Communication Technologies in Households carried out by the National Statistics Institute.

Autonomous drones could save companies money and reduce emissions of carbon dioxide and other greenhouse gases. Credit: HadasBandel / Wikimedia Commons.

 

When are drones expected to take to the skies?

 

Maestre indicates that it is still early for drones to occupy the skies on a massive scale, at least in Spain. "Our vision is that we will start to see initiatives in very limited conditions, such as experimental projects or emergency delivery operations, in the coming months, but the mass transport of consumer goods will have a time horizon of at least two or three years," she explains.

In the meantime, companies like Amazon say they will start using these drones when and where they have the regulatory support to do so safely. The challenges are complex, but the authorization that the e-commerce giant has received to deliver packages with drones in the United States brings us a little closer to that utopian future in which unmanned vehicles zip across the skies to deliver packages in a matter of minutes anywhere in the world.

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Drones

The energy and fatty acids created by microalgae through photosynthesis can be harnessed to generate electricity or make biofuel. Credit: CSIRO.

  • Tungsteno

Microalgae, a new alternative energy source

Some microscopic algae can produce up to 60 times more biofuel than terrestrial plants using the same surface area and sunlight. Could these organisms revolutionise the renewable energy sector?

PABLO GARCÍA-RUBIO | Tungsteno

 

Fossil fuels account for 84% of energy consumption worldwide and although progress is being made year after year in scaling back the use of this type of resource, dependence is still very high in some sectors. The search for alternative sources that are efficient, competitive and renewable has become the new holy grail of energy. Perhaps the key lies in microalgae? These organisms have been studied intensively for decades, are highly efficient and represent great hope as a renewable energy source of the future.

Microalgae are single-celled organisms capable of living in a wide variety of ecosystems and with an immense variety of forms. While estimates vary, it is thought that there are somewhere between 50,000 and more than 200,000 species of algae, including those classified and those yet to be discovered. The energy and fatty acids that these algae create through photosynthesis can be harnessed to generate both electricity and feedstock in the form of biomass to produce agrofuels.

 

Microalgae are single-celled organisms capable of living in highly varied ecosystems and with an immense variety of forms. Credit: Frank Fox.

 

Cultivation of microalgae for biofuel production

 

In recent years, there have been multiple advances in the cultivation of microalgae with the aim of generating biofuel. If these organisms excel at anything, it is their ability to produce up to 60 times more fuel than terrestrial plants using the same surface area and sunlight. What’s more, they have a reproductive capacity five to ten times greater than the crops used until now.

But productivity is not the only thing in their favour. The environmental impact of algae also stands out compared to other biomass crops (such as corn, palm or sugar cane). Algae don’t need to use resources such as soil or irrigation water, they don’t require fertilisers or pesticides and don’t even need clean or fresh water, since they can be grown in briny water or even wastewater.

In addition, to assist them in carrying out photosynthesis, microalgae can be fed CO2 from other industries. In this way, while generating energy, they can contribute to the decarbonisation of the atmosphere. In addition, according to a recent study, combining them with bacteria can increase the production of hydrogen—another fuel of the future—by 60%.

 

Photobioreactors are devices for the mass cultivation of microalgae. Credit: U.S. Department of Energy.

 

Despite all these benefits, researchers have not yet succeeded in producing microalgae-based fuel at a price competitive enough to enable its mass production and commercialisation. Some large energy companies that had invested in microalgae, such as the oil giant Exxon, have been lowering their expectations because they cannot guarantee its commercial viability in the short term. One of the major problems lies in their tiny size: the handling and processing of microscopic organisms is expensive and laborious. The need for new infrastructure or a drop in oil prices—today at levels similar to those of the early 2000s—also pose a challenge.

 

Microalgae as a material for solar panels

 

Another way of obtaining energy from microalgae is to capture the exchange of electrons that occurs when they photosynthesise in order to generate electricity. While some important advances have been made using this process, the technology to do so is not yet sufficiently developed.

However, in the city of Hamburg, Germany, the first building in the world to supply part of its energy consumption through panels containing microalgae was built in 2013. These glass panels, located on the facade and with a rotation capacity to face the Sun, are supplied with CO2 and nutrients that encourage the microalgae to reproduce. The spectrum of light not absorbed by the algae during photosynthesis is converted into heat, which is used to generate hot water for the building or to heat the interior. Periodically, the excess microalgae is collected and sent to a facility offsite where it is fermented to produce methane gas, which is burned to produce electricity.

 

The BIQ Building in Hamburg supplies part of its energy consumption through panels containing microalgae. Credit: NordNordWest / Wikimedia.

 

The aim would thus be to create panels composed of algae that harness sunlight, albeit using a much different technology from solar panels. Although there are such panels being used in experimental projects like this one, the idea is still a long way from replacing current silicon panels, which provide high productivity for relatively low cost. The facade of microalgae panels in the Hamburg building, for example, costs around five million euros.

So far, none of the processes for obtaining energy from microalgae have managed to be competitive in a market as complex as the energy market. It will still take years before they become competitive, according to many experts, as a number of economic, production and infrastructure challenges have to be overcome. Even so, research is ongoing. Perseverance could eventually see algae become a key crop for the energy sector in a few decades, either in the form of biomass or as an essential component of solar energy production.

 

· — —
Tungsteno is a journalism laboratory to scan the essence of innovation. Devised by Materia Publicaciones Científicas for Sacyr’s blog.

  • Panels
  • Solar
  • CO2
  • Renewable energy
  • Ingeniería

Concrete with steel fibers: stronger and more sustainable

We are working on a project, called eFIB, which has overall objective of developing a new slab system using high-performance steel fiber reinforced concrete (SFRC) without traditional reinforcement.

Sacyr’s commitment to sustainability, improving energy efficiency, reducing carbon footprint and increasing productivity leads us to create innovative solutions, such as fiber reinforced concrete.

We are working on a project, called eFIB, which has overall objective of developing a new slab system using high-performance steel fiber reinforced concrete (SFRC) without traditional reinforcement.

Use of SFRC has increased significantly in recent years, due to its acceptance in several countries.

It represents one of the most important innovations in the field of special types of concrete. The change in material composition changes its properties: increasing residual tensile strength (under certain loads), durability, resistance to cracking, improving thermal properties, behavior in case of fire, etc.  

 

It handles large loads

 

“This acceptance and the need for innovation in new construction techniques and materials has motivated research into potential applications of SFRC in structures that bear substantial loads,” explains Ángel Sanchez de Dios, Structural Engineer in the Building Department of Sacyr Engineering and Infrastructure.

In addition, the interest in using SFRC as the traditional reinforced concrete for these types of structural elements relates to advantages such as resource optimization, reduced execution times, reduced environmental impacts and other aspects. 

“The concrete is manufactured at the depot and arrives on site where they pour it into the formwork. Labor requirements are reduced in all the processes as steel assembly work on site is minimal, fewer operators are needed for casting on site as self-compacting concrete is used. All these semi-industrialized processes mean an improvement in energy efficiency,” explains Angel Sánchez.

However, consolidating the use of SFRC in slabs requires improved understanding of fiber behavior in aspects related to breakage form and deformation under the action of loads maintained over time. These aspects will improve with long-term use and accompanying regulations to develop these aspects. “With the current cost of fibers, it is also very important to ensure that it can be done with low quantities of fibers to make it cost-effective,” the expert stresses.

 

Verified viability

 

“We think that this is a good solution with advantages of all kinds, so we remain vigilant to be able to apply it in some of Sacyr’s new projects, as we have verified the viability of its use in building slabs,” says Angel Sánchez. 

With the aim of taking structure to its limit, a load test equivalent to an additional load of 20 people per m2 was carried out, i.e. way above the requirements and above the calculation evaluations. Its behavior against cracking and deflection remained in low, permissible ranges.

  • Concrete
  • Engineering and Infrastructure Sacyr

We use our own and third party cookies for analytical purposes. Click on HERE for more information. You can accept all cookies by clicking the "Accept" button or set them up or refuse their use by clicking .

Cookie declaration

These cookies are necessary for the website to function and cannot be disabled in our systems. These cookies do not store any personally identifiable information.

Name Provider Purpose Expiration Type
LFR_Sesión_STATE_* Liferay Manage your session as a registered user Session HTTP
GUEST_LANGUAGE_ID Liferay Determines the language with which it accesses, to show the same in the next session 1 year HTTP
ANONYMOUS_USER_ID Liferay Manage your session as an unregistered user 1 year HTTP
COOKIE_SUPPORT Liferay Identifies that the use of cookies is necessary for the operation of the portal 1 year HTTP
JSesiónID Liferay Manages login and indicates you are using the site Session HTTP
SACYRGDPR Sacyr Used to manage the cookie policy Session HTTP

These cookies allow us to count visits and sources of circulation in order to measure and improve the performance of our site. They help us know which pages are the most or least popular, and see how many people visit the site. All information collected by these cookies is aggregated and therefore anonymous.

Name Provider Purpose Expiration Type
_gat Google It is used to throttle the request rate - limiting the collection of data on high traffic sites Session HTTP
_gid Google It is used to store and update a unique value for each page visited Session HTTP
_ga Google This is used for statistical and analytical purposes for increasing performance of our Services Session HTTP