How AI and drones are trying to save the Great Barrier Reef

Time is running out, with half of the reef’s coral already dead.

Image: Queensland University of Technology

The Great Barrier Reef is home to one of the world’s most diverse ecosystems. Across its 2,300 kilometre reach, there are 2,500 individual reefs, over 9,000 species of fish, whales, and turtles, among other creatures. A UNESCO World Heritage Site, and one of the world’s seven natural wonders, the Great Barrier Reef attracts over two million visitors every year to explore its colourful corals and unique marine life.

Yet for all its wonder, the reef’s coral has rapidly deteriorated as ocean temperatures continue to rise. When water temperatures rise, algae and corals separate, resulting in corals being stressed and stripped of their colour — this is known as coral bleaching. The coral bleaching events inflicted on the reef are well documented, with half of the coral on the reef already dead. 

SEE: Rebooting the reef (CNET)

While coral populations can recover from a bleaching event, the road to recovery is often a decade-long process. If carbon emissions remain steady, UNESCO climate models predict coral bleaching will occur twice every decade from 2035 onward, and annually after 2044. 

Finding ways to solve this environmental catastrophe has been difficult, but with concerted efforts, there has been gradual progress towards better understanding how to protect the reef — particularly through the development of drone technologies.

Using drones to see beyond the human eye

With so much coral to regrow, it can be confounding as to which parts of the reef should be prioritised in the regrowth process. Traditionally, in-water surveys and NASA satellite imagery have been the primary methods used to collect data about the state of the reef. But both of these methods are not without their drawbacks. 

In-water surveys are generally inefficient due to the limited amount of data points it can provide. Satellite images, meanwhile, can provide thousands of data points, but are often hard to decipher due to its low resolution or cloud coverage.

A team, led by Felipe Gonzalez, an aeronautical engineer at the Queensland University of Technology (QUT), has attempted to find a happy medium between these two methods by designing a drone that captures data through hyperspectral cameras.  

A hyperspectral camera collects and processes information from across the electromagnetic spectrum, which is beyond the visual spectrum that humans can see. The visual range of humans is 450-700nm, while a hyperspectral camera can capture information between 300-1000nm. 

SEE: How to implement AI and machine learning (ZDNet/TechRepublic special feature) | Download the free PDF version (TechRepublic)  
 
In comparison to the 30-40 data points collected by in-water survey images, a single hyperspectral image can cover thousands of data points about a particular area of coral. Drone images also do not face the same cloud coverage or resolution issues of satellite images. According to Gonzalez, the data captured can differentiate between coral, sand, and algae, as well as determine the type of coral and the precise levels of coral bleaching. 
 
“It gives us a better understanding of why a coral is a certain colour. Is it because [the coral] is its natural colour, or is there is a problem with the coral? Is it dying or bleaching?” Gonzalez said. 
 
Working alongside the Australian Institute of Marine Science (AIMS), the country’s tropical marine research agency, the data captured from the drone is validated with AIMS’ previously collected data from in-water surveys. 
 
“We can look for signatures that we have from in-water surveys and go ‘okay, create an algorithm that looks for [a specific] signature’. When it picks up that same signature, it can then be developed to extrapolate it to be applied to the same imagery. This then allows the camera to capture the different types of corals,” Gonzalez said. 

“It gives us a better understanding of why a coral is certain colour. Is it because [the coral] is its natural colour, or is there is a problem with the coral? Is it dying or bleaching?”
Image: Felipe Gonzalez, QUT aeronautical engineer  

“Rather than just taking in RGB information, which is important and useful, the application of AI can tell biosecurity people that there is an invasive plant in a particular environment.”    
 
But with improved reef coverage from the gathered data, reef conservationists are facing new challenges in interpreting where the Great Barrier Reef needs the most help. With one hyperspectral image providing more than 4,000 data points, a single drone flight can amass thousands of gigabytes of raw data that need to be processed and analysed. 

To be able to harness the drone’s full potential, Gonzalez said, data processing and computer vision development is needed to interpret the data, which is something he expects will be the next frontier for reef-saving technologies.

Developing ‘smart’ reef-saving tech is a repetitive process

Developing new technology for the environment is important, but the process towards creating these reef-saving technologies can be a repetitive one — new technologies are often a refinement of something that already exists, or a re-engineering of tech so it can applied for a different use case. Unveiled in September last year, the RangerBot, is a multi-purpose drone that eliminates coral-eating crown-of-thorns starfish and monitors the reef for health indicators and maps underwater areas. Since using the RangerBot across the Great Barrier Reef, the research team behind the drone also developed larval bots, which are derivatives of the RangerBot. 

The larval bots are capable of delivering larvae coral to reefs for reseeding, and a pilot trial of the bots was conducted in November last year at the Great Barrier Reef. Users of the larval bot operate it via an iPad, where they are then able to tell it to drop millions of coral spawn into the reef.  

SEE: The Great Barrier Reef could be saved by these lab-grown coral babies (CNET)

Last month, the QUT research team behind the larval bots also conducted a coral spawning trial in the Philippines by using the larval bots. In the Philippines, coral only spawns during one night of the year. The coral spawning process entails capturing the spawn, rearing it in floating cages for 5 to 7 days, and when it’s ready to settle, distributing it again with the larval bot or divers. In places like the Philippines, where dynamite has damaged reefs to the extent where there are no reefs left, the only option left is to regrow coral.
 
Prior to the development of the larval bots, coral respawning trials have primarily been diver-based, conducted in 20mx20m spaces. But now the team is looking to scaling to a hectare or square kilometre scale. To achieve this increase in scale, however, there are various challenges that conservationists face.  

SEE: Green Tech Tips & Tricks (TechRepublic on Flipboard)  
 
Speaking with Matthew Dunbabin, QUT professor and the lead developer of the RangerBot and larval bots, he told TechRepublic that while conservationist underwater technology has existed since the 70s, the majority of existing platforms do not lend themselves to large-scale management activities. 
 
The practical challenges that conservationists face is how to distribute coral spawn efficiently when there is only a limited amount of coral spawn, and finding out where is the most suitable area to grow coral spawn to maximise coral resettlement — all challenges that require the use of AI and real-time computer vision and decision making. 
 
The technological challenge for Dunbabin’s larval bots then, Dunbabin said, is much like the ones Gonzalez’s hyperspectral drone faces of figuring out how to apply AI, machine vision, and robotics capabilities to reef-saving technologies. 
 
“How do we package it into a system where they can do things by themselves in the wild? We’re so advanced in our AI, machine vision, and robotics capabilities that we lack the platforms to how we get that into the environment safely,” Dunbabin said. 
 
“Fundamentally, climate change is impacting the reefs, so as technologists, we play a role in reducing our carbon footprint so not only the amount of energy used to drive the device but for it to think smarter. In terms of the management of reefs, we’re only just starting to delve into AI and machine vision, robotics, to management activities, and we expect to see a dramatic upscale of that.”

Funding is the backbone for saving the reef

The information collected by the hyperspectral drone or the RangerBot derivatives has provided the Australian government with more data about the Great Barrier Reef than before, as well as an understanding that there is a great need for AI development in the environmental conservation tech space. But the process of collecting and analysing the data required, as well as developing the technology, to save the Great Barrier Reef requires a lot of time and resources. 
 
Developing technology to locate, rehabilitate, and protect coral reefs is an expensive and time-intensive process, with the Australian government’s pledge of AU$500 million having been labelled by environmentalists as being “nowhere near enough” due to its sheer size. 

Bill McKibben, founder of a grassroots climate movement 350.org, criticised Australia’s use of coal, which mitigates any work done to save the reef. 

“Science is well aware of what is killing coral on the Great Barrier Reef — it’s the excess heat that comes from burning fossil fuels,” he told The New York Times last year. “If the Turnbull government was serious about saving the reef, they would be willing to take on the industry responsible for the damage.”

SEE: How self-driving tractors, AI, and precision agriculture will save us from the impending food crisis (cover story PDF) (TechRepublic)  

The hyperspectral cameras used by Gonzalez’s hyperspectral drone systems cost around AU$100,000, and only four years ago, they were priced at around AU$500,000. To fund the development of the drone, Gonzalez and his team applied for and received a
Microsoft AI for Earth grant
earlier this year, which makes software tools, cloud computing services, and AI deep learning resources available to researchers working on global environmental challenges. The hyperspectral drone system was one of six projects that received a slice of Microsoft’s AU$50 million grant
 
“Now we can use Microsoft’s AI tools in the cloud to supplement our own tools and quickly label the different spectral signatures,” Gonzalez said. “So, where processing previous drone sweeps used to take three or four weeks, depending on the data, it now takes two or three days.”

As it stands, the Great Barrier Reef is the world’s largest coral reef system, but the implications of rising ocean temperatures extend far beyond this one reef. Coral reefs are home to a quarter of the world’s marine species, and they provide 17% of the animal protein consumed by humans, according to the United Nations. 
 
“Time is too short, and current human resources are too few to solve urgent climate-related challenges without the exponential power of AI,” Microsoft chief environmental officer Lucas Joppa said. “By putting AI in the hands of researchers and organisations, we can use important data insights to help solve issues related to water, agriculture, biodiversity and climate change.” 
 
Where there is optimism from a technological perspective, Dunbabin said, is that trust and privacy is not really a concern regarding environment management. The researcher told TechRepublic that hopefully this translates to global tech giants being more open towards providing funding and resources for the development of AI reef-saving technologies.

“Time is too short, and current human resources are too few to solve urgent climate-related challenges without the exponential power of AI.”  
Lucas Joppa, Microsoft chief environmental officer  

Dunbabin and his research team have plans to once again deploy their larval bots at the Great Barrier Reef in December. This time around, they will try and massively upscale the area where damaged reefs can be reseeded.

Meanwhile, for the past two months, Gonzalez has been processing the spectral data collected from the reef so far with Microsoft’s AI tools. In September, his team will start a second round of drone flights. 

“We aim to return to the four reefs AIMS has already studied to monitor any changes,” he says, “then extend the monitoring to new reefs.”

Also see

How AI could save the environment 
Artificial intelligence techniques are now used to monitor endangered species, track diseases, and optimize crops. But there’s much more work to be done.

How the merger of two data giants will benefit the social sector
GuideStar and the Foundation Center recently combined forces to create Candid. Find out how the nonprofit is using machine learning, data science and other tech.

How IoT tech is helping African rangers protect endangered elephants from poachers 
Tanzanian rangers use EarthRanger, a system that relies on data from remote sensors, to keep endangered elephants safe in their wildlife reserve.

Farming for the future: How one company uses big data to maximize yields and minimize impact
Foris.io has a mission: Make farms more productive and protect the environment while doing it. With a combination of hardware and machine learning from IBM, Foris.io aims to change the way we farm.

Australian universities use artificial intelligence to map global foreshore (ZDNet)
Two Australian universities teamed up to apply artificial intelligence to map the loss of crucial coastal environments around the world.