California firefighters are using artificial intelligence to help spot wildfires, feeding video from more than 1,000 cameras strategically placed across the state into a machine that alerts first responders when to mobilize.
In an example of the potential of the ALERTCalifornia AI program launched last month, a camera spotted a fire that broke out at 3 a.m. local time in the remote, scrubby Cleveland National Forest about 50 miles (80 km) east of San Diego.
With people asleep and darkness concealing the smoke, it could have spread into a raging wildfire. But AI alerted a fire captain who called in about 60 firefighters including seven engines, two bulldozers, two water tankers and two hand crews. Within 45 minutes the fire was out, Cal Fire said.
Developed by engineers at the University of California San Diego using AI from DigitalPath, a company based in Chico, California, the platform relies on 1,038 cameras put up by various public agencies and power utilities throughout the state, each one capable of rotating 360 degrees at the command of remote operators.
Since the AI program began July 10, Cal Fire provided other examples of AI alerting fire captains to a fire before a 911 call was made, though it did not yet have a comprehensive report.
Neal Driscoll, a professor of geology and geophysics at UCSD and the principal investigator of ALERTCalifornia, said the sample size so far was too small to draw conclusions.
Cal Fire hopes the technology can one day serve as a model for other states and countries around the world, a need underscored by unusually devastating wildfires in Hawaii, Canada and the Mediterranean this season.
“Its 100% applicable throughout anywhere in the world, especially now that we’re experiencing a lot larger and more frequent fire regimes and with climate change,” said Suzann Leininger, a Cal Fire intelligence specialist in El Cajon, just east of San Diego.
Part of Leininger’s job is to help the machine learn. She reviews previously recorded video from the camera network of what AI considers to be a fire, then tells the machine whether it was right with a binary yes or no answer. Any number of phenomenon can trigger a false positive: clouds, dust, even a truck with smoky exhaust.
With hundreds of specialists repeating the exercise up and down the state, the AI has already become more accurate in just a few weeks, Driscoll said.
Beyond the camera network, the platform is collecting vast amounts of additional information, including an aerial survey to quantify the vegetation that would fuel future fires and map the Earth’s surface beneath the canopy, Driscoll said.
Airplanes and drones are also collecting infrared and other wavelength data beyond the capabilities of human vision.
During the winter, the platform is able to measure atmospheric rivers and snowpack. The UCSD team is also capturing data on burn scars and their impact on erosion, sediment dispersal, water quality and soil quality, Driscoll said.
The data, which is available to any private company or academic researcher, could eventually be used to model fire behavior and improve as yet unforeseen AI applications for studying the environment.
“We’re in extreme climate right now. So we give them the data, because this problem is bigger than all of us,” Driscoll said. “We need to use technology to help move the needle, even if it’s a little bit.”
https://ift.tt/ZkD8nNY
https://ift.tt/yGP240p
0 Comments