By David Pendered
Georgia Tech researchers are devising a way to track smart phones and other devices in ways that would enable rescue workers to locate folks who are in harm’s way even when a power outage knocks out the internet.
The short explanation is that rescue workers may be able to ping devices in a disaster area to determine their precise location.
This information could be used to create density maps of people in this region.
These maps can inform response teams and help set search and rescue priorities, according to a statement from Tech.
“We believe fog computing can become a potent enabler of decentralized, local social sensing services that can operate when internet connectivity is constrained,” Kishore Ramachandran, a Tech computer science professor, said in the statement.
“This capability will provide first responders and others with the level of situational awareness they need to make effective decisions in emergency situations,” Ramachandran said.
The innovation Tech’s researchers bring to the table appears to be what Tech calls:
- “[A] generic software architecture for social sensing applications that is capable of exploiting the fog-enabled devices. The design has three components – a central management function that resides in the cloud, a data processing element placed in the fog infrastructure, and a sensing component on the user’s device.”
Tech researchers are looking at a cousin of cloud computing.
This cousin relies on the processing capacity of the devices people use. As opposed to the processing capacity of the cloud – where so much of our photos, videos and files are stored. This cousin is named fog computing.
The technology came to widespread attention in April 2016. As with so much of the Internet of Things, the formation of the OpenFogConsortium spurred interest due in part to the weight of backers including Cisco, Microsoft, Dell and Intel, according to a report in businessinsider.com.
The notion involves the processing of data at the local level, rather than pushing so much of it into the cloud.
Fog computing doesn’t rely on the internet to move data to the cloud, where it is analyzed and responses are sent back to devices via the internet.
It is based on the concept of distributed service, which is the technology used by corporations in the days when a smaller number of computers shared a network.
This is in the era before the cloud was devised as a way to have large numbers of users send data to a centralized system.
Just as Cisco and partners weren’t the first to look into the role fog computing could play in disaster response, Tech isn’t the first to examine ways to tap data spilling out of smart phones, routers and other devices.
In 2014, a candidate for a doctoral degree at Texas A&M University wrote his dissertation on the creation of fog architecture to respond to disaster. The student described the type of scenario he intended to address; the scenario is similar to that which faces the Tech researchers:
- “When a disaster hits an urban metropolitan area that spans tens of square miles (2011 Joplin tornado) or hundreds of square miles (2011 Japan earthquake), power and communication infrastructure are rendered unusable. … [E]lectrical serves are still being restored, a few cell sites have been restored, cell phones are being distributed and satellite telephone has been set up. In this kind of environment, presence of broadband internet access cannot be assumed.”
Such was the case in Houston, following Hurricane Harvey. Such was the case across parts of Georgia and some part of the Southeast, following Hurricane Irma.