AI enters its next act with inspections | Tech News
A perennial controversy surrounds artificial intelligence (AI) and jobs. Will AI take away our jobs, leaving us unemployed? While some jobs will indeed change or disappear—as they have always done and will continue to do—AI can also be a human force multiplier in business applications where we humans are still under-delivering.
One set of jobs recently added to the endangered species list–so to speak–consists of collecting, storing and analyzing terabytes of drone footage of telecommunication towers, pipelines, solar panels, bridges and other structures where gathering data can be treacherous for humans.
The repetitive operation of flying a drone—and analyzing the copious data collected by it—is a job prone to fatigue and errors. Pilots aside, analysts tasked with finding a crack in the 351st insulator inspected during the 25th drone mission of the day might be expected to tire. If cracks on insulators are not their life’s passion, chances are they will eventually make a fatal or money-losing mistake.
That is where AI comes in—software emulations of human nervous systems that can be trained to partially offset dull and repetitive tasks that human analysts must perform relentlessly on huge datasets, day after day. In effect, the drone “borrows” the human brain to deliver its final value to the enterprise.
Edge AI and biased analytics
Think about how humans inspect a structure: they go from cursory looks to more thoughtful and prolonged analysis of critical areas, which they recognize from experience or visual inspection. In a sense, humans are biased. These are good biases, driving quick decisions on how to use precious compute cycles.
On the other hand, think about how drones collect data: they are flown over a structure in a way that is almost always independent of the actual data collected. A drone may spend the same three seconds collecting data from a normal area in a structure. Unless the operator has immediate visual feedback on what the drone “perceives,” the drone will collect its data in an unbiased way. The result may be the need to collect additional data post-flight.
AI at the compute edge, ideally on the drone itself, can bias data collection by directing the drone to the pertinent structure, then sub-areas in the structure, and then specific, probable anomalies. This can go as far as increasing the time spent collecting data from interesting areas to engaging other sensors and alerting operators for additional actions.
Today’s drones are built with enough compute power to accommodate edge intelligence: from powerful Snapdragon processors to even more powerful Nvidia GPUs. These processors can all be substrates for real-time operating AI to power this important mission-critical ability.
Cloud AI and analyzing data for insights
Once data is collected and deployed, the hard analytics task begins. Analytics tasks vary between inventory of structures and mapping the visual appearance of an object to more detailed information, such as model and year of production, or to finding defects on that object, including corrosion and damage.
Each mission collects thousands of still images or high-definition videos. Every time a human analyzes a frame, it has not only an economic cost but also a certain probability of error associated with it.
Given the large number of frames and the multiple layers of information, the possibility of the human analyst missing a key frame, where the specific item appears, or the defect is most visible, increases.
One of the leaders in use of AI for mechanical inspections is well-known, deep-learning software firm, Neurala. Similar services include Intel Insights and General Electric’s Avitas. As Max Versace, Neurala co-founder and CEO explains, “Unlike humans, AI does not experience fatigue. AI systems can be trained to perform multiple tasks, from inventory to classifying a 3D image and finding tiny defects in it.” Of course, performing these tasks successfully is predicated on good AI, trained with good data.
Predictive AI and knowing when to act
One of the reasons structures are inspected is for maintenance. Being able to detect a crack or corrosion will tell the analyst that in the next 6 to 12 months that component may fail, and maintenance should be planned to avoid downtime.
This requires additional, purpose-driven AI, which is able to take into account historical data and relate this data to specific outcomes. Similar to a medical doctor who remarks, “If you keep eating boxes of french fries, you may gain a few pounds,” predictive AI is able to diagnose future states of a system based on current and past data. Applications of AI to temporal predictions abound, from medical data to financial time series and network attacks.
AI will find a stable job in this department as well.
What to expect from AI in inspections
We would all want AI to be available on drones and cloud tomorrow, but a more realistic look at the inspection ecosystem highlights that some application areas may come sooner than others.
While edge AI is appealing in its rationale, the first application domain of AI will be post-processing, because this AI won’t require particular hardware to be available on the drone. It will be an “AI plugin” running on the enterprise software infrastructure.
Edge AI will be implemented after 1) the realization that post-processing AI is only as good as the data that is initially ingested and 2) beefier, lightweight processors have been developed for edge devices, such as drones.
Once the two AI applications above are fielded, predictive AI will come into play, delivering a full, AI-powered software pipeline that maximizes data collection, insights and actionable intelligence for the enterprise.
Drone inspections are finally here, and, with AI under the hood, they are here to stay.
This article is published as part of the IDG Contributor Network. Want to Join?
Comments are closed.