Texas entrepreneur Dave Copps has launched Hypergiant Sensory Sciences, which uses AI to help companies understand their physical environments. Copps told VentureBeat his new startup is backed with “more than $5 million” in a first round of funding.
The Dallas-based company will use sensor networks and deep learning to help companies track what is going on in their environments, offering various applications, but starting with security for critical infrastructure. For example, an oil or gas company will be able to use the company’s software to automatically track sand trucks driving in and out of an oil well property, perceive how full they are, observe other patterns, and proactively alert operators of anything unusual.
Copps, who sold his previous company Brainspace last year to cybersecurity company Cyxtera as part of a $2.8 billion transaction, said his new company wouldn’t be limited to the natural language processing focus area of Brainspace. It would extend to visual analysis, in particular. “We’re building a company to augment human perception,” he said in an interview.
Most corporate observation systems rely on humans, Copps said, but humans are only capable of doing a certain number of things at one time. That limits their ability to take appropriate action in many cases. For example, companies seeking to visually track their environments might put up 50 cameras, and have operators track their views on a single screen. If something bad happens, operators might have to go look at the tape to see went wrong, after the fact. “We want to replace that with a single model,” Copps said. “If something’s about to happen, there’s an alert, and you can pop over and investigate.” To allow predictive alerts, the model will include intelligence from patterns learned over time. Moreover, if a company has 100 oil wells, an AI-driven system could conceivably track all of them, automatically, with learnings transferred between them.
Hypergiant Sensory Sciences’ first round was led by Align Capital of Austin, Texas and includes Capital Factory and GPG Ventures, among others. Besides Copps, cofounders include Chris Rohde and Ben Lamm.
Hypergiant Sensory Sciences is launching within a wider syndicate of companies called Hypergiant Industries, also cofounded by Ben Lamm. That syndicate, founded earlier this year, aims to serve companies with artificial intelligence solutions, as well as invest in other AI companies.
Hypergiant Space Age Solutions, which launched earlier this year as the syndicate’s commerce services division, is now doing “significantly more than $10 million” in revenue, and will be at 100 employees by the end of the year, Lamm told VentureBeat. Customers include GE Power, Shell, and Apollo Aviation.
Snagging Copps is a coup for Lamm, given Copp’s early track-record in AI. Brainspace pioneered a natural language processing approach called latent semantic analysis (LSA), which gave companies an easier way to sift through millions of documents and make meaning of them. In lawsuits, investigators sometimes need to sort through thousands of email threads, and tracking crimes can be difficult when code words and obfuscation are used. That’s what Brainspace helped with, by understanding language correlation and semantics.
While LSA had been under development for at least a couple of decades, it had some limitations. Brainspace reworked the LSA approach so that it could work at scale, and apply algorithms on terabytes of data.
Brainspace picked up its first customer, LexisNexis, around 2008, when the company had only four employees. Later, after the infamous BP oil spill in 2010, 17 different law firms used the company’s software for e-discovery for lawsuits, to find out who at the company said what when. Today, the company’s software is used by most major consulting firms.
After leaving Brainspace earlier this year, Copps said his next move would have to be the right thing. “I have one more move, then that’s it for me.”
For this new venture, Copps says he sees few competitors. Other companies are doing different aspects of what he plans to do, but none offer a unified, learning approach, he says. Some are doing object recognition, and others do AI modeling. “We’re doing the magic of pulling it all together, and innovating on the deep learning side. We’re applying AI to understand how objects are interacting with each other, and extracting meaning.”