3 common machine learning mistakes to avoid | Tech News
I’m a big fan of cloud-based machine learning and deep learning, and AI in general. After all, you can’t be a geek without imagining having a conversation with an artificially intelligent being that can answer questions and carry out your bidding!
That’s said, I’m also seeing cloud-based machine learning and deep learning misapplied over and over again. All have easy fixes for the most part, and certainly cloud-based machine learning is here to stay. But use it wisely and appropriately.
Here are the top three recurring mistakes that I’m seeing.
Machine learning, without any learning, is worthless. The true use case for machine learning is applying algorithms to massive amount of data and having certain patterns emerge that become the training for the machine-learning-based applications.
So, no data, no learning. Although a machine learning application can gather data over time and become smarter, it needs a jumping-off point where there is enough data to teach the system how to think in the first place.
For example, there are machine earning systems that operates in hospitals, that do the dark art of telling the staff your likelihood of dying during your hospital stay. Without at least 100,000 data points, you can count on that likelihood being either 0 or 100 percent—not helpful.
2. Using machine learning where it’s not needed
This is the most common fail that I’ve seeing, resulting in companies spending as three or four times times the development costs to use machine learning in an application—for absolutely no reason.Machine learning systems simply offer no real advantages in many use cases.
Procedural logic works well most of the time, so building a knowledge base for, say, an accounting systems or scheduling system is just over the top. Worse, the resulting applications are much less efficient.
3. Not understanding the performance impact
Embedding machine learning systems in applications can sometimes make them much more valuable to the business. However, it can also kill the application performance.
Think about it: An embedded machine learning service could have several-second latency as it runs algorithms across the data. If this application should provide a response in near real time, any value from machine learning goes away quickly considering the lost productivity from the delayed response.