The Machine Learning Transformation
At Textron Systems, digital transformation is core to our strategy, not only from the perspective of how we execute our business, but also how our products and services are differentiated. A key enabler to digital transformation is leveraging an ecosystem of ’as-a-service’ offerings that bring the necessary capabilities and operational performance to our solutions. Beyond the elasticity that these architectures provide, they also enable access to a marketplace of services like machine learning, advanced analytics and proven algorithms. Machine learning represents an opportunity for commercial, civil and government organizations alike to drive powerful economies such as cost reduction, productivity optimization and product differentiation. To achieve these benefits, organizations must implement machine learning with an eye toward avoiding its most common failure modes.
First and foremost, data is key when it comes to machine learning. The more data you have for the advanced algorithms to learn from, the higher the probability of driving desired and meaningful insights. There are inherent challenges to perfecting machine learning insights, starting with the cleanliness, or quality, and availability of good data sets to mine, or “learn” from. There are several applied examples where biases in data have skewed the results and accuracy of data insights through machine learning concepts. Training an algorithm is like training a body; the healthier the inputs, the better the chances of improved performance.
Equally important to developing successful machine learning insights is choosing the right algorithm for the right problem statement. While there is an art and a science to picking the right algorithm, the services available across modern SaaS architectures provide access to run models and validate confidence levels without having to have a PhD in Data Science. By focusing on outcomes and developing the skills to know which model works best for which problem, organizations can apply the right algorithm to meet the business need.
While there is an art and a science to picking the right algorithm, the services available across modern SaaS architectures provide access to run models and validate confidence levels without having to have a PhD in Data Science
As we look to increased machine learning model performance in applied areas like sensor and imagery analysis, we are leveraging large catalogs of data to train algorithms. Automating workflows or improving confidence in insights requires a level of accuracy and dependability in data labeling. Where scale is a challenge, and availability to quality labeled data is required to further mature models, we can leverage the ecosystem for synthetic models. Synthetic models are information objects manufactured, or engineered, by computers rather than real world events. Synthetic models provide access to large amounts of accurately labeled data inputs that could otherwise take years to catalog. Use of synthetics can also assist in improved machine learning where data varies in interactive conditions, like in the case of a variety of pixel resolutions or in-motion changes.
For our organization, the benefits gained from leveraging machine learning concepts and improved insights will include building automated workflows to assist in shifting data/human interactive tasks from discerning data inputs to more value-added action or decision-oriented steps. Beyond improved automation, additional benefits include the ability to learn at scale for predictive insight.
At Textron Systems, we are “All in for Autonomy”, and that holds true for not only the tremendous products and services we deliver to our customers, but also for the solutions Information Technology brings to enable our users and program teams; solutions that are core to enabling our strategy.