We are in a golden age of Artificial Intelligence discoveries. Researchers at Google Brain, DeepMind, OpenAI, and other companies of all sizes are finding new ways to build, train, and apply AI models across a variety of domains.
However, the majority of AI projects applied to business needs either fail to meet expectations or fail completely. What can your company do to improve the odds of success?
To find out, we recently sat down with executives from three leading St. Louis-based companies Charter Spectrum, Mastercard, Spire, as well as our own Kit Menke, at the prepare.ai conference (the Midwest’s premiere AI & technology conference) for a panel discussion, “The One Thing Every Company Must Do Before Leveraging AI”.
Our Speakers
Julius Ming'Ala, an Agile Engineering Lead at 1904labs, led the discussion with local AI leaders.
- Dan Carmody - Director of Data Analytics and Intelligence at Mastercard
- Henry Campbell - Director of Advanced Analytics at Spire
- Jonathan Andrews - Senior Director, IT Information Governance at Charter Spectrum
- Kit Menke - Data Engineering Practice Lead at 1904labs
Setting the Stage
Building models is not the first step in adopting AI technology. Yet, many companies launch AI initiatives without having the right data infrastructure in place to effectively support their efforts.
Without taking key steps with your data infrastructure upfront, AI projects have a strong propensity to fail.
During the discussion, we learned how leading companies lay the foundation for success in their large-scale, business-critical data projects. Below are some key takeaways. You can also watch the full video below.
Key Takeaways
Know the data users and understand their needs
Whether the users are data scientists, business analysts, fraud analysts, or business users, the data needs to be built for the correct purpose. The way you ensure this is to work with the data users, create a relationship with them, and understand their needs on a regular basis.
“Not all data is created equal. It must be fit for purpose, because one piece of data is valid for one type of analytics, but is completely unusable for another.” - Jonathan Andrews, Charter Spectrum
Be scalable from the start
Telecommunications/internet, finance/tech, energy/IoT, and many other industries generate huge amounts of data - hundreds of millions or billions of events a day. Plan for big data from the start by building everything to be scalable, even if you are starting with small projects.
“Everything that we built, we didn’t have the luxury to build something small." - Dan Carmody, Mastercard
Make data as fast as possible, while still being comfortable with it
Getting data to users as fast as possible is important, but not important enough to sacrifice quality. Find a balance between speed and quality checks/enrichment that meets your users’ needs. As you continue to develop your solution, you can improve the speed.
Be willing to pivot as technologies change
Technologies change, so what was the best choice for your solution last year, might not be the best choice this year. You need to be able to pivot in order to optimize your solution quickly and continue serving the data users the best you can.
“We are in the process of upgrading our infrastructure across the board... We want to be more responsive to our customers, so we’re installing newer technologies where we’re bringing in more data." - Henry Campbell, Spire
Final Insights: What is One Thing Every Company Must Do Before Leveraging AI?
Each of our panelists provided a different perspective on this question based on the unique needs of their company.
Jonathan Andrews - Make sure your data is fit for the purpose it is going to be used for, accounting for the speed, volume, and variability of the data.
Dan Carmody - Make sure you have a relationship with the AI team to meet their needs.
Henry Campbell - Focus on operationalizing the data/analysis by connecting the people to that operationalizing event. Keep those connections in place to keep in mind what the intent of the model really is.
Finally, Kit Menke wrapped up the panel with the perspective from a technology partner:
“Trust your data by leveraging good DataOps principles: treat your SQL like the code it is, have good automation and testing built in from the beginning, and implement automated data quality checks at every step of your pipeline, so that you and your users have confidence in the data you are delivering to production.” - Kit Menke, 1904labs