Applied AI & AI in business /

The AI struggle is real: Thoughts from Web Summit 2019

November 11 2019/5 min read

Described as “Glastonbury for nerds” by The Guardian, Web Summit is considered to be the must-attend tech event for the last few years. Rarely do you get the opportunity to have such a large number of influential individuals and companies gathered together in one place. Representing Peltarion, I had the opportunity to attend this year’s Web Summit in Lisbon together with the team at Business Sweden / Sweden Co-Lab - a cross-sectoral innovation and collaboration platform. Below, some thoughts from me from these high-paced and inspiring days. 

I find myself sitting in one of the chill-out areas at Web Summit 2019, reflecting on my experience so far. The first thing that comes to mind is, this has got to be one of the best conferences I’ve ever attended. 

The summit, taking place between the 4th-7th of November in the beautiful city of Lisbon, Portugal, is the biggest tech event in Europe, bringing together over 70,000 attendees from 170 countries. Web Summit gathers founders and CEOs of some of the most renowned technology companies, fast-growing startups, policymakers and heads of state to ask one simple question: Where to next?

As expected, many of the well-established tech giants are present, talking about all the incredible work they have and are planning to do. And there is nothing wrong with that. However, I’m more interested in hearing from the many buzzing startups, and in the last few days, I’ve spoken to many of all sizes. Web Summit has these categorized into three groups: Alpha, Beta and Growth

Having worked for companies both small and large, I appreciate the grind of trying to convince someone that what you’ve got could be the next big thing.

At the inauguration of Sweden Co-Lab booth by the Ambassador Helena Pilsas Ahlin and Helen Rönnholm of Business Sweden

Challenges companies face when pushing the limits of AI

I’m here for a couple of reasons; in fact, the obvious first one is to spread the word about the work we are doing at Peltarion in operationalizing AI. The second is to get insight into the challenges companies face when pushing the limits of AI, specifically deep learning. I’m curious to hear about what the pain points are when building and deploying deep learning models into production. 

This is what I found out:

  • Data preparation. Fairly quickly, it was obvious that the data preparation step is one of the main challenges in AI adoption. Almost every company I speak to begins by mentioning the dreaded data preparation. “Data preparation is the biggest nightmare and hurdle. A majority of our time is spent annotating a vast amount of data,” mentioned Lorenzo De Matteri, Co-Founder and CTO of Aptus.AI. The biggest challenge and opportunity we currently face is simplifying data preparation. At the moment, it tends to be quite a lengthy process of 1) sourcing a quality dataset and 2) processing this vast amount of data, which usually involves tedious coding. There are no doubt some alternative solutions on the market to make this process a little easier but it will always require a touch of manual feature engineering.
  • Model building. Following data preparation comes the incredibly complex and critical phase of building a model. A common practice to achieve this is to dive into the depths of Tensorflow. Typically a reasonably advanced model would require a sufficiently technical mind to orchestrate several thousand lines of code, hence requiring much time, resources and collaboration. This was the exact impression given by Alex Radovichenko, Co-Founder of Raccoon.World, when he told me, “It took us about eight months to create our initial model in addition to four months of fine-tuning.” 
  • Deployment. Finally, comes the challenges of deployment. Even if a data science team develops a model that works, it will need redevelopment and adjustment to work in a live production environment. Moving from an initial prototype into production can create a significant amount of work and investment in the infrastructure. The code needed to support the systems can be referred to as “plumbing” and may be significantly larger than the core code itself. The process then involves the data scientists handing over the prototype project to the central IT organization to re-implement in a way that fits the requirements of the production environment. As you’d expect, this introduces new efforts, as the data and the model need to fit into the more complex live production environment.

One of the 4 Pavilions showcasing a mixture of vendors in addition to pitching stages

So is it all doomed to fail?

Not exactly.

Productivity improvements will come when you free up both the data scientist and teams to focus on high-value activities, like modeling. The overall message I’m trying to get across is, it is paramount to effectively utilize the time of the data science team where it matters most.

Understanding the business and its business problems

Without a doubt, there are a range of platform/product offerings that could help positively impact productivity. For example, working with prepared datasets (nirvana) and an intuitive graphical interface for developing and quickly iterating models, training them, evaluating and comparing results would have a huge impact.

Businesses looking to the future know that artificial intelligence will become central to achieving a competitive edge. While there are many barriers to achieving success, AI can be accessible to all.

The Peltarion Platform is continually evolving. Curious to learn more? Learn all about our revolutionary deep learning platform.

One of the many Web Summit installations scattered around beautiful Lisbon

  • Max Vassiades

    Max Vassiades

    Enterprise Strategy

    Max Vassiades is part of the Enterprise Strategy team at Peltarion. With extensive experience in sales and marketing, he provides a unique approach to enterprises looking to solve complex business problems. Apart from deep learning, he also has experience with distributed computing, middleware, big data and databases.

02/ More on Business & Applied AI