top of page

Carlos Escapa: “AI is not a strategy”

By Pablo Yannone Sancho, Journalist at GLTH



“A technologist at heart.” This is how Carlos Escapa defines himself, as he has dedicated most of his career to building global markets for technology companies. Four years ago, he “came across legal tech” through a good friend, Albert Ferré, who motivated him to join the GLTH.


“Something that motivates me is that I like to see how science and technology are applied in the world”, says Carlos. “And working with lawyers is particularly interesting because they tend to be extremely inquisitive people and up to speed on what happens in the world from a social, legislative, and other perspectives.”


He is very focused on data technologies, as he considers that, after all, “it underpins social networks, artificial intelligence, logistics, and all manner of new developments in socio-economic trends.”


The right question is data, not AI

In terms of leveraging operational efficiency and decision-making, Carlos explains that AI provides “statistical techniques” that allow people “to learn from existing data or rather exclusively, data related to phenomena that have taken place in the past.” This, later, allows you to make “predictions.”


This is why, after all, AI strategies do not make sense, whereas data strategies do: “Companies should have a data strategy, and then based on the data strategy, they can decide how and when to apply AI. But AI is not a strategy.” What would be, then, a good example of a data strategy?


Well, firstly, you have to consider your own data as a “competitive asset.” Something that we can see represented when we talk about “foundation or customized models”: foundation models are  available to millions of other people around the world, that largely use publicly available data that, also, is available to many other companies. Foundation models do not provide competitive differentiation.


On the other hand, customized AI allows companies to “enrich foundation models with your own data,” which, according to Carlos, “embodies competitive differentiation.” After all, it's not a matter of leveraging AI. The right question is “How do you leverage your proprietary data that nobody else has access to?” “How do you use data that tell you what customers need, how they're using your products, what they're complaining about, what new things they're asking for, why do they contact you in the first place, how long they invest learning about your product or using it, and so on?”


“All of this metrology inherent in the customer-provider relationship goes mostly unutilized. If you can apply AI in this data, then that allows you to derive insights and you can come up with new products and services based on that analysis.”


To sum up, building a model from scratch is not necessary. As Carlos says, there are big and open-source communities where “people come up with base constructs that other people can leverage.” That is why, instead of building your own model, it is much easier to use techniques like “transfer learning,” “continued training,” or “fine-tuning” in order to leverage what the community has already done.


Some tips before using a model

Firstly, we have to know the problem and whether the model we want to apply is suitable: “I always invite my customers to work backwards from a business problem and then we will find out whether we need to apply models and what kind of models in order to solve the problem.”


Secondly, having a good advisor is necessary: “You must have a good team of people advising on whether the application of the technology is appropriate for each use case, and take into account legal and ethical considerations. You definitely do not want the machine learning engineers to make those decisions. They lack perspective.”


And finally, do not underestimate AI: “We have to understand that this technology can be extremely powerful. Therefore, it can easily go the wrong way.” Within this last aspect, the AI Act in Europe highlights legislation that plays an important role, remarked by Carlos: “This technology can easily be applied in ways that can make citizens feel vulnerable and impinge on their civil rights. So you definitely want a committee, a group of people, a small village with a different perspective.” This team will include an evaluation of the risks, specific uses, and the effect of the model.


Know more about Carlos

When he was growing up, Carlos did not have these “grandiose ideas” like “building billion-dollar businesses or making massive amounts of money.” His dream consisted of being a computer programmer, something that in his environment was not that easy, and unemployment, besides, was “extremely high”.


“If there's one thing that the world is a lot better at today, it's about allowing people to discover their talent and to develop it,” says Carlos. “We have much more capacity to acquire knowledge than we did when I was growing up.”


If we step back more years ago, as if we were traveling in time, those opportunities were even less. For instance, they did not have access to technological developments, which, “from an emotional and organizational point of view,” as Carlos expresses, “they were very strong and there's a lot that we can learn from them.”


Indeed, for Carlos, history can teach us a lot of things, though “we don't teach history accurately”: too much war and conflict, but what about peaceful periods of time? “We don't talk about it, which I think is a shame,” says Carlos. He would love to be able to visit Baghdad in the year 1000, as there existed the “House of Wisdom,” a place where the Abbasid Caliphate, explains Carlos, “pulled together people from many different cultures, many different countries, and they were valued for their knowledge.” “That exchange of knowledge must have been unlike anything that we have available today in the year 2024,” concludes Carlos.


Well, nowadays instead of a  “House of Wisdom” we have social networks, an example of how technology and AI can harm and have negative effects on our “mental health” and the people we love.


“Children do not play like they used to. They're glued to a screen,” Carlos says. “And people get more lonely because they spend a lot more time in front of the screens, they do not communicate enough and they don't enjoy each other's company as much as they used to.”


Following this line, Carlos has something to tell his younger self: “When I was a teenager, I thought that the only important thing was to get an education. I didn't pay anywhere near enough attention to emotional intelligence and the ability to have and to manifest empathy. I could listen to people, no doubt, I could understand what they were saying. But if there is something about myself that has always been an improvement point, it is that I take things too literally. And being able to read more into a context and to absorb how other people feel about a given subject and other emotions is something that I didn't really learn.”


Without a doubt, this is good advice for everyone, especially when technology tends to make us take things for granted: look after your health care and do not idealize AI, because it has, like everything around, two faces.


Data AI/ML Global Practice Lead, Accenture-AWS Business Group, Amazon Web Services (AWS)



11 views0 comments

Comments


bottom of page