Features Hub

The tech stack is changing so fast, what is right today may be suboptimal tomorrow

Fri 1 Mar 2019 | Bruno Kurtic

With so much change taking place in IT today, it can be hard to keep track of what is taking place within your own IT systems. Bruno Kurtic, co-founder at Sumo Logic, argues that it is time to look at machine data analytics to help keep up with these changes

Digital transformation has gone from new approach to IT to cliché over the past six months. However, more companies are moving to this approach. What do you think the consequence of this will be?

Whatever the phrase may mean within the word soup that we like to play with, it does not change the fact that digital transformation actually began in the early 90s when banks disrupted their own local branch business models with online banking. It is continuing today. Every week, I meet executives across industries like manufacturing, government, retail and others where their strategies revolve around writing better software to deliver better coupling of often traditional products with new digital services around those products.

In fact, I see digital transformation as orthogonal to the new approach to IT.  IT is evolving to facilitate all these changing requirements brought on by digital transformation but digital transformation still occurs within the old context of IT. The best in digital transformation are both the ones pushing IT to adapt and are the ones adopting new IT services and models such as IaaS/PaaS, CI/CD, agile, DevOps, big data, SaaS, etc. in order to outmanoeuvre the competition.

What does this mean in practice? Companies that do this better will win, because they will be able to focus on their core business, innovate faster, and reduce cost.  That ability to innovate and improve continuously will depend on their ability to capture data generated by their digital properties, understand what that data is telling them about their customers, products and services, and use those insights to quickly improve how their products and services serve their customers.

You mention new technology adoption, and that you are seeing more of your customers adopting containers and serverless – what is driving this?

What’s driving this change is the adaptation of IT strategy to fit their new digital business model strategies and it is just a continuation of digital companies adopting modern IT to innovate faster.  Containers, orchestration, serverless and other technologies like that further facilitate agility and flexibility, and they can be used to meet specific goals.

Containers and orchestration technologies enable enterprises to more efficiently build applications based on microservices architecture.  These technologies and architecture enable them to create more efficient applications that enable autoscaling, fault isolation, easier maintenance, and portability.

No technology executive wants to repeat the vendor lock-in problems that took place in the 1990s. Containers, coupled with Kubernetes, are what the majority today see as a way to insulate them from the underlying infrastructure and facilitate a multi-cloud IT strategy.

Serverless functions are an excellent way of cost effectively running applications or microservices that are intermittent and need bursts of computation without the need to spin up even containers.

“With new cloud, opens source, and SaaS services becoming available, what is right today, may be suboptimal tomorrow

These technologies evolved to satisfy requirements within digital business models and how to build software to support that business model, and not the other way around.

Do you think companies are getting the basics right here? Or are they seeing problems that they shouldn’t?

Broad based evolution has been spurred by localised coups and revolutions within companies. Because of that I’m not sure that anyone really has it “right” yet. The tech stack is changing so fast with new cloud services, opens source, SaaS services becoming available, that what is right today, may be suboptimal tomorrow. I asked a tech executive within a manufacturing company if they feel that their new application was modern in terms of tech and architecture. He answered that nine months ago, it was bleeding edge but that he fears it is ‘antiquated’ now.

The new “right” here is fast adaptation to changes. For enterprises, that means not trying to beat AWS, Azure, GCP on running data centres, but to focus on adopting what is on offer and focusing on continuous improvement within their business model and supporting technology strategy together. That also means capturing data from digital properties, apps and infrastructures – data like logs, metrics, events, transactions – that can tell you how well you are serving your customers and what your customers are doing. Using this data to adapt over time is a must.

How do you think companies can learn from each other about all these new approaches?

“Best practices” are evolving as fast as the underlying technology. Two years ago Kubernetes, containers, serverless were in single-digit adoption across enterprises and now one third are using them in production.

Another big challenge in getting it “right” today is lack of skills to adopt and apply all this new tech goodness. It is important enterprises are provided with insights into what broader community best practices and benchmarks are in aggregate, and are therefore provided more insight into which technologies to adopt, how to run them, how to tune them and what to expect in terms of their behaviour.

How do you view cloud and security now – what is the biggest challenge?

People fear the unknown, that is in our nature. However, as we start entering the attic and turn on the lights, we realise that it is not so bad there. The increase in automation, the separation of responsibility between IT teams and cloud providers, and the homogeneous tech stack all come together to make cloud more observable, faster to update, and ultimately more defendable.

On an industry level, multi-cloud is still new but about 15 percent of our customers are running across at least two clouds. There are currently still only a small number of security technology vendors adapting to this new world and that can make it difficult for enterprises. We spend a lot of time and money integrating with all three leading providers to help this future wave of multi-cloud workloads. We also take a consultative approach and share our internal best practices with our customers as we’ve been running securely across clouds and regions for almost a decade now.

Running inside a porous infrastructure, where outside companies control some layers and you are responsible for others, requires a different mindset and architectural approach. One must assume that bad actors could be on the same physical machine. Once you get comfortable with that, you will do everything more securely that you have ever done previously. This is why lifting on-prem architectures into the cloud is not ideal. Something that is designed to assume it is behind a thick brick wall is unlikely to fare well when its soft gooey centre is exposed.

Join me at DevOps: Live

Sumo Logic is exhibiting at DevOps: Live, ExCeL London 12-13 March. DevOps Live and its colocated events attract over 20,000 IT business leaders, decision makers and professionals.

Experts featured:

Bruno Kurtic

Co-Founder and Vice President of Products and Strategy
Sumo Logic


Cloud cyber security DevOps kubernetes multicloud serverless
Send us a correction Send us a news tip