Skip to main content

DevOps Decrypted: Ep.23 - The Evolution and Devolution of AI

In this episode, we're still riding high after our last episode with Gene Kim, whose words are still echoing in our minds as we talk about our upcoming business transformation ebook, hackathons, and, of course… AI.
DevOps Decrypted Podcast Logo
We’re back – still riding high after our last episode with Gene Kim, who’s words are still echoing in our minds as we talk about our upcoming business transformation ebook, hackathons, and of course… AI.
Once again, artificial intelligence dominates the news, as OpenAI launches Sora – and deepfakes make for headlines we never thought we’d read. We discuss the ethical ramifications of AI leading to job cuts, and whether DevOps roles are on the line.
We also cover cloud repatriation, ongoing big tech layoffs, and the latest move around IPs from AWS. And we all learn a new favourite word – it was 2023’s word of the year, no less!
This was a really fascinating discussion, and it’s already spawned a whole bunch of new launchpads for future episodes. Stay tuned for more, and please give us your feedback on the show at devopsdecrypted@adaptivist.com.
Laura Larramore:
Hi, everyone, and welcome to DevOps Decrypted, where we talk about all things DevOps. I'm your host, Laura Larramore, here with our Adaptivist panel, Jobin Rasmus and Matt.
Hey, guys, how are you doing?
Rasmus Praestholm:
Hey hey!
Jobin Kuruvilla:
Hey, Laura, how are you?
Laura Larramore:
Today, we're gonna talk a little bit later about an analogy that compares Giphy to AI – and what's snowballing there. We're gonna talk a little bit about what's in the news, and we're gonna talk a little bit about what we've been up to.
So, what have you guys been up to?
Jobin Kuruvilla:
One of the things that I have been working on is the Gitlab hackathon.
I don't know if you guys have heard Gitlab is actually organising a hackathon themselves.
And, being a strategic partner for Gitlab, we do have a specific interest in it. So we are also taking an interest in what's going on there and how we can actually pitch in with some ideas. You know I've been. I've been part of a lot of hackathons in the past, you know, a good way to contribute ideas and thoughts.
So, I think we as a team are putting some things together for this hackathon, but again, Adaptavist is a big group. It's not just my team. So… Matt and others. I don't know if you are aware of this hackathon or working on anything?
Rasmus Praestholm:
I think I did throw some ideas on Slack about it in that. I wish that Gitlab had something like a Backstage…
And you know, easier like webhooks and integrations, like somebody should have a Scriptrunner thing that might work on Gitlab. I think that could be cool... Right?
Jobin Kuruvilla:
I think you're giving away your ideas too soon. Hmm.
Matt Saunders:
That's okay, though I think? So I'm a bit 50/50 on hackathons sometimes.
And sometimes, you feel like you can kind of spend too long, like planning and working on what you're gonna do on these hackathons. And then you get to the actual time. And then you kind of like done all the work. And you've realised how your organisation doesn't necessarily support the work that hackathon's doing… but I like the idea of this Gitlab one, because Gitlab, famously they are – I can't remember the exact term they used to describe themselves – but they're more of an open company.
You know. They do almost all of their business using merge requests and open documentation.
There's very little that's actually kept secret within Gitlab.
Disclaimer, obviously, we know a few people in Gitlab due to our partnership with them. So I think this is a great idea to get engagement across the community.
But it wouldn't work in all organisations.
I think we look culturally at how you build an organisation to support something like this. And the reality is that you've gotta have the openness, you've got to have that collaboration, you know. No impediments to being able to fail on things to support this.
So yeah, I'm really excited about it. And I especially like the fact that they've gone into some detail on what areas they want to do hackathon work in. So maybe some of it is getting other people to do their work for them…
But to be honest, seeing all this stuff out in the open, I think, is a really, really good thing.
So yeah, that's how we can contribute to it.
Rasmus Praestholm:
I definitely agree because I have looked at old issues and merge requests on Gitlab that are about Gitlab.
And it's both awesome because of how open it is. And it's also like the ultimate dogfooding.
And even though something like Github tends to get more of the open source projects, they themselves are not open source in the way that Gitlab is. So you know, kudos to Gitlab for that.
Speaking of.
I also just remembered that they had an inlet out for another, like an AI thing, coming up this Thursday. So that might be something to sign up for and attend.
Jobin Kuruvilla:
And there is an emphasis on AI even on the hackathon. I think that's becoming the trend now. You know. Put everybody's minds together and see what we cannot see in the AI space, and, you know, go for it.
Matt Saunders:
I'm going to be cynical again.
So yeah, AI, yeah, we're all talking about it, everybody about it. Hey? What about all these things we could do with AI, fantastic – can we actually do something? But actually, maybe we can here.
Because, you know, there are clear and detailed descriptions on this hackathon page – about.gitlab.com/community/hackathon – as you know, real tangible things you can do with it. So yeah, bring it on.
It's another way, where the advantage of this stuff being open is going to play into people's hands because I think we're all struggling for, you know, real proper game-changing implementations of AI without the hype.
And yeah, collaborating, like many people across the world, on it, on a product that everyone loves, or that lots of people love, sounds like a great way of doing it to me.
Rasmus Praestholm:
Yeah, yeah. Now, this tells us something about an ebook we have that's coming out.
Matt Saunders:
So it's about mastering digital transformation. So the power of DevOps, practices, culture supercharging software delivery…
I had a review of this. It's been put together by a lot of very good people. Jobin, I think you're one of them within Adaptavist – and reading it and thinking, yeah, that kind of makes sense.
Yeah, this is good. This is… nothing controversial in this. And I think it's a really, really good distillation for the best practices that we see across the industry of how to do things and do them really, really well. And yeah, there are a few things in there that caught my eye as well.
This is where the Adaptavist link adds more value to just or whatever else you can read out on the internet.
But yes, really, really good. Read particularly about the importance of collaborating, communicating, and letting people work with with autonomy and empowerment.
And so, yeah.
Jobin Kuruvilla:
Yeah, and quite a bit of things, you know, coming from our own experience, working with customers. For example, how do you scale DevOps in a big enterprise? Right?
And what are the challenges that come along with it? Because you know, there's always a challenge when you are doing transformation in a big business. Otherwise, you know, I mean, if you look at the numbers. A lot of the transformations actually fail.
Part of the reason is, you know, people are probably coming across the same challenges. So we are talking about challenges in digital transformation, the common roadblocks that we see from, you know, starting from talent acquisition all the way to, you know, regulatory compliance and stuff like that?
So yeah, interesting read. Just let us know what you think about the ebook.
Rasmus Praestholm:
Still though, what about the most important part? How many times does it mention AI?
Jobin Kuruvilla:
Ha!
Well, not in that e ebook. But I'm pretty sure people will be coming back with, you know, questions about AI, even in DevOps space.
Can we even go on one podcast without mentioning AI?
Rasmus Praestholm:
I don't think so. And it's all over the news, too. Of course.
Laura Larramore:
AI made the Super Bowl ads. Did you guys see that?
You guys might not be the Super Bowl people that I am! It's everywhere!
Rasmus Praestholm:
Well, other than that, I know that there are still a bunch of events that Gene Kim is running. Since that we had him on the podcast's last episode, that was great.
Matt Saunders:
If you're a regular listener to the podcast, our last episode was entirely Gene Kim.
We had a great time. If you haven't listened to that, then please go and listen to the previous episode.
So, Gene did something quite interesting with his event, DevOps Enterprise Summit, which has been running since… Well, probably about a decade or so.
And he's now renamed and taking the word DevOps out of it.
And it's a lot more about enterprise transformation. I can't. Sorry. I can't remember the exact new name of the event.
But that is still going on, and I think it's great that that's evolving to take into account the way that the world is changing, and the fact that I just want to pick up on something that Jobin, that you said a few minutes ago, which is that so many of these DevOps, digital transformations that we look at actually fail.
And people are still failing, and if we carry on doing the same things, and if Gene Kim carries on running the same conference with the same people, then those businesses are still gonna fail. Well, until they all go out of business, of course.
So yeah. And it's a great evolution into something that's closely related to what was going on before. But takes it into a new age.
And yes, with bucket loads more AI.
Rasmus Praestholm:
So that's one of the good parts, like changing evolution positive, and all that, and some of the less good parts that are in the news is this big story about a giant like deepfake phishing attack that got like millions of dollars from deepfaking a whole meeting with like executives to an employee of a bank, or something like…
That was crazy. Have you all heard?
Jobin Kuruvilla:
I did. Yeah. It came on CNN the other day. It's funny, right? I mean, if it was not for the wisdom shared by Gene Kim in the last podcast, I would have thought, you know, it's too good to be true. Is it really Gene on the call? Or is it like a deepfake AI imitation of Gene!?
But yeah, it's funny in this world, and it's hard to see if the person is real or not.
Rasmus Praestholm:
Yep.
And it's just kind of getting worse and part of that, it runs into the main theme of this podcast that we'll get to here just a little bit.
But another way that it's happening is, and there, there's a term later coming. But right now, an example of this is also that I noticed AWS is set to make billions of dollars just off like IPs, like a new charge on IPs, and so on.
Which is like. Yeah, they already have all the customers. So why not just start like wrenching up the fees like it's almost like, we know, a different vendor that you know, does that – not naming any names. But you know…
Can we? To see more of that?
Matt Saunders:
There's a scarcity, isn't there?
I mean, we've been hearing about running out of IP v4 for decades now. And I think you almost sound like some sort of Cassandra saying, Oh, my goodness, we're gonna run out of IPs! We've been saying the same thing for 20-odd years. Maybe it's finally going to bite. I mean, I remember looking at some of the IP allocations that are going around the planet.
There are only a finite number of them, whilst I'm sorry. Slash A Networks as they now call them.
And… It's hard to tell whether it is a cynical thing like, well, AWS gonna start charging for this, or if we've got to the point where the shortages actually, now it's time to change the world.
I think it's encouraging some good behaviours. I mean, we've done some proactive things in Adaptavist when this announcement came out. But also things like our Cloud Zero report that says, you're now spending this extra money on AWS, where you weren't doing the last month, is putting out some some examples of what frankly, is not entirely brilliant engineering practice – things like public IPs being assigned to interfaces to EC2 instances where they don't really need to be.
So yeah, it's encouraging some better behaviour.
Jobin Kuruvilla:
It is also, in some way, incentivising people to move from IP v4 to IP v6, right? I mean that that's been slowly but steadily happening.
Maybe this is another reason to do that?
Rasmus Praestholm:
Yep.
Matt Saunders:
Yeah, yeah. It's almost as if the increased use of things like NAT – and even in carrier networks, CGNAT, has meant that the kind of IP exhaustion thing is slowed down.
Because it is fairly easy to NAT things and use RFC 1918 net space inside of your network.
But I feel like it's only really early adopters who are really doing that as a default. And now, yeah, we start to charge for those IPs. Suddenly, there are, you know, they are actually a chargeable thing.
I think it's making people think about network design a lot more. And yeah, and tidy things up. So does that in itself. Then slow down. IP v6 as an option again? Well, I mean.
I talked about IP v6 and IP v4 exhaustion, as I'm sure a number of you have, in the previous century.
It just feels like it's taking forever. It's disappointing.
But yeah, AWS are because they can. Maybe?
Rasmus Praestholm:
There was one bit in the article that stuck out to me – it's like, Hmm! If it's because of IP scarcity. We don't have enough of these, but I think the article I read said something about AWS having a hundred 1 million-ish reserved IPs, but they were only really using half of them on a regular basis, like.
Hey, hang on!
Where's the scarcity, then? Are you just charging so you can charge more, and make a million, a billion dollars a year? Hmm, that seemed kind of like.
Jobin Kuruvilla:
Not a million dollars… It's going to make AWS 400 million to 1 billion extra every year. That's what it was saying in the article. That's a lot of money…
Rasmus Praestholm:
For essentially no work.
So that brings me to another thing that has been recurring on and off here over the last many months now in that these tech layoffs – they just keep popping up, making news, and all that kind of the economy stays good.
And it seems like, hey, a lot of these companies are profitable anyway. So why are they laying off people still?
Thoughts?
Jobin Kuruvilla:
Well, I mean you. You need to show profit.
You know a lot of these public companies, you know, they had to cater to the needs of the stakeholders. Right?
And how do you show sustained revenue in your books? I mean, remember, at a time when you had 50% growth, it used to be good. But now 50% of growth is not good enough.
So how do you keep showing that, you know, you're growing at a hundred per cent, 200% rate?
That's probably one reason.
Rasmus Praestholm:
Hmm.
Matt Saunders:
I think it's like a second wave coming on.
I mean, I can postulate on this that maybe, like, the first wave is where, you know, companies have got away with… actually, now, I'm gonna rephrase what I'm saying. I was gonna say that companies got away with having lots of spare people.
Clearly, that's going to be the case in some of the, you know, some companies before the first wave came along.
But I also would just want to make a note of how trimming your company to the bone and only running with the essential resources is a really good anti-DevOps way of ensuring innovate, and you don't do things really, really well.
So it was an absolute tragedy that so many people got laid off the first time around.
But even more so because it just strikes in the face of how I personally think you need to run an organisation – with some slack.
And I don't mean these messaging companies.
You know that freedom to actually go off and use the diverse resources that you've got in your people. And so yeah, I think, after the first way, you've ended up with these leaner, meaner companies chasing after possibly the same goals, and then it becomes a bit of a survivor of the fittest race because if everyone is leaner and meaner than your opposition are, your competitors are.
So you gotta be even more leaner, and maybe it means that the first round was just too arbitrary.
And again, this is a generalisation. It's like you get the big companies. Forgive me. I don't have the numbers, but they say we're gonna cut hundreds of people.
You can't do that in a particularly nuanced way.
And so what I think is also happening is, there's a bit of a bounce back where people have been laid off. And then companies have realised they've laid off some of the wrong people.
And they've had some of them back. I know they're going again, and being more nuanced because the costs are creeping up again.
Maybe that's the reason for it. I don't know. All I know is, it's... It is sad. There are so many good people out there in the industry who I see now, looking for work and who are capable of actually transforming companies who do get this sort of structuring right.
Rasmus Praestholm:
Yeah, I think you kind of hit on a fourth cause. I had 3 in mind already, but the fourth one's kind of is that well, everybody's doing it, so we can do it too.
It's kind of like how, during inflation, lots of companies raise prices and things. Then inflation got under control, at least in the US.
And it's kind of like. But wait, prices aren't really dropping, and sometimes they're going up more and like the same justification. No, no, no! Our supply chain costs went up. We still have to keep raising prices like everybody else is doing so we can't like be singled out and blamed.
It's almost like you can do that by laying off people like, oh, no, no, it's an industry standard.
We gotta. I'm sorry, guys, but we gotta!
Jobin Kuruvilla:
You can always be too hard on, you know, excuses for doing this right? I mean, UPS just recently… So they're gonna cut 12,000 jobs. And this is the biggest in their 116 years of history. Now I was just reading about it. And you know Chief Executive Carol Tomé said. You know, to do it in part due to the new technologies, including artificial intelligence coming on board. So, obviously, that is forcing a few layoffs as well.
Can you? Can you really blame the executives for doing that? I mean, if there is AI or new technologies helping do things faster and quicker. And if that is resulting in, you know, laying off few people, you know as bad as it sounds…
Tough luck.
Rasmus Praestholm:
Yeah, yeah. I mean, I can see it in some. Sometimes it probably makes sense.
But every time I really kind of look into it it's either like greed – which is the typical one like, yeah, we got to maximise stockholder value, you know. Forget the employees and all that.
Some companies are like that. Some are not – like Adaptavist is great, you know. No bias here or anything!
And then perhaps that's also like some caution like we keep hearing oh, the recession will be here any moment now. It's right around the corner, guys, seriously this time!
And then there's one more.
That's fear, which is probably the least unlikely. But I'm kind of curious about this because there are all these talks about regulations around AI destroying too many jobs.
Or even like more and more workers unionising. And I kind of wonder if some of these big companies are trying to get ahead of that and shrinking down and downsizing before there are more constraints on how they can actually do that.
But I will admit that's probably unlikely, but I don't know it. It probably is. Some companies? Other companies not so much
Laura Larramore:
One thing that Matt said, that kinda hits on it a little bit is the innovation thing, like, if you do, a wave of layoffs because you kind of overhired, or whatever reason, and then you wind up in a place where you're not innovating enough.
Then you're contracting some more.
Then you'll have to do more layoffs. And it's kinda like a cycle. And to me, like folks who say, Well, AI is gonna cause these layoffs.
Well, maybe.
But I think that at some point you've got to see. Oh, AI, is not that innovative in terms of they're not a person. AI isn't able to think like a developer could think and come up with, you know, some innovative ideas that could drive a company forward.
So I think that's kind of something interesting that Matt hit on there.
Rasmus Praestholm:
Yeah, how do we think this is affecting DevOps and IT in general? Are we affected more than other companies? I mean, it is big tech that's laying off people. But is it from the DevOps department? Is it marketing? Is it sales? Is it something else?
Matt Saunders:
I can only really guess at that.
And, of course, with a DevOps halo on, I would suggest that it's not really DevOps because people who I think there's a personality type that is heavily involved in DevOps.
Which is, you know, there's almost a hustle about it, where, if you look at DevOps as being something where you're doing everything you can to ensure false flow, remove friction.
Yeah, always hustle around stuff. I get the impression that a lot more of these layoffs are in more siloed roles, I mean, in everything you see, like lots of – I see. But so some of my friends, DevOps engineers, coming up when i look on LinkedIn, you get the open to work things show up.
But obviously, I'm connected to more DevOps engineers than anybody else.
But I get the impression that we've got companies that have bulked up and have, we've got lots of people doing roles that are possibly very niche. No, no, this is the wrong word, but more kind of siloed in terms of what the organisation needs. And as we all know, the danger behind silos is in the communication structures around them and the delays you get. You know, you know, any value stream that will show you that.
And so you get this in organisations. So yeah, I mean, as I said before, some people are getting laid off. But I think the people who are in the best positions are the ones who are thinking around, you know, that kind of transformative DevOps mindset.
So I'll say Rasmus, I think I think we're seeing a little bit of bias based on what we observe being heavily involved in the community. And the reality is, possibly, somewhat different.
Rasmus Praestholm:
And that could also help explain the rehiring like oops, I didn't realise we needed those guys over there. Apparently, we do!
Matt Saunders:
Yeah, yeah, I don't want to expand on something that Laura said a few minutes ago, which was around. How did you get, I can't remember the exact word you use for it, but almost like a bit of a death spiral where you cut people. And so you end up with a smaller group. But that smaller groups, then, are almost acting without enough psychological safety, and so they achieve less.
And if they cut more and more, and then you really see what you've cut, that you shouldn't have done.
Rasmus Praestholm:
And that might sort of get me to my main point here. Because, as you say, Matt, this is like, it's a tragedy that there's all this impact on people's lives. And it feels like we. We are getting to a timeline where this is getting more and more in focus.
And I came up with this ridiculous analogy here, that the dark timeline that AI could be heading down is kind of like what happened with Giphy.
Matt Saunders:
The image hosting service, right?
Rasmus Praestholm:
Yup, Yup image service.
So I'll explain it real quick. So I'm one of those nerds who collected like, suitable reaction gifs and images for years and years and years.
And, like curate, this huge collection of, let me find just the right image for you. I am an expert, you know, all kinds of air quotes and self-deprecation around that.
And then suddenly came Giphy. This thing that kind of made it really easy to do that because it's a database for a bunch of images, and you just do a keyword search, and you get an image and, like, ta-da! Suddenly, everybody is an expert.
But then a little bit like AI, like how you used to have this hugely diverse conversation on Stackoverflow and sites like it, it suddenly gets ingested and automated. And now you have.
You have more people able to use it.
But with kind of like lower quality hits, because it's the same things that keep coming up over and over again, like how you would have ChatGPT kind of regurgitate the same stuff over and over again rather than ever come up with anything. Truly, you know, unique.
For example, if you have fewer people actually posting novel news, you know their own content on Stackoverflow, where they're thinking. And they're like, here's a new thing that's never seen before. You never get that with AI or not, really.
And there's a new movement that actually has latched on to this, which I am.
I will happily encourage anyone to read this. There's an article called The New Luddites in the Atlantic. We'll stick a link in there. Assuming that is not a subscription. I don't know…
But it has a neat callback to the original Luddites, which was a movement 200 years ago that has been essentially equated with anti-technology people that are afraid of the, you know, the future, and like, try to regress and be reactionary and destroy the machines and all that kind of stuff.
When really, it was almost like, if you really go back and look at it, it was this. It was a group of textile artisans, like craftsmen and craftspeople, that had been working for years or decades to perfect their ability to make textiles.
And then factories and factory owners, as part of one of the, you know, industrial revolutions came up with these cool automated machines that could now make textile, albeit at a lower quality rating, really easily with unskilled labour, which sometimes with literal children like child labour being put in in place of a multi-decade, you know, career craft or some person.
And it's easy to jump back on thinking. Oh, they were afraid of their jobs, and they're just destroying everything. Because, like, no, we want to keep, we don't save our jobs.
But then you really think about some of the writings that came out of it, and it highlights that it's not so much that they were afraid of technology, but they were afraid of the few benefiting on the backs of the many as new technology comes out, that only a few people can access – much like how today, the core of AI like everybody can log onto ChatGPT or talk to Bard, now Gemini and all the others out there, and you know, use it.
But very, very few hold the keys to what is actually going on, and they are controlling. You know, the AI race.
We had the whole thing with with OpenAI dumping their CEO in favour of, we move too fast. Giant uproar. Everything got back to normal again. And it's just… who's winning?
Jobin Kuruvilla:
You mentioned 2 different things, right?
One is only a few holding the key to this whole AI thing, right? Yes, companies like you know, Google. Meta, all these companies hold on the key to AI. That makes sense. But you also mentioned only a few benefiting from it.
Now, I would argue that point, though, because there are a lot of people benefiting from this new boom. Take the textile example you're mentioning about. You're right. You know the new technology helps create this. The sales faster made it difficult for the people all those people who were experts in it being that career for so long. But eventually, it also made it possible to create so many new designs of this so fast.
So everybody else in this world could then go for different designs they liked. And you know, we could just, you know, upload a design to Amazon aware and get it printed on a T-shirt. So, there are so many options out there.
So a lot of common people are actually benefiting due to these technological advances.
And I would say the same thing with computers. Right? I remember back in the day when I was in India when computers came out, there was a big uproar about computers replacing everybody's jobs.
Now, can you imagine a day without touching the computer? Or without you know, computers helping our lives make better. So in the end, I think it's not just a few who are benefiting a lot of people who are benefiting because of these technological advances.
Don't you think?
Rasmus Praestholm:
Absolutely agree. There are, you know, layers to the benefits in that.
Yes, it may concentrate more capital in fewer hands. That's a problem, like society-wise. But yes, it also broadens what's available, makes more goods available and all that. And that is good.
Yet there's a balance to be called for it because, as with the Luddites today, it has a tendency to degrade quality, even if it becomes more widely available. So I think what the original Luddites would have been happy with would be if you'd found a balancing point. Rather than switch directly from skilled artisans to child labour, we'd gone somewhere in the middle to like, get okay, we can rehire and retrain the artisans to be like the supervisors and the machine understanders and fixers, and then like hire regular peasants, not just children, to work the machines and strike a better balance.
Sure, the machines can mass produce cloth that's just like gonna break in a year or 2, while the good quality stuff would last for decades and would be passed down. Let's find, like, a middle point again.
So I think that's what's happening right now, and it's up to us to help balance the scales, because very easily, with fewer and fewer hands controlling it, could you get to the point where – well, you know, not only is your job gone. But you know your whole industry is pretty much ruined, and maybe some of these good things have never happened again.
Matt Saunders:
I have to kind of sit on the fence a little bit here for my own sanity, to be honest. Otherwise, you get into capitalism. You get into all these things.
And the reality is that this, as Jobin said, is very nuanced.
If you're a developer and you want to solve your problem, you can either spend half an hour Googling through Stackoverflow answers and Stackoverflow posts, or you could put in ChatGPT and probably get the answer you need in 5 seconds.
That then, enables the developer to spend his time, his or her time on other things. Great, fantastic.
But there are limits to it.
And you can see projecting ahead. Well, you know, today, developers like using ChatGPT to write a function. Tomorrow or in 6 months, or in a year, he'll be using ChatGPT to write an entire application.
And yeah, the ethical effects of all this stuff start to collide with the actual practicalities.
Rasmus Praestholm:
Yeah.
Matt Saunders:
I've overused this quote, but you end up with Jeff Goldblum in Jurassic Park.
He's saying how they spent so long working out if they could. They never stopped to think to see if they should.
And it's a global problem, it's, and it's a social problem.
Should we stop using AI to advance ourselves? Well, we can't really because if we don't do it, then somebody else will.
And now I'm getting political. So I'll stop.
Jobin Kuruvilla:
I completely agree. I mean, we keep saying this, you know, AI is not going to replace you.
But AI is going to replace you with people who use AI, right?
So I think I, there's definitely a balance to be made. But at the same time it is enabling us to do things faster and quicker, so we can. Actually, it's just like DevOps side. I mean, we keep saying, don't waste your time, you know, building infrastructure or making sure it keeps running.
But instead of, you know, use your time to innovate. Do other stuff.
Because DevOps can take care of that basic stuff that you used to worry about for so long. Right?
It's just like that, you know, AI is actually enabling you to do more innovative stuff. So you don't have to worry about, you know, thinking about, okay, what is the RegEx for, you know, this particular thing.
Matt Saunders:
So that's an interesting one because when you say that, Jobin, it makes me think of a few years ago, there was a movement called hashtag NoOps, where you get to the extent where, like we DevOps, automation is working so well that you don't actually need Ops. Well, that's what people believed. And you know, TL;DR, long story, we still need Ops.
Do we still need the same Ops people that we need that we needed before all this DevOps and NoOps stuff? No, it's different. And it's gonna be the same with AI.
It's like, Yeah, it's here to stay. It's going to replace you to some degree.
But also, it's not going to replace you.
Rasmus Praestholm:
Yeah, I love the idea of building, you know, upon the shoulders of others, as over time, we get higher and higher.
I just hope that those shoulders aren't like skeletal bones, because we've just ground our predecessors into the dirt because, like AR.
But maybe I'm a little bit of a new age Hippie there.
Jobin Kuruvilla:
I would actually go back to the analogy that you are making Rasmus with Giphy there.
I go to Giphy, and, you know, search for certain keywords, you know. I tend to get the same gifs over and over again, which makes it a little bit more. You know, mundane. I mean, it's not creative anymore. You get that kind of feeling.
But at the same time, I was reading an article about why ChatGPT works, right – part of the reason why it works so well in many cases is because, you know, it is only predicting the next token and based on a lot of calculations and algorithms. It's figuring out, okay, what is the next possible set of tokens, you know, available.
And then it assigns a priority to them. But, interestingly enough, ChatGPT doesn't always pick the highest priority token every time.
Then, there is a randomness associated with it. So it actually picks, you know, random tokens, which makes it interesting because if you ask the same question to ChatGPT multiple times, you will see different answers because it is actually adding the randomness, which then becomes kind of the creative element in giving answers, you know.
So I think there is some creativity there which will only improve over time.
It's not always, you know, going to keep you the same stupid answer…
Rasmus Praestholm:
And then and then you can like thumbs up or thumbs down the responses, which helps train the model over time, and I think that is the way to go. But there's gotta be a way to, like, tie the feedback loop back to where it came from.
So, like, while this answer was provided by so and so over and Stackoverflow at this date, and but that's difficult with the way, you know, generative AI works right now. But you know, that's how it somewhat motivated people to go and write things on Stackoverflow in the first place, they got, you know, commas, or points, or whatever it is, and if there was a way to somehow tie what comes out of generative AI back to where it came from, or like furthering the thing beyond his training, the model that's controlled by few, very, very few people.
That would be something of interest.
But if I may switch over to a newish term that I really love now and a mild, strong language morning here… The term is enshittification.
Which is a real word. That was actually the word of the year in 2023, as by some dictionary somewhere, I forget which one it was. And it was only thought up in like 2021, or 2022. So it's like super new, and it just perfectly captures, I think, the zeitgeist of what's going on in this place because it it has a Wikipedia page, you guys! It's crazy.
And it has these amazing examples of how it's kind of like an example of it is when a new piece of software comes out like a new platform, a new portal, and it starts out just being like awesome.
And then suddenly, over time, it incrementally gets worse, as maybe the company starts monetising it more and more. Hey AWS! IPs, I'm looking at you!
And then eventually, it either just leads to an acceptance that things are actually worse now than they were before, or it may lead to a backlash, kind of like the Luddites. And there is another neat article called Dare to Connect a Server to the internet, that kind of encapsulated some of this cloud backlash with AWS and others, you know, continuously having incentive to tell people that you know you shouldn't set your own server. It's too hard. No, no, just pay us to do it. It's great, it'll be great.
And then we really really go back and look at it. You know. I have a server on my desk. And I ran three commands. And now it's online. And it's like, it doesn't cost me anything but electricity.
So… I just. I love that term. I love what it says about the world today.
And I just really hope that we can veer off on kind of, like, the good timeline of AI rather than the bad one that just turns more and more into the Giphy-fiction of modern life.
Laura Larramore:
It's interesting to me that AI is both evolving and devolving simultaneously. I think you could evolve into using it for these things that are productive. You could devolve into what we were talking about earlier. The deepfakes controlling politics, all kinds of stuff is possible with it like it. There's a flip there, and it's kind of a little bit social in terms of what's going to happen.
Jobin Kuruvilla:
It is sort of like nuclear power, right? I mean, you can use it for the good, but you can also use it to kill others. Right?
When we talk about AI and evolution, people are only thinking about self-driving cars. You know, all the fun stuff, you know. Text to video all of that. But you have to also think there are a lot of other things happening with AI – take the DevOps tools, for example. You know, Gitlab
Gitlab Duo or Atlassian, you know, Atlassian coming up with AI features on Jira, Confluence, all of those different tools. So there are this little improvement also happening within the DevOps tools, which are specifically AI-driven that will, you know, make our lives a lot easier like, Gitlab Duo, for example, there's yes, code generation. That is part of it.
But at the same time, you know, automatically finding reviewers, you know, helping to summarise epics or issues. You know that same thing on Confluence, you know, okay, getting the summary of a page, you know. Searching when you click on a particular text inside Confluence. So, there are all these small features that are coming as part of the AI evolution.
Which, again, makes these products much better. So even though we say, Yeah, you know, Meta, or you know, Amazon, holding the key to a lot of these things. That is, all these different companies coming together and improving our lives little by little by adding new features that help us, you know, do things faster.
I do like the word. And what is it? Enshittification? You said there's a Wikipedia page for it. I'll go read about it.
Matt Saunders:
So it's come about through Cory Doctorow, who you may know from the blog BoingBoing back in the 2010s, very much following the Zeitgeist of new platforms that were coming out back when Facebook, Twitter, etc., were coming to the fore.
And yeah, when it describes enshittification – yes – I'm literally reading off the Wikipedia page now! He talks about it being how platforms die.
How, first, they're good to their users. Then, they abuse their users to make things better for the business. And then they abuse those customers to claw back value.
And I'm interested here because of the platform link here. We haven't talked about platform engineering for a little while here. And I'm really looking forward to, well, I am, or I'm not. I'm quite looking forward to the enshittification of platform engineering.
By that stage, it wouldn't mean that we would have gone through the stage where we are building successful platforms, using IDPs, using good source code management, good deployment systems, etc., and solving our users' problems – to the extent that the platform actually becomes so powerful that we start enshittificating it.
So yeah, I hope that doesn't happen. But also, it's kind of like a sign of success, perhaps?!
Just on the… what was the other article we were talking about?
Rasmus Praestholm:
Dare to connect a server to the internet?
Matt Saunders:
Yeah. Cloud repatriation. This is an interesting one. So this is DHH, the inventor of Ruby on Rails? Who is leading a push to get everyone to leave AWS, GCP, etc., and based on the proviso, well, you can run this stuff on the server yourself.
DHH can do that.
I can do that. Rasmus, you can do that. Jobin, you can do that, Laura. I'm sure you could do that as well.
Should you?
I don't know, but it strikes me there are these 2 extremes where we hear about people spending millions on Amazon, and then DHH telling us all to run a private CoLo server for $50 a month.
I think the reality somewhere is in the middle. And it all comes about from working out where do you spend your time.
I would love it if somebody told me to go off and run some big infrastructure on a single server.
I would love to do that. Is it a good use of my time? Probably not.
Rasmus Praestholm:
I'm hoping that we can find a balanced future; that's where I would like something like post-AI DevOps to look like because you can tell there have been industries that have been disrupted, just like textile and the Luddites and all that. For instance, Uber is an example.
Uber really kicked open what's called the gig economy.
And it's kind of like, in some ways, it's good. Some people really enjoy having the flexibility to be like their own boss and only sign in when they want to, and that's great.
And then Uber, apparently in some cases, can probably be a maybe not like perfect master, and working in an Amazon warehouse is probably not exactly great either. But what it enables is amazing – this logistical system.
But having these big corporate titans controlling these things is what makes me worry. And I wish we could find, like, the middle point where we figure out, like, okay – hey, capitalism! Do your innovation thing.
Great. Now we have a beautiful new thing. It's great. Everybody should do it this way.
Maybe there should be kind of like, almost like a public framework of – this is how to do it in a really effective way, anybody can do it. Some like half cloud, half home. You plug in the thing to it like this. It's still yours, but it now benefits from the economies of scale that have come out from the innovative engine of capitalism and all that.
Jobin Kuruvilla:
The solution is more political, not technical, isn't it?
Rasmus Praestholm:
Yes, it is. It is. It's a society, and I think that it's probably some way of that like regulation that I thought might be behind some of the layoffs, that they're worried, that like unions and things are going to get more and more into the whole like, hey, wait a minute… I'm happy you're benefiting from all this. But how about, you know, sharing it with all of society, so we can actually be happy and not devolve into real new Luddites and go smash up data centres.
Matt Saunders: Well, here's a can of worms which we should not open right now because I think we're about ready to wrap up the podcast.
I would posit that perhaps the answer to that is open source.
The end!
Rasmus Praestholm:
Yep. But open source needs something more – that's been clear. But that's a whole, another podcast.
Laura Larramore:
Indeed it is!
This was a fascinating podcast – I think it's interesting how almost all these discussions I can relate back to; this is a social and people problem.
People matter. So, I like that.
That's it for DevOps Decrypted today.
We want to point out that we have a new feedback email address. It's devopsdecrypted@adaptivist.com.
So leave us some feedback there, and let us know how you like the podcast and your thoughts.
For the team and myself – have a great day!
Why not leave us a review on your podcast platform of choice? Please let us know how we're doing, highlight topics you'd like us to discuss in our upcoming episodes, or email us at devopsdecrypted@adaptivist.com.
We genuinely love to hear your feedback, and as a thank you, we will be giving out some free Adaptavist swag bags to thank you for your ongoing support!