20 Years in Tech, What to Consider Today
Burton White (00:00):
Hey there, thanks for dropping in. I’m Burton, white Co-Founder and CEO of Excella. And this year we’re celebrating Excella’s 20th anniversary. In doing so, we’ve been reflecting a lot on the last 20 years–not just about what’s going on at Excella, but also around technology, our industry generally, and all our clients. So we thought it would be fun to record some of these thoughts and conversations and share them with all of you. So here’s our first one and today I’m joined by my Co-Founder, Jeff Gallimore, who’s also our Chief Technology and Innovation Officer. We’re going to talk a little bit about his favorite topic, which is technology, and how that intersects with people and enables people and so forth. So that’s our conversation today, Jeff, thanks for being part of this.
Jeff Gallimore (00:53):
My pleasure.
Burton White (00:55):
All right. Well, in 20 years—we’ve been doing this for 20 years, so thanks, thanks, partner. It’s been a good 20 years, hasn’t it?
Jeff Gallimore (01:01):
[Laughs] It’s been my great pleasure, partner.
Burton White (01:05):
[Laughs] So I was kind of amusing myself thinking back to some of the early projects that we worked on. Some of the technologies that we were working on early that at the time seemed so daunting, the technical challenges we were trying to overcome. But you probably have some of your own favorite memories. What were some of the early projects that come to mind for you?
Jeff Gallimore (01:27):
Yeah, one of the first projects that I was on, just after we started Excella in fact—I think it was the first project that I had after we started Excella—was building a custom content management system for a federal agency—which shall be named remain nameless. And we were using a project called Web Logic at the time which was total, total overkill for this thing. Like I said, it was, we were custom building a content management system to handle, like, rules and regulations and management of those and the workflows. So, it was basically taking a bunch of unstructured content and documents and turning it into structured content in an object-oriented application and a relational database so the agency could manage it with their own business rules.
Jeff Gallimore (02:22):
And I remember doing the design and standing up the, you know, the application and helping the team doing the, you know, the coding and the implementation of this. It was just, you know, thinking back on this, like we would never, ever do that today because we’ve had the rise of the search engines. So, I mean, I was thinking about this, Google was founded in 1998. So, when we started Excella 20 years ago go, Google was only a few years old, four years old. It wasn’t even a thing back then. And then, you know, fast forward, even further, well now we’re using AI and machine learning and things like natural language processing to suck content and insights out of these documents, you know, unstructured documents in a way that we never would’ve done before where we didn’t really have it accessible in terms of the technology and the compute and the storage and all that stuff. But in a way that gives us a lot of power for being able to process things in a way that we’ve never been able to certainly 20 years ago.
Burton White (03:33):
It’s unbelievable. I haven’t thought about Web Logic in forever.
Jeff Gallimore (03:37):
I know!
Burton White (03:38):
It’s so great. And you, you’re right. We’ve never built a custom content management system today, but at the time that made good sense
Jeff Gallimore (03:48):
That was what was needed to solve the problem.
Burton White (03:51):
Yes, so that’s kind of enterprise tech stuff, I was thinking about our own personal tech and that’s obviously transformed, you know, by a thousandfold. But what were—think about 20 years ago—what were the personal technologies that you were dabbling in?
Jeff Gallimore (04:12):
Yeah, I don’t know about dabbling, but I’ll tell you one technology that’s still a thing today that was a big thing 20 years ago was Bluetooth. Like I remember the coolness of being able to take my mobile phone and pair it with the earpiece.
Burton White (04:32):
A Jawbone?
Jeff Gallimore (04:32):
I had a Jawbone, yes, I did. I remember like, all those people that used to walk around in the sidewalks and they’d always had, or in the meeting, they would be in the office and they would always have the earpiece connected. I was not one of those people, but Bluetooth enabled that. Yeah.
Burton White (04:50):
Yea the first time you saw it, you thought somebody was talking to themself walking down the street.
Jeff Gallimore (04:55):
That’s right. That’s right. Yeah. And, and it was just such a neat technology to be able to pair all of these disparate devices like phones and earpieces, and then later cars and speakers and all kinds of stuff. And now today, like I still love Bluetooth because we still use it still, you know, for all the things. But now the technology that I have become profoundly appreciative of is voice assistance, you know, like Alexa and Google Assistants, and how they can support things like home automation. So I’ve gone through an effort over the last few years of automating all the things in my house, you know, lights, and fans, and all that stuff. And I can interact, and my wife and my family can interact with these devices and with our house through their voice.
Burton White (05:54):
Well, and as you’re saying it, the device over here on my desk, whose name is “A L E X A” is kind of going bananas. Cause she’s hearing, hearing her name, and not getting any instruction. So, I know it’s unbelievable. We have all kinds of fun in our house with the devices and the accuracy is unbelievable, how often the device gets it right. You ask it some strange question and you get a pretty good answer back.
Jeff Gallimore (06:25):
Right, and being a technologist, I, of course, can’t just appreciate the technology on its face. I have to understand the guts behind it. And so, understanding all of the complex technology and logic and processing and intelligence—if you want to call it that—that lives in those environments and behind those systems is just, it’s totally fascinating to me.
Burton White (06:50):
Yeah. Well, I, of course being the technology professional in the house, they, everybody asks me how it works. I have no idea, but I just say a lot of big words and hope that it persuades them.
Jeff Gallimore (07:02):
Does that work for you?
Burton White (07:03):
It does work. [Laughs] It usually works, except the kids now know more than I do. All right, well, so you have over the years, you have frequently, fondly, cited that very famous operational framework of “people, process, and technology.” And you always say in that in order of importance—and that too, what goes into that framework has changed dramatically in the 20 years. What’s one that you really point to as the biggest transformation in that category–those categories.
Jeff Gallimore (07:35):
Wow. The biggest one. I’ll start with the, probably the most recent and then, and then kind of go back. I think DevOps and DevSecOps are, you know, “dev whatever ops”, I think is the biggest one because the more I’ve learned about DevOps and the movement, and what’s behind all of that, the more I understand how much that term ties things together. How much, how broad a reach that movement has into, like, such disparate things that actually hit at the people and the process and the technology, all of that stuff. And it just brings it all together. There’s a big myth there—I think, which is becoming less prevalent than it used to be—and that’s that DevOps was really all about technology or is really all about technology, and it’s not. I mean, this technology is certainly a part of that, but really it’s about ways of working and the mindsets and the practices that we use to do work—humans doing stuff with other humans.
Jeff Gallimore (08:56):
So that’s sort of the biggest one. But I could unpack each of these. In fact, when I think about the people, and I think about process, and I think about technology—I can think of examples in each of those areas of things that really, maybe, weren’t a thing 20 years ago, but now are. So, for example, with people there’s been, particularly over the last maybe five or six years especially, there has been this understanding of the importance of culture and its impact on teams and organizations and the ability to influence performance. We hear this term a lot called psychological safety. That’s always been a thing, but it really has entered the common vocabulary really in the last maybe few years. And that’s a people thing. Now in terms of process here was something that I realized: the Agile Manifesto was created in early 2001. So just over 20 years ago. So, at the time that we started Excella, that wasn’t even really a thing.
Burton White (10:10):
No one, certainly when we were starting, no one had heard of it.
Jeff Gallimore (10:13):
No, they hadn’t. And now it’s sort of, you know, the de facto way of doing things. We got opinions about Agile and Agility and doing versus being, and all that stuff. But now we’re seeing the application of lean practices applied to technology. So, bringing in practices from the manufacturing world. Here’s a little-known fact for you — actually, it’s not that little known, at least I know it –but The Phoenix Project, which is, you know, is a book by Gene Kim [and others] which is a novel about technology and DevOps that was actually patterned after a book called The Goal that was written back in the eighties by Eliyahu M. Goldratt which was about the lean manufacturing movement. So, you know, what works for manufacturing apparently also works for technology. And then speaking of technology something that wasn’t a thing 20 years ago was the cloud. Remember what it took to provision new environments in our technology organizations, you know, you’d have to form—
Burton White (11:32):
Yeah. The form, you’d have to work it out months, months, and months ahead.
Jeff Gallimore (11:37):
Right. And then, you know, the equipment would show up and then you’d have to rack it and provision it and all of it connected up and all that stuff. And now, we’ve got the cloud where you can, for better or for worse, you can do things in minutes with a credit card if you wanted to. And it’s just enabling scale that we haven’t thought of before or haven’t dreamed of before. It’s enabling scale that we haven’t thought of before or haven’t dreamed of before, it’s enabling speed in a way that we’ve never dreamed of before. And one of the other interesting things I think about with respect to the cloud is it also enables managing business risk in a way that we’ve really never thought of before. It doesn’t require companies to make these massive bets way in advance of something, so they can decrease their risk and create options and flexibility for what it is that they’re doing. And I think that’s pretty cool too. And none of that existed 20 years ago.
Burton White (12:34):
None of it, but, okay. And as I was asking the question, I thought, for sure, we were going to land on Agile as the top change. But I have to say, you’re talking about DevSecOps and DevOps, I might have longed for that more than anything, not knowing it 20 years ago. How many times was, on a call at two in the morning deploying code—and you had to do it at two o’clock in the morning because you could shut down the system then—and you had to wait for everybody to come online and then somebody wasn’t there, or somebody fell asleep halfway through when it was their turn to take some step. So maybe the quality of life is improved more by DevSecOps than anything else.
Jeff Gallimore (13:14):
That is a whole separate discussion and I, one thousand percent agree with you. The days of the weekend release window should be gone.
Burton White (13:27):
Mhmm. It’s unbelievable and should be gone. Big changes. All right. So, I think another big area of change is how we think about equity. Probably you could put in a cousin to that is, just ethics around technology as technology’s becoming so powerful. We’re thinking about all of that much more than we used to. So, what do you think is– how are we thinking about that stuff 20 years ago compared to today?
Jeff Gallimore (14:02):
So, it is awesome that we are thinking about equity and technology and ethics and things like that. And to be honest, this is also a tough topic for me, because this is when it becomes less about the technology and more about the people, about human beings. That’s when it starts to bring out some feels. So, you, you talked about the technology being powerful and it absolutely is. And with any power, that power can be used constructively for good and in helpful ways. And it can also be used in harmful ways—either intentionally or unintentionally. So, I’ve come to appreciate and understand that. And then one of the things I also know about me is that I get—as you could probably tell, and you certainly have known me long enough to know this—that I get really excited about using technology to create opportunities for other people that they might not have had before.
Jeff Gallimore (15:16):
So that’s something that gets me really fired up, and I know I’m not alone. There are many others out there who are excited about using technology for good too. But unlike 20 years ago, today we’re much more cognizant of the potential for unintended negative consequences. So we need to be much more thoughtful about the risks that are associated with the systems and the technology that we create today. It seems like, you know, most of the examples that we hear about these days have to do with AI-based technologies especially as it relates to bias. It’s also why organizations need to spend more time thinking about AI ethics and responsible AI and explainable AI, which goes by a bunch of different names. And this is also one reason among many why inclusion, diversity, and equity are so important because we’ll be able to spot those risks better and be able to address them more effectively.
Jeff Gallimore (16:21):
And then also unlike 20 years ago, we appreciate much more that different people experience the world differently. And we need to take that into account as we build and deploy technology. I think that’s one of the reasons that the field of customer experience has evolved as much as it has and become such a high priority for organizations. We want to delight our customers and the people who use our technology, and that takes a lot more effort and expertise than it did 20 years ago. User expectations have evolved quite a bit in two decades. This is another of the many reasons for inclusion, diversity, and equity to better understand value and design for different individuals and communities.
Burton White (17:13):
Yeah, totally well. Okay. And so I’ve I feel like for our kids, I keep telling my kids, and they’re not that interested in say coding and that part of technology. And I’ve said, well, if you really want to be part of the next big thing, study philosophy, and the humanities, because it’s the ethics that are really going to be the challenge. Technology is tough with AI and so forth, but making ethical decisions is going to be even bigger and tougher. What do you think about that topic?
Jeff Gallimore (17:44):
So, you’re right, the ethics—generally with respect to technology, but really especially with artificial intelligence—the ethics part of the conversation is really taking center stage in a lot of organizations. It’s because of some of the interesting, we’ll call it interesting, dynamics within AI implementations. So, what do I mean by that? Well, first of all, here’s the thing for you: if you train AI algorithms on biased data, you get biased outputs. Like that stands a reason and that makes logical sense, but I don’t think we fully appreciated that coming into all of this. We didn’t, we didn’t really have an appreciation of the bias that was inherent in a lot of our data. That was a product of, you know, the environment and the decisions that we had made in history and all of those things.
Burton White (19:02):
All right. So yeah, the bias that can be in the data is such a big thing. It’s just, a huge potential pitfall. And so virtually every organization out there now is being called on to bring AI into their enterprise to make use of AI and all its power. And yet there are pitfalls like the one you just noticed. So, what do you think are the top few things that we got to help our clients navigate as they go into this new grey frontier of AI?
Jeff Gallimore (19:39):
I’m limited to just a few? I have to pick just a few?
Burton White (19:43):
Yes, yes, it’s a brief podcast.
Jeff Gallimore (19:46):
Okay. So here are the things that we’re seeing and that are in the conversations that I’m having, here’s what I’m hearing. So, first of all, we’ve already talked about the AI ethics part. And so, whether you call that transparency, responsible AI, explainable AI, or ethical AI, organizations are really grappling with how to use this technology in an ethical, responsible way. That’s why we’ve got these pitfalls. We actually spent some time internally to come up with our own AI ethics guidelines for when we engage in projects like this. How we’re going to show up, and the decisions and the frameworks and the principles that we’re going to use to help our clients navigate this. So that’s one, I think another one when you look at AI–AI systems are fed by data, and organizations really have to look at the quality of the data that they have and that they’re feeding these systems.
Jeff Gallimore (20:51):
And they have to look at how they’re maintaining that quality and how they’re governing all of the data assets that they have within their enterprise. And so you really have to scrutinize that data quality and make sure that you’re governing it correctly and governing it well. I think another one another one that we have started to hear some about is, we’ll call it the friction between normal, quote Agile ways of working that we’ve seen in typical software development and how that’s not really necessarily matching up really well with AI implementations andhow we build those. [models] because of the really experimental nature and the high hypothesis-driven nature of AI implementations. So Agile is great. Agile methodologies and approaches are really great at getting work done with speed and quality and all of that.
Jeff Gallimore (22:02):
But we’re now bringing some new dimensions into our teams, which is “I’m really not sure what the right answer is at the beginning. I have to experiment my way into the right answer.” And so, how do you bring Agility into those kinds of efforts? It looks a little bit different than what we’ve been used to. Then probably the last one that I’ll pick—there’s a million other ones that I could talk about too—the one that may be the most scary is security. We know how important security is and how scary a proposition that is to protect our organizations and enterprises from bad actors out there. Well, we’ve now got new threat vectors with AI implementations. So there’s this term that I learned about it’s called model poisoning.
Jeff Gallimore (23:04):
And so bad actors can take, use certain tactics to actually change how AI implementations operate by feeding it different data. And it starts to make the, here’s a technical term for you, make the thing go wonky. So you have to now protect from new threat factors to make sure that your system is doing what it’s intending to do, what you intended it to do. So it’s, there’s a lot of challenges out there. There’s a lot of new, for sure.
Burton White (23:39):
No shortage of new. Every time we have one of these new things, I love your point about data, just almost like a starting point that’s been the issue, always. Even in some of our early projects, 15 years ago, we would be doing projects to help with analytics. The first thing we realized is we have to clean up the data before we can put it in analytics on top of it. So, lots to chew on. The great thing about technology is that it’s always changing. And so, we always have new challenges and ways of applying what we’ve learned from the past into these new frontiers. So what do you say we do this for another 20 years.
Jeff Gallimore (24:17):
I’m in if you are.
Burton White (24:18):
All right, we’ve got lots of cool stuff to chew on, so let’s do it. All right, Jeff. Thanks. This has been fun.
Jeff Gallimore (24:26):
It’s been my pleasure. It has been fun. We’ll do it again in 20 years.
Burton White (24:30):
All right. And for those of you listening, tune in to our upcoming topics. We’ll explore other things, other big areas of transformation over the last 20 years. Thanks for tuning in.
As Excella celebrates its 20th anniversary, CEO and Co-Founder, Burton White, sits down with Chief Technology and Innovation Officer and fellow Co-Founder, Jeff Gallimore, to discuss technological changes over the past 20 years, and the biggest considerations for IT leaders today, including:
- Biggest impacts in enterprise and consumer technology
- Technology equity, ethics, and other considerations for leaders
- What do leaders need to consider when adopting and applying AI to their organizations?
You Might Also Like
How to Build Responsible AI Applications for Federal Programs
AI Engineer, Melisa Bardhi, joined host John Gilroy of Federal Tech Podcast to share how...
Decoding Artificial Intelligence: A Simplified Guide to Key Terminology
The world of Artificial Intelligence (AI) is full of terms like Generative Artificial Intelligence (GenAI),...
Driving Practical Generative AI Solutions Through Innovation and Collaboration
The landscape of technology is quickly changing and, at the forefront of this evolution, lies...