EP 77: Justin McBroom from Marathon Oil Corporation
#77

EP 77: Justin McBroom from Marathon Oil Corporation

0:00 All right. Well, welcome to another bus and episode of energy bites. I am Bobby Neil and I've got my trusty co-host, John Kalfian. How are you doing? How are you doing? Good to see you, Bob.

0:11 Likewise, buddy. And the Razorbacks have just continued to, it's so wonderful. Be terrible. My Saturdays are totally free now. It's so great. I told my wife after, I think it was like the

0:22 Notre Dame game. I was like, there's a very strong chance we don't win another game and we haven't. And it's been, it's been, you know, it could be. Yeah Well, now the rumors are they're

0:30 hiring Barry Odom as the DC and Bobby as the OC. with no specific head coach and like, oh, that'll work or great. Yeah, I'm sure all the recruits are going to be all over that. Yeah. So I'm

0:42 sure that's that's going to change tomorrow. So who knows, but well, funnier, luckily for our gas team into Texas tech and they're actually having, yeah, y'all are having a hell of a season.

0:50 Yeah, I was shocked. What's it like to well, thank you, oil money. That's right. Hey, they just understand the game. totally changed. And it is what it is, like they're going to keep

1:01 massaging it or whatever. But, you know, that's why Dabo's not doing well is because the old style of coaching and recruiting is not, he is just fighting it tooth and nail. But, but yeah, so if

1:12 in case you're wondering, though, our guest is Justin McBroom, you know, formerly of a marathon and oil corporation, and I would butcher his title and apparently they're all fake anyways But I

1:25 mean, what kind of operations, digital, data analytics, you know, kind of all those kind of things. Yeah. One. Yeah, it's we were part of our integrated performance organization. It was

1:38 central support for operations and our team didn't belong in the asset, but supported it. So I got the island of misfit toys. Yeah. And so I made up a unique, very long, confusing title to keep

1:54 people guessing what I did I think that's one of the things that most of the, like, more successful operators have started doing that model, like, I know Devon does something similar to that. I

2:05 think EQT kind of does something similar to that. And so it's like, you know, because it's like, okay, well, we want to try out this new thing, but it's going on the AFE, if you don't have

2:15 that department with a budget to pay for the new things, and then if it's going on the AFE, everyone's reason held because all their bonuses are tied to AFE or operations, and so I think that's

2:25 like one of the most useful ways for oil gas companies to actually test and implement technology and new tools, because it doesn't have to come directly from the completion engineers AFE, and he's

2:38 not going to get mad about it. And I think the unique thing about our team was

2:45 on the data side specifically, like I had the measurement team, production data, and reliability, and some of the normal operation stuff, but But the data team that we had was actually made up of

2:56 petro techs. as opposed to being centralized through IT,

3:02 we actually had the experience of the people that knew what was going on, knew the workflows, understood what needed to happen, what data would help. So as opposed to, you know, the old model

3:17 was just go to IT, tell me requirements, and then iterate indefinitely until - Wait two months. Until no one wants to do it anymore, right? Yeah Yeah, I mean, I think that that was a big thing.

3:29 Even seeing that difference between, say, what we did at Grayson Mill and how they did at Devon. I don't think any one way is right or wrong, but like, say for us, it was like, I was running

3:36 the data team. And but we put like a tech or an analyst in each kind of functional group, and they were kind of dotted line back to us. We were making sure we're keeping those

3:45 lines of communication open versus like, you know, seeing different organizations where it's, you know, solid line back to, say, IT or a data organization. And they kind of, they sit in the

3:54 group, but it's not, you know, they have two masters and, you know, and it's. I think we made that decision not super intentionally, but I was big on like, I think the first one we did was

4:03 putting a gentleman in the production team and it was like, we could keep him on our team and it could be fine, but, you know, our VP of ops, Ken, and he's gonna want his guy. Like he's always

4:16 gonna want his guys, they're gonna like, was put him there. Like, and so he's at Ken and his beck and call, but he's gonna work with us and communicate with us and they're gonna get it from the

4:23 centralized source. But like, yeah, I think it's, for me, I think it's important to have those, that expertise like in the group working with, directly with the production engineers or whoever

4:32 the people are, even the foremen and all that kind of stuff. Yeah, for me it was important that we were under the operations executive vice president because, you know, culturally a lot of things

4:48 end up being top down when it's transformational change. And so when you're in their org and you can go sit with them and get their buy-in. then you don't have as much opt out, right? So if I'm

5:00 going to a drilling manager and saying, Hey, you're well-vusing complete, and we can't use this for the analyticsthat we're trying to build out. If I'm in IT, he says, Who are you at? I don't

5:11 care, right? But now he's like, Oh, the EVP said so? Okay, you know, I'll get on board. I might not like it, but I'll get on board. So it was better to be in that organization to affect the

5:25 change that we needed to make, so, that it was a positive. Yeah, what are some of the things that you think, and this doesn't have to be specific to any company, but what are some things that

5:36 you think operators should do, or lessons learned that you've experienced that are really good and valuable from that perspective, and what are some things that you see commonly that are, you're

5:47 just like, why are we still doing things this way? Oh, yeah, this will be the entire podcast Yeah,

5:54 it's interesting, the lessons. I think the first thing is just around data and workflow, you

6:05 know, with consolidation and reduction and everything else. We've seen that.

6:12 And one of the first things to drop off somebody's plate is like scrubbing the data. Yeah. So when you lose staff and all of a sudden you've got to do more, well, you probably have to stop doing

6:26 something else And what they usually stop doing is looking at the data. You might lose a tech in a group and somebody's doing double duty. And you can see that through time where data quality goes

6:38 down, completeness goes down.

6:42 And I don't think they cared as much.

6:48 So we tried to somewhat incentivize people around the data as an asset Yeah, treat it like an asset. Um, so I think that's critical for people to continue to do in basically all environments. And

7:06 are you thinking like, is it field data entry, you know, that kind of stuff all the way through, like even just having like different checks or exception reporting or like, yeah, I mean, I

7:14 think it's all of it, right? Now we've gotten a lot smarter at how we review data. Yeah. Right. So exception based surveillance on your data is the next level where you can actually look and say

7:28 that lateral length doesn't make sense or teas, no, whatever, instead of having to go in and read every daily report.

7:37 So it's data capture and I think the interesting thing is get more than you think you're going to need. Yeah. You can't go get it later, right? Yeah. Yeah. And so you never know what's going to

7:51 come in the future that you could actually use that and it would be powerful So, um, in all environments data is going to make sense. Because you're going to need it at some point. You can't

8:01 approve what you don't measure and you don't know what you don't know yet. Yeah, Bobby and I have very strong feelings on that. In our past lives, we worked at a gauge company and so doing DFIDs

8:13 or fracking or fear ends or PTA and stuff like that. And it's one of those things. It's the same, at least with a DFID, right? It's like, there's only one time to get that data and it really

8:24 doesn't cost much and it really shouldn't mess up any of your operations if you do it, right? But it's another thing, right? And so it's just crazy 'cause it's such a simple thing. But I think

8:38 even on the other side, you get into it and then the people wanted one second data when they didn't even need it at that point. So then there's like, you're capturing unnecessary data and I think

8:46 we've got a lot of that too. But again, it's like, you can always filter it out. You can't recreate it, right? Yep, but the thing is, the situation all it's like. pressure is literally not

8:58 going to change enough in one second versus five seconds for it to matter, or a minute for that matter. But yeah, I think that the other thing that I think is what success looks like, what good

9:13 looks like,

9:17 is giving the actual business users the right tools and access

9:24 And so I post a lot about, you know, IT locking things down.

9:31 Spotfire info link is your only access to data. Yeah, because it's frustrating. I would have people call me, how do I get access to the data? Okay, well, what are you comfortable? What level

9:45 are you at, right? Because it's some people want Python, some people want SQL other people have no idea. I want to see this. Yeah, and so,

9:60 but I remember when we were locked down to accessing data through either the application and downloading reports or going to Spotify info link. And we didn't govern those very well, right? So if I

10:14 needed data from SAP, I would call up somebody, they would create a view, they would write an info link for that view, and it would be a one and done. I would never use it again, but it

10:26 perpetually stayed in the library. Yeah, so we could go grab that with no context. Exactly, it was never updated, it was stale, it didn't fit their need, but it's like, oh, here's SAP data,

10:37 this must be what I need. Hold on, that could be dangerous in the wrong hands, especially at a public company, if it's not, like you said, if there's not super good governance on the access side,

10:46 but no, I agree 100. In it, hey, I've seen some bizarre shadow IT stuff go on and with buffer information links, I've done some of it too, but like writing the SQL inside of like information

10:58 designer.

10:60 And then you just look at these data canvases and like, why is my spot fire slow? Well, you're pulling, you know, 5 million rows into RAM and then you're doing, you know, an unpivot and a

11:10 casewin and an overstatement. And now it's looping over every row to do that. Is that technically like a SQL injection by doing SQL in Spotify? It's pretty crazy. But then at that point it's

11:21 running almost like a virtual, yeah, it's like a

11:27 VM on that server environment. What do they call it? Data virtualization layer almost where it's like, it's connected to a thing, but then the processing I think is happening on the Spotify server

11:34 and then it's coming. But then it can even, that could be happening and then it gets into Spotify and then someone does their manipulations and pivots and transformations on it. Yeah, I had a

11:44 chemist

11:46 and he was basically uploading daily production for all like US onshore.

11:56 maybe not for all time, but for a lot of time, right? Because these things are based on who originally built him that if they didn't add the right filters, and he was loading all that data to

12:09 summarize by asset, by month. And I'm like, I'm just gonna go into Snowflake. I'm gonna write a view for you. And he didn't understand anything I said, but I got him to use a data connection,

12:25 connect to the view. And he goes, wow, this is loads in like seconds instead of like all day. And he had no idea, right? So, but those are the opportunities where

12:39 educating people, understanding their workflows, what are you really trying to do? All that stuff matters. And the more people that can do it, it doesn't have to be me or my team. If I can train

12:52 other people to do it, They can support their peers. So there's a ton of examples where it's, people are pulling everything and they don't need to. Oh yeah, big time. Yeah, I mean, there's a

13:07 huge, like I mean, we got to meet people where they're at. And like, yeah, I mean, you've got so many different layers of, you know, you've got your some nerdy reservoir engineer who, you

13:16 know, taught himself, you know, rust and wants to write everything in Python and, you know, just give me access and let me go Give me an ODBC connection and I'm good down to like, no, I need

13:26 you to build me this pivot table in Excel 'cause I literally don't know how to do it. You know, and there's everywhere in between, but

13:33 to your point, like your posts and what we're talking about like a spot for information like is a point of access, but it should not be the point of access, you know,

13:42 I can speak to what we did at GME and actually saw, I was pleasantly surprised when I saw it Devin for a big, you know, public company, But like, you know, We made sure everyone had an O2BC

13:53 driver installed, like when their laptops were created and like the DSM was created, I think that was unprogrammedically by IT. And then we just had to, then I was able to help them get the

14:01 connections, but you know, they could connect Excel directly to, you know, their, say, accounting data in, you know, Snowflake, via O2BC connection, now they could pull it down. And what

14:10 people don't realize a lot of times is you can pull way more than a million rows into Excel, as long as you pull it into memory, you know, into the connection and you can run pivot tables on a

14:18 hundred million rows if you want Yeah. And I'll fold that into the SQL in the backend, especially for direct query stuff. But

14:27 yeah, there was, I remember I was working a planning project. That's really what got me into data is KPIs and metrics. And so we came up with the KPIs and the reports and then it's like, okay,

14:42 now you've got to go build these and maintain them. And we were doing the worst thing So if you're doing this, stop doing this. We were opening every application. So you would go into Wellview and

14:56 you would pull your reports or you would go into Spotfire and you would generate a report and you would export it, put an Excel. And so you had all of these systems in Excel workbooks and then

15:09 you're doing VLOOKUPs across then, something

15:13 would change in the schema, your VLOOKUP would break. And I'm like, we've got to find a better way to do this And that's really what got me passionate about data is like, this is my ass is on the

15:26 line to get this done, how do I do it better? And we went from, there's probably eight of us killing ourself on work day six. You know, I mean two, three o'clock in the morning, finishing up to

15:43 making direct connections, using the right tools, upskilling people, and two people are knocking it out by noon. And so it was just a massive change that we saw.

15:57 And then we ended up migrating to more Snowflake as our cloud data warehouse. And I remember being excited about Snowflake until I saw it. And I go, this is just SQL. And like I'm not a coder.

16:13 And I'm like, the people that need to get to this don't really know SQL, right? I'm thinking of all the technicians and support. They're not SQL users.

16:29 So I worked with IT and was like explaining we need another layer in here, usable layer.

16:36 Altrix is my go-to secret sauce, but they didn't want to give everyone access. So we ended up just putting Sigma computing on top. And once you taught people how to use it, basically excel on top

16:52 of Snowflake. Incredibly powerful. And the things that people were able to create and do on their own are just amazing. Yeah. And so all SSO, you didn't have to worry, whatever roles you had,

17:04 poured it over. I'm not writing back to the database either. It can. I mean, you have control to allow them to do that. Exactly, yeah. Just lifting it up. Yeah, we did POC, but like for us,

17:14 what we were doing a lot with Power Query and like Excel directly tied and just having another tool to run up compute. Just didn't make sense for our company at our size, but I know like I'm pretty

17:24 sure Devin has a case study with them and y'all did as well. I think y'all went to the Snowflake conference and talked

17:31 about it, but I've heard really good things and yet it's like an Excel like inner spreadsheet interface that sits on top of. And I think you can choose it, it'll run like in the thing in the same

17:40 cloud even as your Snowflake instance you believe and yeah, it's pretty powerful. Yeah, I think the big thing there for me is like, especially now with all that. the LLM stuff is how much more

17:53 powerful and better can our companies be if we give one, we give people access to the data that they need, and two, we empower them to do something with that data to solve their own problems. And

18:02 it's like, whoa, yeah, like that's, we've started hearing more and more companies where they're starting to let the drilling and completions departments kind of have access to some of these tools,

18:13 either coding agents or whatever, but they're able to quickly solve their problems and stuff like our Databricks and something like, why not give them a sandbox environment like this? Like, and

18:22 then you have formal ways to promote things to production, but like, you know, and then you definitely need to put bounds on how much compute and the things will time out and whatever, but like,

18:31 with the proper governance, like giving them a spot to play around with things and just might be really shocked. I mean, and I've actually used it a little bit at one of my customers, but I mean,

18:40 like, I know within Databricks, they're AI thing and they're really good as far as like helping write code and do stuff within there I don't know, so Snowflake's got their answer to that. And

18:49 you've been using Mother Duck, like, there's a lot of things out there now that can upskill a noob to dangerous really quick. Yeah, I got my Mother Duck MCP stood up on my cursor instance, and

19:01 it's awesome. Nice query, all those tables that we've been working on and be like, Oh, this one sucks, I need to fix it. I don't have to open Mother Duck at all, I've never seen it. Yeah, no,

19:11 it's, like I said, I was hesitant to pick up all tricks when it started, but the sequel I know is from all tricks. Okay, yeah. And they have a virtual query builder. And so you literally just

19:31 see your schema and then you click on a table and then you can select the columns you want, you can write the filters, and you just tab over and it shows you the sequel. So just being curious, I'm

19:45 like, What happens when I do this? Oh, that's a where clause. That's what that means, a select statement, right? And so I was able to get good enough at SQL.

19:58 The only problem that I had is I had,

20:01 I don't wanna say too much access, but for our Snowflake instance to load, it was basically caching all the metadata. It was what Altrix was pulling into servers. I mean, it wasn't like it was

20:13 running on top necessarily. Right, so it was sitting beside it, right? And so every time I wanted to open that connection, I had to wait. Okay. And so what I started doing is either going

20:24 directly to Snowsite and doing an investigation or DBber or

20:31 something. And then I would take the query and go in and - So does Altrix have an almost reverse where you could provide the SQL and it generates the - Yeah. Oh, that's pretty sweet. So you can do

20:42 SQL, you could do the visual query builder

20:50 The trick was if I was connecting to a database is remembering what database it was. So is it Oracle, or is it SQL, and, you know, the select star, or the select top versus where RONOM equal,

21:02 and so, yeah. But, you know, that, every person that came on my team got a license and picked it up, quick learning curve. Nice And so, I still fought through the very end to have it until I

21:20 was done, so. What were some of the, like, how much pushback did you get when y'all kind of proposed that idea, 'cause normally, at least traditionally, an oil and

21:29 gas IT does not like to give people access to anything, no less, lots of data. Oh, I was fortunate. So, it really started

21:42 We ran this. planning project, and we're transforming the way we did corporate planning. And so it's like the planning variables database

21:53 part of it. Okay. Yeah. So it was multifaceted like our PV inputs, um, trying to normalize inner side, even when we did type curve updates, like we just redid the whole way we did planning.

22:06 And so I was reporting directly to, uh, he essentially became the vice president of planning and I would go, Hey, I need this access and they're not giving it to me. And he would go talk to the

22:22 CIO and they'd be like, all right, given the access it needs. And after about three iterations of that, just go, okay, you're kind of like a made man. We'll let you have access. And, um, I

22:38 think we only really made an impact one time to a production system. We hit BW while they were trying to run consolidations, and it's all on HANA, which is in memory, and we crashed it, and so

22:55 we're like, All right, we need to put some guard rails around this one. But - Most of the two, if you're centralizing to a snowflake, then a lot of times if they've got it coming up, then most

23:04 of the times it shouldn't even hit the other side. When we went to snowflake,

23:09 it was no issue at all

23:13 One of the other things that we were able to get done, and I'm surprised that more people are doing this, is everybody thinks data sensitive. Oh, you're in that asset, you can't see this asset.

23:26 Or you're in operations, you can't see, and it's like, I'm a custodian of the company. Why would you try to keep this from me, right? So what we did was flipped it on its head, and we said

23:39 everything is gonna be general access, Everyone can have it unless. you deem it sensitive. So instead of just assuming it's sensitive and fighting to get it open, and really the only thing that

23:51 ended up being quote unquote sensitive was the financials and the forecast. And all they were trying to do was make sure you didn't have a complete view of the company. Yeah, so you can get a trade

24:05 stock on it, right? Yeah, it's like, all right, you can have forecast, but just the segment. Right. You can have actuals, but just the segment And so that really helped

24:18 just accessing general, right? Yeah, it makes, I mean, it makes a ton of sense, right? Like, let's not lock everything down. That's only locked the important things down, right? Like, we

24:28 don't want people accessing, you know, HR and people's salaries, but if they need production data, why wouldn't we give them production data? Yeah, and I was, I mean, I guess we kind of did

24:38 that by default TME at also, but then like, whenever to Devanite. similar kind of secure by exception model, essentially, like where it's like, it's all open, unless we say it needs to be

24:48 locked down. And I think, especially with the public companies, a lot of times, they'll even have like a separate instance, you know, for the financial stuff. It's not even just totally

24:56 partitioned. And I think it's a lot easier for audit purposes and everything too. Yeah.

25:01 The only like negative of that I've heard was apparently Chesapeake back in the day. And this is like early teens, maybe even before that But their like exploration folder was public. And so one of

25:17 the land guys would just feed all the data to his brother and he'd go lease up everything. And they didn't know. Well, I mean, that's data governance. I mean, that's part of this. Like, you

25:25 arrive at this, but then like you actually still need to govern it. We're can't just like, well, it's all open. You know, here, I mean, like if you choose to do that, then we, I mean, at

25:33 the same note, right, that only takes a few times of buying, buying your new exploration. acreage from the same person to be like, What's happening here? How is he always ahead of us? Yeah.

25:45 That only goes so long. You know, I think at the end of the day, people that are going to do that are going to find a way to do it. They shouldn't be working for you anyway. Yeah, exactly. Yeah.

25:53 And again, it's like

25:57 trust but verify. But until somebody proves that they're not trustworthy, just. Yeah. Most people, hopefully most people that you've hired, you've done a good job

26:09 of vetting them Yeah. You know, they have the company's best interest at heart and that you've aligned their incentives well enough with the performance and everything that. Yeah. I used to joke

26:17 around when I was on the planning project, I had unfethered access and it was all segments. I could see forecast, I could see actual. So I was on the blacklist, I couldn't trade, during certain

26:32 periods of time And I used to joke with my boss. What am I missing? And so what are you talking about here? I go, I have access to all this data. Either I'm too busy to spend enough time to look

26:45 at it or don't know what I'm seeing, because I can't tell if this is gonna push us higher, lower, and then you know what it's like. Letting you know how like the forecast is gonna be interpreted

26:56 by the street. Like, are you sandbagging? And they think that you're gonna meet it anyways. You know, it's like. Yeah, you never know, right? I always think it's fascinating that all of them

27:05 come up with their estimates and it's like, how the hell are you estimating anything? But then also, everything is based off the estimates. So it's like, if you just come out and are like,

27:15 they're gonna do good, and they did good, but they didn't meet your estimate. Yeah. Now it's still, is that a negative? Like, how is that possible? Well, I know that when Jimmy got acquired,

27:25 they were worried about over

27:28 going past the too far over forecast. Yeah. Because the street might not like that.

27:35 And like, how's that crazy? like, why is that a bad thing?

27:39 So let's pivot a little bit, because I think we're going to help us stay on some of this stuff. Anyways, but so Justin, you've been doing a lot of, that's kind of how we came to this to be, but

27:47 posting a lot on LinkedIn.

27:50 You know, trying to buy his next job, if anyone. Yeah, don't stop. Because that's what he's saying. Accepting offers. So, you know, he's a free agent. But I mean, how many days in are you

28:03 now? But I mean, over a hundred. Yeah, over a hundred days consistently straight up But I mean, you don't realize until you try to do it. Like that takes a lot, especially like to the quality

28:13 that you're doing as well. Like it takes a lot of diligence and like stick to it. And it's like to do it even like it's like weekends and nowadays. And I'm sure you're scheduling some of this out,

28:22 I assume. Maybe, maybe not. I have these bursts of thought

28:29 for content. And I'll put that in notes.

28:32 But it's usually kind of day of day before

28:37 I was joking with some folks of the day, so I usually use a chat GPT or Jim and I to generate an image for whatever the post is. And I was fighting so much to get the image I wanted. I just changed

28:53 the topic for the day. It was like, I wanted a woodworker like working behind his back, right? So working on something blindly. And it would not do it. It was like, this is not normal. And

29:07 I've never seen this and the body doesn't contort that way. So I'm not going to give it to you. So

29:13 yeah, there's still predictive models at the end of the day. If it doesn't have something to base it off, it doesn't do very well. My mother-in-law posted text today for recipe for pumpkin moose.

29:26 So I went and typed pumpkin moose in the Gemini and it created a pretty awesome picture.

29:32 It's like this moose with a bunch of jankle deck liners that make it up That's incredible. That's pretty awesome. Um, so, I mean, like, again, we're well over 100 days in now. So, you posted,

29:42 you actually did some analytics on your posts here recently, but like, um, and you can go into a little bit like what are some of your most popular ones and like I'd be interested like, because

29:52 one thing that could be infuriating about LinkedIn or the algorithm is how much the algorithm changes and you may have been doing it long enough to see it change over two or three, four months. Um,

30:01 but like what's one that at least you were like super proud of and maybe it got numbers, maybe it didn't, but yeah, there's, uh, it's interesting. I actually ran an experiment because one of the

30:14 ones that got probably, uh, outsized attention was it was one of those days, like I was busy. I didn't have enough time and I go, what can I post? And I remember seeing somebody else had done

30:29 this So I gave my LinkedIn profile picture to chat GPT and said, roast me. And that was my post for the day, was the roast. But my LinkedIn image is me standing in front of the building, which is

30:47 the proxy for it's your last day. Yeah, like so, but next time I call it the Exodus photo. Yeah, so I think the majority, I could tell by people's reaction, if they actually read the post,

31:00 versus just like liked it or thumbs up, or hey, you know, think it about you, and it's like, okay, you didn't read the post, you just see the photo and you assume I'm saying it's my last day.

31:12 And what was funny is people that were like emailing me, they had done it themselves, you know, and hear the results. And I think the more the model, if you were using your model that you had

31:27 been using for a while, Yeah, it knows more about you Exactly, it took some shots that normally wouldn't take.

31:37 I think the others are kind of cross industry, so I had one and it was the CEO looking out across his labor and it was

31:52 like the majority were employees and then a small thing of contractors and then he says, I think we need to make headcount cuts. So the next image is the exact same amount of people, just now more

32:03 contract and less employees and

32:09 the point was around metrics. If your metric is headcount, that's not a good metric. It's about labor costs, no matter what bucket they come in. But that really caught fire because people could

32:24 see it across any industry

32:28 And there was one about

32:30 consultants recently, an employee pitches an idea. The CEO says, I hate that. The consultant pitches the exact same idea and says, This is wonderful, right? Yeah, it's gonna be 10 times the

32:42 price, too. Yeah, so why do you think I'm doing

32:47 it now? People, I think you get better overall engagement when it's more agnostic in just those general pieces for industry and the pain points that

33:00 I think we have. It's somewhat liberating, not having to worry that like your boss is gonna come in and be like, Hey, is that post about me? Yeah, yeah. And so, yeah, you can kind of say the

33:13 stuff that people are thinking. Yeah, sure. Yeah, it's funny. I've found, so the most recent one that we had from our podcast was one where we were talking about how people still post their API

33:24 keys on GitHub. It got like 600, 000. You put it on TikTok, didn't you? I didn't, but I don't have a TikTok. Oh, yeah, I did, yeah. Yeah, it hit 600, 000 impressions on TikTok on the

33:35 Kaleid account, which is crazy, but yeah, it's more general, right? Like you had all the AI people talking. Amazing, if your TAM is wider than now. But

33:45 it was mostly all the software people being like, This has been happening for a decade, and people are stupid, and they shouldn't do that, just arguing in the comments. But that's another piece

33:55 of this that I've learned from watching Colin and stuff, is it's like there's a game to it, right, where Colin will post frack with a K, and he knows that's going to ignite an entire subset of

34:07 people to come in the comments, which just drives traffic, right? And so it's interesting that, but then the flip side of it is it's like, I found that some of my spur of the moment of like, Oh,

34:18 this is funny, no one will. Oh, yeah. I don't think anybody will care, And it's like, take a picture of the cars parked out front and chucks the only one pulled. head first and I'm like, point

34:29 out which finance guy who, who drives, which finance guy drives what vehicle and it's like, yeah, that also in religious people too. It's, but like that literally took 30 seconds of thought and

34:40 I didn't think it was going to do well. And it was a great one. It's such a weird. And it's like shit, shit posting. Yes. Always, always wins. Yeah. I mean, you can, it could be something

34:49 you didn't even go to chat, but you put your heart and soul into it. Like spend an hour and a half curating it, post it crickets, like, you know, that was a funny joke. I thought out there.

34:58 Yeah. Yeah. My OGGN one also popped off pretty, pretty well, even though I wasn't attacking anyone for the record. I will state that just just looking at data. Yeah. But it's, yeah, it's such

35:11 a weird world as far as like figuring. Have you seen any changes since you've been doing it consistently or not really? I can't really, I haven't really done the analytics. It's a whole like, I

35:23 think there's time of day. Yeah. like if you're posting in the morning on a Monday versus mid day on a weekend.

35:33 And what I found is you can usually tell within the first two hours. Oh, yeah, because after that, you're somewhat falling off. Um, I think, I think, but even like Bella, if it's himself,

35:45 like if you get some interaction and then like if someone interacts and then you interact quickly to it, like it builds a momentum and like in the hour, then we'll start coaching But like if it just

35:53 sits there dead for 30 minutes, like you're, you're cooked. Like it needs to be true engagement too. Not just people like liking it because that'll only get it so far, right? Like, that's a,

36:02 another big one is getting a hook to get people into the comments to say something so that other people will then get into the comments of the same thing. Reposts comments, um, reposts have been

36:16 terrible for me. Like, yeah, I mean, and they used to like hockey, like, like, like this comment should maybe be a repose and like did I do that and it was good. No they don't they don't do

36:24 shit They're pretty, like, I don't understand the why, because, like, I'm all right. I tried to repost the,

36:32 um, the posts I had of, like, when I did the podcast with funk, because I was the first podcast I ever did, and dude, I got like 60 impressions on that thing. I was like, this is depressing,

36:43 like, because it's actually a thoughtful, like, post and they can, it was like, it got nothing. Like, yep, no, they don't, they don't like repost. They don't like, if you include external

36:52 links in the post, there's all kinds of stuff I noticed that same TikTok video that got 600 impressions, uh, when I posted it, I posted it with the TikTok water stamp on it. Oh, yeah. I mean,

37:03 it did awful. And then I posted it without, and it did way better. And it's like, this is the amount of tech that they're using to understand what is in a post is pretty. Because you mentioned

37:13 that you're using AI generated photos and I think, weren't they trying to do some stuff to kind of suppress that a little bit too? I don't know if they're suppressing it as much as just making sure

37:21 that people know that, like trying to tag it. Okay, 'cause I didn't write or something. But to that point, did you see that Vine is coming back? No way. Yeah, I think they're calling it divine

37:32 and they're banning AI content. Like that's their nuance, which I think there's a huge market for that. I think it's gonna be really interesting. Yeah. Apparently, they're also starting with all

37:42 the old Vine videos, which makes me really happy on the style just that is. Yeah, no, it's,

37:49 you see people that blindly believe, like I've got small kids Yeah. And I'll show them a video and they're like, they're convinced that the grizzly bear is jumping on the trampoline. Right, doing

38:01 back flips Yeah..

38:03 So it is, I can see it's starting to go the other way, just from a reputation perspective, you know. True. There's obviously the cartoon elements that, yeah, who cares, right? But the more

38:21 realistic,

38:24 starting to suppress that,

38:27 videos have not done well. I'll say that. I'll also agree with that. I don't like that they've gone to that more real style. Yeah, it makes no sense. Like if interactions, like it's supposed to

38:35 be on the phone, like it looks like a Facebook real, like, but on LinkedIn, like in it, I think it makes it hard to interact with, like maybe you might get a thumbs up or something, but like I

38:43 liked it when it was more just like embedded into like a LinkedIn - Yeah, yeah. Or whatever it is. Yeah, no, I didn't, I never understood that Like, well 'cause it, I mean, at least the way I

38:54 would think they would intend people to use it is from the phone. But I'm not going on LinkedIn to look at reels. No. Like I'm going on LinkedIn to look at posts. Yeah. It's a, yeah, it's a win.

39:07 The post part that you may put as context for it kind of gets just like - It's lost, yeah, lost in the - For sure. And then, yeah, length of post is another, at least for videos is another big

39:18 one like we've AB tested that a lot. And so like some platforms are better. Twitter is really good for longer forum content and YouTube, obviously, but then Instagram and LinkedIn are normally

39:28 better for shorter under a minute or two. And it's like, okay, oh, that's fun. Just another thing. But that's the point of all of this is like that data and analytics rabbit hole around social

39:40 media is so deep, like there's so much data that they're collecting. And you're just like, Oh, okay, this is a lot more than I was anticipating and all the things you're trying, like all the

39:50 KPIs you're trying to monitor and measure. She's like, Oh, man, time of day, type of post. Does it have an image? Does it have a video? Does it have a link? Like all these variables. And so

39:59 it's, it's a fun thing from a data perspective, because you're like, Oh, okay, let's play around with some of this stuff. So maybe get them back to where we can dive more on. But like, what's

40:08 a post that you've made that like, again, maybe near and dear to your heart, or like that you're super passionate about, like, that we haven't really gotten

40:19 Yeah, I'm trying to think it's a lot of them try trying to remember.

40:24 I remember what I ate yesterday.

40:29 I think it's really about the enablement.

40:37 Trying to give people the right tools, meet them where they are. I've got some

40:43 fundamental principles around data like

40:47 accessible, clean, complete

40:50 Making sure people have the right tools and kind of that empowerment perspective. I'm

40:58 not enough people in the industry to talk about that stuff, in my opinion, at least publicly. You might go to a conference or something. Well, we've talked about a lot. I

41:06 mean, that's kind of where I focus my energy, too. Even on the consulting side, everyone wants to talk about AI and all this, and I'll let them talk about it I

41:16 still walk in places and people don't have the fundamentals down, and it's just. You know, you've got to have that first before you could do it's, you know, we haven't been able to do data

41:25 science without it. You haven't been able to do machine learning. So why do you think you can do AI with it now? Like we'll just give it to the language model. It'll figure it out, Bobby. That's,

41:32 that's all right. I went to a conference. I think it was last week and future of AI, I think. Yeah. Yeah. It was interesting because they had a panel and I think the majority of the panelists

41:45 were more IT and they asked them what, you know, the, the hurdle to AI was and their perspective was so disconnected. That's so spot on. The IT perspective is security. Yep. Yeah. And everyone

42:03 in the audience taking a poll is like data. I think, well, data quality, we've had the problems. It's like, yeah, but you don't understand. This is going to exponentially worsen the problem

42:17 like when when you have uh well-view casing that is sticking out of the ground on the diagram, it's like that model doesn't know that you can't be, you know, 10, 000 feet in the air and it's like,

42:34 oh, okay, yeah. So

42:38 again, when you're trying to use your data to add context, to answer questions and it's bad data, you're going to get bad training and you're gonna get bad results And I had the advanced analytics

42:53 team for a little bit and when they would create their ML models, we would talk about public data, but they were literally bringing in entire basins' worth of data. And so a couple of bad data

43:07 points really didn't affect the model, right? These are not outliers. Yeah, yeah. They're not as many as scrubbing. Yeah, it wasn't enough to manipulate the model one way or the other, right?

43:19 Um, where I think now the more pointed, especially LLM and other elements, you have to be way more careful. And the approach I took, so one of the things that we did at the company was, we

43:36 created a master attribute table. So golden record, whatever you want to call it. We supposedly had one,

43:46 but it was more data blending, which I disagree with, because essentially say this should be the source, but if the data's not there, go to the next source. And if it's not there, go to the next

44:00 one. I would rather have no data than bad data or data that's not verified. So we basically joined up eight or nine different systems. Said every well is a record, every attribute is a column We

44:18 put a life cycle trigger to say pre versus. like a plan versus actual. So it would change

44:29 the source. And when we were done, I took it to the executives and I go, your ugly data is now available. Now we go. I'm not changing it in pipeline. I'm not changing it in anywhere in the

44:43 system. If it's important enough to change, we're gonna go change it into all of your database or in Pro-Count And then you would have blanks and I'm okay with blanks. I'll serve those blanks.

44:55 Yeah, because now we know, like now we know, yeah. And again, I think that view of something is better than nothing isn't the case when it comes to data and you're really trying to do something

45:05 with it. Yeah, it's such a dichotomy, right? Because I'm the same way in your earlier comment of like, I would much rather get too much data than have not enough. But the flip side of that is

45:16 the data that you use, you need to validate and verify and make sure it's not just complete garbage. And so that's the nuance of a lot of that. I also want to just die a tribe for a minute and tell

45:28 all of these people putting on AI workshops and conferences to quit having these C-suite IT people as panelists on these panels because it's not, that's not, they're so disconnected from what is

45:44 happening that like, you know, like, oh, we use it in, you know, to help write emails and stuff And it's like, that's not why people paid money to come to this conference again and for your

45:54 open enrollment bot and everybody. Right. Yeah. Yeah, we made a rag on our HR policies. It's like, okay, cool, is that moving anyone's needle? So I mean, to that end, I mean, like you've

46:04 been at now. I mean, you know, at least, you know, one major, you know, they're obviously, we're had to be doing, we got to test out some AI. I mean, have you seen any legitimate wins from

46:14 using AI, especially on the operations side? I mean, so this is a future post, but. It was brought up at the conference.

46:24 I don't remember who published the report. It might have been multiple, that like 95. MIT. Yeah, of pilots or PSEs have failed, and they were trying to discuss why. And to me, it's obvious.

46:43 They were never designed to succeed. When AI became a buzzword and people started going out in earnings calls saying, We're leveraging AI. Then boards of companies started to say, How are we using

46:59 it? We have to be able to say the same thing to compete. So then executives go, We need to be able to save this. They put it in AI, yeah. And we did it. We had people more than willing to give

47:12 us free proof of concepts. And there was never the intent to try to scale these.

47:20 It's a check the box. We did it. And this was a couple of years ago. And I remember that was kind of my first introduction to like behind the curtain. And I won't say any names, but they

47:35 essentially said, what are like 10 questions you would ask the model? And then what are the queries you would write to answer those questions? And when we got the proof of concept back, it could

47:48 answer those 10 questions flawlessly. But as soon as we ask another question, it literally broke. And it's like, okay, we're not far enough along. And we don't have the resources or want to pay

48:02 for somebody to just sit there and basically code in a model to answer every specific question. So yeah, I viewed it as,

48:13 here's the way the technology works Here's what it would take to implement. And we weren't ready to really take that on.

48:21 I haven't seen anything

48:25 beyond the just

48:28 general. You ask a question, how do I get this software and it tells you, you fill out a ticket, right? You know, it's more like a general company co-pilot is what I've seen so far. Yeah, I

48:45 think we have the whiskey and data tonight. I think it's kind of where we arrived. It's like, I think more companies need to be giving access to things like cursor or whatever else that are

48:54 amplifying what they do, but I think that's the best use of it right now. I would like to amplify, make a person twice as efficient what they do by if they wanna code or certain things and make

49:05 them better at it. But I don't think it's not, we're definitely not necessarily put it on top of everything and it solves all your problems. No, well, and that's the thing too, right, is it's

49:14 like, even in our experience, deploying some of our search products into some of these enterprises,

49:21 You know, it's a chicken and egg thing, right? Like I can train this model if I know all the questions you're gonna ask it, but you don't even know all the questions you're gonna ask it. So how

49:28 am I supposed to train the model? And so in my experience with that stuff, especially on just like rag or search it's your, whoever you're working with should be very clear that it's an iterative

49:40 process, right? Like our first, one of our first clients was a crack-end and we were doing on the search on their midstream contracts, right? And the first couple, like we have now built in a

49:52 two-week kind of beta period where the users just go in and test and ask their questions and all of that. And so we had that period, they were getting a bunch of bad results. We went through,

50:04 there was some stuff on the chunking side that we changed and we immediately got so much better results. But that's just for that one use case. And so like the next use case, the chunk size might

50:06 be different,

50:16 the temperature might be different, model we use might be different. That's another piece of this that I don't think people like appreciate is that it's not going to be perfect out of the box ever,

50:26 like we're nowhere near that. But also because it's not perfect doesn't mean that it can't get there as well.

50:35 And so that's an interesting point. The other thing about that paper that you mentioned that no one talks about because it wasn't in the headline, it was 95 was the headline, is there was a section

50:45 in there about 60 of industry specific deployments have been successful or like vertically integrated things have been successful. And I think that ultimately ends up being the reality because, you

50:59 know, like I thought about this the other day and it just like blew my mind, you know, all the foundational model companies, like thinking about trying to train one model to be good at everything

51:10 and to know everything, like we're trying to do it just for oil and gas and it is taking It takes a long-ass time to get them right. And so like these general foundational models, trying to be good

51:21 at, you know, have a PhD, pass the bar, pass like all these exams and stuff. It's like, and even then, it's like, okay, well, you just took the questions from the exam and you trained it to

51:31 answer those questions. All those benchmarks that are out there, it's the same thing. Those models are just trained on the benchmarks. That doesn't mean they're a better model. It just means they

51:38 beat the benchmarks because they did a better job training them or whatever it is. And so a lot of it is there's just so much nuance with all of that stuff But one of the things you brought up

51:49 earlier, I was gonna chime in, but I didn't. But I feel like now's a good time is there's so much, I don't know what I don't know in our industry. And that ends up being like almost a gatekeeper

52:04 for technology or for automations or for things like that, where it's like, Bob in regulatory knows how to use Excel. That's all Bob knows how to do. So everything that Bob's doing for his

52:14 regulatory stuff is in Excel He doesn't realize that I can - pull the data directly into Python and I can connect to the database and I can do all this stuff in Python. And I don't have to worry

52:24 about, you know, the well name starting with a dash, turning it into a formula in the Excel sheet. And it's like, there's just, and so there's this knowledge gap where because we've had that for

52:34 so long, I think there's so many, you know, AI is going to improve those types of workloads well before it's doing your job for you. Yeah, it was to that point, we,

52:47 we were one to lock down GPT when it came out. And then

52:55 we had some conversations and said, okay, as long as we understand and we can control it. So made the enterprise agreement, you know, kind of worn it off in Azure and everything worked. So we

53:09 were on a data council says IT directors and myself. And it kind of became the like AI council. So we would start to talk about what are the use cases. That we would survey people. And we've got a

53:24 lot of response, but when you actually read what they were trying to do is like, this is not AI. Like, this is a VBA macro, right? This is a bot - Right.

53:37 But again, they didn't know that this has already been available for a decade. They just don't know how to do it. Yeah, I remember telling our interim CEO at University Lands is like, it pains me

53:48 to say this, but like if we just taught half people here how to use like how to record a macro, they'd be way more efficient. Yeah. Oh, it's true though. Right, like they don't even know that

53:57 that exists. Like they don't know that they can just go hit a record button, you know, delete these columns and you know, run this filter and then bam, it works and you can do that every time.

54:07 But like, I don't want people to learn that rather than go another step and use a better tool for it. Yeah And yeah,

54:13 it's. The problem is everyone pitches LLMs is like, it'll do everything for you, you know, the every foundational model company wants you to think. It's the next best thing, besides sliced bread,

54:26 but it's a tool. Like that's how we have to think about this. It's another tool in the toolbox. But we get like, I can't tell you how many times I've been in a client's office first meeting and

54:35 there's always, there's always a production or like an IOT guy in the room. And they're like, I want it to optimize all my artificial lift And I'm like, well, it's a language model. And so all

54:45 your artificial lift data is numbers and language models are not built for numbers. I was like, you need a machine learning model that was probably built 10 years ago. Yeah. Didn't someone say in

54:54 that scheme data night that they had it take the table of data but they converted like into sentences or something like that. Like on this day - Yeah, you can do that. It made 30 barrels of oil,

55:03 blah, blah, blah. And like it's a turn a table into language and then it was able to do language things on the - Because that's what it's, again, it's not magic. word model is predictive

55:15 language. And if it doesn't have language, it doesn't know what the hell to do. Yeah, but I think

55:21 maybe it was a post yesterday.

55:26 Circling back to what do I think like companies need to do, what they should do is let people play with these tools. And what I've seen and continue to see is like I'm behind personally on playing

55:46 around with the new stuff, right? Because we don't have access to it. It was like, no, you're not getting

55:54 cursor. We had open AI and you could use that. But all of the other stuff going on, you don't have access to. Even if you could put it in a air gap

56:10 environment And, and say, okay, this is public data. Or, or we're going to create synthetic data and we're going to populate it and go out and see what you can do, test things. No one gets the

56:23 opportunity. So

56:26 with two small kids, I don't have a ton of free time at home, right, to be keeping up. But the people that will enable their curious, and it's probably 10 of the employees that will take the time

56:42 to go do it.

56:44 There's another percentage. If you tell them that's what they should do, they'll, they'll figure it out in the bottom percentage. You'll have to force them to do it, but let them have an

56:54 opportunity to play with it. Where we're at now is we wait for a third party company that's been focused on that to come in and say, this is how it works. And they're like, Oh, okay, you know

57:08 how to do it So you come in, you're taking on the same. technology risk. If you're going to break now, it's for someone else. Yeah. So I just, there has to be a paradigm shift in the way that

57:20 they set that up and allow people to play

57:25 at pushing it down to those that are curious and

57:29 Google, like before LOMs, we all Googled. Yeah. Could you imagine if we cut off Google access, like all screwed companies would be? That was part of the IT pushback when we were going to open up

57:43 chat GPT was, can you imagine if somebody blindly believes what they get as an answer? And I said, they're responsible as an employee for their product. We're paying them for their subject matter.

57:57 They need to use their discretion. You can go Google and get a stack overflow. That's incorrect. Right. It's on you. If you present that as your work, this is no different We shouldn't have

58:09 people. in critical positions quietly trusting the internet. Yes. And how do I get access to this AWS database?

58:16 I would just set the firewall to zero, zero, zero. And then like, yeah, and it'll work, but it also totally screw you. Yeah. No, but that's what I told everyone. We're still in the early

58:30 innings of all of this stuff, and it is very much so trust but verify. Like you're the human, that's a machine. It's a predictive model at the end of the day And it's up to you to validate this.

58:40 I was not able to 'cause I know y'all probably run into a little bit too where our certain providers are like, oh, you can't use that on top of my data or whatever. It's like, this is just another

58:47 interface to the data. Just like spot fires in interface to the data or Excel, or whatever. This is just another way to interact with it. Instead of it's a chat bar instead of having to write code

58:58 or create some other kind of connection to it. I completely agree with you though. There are workarounds if you're worried about the data security piece Synthetic data is a fantastic. It's a great

59:08 go to whatever model that you can use, have it generate a synthetic data set from a real data set, and then use the synthetic data set, and it will work, and it works really well. Or yeah, like,

59:22 you know, whether you're on Azure or AWS, they all have AI studios that make it incredibly easy to go and deploy a model, and it's absolutely within your servers or your resource groups or whatever

59:35 flavor you're in But yeah, there's, well, even then, like, people still don't know that going to open AI directly is different than using the Azure open AI instance. And it's like, those are

59:46 very different things. The data that's being moved is going to very different places and like all kinds of things around that. And so I think that's another piece is just like, IT and legal are

59:56 catching up to, okay, well, this one does this and it, you know, it sandboxes it or doesn't store and then all of this stuff. But then you also have the legal side of it

1:00:07 saw something a while back where they're talking about like, you know, forcing the foundational model companies to keep logs for X amount of time because of the Fed, like FBI and stuff. If they

1:00:20 needed to subpoena somebody, they wanted that data and it's like, Oh, well, that's fun. That makes this very different. Yeah. Now it's true. 1984. We're just telling it all of our secrets

1:00:32 voluntarily to at that. So it's, but I agree with you, like the people that adopt it and understand how to integrate it and empower people with it will do laps around people who are stonewalling it.

1:00:46 At the end of the day, yeah, it's like, would you ever shut Google off? No. Well, then why are we shutting this off? Because that's literally make people use Bing and, you know, put you in a

1:00:55 strike, pull up ass G's as your default search. Yeah, it's you've got to keep up. It's the same thing. Why?

1:01:08 on the technology side and operations, right? Yeah. If you were still doing sliding sleeve jobs when everybody else is doing plug-and-pro slick water or even more dramatic is like, we're not all

1:01:21 still using cable rigs. We're sliding self-walking drilling rigs around. Like when you look at it from that perspective, everybody will say, I want to be a fast follower. Like there's a people

1:01:36 that are going to be out there and the bleeding edge, but this should be no different. Like you don't want to sit on your hands too long and lose competitive advantage from execution in the field.

1:01:47 It should be the exact same thing with the people in the office. If they're losing competitive advantage in figuring out where to drill the next well or how to maximize spacing, depletion and

1:01:58 completion, why would you not want that to happen? Yeah, well, and that's the thing that I see is it's like right now AI's not going to, find oil or optimize any of your stuff. But what it is

1:02:10 going to do is it's going to take the 30 to 40 of your time you spend not doing those things, like regulatory or accounting or just copying and pasting data from one place to another. And it's going

1:02:21 to allow you to have more time to actually use your brain and what you're hired to do. Because we're so behind from that perspective that it's like, we're going to skip. I feel like we're going to

1:02:34 take a jump because we're going to automate all these back office and then be like, oh, well, the benefit of that that I see as someone who's been laid off a million times is like, hey, maybe we

1:02:43 don't have to step up as quickly as we would have in the past just because of operational activity, because more people are actually focused on the real work and not the stuff that we can automate

1:02:54 and not filing regulatory permits for every well that they've got.

1:02:59 And so hopefully that means on the downside, there's not all these mass layoffs and stuff like we've all experienced before And so I'm cautiously optimistic about that. We'll see, you know, it all

1:03:09 depends on the leadership of all these companies at the end of the day, and you're spot on where it's like, okay, well, we've got a, we've got to be AI enabled, and then it's like, well, no,

1:03:18 you need to like actually understand what that means, right? Like, is that giving people cursor? Is that automating a workflow? Like those are completely different scenarios that exist, but both

1:03:29 potentially equally is valuable And I, again, I think at the end of the day, we like to talk AI and technology. It really goes back to workflows and automating bad workflows isn't any better. And

1:03:45 so, again, what we saw during, you know, the layoffs starting in kind of

1:03:53 2015,

1:03:56 no one is sharing their work and their workflows This is their job, they do it, and then now they're not there. So, somebody else. It really creates it or it takes over again. Why were they

1:04:07 doing it this way, right? I could do this in 10 minutes every morning. And so

1:04:15 part of by necessity, a lot of work got better when there were less people to do it at the same amount to get done. And

1:04:26 so it doesn't have to be AI. It's just like driving that culture of constant improvement I rebuild my own stuff all the time, 'cause I didn't know. Yeah, you go back to a script and you're like,

1:04:37 Well, eight or two years down the road. You're like, Well, it's always like doing.

1:04:41 I don't need that table, you know? Listen, that is the best part about using some of these coding tools is like, you go through, you write, you're writing all these scripts and then you get it

1:04:52 to write validation scripts and test scripts and all this stuff. And then at the end, once you get it right, you're like, Okay, now go clean this upand remove everything I don't need. It's like,

1:04:59 Oh my gosh, this makes it so much easier to revisit. Oh, and then also write me or read me. markdown file that describes exactly what we're doing. Yeah. It's, yeah. But I think to your point,

1:05:11 like the enablement part, you know, just like let people in my opinion are inherently lazy because they, and I don't mean that in the best way possible, right? Like, I don't mean that in a

1:05:23 derogatory way at all. I am at the top of that list. Like I can do the same thing. Right. Less time than this. Yes, if I can automate something so that I don't forget that I'm supposed to do it

1:05:32 tomorrow or every Wednesday or whatever the hell it is, there's a very strong chance that I inherently will try and do that. Like everybody wants to make their job as pleasant and as painless as

1:05:43 possible. And there's a lot of pain and uncertainty in most people's jobs these days. And so it's like, let that get automated away and let them solve those problems and then start worrying about

1:05:54 bigger things. But

1:05:57 man, we're already,

1:05:59 you know, we blew through that time, we're already passed. That's always a sign that it was a good conversation.

1:06:08 Do you wanna do a news or speed round? Go ahead and do a speed round. Okay. Where are you from originally? Snyder, Texas. Snyder, Texas, okay. She's a west. Yeah. It kind of sits in between

1:06:23 Abilene, Synodula, Lubbock, Midland. Okay. What's your favorite place if you're going out that way to stop and grab some food? Oh, either Snyder, Abilene, and you're anywhere over there. So

1:06:34 I went to school at Tech in Lubbock.

1:06:39 Tie pepper across the street from the university. I probably ate the cool special a hundred times when I was at college. That's the one thing on this. What is your stance on the tortillas? We

1:06:53 could not throw tortillas when we were there. That's why I'm asking. What is that? Yeah. I don't understand So I - Let's continue to ruin college football. Yeah, it's -

1:07:03 you know, I actually had to look up why they did it and it was way back Southwest Conference. Somebody said, Oh, you're going to the bowl. What the tortilla bowl. And so for that game, they

1:07:15 threw tortillas and they had the penalty where if anything came on the field, it was like a 15 yard penalty. So we got patted down to make sure we didn't have them. And then one day I turned it on.

1:07:26 I'm like, the tortillas are back. Like I feel like I got gypped, you know, yeah. And then Lubbock was a dry county when I was there. And now it's what. So it's you know, all around you. Yes.

1:07:38 That's interesting. I like the old that because you're probably almost exactly my age. You started college 2004. But so you were there like when they had Crabtree and all that, like when they beat

1:07:48 UT and all that. Yeah. So I that was the first year out was when they did that. I was watching in Gillette, Wyoming. But we We had Mike Leach. We're still

1:08:04 there. Oh my god. And that's my glitch. I, the crazy pirate, so. We actually had Bobby Nite as the - Oh, that's right. That's the basketball coach. That's a really cool coach. Yeah, yeah.

1:08:13 So I still don't know if this is like AI-generated or not 'cause it came out around the same week that all the AI-generated videos came out like a month ago, but like there's a video on Twitter of

1:08:21 like Bobby Nite doing a golf lesson out of the trap. Like he has like a pro teaching mat hit a trap and it is, I don't know. I think that's real. I think I saw it years ago Like how to throw a

1:08:32 club? Well, yeah, well, this was like him hitting out of the bunker and he's like, look at that shit. Yeah, like it just like down. But it came out like the same week that people were posting

1:08:43 videos of like Stephen Hawking doing the Olympics and stuff. And I was like, and so I was like, man, is this AI generator at that point? Is this AI generator at that point? 'Cause I mean, I

1:08:51 look like, you know, I'm sure you could tell Sora is like, I think it looked like it's from the '80s or whatever, but it was, whether or not it was real or not, what world we live in that we

1:09:01 have to ask that question. Yeah. Say I was a couple more, so come back over to Houston. You've been here at least, what, 15, 20 years now? Oh, no, just nine. Oh, and not okay. Yeah. But

1:09:12 long enough to enjoy. What's your favorite place to eat in Houston area? Oh, taste of Texas any time I can go. Yeah. Still great, pick your own steak. You had to love that being right there at

1:09:25 the marathon building. Yeah, yeah, it was any opportunity we had, we would try to go there Sweet. Yeah. What is one tool or library or program that

1:09:38 you think more people should be using either on the data side or the operations side? It's gotta be all tricks that's forever a super fan. So if anybody from all tricks is listening, I need a

1:09:51 license to play around when I don't have a job here. So not gonna be mad, send one over. Yeah, I haven't touched it. physical coding, but I know it's got a really strong following and people

1:10:03 that use it really love it. So it's basically no code, low code, visual, but not mean I can give people tools that they can be put into it. So perfect, man. Well, no, I mean, thanks for,

1:10:16 you know, finally getting the time for it. And I know, you know, that can be tough, you know, negotiating the corporate hierarchy and everything. Yeah, no, I appreciate you guys reaching out.

1:10:25 It's been a blast saying we appreciate you coming. Yeah Thanks for your work and people find you on LinkedIn, Justin McBroom. I'll inundate your page if the algorithm hits right. So yeah, what

1:10:38 day are we on right now? Well, now you're counting down. So I got 30 days left to end transition. So that that'll be the post today, 30 days. Yeah, it's great stuff. Please keep posting. Yeah,

1:10:52 not enough people in our industry talk about anything, no less in public. So yeah.

1:10:59 Um, thanks guys. If y'all, if y'all need anything from us, you know, where to find us, we are running a black Friday thing on collide next week. There's only like, it's some kind of limited

1:11:11 drop. I don't know exactly that. I was, I, I've all not a few minutes ago. Yeah. So you've got some, you know, Oh, yeah, I did please plug. Great. Try to, uh, to start a little t-shirt

1:11:22 run. So, um, I think one of the post shows, uh, it's data, not data. So I have a t-shirt and it's a sequel, not SQL. So yeah, yeah. Scroll through and find them. I think I've linked the

1:11:38 store in the comments. So yeah, maybe we, we do some kind of a joint joint merge thing there. We get that on the dog house. Yeah. Cool guys. We appreciate it. We'll see y'all next time.

1:11:49 Thanks again, man. Yeah, thanks.