Stack Overflow vs ChatGPT & AI's Impact | Rubber Duck Dev Show 106

Episode 106 October 27, 2023 00:40:08
Stack Overflow vs ChatGPT & AI's Impact | Rubber Duck Dev Show 106
Rubber Duck Dev Show
Stack Overflow vs ChatGPT & AI's Impact | Rubber Duck Dev Show 106

Oct 27 2023 | 00:40:08

/

Hosted By

Creston Jamison

Show Notes

In this episode of the Rubber Duck Dev Show, we discuss the differences between using Stack Overflow and ChatGPT for finding software development answers. We also discuss what the overall impact of AI might be in the future.

To get the show notes for this episode, visit:

https://www.rubberduckdevshow.com/episodes/106-stack-overflow-vs-chatgpt-ai-impact/

 

View Full Transcript

Episode Transcript

[00:00:00] Speaker A: Hello and welcome to the Riverduck Dev show. I'm Chris. [00:00:03] Speaker B: And I'm Krusten. [00:00:05] Speaker A: And tonight we are going to talk about stack overflow and the impact that chat GPT has had on that environment, that ecosystem. It's kind of interesting, and it's not something I had thought about before, but we're going to explore that tonight. But before we do, we can review how was your week? [00:00:30] Speaker B: You know how sometimes I talk about I'm constantly spinning plates? [00:00:35] Speaker A: Yeah. [00:00:36] Speaker B: And I'm like, spin plate here, spin plate. I'm doing that to the nth degree, and they're starting to drop. I'm like, I got way too much going on. Last week I was talking about an issue I had where I started using docker and kubernetes and trying to figure out some stuff, and I started the kubernetes cluster or pod or whatever, and it basically locked up my whole machine. Well, the other side effect of that is that I had been using the built in container infrastructure in Ubuntu, which is LXD. I can't remember if that's an acronym for something they're LXD containers or LXC containers, but I had been using those and their networking stopped. So now none of my containers worked. Now, I didn't have that many, but still it was like, what the heck is going on? So here's what happened. I had created the docker repository where it stores the containers under my dropbox. Now, I normally put all my stuff under dropbox because it all gets synced across my laptop, my computer, desktop, and everything. Well, apparently docker seemed to be okay, but Kubernetes bringing up a pod for whatever reason, it just locks up the whole machine. So I'm like, all, I got to move that docker repository out of my dropbox folder. Which is unfortunate, but it is what it is, the LXD container. It's a known issue, and it's been happening with the networking. After people install docker or docker engine or whatever the docker software is, they manipulate the firewall or the IP tables for Linux. They put their own things in the firewall, and it breaks all the LXC containers. And I looked at a GitHub issue that's six years old, and people are still talking about it. It's still an existing issue. [00:02:55] Speaker A: Oh, wow. [00:02:56] Speaker B: I'm like, well, I guess that's a way to beat down your competitors, is just break their stuff. [00:03:01] Speaker A: Yeah, that'll do it. Good lord, that's not cool. [00:03:07] Speaker B: Now, of course, with this, I don't even know if I could use LXC containers with kubernetes. Presumably it's agnostic with the type of container, but I'm like, well, if I'm using docker, I might just move my LXC containers to docker because there were workarounds. But the amount of work to get that working, because you look at it and say, oh, this worked for me, and then, oh, well, that didn't work, but this worked for me. Oh, that didn't work. Everybody has their own solution. And when I see that nothing has been narrowed and said this is how to solve it. When I see 15 different solutions, I'm like, none of these are going to be a long term thing. So I'm kind of like I may just stop using the LXC containers or that's, sorry, go ahead. [00:03:58] Speaker A: The LXC. That's Linux only. [00:04:00] Speaker B: I mean that's a Linux ubuntu only Ubuntu. Ubuntu created this standard. I don't know if they passed it down to others, but yeah. So anyway, that's the report of what was going on. [00:04:14] Speaker A: Yuck. [00:04:15] Speaker B: So a bunch of conflicts between different softwares, but how was your week? [00:04:21] Speaker A: Well, it's busy, so you know how we did this acquisition and basically doubled the size of our engineering team overnight. So there's all the stuff that goes into the change management of that whole process and converting over their customers and all this stuff. Well, the timing of that was a bit hectic, let's say, because we have a Code Freeze on December 15 every year and so we were already scrambling to try to get some high priority stuff done and into production by the Code Freeze and then this happened. So while this is a good thing, ultimately it has really compressed the timeline of the stuff we had to get done before the Code Freeze. So while we're bringing more hands on board, they're not going to be up ramped up for our stuff before the Code Freeze. So they're actually a time sync right now because we have to train them and stuff. That's not a bad thing, it's just what happens when new people come on board. [00:05:34] Speaker B: It's just non ideal timing. [00:05:36] Speaker A: Yeah, so it's going to be a busy month. November is going to be I'm going to be run ragged, I think, but we'll get there one way or another. Sometimes you just got to pull the 120 hours weeks, it just happens out of where? Well, out of your sleep hours, I suppose, but whatever works. No, we'll get it done. Mostly it's a matter of making sure we're communicating effectively so that we don't get confused with stuff. Because now there is so many moving parts that we have to make sure all these different projects stay on the proper paths with the proper people. So there's a lot of communication infrastructure that we're trying to build up really quickly to make sure we keep all that stuff in line. [00:06:27] Speaker B: But what you just expressed is exactly how I'm feeling with I was already very busy and now I have two full time contractors and two part time employees and just juggling that. [00:06:50] Speaker A: And when you don't have quite the right communication pipelines, centralized communication pipelines in place for that level of stuff, if you expand beyond your infrastructure, it gets really hectic, especially when you got timetables involved that really can't be shifted. This December 15 window, that's a hard stop anyway. Yeah, we're actually scoping back a few things because we were like, no, ain't going to happen now. So you better plan that for first quarter. Anyway. So, busy week? Yeah, busy, busy, but also fun. Fun. So Stack overflow used to be like the go to place for a lot of developers who had questions about things. [00:07:45] Speaker B: And you had found some articles still is mine. [00:07:48] Speaker A: Yeah, I use it quite a bit too. But you found some articles where they were kind of examining the and I didn't get a chance because I'm so busy, I didn't get a chance to read through all of them, but I was able to skim through some stuff. There were graphs in there that were showing that stack overflow's traffic has severely declined since Chat GPT came on the scene. They had originally said something like 50%, but then they backed it down to 35% because some of that was an adjustment in the way the AdSense or Google Analytics was working. [00:08:23] Speaker B: Yeah, and I think they had reported something not as significant as that the traffic, meaning, I think stack overflow themselves reported it wasn't as severe as at, and we'll have links to all this in the show notes, but yeah, it seemed relatively stable. But then January 2023, it started going, yeah, now, not that significant, but what was I think even more dramatic was the vote decline. So this indicates activity on the platform, not just views, but it had been dropping anyway through 2021 and through 2022. But again, it was more of a steeper curve in 2023. And the same thing with posts. So two other metrics of activity on the platform where both have been declining over the years. But the slope took a steeper turn this last in 2023, which you got to think it's Chad GPT, because I actually found people talking about this on Twitter and people were saying, oh yeah, I use Chad GPT now. [00:10:00] Speaker A: Right. So I have some thoughts about this whole paradigm and something that this is pointing out, not necessarily that stack overflow itself is losing traffic, but just the attitudes that this is kind of pointing to. And I have some thoughts about that. But what are your thoughts about this happening? [00:10:25] Speaker B: I got a ton of thoughts. The irony, and I think someone had pointed it out, so I can't remember who pointed it out, is that Chat GPT trained itself with programming, probably a lot on Stack overflow. And it's killing the I don't know, killing, but it's basically hurting its long term prospects by essentially putting out a business what helped built it to begin with, and this is probably what's happening all around. It was artists that have posted stuff to the internet for years that Chad GPT or other AI tools are using. And now that they've been trained, they're on their own in producing things that now prevent these other sites from receiving eyeballs and traffic like other artists. So the irony of there is very thick. [00:11:34] Speaker A: Yeah, it is a little bit. And I think I have significant concerns. What this points out to me is that people, developers, let's specify developers here, because not just generally people I'm concerned about developers, developers are putting more trust into chat GPT than stack overflow. Now, whether that's conscious or not, I doubt it, but my concern is chat GPT, when you ask it a question that you would normally ask, stack overflow says, here's the answer. [00:12:18] Speaker B: That's what I hate. You're probably thinking exactly thing I'm thinking, yeah. [00:12:23] Speaker A: Whereas with stack overflow, here's my answer. And somebody says, well, I disagree with that because this and then you have this whole conversation and so you can get all the information from all the sides and say, okay, well, this is what makes sense to me, this is the answer I'm going to implement. [00:12:40] Speaker B: And I may find the answer in a comment that is not the top rated. Someone made a comment to someone else's proposal, right? That is not even the top rated one. I'm like, oh, that's it, yeah, because. [00:12:55] Speaker A: This question isn't exactly my situation, but this comment had exactly my situation. This one bit was flipped the same for him than it was for me, and that's what my problem was. You're not going to get that with chat GPT. You might get lucky and happen to get the right answer. But my fear is that with developers being leaning so much on the AI and chat GPT to get answers to questions that you would normally pose to places like stack overflow is that we're going to lose a lot. Of critical thinking and analysis capabilities in the developer community, because they're just going to take the answers that they get there and plop them into their code. And I think that's a very dangerous and bad precedent to work with. [00:13:48] Speaker B: Here's the thing, you're going to be losing a skill set, too, because like you have said previously, and I firmly believe it, what makes me a good programmer is how good my Google foo is to a large extent, right? So that ability to search and find things, I think losing that could hurt us long term, right? [00:14:18] Speaker A: And I think when you get easy answers to stuff, look, I'm not trying to crap on chat GPT here, I'm just trying to say use it responsibly because there are things that it's not good for. And this is one of those things. I think if you start getting the simple answers, you're going to stop thinking about problems deeply, you're going to stop analyzing, well, there's 15 different answers here. I need to really understand the problem so that I can pick the best one. And I need to understand all the solutions. And down the line that makes me a better programmer. Because when I have other problems come along, I can start making these connections in my brain about, well, this looks similar to this, but it's got this aspect of this other thing. And so I can craft a solution based on the experience I've had and put in for searching for solutions and struggling to find the right answers. If I'm not struggling to find the right answers, that stuff is not going to stick in my brain as well. And so if I'm just copy pasting answers from Chat GPT, anybody with access to Chat GPT can do that. They don't need to be a developer or a software engineer. [00:15:35] Speaker B: And really, it's just putting yeah, really, as I think about it, Chat GPT is great when I think the let's say the probability of receiving a correct answer is super high, meaning but there again, you write so, yeah, my thinking is going in all sorts of different places. So, for example, I have a very simple question, like, what are the like, I think I was using for DND when I was checking something, hey, what are some names for elves, for example? And it says, all right, here, I'll give you some names of elves. So it's just throwing out some stuff. [00:16:30] Speaker A: Right. [00:16:34] Speaker B: Or if there are answers where essentially everyone on the planet agrees, this is the answer to that question, like, how many oceans there are? It should give you the accurate answer for that. [00:16:47] Speaker A: Or how many planets there are. Oops. [00:16:50] Speaker B: Well, exactly, because I was actually thinking about this. Oh, well, you could say, what color are tree leaves? It's like you mean, say, green? It was like up. But wait a minute, is it autumn? Or is it winter? Or what color is the sky? Blue. Oh, wait, unless it's dawn or it's dusk or there's fires. [00:17:11] Speaker A: Or you live in certain large cities where it's brown. [00:17:16] Speaker B: Yeah. [00:17:17] Speaker A: Who knows? [00:17:19] Speaker B: It's kind of hard. Again, I go back to kind of what I mentioned when we were talking briefly about AI, when what's his name was here. I can't remember. I'll think of it in a minute. But it's like Chad GBT is in its teenage years and it just says, I have this knowledge, and you're asking for the answer. Here it is. And there's no like, I hate to say it, almost every response of Chat GBT should first be, well, it depends, and then ask you questions to help clarify and give you the answer. [00:17:55] Speaker A: Right. And again, I don't think either of us wants to crap on Chat GPT. Both of us use it. Both of us find use for it in certain situations. It's great for a lot of things like, hey, give me an idea for an opening paragraph for an essay on this, or list out some of the top five reasons that people do this thing. Or there's all kinds of good applications to get information from Chat GPT, but I think development is one of those where you have to be very careful, because Chat GPT doesn't think about things like security issues. It doesn't think about things like, what's actually the most optimal way to implement this thing in this particular language. [00:18:46] Speaker B: Where. [00:18:46] Speaker A: You do start getting those discussions. If you post these things on Stack Overflow or just in your Ruby group or whatever, interacting with other human beings gets you much better discussion than Chat GPT does for these kind of things. [00:19:03] Speaker B: I think it'll eventually be there, it's just not definitely not there today. I think the security will be better, the performance will be better. I think that'll just come with time. But I know my frustration at least because I've used it for some code, trying out some things and I'm like, okay, one answer I got was, all right, this is okay. And then they asked another question, another time and I'm just like, no, this doesn't make sense, or this can't be the so then I turned to Google and I found what I was looking for that way. But I was kind of like a little bit in prep for this episode. I said, let me try asking some coding questions. And it was very hit or miss. [00:19:52] Speaker A: For so yeah, and I've kind of had that same experience. I will use it, I'll ask it and it's good for I'm not quite sure exactly what I'm searching for. So here's a word problem, give me some code that might solve this. Then I get some ideas of okay, this is what I'm actually trying to do. [00:20:09] Speaker B: Yeah, and because it's hit or miss, the disadvantage is having one answer and saying, okay, this is my answer to you. And I almost wanted to say, hey, here's an answer. And like C three PO tell me the probability that this answer is correct is 73.56%. Or hey, this is what I think is the best answer. But hey, there's a few other possibilities because as I've looked into like I'm covering this more artificial intelligence topics because topics on Scaling Postgres, my other weekly show and they're doing every startup and their brother are trying to build or is building AI into their tool and even not even startups every company and essentially it's just a nearest neighbor search. So it's just saying, compare what was entered to what we know in terms of this knowledge base. And there's cases where given your data set, you're going to be wrong or you hit a local maxima and it's just not the algorithm hadn't found something that's close but didn't quite hit the filter to be able to present it to you. So it's kind of like I almost wanted to say again, this, we think is with this level of confidence, the correct answer. But it could also be these three other things, for example and I think. [00:21:46] Speaker A: That'S a good point, because that's something I did early on when I was starting to experiment with Chat GPT and what it could do for coding and stuff. I wanted to find out how sure it was of things. It has a function where you can just basically ask the same question again. So I would say things like give me an algorithm that does this in Ruby and it would spit a thing out and I would say, okay, that looks, I suppose, reasonable, ask the same question again and it would give me something completely different. It was still Ruby, but it was completely different. And every one of those things was like, okay, this part of this algorithm is really good, this is implemented well, but the rest of this is trash. Or 50% of this algorithm is good and I can use this and I can modify this, but I got to find something else for the rest of this because it's garbage. It's badly written, right? So I think if you're going to use it to investigate possible solutions to dev problems, you should ask it several times the same question, get all those answers and then start doing comparisons on your own to say, okay, well, why would I do it this way instead of this way? [00:23:01] Speaker B: Yeah, I don't know if you can manipulate this in chat GPT itself, like the actual application that people use, but I think when you're querying indexes so from the low down database perspective, if you're querying the actual chat GPT information or I can't remember what the term is, but I think you can have adjust, for lack of a better term, a jitter. So like how random? Like the fact that you said you asked it three different times and it gave different answers. I think that is a parameter you can control to a certain extent. [00:23:41] Speaker A: Right? Well, in this particular case, that's a parameter I would want as wide as it can because I was looking for it to give me different answers so that I could make good decisions based on multiple pieces of information. And I was just doing that for an experiment. I didn't actually use that because I didn't like any of the answers I got. I asked it a question I already knew how to do pretty well so that I could evaluate, okay, what's it giving me. And it was decent, but one of the answers had severe security holes in it and I was like, I would never implement this because it would open the company up to massive lawsuit exposure, security problems. But also there was another one that had some pretty significant optimization issues and was doing things not very elegantly. And then there was one that it gave me for one of one of my experiments that was so while it was technically correct, it was one of those things where it was so over clever that I couldn't really read the code. It was like looking at somebody fell asleep on their keyboard because there were. [00:25:03] Speaker B: Just so many methods looped together. That sounds like an example of these Rubyists that write what would probably be 50 lines of code in one line. [00:25:12] Speaker A: Yeah, exactly. And I'm like that's neat and all but I can't support that or maintain it or even read it very well. So that's no good. But it's just one of those things where I'm majorly concerned about the fact that if this trend continues, I'm worried that the developer community starts losing its ability to critically think about problems and find unique and inspired solutions to those problems, which is what software engineers do. That's what we do. I mean, yeah, we use Google to think through things and to bounce ideas, but we also use other people and other developers. And over my time of doing this, that's the most important resource to help me think better are other people. [00:26:10] Speaker B: Yeah, well, I mean, I think yeah, I think in order for it to work better, I would want it to more emulate the experience of doing a Google search for something. Meaning that you get multiple, you know, because what you're describing, I think if people are thinking critically, well, anyone can do a query click on not using Chat GPT. So they want to figure out how to do something. They do a single Google search, they pick the first option happens to be, let's say it happens to be stack overflow and they pick the highest rated answer and that's what they implement. So someone could do that and that's not intelligent to do. Sure, but that's what is kind of like what Chat GPT is giving you. [00:27:05] Speaker A: Yeah, but the difference with the Google search though is if I go to that first stack overflow, what usually ends up happening is I start reading through comments and then I have the discussion. [00:27:15] Speaker B: No, I'm talking about a developer that is focused on doing the least amount of work. I'm just going to pick this first one. Here's the highest rated answer. I'm going to take the code, I'm going to put it in. [00:27:29] Speaker A: Well, I mean, I'm focused on doing the least amount of work, I just like to do it correctly. [00:27:34] Speaker B: Well, maybe they don't care about that as much. Yeah, I don't know. [00:27:40] Speaker A: Well, and that's of course what I'm. [00:27:41] Speaker B: Worried about, that we are two relatively old fogeys and we would like to hear your opinion if we are totally wrong. So put it in the comments. If you think we need to get over ourselves, young, up, whatever and get with a program. If you think Chat GPT is rainbows and unicorns, let us know. And if you know how to use it better. [00:28:07] Speaker A: Yeah. And what has been your experience with that? I mean, that's been my experience with it and like I said, I think it's a fine tool for a lot of things. I just urge caution with it for this thing, for finding development solutions to programming problems. Not that you shouldn't use it at all, but you should use it in conjunction with other things to make sure you're thinking through problems. But yeah, let's hear your experiences with it, how you use it, how you think it does help developers, or if you think it does help developers think critically about problems, or if you agree that it's concerning that stack overflow seems to be sliding downward as Chat GPT goes up, does that point to a problem in the development community? [00:29:00] Speaker B: Yeah, well, I mean here's the big problem is that, well, I don't know if it's as common today, but at least a few years ago there was a new JavaScript framework coming out like every week. So the fact that Chad GPT only knows stuff up to like 2021, it's not going to know anything new. And if we destroy hypothetically stack overflow, what are we going to do to learn the new stuff? [00:29:29] Speaker A: Exactly. Yeah, or to think up new stuff. I mean, somebody has to be critically thinking to come up with these new JavaScript frameworks in the first place for us to learn. Right. And if we lose that critical thinking mechanism, then we lose the ability to innovate, we lose the ability to create new things and we might as well be living in the dark ages. I think the best part about software development is the investigation towards a solution. Not just copying code, but the act of investigating a solution. Engaging your critical thinking skills, engaging your logic skills, and coming up with creative solutions that nobody has come up with before. Because that's what we do all the time, day in, day out, is we come up with solutions to problems. Anybody can copy code. My daughters, when they were ten years old, could copy code, had no idea what the code did, but they could cut and paste code. You're not going to pay them for that. So I don't know, maybe it's just old fart concerns. [00:30:52] Speaker B: So have you actually used Microsoft's Copilot or integrated it at all into your have? [00:31:01] Speaker A: Well, I've used it a little bit, I've played with it, but I don't use it on the regular. It's not integrated into my IDE that I use on a daily basis. So my boss uses it or has used it. I don't think he uses it much anymore because he found all kinds of issues with similar to what we're talking about. It would give me crap answers and I'd have to change it anyway. [00:31:33] Speaker B: So just to go down a slightly different path as we're talking about this. So there was another article talking about, well, the title is microsoft is Reportedly Losing Huge amounts of Money on GitHub Copilot and that there's a report saying that Microsoft is spending up to $80 per user per month in some cases to run the AI assistant where they're essentially charging $20 a month for the service. [00:32:07] Speaker A: Yeah, I don't get it, how is. [00:32:10] Speaker B: That about I'm just saying they get $20, but they're spending $80 to deliver the service. So at some point something's going to break. If that's truly the case. [00:32:20] Speaker A: Yeah, I mean, maybe that's a loss leader type thing, or it's a magnanimous, definitely loss leader. [00:32:29] Speaker B: But here's the thing is, okay, what does this mean for the future? Now, how has Google been monetized it's through primarily where they make all their money, ads. So what do I see coming to Chat GPT in the future? Is Ads now, I don't know in what form it will. This is just a prediction, I don't know. But if it truly costs all this money to run these AI systems, they've got to get more money from somewhere. And I'm sure that there's going to be some company that says, all right, free Chat GTP for everyone or whatever, the next competitor, but you got to deal with ads. And is it now, you're going to have an avatar called Sally, and Sally is going to say, hey, I know you were googling for an answer to something. Would you be interested in this? Like how to tie a bow tie? Would you be interested in a 5% discount on bow ties from bow ties or us? I can imagine having that interaction in some point and you having to say, no, Sally, not today. [00:33:44] Speaker A: Right. And the other thing I can see coming is microtransactions. Hey, for only 199, I'll give you two more answers. Oh, God. But yeah, you're right. I mean, at some point somebody will want to monetize that, because anything that loses money isn't going to last very long. At some point it has to turn profit or companies will eventually get rid of it or it'll die on the vine. Because, yes, loss leaders are fine for a while, but eventually they need to turn into something or they need to go away. That's just the way business is. [00:34:24] Speaker B: And Microsoft actually, we were talking about Copilot, but Microsoft recently added is it Windows Copilot? Yeah, it's Windows Copilot to all Windows eleven, I think. [00:34:37] Speaker A: Oh, boy. [00:34:42] Speaker B: So if you thought the ads or the intrusion of Microsoft Windows into your daily life was hit peak, hold on to your cap. [00:34:55] Speaker A: Yeah, I still haven't upgraded to Windows Eleven because I'm not jazzed about where it's going, but oh, God, and now that I know that's in there, I'm going back to pen and paper, I'm telling you, I'm just so I'm over this crap. God, I'm sold anyway. Yeah, I don't know where it's going. I like the idea of AI. I've always been fascinated with AI and things that it can do and computer based learning and stuff like that. It's fascinating to me and I think tools can be useful, like Chat GPT can be useful for certain things. I'm just afraid that we become too reliant on it and we stop thinking for ourselves. I've seen it happen in a lot of other places in my time on this planet where people become reliant on external information and don't think for themselves anymore. And I'm concerned that's what happens here and with developers, that's a profession. Where if you don't have critical and logical thinking skills, you're kind of useless in that profession. That's all that profession is. Anyway, hopefully this is just rainy day talk and it doesn't turn into that kind of thing. And maybe developers aren't using this the way I think they are. [00:36:41] Speaker B: Well, I mean, it's easy to fall into the rabbit not rabbit hole, but don't become too lazy just by relying upon what it gives you, because I look at the answer and I compare it and contrast it with what I know of the area of which I'm asking to say, how much do I trust this answer? Right? Because I've looked at so much other stuff before. So again, I'm doing my own pattern matching with my own experience and comparing it to it. And sometimes I say, all right, this looks good to me, or sometimes I say no. [00:37:20] Speaker A: Yeah. So please don't just blindly trust chat GPT's first answer to a coding problem. Take it for hints and direction. [00:37:31] Speaker B: Yeah. And I think in time it will mature at a scary rate to the point where, yeah, I think more jobs are going to be in jeopardy, but I think it's just going to happen as it matures and gets older. Right now, I think, like I said, it's probably in its teenage years, but as these models become more accurate, it's going to get scarily good. [00:38:00] Speaker A: Yeah, right now it's a T one and soon it'll be a T 1000 and then we've got a movie on our hands. Anyway. Yeah. Let us know in the comments below what you think about this because I really am interested to know how people are using it and find out more about what this decline of stack overflow. Is that pointing to what we think it's pointing at or are you seeing something different? Let us know. I'd really be interested to hear from you about that. Anyway, we are up on time, so I really enjoyed that. That was a good conversation. Thanks for bringing that topic up, Creston. That was fun. [00:38:48] Speaker B: No problem. [00:38:49] Speaker A: And we will be back next week with something, but we don't know what yet. We're in talks with several people about bringing some more guests on and some other topics, but we haven't gotten anything finalized yet, but we will let you know. If you like this show, please do like and subscribe so that you know when our shows drop. They will be coming out on Fridays now, so we will get them uploaded on Fridays so you can start looking forward to that. They used to be Thursdays, but we're moving to Fridays because life, so stick with us. We will be back next Friday with another one of these. And if you have suggestions for topics that you'd like to hear, please put them in the comments. Or you can join our Discord and put them in the Topics channel in our Discord Discord link should be in the description below. You can also talk to us through Twitter at duckydev show or you can join our [email protected]. Come and join us there. If you prefer to listen to these things in your car, on your walk or on your Jog, you can find them as audio podcasts anywhere. That your podcasts live, so you pick your favorite one. We will be there it. So until next week, happy programming. [00:40:07] Speaker B: Happy programming.

Other Episodes

Episode 61

October 13, 2022 01:01:18
Episode Cover

Live Streaming Tools & Toys With Aaron Francis | Rubber Duck Dev Show 61

In this episode of the Rubber Duck Dev Show, we discuss different tools & toys you can use to live stream or just record...

Listen

Episode 14

September 23, 2021 00:50:40
Episode Cover

Background Job Processing | Rubber Duck Dev Show 14

In this episode, we discuss how to handle background job processing. Delayed Job (Ruby) Resque (Ruby) Good Job (Ruby) Sidekiq (Ruby) Shoryuken (Ruby) OTP...

Listen

Episode 99

September 07, 2023 00:34:29
Episode Cover

When Should You Use Background Jobs? | Rubber Duck Dev Show 99

In this episode of the Rubber Duck Dev Show, we discuss when you should use background jobs. To get the show notes for this...

Listen