
Data Science Hangout | Frank Corrigan, Target | Understanding the Impact of Data Science
We want to help data science leaders become better. The Data Science Hangout is a weekly, free-to-join open conversation for current and aspiring data science leaders. An accomplished leader in the space will join us each week and answer whatever questions the audience may have. We were recently joined by Frank Corrigan, Director of Decision Intelligence at Target. 6:38 - Problem formation - trying to find the unknown unknowns and bringing them to the business 8:20 - Integrating your data science team into your company's business objectives (ex. newsletter) 10:50 - What is the divide between a business analyst and a data scientist, in your eyes? How business analysts and data scientists differ 15:32 - What is the biggest mistake you’ve made in your role and what did you learn from this mistake? 19:15 - Onboarding new team members effectively 25:00 - The importance of motivating non-data scientists 26:42 - Resources for data scientists 32:30 - Challenges when using different tools across a data science team 49:28 - Analytical thinking vs critical thinking skills 57:30 - Embracing the 80/20 Rule & the importance of Focus Time 1:02:48 - Two frameworks to be more effective with stakeholders 1:06:47 - Rebranding to "decision intelligence" 1:08:35 - Quantitatively measuring impact from data science insights ► Subscribe to Our Channel Here: https://bit.ly/2TzgcOu ► Join the Data Science Hangout Live every Thursday from 12-1 ET: https://www.addevent.com/event/Qv9211919 Follow Us Here: Website: https://www.rstudio.com LinkedIn: https://www.linkedin.com/company/rstudio-pbc Twitter: https://twitter.com/rstudio
image: thumbnail.jpg
Transcript#
This transcript was generated automatically and may contain errors.
Welcome to the Data Science Hangout, and welcome back to everyone who's been here before and to those joining for the first time. I'm Rachel Dempsey, and I think we all kind of know the drill by now, but we want to use this time to focus on questions that are most important to you all. So there's really no agenda for these meetings, and everyone's welcome to join in live or put any questions that you have in the chat. This week we're also going to add a Slido link into the chat, so if you did want to ask anything anonymously, that you have the opportunity to do so as well.
Is there a password that people have to use?
Nope, so there shouldn't be a password. They can just go there, and you can put your name there or you can choose anonymous. But also just want to share with everyone at the beginning here that we will share the recording of each Hangout on YouTube so that other data science leaders and aspiring leaders can benefit from the discussion. But without further ado, I'm joined by my co-host for today, Frances Corrigan, also known goes by Frank, Director of Decision Intelligence at Target. And so Frank joined a few weeks ago for the first one and is here today as our featured data science leader. And talking with Frank, he's very passionate about discovering new ways to think about the world and the communication of the work that his team does.
Yeah, absolutely. And I'll start off by saying I am happy to take the lead on the questions that come through the chat this week, but I found it over the past few weeks to be incredibly helpful and useful hearing other people's perspective about the work they're doing, how they're communicating their work and some of the challenges they're facing. So by all means, I'm as much willing to hear other perspectives as give you mine. I work at Target currently. I've been at Target for almost four years now. I build and lead what's called in data science teams. We support transportation within the supply chain. And I typically default to calling our data analysts technical problem solvers because effectively that's what they are, right? We're here to solve problems for the supply chain, make things go faster, spend less money doing it, make sure that the shelves at your local Target store are full of product.
And we just do it a little bit differently than the operators or maybe the other business analysts, right? We have a different toolkit and it is a bit more technical, right? We write code. We all know R and we know Python. We're not software engineers. We try to be diligent in our coding practices, but it is definitely not software engineering that we're doing. And then the other thing I'll say is we don't shun Excel or spreadsheets. I think there is a kind of this going on that Excel is kind of old and it's over there and people that use Excel are just business analysts. I sometimes say that Excel or CSVs are the whiteboard of data analysis, right? How quickly can you prototype an idea and communicate it with your partners and your stakeholders in Excel versus trying to mock something?
What I love most about it and why I keep doing it after, right? So I'd say I've been doing this stuff for about a decade, past four years at Target. But I love the that moment when you are analyzing data or you're finding insight and your perspective just kind of changes from you thought the world was a certain way to wait, that doesn't make sense or holy cow, right? I didn't never really thought about it that way. I truly believe that if you can take in multiple perspectives and really shape an opinion or a view about how the world is, and we're trying to get as close to truth as possible, you end up making better decisions, right? You make better decisions as an individual, your team makes better decisions, your stakeholders, it's good for your company, it's good for the individuals. And then those good decisions compound on each other. And you ultimately have a profitable company or I think a better life.
I truly believe that if you can take in multiple perspectives and really shape an opinion or a view about how the world is, and we're trying to get as close to truth as possible, you end up making better decisions, right?
Problem formation and finding unknown unknowns
Yeah, so there's, as with everything, there's two hands. So on one hand, if you're a data or data science team, you can sit there, right, you can sit at your desk, look at the backlog, figure out when requests come in, you need this data pulled, or you need this analysis done or need this question answered. On the other hand, there is a state where you want to be thinking creatively and really trying as best as possible to find the unknown unknowns, and try to tackle those because if you can find some of those and bring those to the business, that becomes the true, in my opinion, the true unlock. And in order to do that, you need a little bit of credibility, or maybe a lot of credibility.
So two of the ways that that I try to do that in my daily life in my line of work are my stakeholders in operations have weekly business reviews or WBRs. I make sure that I attend those WBRs so I can listen to the supply chain operators or the business analysts talk about the problems they are seeing in their business, because that can give me a spark of, oh, man, I know I'm reading this book on regression, maybe I can connect in some of the methods I'm learning over there with some of the problems I'm hearing about on this call. And then if I'm able to ask questions within those WBRs, then people say, oh, man, Frank shows up or Frank's team shows up, and they always ask really good questions. And if people think, right, doesn't have to be an answer, if they think that you can ask good questions and make them think about something differently, right, they're going to invite you back.
And then the second thing that I do is I write a biweekly newsletter. And it is, I do a few things, and I've experimented a few ways. But ultimately, what I'm trying to do is write about the industry and the state of the supply chain industry, connect that in with data analysis methods, and do it in a way that is educational. So anyone can read it, whether you're a supply chain operator, or whether you're a data analyst, and you can get value out of it, because you're kind of creating that bridge between the technical methods, and then how that applies to the problems you are solving. And I do those two things, because at the end of the day, when I bring a solution, or my team brings a solution to our stakeholders, right, I want them to listen, right, because if they don't listen, they're going to get implemented, right. And by doing this, I basically integrate me and my team into the broader objectives that we're trying to work on.
When I when I originally started doing it at Target, it was totally grassroots. I put everybody that I had on the email distribution, it was all it wasn't BCC at first, right, I just put everybody out there. And at the beginning of the letter, I would say, Hey, this is why I'm doing this. If anybody else wants to be included on this newsletter that that you don't see their name out there, pass it along. And if they if they want to be on it, they need to ask me and I actually made it I had a sort of a strict policy because I really wanted people that were interested in it. So unless you explicitly asked me via email or called me, I'm said, Hey, can I get on your newsletter distribution, I wouldn't add people. So I never added people just because I thought they would be interested. I want it to be a total pull system. And it worked really well. And then I created a large distribution and now I have I it's like distributions within distributions of teams. So I think there's somewhere close to 200 people on my, my weekly distribution and it's really word of mouth that that it keeps growing.
Business analyst vs. data scientist
So there's one question there that says, What is the divide between a business analyst and a data scientist in your eyes?
Yeah, awesome question. So concentric circles, of course, but I'll explain it one way. In my eyes, the business analyst is there every day, trying to respond to the questions at hand for the day or the week, or maybe the two weeks, most of the stuff that they are working on is going to be relatively one off, right? Hey, can you help me figure out this carrier problem this week? Can you help me figure out why a vendor isn't shipping on time this week? And then those insights or those discoveries, ultimately, we want to be able to do that in a repeat fashion. So I kind of see that's where the technical problem solvers can bring their more technical skills in and say, Oh, that's great that we found that let's do that over and over again. So right business analysts, more ad hoc one offs, and then the technical problem solvers, we're trying to scale it up and make it repeatable.
Hiring for behaviors
So when I think about the most important thing that I have to do when I'm running a team is identifying the right talent and then trying to convince them to come onto the team.
What I'm really getting to is when I think about the right talent, what does that mean? I heavily index on behaviors. And I always think that if you can find individuals that have the right behaviors, then the outcomes are going to follow from that. So when I think about behaviors, folks that are bold and intellectually curious, they have a hard time sleeping if they can't go find an answer or come up with a process to go find an answer. That's really huge. I find extreme ownership to be another winning behavior. I want people on the team who are going to show up and say, oh man, no, that problem happened. That's on me. I should have done something differently. Even when it's not on them, right? Everyone else in the room looks at them and says, that's definitely not on you. There's nothing you could have done differently. But just having that attitude of saying, right, I could change something I do next time, and that sense of control is really big. And then also I always talk about learning frameworks. I think that the people on my team should be learning constantly and not just, right, it's not just, hey, I have this project, I have to learn a new forecasting skill. I think it is really important for all people that work in the data industry to just be constantly learning and reading because that's, right, I referenced the unknown and unknowns before. I feel like that's how you can ultimately find those. You'll be pummeled with business problems all day long, but it's the people who are kind of going outside of the normal day and trying to get other information and bring that back to those problems who really thrive.
Onboarding new team members
Yeah, for sure. So there are, I'd say, the two components that I get new team members to focus on most heavily. The first is developing relationships. So there's a lot of get-to-know-you's at Target. You come work at Target, you see those all over your calendar. But because I do index heavily on the relationship aspect, because I think the relationship will help facilitate communication, I say, let's make sure that you are getting to know our stakeholders better and our partners. So I'll set up a lot of those and make sure those copies are happening. And then on the technical side, we have the technical tools and the processes that we use. It's not incredibly diverse. We do use R and Python. I know some shops who are relatively small will just be an R shop or a Python shop, but we have a relatively small set of tools that we use. So I do have team members, and I think I always have on any team that I've built, who are very inclined to teach other people what they know. So you know who those people are on your team and you say, hey, this is your partner for the first month, or a new team member. This is your partner for the first month. They're going to show you the tools that you need to know, where the data is that you need to know. And then they'll also show you what we don't have today, because we do have limitations in our capabilities.
And then when I check in, something I do, especially if it's a relatively new, if we have someone coming out of school and it is their first job or their second job, they've only been in the workforce for I'll make sure to have a, right, personally, even if, right, I have senior managers under me, but if they're coming out, I'll make sure to have a 15 minute check in with them every day, right? Especially in virtual world, they might feel a little bit lost. So I'll have a 15 minute check in every day and kind of say, hey, who are you meeting? What do you think? Did you, did you like them? Would you have another coffee with them? But then also, hey, what, what technical skills or what is what data set do you learn about today? Just to keep us aligned.
Interviewing for behaviors
So personally, I have not used assignments. So the first is I, I spent a year working at Wayfair and I think this is a habit I picked up from Wayfair because any, if you've ever gone on interview to Wayfair, you've probably been through a case question or a few case questions because they love that there. And I picked up the habit there and I feel like working with a potential candidate, no matter what level, we're just working through a case question, trying to solve a problem together and either do it on the whiteboard or you need to have a piece of paper and then you can show each other what you're writing down. You can identify a considerable amount of behavior and attitudes through doing that exercise with each other. And then also follow up questions is huge in the interview. So there's always the, let me ask you the first question and then if the candidate gives you an answer, always have a follow up question because I find if you can go a level deeper, that's kind of the more revealing segment.
Number two, we, I had a partner that would, he had a linear regression interview question and it would be, hey, do you know R, do you know Python? People that could do this in Excel, great. I don't think anyone ever tried to do it in Excel, but it was a relatively basic linear, build a linear regression question and then go through it and talk through the ahas, oh, there's heteroscedasticity in the model that's going to throw us off and have candidates run through that. And you can kind of see how they think. And it really, really wasn't about can you do the code, right? Can you call the problem? It was more about how they think and how they try to, to navigate around the problem at hand.
Managing tools and reproducibility
So personally, I'll start off with a no answer, and then I'll back out from there. No. But the reason that we're – we're almost being forced to be better about it because we – Target is developing better tools. we have a team called Dataminr, and Dataminr is now producing hosted versions of RStudio, hosted notebooks, both in Python and R. So, we are – that is helping us to align our versions and write. So, at any given time, people are creating new notebooks, new scripts. It's kind of forcing us, and that is looped in with our GitHub repository. So, that is kind of a lot – like, forcing us to align, which I think is the purpose of having great data tools, right? If you have great data tools, they kind of force you – like, they don't allow you to have bad habits.
Well, I – I wrestle with this constantly. I work in pharma. We're regulated, and it worries the heck out of me what versions of things people have on their desktops, right? And we have kind of like a separate instance of R, which is the kind of fully qualified, you know, documented SDLC stamped controlled version. But still, I know that colleagues are going to develop code on their desktops, and then when it breaks, they're going to come to me as resident R geek and go, hey, Mike, why is this broken? And, you know, my kind of gut reaction is burn it all down, get the standard version, retry, and then if it's still broken, I'll help you.
But that kind of battle of reproducibility, you know, managing versions and, you know, even just trying to help colleagues is super hard if you can't guarantee what they're using.
I used to say to colleagues that, okay, there's a kind of shopping list of packages that we roll out. So, we kind of said, here's an officially supported version of R and with that comes this set, right? Because that set is also going to be aligned to that version of R upon our regulated qualified instance. So, if you develop with that set, we know you'll get it to run on our production system. But we've, that was horrible to maintain, because it was off, it was about a thousand packages. You know, if you go for a number of packages plus all their dependencies. And so, if you're trying to push a thousand packages to someone on their first install, A, it takes a while. And B, they're going to be, if it breaks, then you've got a lot of mopping up to do. So, now I've kind of, we're switching over to supporting RStudio Server Pro and we're saying, okay, this is where you do your development work. If you want to work on your desktop, here's the version we're supporting and here's a kind of get you started set. And, you know, we get that from RStudio Package Manager. But we're then kind of saying, look, your desktop is just a sandbox. You know, try stuff out. If it works, great. If it doesn't work, whatever. And if you get into a complete mess, then just, you know, blow it all away and start again.
BI tools and data science
I, so personally, I would say yes. I would say it's a step in the data science process. I haven't been able to understand why data science is taken as the building predictive models as, and then like, there's the BI before that, because how do you build good predictive models if you can't do feature engineering and where does good feature engineering come from? To me, that comes from the BI stuff, right? Exploring your data visually, trying to understand the real, in the real world, how did that data get created? We could probably, right, someone could probably disagree with me and say, well, you know, like some of this deep learning stuff and the neural networks that are being created, you can just feed it a bunch of data and then it gives you good, pretty good predictions. But I don't know, at least in my world, a lot of it is you got to know your data. You got to know the process through which it's created. You got to know what it's telling you and what it can't tell you before you build a good model.
From my perspective, the only time that I run into that is in the, the delivery or the output, right? They want output to be standardized from one platform or one, one view because, right, and standardization has its value, right? So being familiar with, you open your phone or you open your laptop, I'm going to take a look at the reports of the month or the week. It is helpful for leaders across, right, different teams and different parts of the company to have a common view. So to me, I can understand that component to a degree.
Hey, so this is Hugh again. We're, this is kind of what we're actively living right now. You know, we've got certain officially blessed things that don't quite meet all of our needs. We're having to justify why we want to go to, you know, alternatives. And one of the things that's really helped us out with our leadership is showing value early, you know, getting something that can't be done in another way or is not easily done and the resources we're given. And then taking that and say, hey, here's how we can add value using these open source tools that don't cost us anything. And that's usually kind of the big hook right there is this cost us nothing other than our time to build. But I will say it's still an uphill battle. And something that I've seen, I'm new to my company, but we're using things like Power BI to do a lot more complex data cleaning and data manipulation. And to me, that's not what Power BI is built for. You're not, you need to pull that data out, run it through R, you need to run it through Python, whatever it may be. It doesn't matter, just not in Power BI because you're going to slow your whole process down.
Analytical vs. critical thinking
Man. Okay. So when I think about critical thinking skills, I think that is, in my mind, I layer in a component of curiosity and drive, because I think about my bicycle is broken, right? I'm going to figure out what's wrong with my bicycle. You try something, that didn't work. Then it takes you a second to think, man, what else could it be, right? It could even be a flat tire, right? What are the reasons my tire went flat? And what are the reasons why my tire went flat? And I look at the tube, and this is the reason why the tube went flat. It's broken, and certainly the amount of critical thinking that goes into that will hopefully lead you to not inflate your tire as much next time you go riding. To me, like going through all those steps is critical thinking, right? So it applies to a ton of problems that we face in our daily lives and in work. I will contrast that with analytical thinking, which is the ability or the tendency to apply these quantitative methods to try to help answer your question or solve your problem.
Yeah, I think the way that I started to understand to put into a mental model for myself was that critical thinking is kind of like what you described it. Think about like all the ways that can go wrong and you're really thinking about the interaction and all the different components fit together. And when you pull this lever, how does it affect the rest of, and all the different inputs and all the different pieces working together. And so it's really systems thinking. It's a big picture approach. And the analytical, I would agree with what you said. Aside from the creativity part, the analytical is reductionist thinking, which is how many, many people are taught to think and are taught in universities and are taught when they learn programming or data science. You break something down to its component parts and then you just focus on understanding this one little tiny thing. I'm a biologist and you studied this one gene and that's it. Back in the good old days, you would have somebody study a single protein for 40 years. So they would dissect that protein for everything that they can possibly do. And then kind of revolve and say, well, we want to go back and understand how that protein works in the whole kind of system. So then I put that on critical thinking is the big picture systems thinking and analytical thinking is the reductionist component pieces. And the issue that I realized with critical thinking is that we learn and we're taught and we live in a society that's very much focused on analytical thinking, right? And I think that's why critical thinking is really hard for technical people, because we learn by breaking things down, reducing it to their component pieces, but then putting them back together.
I like it. I like the micro versus the macro distinction. And as you were describing, that got me thinking about Steve Levitt for economics, people I mostly admire. And he talks about his experience. And in the past, he read for about three months, he wanted to be a macro economist, but he just couldn't get his brain working that way. And I wonder if that is the critical thinking, but he's really, really good at the analytical thinking, right? The more macro stuff.
Scheduling time for critical thinking
I mean, that's become, to me, that is something you have to practice, right? I would assume no matter what size company you work at, or if you work for yourself, you're, in our society, we have a lot of meetings back to back, right? We're always scheduling time for ourself to meet with people to work on a certain project. For me, over the past few years, I feel like I've been listening to these thought leaders that make it very apparent that if you're not scheduling time for yourself to be thinking about how the components of your team or the components of your stakeholders' problems are going together, you're always going to be reacting, right? You're never going to produce things that are truly game changing. So for me, and it's been a few years in practice, but I've been very adamant about scheduling time for myself and making sure that I disconnect and turn the laptop off. I also do a ton of running. I've been a runner my whole life, and that hour of my day is super, super important for that critical thinking, right? I go out, I don't listen to anything, I don't have headphones on. So that's an hour, right? I'm running through the woods, typically right through a trail, and I'm not distracted. And that kind of becomes, and whether that's at lunchtime, or that's before work, or whether that is at three o'clock in the afternoon, people that I work with now know that that is a critical part of my day. But it's hard, right? A lot of people don't have that luxury. I think I've created it over the past few years.
Mental models and the Pareto rule
what do you think is the mental model underused in the analytical field?
I'm going to preface this with, I'm going to say there's this philosopher, Derek Sivers. If anyone knows him, you know why I call him a philosopher. But a big thing for him is, if he's going to a meetup, or an interview, or anything like that, he always tries to get the questions in advance, because he knows that his first answer is going to be the most important one. So, with that said, I'm going to give you guys the first thing that comes to my mind. Not necessarily my best answer, but the Pareto rule, the 80-20 rule. I think if you are coming out of college, day one, you're going to join really any team, for all intents and purposes. Let's say you're joining a data science team. Continuously considering the 80-20 rule, where 80% of your results, or your outcomes, are driven by 20% of your effort, is really, really powerful. And I think if you have that 80-20 rule, and if you're a team, and you can embrace that, you can really start to understand that, let's tie this back to our last conversation, that focus time, I need that time off, so I can decompress. So, when I come back for the hour of work, my mind is there, and I'm in it, because it really boils down to quality over quantity. You can work eight hours straight in a row, and after the first two, two and a half hours, the quality of your work is not going to be great. And so, if you can figure out the way to really rest your brain and identify that 20% of time, that 20% of your time, if you can make that really, really good and focused, is going to drive a really quality 80% of your results and outcomes.
You can work eight hours straight in a row, and after the first two, two and a half hours, the quality of your work is not going to be great. And so, if you can figure out the way to really rest your brain and identify that 20% of time, that 20% of your time, if you can make that really, really good and focused, is going to drive a really quality 80% of your results and outcomes.
Communicating with stakeholders and decision intelligence
we, as in, this is Greater Target, not just me and my team now, but Greater Target has built an in-house visualization tool. So, similar to Domo, but we built it in-house. We use that to run all of our weekly business reviews. And we build those in such a way that makes the experience is still a work in progress. My peer that's working on it every single month, huge new milestone. But the functionalities are repeatable across all of our weekly business reviews. So, I'd say that's big, right? Because we could have one leader be in one business review, walk into another one. It's the same experience. And then the other is that when we do more technical analytical analysis, and in full transparency, my team is still catching up here, but I have a partner team that is always delivering their analysis in R Markdowns. And it is target branded, and it always looks the same, right? So, you have your document, you have your table of contents, you have your problem statement at the top, and then you go from there. And the visualizations are really crisp and clear, and sometimes they're plotly, so they're interactive. But at first, the stakeholders were like, why not Microsoft Word? Like, what's going on here? But over time, right, they keep consistently going back to that R Markdown and delivering it in the same way. And now the stakeholders expect their analysis to look like that. So, it's easy for them doing the analysis, right? They do their analysis in R, they use R Markdown, they deliver it, and their stakeholders are expecting that. So, I'd say those are the two patterns that I've seen that work really well in my experience.
So, historically, and I say historically lightly, because I've only been with Target for four years, but it was similar in prior roles. There's this perception that data teams, whether you're a data science team, data analytics team, they do reporting. You do metrics and reporting, you deliver me my reports. My decision to rebrand to decision intelligence was really to help my stakeholders change their perspective on what we do. So, I made, this was my decision, I think, like, maybe a lot of people, I fell in love with Cassie Kasarov and her posts on LinkedIn and her making friends with machine learning videos, and I thought, oh, this really gets the point across. So, I really made the branding change to decision intelligence, and it has really helped. And when I did that, I started off with all my stakeholders, and I said, hey, this is what we're doing. We're starting from decisions. Once we start from decisions, we'll understand default actions, and then we can really start to back into the numbers and metrics and models, but really, I'm coming, me and my data analysts are coming to the table, thinking about the decisions that you need to make, because we know there's problems that happen every day. We can't really do anything about it if we don't understand what the decisions are. And it was really a reframing of stuff that we always wanted to do, but really what it has allowed us to do is have stakeholders and partners that are now working with us that are thinking a little more broadly about the role of data teams, particularly my data team.
My decision to rebrand to decision intelligence was really to help my stakeholders change their perspective on what we do. We're starting from decisions. Once we start from decisions, we'll understand default actions, and then we can really start to back into the numbers and metrics and models.
So, I think I'm fortunate in that all my stakeholders today, I'm, right, half of them I'm actually, I would say I'm friends with. So, we're very locked in, and I understand as they make the decision what's happening. So, I do understand when they actually do take an action, but I'm not really satisfied with just knowing that they took an action. What I'm trying to figure out right now, and if anyone has thoughts on this, I'd love to have a conversation later, is, if you make a prediction that there is a problem that is going to happen, or service level performance is going to deteriorate, that prediction is out there. You have stakeholders or operators or whoever that has access to that information. If you don't know specifically what they're doing, right, they could be having calls with people. They could be turning dials that you don't even know about in the background. Is there a way that over time you can log your historical data, right, your historical predictions, and then over time you can look at what actually materialized? Is there a way, and I really think it's regression modeling to say, based upon what we see in these groups where we are predicting performance deterioration compared to the whole, we can estimate that the predictions are having the intended effect.
I know that some of the stuff we're doing is not going to get used. I hope that it's not going to get used in the wrong way, and I hope we're being proactive and educative enough that it's going to be used in the wrong way. I know some stuff is going to be a waste. I don't have a great response for it. I think if people keep asking us to try, like if you keep getting invited back, there's got to be something there.
I love writing. I don't know if I emphasized that enough on this call. I think it is really important for data people to write. When you're forced to write something, it really makes you rethink how your mind is working. It really makes you rethink how your mind is working. Yeah, I would love if people discovered my blog there, and then also LinkedIn is pretty easy to find me, as long as you're looking for Frank Cardigan rather than Francis. Awesome. Thank you so much, Frank. Really appreciate you joining today, and thank you so much, everyone else, for joining. We'll be here again same time next week. I'd love to see you there, too. Have a great rest of the day, everyone.
