Evan Wimpey: Hello and welcome to the Mining Your Own Business podcast. I’m your host, Evan Wimpey. And today I’m excited to introduce a unique guest. Mike Arney is with us today. He is the founder of Pickleball Vision. That’s right. Pickleball Vision, a super interesting use case for data and for analytics. Really happy to chat with Mike today. Mike, thanks so much for coming on the show.
Mike Arney: Yeah. Thanks for having me. I hope I can provide some insight to your audience here. It’s—I know it’s maybe a little bit outside of your normal talking points but should be maybe a really interesting real world use case to chat about a bit.
Evan Wimpey: Certainly. I hope so. And not to draw too much inference from a small sample side, but my small data science team occasionally plays some pickleball. And I suspect it’s sweeping the nation everywhere, data science teams across the country. So a little bit of fun and data mixed together is great, Mike, to get started.
Maybe can you just let us know what is Pickleball Vision and how you got it started?
Mike Arney: Sure, sure. So, I think a lot of folks might be familiar now with the idea of analytics and sports. Of course, we are really doing that for pickleball. We’re one of a couple, you know, I would say first companies to this market because the game is still small, it’s growing exponentially, as most people are aware, maybe not exponentially, but pretty quickly. And yeah, so we use computer vision to track the ball, players, the court—we track all shots, we compile that data into all database, and we pull that out for creation of analytics reports for individual players.
And that can be used for many different things. Mostly, I would say player self-improvement is a big one. Pickleball is really easy to learn. But once you get, you know, once you get to the competitive level, the learning curve goes up pretty quickly. There is a competitive level, it’s not just kind of a casual sport, and that’s really our audiences.
Players that want to I would say move past an intermediate to advanced skill set for competitive play and have an ability to go back and not only re-watch their film, but have some data points to let’s say digest the downtimes or after actions that they can make improvements for the next time.
Evan Wimpey: Awesome. Very cool. Mike, that’s a really interesting use case. Can you talk a little bit about your background and what prompted you to start in the first place?
Mike Arney: Sure, sure. So my background is in UX/UI design. I’ve been a—let’s say graphic designer for 15 years now. I’ve always been really interested in data visualization.
It’s always kind of been—not a passion of mine—but something that I’m interested in. So, you know, last year a couple things kind of all came together. One was, you know, AI, which everybody has heard about as a thing. Like, maybe I should be versed in this field because I see how the amazing things that it is doing, how can I actually get my toes in the water and understand how this works from a functional perspective.
So desire to have that knowledge base because it may be important in the future. So that was one and two, I got addicted to pickleball. It’s something that really took me and something that when I was having some downtime, I kind of started to think about, well, okay, how can I combine these skillsets right into something that allows me to learn about AI, computer vision, machine learning, whatever term we want to use.
But mostly like on a realistic perspective, computer vision and machine learning really is more so than AI. Anyway it’s a combination of those three things. And I had a bit of time where I was able to, you know, experiment that the solution, make sure that it was something that could actually work first of all, and then once we were able to vet it, I just kind of—it started to snowball a little bit—shared it around with some people.
We had a proof of concept. The demand was really high. People were really interested in the output that we were getting, even from like a proof-of-concept stage. And then I discovered it had some legs and decided to just kind of continue with this experiment, and turn it into something a little bit more real. And we are, we are still making it real. That’s our goal right now.
Evan Wimpey: That’s very cool. Mike, you mentioned you, your background is in the tech world but maybe not necessarily in computer vision, machine learning, right? How did you go about vetting the solution? There’s a lot of learning resources.
Yea, you know, folks study this at the PhD level. Folks watch 30-minute YouTube primers. There’s almost too many options to explore. Like, how do you learn the basics of computer vision? Can you talk about that process a little bit?
Mike Arney: Yeah. So I skipped all the linear algebra portion on the computer science portion because I don’t write code, you know, I’m not writing functions and variables and loops.
And I know all those things, but I don’t write them. So my experimentation and vetting started with, first of all, researching what else is out there, what are—have some other people than in other sports. You know, tennis is a big one. Tennis is a big sport, so in some cursory research, I found that people were doing something very similar with tennis, right?
And I found some CV models that were tracking balls, players, courts. So because I found those products on GitHub, I at least knew that they were possible. But then for my actual experimentation, I stumbled across Google’s Vertex AI, which has a pretty clunky interface, but an interface nonetheless, which is what I really needed in order to make this proof of concept.
So that process started with, you know, they give you a bunch of free credits to jump in and try Vertex. I had to go through, you know, gather a bunch of videos, upload the videos, and then I started learning how to annotate. So basically I learned how to draw a box around a ball to say this is a pickleball. But also, you know, had to do that thousands of times, so kind of understood the process. and then we’re also ] detecting rally start and end points.
So like a game of tennis, you know, you hit the ball when the point is over, there’s a downtime, then you go back into the next rally and hit the ball. So another training process that I did via Vertex was rally detection so that we could detect the start and end point of a rally. So I basically got those two things working in Vertex, had something that I could kind of show around, had confidence in myself that this was worth maybe employing a real developer to work on a proof of concept that wasn’t just going to be a waste of money basically. And then went from there. So Vertex was great. It had, like I said, a clunky, but it was enough to at least give me the confidence to put a little bit more momentum towards the project.
Evan Wimpey: Alright. And I want to ask what the relationship is with Vertex now—if you’re still using it. Yeah, sort of what I’m mapping this to is that I’m thinking about folks. In industry that maybe have an idea for an analytics project. Hey, let’s jump into this. And you know, there’s sort of two options there.
There’s the at scale in production, it needs to look like this, but maybe there’s sort of this, this Vertex equivalent, we can, we can see if this has legs at all, we can, we can vet whether this is a potential solution. So I’m curious if you ended up being tied to Vertex or if that was just sort of a, we want to see if this is possible at all.
Mike Arney: Well I think Google would have loved for me to have been completely tied to the platform after the proof of concept was done. We were able to get away from it, and it wasn’t like we needed to at some point migrate away, but I would have stayed with it if it had been the right choice at the time.
But to answer the question more specifically, it was a great platform for me as somebody who can’t write code and can’t go download a machine learning or a computer vision library from GitHub and put it on my local machine and install pip and all those dependencies. And, you know, like that would have taken me way too long to, and I don’t have enough hair left to pull it all out. So it was really helpful for I think those of us that are not technical, but can at least understand some of the logic that’s happening and can understand capabilities. So we’re not tied to it anymore. It gave me some pretty good results at the beginning.
We tried to tie it into the initial system for rally detection, so we were using the Vertex AI model to detect the rallies, and then we were using our bespoke computer vision model for ball detection and player detection, but at the end of the day, going through two different systems didn’t make sense.
So having one model on vertex and three models on our kind of bespoke system didn’t make sense. So we retrained the action recognition model. And so we had all four systems at one and one repo, whatever infrastructure bucket so it was kind of an efficiency thing at that point because we had done so much custom that it wasn’t making sense to go back.
But yeah, I mean, I would recommend it. There’s definitely as a system in and of itself and using other systems like that to vet an idea, I don’t see why not. And plus, you know, Google gives you lots of credit so you can experiment for, you know, 30 hours or so before you have to actually start paying for them.
So that was a plus as well.
Evan Wimpey: That’s great. That’s Google and a lot of the other big players that offer the tools is want to get you locked into that ecosystem for sure. Yeah.
Mike Arney: And, and, you know, we were actually, we are still locked into Google cloud platform though, because that never went away.
We have all of our videos hosted there. So, you know, from a business perspective, we are still in that ecosystem in a more limited sense, but that was not one that, you know, we can’t easily migrate to AWS now. It’s like, all right, well, you got us there, which is fine. The other one just to mention really quickly is Roboflow.
This for me, it was a really great computer vision, platform software package to, to start learning. Again, it was very user facing, and for me it allowed me to visualize the things that I needed to learn. So that was another great, great experience, but not something that we ultimately used, but it gave me the education that I needed to be, to move the project forward.
Evan Wimpey: Awesome. Yeah. And that’s, I mean, sometimes that’s really what you need is to get your background, get your, get your technical learnings up to the point where you can, you can try to make the, the right decision and we’ll certainly leave the Roboflow, the Vertex, in the show notes here, for folks that are interested and, and I want learn more, Mike.
I want to speak—I don’t want to cast my own shortcomings on the entire data science community, but oftentimes UX, UI is sort of the afterthought of let’s do the analytics. Let’s get the data. Let’s build the tool. Oh, great. Now we have to make this usable for somebody who’s actually going to look at it, to make a decision to try to get better.
You’re coming from a UI/UX background and sort of learning the analytics piece as you go, is this something where you’ve already, you know, you’ve got a user interface or you’re thinking about the user interface as you’re going through and, and annotating and deciding, you know, we need to detect rallies. We need to detect downtime. Do you already have a user interface either visualized in your head or something like actually framed out the way you, you want things to look?
Mike Arney: Yeah, so we do—so I approached it from, you know, one day sitting on the bench playing pickleball and just kind of waiting for the next game to start and thinking about like wouldn’t it be cool if I had like this heat map that could show trajectories that can show my position and kind of just visualizing an analytics report in my brain wouldn’t it be cool sci-fi scenario?
So I definitely do approach things and always have from, from that end point and then try to work backwards. It’s just also something that I find enjoyable, and I’ve had a lot of—I think a lot of success in like improving skill sets based on visual references or analytics or reporting that, you know, graphs and bars and donuts and all the interesting things that can give you a perspective that you can’t get from first person.
So yeah, I do—we have a full interface going the product vision is really the same as it was, which is a player profile that can show your trends over time. And then that player profile consists of multiple games and each game itself has a report.
So that that’s, that’s the big idea, right? And that’s still there, obviously lots of details change, but the overall, the overall big idea around trends and individual game reports is still really core to what we’re doing. So yeah, that’s how I approach this because it’s more of, I would say, here’s a problem that we can solve.
What would a possible solution look like? Ooh, that sounds really cool. Or that looks really cool in my brain. Is it possible? Let’s find out, right? Let’s do a little bit of research. Let’s experiment.
Evan Wimpey: Yeah. Very cool. Is sort of the, the overall vision has, has, has remained there. Have, have you been able to collect any user feedback?
Either yourself as a player or other users of PV Vision been able to say, well, I wish I could see this, or this is useless, or I don’t understand what this means.
Mike Arney: Yeah, tons. So I engaged with the community right away as soon as I had a proof of concept that was visually interesting. I put it up on Reddit, partially to tell the world like, hey, we made some really good progress here. Check it out. Tell me what you think. And, you know, people like to watch videos with their favorite sports and like little, you know, markers and graphs and circles flying around, you know, like data points, right? Like it was interesting for people and it really is interesting to watch. So. We started to really get some community traction that way.
I got a bunch of great feedback, great comments, discussions on Reddit. We got a discord server up and running. So we kind of started this laddering process, I would say for folks in the community. So, you know, the real broad net for was Reddit. If somebody within there seemed very interested, we could say, hey, join a smaller community on discord.
So I think we’ve got like 40 people on there now. It’s nothing crazy, but it’s they’re active folks, which is wonderful. And then from there, we even have a subset where you can move up into like our external Slack channel where it’s even a smaller group of collaborators that are really passionate about the project So oh and then as a side, we also have a roadmap. So it’s roadmap.pb.vision where, you know, you all have seen these things before where you can, you can suggest a feature, people can vote on, vote on it, comment on it, and then we’ve got kind of our month-by-month roadmap on the left side of the page.
So yeah, tons of feedback. Yeah, you know, we’re trying to act on as much of it as we can and yeah, I think we’ve got like a really, a really good loop going and that’s, that’s always been, you know, my experience as a user experience designer and an interface designer, you know, usertesting.com. We’re testing the minutia of all these digital products that I work on in my day job that, you know, I just, I know the value now, and I kind of wanted to incorporate it from the get-go. So, so yeah, lots of opinions, lots of great ideas. Yeah,
Evan Wimpey: Yeah, very, very nice. Very focused, very community effort. And then, you know, I think with a sport that has gained popularity very quickly, the community support is there for a tool like this. Well, and to folks listening, go ahead.
Mike Arney: Well, I was just going to say the other really cool part is we attract pickleball enthusiasts. Yes, absolutely. But of those pickleball enthusiasts, there are a subset of super talented developers that would love to see this succeed You know folks that are are great with let’s say backend infrastructure who say, you know I’ll give you some time if you want it.
I can help clean these things up. It’s like, yes, please. Thank you. Amazing. And then we’ve got kind of like a Patreon thing too, where we’re sending out swag and whatnot. Sorry to interrupt, but it’s been—the community aspect has been really wonderful. And it’s people with shared passions, you know, which is great.
Evan Wimpey: Awesome. Yeah. And that I, you probably have some, some developers listening right now. So hopefully folks, folks that are interested can certainly reach out, get in touch, and be a part of that.
But the thing I wanted to comment on is, you know, maybe people are more passionate about pickleball than they are about their day jobs as supply chain or e-commerce or whatever, but the idea of soliciting community feedback, getting this feedback early, not just when you have some polished analytics tool, but sort of early on when your end user is the person who really is the customer, who really is going to be consuming this analytics. And if they’re not bought in, it’s going to be a really tough climb for even the best analytic, whether it’s computer vision or forecasting or predictive or prescriptive analytics, whatever it is, the end user needs to have some buy in before this is, you know, some analytics is thrust upon them.
Mike Arney: Yep. Yep. And in our case with I would say a larger audience, it’s important that we in some way are able to filter the feedback, to find majority use cases, right? Like, I would say if there is a drawback to, you know, getting passionate folks involved—as we all know, some people are super passionate about certain things and they can, their voice can be pretty loud about those things and it’s great, you know, like have that passion, but you have to be very careful that that does not drown out the silence or the quiet majority, right, which I’ve seen that happen a lot personally.
And it’s, and it’s hard to differentiate between them sometimes, but, you know, usually using a little bit of community analytics on your community itself and just saying, okay, like there’s some real numbers here or votes. So yeah, filtering through that can be tough. I think that’s one of the hardest things we have right now is prioritization of features because we’ve just got so many, so many ideas and we want to do them all.
Of course, we’ve all been there, but. We can’t so figure out which ones are the most important really and the most important from a user base point of view, right? Like not at all. I mean, we’re, we’re product led product focus, but this is not at all about features towards monetization. It’s about getting the word out there and creating word of mouth as a first and foremost priority, you know, we want people’s jaw to drop and in the largest sense, the word is the, it’s the first product priority. And so I think that’s true for all, you know, at least all use cases of analytics and reporting, right?
Is that if somebody can say like, oh my God, like this insight that I have now is just amazing. Like that’s success.
Evan Wimpey: So awesome. Yeah, absolutely. Mike, so you’ve got, you’ve got the backlog of features to add there. I, there probably isn’t, there probably isn’t an end state. I would imagine there’s always more to do and new things to do, but can you, can you, can you look to the future a little bit and maybe see where you hope PB Vision is, you know, in the next year or so? And then also maybe address what, what are the biggest, what are your biggest challenges either technically or otherwise to get to implement the features and get where you want to be.
Mike Arney: Yeah, great. So I think a year from now, our hope is to have a monthly subscription model.
You know, that is very reasonable. That allows folks to upload enough games of themselves playing, let’s say, five or six a month, perhaps. And really show continuing value within that subscription model. So, you know, it’s on me as a user to upload the games, but if I do that, I’m really incentivized to continue to upload games so that I can see trends.
I can see where I’m improving. I can see where I’m stagnating. I can see where my strengths are so that I can capitalize on those strengths within the game. So that would be my hope is a year from now that people are like, All right, like this is a routine that I’m in. I’m either getting a recording from the facility that I play at, because a lot of facilities are recording games for people now, or I’m setting up my tripod and uploading my own game.
The tech challenges are, are tough right now from a processing perspective. So if a user uploads a—it’s about a one to five ratio for every one minute of video. It’s about five minutes of processing time. We haven’t done any sort of technical optimization, but it’s burning these, you know, these a 100 GPUs in the cloud pretty hot.
And, you know, there’s definitely optimization that needs to happen, and it’s the balance between optimization and accuracy, right? Like, we want the accuracy to go up, but sometimes that increases the GPU load. So my hope in a year from now is that we can have the, the time cut and cut in half and the accuracy increased the accuracy of, let’s say shot detection, both detection, the time is really not an issue, right?
Like for the user it is for us. It takes a while and it’s intensive, but for the user, it’s kind of like a shoot it off and forget about it and then get the report and consume the data at your convenience. So I don’t think users are going to care about, they’re not going to just be sitting there hitting refresh on their email, waiting for this report to come in.
I think my hope is that it’s going to come in and it’s going to be like, Oh, cool. It’s here. Like. A letter arrived in the mailbox and now I can read it. So I’m not super concerned about that from the user perspective. But, you know, the faster the better, of course, right? If you can have it instantly, it will be amazing.
So that’s a challenge. And yeah, I think the other really big challenge from a user perspective or the big challenge from a user perspective is getting folks to record the game, right? There is legwork that needs to be done to record your game. You have to. Get your phone, you have to get a tripod, you have to set it up and frame it, you have to hit record, you have to tell everybody like, hey, I’m the guy over here recording this, you know, and I’m, you know, so there’s that like social like, oh, you have to get over that, right?
Like anytime you’re taking photos in public, there can be a little bit of a, you know, social like barrier to get over there, I would say. But then there’s the actual, you know, work of lugging a tripod, setting up your camera. So the value of the reports needs to obviously outweigh the difficulty of setting it up and uploading.
So we’re trying to make that as easy as possible. If a user uploads or records a video on their phone, we’re compressing it down to the absolute minimum bit rate file size so that they can start to upload it from a 3G or 5G connection before they get home. If, if their data plan supports it, of course. But there’s we’ve made it somewhat easy to like say, okay, you’ve recorded this video.
You now, you don’t, you no longer have to go home, take it off your phone, put it on your desktop and then upload it through a web interface. We can, we can use our uploader app and just, you know, start the process there. So but that’s, I don’t think we can make it any easier than that. Right? Like, there’s no—we can’t set up the tripod for you. We can’t hit record for you. So that’s a big barrier on the user side. But as I did mention, a lot of facilities are starting to offer game recordings as an upsell to their attendees. So assuming you play at an indoor court that has cameras set up. A lot of times you can get that footage provided to you.
But obviously if you’re in the city park, there’s no, there’s no hardware and cameras recording for you. So, so those are the big challenges. And then if you want, I can talk about this kind of future sci-fi vision that is actually somewhat practical.
Evan Wimpey: Yeah, please.
Mike Arney: So future thinking, you know, there’s, there’s lots of ideas.
People, people have all sorts of ideas around what this could be like, you know, five years from now, but one that’s really somewhat practical and really cool is integration with ball machines. We’ve got a partnership that’s in the works to work with a ball machine. So people are not familiar with ball machines.
It’s just, you know, a big hopper of pickle balls that sits on the other side of the court that feeds you balls to hit back. Right. And it’s a static. Pretty dumb machine right now. You hit, you hit the, , spin rate, the trajectory, the speed, and you hit go and it starts spitting out balls. , but there’s, there’s a lot of innovation happening here too, where, okay, what if this machine is, you know, if you’ve got your phone, you’ve got some pre-programmed drills, I can start it from my phone.
But what if, you know, it’s like—what if you put some wheels on the bottom of it, and all of a sudden it’s a Roomba ball machine, and it can kind of move around the court, and then what if there’s a camera on it, and it can see me on the other side of the court, and it can hit balls based on my position, provide some feedback? And then even, you know, a step further, what if it can see—what if it can assess me, you know, to figure out my skill level with the first 10 shots to say like, okay, he’s kind of like a low to beginner, we’re going to send him some easy balls or he’s a more advanced player?
Like there’s, you know, we can assess the skill level and start shooting balls that direction in a way that really challenges this person as a player. It’s a little sci-fi, it’s a little out there, but it’s technically super possible, right? Especially, so the partnership that we’re working on is to, , get our computer vision system on this machine so that it can detect the players and the shots that they’re hitting on the other side of the court and can either change the direction, spin, whatever of the shot, or, , Assess the quality of the player, and that would just be so cool.
Evan Wimpey: Awesome. Wow. Yeah, it is. It is very cool. It’s great to see how quickly pickleball is growing and initiatives like this come into place that you would think, oh, well, you know, just sort of recreational sport that a handful of people do great for, it’s not going to be there, but with the growth and with the capability, the tech capabilities and folks like yourself that are pushing this super exciting to Yeah, Mike, where, where can folks follow along, learn more, use your tool, contribute possibly?
Mike Arney: Yeah, totally. So we’re, we are live. It’s, there’s no cost right now. It’s at app.pb.vision. Or the website is just pb.vision. Right now you can record your game using our framing suggestions so that the game needs to be recorded in a certain way so we can get good data back.
And you can upload your game and you can get a report on your performance. You know, I would say it’s still in an adolescent phase at best, but we’re continuing to iterate on it every week. And you can see things like your serve depth and return depth. Highlights from the game based on, you know, the length of the rally, you can, you can see how fast your best serve was all those things are available now.
I will say it’s a little rough, but, you know, it’s a beta product still, but it’s really taken shape and we’ll just continue to be polished. And I think, you know, four or five months from now, it will be a well-oiled machine I hope.
Evan Wimpey: Awesome. Very exciting. So, for all you tech teams out there, analytics teams that are hitting the pickleball court once a month or so, grab your phone, grab your tripod, check out pb.vision, and combine your analytics with your enjoyment of pickleball. Mike, thanks so much for coming on the show today.
Super interesting to chat with you. I hope folks follow along excited to see where PB Vision goes.
Mike Arney: Yeah. Appreciate it, Evan. Thanks everybody.