OrthoEvidence, Paisley Park, and Creativity

OrthoEvidence, Paisley Park, and Creativity
OrthoJOE
OrthoEvidence, Paisley Park, and Creativity

Feb 25 2026 | 00:22:30

/
Episode February 25, 2026 00:22:30

Hosted By

Mohit Bhandari, MD Marc Swiontkowski, MD

Show Notes

In this episode, Mo and Marc discuss the development of OrthoEvidence, an online platform that delivers evidence-based summaries of RCTs to help busy orthopaedists stay current with and confident in the medical literature. Mo also reflects on a recent visit to Paisley Park, using Prince’s creative workspace as a springboard for a broader conversation about creativity, discipline, and work ethic. 

 

Subspecialties: 

  • Orthopaedic Essentials 

 

Link: 

 

Chapters

  • (00:00:03) - Ortho Joe Podcast
  • (00:01:49) - In the Elevator: Ortho Evidence
  • (00:04:22) - Osteo Evidence: The importance of randomized trials
  • (00:08:40) - The Innovative Ortho Evidence Project
  • (00:12:14) - In the Elevator With Orthopedic Surgeons
  • (00:12:53) - Machine Learning in Orthopedic Surgery
  • (00:17:36) - Prince's Love Letter to Paisley Park
  • (00:21:52) - Egg Hunt with Ortho Joe
View Full Transcript

Episode Transcript

[00:00:03] Speaker A: Welcome to the Ortho Joe Podcast, a joint production of the Journal of Bone and Joint Surgery and Ortho Evidence. Join hosts Mohit Bhandari and Mark Swankowski as they discuss current topics and publications in the world of orthopedics and beyond. All right. [00:00:19] Speaker B: Well, hello. And if you're watching on video, you're noticing something. We're actually in the same room. I think this is the first time. [00:00:27] Speaker A: Totally the first time in Ortho Joe. [00:00:29] Speaker B: History that we have been in the same room at the same time. [00:00:32] Speaker A: First time in five years, and it's a glorious occasion. We had the eminent Professor Dr. Movit Bandari as the grand rounds presenter at the University of Minnesota Department of Orthopedic Surgery. Last night gave a scholarly address on the whole issue surrounding creativity, which is a big interest of Mo, as many of you know. And we'll get into what his afternoon activity was a little bit later. But to the best of my knowledge, we've never really discussed very much ortho evidence on this. And for the audience, some of you haven't listened from the beginning. So we decided, because Mo was presenting to our board of trustees on a fairly regular basis on his progress with ortho evidence, that five years ago, we would do this podcast. And it. So it was the Journal. So that's the J. And OE is ortho evidence. So it was Joe. And we thought Morning Joe. So Ortho Joe. So we got coffee. We don't have Tim Hortons. [00:01:38] Speaker B: No. [00:01:39] Speaker A: And we don't have the famous Boston Brew, Duncan. But we got Starbucks. We're a long ways from Seattle, but it's all over the world, so. So I thought it would be a good idea to just catch the audience up about, why did you start Ortho Evidence? What was the motivation? Who were the people that encourage you to do it? [00:02:00] Speaker B: Sure. [00:02:01] Speaker A: And let me just. I'll let you talk on that. But. So the Journal has always been interested in the evidence pyramid, which, again, was brought to us by our McMaster colleagues, Gord Guy and others. And in 2002, we introduced the orthopedic world to the levels of evidence concept. And Jim Heckman was the editor at the time. And we began to work with Most colleagues at McMaster. Brian Haynes. Right. Who was filtering the entire medical literature for the highest levels of evidence. So we've been getting downloads from Brian's group at McMaster for a decade, maybe 12, 13 years. [00:02:43] Speaker B: Right. [00:02:44] Speaker A: And using that as the fodder for the section that we called Ortho Evidence. [00:02:49] Speaker B: Right. Or evidence based evidence. Right. [00:02:52] Speaker A: So we've had this long, long relationship with McMaster at the Journal and so Mo, why did you start it? [00:03:00] Speaker B: Yeah, so this would have been well over a decade ago, easily over a decade ago. And you know, what was happening was so, you know, we were doing research, we were running trials as you know and know I would, to promote the trials we're doing. You, you'd travel, you'd go to lots of places and you'd be promoting evidence, you'd be going and talking about teaching research courses. And the question I would get asked the most was how do you keep up with the evidence like that. That became a very fundamentally critical question for most people is I just can't keep up with information. And you know, I would say things at that time as well, you know, there's the Cochrane Database and then there's a PubMed. Most of them would look at me like okay, I'm not doing PubMed searches just to keep up with the evidence. [00:03:41] Speaker A: Too hard to find it. [00:03:42] Speaker B: And I'm certainly not looking at journal articles. You know, it used to be that you could look at, you know, particular journal like we talked very fondly about, you know, the Journal of Bone Joint Search historically, the table of contents in many ways when there wasn't like 400 other journals was the journal that you would be looking at. But now with the subspecialties it's just impossible to keep in touch with everything. Right. And I think that was one issue. One is how do you keep up? And the second one was, is how do we know what evidence to trust? So those were the two big fundamental questions and I would, I got somewhat reflective on, you know, really I don't have a good plan for how to keep up with the evidence. So on a personal note, I just started asking people at, you know, around me, like some of the, you know, the team member saying can we just start pulling randomized trials? Let's just pull randomized, randomized trials. And you could say well why randomized trials? Well I mean, I guess my would have thought well, why not randomized trials? Let's start with something that, you know, if you're going to read something, why not read something on a therapy in which, you know, there's a, there's a reasonable chance that if it's well done that you could be able to apply it to practice. Right. So that was one of the, the, the thinking there. Initially people said, well you can have much data, quite frankly. Well I will tell you, come Fast forward to 2026. We are putting online on Ortho evidence about 100 randomized trials relevant to the practice of orthopedics every month. 100amonth, 100amonth. We have about 14,000 randomized trials that have. That are in the database. But I'll take a step back to that bar. It takes a long time to read a paper. And so how do you help people stay up with the evidence? Well, one is you. You summarize it for them in a way that makes it use, you know, usable. Now you could say, well, I can do that now with AI, yeah, you can. But some summaries of just taking what they've done and throwing it into a shorter paragraph that reads nice is not what we're talking about. We're talking about a critical appraisal. So critical appraisal is different in that, okay, let's go through the checklist. Let's make sure it's quality. What does this add? What are the relevant data points? That kind of synopsis is what we've been trying to do. We started off with like eight or 900 words and gotten it down to six or 700 words. And six or 700 words, you think, oh, that's pretty quick. But it still takes time to read. So we've created even a 100 word summary. But the whole goal of it was that if you can set it up as a brand and for myself, like, if I can do something for myself that I believe is valuable and I'm finding it useful, then I'm hopeful other people will find it useful. And then we kind of said, okay, well, why don't we formalize it as ortho evidence and put it on a site so people can get access to it. And then that's kind of led to where we are today. And then we, you know, we entered into a really great, great, I think, partnership with the Journal of Born Joint Surgery under your leadership as the editor. And I think in many ways we'll continue to try to now do more for that, you know, and continue to grow it and do, you know, do more to try to get that information out. It's probably harder now than ever, I think, to stay abreast of information, even though the perception that, oh, I have information at my fingertips, right, it's. It's a bit tricky, right? And, you know, and we're struggling like everybody else to try to say, you know, to figure out, okay, how best do we serve, you know, our, our membership? You know, the, those. Those out there in the world who are interested in musculoskeletal evidence and, you know, want to ensure that what they're reading is something they can believe. Right. [00:07:22] Speaker A: I, I will remind our audience that the, the pyramid is inverted because as you go up the pyramid, the risk of bias gets better or narrower. [00:07:34] Speaker B: Yes. [00:07:35] Speaker A: And one of the things that you do at OE is you have, for every article that you analyze, you still have a, a metric, a quantitative risk of bias score. [00:07:45] Speaker B: That's correct. Yeah. Yes. Yes, right. That's right. So I mean, it's really important. [00:07:49] Speaker A: Right. [00:07:50] Speaker B: Like, even if there's a, you know, even randomized trials are going to have limitations and, you know, and for those, you know, who are listening in or watching, you know, I'm sure you've had lots of discussions at your own journal clubs or when you've read a paper, particularly a randomized trial, where you say, I don't believe this. Right. And you may actually be right. So what we're trying to do is go back and say, well, what are the features of a trial that would make it more trustworthy or than not? But we still believe that a randomized trial should be ready. And so the argument has been if it's on, or I used to always say, and I still say if it's on ortho evidence, you should read it. Because we've already taken, you know, like as much noise as we can. We've done the heavy lifting in a way to get at least to some sort of signal now and then. We've told you, by the way, you know, here's everything we know about it now. You can make a decision on your own whether you want to use it or not. [00:08:39] Speaker A: Right, Right. So you got this idea and obviously to do that level of work, you have to have people. [00:08:45] Speaker B: Yeah. [00:08:46] Speaker A: So how did you go about getting the funding to really start this enterprise? [00:08:50] Speaker B: Yeah. So there are two ways to do it. Right. So trying to get grants and things, it's doable, but, you know, that's three or four years of trying to get grants and then you're always in that cycle. So it wasn't really something that I was doing because of grants. A lot of it was I'm just going to invest personally in doing it. I'm going to invest my personal time, I'm going to take personal cost into it. And so I'm going to expect some of those costs. And then I'm going to go to other groups and say, you know, functioning like a startup mindset. Right. Which are there individuals who also believe strongly in this And I was really, really happy to see that there's lots of people who are interested in, you know, in this, in this type of vision, which is how do you get knowledge out to people and how do you get the best knowledge out to people that need to get it? So it went through a series of, you know, I would say friendlies who are interested in investing. Right. [00:09:41] Speaker A: Including angel investors. [00:09:42] Speaker B: Angel investors, yeah. And, and then, you know, the goal here, you know, the mission for me personally has been you want to make sure lots of people get it. So how do you expand a network? How do you get people to use it? Well, first of all, has to be valuable, has some quality. But then you then the targeting approach we have done is we said let's partner with other reputable groups like you would and hope that, you know, that we can then, you know, build networks based on that. And that's how it's kind of gone. [00:10:11] Speaker A: Right. And you've also, in your dissemination plan have partnered with a lot of orthopedic, national and international associations to get this as a benefit for the members. [00:10:22] Speaker B: That's exactly right. I mean, like, I'm sure, I mean you, you have been president of numerous organizations, but also you're, you know, you understand how organizations work locally and globally. And the challenge that we faced, like the challenge that I always face is being a member of organization was always asking, okay, what, you know, what is the member benefit for me to spend my time, to take my dues. And you know, so most organizations are there for a host of reasons, right. And they do very good things for their membership. But it was very clear to us that ortho evidence, or just basically finding a member benefit, like an evidence based tool that gives your members a chance to kind of get access to information rapidly. It was very valuable. The one thing that we did do, I thought that was very helpful for us is that we would say, let's look at, I'll take our national organization, the Canadian Orthopedic Association. Great partnership. We've had many, many years of partnership with that group and they've been amazing in torosoal evidence. And the surgeons have responded and been really supportive of both evidence around Canada. What we do is, you know, we feed back to our Canadian Orthopedics Association. By the way, here's what your peers in the organization are reading. So like you'll get like an update from your organization's readership, not just the overall readership, but your organization. Now you can always go back and say, what's anyone on ortho weapons reading. But it's kind of nice, you know, when you're in an organization to say, okay, what are my peers like, what are the peers that I work with in my, in my community? What are they reading? What, why? And then, you know, why is the next question. But you need know what they're reading so you can kind of get a sense of, well, maybe where the why is. So that's been really helpful for us to help people feel a bit more engaged and a bit more connected. Although we haven't set up, you know, formal chats and things, it's been more, much more a. Here's what people are doing and you might be interested. [00:12:14] Speaker A: And approximately how, how many organizations have you partnered with at oe? [00:12:18] Speaker B: Oh, it would be double digits off the top of my head. I don't know the exact number. And also a lot of hospital systems too, Mark. So, you know, hospital systems in, you know, they're not associated with a particular organization, but they are running hospitals. And the hospital leadership wants to say we want orthopedic surgeons to have access to evidence, you know, in a hope. [00:12:41] Speaker A: In the US and Canada. [00:12:42] Speaker B: In the US and in the US Globally. And Canada. Yes, there are some hospital systems. There are hospitals, so to say. [00:12:51] Speaker A: Yeah, yeah. So I think recently you had big announcement at OE regarding a search tool. Yeah. With I don't know how many million data points you have. I know you have that number on you. [00:13:04] Speaker B: Yeah, yeah. [00:13:05] Speaker A: Tip of your tongue. [00:13:05] Speaker B: Yeah, yeah. We're into like just under just shy of 200 million data points. [00:13:09] Speaker A: 200 million, yeah. [00:13:10] Speaker B: But, but, but, but, but think about this, right? Like, like in the world of AI, when you look at it, put, I put in quotes Chatbot, because, you know, it's moving so quickly. You know, you can go ahead and search the World Wide Web, you can search for Google Scholar, you can search PubMed very quickly, pull it information and you get all this information back very quickly. And it does a pretty good job. But we realized very early on or that we were creating a database, like, you know, if, if we're going to identify, critically appraise and review, you know, 14,000 orthopedic articles that are randomized trials, we thought that, you know, I still believe it's a very valuable data set. So we've kept it kind of firewalled so it's not accessible to any, any bots that are looking for us, you know, looking for information. So we, we've created a small language model that can use, that says, you know, that uses the Big large language models to say, hey, first we're going to, if you put in a question, you're going to search our database, then you're going to come up with a bunch of information and then we're going to send it to a large language model, pick one, and then they're going to make sense of it. But the data that they're making sense of is our data set, not data that's coming out of, let's say what you know, with many others that might do if they don't have their own proprietary data set. So there are the good and bad in that is the stuff is if we come up with an answer from our site, it's pretty good because it's, you know, it's based on randomized control data. But randomized trials don't answer every possible question or every possible permutation of questions. So for that we are going to build out a second version of that that would push it out to the World Wide Web and say, okay, listen, you know, we actually have high quality data. Here's what our answer is. This question you've asked doesn't have particularly high quality data, but here's what you know, some other answers out based on, you know, lesser evidence. And that way at least we can be very clear to people that where the, where the, where the information is coming from and where you know how to go from it. But that also I still think, you know, is going to, we're going to see massive improvements like whether we're doing it, you know, there's a, there's a bunch of groups doing the same thing. [00:15:12] Speaker A: And so, but they don't have the data that you have. [00:15:15] Speaker B: Right. So I think that's where I think the opportunity is, is what can we do with that information? And, and how can we also make things easier. So you know, so one thing know is you've got to have a really reasonable automated process. You've got to be customized. We've just launched a completely new looking site like just in January. So that's kind of nice. Which becomes more of a news forum as well. So we've got our information, we pick, we, we pick stuff from it that we think is important but we're going to try to very similar to what's happening at the Journal of Bone Joint Surgery, which is, we're trying to make it a place where you can get information, you know, beyond just the core information from you know, a handful of randomized trials, although in our case it'll be 100 randomized trials. So there's a lot there. Right. And also customization. So if I'm a spine surgeon, if I. And if I click, I only want to get spine evidence. Well, spine evidence will be my weekly mailer. It won't be coming in from sports or something else. [00:16:11] Speaker A: Got it. Yeah. Just a clarification question. Is that the data pull from an individual RCT technologically done, or is there a human. [00:16:20] Speaker B: Human. Human. So, and I, I know for sure that, you know, we're going to be entering into a phase. Not too long. It's probably happening in some areas now, but for certain, being able to pull information from just PDF papers is getting better and better and better. And with agentic AI, various agents can do all that, as I'm learning. But we have gone through, you know, for many tens of thousands, in this case, 14,000 reports had duplicate extraction. And even with that, with humans doing it and duplicate checking, we're always finding, you know, like, tagging issues and things. So it's, you know, yes, you can say a machine can possibly do it better. I still think right now having that human touch is really important for us. [00:17:04] Speaker A: Right. [00:17:05] Speaker B: And we'll see how long we can keep it up. But that. That's the plan. And so we have a team that has been working on that and we have a process to do that. And that process still allows us to have 100 reports a month. So that's pretty good. [00:17:17] Speaker A: That's quite impressive. Yeah. Well, I think our audience will appreciate getting that. Long overdue. [00:17:25] Speaker B: Yeah, I know. [00:17:27] Speaker A: And we never talk about it. [00:17:29] Speaker B: Yeah. So. [00:17:30] Speaker A: But we. We have lots of interesting folks to talk about. And. And now a slight digression. So here in Minnesota, my partner, Dr. Cole, kidnapped Dr. Bindari, who is a real audiophile, very interested in all forms of music, particularly, I would say, rap and other forms of music, and took him to Paisley park, which is the home and recording studio of Prince. And maybe Mo, you could just give our audience just a brief update and what you learned at Paisley Park. [00:18:05] Speaker B: So clearly, I suspect many listening in felt. I mean, Prince is known by almost everybody. And, you know, you've always heard about him being this creative genius. And, you know, while I maybe, you know, didn't follow his. All his albums, you know, like, like an absolute fanatic. [00:18:24] Speaker A: But you saw Purple Rain multiple times, like most people. [00:18:27] Speaker B: Yeah, yeah, yeah. For sure, for sure. Oh, no. And I certainly knew about it. Like, I mean, he's like, he's an incredible artist. But what I didn't really understand, like, you know, and, and particularly now as I've been trying to think through, like, you know what, like where does this creativ from? And how do you, you know, and how do you foster that? There was this really lovely two, two hours, you know, going through Paisley Park. But what struck me, and I've mentioned it a few times already to you, is we got to get a quick look at this little office area of his and on his desk, to me, you know, spoke volumes, right? He had, he had handwritten notes and there were lots of them. And you know, it was a, a darker thing with candles. So I, I started thinking about. And he had a magn. [00:19:12] Speaker A: And just to be clear for the audience, it was left as it was. [00:19:15] Speaker B: Exactly. [00:19:16] Speaker A: Passed away. [00:19:16] Speaker B: Exactly. [00:19:17] Speaker A: Okay. [00:19:17] Speaker B: They haven't messed with that at all. And so, you know, you're kind of getting a glimpse into this, this genius and you know, how and how he worked. And it wasn't so much the content of what's on the content. It was the fact that he was still handwriting and writing it. It was the fact that he seemed to need like there were candles everywhere. That just was the ambiance that he would create. And the fact that, you know, the majority of his, of his recordings were him playing every instrument. Right. And you know, the rationale behind that he was using analog recording tape rather than going to the, you know, fancy digital stuff. There was a lot of that you learn about him in that process. And I remember asking our guide, you know, where does this creativity come from? You know, and you know, when you're, we're thinking about where do you get the things when you see these sorts of highly creative individuals, where does it come from? And her statement was, well, you know, he would record everything himself because it was in his head and he visualized and he didn't think he could relay that to a set musician, someone coming in. So he said, if fetus put it on tape, then they could understand what he's trying to get at. And I said, so my, I paused her and I said, no, no, I get that part, but where did the idea in his head come from? And then it was a pause. [00:20:29] Speaker A: Yeah, right. [00:20:30] Speaker B: And so I put a call out to anybody who understands Prince's creative process. I could tell you it was very clear to me leaving that though I actually left and had about an hour to myself. Like it's got left off. And I got a chance to sit and I really wasn't able to do much except just reflect on just how impressive. When you see something truly impressive and awe inspiring creativity I think. I think I saw that today. I think I saw that today. So I'm going to be, of course, looking at every album again. I'm going to go through every discussion, and I'm probably going to re. Reintroduce myself to this creative genius. Yeah. But. Yeah, great day. [00:21:05] Speaker A: Sounds like the same experience I've had when looking at the David from Michelangelo. [00:21:10] Speaker B: Yes. [00:21:11] Speaker A: Statue. [00:21:11] Speaker B: Yes. Yeah. [00:21:13] Speaker A: I don't understand how that's possible, but. [00:21:16] Speaker B: Right. And then you can. Then you say to yourself, like, yes, I get it. I get the, the. The. You know, intellectually, you know, you can. You can get to the point of. Okay, you know, there is creative genius and some things that some people have that others know, and there's talent, but there has to be. Like, it's. But you have to work at it. Like it just doesn't happen. Right. And so he must have had an. An incredible work ethic. An incredible work ethic to be able to do what he did and to be able to accomplish what he did. But anyways, it was very, very reflective experience. [00:21:45] Speaker A: Yeah. Genius is not just a simple gift. [00:21:48] Speaker B: It's. [00:21:48] Speaker A: It has to be enhanced by hard work. [00:21:50] Speaker B: Absolutely. [00:21:50] Speaker A: The message. [00:21:51] Speaker B: Absolutely. [00:21:52] Speaker A: Well, it's been great, Mo. It's so nice to have you in the same town. Unfortunately, our golf courses are not open, otherwise we'd be spraying the ball all over the fairway. [00:22:02] Speaker B: I'd be in the forest looking for those balls, though, and I probably come up with four or five more, that's for sure. [00:22:06] Speaker A: Guaranteed Easter egg hunt. That's exactly what it is. [00:22:10] Speaker B: Yeah. [00:22:11] Speaker A: Well, we'll be a part next time. And until then, cheers. Cheers. [00:22:15] Speaker B: All righty. Ortho Joe it is. Thanks again. Yeah. Sam.

Other Episodes