Click HERE to receive SHRM number.

With ChatGPT now released in a usable format, the world’s been abuzz with the potentials – and pitfalls – of artificial intelligence.

Our guest today says that the word – artificial intelligence – is misleading. Artificial anything, artificial turf, sweeteners, they all try to get as close to the real thing as possible.

Chris Duffey, strategic development manager for Adobe, says that isn’t the goal. This tech revolution isn’t so much about replacing us, as amplifying the magical powers we already have.

Through art school and the fast world of advertising, Chris has spent his professional life weaving that magic. So what does AI really mean for us?

Download the TranscriptDownload on iTunes




Since ChatGPT was released in a usable format, the world has been abuzz with the potentials and pitfalls of artificial intelligence.

Give it a name, a couple memories and a couple emotions, and I will spit out a sonnet for your loved one much, much faster than it takes us to write one. Our guest today says that the word artificial intelligence is misleading. Artificial anything, artificial turf, sweeteners. They all try to get as close to the real thing as possible. Artificial intelligence, by extension. Sounds like the Terminator.

Chris Duffy, strategic development manager for Adobe, says that that’s not the goal. Human brains are the most powerful computer we know of and built with a code we may never crack. So this tech revolution isn’t so much about replacing us as amplifying the magical powers we already have on this path through art school and the fast world of advertising to selling the message of Adobe’s Creative Cloud. Chris has spent his professional life weaving that magic. So what does I really mean for us? I hope you enjoy.

Whitney Johnson: So, Chris, your parents both worked in educational spaces. One was a teacher, the other a reading specialist. What was your childhood like in a home with two educators?

Chris Duffey: Yeah, a fascinating question. Brings back a lot of great, great memories. In many ways, I have to give my parents a lot of credit because I was somewhat of an anomaly from that that perspective. I, from a very early age, tended to gravitate towards the more creative route in life. And so in some ways, education learning is a linear thought process. And I was somewhat fascinated by the non-linear path or approach to solving things. So I think, you know, I give my father specifically a lot of credit for recognizing that and not trying to force me into a natural path for my learning experience. He I think early on we came to kind of an informal agreement like you do have to have a formal education, but you can supplement that with some nontraditional approaches. So in college specifically, I went to a liberal arts school during the day, and my joke is I would wear a polo shirt during the day, and then at night I would put on my black t shirt and go to art school. And so that was kind of a interesting dichotomy throughout my life, where I’ve always tried to balance the right brain and the left brain approach to business, my life and so on. So yeah, you’re.

Whitney Johnson: Reminding me of a conversation that I had with my son when he was in high school. And, you know, the why do I have to do math? Why do I need to do history? I’m not going to use this anyway. And it sounds like the conversation you had with your dad is he said you’re right, you’re not going to use math, but you need to graduate from high school so that you can go to college or go to art school. So what did you actually study and after? After high.

Chris Duffey: School? Yeah. So I had a neighbor and I grew up in Milwaukee, Wisconsin. And, you know, he always was quite cool looking character. And he had an agency in Milwaukee, Wisconsin, so I believe it was my senior freshman year. I had an internship at the advertising agency. And that just, you know, really sparked my interest in terms of professional application of art. You know, I have a fine arts degree, art history degree. And as much as that craft appealed to me, I always had a greater interest in applying, you know, creativity, which I think is a more encompassing to business cases. And when I saw how that was being used in the agency that just, you know, sent me off, it was pretty inspiring, like the creative floor at that time, a lot of agencies were divided by the creative floor and the account business floor. And I think over the years, you know, 25 years later, it’s more integrated, rightfully so, but back in the day to see kind of the fun they were having at work was quite interesting for me. So I first got into marketing and then really honed in into advertising and then supplemented that with the craft of creating things in art school. And I would quite often, and I was jumping around in art school where I started out in painting, then sculpture, photography. About the third year in someone introduced me to one of the teachers, introduced me to computer graphics and that was probably my second aha moment throughout the years in fine arts, I would always get a similar pattern of comments. That’s nice crisp, but your art has a sense of humor to it. And it wasn’t intentional. I just didn’t have the, the, you know, the detail of the craft. So it came off a little humorous. But when I got introduced to computer graphics, that was a breakthrough for me. There was one undo. There was, you know, I could almost create at the speed of thought in early Photoshop and so on. And so it was quite profound that stood with me throughout the years as well.

Whitney Johnson: So when you discovered computer graphics, you said, “Oh, this is it.” That’s it.

Chris Duffey: Yeah, that’s it. So now I can create cross-medium outputs, photography, I can combine, scan, you know, human creation of drawings and painting. And that was quite a breakthrough. It took away a lot. The constraints of traditional mediums for me. And so then went to a post grad school in Miami, which was called the Miami Ad School, and it was a very select small, small group, about four classes at a time, cohorts at a time maybe comprised of 10 to 15 people. And that was, you know, many call it the Harvard of advertising. And so that was the true boot camp for me that really helped propel my career forward pretty quickly.

Whitney Johnson: All right. So speaking of the business world, many of our listeners have probably used an Adobe product like a Photoshop, and all these separate softwares have now been bundled up into the Adobe Creative Cloud. What’s your role been in that product?

Chris Duffey: Yeah, I think I share an equal passion. Hopefully for that. Your listeners have as well for the product. It really is a privilege to be here at Adobe. Quite often when I’m asked What do I do, I mention Adobe and literally a smile pops up on people’s faces. Everything from the creative cloud to the use cases with PDF and then more recently experienced cloud. So I was brought in to help spearhead our go to market strategic development for creative cloud in the enterprise space. A bit of a mouthful, but essentially accelerate the adoption of high priority products within creative cloud. And so we’ve launched a number of new ones over the last few years, which is really a reflection of how, again, creativity is starting to become pervasive across the enterprise business world. So it’s no longer about creative professionals, it’s about anyone can and should be able to create and express themselves, whether it be a presentation, a sales pitch and so on. And so my focus is on some of these newer products to showcase the art of the possible and ultimately get enterprises to unlock the full potential and value of the products.

Whitney Johnson: Yeah, interesting. Because I do think about it mostly my interaction with Adobe is for PDFs, and what you’re saying is, Oh, there’s so much more.

Chris Duffey: Yeah. And PDF, you know, it holds its own. It’s the, the world’s most used software of all time, trillions of times. So it’s quite profound when you think of it, think about it.

Whitney Johnson: So what you’re saying is don’t disdain or dismiss the PDF.

Chris Duffey: It is.

Whitney Johnson: Interesting.

Chris Duffey: And there’s been some great innovations over the last number of years on PDF. Liquid mode is one example of where artificial intelligence has been infused into PDF and it just feels natural now. But when you open up a PDF on a mobile phone, it can reflow to fit the form factor. And so technically that’s quite complex, but from a user standpoint it just seems. Yeah. Second nature. Quotidian. Yeah. Huh. All right.

Whitney Johnson: So here today, you’re very much a thought leader in the tech field and especially at Adobe. You’ve got a lot of data reports, articles all flying at you. What do you read on an average workday and where do you focus your eyes? Where do you focus your attention?

Chris Duffey: I am a huge fan and I’m not just saying this of podcasts. I think in-depth conversations have been instrumental for me, specifically, I’d say over the last seven years or so. So I try to listen to 11A day easily. I’ve got multiple feeds over the years, maybe too many feeds in terms of email newsletters, so don’t want to sound negative, but bombarded with newsletters and traditionally just read, read the headline and then if it’s of interest, click through. It’s still a die-hard New York Times reader still read that every day. For 20, 30 years now, I’ve been reading, you know, the public, the marketing publications, Adweek, AdAge and so on. And then a big fan of like industry analyst reports when those come out really dive deep into like Gartner or yeah.

Whitney Johnson: Oh yeah. Okay. All right. So you opened it up. So podcasts, what are 1 or 2? It sounds like you’re who are some people that you just consistently listen to?

Chris Duffey: Uh, Lex Friedman. I’m a big fan of Lex Friedman, former MIT professor, um, which has opened up a number of other ones so that if anyone really wants to dive deep into the technical implications of technology, that’s a great one. And then I bop around, you know, maybe two, three weeks on one and then discover another one. So Spotify is kind of my access to deep, deep information for the last number of years.

Whitney Johnson: Um, so. Have you ever mistaken a disruptive technology for one that was just distracting?

Chris Duffey: I think myself and the industry included was reflecting the other day on we were using Alexa skill sets, um, when it first came out and we did a project for, I won’t say the brand, but a brand that is synonymous in the dental office. And so we created an Alexa skill set to enable the dentist to ask questions about the product. And looking back, the questions were pretty straightforward. So I don’t know if that was probably an overexcitement of the technology that was an indication of these newer ones that are out currently in terms of like the large language models and ChatGPT and, and so on. But sometimes you have to experiment and those experimentations may fall flat at the moment, but they help move things forward from a business product development and just a knowledge base standpoint. So I’m a big fan advocate of experimentation and it’s somewhat dependent on framing the value of experimentation. It’s it’s not really. Only about the KPIs of the the outcome. It’s really about the learning to try to achieve those outcomes. So I think that’s, that’s, that’s quite important.

Whitney Johnson: Yeah, that’s interesting. So, so if you’re learning something, it’s not necessarily a distraction unless you’ve learned what you needed to learn and you stay with it too long. So that’s just practice of learning and experimentation, even those asking the questions. Alexa type skills for a dental practice. You didn’t use it. Maybe you could have gone down that rabbit hole too far. But by just saying, okay, we learned something, we experimented that. Prepared you to probably be more open to something like ChatGPT.

Chris Duffey: Exactly.

Whitney Johnson: Yeah. Mhm. You have spoken and written extensively about how AI can supplement human creativity. Say more.

Chris Duffey: In many ways, humanity is given or encounters a profound technology disruption maybe every 10 or 15 years or so. If you go back we had the compute revolution, which then led to the cloud revolution, which led to the mobile revolution. Then subsequent social revolution. And we’re now at this next evolution of revolution of artificial intelligence. We are with that said, at this moment in time, somewhat of an inflection point of this convergence of multiple technologies happening in parallel. We’ve got artificial intelligence happening, which is a bit more mature. We’ve got more recently the Metaverse, which is AKA for immersive 3D like experiences. And then we also have a third happening, which is Web3, which is all about co-ownership of data in terms of artificial intelligence, and it’s still very much TBD. Will that encapsulate all of these technologies? Will it propel these technologies? It’s really somewhat of a semantic academic conversation at this point, but artificial intelligence is quite profound. It’s been around since the 50s, somewhat off, founded off of a false premise of how can we recreate the human mind. And I think what we’ve seen over the decades is we’ve got a pretty good human mind. It’s how can we augment, how can we amplify the human intelligence? So somewhat of a branding marketing tangent, I think we misbranded the technology with the term artificial. No one likes anything artificial, but it’s become synonymous for the technology. So some call it machine learning, some call it deep learning. Those are subsets of AI. But for all intensive purposes, it’s artificial intelligence or AI. So I there’s two aspects not to get technical, but I think it’s important for your listeners because I’m sure they see the headlines.

I up until about five years ago, you could segment into maybe three layers that you could think of superintelligence, which is this notion of it’s so profoundly intelligent that human capacity can’t even comprehend it to every to my knowledge, everyone I’ve talked to, there’s no system out there. There’s not even companies that are trying to accomplish that. The next layer is really called general intelligence, and that’s trying, again to recreate the human mind. I think we’ve seen time and time again through the burst and bust cycles that that’s not, you know, a great probability. Where there has been huge breakthroughs is what’s called narrow intelligence, where it’s focused on a specific task where there have been probably the most value in terms of that area is predictive analysis. So AI is wonderful at ingesting large amounts of data and finding patterns through a number of different techniques and making predictions. So that is quite, you know, valuable for companies large, medium or small, even individuals. How can you use data and then make predictive assumptions off of that? More recently over the last couple of years, three, 3 or 4 years has been this new invention, I would say, of generative. Ai came out through some breakthrough techniques called the cans and Gans generative adversarial networks. But essentially it is evolving, advancing predictive understanding that has now given AI the capability to create. And one caveat with some of these terms I use is we as humans quite often project human characteristics on non-human things because that is our lens on the world. So when I say create, I’m applying a human attribute capability to a technology out of lack of.

A better term. So now with these generative adversarial networks and Gen II, we can use large data sets to create things. And so we’ve seen over the last year or so just the explosion of the things like ChatGPT, which is based off of which is called a large language model, which is based off of a breakthrough called language transformers. And essentially it started out and many people have probably seen where you saw an email, you saw a prediction of what the next word or two could be. And that’s now evolved to it’s not only a prediction of the next word or not even the next sentence or not even the next paragraph, but it can now ingest and comprehend again another projection of a human word, understand the data and make summaries of the data. So no longer when we go to like a search browser and we type in a question, it will give us multiple links. It now ingests understands all of those different links and sources and summarizes it based off of the prompt. Profound opportunities around using generative AI at Adobe not to go too deep at Adobe. We’re focused on maybe 4 or 5 horizons in terms of generative AI. The first one being generative text, which we just talked about, the second one being generative image images. So where you can type in through prompts, create a picture of a moderator at her desk with how many pictures do you have in the background? Eight pictures, yeah. And it can generate a pretty impressive picture. Next one is generative 3D and then the next the following is generative video. And so those are all happening in parallel.

Whitney Johnson: All right. So a number of artists have signed online petitions demanding some kind of credit for art that makes if that AI has been trained using that artist’s work. Do you have any thoughts on that?

Chris Duffey: In terms of in the industry, it’s called provenance, right? Giving the rights to the training data and where it originated. And so I use this as a metaphor and then happy to tease out different aspects of it. Adobe launched a few months ago at Adobe Summit in March, what we call our generative solution called Firefly. And Firefly is a generative output. And so we spent a lot of time in terms of how can we celebrate and acknowledge where those data sets came from. And so ultimately that gives companies, individuals peace of mind that it is ethically sourced and commercial ready to be used. And so part of that has to be this acknowledgment of where, where and when was the data trained, where the system was trained on. And so we had about 300 million assets in Adobe Stock, and so we trained it on Adobe Stock. And so now we’ve got a system where we can monetize the contributors but also have traceability of where it was trained on. And so I think that’s a metaphor for, we think, a right approach in terms of how do you create generative systems that will create really great outputs but also reward and acknowledge the contributors. Some caveats and not some caveats, but some nuances to that. And I think we’ve approached this multiple times in slightly different ways, but there is learnings in history in terms of citations and write some managements. We we’ve got rules and regulation in terms of like authorship and citation best practices in music. We’ve encountered that in terms of in the early mid 90s when sampling became a big thing in terms of deejaying, there’s kind of rules and regulations and standards and agreement and now we’re rightfully so having conversations on what is the, the right approach to identifying where it came from, monetizing where it came from and so on. And so it’s again, a complex topic, but I think we’re making a lot of great. Headway over the last months.

Whitney Johnson: Yeah. Something that I appreciate is that you’re very well spoken and you’re able to talk about these ideas in a very clear not for the expert way. I was going to say layman, but I want to say layperson, I guess layperson way. How do you nurture that in yourself, that ability to speak in a way that anyone can understand what you’re saying?

Chris Duffey: Yeah. Thank you. One aspect and I’ll maybe overshare, but I’ll share. I’ve used AI systems to do analysis on how I speak and write in terms of sentiment analysis. I’ve been flagged where I say, um, a lot. And so, and I feed in multiple interviews. And so that’s something I’m very, very cognizant of. And I’ve caught myself a few times throughout this conversation where I’m saying, um, or, and I should just breathe and pause. So that’s something I try not to only read about it, but actually use it in my day-to-day life, both personally and professionally.

Whitney Johnson: Wait so how do you use the AI to flag your ums and ahs? What are you doing with that?

Chris Duffey: That’s there are some APIs that I’ve customized where I can feed in an audio or a video and it can do sentiment analysis of my voice. And so sometimes I speak too high, I speak too monotone, so I’m always trying to.

Whitney Johnson: Do you sell that?

Chris Duffey: No, it’s, it’s just kind of a little side hobby.

Whitney Johnson: I want that. I think you should sell that. I think that’s really interesting. Fascinating. Well, that, I’m serious about that. That leads to my next question, which is when you think about I mean, you just talked about all these practical ways to use it. What about as you think about your own personal growth and development, gets you most excited about? Ai besides the fact that it monitors your voice so you can delete your ums and ahs?

Chris Duffey: I think what’s quite exciting is there’s two things happening that are coming together right now in terms of this ability of predictive analysis and suggestions and then the ability to generate off of those predictions. And so when those two things meet from a business standpoint and my background expertise, focus is on marketing, that is quite exciting for the ability to ingest near real time data in terms of your content or your promotion, ingest what’s working, not working, get predictive analysis on how to optimize it, and then generate the output very quickly. And so that circular round trip loop is getting condensed ever so faster over the last number of years. So that from a very professional standpoint gets me quite excited to stay up late, pop up early in the morning because that that I think will really help businesses large and small. Okay.

Whitney Johnson: So can you give me a specific example of what that looks like?

Chris Duffey: Sure. Let’s say someone has a podcast and they’ve got a number of different social media newsletters that they’re banner ads, you know, cross-channel omnichannel marketing approach, the ingestion of the data of what’s working and not working. The system can make some predictions on not only which channels are having a more probability of success, but it can also make some suggestions on what messaging is working, what cohorts, what personas are more interested. And there that’s where the human comes into the loop, makes them additional insights, ingests those suggestions, and then can help trigger the system to deploy very quickly.

Whitney Johnson: So exciting. It’s so exciting. Yeah. All right. We’re starting to get toward the end. Um, let’s talk about the metaverse. You have a new book out called Decoding the Metaverse. What is your working definition of the metaverse?

Chris Duffey: I’ll keep it very simple. Immersive experiences. And so in many ways, this is my second book, might be my last. These books are a lot of work. Yeah.

They took, as you so well know, they take a lot of work. Um, and so, and it takes a number of months to recover. I find I don’t know about you, but, um, so this is my second book. The first book was called Superhuman Innovation. Which was about generative AI, artificial intelligence, where I built a system to talk about AI. So we had a back and forth when that book first came out. I still remember quite clearly and I have the mental scar wounds I would present and it was pretty unanimous. It was a blank face that turned into somewhat anger very quickly about suggesting AI has the ability to make suggestions and seemingly create. And so I think I’m encountering a similar pattern. And I feel quite validated that five, six years later, we’re now so excited but cautious about the use cases of generative AI five, six later, years later. I’m seeing a similar pattern with the metaverse in terms of helping paint the picture. This vision of a more immersive experience. And the reaction is in many ways somewhat similar, very skeptical about that, and you know, quite often receive the question, but do we do we need more technology to have us face down into our screens? What are the implications of that? And those are rightfully so, cautious questions, thoughts. But the broader picture is it’s the natural evolution of the Internet as we know it. You know, every number of years it gets more immersive in terms of the content, in terms of the immersiveness of the experience. And this is really what the metaverse is. It’s a kind of a, again, maybe a term that might not do full justice to the technology because it wasn’t really in the cultural vernacular too much until recently.

Whitney Johnson: In your book you say, I’ll read from page 39. You say the metaverse pulls together all the technologies from the hardware to the software to artificial intelligence to virtual reality, to the Internet of Things, into a coherent virtual representation. And so question for you, what would the metaverse, So you and I are having this conversation over Zoom. There will be snippets of this video that will go onto YouTube. Most of this will be audio that people can listen to. What would this conversation look like if you and I were in the metaverse?

Chris Duffey: Sure. Yeah. In many ways it’s going to be dependent on the hardware or the form factor. We’ve seen obviously the big announcement with Apple getting into the mixed reality spatial computing realm so that that allows for this ability. What I like about that approach is it’s truly fuzing the digital world with the physical world. So it almost an overlay of the physical world with digital. So that’s one approach. Quite often we’ve seen like these very 3D, almost game like environments, that’s another approach. And so I would just offer to your listeners that it’s not going to be one approach. It’s really, I think, a mindset of how can experiences be more immersive, whether it be through AR, VR, mixed reality, 3D web and so on. And so it’s all encompassing as you kind of pulled out.

Whitney Johnson: So as I’m listening to you, it might be a situation where right now we’re on Zoom and yes, we’re having a conversation, but it could feel more like we’re actually in the same room together.

Chris Duffey: Yeah, exactly. And so I would equate it somewhat to digital. Like it’s such a broad — what is a digital experience mean? And so obviously that’s quite encompassing in terms of digital can be many different approaches. And similarly, the metaverse will be immersive and will have very many different onramps into having a metaverse experience.

Whitney Johnson: How do you want to leave your personal print on the construction of the metaverse? What would you like your role to be?

Chris Duffey: No, thank you. That’s a great question. And in many ways I always felt a little dismayed by I missed kind of the advent or the invention of the internet by about five years or so. And so I became almost a user or practitioner of the internet rather than helping to build or contribute to it. So I very much see this opportunity and encourage. Quite honestly, everyone can be a contributor builder to the metaverse at this point. We truly have the technology at our hands, the knowledge, accessibility to help build a future that we truly all want and desire. And so that that’s my hope, is to help guide the metaverse to be more inclusive. To help it become very creative, creator, focused. And then also, you know, I’m a big advocate of some call it red teaming. Others call it just focused on unintended consequences. And so in the book, I have a chapter or two on a number of different unintended consequences, how we can learn from Web two and apply some of those learnings and principles to the Metaverse and Web three.

Whitney Johnson: I love the theme of you taking this this impulse from a very young age to be a creator and what you said earlier about creativity and it comes in many different forms and your desire to be a co-creator of what the metaverse looks like. I think that’s lovely.

So what’s an S-curve that you have jumped to recently? Something that you’ve done new recently?

Chris Duffey: I just went on sabbatical, so I had five weeks sabbatical. So this is my first day back. So. Oh, a great a great way to kick things off. And so my family and I had had a great time, five weeks together. And I think the big realization was we’re all trying to find this balance of respecting history. We’re in Europe. So obviously a great, you know, reminder of the importance of history, but balancing that with the future. And I think balance is probably my big focus on everything I do in terms of work life balance, in terms of growth opportunities for business, in terms of, you know, new feature, is it ready? Should we launch and iterate approaches? And so balance is my big focus coming out of sabbatical.

Whitney Johnson: What a phenomenal counterweight to the work that you’re doing of being in Europe and exploring history and then going into the future, the Metaverse. Um, all right. So Chris, what was useful to you in this conversation and maybe something that you said, it may be something that you observed or thought.

Chris Duffey: You know, I’ve been taking mental notes. I think you do a wonderful job of prompting. And so I think as we enter this new era of the importance of prompting into generative, I think there’s going to be an art in terms of crafting the prompt some early responses, reactions to gen-z it just falls flat. It’s not as rich or textured as I had hoped. And I think the realization is it might not be solely the system. It might be the prompts you’re entering into the system that’s not equating to something the user wants. So prompt is something I’ve noticed you’re a great prompter, but you’re also a wonderful listener and the ability to summarize. And I think that that again is a something in this culture of slam-dunking environment, aggressive, maybe overly aggressive. I think it’s a great reminder of the importance of listening.

Whitney Johnson: Thank you. I see an article coming out of your brain called The Art of the Prompt. I hope you will write it. So, Chris, any final thoughts?

Chris Duffey: No, I encourage everyone to yes, question technology in many ways. That is part of creating technologies to, you know, apply the scientific approach of constantly questioning things. But I would also encourage people to use it as well. And so use it professionally, personally, because I think the more people use it, the more they will feel comfortable with it. But also, you know, help them question things as well about it.

Whitney Johnson: Mm hmm. It’s so exciting. Chris Duffy, thank you very much for joining us and helping us decode the metaverse. And I would recommend that people pick up your book again. Thank you.

Chris Duffey: Thank you.

Right away at the top of our conversation, Chris called himself an anomaly. White polo shirts and lectures during the day. Black T-shirt and sculpting by night. Shouldn’t we all hope to be such anomalies? To have that little mystical computer in your skull firing on all pistons. Chris told us the ultimate rabbit hole is trying to replicate that quote unquote. Anomaly using ones and zeroes. We can chase our tail as a species forever doing that. But if we use our unique computer to build other unique computers, now we have the tools in our Batman utility belt. It’s putting truth into that old idiom. If you can think it, you can do it. Really? I want to leave you with this. There was a show on AMC in the late 2010s called Halt and Catch Fire. It was all about the personal computing revolution in the 80s. But even in the pilot episode, the message was clear. Here’s Joe Macmillan, the jobs esque visionary played by Lee Pace.

Computers aren’t the thing. They’re the thing that gets us to the thing. I guess it’s up to us to figure out what that thing is. For more from a fellow creative visionary, I’d love for you to listen to episode 311. That’s my talk with the first Chief Design Officer at both 3 a.m. and PepsiCo. Mauro Porcini from Sticky Notes to Mountain Dew. It doesn’t get much more split brain than that. Both Mauro and Chris take that humanist view of the future, centering not on what we build, but on us. Thank you again to Chris Duffy for joining us and thank you for listening. If you enjoyed today’s show, hit subscribe so you don’t miss a single episode. Thank you to our producer, Alexander Turk, production assistant Ange Harris, and production coordinator Nicole Pellegrino.

I’m Whitney Johnson

and this has been Disrupt Yourself.

Click to access the login or register cheese