Skip to content



Illustrations by Rozalina Burkova

Animation by Alex Kuzoian

What do you do if your invention becomes a weapon? This happened to former Wired editor Chris Anderson when he launched DIY Drones, an open-source community that helps people build their own flying machines. Now, homemade drones are used by everyone from conservationists to contractors — but they’re also used by ISIS to drop bombs on civilians. So what is Chris’ responsibility here? Did he foster innovation for a community of like-minded do-gooders — or democratize a weapon for a terrorist group?

Subscribe Now for Free: Apple Podcasts | Google Play Music | RSS Feed


Chris Anderson is the CEO and founder of 3DR, America’s biggest drone maker in units. Prior to 3DR, Chris was the editor in chief of Wired magazine, and a writer and editor for The Economist, and Science. He’s the author of Makers: The New Industrial Revolution. He studied computation physics at George Washington and UC Berkeley and did research at Los Alamos National Laboratory. Follow him at @chr1sa


Baratunde Thurston is an Emmy-nominated futurist comedian, writer, and cultural critic. Thurston helped relaunch The Daily Show with Trevor Noah, co-founded the Cultivated Wit agency, and wrote the bestseller How to Be Black. He is the host of the new podcast How to Citizen. Follow him at @baratunde


Kevin Delaney is the editor in chief and co-CEO of Quartz, which he co-founded in 2012. Previously, Delaney was managing editor of, a reporter for SmartMoney magazine, and a TV producer in Montreal. He is a member of the Council on Foreign Relations and has a degree in history from Yale. Follow him at @delaney

In a science fiction book there would have been that guy who had a choice to say, “Oh, no! They might destroy the world. I should probably stop.” And I’m like, “I’m not gonna stop.”



CHRIS ANDERSON: Ten seconds after I did it, I realized that I’d created something that other people saw as a weapon.


Drones are highly classified, export-controlled, aerospace-grade weapons – and I just started a community that was putting the letters “DIY” in front of a military technology.


Suddenly we had tens of thousands of people basically putting robots in the air.


It’s totally technically possible to have cameras overhead all the time.


You don’t have any legal right to privacy outside. You have actually no right.


As you know, terrorist groups – ISIS and others – are using drones today to drop munitions. Some of them are based on our technology.


It turns out that a few bad people with powerful tools do have a lot more impact than a lot of good people with powerful tools. Many of us were just deeply wrong about that.


In a science fiction book there would have been that guy who had a choice to say, “Oh, no! They might destroy the world. I should probably stop.” And I’m like, “I’m not gonna stop.”


CATERINA FAKE: That was Chris Anderson, who left his post as Editor of Wired magazine to start DIY Drones, an open-source community that helped anyone build their own flying machines. This was back when drones were still a classified weapon, not an off-the-shelf consumer product.


His drones allowed filmmakers to capture sweeping vistas and scientists to count the nests of endangered birds. But Chris believes they also helped ISIS drop bombs on civilians in Syria. Drones already exist – and they’ve created ubiquitous eyes in the sky. But should they persist?


That’s one question we’ll ask on today’s show. But Chris’s story raises a deeper question: Who is responsible when an invention becomes a weapon?




FAKE: I’m Caterina Fake, and I believe that the boldest new technologies can help us flourish as human beings – or destroy the very thing that makes us human. It’s our choice.


A bit about me: I co-founded Flickr – and helped build companies like Etsy, Kickstarter, and Public Goods from when they first started. I’m now an investor at Yes VC, and your host.


I want to start today’s show with a story. It takes place on a hot afternoon in the summer of 2018.


KEVIN DELANEY: There’s this really loud sort of “bzzzzzzz” noise I could hear from inside my apartment in Brooklyn. And it was just persistent.


FAKE: That’s Kevin Delaney, the Editor-in-Chief of Quartz.


DELANEY: So I stepped out into our backyard, and I looked up. And about maybe 50 feet above me, there was a drone. And it was just parked over our backyard. My teenage daughter came out and we were just standing there, and this drone – who we didn’t know who it belonged to – was actually just sitting there above us, surveying our backyard, a space that we imagined to actually be a private space.


I started looking around for objects that I could throw at it. I was looking around for a rock to hit this drone with and take it out of the sky. I couldn’t actually find anything. After a while, it sort of sped off and disappeared, and it never actually came back. But the sense of powerlessness in the face of a drone hovering over us was actually pretty profound.


FAKE: That moment – when Kevin looks for a rock to throw at a drone – really struck me. I mean, Kevin’s a long-time tech journalist, right? But I can imagine just how he felt. Even the most techie among us have these moments of powerless frustration when technology peers uncomfortably into our private lives.


We already live in a surveillance state. But unlike what George Orwell about wrote in 1984, it’s not a dictator following our every move. It’s actually tech companies, and other citizens, like the random kid who bought a drone at Walmart.


On today’s show, we’re following two parallel stories: The first is about the world of surveillance brought by drones – the kind of drones that anyone can buy. You can attribute the spread of this invention, in part, to Chris Anderson.


Twelve years ago, he launched an open-source community – and then a company – to evolve drones from a military technology to an everyday gadget. By helping to make drones available to all, Chris helped farmers monitor their crops, scientists track endangered species, and doctors deliver life-saving blood to rural communities.


But drones also contributed to the constant buzzing invasion of privacy here in the U.S. More shockingly, Chris believes these same drones – and ones like them – were used by ISIS to drop bombs on civilians in Syria.


Which brings us to the second story we’ll follow on today’s show. In Chris’s decisions, we can map the arc of the inventor’s dilemma: What do you do when you realize you’ve created a weapon? The story of DIY Drones began when Chris was still the Editor-in-Chief of Wired magazine. He was sitting around the dining room table with his kids.


ANDERSON: So I’m always trying to get my kids interested in science and technology and just failing time and time again – possibly because I’m trying too hard. One weekend a LEGO robotics kit, called LEGO Mindstorms, came in and a radio-controlled airplane came in. And I thought, “This is going to be an awesome weekend. We’re going to do robots on Saturday, and then we’re going to fly a plane on Sunday.”


We did indeed build the LEGO robots. And they did pretty much what you would expect, which is sort of drive very slowly towards a wall and then back away. So the robots were a bit of a bust.


So on Sunday we decided to fly the plane. And we take them to the park – and I immediately fly into a tree. And I thought, you know, “How could that have gone better?”


And I thought, “Well, what if the robot flew the plane?” I go to Google and I’m like “Robot,  plane.” And the first result is “drone.” I’m like “Wait, what’s a drone?” I Google again. It’s an aircraft with an autopilot. And I’m like, “Wait, that’s what’s in the LEGO Mindstorms box.”


And that evening around the dining room table, we put together a LEGO autopilot and put it in our seaplane, and then the next weekend we tried it and it kind of did work – at least it flew better than me. And at that point the kids were done. But I was like, “What just happened?”


FAKE: This was 2007. And what had happened is that new tools – like mini sensors – had become plentiful and cheap, making it possible to build things like GPS systems and accelerometers in small devices. This was the year the iPhone launched, the year 3D printing became mainstream, the year the “maker movement” took hold, making small scale creation of electronics possible.


At this point, drones were still mainly associated with the military. And while a few hobbyist kits were available, they hadn’t made the leap to widespread off-the-shelf products.


But Chris was fascinated by the drone he created, and wanted to see if others shared his passion. He created a community called DIY Drones, where anyone could learn to build a drone of their own.


ANDERSON: It was putting the letters “DIY” in front of a military technology and it was a little provocative, a little cheeky, and it just took off.


Everybody was thinking the same way, and this community became the place where they did it. And suddenly we had tens of thousands of people creating electronic designs and software designs, and basically putting robots in the air.


FAKE: The idea of DIY drones started to take hold.


ANDERSON: Once you can put advanced technology in real people’s hands, it inspires them to think of applications they wouldn’t have thought of otherwise.


So we had people doing things like wildlife conservation. It’s like, “Hey, my job is to look at crane nesting. How do we count crane nests?” And it turns out, you’re flying over them, they’re pretty easy to spot.


Then the next person is like, “Oh, I actually do bat movement. Could we use this to detect bats at night?”


These aren’t people who came for the drones. It’s people who had a problem out there that maybe a drone could solve.


FAKE: More and more people found problems that drones could solve. To give you a sense of the creativity that flourishes when that happens, let me take you into the DIY Drones community. These are all real exchanges that took place recently in Chris’s open source community. And you can hear in them the surprising, sometimes hilarious, and sort of gorgeous flourishing of creativity that can happen when ordinary people have access to new tools.


DR. SUREFLY: Hello all, I’m new to this. I’m in the medical field and I need a hobby. I’ve decided to build a drone. A big drone. I’m going to build an octo-copter that can carry me.


RESPONDER 1: Wowowowowowowow!


CATTLEMAN: Hi all, Is there any software out there that can count cattle?


RESPONDER 2: You could upload a Mechanical Turk and pay someone like 15 cents to count the cows.


DR. SUREFLY: Will a BMW motor work for my octo-copter?


RESPONDER 1: I would start with building something small as this is a very complicated and dangerous project!


RESPONDER 2: The 10218 with 30.5” triple props is capable of lifting 57.28 pounds.


DRONE VID: Attention fellow drone fans! I found this hilarious drone wipeout video that I know you guys will love just as much as I did! Watch, laugh, hey, share the joy!


RESPONDER 2: I can count 200 cattle with a 4% margin of error. I’m trying to count trees right now – but trees are much harder than cows.


RESPONDER 1: I wrote a bird counting algorithm that I could adjust just to count cows.


DR. SUREFLY: Awesome! Thanks for the tips! I’ll post details and photos when I finally start building my Octo-copter.


FAKE: It was this kind of energy in the consumer drone market that drove Chris to launch his company 3DR, with partner Jordi Munoz. 3DR stands for 3D Robotics.


When they launched in 2009, 3DR sold homemade drone kits, continuing the DIY Drones tradition. In 2012, Chris left Wired and turned his attention full-time to selling 3DR’s first recreational drone: the IRIS. Thanks to 3DR – and even more so, the Chinese company DJI – drones were everywhere in 2015, after Chris first rigged some LEGOs to a toy airplane.


Chris remembers a moment when the potential consequences of his consumer drones flashed before his eyes.


ANDERSON: I am on an airplane and open up my book, called Kill Decision, by Daniel Suarez. It’s a science fiction book about swarming drones that are modeled after army ants, and end up destroying the world. And I’m like, you know, this is super cool.


Then I closed my book, I opened up my laptop and go back to what I’m doing, which happened to be writing some code for our drones. I’m writing the swarming algorithm.


I’m like, “Oh, huh.” So that’s the point in a plot in a science fiction book where there would have been that guy who had a choice to continue writing this swarming algorithm – or say, “Oh, no! They might be used to destroy the world. I should probably stop!”


And I’m like, “I’m not gonna stop!”


FAKE: Chris knew his invention could be used to cause harm. But rather than stopping, he forged ahead. Had Chris crossed an ethical line?


ANDERSON: I was given an opportunity to just keep my laptop shut – and I didn’t.


FAKE: A year later, Chris learned that ISIS had bombed Syria using the drones that were likely made by the DIY community he created.


FAKE: How did you find out that ISIS was using the DIY Drones in their practice?


ANDERSON: Reading the New York Times.


FAKE: So you read about it alongside the rest of us?


ANDERSON: Yeah. So we knew it was possible, and of course people sort of theoretically said, “Couldn’t terrorists use this?” And the answer is, “Yes.” But it wasn’t until the pictures started circulating from the front and there’s this various destroyed drones or crashed drones and like “Oh, I recognize that.”


I actually don’t know what exact code they were using. I assume it was ours, modeled after ours, or using our code, or using our open source designs. We never sold to ISIS. What happens is we sold to regular people, who sold to regular people and went on eBay. And then six people later, it’s in Syria. Once you start selling things on the internet, it’s hard to know where they end up.


FAKE: And how did you feel about that?


ANDERSON: Partly – obviously, I felt bad, right? Partly I felt like it was entirely predictable and we knew this was going to happen all along. Obviously I was also hearing from three-letter agencies.


FAKE: By three-letter agencies, he means the FBI, the CIA, the HSA, and so on.


ANDERSON: Part of me felt, like “What more can I do?” And so I redoubled my efforts to brief the agencies and to go to DC and to talk to everybody I could to let them know what was going on.


On the community, we were the first to see evidence of people at least contemplating bad things. If you’re going to fly 30 kilograms for 50 kilometers, that doesn’t sound good. That just kind of sends off our warning sign. So we were very clear with our terms of service on day one – this is 10 years ago – that if we saw behavior that we considered unsafe or illegal that we would report it.


And we did so. Many, many times. But all that means is we just call up one of our buddies in one the agencies and send them a link and say, “This person seems like they’re up to no good.”


FAKE: So Chris informed government agencies when they saw something suspicious. But to me, that’s too little too late. Could he have caught it sooner? When Chris first realized that his drones could be used as a weapon, what might he have done differently?


FAKE: Was there a moment when you realized that you had created a weapon?


ANDERSON: Well there was certainly a moment – and that moment was about ten seconds after I did it, when I realized that I’d created something that other people saw as a weapon.


When we – my kids and I – were putting a LEGO autopilot on the internet, comment number two, I imagine, was somebody saying, “Hey, you know that autopilots are export controlled and by disseminating this, by exporting it across national borders, there’s all this paperwork you need to file to make sure it didn’t get in the wrong hands.”


FAKE: And was there any moment in which you thought you might want to stop the project?


ANDERSON: No. Not at all. Again, it’s LEGO with my children. The fact that somebody out there considered it a weapon suggested to me that there was a gross miscalibration of what we considered weapons.   


FAKE: This is what’s strange and fascinating about Chris’s story. He didn’t accidentally create a weapon. Rather, he deliberately took a weapon and re-invented it as a consumer tool.


ANDERSON I dined out on the phrase “weaponizing LEGO” for years. I thought it was, this is sort of classic Silicon Valley libertarian hubris: The government’s so stupid, they think LEGO is a weapon. So I definitely did see this as exposing a chink in our regulatory armor.

So yeah, I thought it was super funny that people would nominally stick us in a regulatory bucket that would require armed guards. But I’d hoped that this would be a wake up call for the regulators to update the regs.


FAKE: Now, Chris has a point here. The government is notoriously behind the times when it comes to classifying technology. For example, certain game consoles – like the Playstation 2 – were prohibited from export for a while because they were considered a potential weapon.


ANDERSON: If the synergistic – and I use this word loosely – relationship between Silicon Valley and government is anything, it’s the accelerator and the brake. They’re both necessary to drive a car.


The accelerator pushes a little too hard and then the brake pulls it back, etc. And so, the notion would be that government without the feedback loop of innovation and provocation would be static. And so we help government become more dynamic by being a little bit provocative.


FAKE: You could argue that “provocative” is what Silicon Valley does best. As Chris mentioned, there’s a dominant libertarian mindset that says, “Keep the government out of my invention.”


This could be dangerous when combined with the fact that inventors love the act of inventing. Often, they don’t consider it their job to imagine the outcomes. This is where Chris and I see things differently.


ANDERSON: Here’s something we can fight about. We as technologists, it’s not our job to consider all the possible implications. It is those who we have elected to protect us – the police, the military, government, etc. – it’s their job to consider how society feels about what we invent and to inhibit us.


I feel like my only obligation is to let the authorities know what’s possible, and then submit myself to whatever they decide to do. But I shouldn’t be self censoring.


FAKE: But I do think that we – kind of sitting as we do here in Silicon Valley – whenever there is congressional hearings, we actually sit and watch elected officials, and the questions that are asked seem to be so clueless and ill informed. And so how do we fix that?


ANDERSON: That is definitely a flaw in my argument. I agree.


FAKE: Right? It’s like laughable.


ANDERSON: There’s only three optionsEither we police ourselves and we see how that went; B, we’re oppositional and we’re like, “Screw those guys.” I mean, in some sense, big tech companies are more powerful than nation states right now.


FAKE: And people would actually argue that we have created our entire super nation online.


ANDERSON: There is a third option though. And the third option is always the option with politics, which is take a deep breath, engage, go through those agonizing congressional hearings and trips to DC.


FAKE: But in some ways I think that we have a responsibility to basically put together that list of, “Here are the ways that this could go really badly wrong.” And just let them know. And kind of get out in front of it.


ANDERSON: Yeah. That would certainly be an impactful way to do it where you come to them with a list of, “Here’s 200 nightmare scenarios.”


FAKE: “Look at these things. Two hundred nightmare scenarios that we have come up with at our Monday meeting.” Why not?


ANDERSON: Well, one reason why not was that they’re like, “Well that sounds bad, we’re just shutting you down until we can figure it out.” So one thing is they don’t deal with the scary list very well.  


FAKE: Whether or not, we bring our “scary lists” to the government, I do believe that every inventor, every scientist, every tech company needs to understand the “scary list” for themselves. You have to ask and answer the question: How might my technology harm people?


And that’s a great segue into our Should This Exist workshop. Here, Chris and I will respond to ideas I’ve gathered from super smart, creative experts. We asked each of them to push our thinking about drones. In what ways could drones create a modern day dystopia? And in what ways could they create a better world?


Let’s begin with where we started the episode: Kevin’s story of the hovering drone in his backyard.


DELANEY: I was looking around for a rock to hit this drone with and take it out of the sky. I couldn’t actually find anything.


FAKE: I wonder what Chris has to say about this: Are backyard hovering drones something we should embrace? Or should we look for the nearest rock?


ANDERSON: That’s a very common reaction. Almost universal. I sympathize, probably people would feel the same way. Here’s a couple ways to think about it, which are not helpful in any emotional way but are sort of technically correct. One, all he has to do is look at this house on Google Maps and he can see that satellites have been taking pictures of his backyard for a long time – and airplanes as well. You could argue that in a sense the drone did him a favor of making noise. All the other ones are visible and silent.


FAKE: But in that sense you actually know who it’s coming from. You know that it’s for example, Google, etc.


ANDERSON: Do you? The satellites?


FAKE: Presumably. You’re saying that, “Just go to Google Maps or Google Earth and you know that you’re being photographed by them.”


ANDERSON: You have thousands of satellites overhead. Chinese, Russian, you know private, public, NSA, airplanes – do you have any idea which airplanes are overhead? To say nothing of the police. So no, you have actually no idea who’s using that. So that’s one thing.


From a purely legal perspective, you don’t own the space over your head. You have actually no right. Society, at least in the United States, we operate on what’s called reasonable expectation of privacy. And you have no reasonable expectation of privacy outside. You don’t have any legal right to privacy outside.


So I completely sympathize and feel exactly the same way – and yet I’m sure people felt that way about cell phones, cell phone cameras when they first started. They felt that way about traffic cameras when they first started. They probably felt that way about windows when they were first created, is that really jars our sense of privacy and personal versus public space. And this is just one of those liminal moments where we’re going to have to adjust.


FAKE: We can always count on comic Baratunde Thurston to look on the light side of any situation. Baratunde has spent his career at the intersection of humor and technology. He’s the host of the podcasts “Spit” and “#TellBlackStories.” Here’s what drones brought up for him.


BARATUNDE THURSTON: The fun side of this, to me, is we will probably end up with a new section on Craigslist, which is “Lost Drone” and “Found Drone” and so you try to reconnect, you know, “Found a drone in my backyard. Wonder who this belongs to?”


What is that going to look like in the future? They’ll be some drone matchmaking. Maybe you can overlap that with online dating and you end up with the person whose drone you found: “Oh, how did you guys meet?” “Well, his drone fell in my backyard, almost killed my dog, but I knew it was love when he responded to my Craigslist post.” That could be how children get made in the future.


ANDERSON: Well he’s very prescient that there has been a “Lost Drone” section of Craigslist for about four years. But it has not, as far as I know, led to a lot of happy matchmaking, largely because it’s one of the more obnoxious things you can do. Land a drone in someone’s backyard and that’s what we call mass jack-assery.


FAKE: Baratunde’s next thought for Chris took a more serious turn. Chris’s company creates software called Site Scan, which uses drones to survey large areas of land. Site Scan is often used to capture construction sites. But Baratunde sees more pressing use cases – with big implications.


THURSTON: Once you start assembling imagery of a wide area like that, you’re going to unlock a lot of different types of analysis. You’re going to be able to note, over time, when a car is parked in a certain location. You’re going to capture crime – and that’s going to help police.


And police are going to want access to all aerial footage that covers the scenes of these unsolved murders, and want to rewind this eye in the sky to this time, this place, “let’s see if your camera picked up anything,” and not just rely on retailers and their CCTVs, or ATMs and their cameras, or private home owners and their electronic doorbells with camera enabled, pointing out toward the street. This takes that up to another level and creates a ton of challenges.


Like criminals: So much crime will be caught on tape, so to speak. So much crime could be enabled. Really sophisticated. Talk about surveillance of a target for a burglary. I’m writing a ton of movies in my head right now about what a stakeout even looks like, about what reconnaissance even looks like with this sort of perspective, with this view on the ground from up above.


ANDERSON: Yeah, that’s good stuff. He’s absolutely right technically. We joke that the company should have been called, the company’s called 3DR because the third dimension is going up. But we should have been 4DR, because we are recording over time as well, and we can scroll backwards and forwards in time – which is super interesting when you’re trying to go through what happened when. It’s totally technically possible to have cameras overhead all the time recording all imagery, and then exactly as he says, rewind the tape.


Let’s say a red car was seen leaving the scene of a burglary. The police can just play the tape backwards in time and see where that red car came from and they can trace it right back to its origins. And there’s been a couple experiments in LA and elsewhere to do that.


FAKE: But would you see this as potentially a utopian outcome where we end up with a world without crime?


ANDERSON: I think we’ve all thought about this. Zero is not a good outcome. Zero crime is called a police state. You want all bad things to be kept to a dull roar. You want disease and death and people being mean kept to a dull roar. Anything necessary to take it to zero is so draconian and it creates more problems than it solves.


FAKE: Giving people access to technology can bring out the best and worst sides of human nature – particularly when it’s people watching other people.


ANDERSON: Let’s imagine that all homes have cameras now outside. And let’s imagine that we have two choices: We can either wait for the police to subpoena this in the case of an accident or we could just choose to connect all of our cameras and just make that footage available to the neighborhood or more. And then these cameras, of course, also have the ability to zoom in on things, quote unquote, of interest.


Well so let’s say we have to decide what is of interest: Is a stranger of interest? Do we start profiling? Are we allowed to profile? Do we record the license plate number of every car that has come through the neighborhood? And what do we do with that information?


You know, you can see that vigilante behavior could emerge, swarm-like, from these technologies, and we would end up with our worst instincts being embodied in our use of technology. The things we’d never allow the government to do to us, we’re doing to ourselves.


FAKE: I started to wonder: Open-source technologies are built on a foundation of trust and the assumption that people are good, but it’s not always the case. This use by ISIS of DIY Drones is a stark example. But it’s not exactly new. Kevin points to some examples in recent history we can look to.


DELANEY: Here’s another example, which is people open sourced 3D printing formulas for working guns. And there’s the classic example from years ago, there’s that book, the Anarchist Cookbook, which was this sort of mythical thing that existed in public libraries on shelves you had to ask the librarian to access, that had recipes for bombs. And so, I think that in general in our society, we need to be comfortable with a lot of this stuff existing.


FAKE: What do you think Chris?


ANDERSON: Yeah, and I thought that was very wise. He’s absolutely right that we can’t look at tiny minority abusing a technology and then ruin it for everybody.


FAKE: As Chris and I talked about the implications of an open-source community or open-source technology, I thought of  something that happened recently. When I launched the photo site Flickr, we were among the first to encourage the sharing of photos with a Creative Commons license. I believed in this kind of openness so strongly, I joined the board of Creative Commons.


But then I learned in the last few months that Creative Commons images from Flickr were being used by big tech companies to train their facial recognition algorithms. And this wasn’t the kind of use we had in mind when we developed or promoted this approach to openneness. It was distressing to me, and I shared some of those thoughts with Chris.


FAKE: Well, I do think that this, in some ways, points to all of the assumptions that underlie open source, is that we have a general idea that technology when it gets into – it is invented by people with the best of intentions, and it’s used by people who may not share those values.


And I think that this kind of gets to the core of what we believe as a society. That we have these values, that we believe people are operating in good faith, that we put the technology out there, that it’s open-sourced. “Tech for all.” And this tends to rub up against people with differing world views.


ANDERSON: Yeah, you’ve put it well. And I’m very guilty of that delusion. And this predates drones. And this goes all the way back to the internet. Personal computing. I mean I had was present at creation and I definitely operated under the following calculus: I think that most people are good, I think if you give powerful tools to regular people, they’ll do extraordinary things, and that if there are more good people than bad people, you’ll get a net good out of this.


And so I assumed a kind of symmetrical adoption impact relationship. And in fact, many of us were just deeply wrong about this – it’s very asymmetrical. It turns out that a few bad people with powerful tools can have a lot more impact than a lot of good people with powerful tools.


FAKE: In many ways, I still believe that the good that comes from open source technology outweighs the bad. If governments can ensure that these untoward uses of open culture are not exploited by bad actors, then it’s a win-win for everybody.


But that doesn’t absolve today’s technologists from their responsibility. It’s the challenge of this generation of inventors to grapple with the double-edged sword of innovation in the era of openness.


I’d love to know what you think – about open source, about drones, and about 3DR. Tweet at us using the #ShouldThisExist, and visit for additional reporting about 3DR.