Guest User

letstalkbitcoin #21

a guest
Nov 5th, 2013
103
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 61.69 KB | None | 0 0
  1. 0:14
  2. This is Episode 21 of Let's Talk Bitcoin, and like last time, this episode is a
  3. little bit different. Most of the time, I speak to guests and hosts over the
  4. internet, eyes fixed on screens, talking to people from around the world.
  5. 0:27
  6. John Light is a recent transplant to Northern California, and after meeting at
  7. Bitcoin 2013, I asked him up to the LTB homestead, tucked into a canyon outside
  8. the Napa valley, surrounded by steep redwood forests.
  9. 0:42
  10. Microphone in tow, we sat on the bank of a creek and talked about the future of
  11. identity in a networked world.
  12. 0:47
  13. Today's show is important.
  14. 0:50
  15. Granularity is a concept worth understanding. Imagine the seaside, waves
  16. crashing the shore. As a whole, it's a singular object, a beach, it has its
  17. place, and it doesn't move, it's enormous and persistent. At a granular level,
  18. it's billions of tiny pieces of sand; tidal impacts can move individual grains
  19. enormous distances, relatively speaking.
  20. 1:19
  21. We talked about identity in this context. It's not about the beach. It's about
  22. the individual pieces of sand. Each one is a detail, attribute, event. They're
  23. you. Right now, you pick either the beach, or no beach at all. But that's about
  24. to change.
  25. 1:45
  26. Privacy. You know, there was a time before Facebook, and a time before social
  27. networks in general, and a time before frankly websites where everything you put
  28. on them was owned by the website.
  29. 1:57
  30. I think that we might be moving back towards a time when suddenly this type of
  31. granular control over your own identity is possible, and it's because of
  32. concepts like personal clouds.
  33. 2:09
  34. I'm sitting here today with John Light, one of the good guys so to speak, working
  35. on this identity problem and this personal control problem.
  36. 2:19
  37. John, how did you get into personal clouds?
  38. 2:21
  39. John: When I started my blog, p2pconnects.us, which is a blog about peer-to-peer
  40. technology and how I think it can help solve a lot of problems that are going on
  41. in the world today, the very first blog that I wrote, it was called "universal
  42. reputation rating systems: the future of trust in a networked society".
  43. 2:42
  44. I wrote that based on technology that I saw coming out which was essentially
  45. bringing all of people's social profiles together into one place to create what
  46. I'm calling right now a universal reputation rating.
  47. 2:59
  48. Basically it pulls your ebay, your air b and b, your Facebook, your Twitter, it
  49. puts it all together and it creates a score for how likely you are to be trusted
  50. based on all the connections that you have, all of your past history in the
  51. marketplace.
  52. 3:16
  53. I saw that as a trend; the first company I came across was called TrustCloud
  54. that did this, there's more: there a website called connect.me which offers a
  55. reputation rating system.
  56. 3:29
  57. It doesn't necessarily pull together all of you different social profiles like
  58. TrustCloud does, but it gives people a way to endorse you for different
  59. activities to show that to other people, "yes, I'm vouched for in this way" and
  60. now LinkedIn has even incorporated something like that, where people you can
  61. endorse you for different skills, and so that's kind of how I got interested in
  62. reputation, and through researching for those blog posts that I wrote about
  63. that, the second one was called "universal reputation rating systems: problems
  64. and solutions," where i go into some of the pitfalls that could be encountered
  65. with a reputation system like this and then maybe some solutions for how that
  66. can be worked, and as I began researching for writing these articles, I came
  67. across this concept called personal clouds, where the personal cloud is like
  68. your singular place on the internet to put all of your identity information,
  69. all of your social information and financial information -- everything that you
  70. would possibly need on the internet into a one secure encrypted environment,
  71. and then that identityi, that place marker on the web has its own reputation and
  72. and what the connect.me service is trying to do is trying to serve as one of the
  73. primary reputation providers for the personal cloud ecosystem. No matter where
  74. you are on the internet, you're going to be able to carry around this reputation
  75. that you've spent a lot of time and a lot of effort building up.
  76. 5:08
  77. Rather than having to create a whole new reputation every time you join a new
  78. community, like a new forum or a new marketplace or whatever. So right now we're
  79. starting to see the beginning of this, with social sign on where you can
  80. actually use Facebook and other social networks to log in to places, but that's
  81. still not good enough. Facebook itself is a walled garden that you can't really
  82. export your data from and take to other providers, whereas with personal clouds
  83. you're going to be able to do that. If you don't like your current personal
  84. cloud provider you can just take all of that data that you spent so much time
  85. accumulating and organizing, and just export it and drop it right into another
  86. cloud service provider, or you can post it yourself. When I came across this
  87. concept I realized that is going to do for the internet why Bitcoin is doing for
  88. money, or what torrents are doing for file sharing. It is going to create a
  89. purely p2p, peer-to-peer environment where people are in full control of
  90. everything that's coming in and out, and that's to me I think really important,
  91. especially you know right now, after the news has come out about Edward Snowden
  92. leaks, where we find out that these companies which people have been entrusting
  93. with their data aren't just sharing it with advertisers, it's going to
  94. governments.
  95. 6:36
  96. For people everywhere else that's not in the USA, a government that's not even
  97. their own government. It's really important that people be given the tools to
  98. take control of their data, and control of their privacy, and personal clouds,
  99. I see, are a way to start doing that.
  100. 6:53
  101. Adam: Talking about social networks, and services that keep data for you, you're
  102. talking about how they expose your data to other people in various places, why
  103. is that? Why is that the situation that we find ourselves in, what happened in
  104. the development of infrastructure up to this point?
  105. 7:11
  106. Personal clouds are something that will be out in the next months or years, but
  107. this is a problem clearly that we've had for a while so why did it take until
  108. now to start addressing it?
  109. 7:22
  110. John: The development of Diaspora, which was intended to be a decentralized and
  111. federated social network tried to start tackling this problem years ago. It's an
  112. open source project so there's no real money or monetary incentive for people
  113. to fully develop it, so while the developers have done a really good job of
  114. building a really great product it's not perfect and it's not anywhere near as
  115. advanced in terms of security and granularity as personal clouds as a platform
  116. will be. Diaspora could be looked at almost like a proto personalal cloud.
  117. 8:02
  118. Along with services like personal.com where you can upload a bunch of your
  119. financial and medical and personal data and it's all encrypted, but you can't
  120. really share it on such a granular level like personal clouds are going to be
  121. able to, so there's a lot of services that have tried to address this problem
  122. from different angles, but no one has yet really been able to bring it together
  123. to a holistic system, a holistic platform like personal clouds have.
  124. 8:30
  125. Now how do we get here? Well in terms of Facebook, they don't have any other
  126. revenue model other than to sell your data to advertisers; that's kind of
  127. Google's play as well, you know their revenue model is taking your data and then
  128. using it to serve you advertisements; it's a part of the revenue model you know.
  129. They don't think that they can charge people to use their services so instead
  130. they give it to you for free and then their customers are actually the ad
  131. companies.
  132. 8:58
  133. You're not the customer you're the product when you use social networks. Part of
  134. the challenge for personal clouds is going to be to find out whether people are
  135. willing to pay for privacy. I'm willing to pay for privacy but is, you know, the
  136. average social network user? That remains to be seen.
  137. 9:15
  138. It's really just a matter of changing the incentives away from being
  139. incentivized to sell data or even just give it wholesale over to malicious
  140. governments, and instead have an incentive to keep it private because the data
  141. owner is the customer not the product.
  142. 9:36
  143. Adam: Does privacy matter in the modern age? I mean I think if there's
  144. something that the recent revelations about state spying is concerned, basically
  145. all the stuff that's been coming out about how data is so insecure, it's really
  146. made me wonder, you know I do a lot of business on skype for example, I do a lot
  147. of communication on skype, and that's totally a compromised platform and so I
  148. just kind of assume at this point that whatever I do I better be comfortable
  149. with somebody out there, anybody out there being able to look at it because
  150. there just isn't much they can be done about it.
  151. 10:05
  152. I mean do you think that this is something that even can be tackled reasonably,
  153. or have we passed the point of no return?
  154. 10:11
  155. John: I think that's a really good question, especially as people, despite
  156. knowing what's going on with these social networks continuing to use them.
  157. 10:20
  158. It's an implicit acceptance of the status quo. I think it's dangerous,
  159. personally. Is privacy important? I think it's very important. I think people
  160. should be able to have an on and off switch for privacy, instead of it just
  161. being you know off all the time and then than having to jump over insane hurdles
  162. to get it to that on position.
  163. 10:42
  164. One of the most dangerous things about a lack of privacy on social networks is
  165. it's not just the individual pieces of information that go on to the social
  166. networks, it's the aggregation of this information. If you just put a few status
  167. on Facebook and maybe some location things on Foursquare, and you text a few
  168. people on your cell phone, individually those things might not seem so harmful
  169. but you put it together and all the sudden I know who you're hanging out with,
  170. where you are, where you're not, meaning that if you're not at home your home is
  171. open to burglary or wiretapping or any other kind of malicious activity where
  172. people could then, you know exploit these openings in the various communications
  173. platforms that we have, in order to commit serious crime against you. If you're
  174. a young attractive individual and you suddenly develop a stalker who has a
  175. little bit of technological savvy they might be able to find you when you're
  176. alone, and that's a serious concern.
  177. 11:42
  178. It hasn't developed to be something where that's a common occurrence yet, I
  179. haven't personally heard of anyone using such an aggregation of information to
  180. do this stuff yet, but
  181. 11:53
  182. Adam: it's all out there
  183. John: the threat is there, you know, and it's really not even just what this
  184. government is doing with our data, who's the next guy that's going to be
  185. elected, or gal? If Edward Snowden could have access to this information, who
  186. the hell is Edward Snowden? I have no idea, I didn't elect him, and yet he had
  187. any access to all of this information, he said he could get dossiers on the
  188. president if he wanted to, and read the president's text messages if he wanted
  189. to.
  190. 12:18
  191. That's really scary because all it would take is some criminal organization to
  192. infiltrate the NSA, literally just have one of their young members just go to
  193. college for info sec, go into the military just through the whole step process
  194. and say you're going to be our inside guy, they get behind the controls of this
  195. huge surveillance apparatus and rub their hands together, and just start
  196. clicking away and instead of leaking information to The Guardian, they leak
  197. information to their bosses.
  198. Adam: But they've solved this. You might not have seen this but they've got a
  199. solution to this leaking problem, you see it's called the buddy solution.
  200. So anytime anybody needs to access confidential data in the same way that Mr.
  201. Snowden did the process will be that there will be someone else who will have to
  202. sign off on it, and that will make it 100% secure, it can never be compromised
  203. at all, and if that doesn't work, they're going to the three-man system so it's
  204. yeah, I mean you're totally right.
  205. 13:18
  206. There are something like five hundred thousand people who have Top Secret
  207. security clearance, and that's ridiculous you know, how can something be a
  208. secret when that many people know?
  209. 13:25
  210. John: It's really not just those individual situations, the systems themselves
  211. could become compromised to outside attacks, where you don't even need an
  212. inside man. I mean the government is pretty good at the building an intranet,
  213. where you need inside access in order to see some of this information, but
  214. it's only a matter of time before something like Stuxnet or Flame or any of
  215. these really malicious viruses is able to find their way into these systems and
  216. just expose everything. Maybe it's not even just a concerted effort to
  217. take information and give it to a particular organization, but just to dump all
  218. of it.
  219. 14:04
  220. I mean what happens when that happens? I mean it's just, again it's not the
  221. individual pieces of information that matter, it's the aggregation, it's being
  222. able to build a behavioral profile where I don't just know what you're doing
  223. right now, I can predict what you're doing for the next month because I know
  224. exactly what you do every morning, I know when you go to bed, I know who you
  225. hang out with, for some people they have crazy lives and those things are hard
  226. to predict but a lot of people are creatures of habit, and these kinds of
  227. attacks become very easy, you know. It's just, that's what really concerns me,
  228. is that people are going to be exposing themselves in ways that they can't even
  229. imagine because they're only looking at it one instance at a time, they're only
  230. experiencing it one instance at a time, they don't have this bird's eye view of
  231. what the whole picture looks like and frankly I think that if governments can
  232. have this bird's eye view, the people who they're collecting data on should be
  233. able to as well, so that we can get this kind of full picture of "oh my god."
  234. 15:09
  235. I can tell what this is leading to, now that i can see this whole profile on me.
  236. Facebook has already introduced social graphs, so you can kind of start to get a
  237. picture. I know this person, they know this person and this is what our whole
  238. thing looks like, but you add it in with cellphone data, you added in with your
  239. Gmail accounts, you add it in with everything else that's being collected, it's
  240. a really scary picture.
  241. 15:37
  242. Adam: You use the word aggregation of data a couple of times, the term
  243. aggregation of data a couple of times but I think that the way that I would say
  244. that is it's about the centralization of data because like you said, it turns it
  245. into a target. I mean it's just like Facebook is a huge target for being
  246. attacked because they house so much personal data, if that's true one has to
  247. imagine that systems that are designed to collect and combine data from all of
  248. these different enormous sites on the internet, communities on the internet,
  249. would of course be an even larger target, just because there's so much data,
  250. in the same way that web wallet services are bigger targets than Bitcoin wallets
  251. on computers.
  252. 16:13
  253. John: That's a great point. It is this centralization really that is the key
  254. issue. You know with personal clouds you're still taking all of that data and
  255. aggregating it into one place, but it's your personal cloud which itself
  256. you're going to be to self-host, you're going to be able to host it with
  257. third-party providers...
  258. 16:32
  259. Adam: Okay so we've talked about, we've kind of talked around personal clouds
  260. now, but I think it's relevant to go back and talk about them in a more basic
  261. way. I'm a user, I buy into what you're saying, that privacy is an issue that
  262. I should be concerned about, and so what does this system look like that's
  263. different, how is it different, how am I interacting with this personal cloud
  264. in a way that lets me do the things that you're saying I can use it to replace?
  265. John: That's a really good questions, so I consider myself an enthusiastic end
  266. user, I'm not a programmer, so I'm not working on this from a technological
  267. level, but this is what it's going to look like and this is why i'm so excited
  268. about it: so you're going to have a platform, let's just for an example say that
  269. you're hosting a with a third party, which most people will do just like they
  270. host email with third party,
  271. 17:21
  272. so this third-party provider, when you first sign up with an account you're
  273. going to get a user agreement which should and will have a framework, it's
  274. called a trust framework, which outlines exactly what and what not that company
  275. can do with your data. Not will do, can do.
  276. 17:45
  277. Adam: Is that different than current, when Facebook you know puts their terms
  278. of use out, are they saying "this is what we will or won't do" but not talking
  279. about what they can't, or are you just saying that that's important particularly
  280. in this case?
  281. 17:57
  282. John: I think it's important particularly in this case. All of the data will be
  283. encrypted by default; there's only so much that the company will be able to do
  284. with it anyway. I can use an example of one of the existing trust frameworks.
  285. How i found out about this concept was through a company called The Respect
  286. Network, which is building a network of cloud service providers who all agree to
  287. what they call the Respect Trust Framework, which has five principles including:
  288. how the data is stored, you know how the data is protected at the security
  289. level, interoperability at the protocol level
  290. Adam: Now when you say interoperability you mean the ability to...
  291. 18:38
  292. John: For all of these personal clouds to talk to each other, regardless of who
  293. the service provider is. Redundancy: if their server goes down, your data is
  294. still safe somewhere else, and that is about how the data is actually handled.
  295. Several other points as well, I don't have them memorized. It's very basic, five
  296. principles and then they elaborate from there to describe exactly how they're
  297. going to fulfill each of these principles, and that's going to be the the basic
  298. contract that you're going to be getting into with these cloud service
  299. providers.
  300. 19:09
  301. The important part is that your data is encrypted by default. The kinds of uses
  302. that cause service providers envision this platform being used for require it.
  303. They would be breaking multiple laws if they didn't. Things like HIPAA, the
  304. Health Insurance something Privacy Act, laws that governs the privacy of data
  305. online around financial institutions.
  306. 19:37
  307. Health, financial, those are the two probably most sensitive and high-risk kinds
  308. of applications with which these cloud service providers are expecting their
  309. users to trust them with that kind of data.
  310. 19:49
  311. Adam: So on the health side, what does that look like, what's a scenario where a
  312. personal cloud is useful to me in a health capacity or a medical capacity?
  313. 19:56
  314. John: Sure, let me just tell the story of how the personal cloud is going to
  315. work: so when you sign up for a personal cloud provider, they give you that
  316. agreement, you kind of look at this and you say "do I agree to this? Yes i agree
  317. to this" and then you start to fill out your basic information and upload some
  318. basic data about yourself, a name, a bio, your contact information, maybe attach
  319. some credit cards and debit cards and bank accounts, and from there build
  320. relationships with other personal clouds.
  321. 20:32
  322. As you start to build relationships, each new relationship that you have you'll
  323. be able to give them full granular access to the data you have stored in your
  324. personal cloud.
  325. 20:44
  326. So what your family sees will be different from what your best friend sees, will
  327. be different from what your co-workers see, will be different from what your
  328. doctor sees, and so on and so forth in the case of creating a relationship with
  329. your doctor, instead of your medical records going into a filing cabinet which is
  330. stored behind his desk, they're just going to be dumped right into your personal
  331. cloud, and then as the doctor needs that data to do his job you have a specific,
  332. what would be caught a link contract which governs when and for how long that
  333. data will be available to the doctor.
  334. 21:24
  335. So maybe his office is only open from nine to five so the link contract says
  336. that he can only access your medical records from nine to five Monday through
  337. Friday and then the rest of the time that connection is completely sealed off
  338. by the encryption.
  339. 21:39
  340. How this constant decryption and reencryption of data occurs is currently being
  341. built into the the personal cloud platform, that's one of the the big challenges
  342. of building this kind of system is at the protocol level, building privacy in so
  343. that these features actually work.
  344. 21:58
  345. There's already a protocol called XDI, I believe XDI, which will govern how the
  346. data is exchanged. In the instance of connecting with a friend, basically when
  347. you add them to your personal cloud network, you are going to give them specific
  348. permissions to access specific data.
  349. 22:21
  350. You know you guys listen to the same music, so you let him access your music
  351. files that you've uploaded. You don't care if he knows your bio, so you give
  352. him permission to access your bio. He knows your real name so you give him
  353. access to your real name.
  354. 22:34
  355. Now let's flip this around, and say it's not your best friend that you're adding
  356. to your personal cloud network, it's this new person. You just met them, maybe
  357. instead of seeing your full name they just see your first name, instead of
  358. seeing your real picture maybe they see some stock picture of a blank face, or
  359. something like that and they don't see your full bio, they just see your
  360. professional bio, something you don't mind being public. Things like that.
  361. 23:03
  362. As you gain trust with somebody you can open them up to have access to more
  363. information, and then from there it could be called personal cloud because
  364. it's not just accessible from one centralized location, it's something that is
  365. going to be usable across devices, so that my smartphone could access my
  366. personal cloud, my tablets, my computer, any0 kind of technology platform that I
  367. have which has access to the internet will be able to access my personal
  368. cloud. I'll be able to authenticate to the personal cloud so that it lets me in,
  369. and then I can control everything from there.
  370. 23:59
  371. Advertisement: If I showed you a website where you could easily purchase
  372. electronics from the world's largest distributor with Bitcoins at zero percent
  373. markup, would you think it was too good to be true? Good news: it's real, and
  374. it's at BitcoinStore.com. Choose from half a million items, save money over
  375. Amazon and Newegg, and convert your Bitcoins to real-world items. You can even
  376. buy with privacy; all they need is a shipping address. But don't take my word
  377. for it, see for yourself at BitcoinStore.com.
  378. 24:32
  379. Let's Talk Bitcoin is an experiment focused on getting new ideas into the
  380. conversation. If you like what we're doing, visit letstalkbitcoin.com for
  381. episode-specific tip jars. If you'd like to sponsor the show, please contact
  382. [email protected] to start the conversation.
  383. 24:51
  384. I hope you're enjoying this diversion from our usual segmented format. As
  385. always, it's an experiment and your feedback is appreciated. Let's get back to
  386. the conversation.
  387. 25:04
  388. Adam: I've been trying to think of what a good analogy is, and I started with: I
  389. have a file cabinet, right, tons of personal information, tax returns in it and
  390. receipts, all sorts stuff that I don't really need most of the time but that
  391. occasionally I need to dig out because I need to send it somewhere or you know
  392. like mortgage stuff or verification or taxes or things like that, to a certain
  393. extent what you're talking about here is like a smart filing cabinet that lives
  394. in the cloud, where you can make keys for it, you know make keys for this filing
  395. cabinet that you can give the different people that, they're like smart keys,
  396. it's automated.
  397. 25:43
  398. John: It's more like each individual piece of data, each individual file, if
  399. you want to call it that, is a locked with a different key, and you give
  400. specific keyrings to specific people to unlock specific things.
  401. 26:03
  402. So you might have three different bios, you might have the one that is for your
  403. LinkedIn or you know your professional network, you might have one for your
  404. family and best friends, and you might have one for the public at large, or just
  405. new people in general. So each of those are locked with different private keys
  406. and so you're going to be able to share those with different people.
  407. 26:26
  408. Again, I'm not a coder so I don't know how this is actually being done, but
  409. these things aren't being encrypted with symmetric cryptography, because then
  410. the people that you get it to could just share the keys with anybody and then
  411. anybody else could come in, and unlock the file.
  412. 26:41
  413. So instead what I think a personal cloud is going to do is going to make a copy
  414. of the data, encrypt it with the other person's public key, and then send it to
  415. them. So that they can then decrypt it with their private key when they log in
  416. to their personal cloud.
  417. 26:55
  418. Adam: Okay, I see, so there never is any unecrypted data on the net, you're not
  419. releasing anything, it just gets encrypted under a different specific person or
  420. organization's key.
  421. John: Exactly. Unless of course, you set it to public, and then it's like
  422. Twitter, it's just all out in the open. For certain things, like I use Twitter, I
  423. don't mind having a public-facing website, my blog is public, my consulting
  424. website's public, so there are things that I don't care about sharing with the
  425. whole world, but there are also things that I'd prefer to only keep between
  426. myself and selected individuals.
  427. 27:34
  428. Those are the kinds of things that personal clouds are going to be especially
  429. useful for, because right now when you send a direct message to somebody on
  430. Facebook or when you text message somebody, if you're not using encrypted text
  431. messaging, then that's just clear text sitting in multiple other servers,
  432. the NSA server, your service provider's servers, and every sever in between your
  433. ISP, your cell phone company, all these different servers.
  434. 28:01
  435. With a personal cloud, instead it's just going to be all a bunch of cipher text
  436. sitting on all of these servers, and you know good luck trying decrypt all of
  437. it.
  438. 28:09
  439. Adam: Well you could try, but...
  440. John: when everything encrypted, how do you know what to target? Are you going
  441. to spend 10 years trying to brute-force something, only to find out it says "I
  442. like pudding," or "meet me at Starbucks at 3"? It's going to make these efforts
  443. for collecting data just look absolutely silly.
  444. 28:34
  445. All of a sudden they don't even know who to target anymore. You know I read
  446. recently that right now, encrypting your data by default makes you a target.
  447. They're much more likely to hang onto it and build a nice little profile on you,
  448. and spend some effort trying to decrypt your stuff, especially when most of your
  449. communications aren't encrypted, but then selected conversations are.
  450. 28:57
  451. But when everything is encrypted by default, all of a sudden it becomes kind of
  452. like "boss, what do you want me to do here?"
  453. 29:04
  454. John: So we have to move the baseline basically, right now it's abnormal if you
  455. encrypt, few people do it because it's a hassle and when you do do it, then it's
  456. because it's something that you actually need to encrypt, you feel like.
  457. 29:18
  458. John: For the most part, I mean for most people.
  459. Adam: That's the perception, I'd say.
  460. John: That's the perception, for sure.
  461. 29:25
  462. John: I have used text-secure and RedPhone and GPG with all my friends who were
  463. willing to download it, and thankfully a lot of my friends have been willing to,
  464. since I've learned about this stuff and it is kind of just like encrypt by
  465. default. Why? Because we can. And because the picture of somebody trying to
  466. brute-force something and then seeing "I like pudding" at the end of it is just
  467. so hilarious in our minds.
  468. 29:48
  469. We're already beginning to see steps people are taking towards actually going
  470. through the learning curve of figuring out how to use these encryption tools and
  471. using them. Specifically since these Snowden leaks came out. When it actually
  472. becomes something that people don't have to work to do, when it's just that's
  473. the default mode of behavior, when it's going on behind the scenes and you don't
  474. have to think about it, then that many more people will be doing it. I mean I see
  475. personal clouds being like the next not Facebook, but the next Personal
  476. Computer.
  477. 30:20
  478. It's not just an evolution from social networks, it's an evolution of the
  479. platform from which you do all of your work. Because think about it: Google
  480. itself, okay, Google's kind of a company I love to hate because I'm a Gmail
  481. user, I use Google Docs for various things (nothing sensitive obviously), but
  482. when it's handy I use it, it's kind of the default search engine, I've been using
  483. Startpage and Duck Duck Go more, in Mozilla it's the default search engine and
  484. they get kind of the best results because they are the biggest monstrosity of a
  485. search engine.
  486. 30:55
  487. There's a lot of things I love about Google but there's also a lot of things
  488. that I hate about Google, particularly their revenue raising model where they
  489. take all this stuff that I do with them and then sell it to someone else so that
  490. they can serve me ads, I mean that's just...
  491. Adam: Would you pay for search?
  492. 31:09
  493. John: I would pay for a Google account if it gave me access to all of the
  494. things that they do. I mean I pay for internet, I pay for cell phone service why
  495. wouldn't I pay for a personal cloud provider which is going to protect my data,
  496. and that's the value that these personal cloud providers are going to be selling
  497. to their customers.
  498. 31:31
  499. You're going to get all the features of all these other services that you use,
  500. minus the part where they take all of your data and give it to the highest bidder
  501. or give it to whoever's pointing guns at them or whatever the case may be.
  502. 31:43
  503. Adam: so, you said that this is not an evolution of technology broadly but an
  504. evolution of the personal computer, and I'm very curious for the thought behind
  505. that because I don't really see the connection there. Isn't a computer more
  506. about what enables you to do in terms of hardware capabilities? What do you mean
  507. by that?
  508. John: Cloud computing and cloud processes, cloud services in general are
  509. advancing at such a quick pace where you'll be able to be delivered
  510. software-as-a-service, or anything as a service really, from a cloud provider.
  511. You'll be able to manipulate software that's stored on their machines.
  512. 32:23
  513. It doesn't matter what kind of machine you have really. You could have
  514. something with like a Pentium 4 or something.
  515. Adam: So this is the Xbox One concept that they've been very interested in, where
  516. yes the hardware itself is not that impressive, I'm not sure if you're familiar
  517. with this, the next release of the Xbox console that's coming out,
  518. 32:39
  519. John: Yes, please explain
  520. Adam: Is not that powerful overall, but what it does is it hooks up to the cloud
  521. where in the cloud supposedly there are between two and three additional Xbox
  522. Ones worth of processing power available, so that the thought is that even
  523. though your hardware isn't that impressive this cloud is impressive, and so
  524. people can build up games with the idea that it's not your hardware but the
  525. hardware that you have plus some hardware in the cloud, and that overtime rather
  526. than upgrading the hardware they'll just upgrade the cloud so you won't need to
  527. buy another box.
  528. 33:13
  529. John: Exactly, so your personal cloud platform is a shell, it's a shell within
  530. which you store minimal amounts of data, and I say minimal meaning it's like
  531. your whole life on a kilobytes processing level it's really easily digestible
  532. by pretty much any machine that exists right now, my smartphone could handle
  533. this storage of this data, you know what I'm saying?
  534. 33:34
  535. And then it's these other servers of the service providers that you need for
  536. doing your financial transactions are doing your gaming or doing your
  537. marketplace activities or whatever, they're going to be kind of like the
  538. communities that you still kind of carry around your identity to all of these
  539. different places, but they're just getting the relevant info that they need out
  540. of your personal cloud, so they're not actually storing any of the data it's all
  541. in your personal clout and they just kind of take what they need to do the
  542. minimum function that you're asking for.
  543. 34:07
  544. So in the case of a marketplace, it pulls your reputation information, it might
  545. pull your address if you need to have stuff sent to you, they might pull your
  546. email so that, if there's even email, I mean in reality these things will be
  547. able to send messages to each other, so email itself might not even be
  548. necessary as a service one day.
  549. Adam: So when you're talking about email, and how these can send messages but
  550. it's not email, how is it different from email, is it more like a personal
  551. message, I mean diseases is this a semantic differention?
  552. John: It's still an address, it needs to know how to get this message from
  553. a point A to point B, but the address isn't @gmail.com, it's your home address,
  554. kind of.
  555. 34:51
  556. This is your place on the network, and this is how it finds you, and that's what
  557. XDI, it's short for eXtensible Data Interchange protocol is going to do.
  558. 35:01
  559. The whole personal cloud platform is built off of a semantic data graph
  560. 35:07
  561. Adam: Okay, what does that mean?
  562. John: Basically what it means is that all of your things are individual points
  563. on the graph, all individually addressable, and so that's how they find, other
  564. personal clouds find your stuff. So that no matter who your cloud service
  565. provider is, the actual addresses, it still can be found on the network.
  566. 35:27
  567. Adam: So is this like everybody has a unique name, and so through that unique
  568. name as long as you know the name you can find the person and connect with them,
  569. make one of these contracts with them?
  570. 35:38
  571. John: Yes, okay yes essentially. Like I say, it's your singular identity that
  572. you're able to take anywhere so that no matter who your cloud service provider
  573. is all of the links that you've given out to other people to find your stuff are
  574. all still valid.
  575. 35:53
  576. That's the most important part is that, you know if I delete my Facebook all the
  577. sudden you can't see anything that I ever had on Facebook ever again, whereas
  578. with personal clouds, as long as I have a personal cloud that's active online,
  579. on the internet somewhere you can access it. The addresses don't change. The
  580. links don't change.
  581. 36:17
  582. Adam: So let's talk about finance for second, that was one of the other
  583. categories that you brought up. In the last couple years I've had my identity
  584. stolen twice without ever having lost my credit card, so that implies to me that
  585. there's a minor security problem with transactions that happen on the internet.
  586. Would personal clouds have saved me from all of that headache, or are they just
  587. a replacement but they can't really fix some of these problems?
  588. 36:40
  589. John: I would say it depends on what actually occurred. There's a possibility
  590. that you are man-in-the-middled, that's kind of very sophisticated attack that
  591. is usually very targeted because they're trying your browser...
  592. 36:51
  593. Adam: I'm pretty boring, and I was even more boring when I was I doing this,
  594. I've got to think it was something--
  595. John: It had to have been something more like on the other, server side.
  596. Adam: How would it work?
  597. 37:00
  598. John: Okay so it depends on what you're actually doing and what the whole
  599. financial ecosystem looks like, because you know if we're talking about Bitcoin
  600. exchanges there's like no personal identity information to steal, if you're
  601. talking about lines of credit, banking, and things like that...
  602. Adam: So then let me ask a better question, what I'm saying is I want to go to
  603. amazon.com and I want to place an order for something and buy it and I don't
  604. want to store my credit card with them because I'd rather not have my credit
  605. card on file with them, or pick someone less reputable than Amazon.
  606. 37:31
  607. John: sure sure, let's even go with the amazon example because Amazon is a
  608. company that stores people's credit cards and debit card information so that
  609. they can do like one-click shopping stuff like that. Instead of Amazon storing
  610. your credit card details, your credit card would be in your personal cloud and
  611. you have a link contract with Amazon, which says that when I'm signed in you
  612. have access to this information, and only when I'm signed in.
  613. 37:59
  614. They don't take that information and store it while you're signed in, they just
  615. have access to it so when you click that one-click shopping thing, order here,
  616. just real quick your credit card is processed, your address information is sent
  617. to the merchant, as soon as you log out that link is closed.
  618. 38:19
  619. They don't any longer have the credit card number or anything, infact they don't
  620. even need the number because they're not a credit card processing company. They
  621. have probably a third party.
  622. 38:28
  623. Adam: And then it's over, that's what they needed it for.
  624. John: And then it's over, that's all you need it for, exactly. Now credit cards
  625. themselves, you know like the whole credit card network is clear text, I don't
  626. know if anyone knows that but when you run your credit card all of the
  627. information that's going back to the credit card company, it's not encrypted,
  628. it's clear text, that's what cryptocurrencies are competing with right now. So
  629. if you want to use that by all means, just know what you're allowing yourself to
  630. get into.
  631. 38:57
  632. You know when you bring something like cryptocurrencies into the picture then
  633. the risk of fraud becomes even less, because the private keys could
  634. theoretically be stored in your personal cloud and only when you are logged
  635. into your personal cloud could those private keys even be accessed, from there
  636. you can you know send money to people and stuff like that but then you're not
  637. even giving up any information whatsoever, it's peer-to-peer, so there's no
  638. third party that you even need to think about trusting with your personal data.
  639. 39:29
  640. So that is exciting to me. The social networking stuff is really cool, being
  641. able to have my health care information downloading straight from my personal
  642. cloud, that's all really cool, but what I think is really powerful and really
  643. exciting is that this is going to enable a true peer-to-peer marketplace to
  644. emerge, where I don't even need amazon.
  645. 39:47
  646. Amazon's just another walled garden. Instead the merchants are going to post
  647. things to their personal clouds, make them public, and then I search for those
  648. things in my personal cloud, and the personal cloud service does a dictionary
  649. discovery lookup through the whole semantic data graph I was talking about
  650. earlier, and finds all of the things that match my search query and serves them
  651. to me and then i can narrow it down even further to say, you know I don't just
  652. want dresser, I want a black dresser. I want a black dresser with six drawers.
  653. And then the search continues and continues till it gets down to exactly what I
  654. want.
  655. 40:24
  656. I buy it right there from the merchant, me peer-to-peer.
  657. 40:29
  658. Adam: Let's say that we get to this post-Amazon world where Amazon is no longer
  659. required because you have these personal clouds both you personally and let's
  660. say me as a vendor, what does that world look like, how do we find each other,
  661. are we negotiating...
  662. 40:52
  663. Are we literally negotiating you know, do we have a contract where you in
  664. advance so that you're able to look at my store, or am I just putting it out
  665. there, how how does that work?
  666. 41:02
  667. John: you know, most merchants on the internet, they have a public facing
  668. website. I mean there are some private organizations where you need to have to
  669. have a membership to have access to their storefront, but for the most part on
  670. the internet and in the physical world, merchants doors are opened to any
  671. customer who's willing to come in and take a look around. Similarly, on the
  672. personal cloud network merchants will just have all of their product postings be
  673. public, and you know I, as a customer, can do searches as I described earlier to
  674. find exactly what I'm looking for, and then those merchants are competing with
  675. every other merchant who's offering what I'm looking for.
  676. Adam: So geography comes into play here for physical objects, so it'll be
  677. easy or automatic to filter if you're looking for something physical that you
  678. know you're only looking within a certain number of mile radius, or shipping
  679. cost radius, like how would you define terms like that?
  680. 42:00
  681. John: Yeah I think the search that you are conducting to find these things can
  682. be that granular, just like Amazon, their sidebar shows you know categories and
  683. then price ranges and you know, things like that, you'll be able to categorize
  684. these things by the best price, closest location, and the personal cloud is
  685. going to be able to serve that information to you because it knows where you're
  686. at if you tell it where you're at, it knows where the merchant's at because the
  687. merchant has uploaded their location information and you know their shipping
  688. costs and the product costs are going to be transparent, so you'll be able to
  689. organize your search results based on whatever criteria looking. For maybe you
  690. don't care about price, maybe you care about high quality so you look for the
  691. best-rated item, or something like that.
  692. 42:54
  693. Adam: Whenever I'm my tablet, I'll have something come up and it'll say
  694. such-and-such application would like to know your location. Is that the sort of
  695. thing also they could be stored in the personal cloud, because I'm really
  696. tentative to give a lot of publishers that sort of information about me, I
  697. don't really want them to know where I am and that's not something that's
  698. important to me.
  699. 43:12
  700. But there are some things like map applications for example where it would be
  701. useful if in fact I did give it access, so sometimes you just say okay well
  702. screw the privacy issues I guess I might as well do this, can personal clouds
  703. help in that situation too?
  704. 43:26
  705. John: Yes. If you do you choose to give a third party a location, your link
  706. contract will be governing exactly what we can do with that information and if
  707. they break that contract then it's just like breaking a legal contract where you
  708. know there are there are reprecussions, now it's not a legal contract in the
  709. sense that you're not going to take them to court, these things will all be
  710. handled within the trust framework that the company has agreed with essentially
  711. and these things will be handled by social pressures within the trust framework.
  712. 43:56
  713. If they abuse their privileges of getting certain access to data they might get
  714. locked out of the trust framework, and then no longer will people who agree to
  715. that trust framework ever do business with that entity ever again.
  716. Adam: So they have to maintain a good reputation otherwise it can potentially
  717. endanger their relationship with the entire network.
  718. 44:16
  719. John: Exactly. Exactly, because this is a peer-to-peer environment, if I trust
  720. organization A, who also trusts organization B, who I have no relationship with
  721. right now, then by extension I can trust organization B because we're all kind
  722. of agreeing to the same trust framework. If that trust framework is broken by
  723. organization B, then organization A has the power to cut off that relationship
  724. entirely and then everyone else who would have have contact, some sort of
  725. connection with organization B, no longer does. They're now like a quote-unquote
  726. stranger, they're not a friend of a friend, they're a stranger, and so they have
  727. to kind of start from the ground up to build up a good reputation, or find
  728. another trust framework provider who is willing to extend them the benefits of
  729. access to their trust network. And these trust networks themselves will be
  730. federated, so that the trust framework providers work with each other to kind of
  731. prevent bad actors from just being able to jump from one to another to another
  732. to another. Very quickly it becomes a very accountable scenario where everybody
  733. in the network is accountable to basically everyone else.
  734. 45:45
  735. Announcement: Why do you listen to Lets Talk Bitcoin? We'd really like to know.
  736. Are you a new user trying to learn the basics? Are you from the world of finance
  737. seeking clarity on investment opportunities? Are you an entrepeneur looking for
  738. opportunity in a world of confusion? Write and tell us your story.
  739. 46:21
  740. Announcement: Like the fortune teller says, may you live in strange times. And
  741. we certainly do. Do you have a project or passion that falls into what I loosely
  742. as define as "technology or philosophy that can change everything"? We want to
  743. hear from you. [email protected]
  744. 46:43
  745. Adam: In theory it seems like the primary weakness of this is that if someone
  746. ever gets your key, gets your ability to access your personal cloudiness, then
  747. it's just as bad as if someone gets your Bitcoin keys, gets your Bitcoin private
  748. keys and basically they can do anything that you can do, which in this case just
  749. as you have full control, so does anyone compromises your personal cloud. Is
  750. that a concern?
  751. 47:07
  752. John: That is a concern for sure, that's a concern in any sort of environment
  753. where you need to authenticate yourself, and authentication is provided
  754. remotely, where you know it's not like YOU, the physical individual, walking up
  755. to a window and saying "hi, I want access to this stuff" but instead it's like
  756. this abstract version of you going over internet lines to then authenticate with
  757. some cryptic string of letters numbers and symbols.
  758. 47:40
  759. At the start I'm sure that the authentication will just be like, two-factor by
  760. default, almost assuredly. From there, authentication technology itself is
  761. getting really interesting with biometrics. I've seen brain wave authentication,
  762. I mean we could end up with something where we're looking through a heads-up
  763. display, not like Google Glass, Google Glass is very primitive, but something
  764. even more advanced where we have a full field of vision where we're just seeing
  765. the graphical user interface in our field of vision, and as a little node that's
  766. pressing against our temple, and then authenticate by thinking of a certain
  767. thing while blinking three times or something like that, you know I mean the way
  768. the technology is going I think that the market will definitely find a way to
  769. make authenticating into your personal cloud very difficult for somebody to
  770. break, but very easy for you yourself to get in to.
  771. 48:40
  772. Much like right now, let's take this whole thing offline, pre-internet days.
  773. All this same information exists, your financial data, your health data, your
  774. relationships and all that still exist but in analog form. A lot of this stuff
  775. is stored in vaults. Can vaults be broken? Yeah. They can be broken. Someone
  776. can install a hidden camera and watch you put in the combination, or somebody
  777. could just rubber hose cryptography-style you know just beat it out of you, any
  778. number of different kind of attacks could be launched to compromise your
  779. personal data even when it's stored an analog fashion.
  780. 49:16
  781. Adam: It's just a difference of accessibility really, because the difference
  782. here almost entirely is just that in the case of analog data someone actually
  783. has to be there at the vault to do it, and from the from the cloud side of data
  784. they just have to be on the internet, they just have to be connected, and so
  785. that's I guess the thing, is that there's a much larger pool of people who
  786. potentially can go after information, but I think that you're right, that's not
  787. something that's really restricted to personal clouds, that's just, and again
  788. it's about centralization to a certain extent, it's because you've got all this
  789. information there that makes it more valuable than it is spread out all over
  790. everywhere.
  791. John: That is correct, and that's why the barrier to getting information should
  792. be that much greater to scale for somebody who doesn't know the the private keys
  793. you know like that personal cloud providers would perhaps require that your
  794. password have one capslock, one special character and one number or something
  795. like that--
  796. Adam: Yeah, and be like eleven characters long
  797. John: and be like eleven characters long.
  798. 50:17
  799. And you know say, this isn't just Facebook this is like your life, this is the
  800. only password your ever going to need to remember ever again, but it better be
  801. damn good. Instead of just being twelve characters it should be a line from your
  802. favorite movie that you'll never forget, or something like that...
  803. Adam: But not that because that's human readable and so if it's human readable
  804. then it's brute-forcible but, but I understand what you're saying.
  805. 50:40
  806. John: And then when you add in something like what I was talking about earlier
  807. like brain wave authentication then you don't even need to remember passwords,
  808. it's literally just you and nobody could possibly replicate that unless again
  809. they found you and held a gun to your head and said "authenticate and send me
  810. all your everything," you know and people can do that now with analog data so
  811. it's like...
  812. Adam: Make them an offer they can't refuses so to speak
  813. John: Exactly and so I don't see, some kind of new threat models emerge, but when
  814. it comes down to it I don't think it's a show stopper, I don't think it's going
  815. to get in a way of this technology becoming that next evolution from the
  816. personal computer to the personal cloud that I forsee it being.
  817. 51:23
  818. Adam: So we started this conversation talking about trust, talking about how in
  819. a system where you're meeting new people and you want to do business it's really
  820. difficult to do anything when there's no trust in the system, and so we have
  821. systems like Bitcoin that come around, look at this problem they say okay the
  822. solution is to simply remove all the trust from the system, to not trust anybody
  823. and to make it entirely about what is real and what is now, so with Bitcoin the
  824. analogue here is ownership, is if you have a Bitcoin it's not like I owe you a
  825. Bitcoin and so then you have to rely on me to give that to you, if you have it
  826. then you have it, if you don't then you don't, it's very straightforward either
  827. on or off, no middle ground.
  828. John: And I would actually even amend that to say it's not ownership, it's
  829. control. You don't own your Bitcoins, don't pretend you do, you only control the
  830. private keys, and that control could be easily be lost if your system--
  831. Adam: But isn't that control ownership? I mean, what defines ownership?
  832. 52:17
  833. John: I mean, it's semantics, really. It's semantics more than anything. Control
  834. is ability to manipulate whatever it is that we're talking about
  835. Adam: Okay, sure, I'll buy that
  836. John: and i would even go so far as to say exclusive ability at that moment in
  837. time. Multiple parties can be wrestling for control of certain things, and with
  838. Bitcoin coin flip as you said it's very binary, you either do or you don't. you
  839. know when your private keys are compromised, the Bitcoins are leaving your
  840. account immediately and you'll never see them again, unless you can find the
  841. wallet they moved to get those private keys, and you know.
  842. Adam: A little bit of Spy Vs. Spy.
  843. 52:56
  844. John: I mean it was just a semantic thing I wanted to throw out there, just for
  845. the listener to maybe ponder a little bit.
  846. Adam: Well I think that's a well-made point, you know but trust is hard. Trust,
  847. you know another analogue to it beyond control is faith. Because it means that I
  848. have faith in you, if we're trusting one another, that you will follow through
  849. on what you're saying. When you have nothing that actually mandates that you do
  850. it. The solution that that you advocate is reputation, and reputation that isn't
  851. necessarily tried your social security number, or tied to exactly who you are
  852. you know in real life but just that's tied to past actions. And I think it's a
  853. really interesting idea to differentiate between trusts that is built on
  854. identity, versus trust that is built on action and I wanted to know what you
  855. thought about.
  856. 53:48
  857. John: Well we have to connect those two things, because how do you know that
  858. the past actions over here were undertaken by the same person that's over here
  859. that you're trying to trust, and so there are identities involved, it's just not
  860. necessarily identities as we think of them today.
  861. Adam: well you said identities, I think that's kinda what I'm getting at here is
  862. that again we get back to this idea that it's about action rather than identity.
  863. Action can lead to identify, if you somebody who does something and and that's
  864. the thing that you do, you build an identity around those actions, it's not that
  865. the identity necessarily has to come first, it's just you're right they do get
  866. tied together.
  867. John: This is this is true and that's a good point that you make right there, is
  868. that in reality identity isn't just a piece of paper that says a name and a
  869. social security number and an address, it is who you are and what you do no
  870. matter how many different pseudonyms we have on the internet, we are seeing
  871. ego-consciousness undertaking actions in many different venues, in real life and
  872. on the internet.
  873. 54:51
  874. Personal clouds, the way that they're going to enable this full granular control
  875. of identity, it's going to allow for people to build a reputation around a
  876. singular, what I'll call a cryptographic identity. Because ultimately that's what
  877. this whole this whole thing is built off of, is cryptographic authentication.
  878. So you'll be able to have a reuptation rating that's built around this his
  879. cryptographic identity and then the face of that identity can vary depending on
  880. what context you are entering into.
  881. 55:22
  882. So again what your co-workers see might be different from what your family sees
  883. from what your best friend sees, your reputation score which is associated with
  884. your personal cloud, is the same no matter which which of these things you're
  885. facing. You might not show your reputation score to everybody, but you're not
  886. going to really be able to change it. Depending on the reputation provider, of
  887. course. If you control it then you can manipulate the numbers say whatever you
  888. wanted directly, the reputation ratings,
  889. 55:51
  890. Adam: So we get back to that verifiable thing.
  891. John: So we get back to that verifiable thing, where the whole personal cloud
  892. network is built off of this trust framework, basically you--
  893. Adam: When you say trust framework I think that another way to put that would be
  894. it's built on a set of rules right, rules by which people are judged, by which
  895. actions are judged you know, because if you're generating score you have to be judging
  896. things, you have to be saying this is something that gives you more points,
  897. verses this is something that does not give you more points.
  898. 56:19
  899. John: With the basic trust framework in place where you contract with a personal
  900. cloud service provider, they describe to me exactly what they're going to do
  901. with your data, and if they ever break that contract then everybody else whose
  902. party to that contract, being you know all of their other members, all of their
  903. business partners, all these people are going to see that reneging of the
  904. contract, and respond to it. Whether it's with ostracism, or demanding that you
  905. be compensated, any number of different steps that can be taken to pressure this
  906. organization into to make the situation right.
  907. 57:02
  908. If they refuse then comes the ostracism, where they're just booted out of the
  909. network and they have to create this reputation from scratch or try to find
  910. somebody else who will trust them into a new network, right but if they know that
  911. they they have a reputation for deleting all of your data, or you know not
  912. backing it up,
  913. 57:22
  914. Adam: being a bad actor
  915. John: being a bad actor, if they have a reputation for being a bad actor then
  916. they're either going to get trusted into a network and put on a very tight
  917. leash, or they're just not going to accepted at all. And so very quickly bad
  918. actors are kind of weeded out of the system.
  919. 57:41
  920. Adam: And it's important here to note that this is possible because there's a
  921. lot of competition in the space, that's the idea, is that all this is
  922. essentially an open interoperable platform and so it's not like Facebook where
  923. Facebook does something wrong and you're like "oh, well, I'd like to quit
  924. Facebook but where would I go?" You know, all of my data is there.
  925. 57:54
  926. John: Exactly. So the competition part is very important because without
  927. monopolies and cartels can easily form. Because this is an open peer-to-peer
  928. platform, why would I do business with a bad actor when I have all of these
  929. really good candidates over here that I can deal with?
  930. 58:16
  931. People are going to have to watch their behavior. They really are going to have
  932. to work hard to build trust and then keep it.
  933. 58:26
  934. Adam: Yeah, the keep it part is really an interesting point, because like you
  935. said, again going back to Facebook, I don't mean to harp on them so much,
  936. Facebook didn't start off with this model, where they were sell your customers'
  937. data, monetize everything possible because you can't figure out how to make
  938. money from any other way but is that once you've got the network effect there,
  939. then there's all this built-in incentive to stay, including your data, so
  940. everybody else is there.
  941. 58:49
  942. Because all these cloud providers are interoperable and it won't matter you
  943. know, I can talk to you on G+ while I'm on Facebook in this sort of situation,
  944. then again the network effect is no longer applied on a company by company basis
  945. or a provider by provider basis, instead is almost just the entire cloud, the
  946. idea of personal clouds or enterprise clouds is the thing that has the potential
  947. to get viral rather than any particular provider in the space.
  948. John: Exactly, and that's what's important to note about personal clouds, is
  949. that kind of like Skype or like Bitcoin, it's a viral technology where you need
  950. other people to be participants in the network for it to be tangibly valuable to
  951. you.
  952. Adam: Like a language
  953. John: Like a language, the concept might be like a million dollar idea in your
  954. head but if no one else is using it then it's not a million dollar idea, just
  955. like Bitcoin,
  956. 59:46
  957. early adopters of Bitcoin saw something that was really awesome, that has a lot
  958. of potential, nobody's using it, so I'm going to send ten thousand Bitcoins to
  959. get a couple pizzas.
  960. 59:58
  961. Nowadays we look at that like crazy, you're insane. That's because the network
  962. is that much more valuable and therefore Bitcoins itself is that much more
  963. valuable. And so similarly, early adopters of personal clouds, it will be a
  964. tight-knit community, we're all just kind of like talking to each other, and
  965. talking about how great personal clouds are going to be one day, trying to on
  966. board is many of our friends and family as possible, it's not going to be
  967. anywhere near as much friction as trying to get people to start using Bitcoin
  968. because as soon as your friends are using it, its valueable. You know, I don't
  969. do business with my friends, using Bitcoin with my friends doesn't really, it's
  970. like a novelty, it's like "hey I can pay you back for lunch you got me the other
  971. day", you know stuff like that but like for the most part I'm doing business
  972. with for the most part strangers, just people I trust because they have a good
  973. reputation on the internet.
  974. 60:47
  975. Which goes back to the reputation part. But you know with personal clouds,
  976. they're immediately valuable once you start having like that tight-knit, if just
  977. my friends and family are using it it's valuable to me
  978. Adam: It's still valuable even at that low-level, local level
  979. John: In fact, that is one of those reasons why it's most valuable. That's who
  980. I'm having sensitive conversations with that I don't want Facebook's or Google's
  981. employees reading in on.
  982. 61:11
  983. Adam: let's play a dangerous game here, tell me the future on this, how far out
  984. are we from a normal person being able to go and sign up with a personal cloud
  985. provider and get into this system?
  986. 61:20
  987. John: I did an interview with Drummond Reed, who was the cofounder of the Respect
  988. Network that I referenced earlier, Respect Network is a network of cloud service
  989. providers who are all agreeing to that Respect Trust Framework which governs how
  990. data will be used.
  991. 61:40
  992. Drummond said that beta testing will begin in the fall of this year. And that
  993. beta testing, I am a member of developer alpha testing of a personal cloud
  994. platform which was designed, I believe by Newstar in collaboration with Project
  995. Danube, they were signing people up at the Internet Identity Workshop which I
  996. attended earlier this year. You know, I signed up right there on the spot and I
  997. played around with it a little bit, it's pretty cool.
  998. Adam: So not too far out
  999. John: But yeah, really not too far out. This isn't like the singularity, where
  1000. we have to wait for twenty years to see if it happens, this is something that's
  1001. going to be available to people on a commercial level by next year at the
  1002. absolute latest.
  1003. Adam: So John Light, for people who want to get in touch with you or get
  1004. involved any of your projects, how can they find you?
  1005. 62:35
  1006. They can reach me on twitter, @lightcoin, or through my blog, www.p2pconnects.us
  1007. 62:47
  1008. Adam: Thanks for joining us on LetsTalkBitcoin today.
  1009. John: Thanks for having me on, Adam.
  1010. 63:03
  1011. You've been listening to Episode 21 of LetsTalkBitcoin, if you liked, loved or
  1012. hated the show we want to know what you think. Please email all feedback to
  1013. 63:12
  1014. Thanks to John Light for being the exclusive content provider for this episode.
  1015. Music for this episode was provided by Jared Rubens, and Lucas AMKC.
  1016. 63:23
  1017. Stay tuned for Episode 22 of LetsTalkBitcoin, releasing Tuesday July 9th.
  1018. Thanks for listening.
Advertisement
Add Comment
Please, Sign In to add comment