Interview With A Hex Dev: Part 1 — Operational Security

Vince C.
17 min readFeb 10, 2021

On Dec. 3rd, 2020, software developer Kyle Bahr sat down with Kryptosparbuch to chat about Hex, in celebration of its 1-year anniversary. This will be a multi-part transcription of the interview so that Hexicans can enjoy the interview in a written format and hopefully gain deeper insight into Hex. This first part contains a brief summary of how Kyle got involved with Hex, and then focuses on operational security. Wording has been edited for clarity and better readability.

Kryptosparbuch: We have a special guest with us today. Kyle is one of the developers from Hex. Hi Kyle, how are you doing?

Kyle: Good! Good day, since we’re across time zones. I’m doing well, how are you both?

Kryptosparbuch: We are doing well, as well. Really excited to have you here. It’s about 6PM in Germany but I guess it’s like really early where you are right?

Kyle: About 10AM. For some people, particularly developers, that’s very early but I’ve been up for quite a while.

Kryptosparbuch: Nice, I think we have a lot of people here in the chat who already know you. Of course you are really prominent in Hex, but also since it’s only a year old and has gotten so many more users in the last month, maybe you could introduce yourself a little bit and tell us what you do, and what your part is in the Hex community?

Kyle: Sure. I may go back a little further than that just to talk about getting into crypto and how I came across Hex to begin with. So as you mentioned, I’ve been a prominent member of the community and that’s actually been completely organic.

Hex was in development for at least a couple of years before it launched. It went through a whole bunch of devs before it finally found the right person to put everything together, and I came across the project through a YouTube person and podcaster named Chris Coney. He talked about the project as being an interesting, alternate take on crypto. The way that he framed it was:

“It’s store-of-value for its own sake, it doesn’t make any claims about trying to revolutionize the world or do peer-to-peer, or do anything better, it just eschews all of that high-minded stuff that a lot of other cryptocurrencies espouse and just goes after store-of-value price-performance.”

I thought that that honesty was very refreshing. I came into crypto for unrelated reasons in 2016–2017 when all the hype around ICOs, new coins and utility tokens was all the rage. I got to see hundreds of millions of dollars just evaporate into thin air based on these promises, and then to hear this pitch that, “Yeah, we’re not doing any of that. Just store value, just real digital gold and programmable money” made me think: “Okay, this is interesting. I’ll read about this.

When I came into the Telegram chat, I didn’t know of Richard (Heart) from anything prior. I saw the picture with the top-hat and that was my first introduction to Richard Heart. I read through what the project was, what it was supposed to do, then read through the contract and thought it was interesting enough that I thought to myself:

“Sure, I would put a little bit of money into this and since I’m going to do that. I might as well write code in another language to verify that it does what it says it will do and see if I can figure out an optimal strategy.”

Through that process, I was able to engage with the community and answer questions just as a layperson, not working on the project, and not really as anyone special except that I could read the code and confirm or disconfirm that it would do this or wouldn’t do that, and just educate people. There were a bunch of people in the chat there before me who were playing the same role as power-users in the community. They just had a better understanding or longer history with the project, and we discovered collectively that there were a couple of things that were not working as-intended with the contract as written.

Most people don’t know about this, but there were versions of the contract in GitHub for community review in February or March of 2019. Using those myself and a few other members of the community picked at it and ran simulations. One guy was using Python code to simulate outcomes. I had a different thing in C#, and started this discussion about how to make the contract better, and to do or fulfill the promises that were made on what it ought to do. I realize I said promises, and that’s not referring to how the price performs, but that it actually enforces the rules that it says it’s going to enforce.

I just stayed engaged, basically, and just kept picking at the problem and kind of working on it through the spring, summer and into the fall. I was involved in all the audit processes and helping the auditors understand what the contract was supposed to do so that they could properly audit it and check the behavior themselves. I helped liaise between the community, Richard, the auditors and a whole bunch of devs that came and went, bringing people up to speed, and writing documents.

There’s a whole bunch of documents on Google Docs that I wrote that are featured on the website and also there’s links all over the place on how to use Hex, what it is, and how it works. Then the project launched and I’ve just continued answering questions and helping out as much as I can, and here we are.

Kryptosparbuch: Nice! When you said in the introduction you help people verify that the code could do certain things and not do other things, do you remember what those things were? Could you explain to the people what some of the concerns were?

Kyle: Sure, I tend to go into too much detail, so I’m going to try not to do that. The rules and the particulars of how Hex works have changed from the original concept over the course of a couple years of development, and finally are in the form that they are now.

There were a whole bunch of things that were different. For some time there was no Adoption Amplifier. That was a new thing as of Spring 2019. The way that BigPayDay worked was quite different. The way that it was paid out was daily at first. There was a chunk a day rather than having a single BigPayDay at the end. There was no share rate at the beginning. There was a dependency on a particular behavior that was not realistic.

The payout used to be weekly rather than daily, so there’s a whole bunch of things that are different now than before. People would ask things about what happens with partial weeks, which isn’t a thing anymore now we’re doing daily payout calculations. The way the BPD was paid on the daily, people had questions about how that worked and whether or not you get into some kind of degenerate state (turns out you do), which is why it’s a BigPayDay instead of paid throughout the launch phase.

There were questions about the Adoption Amplifier. There was some back and forth about how referrals for the AA worked. I don’t know that anyone actually did this, but it came up:

How could you give credit to multiple referrals for an AA day?” That was actually possible, where you could use a ref-link, put in 1ETH and then click another ref-link, put in 1ETH, and both of those people get their half of your referral, essentially.

There was some questioning about how to do Bitcoin claims, though funny enough there was almost no chatter about that post-launch. That was also heavily scrutinized during the audit; never came up in the public’s eye. There were questions about a lot of things for the first 6–8 months. Once the rules all settled, the questions were actually exactly the same questions we still get today:

  • How do I know that I’m not getting screwed by committing longer?
  • How does emergency unstaking work?
  • How do the penalties get put into the pool?

These are questions that I think will be familiar to everybody.

Kryptosparbuch: Those are the questions we’re going to talk about later in the stream as well. You are a software developer, right? So you are able to read the code and say, “Okay, Hex is a safe product”, and all that. We Germans are all about security; in our chat we talk about that a lot so this will be the first section that we’re going to talk about.

What do you think are the best security practices for Hex? What can the user do to really secure their Hex and not get hacked?

Kyle: That’s a great question. I wish I had a better answer. I think the standard answer for a long time has been hardware wallets, but there have been hacks and phishing attempts or phishing attacks even through hardware wallets. I don’t know that I can make a best practice recommendation, but what I can say is just generally, if you lose physical control of your keys, or of your seed phrase you’re screwed.

Even with a hardware wallet, if it’s a Trezor and has a debug pin for example, if somebody has your wallet for even a couple of hours, they can hit the debug pin and dump your private key. Never trust anything emailed to you; just default to not trusting, so if you get something from Ledger Support it’s probably not really from them.

It’s a good question because I think it’s a moving target. Security is generally a moving target. There’s very little that we can do to protect people from themselves. People forget things, people lose things. There are mechanisms in the centralized world and in the fiat world that you can establish trust through, such as a government ID. You can establish trust because the bank is a big building that you’ve been to. It’s very unlikely that it’s been swapped out with scammers overnight. The government can trust who you are because you have an ID. You have a passport.

Crypto doesn’t have any of that. Crypto is really an all-or-nothing endeavor. If you have the private key you get to spend the money. The end. If you have a system for yourself that lets you not rely on your memory, and you can be reasonably sure of the physical security of the thing that you’re trying to protect, that’s your best bet. Your trust should only be reserved for people that you know in real life, maybe a spouse or your children.

There’s really not a one-size-fits-all solution and as technology gets more sophisticated, the weak link will always be people, and attempts at social-engineering or just gaining physical access to whatever device, paper or even the steel-stamped cards that people use for seed phrases. It’s going to be hard to answer that for all time.

Kryptosparbuch: The first thing you mentioned was a hardware wallet. That is the most secure way to store your crypto. Of course you should never forget your password, your seed phrase. That’s the most important thing. Because you have to have a Metamask wallet in order to interact with the Hex smart contract, what is the difference between a hardware wallet and Metamask?

Kyle: That’s a great question. To back up a little bit, you actually don’t have to use Metamask. Somebody in the community implemented a direct integration to Ledger’s API.

Kryptosparbuch: Is that HardHex (https://apphex.win/ledger/)?

Kyle: Yeah, that actually does not use the Metamask API at all. It just uses the connection to your Ledger, to your computer and just talks to the API through its secure APIs. What makes a hardware wallet secure is that it is not online, and you do not load software onto it. What people have lost — and this has actually happened — is control of their keys in the Hex community because of software. If you have an internet-connected device, and you install software on it, it can report your activity.

As an example, Mac laptops and computers report every app that you launch back to Apple, and I think that’s of concern to the privacy community because it’s becoming a required call that your computer makes every time you launch an app.

What some people did before the advent or popularization of hardware wallets was that they had an air-gapped computer and then they manually typed in the stuff that they needed to sign on that computer. Then they take the output of that and then manually type it into an online computer so that the wallet computer remains offline and is not contaminated. You’re giving it inputs that you know about and you are taking its outputs and putting them out on the network.

A hardware wallet is just basically a purpose-built device to try and do that. It doesn’t go online, you just plug it in to basically send over an electronic bridge, the thing you need to sign, and it spits out the thing that gets into the network without ever loading or sharing the key material.

Metamask is different from both of those in that it is trusting your browser to sandbox what it does, meaning it’s in a little software walled garden, and it can temporarily load your private key to use for some software operation and then spit the output back over the wall.

That’s a trust relationship, so you’re trusting that the browser software is hardened enough that people can’t interrogate or steal secrets from data that’s in these little walled gardens.

There’s actually an open ticket with Metamask. There is an unsafe way to use it. You can tell it to load your keys into memory, and once you do that the browser actually needs to have access to Metamask’s memory, so they have this open bug that you should never use this one Metamask API to load a key in plaintext because then it becomes accessible by the browser, and malicious code on a web page could just read your key, for example.

So Metamask is kind of a dangerous tool, to be honest, unless you’re just using it for very standard things like going to known sites (like go.hex.com) and signing very specific transactions with the little pop-up prompt, but trying to use the Metamask APIs themselves is very dangerous. Don’t do that, and stick to very paved-path stuff that you know for sure works.

In fact, actually as a case in point, somebody asked me today about doing a claim similar to the (Hex) Bitcoin free-claim, where you sign an arbitrary message to prove that you control a key. He was asking if it was safe to do that for this other contract, and I asked him, “I don’t have time to audit the contract but what’s the message they want you to sign?” If it’s some actual text string like “I am claiming my coins”, that’s probably safe but what he sent me was just basically a string of characters that basically looks like a transaction.

They’re asking you to sign a transaction to claim some free stuff but that transaction could be anything. It could be “dump my ETH to some wallet”, so even using the standard flow of “hey, just sign this thing and everything will be alright” and then Metamask pops up and helpfully offers to go ahead and sign this thing, maybe you just gave all your money away and you don’t know.

So again, security is a moving target. Social engineering is almost always the most effective way to part someone from their money: either getting them to do some non-reversible transaction via this sort of sign prompt, getting them to reveal a seed, getting them to load their keys into their browser in a way that you can then scrape through JavaScript, it’s all much easier than attacking the software service.

Kryptosparbuch: I think it’s important to mention that the hardware wallet, Ledger for example, is safe. Metamask is also safe, and it’s the human that is operating it that can be attacked. With the Ledger, if you get a phishing email and you click on it, stuff can happen. In Metamask if you just sign a transaction that isn’t safe, you can get screwed.

One question I get really often: What about people who go into Metamask, stake their Hex (at go.hex.com), and do all that stuff but now want to integrate their stakes into a hardware wallet? Is that possible somehow?

Kyle: As far as i know it’s not possible, but I’m not totally familiar with everything you can do with a Ledger.

Kryptosparbuch: I had the same issue. I did my first stakes only on Metamask, and to increase my security, I took an old Ledger and put the Metamask key on this old Ledger. Then I deleted my Metamask account, so at least I have this key which was generated from Metamask a year ago now, in my hardware wallet. With this you can increase your security because now there is no online wallet with this private key in the background, right?

Kyle: Yeah, and if Ledger allows you to import a private key from Metamask, then it will work. The more common version of that question is:

“If I have stakes associated with an address which is not a private key it’s a public address can I migrate them to a different public address?”

And the answer is no.

Kryptosparbuch: We have different hardware wallets out there, some have Bluetooth, some don’t. Do you know if there’s a difference?

Kyle: I distrust Bluetooth, for a few reasons. Probably the most famous recent thing that I’ve seen published on Bluetooth is that the encryption protocol is very easy to attack. Apparently a researcher proved this, because Bluetooth needs to be compatible with a whole host of implementations of the stack.

It allows you to respond to a request for an encryption handshake to establish a key, where you say:

“I want to use a key that is 1000 bits long” — a very difficult to hack encryption key — and the other device can say:

“I can’t handle 1000 bits”, and then it responds:

“Okay how about 900?”

“I can’t do 900, can you do 100?”

“I can’t do a 100, can you do 8?”

“Yeah, I could do 8.”

And so, the attack is: you convince one device that it needs to use an extremely weak encryption key and then you can easily brute force the key and then decrypt the communication. So, I wouldn’t trust a hardware wallet that uses Bluetooth. Some other malicious device could convince it to use a very weak security key and then any subsequent interaction — which you just as a user aren’t going to know what it’s sending, supposedly encrypted across that Bluetooth connection — would compromise your device. So, I wouldn’t mess with that.

The less connectivity the better.

That should be a rule of thumb: don’t trust anything where someone asks you for information, meaning phishing emails or even phone calls, any of that stuff and less connectivity is better.

Kryptosparbuch: We have a lot of people that bought the new hardware wallet from Ledger that has Bluetooth now and maybe they feel a little unsecured now, so I would say um just do not use the Bluetooth function. Don’t connect it via Bluetooth with your phone and just use it the old way: stick it into your computer and use a physical connection, right?

Kyle: Yeah, there’s devices that are more security or privacy centric, and they tend to have hardware switches for these different connection mechanisms so that you can physically disable the Bluetooth chip, so that there’s no way for anyone to even send a message to say “hey by the way, turn on Bluetooth”, because if it’s a software switch — meaning it’s just waiting for some electronic signal to turn on or off — there’s also been research done on how to do that. Remote turn on cars, Bluetooth radios, things like that.

So yeah, less connectivity is better. Hardware switches are always better than software switches.

Kryptosparbuch: This brings us a little bit to the mobile section. What do you think about the security on mobile phones? We have Metamask on mobile phones right now as well, right?

Kyle: Yeah, I’m not an expert. However, I used to work for Netflix and we had to do a lot with DRM (Digital Rights Management) keys. It’s not your personal wealth, but you know, movie studios care very much that DRM works, and that you can know things about the devices that play movies.

Android in particular is very inconsistent across versions, across implementations and across devices. I wouldn’t trust using an Android phone, personally. There’s an awful lot of research you would have to do on particular models of phones, implementations of Android, and implementations of the trusted hardware circuits that they put on those phones to do secure encryption.

Apple has a track record of being better about isolating and protecting data with new versions. There were versions that were very insecure, where they opened basically an internal software bridge, where you could ask for keys and encrypted material in plaintext. There were some attacks we had to deal with during my time at Netflix where we just didn’t trust certain versions of iOS.

Modern versions are much, much better, but then you are essentially trading security from hackers or scammers for giving all your data to Apple, so that’s your pick.

Kryptosparbuch: Yeah, I heard about that. So Apple has a single point of vulnerability, and Android has many phones with a many different versions, and a hacker would have to implement many different hacks for all those phones, but with Apple it’s really easy: you just develop one hack or exploit, and then you’ve exploited all Apple phones, right?

Kyle: It depends. You’re right that the variety of Android implementations and devices can make that job harder. There’s this phenomenon that shows up a lot. It’s a case for diversity in either a genetic sense or in a defense sense, that it’s easier to create a complex attack surface, or it makes it much harder for highly-leverageable attacks to succeed. So if you have to make one hack for every phone — let’s say every phone was totally unique, so your hack is really targeting one person at a time — the issues that we’ve seen with Android are not necessarily device-specific.

Some of it is implementation-specific, so many devices could share one common implementation that is attackable. In particular, I think it was the L or M series of Android that had a known defect. It sounds much more complicated than it actually is.

Don’t share your information with people. That’s it; that’s the answer, and if you have to think about connectivity on your devices, less is better. It’s really simple. As I was saying, with Android there were a couple of known security modules that had an issue, so even if you don’t know all the ins-and-outs, you don’t have to make a hack that’s specific, such as “Nexus 2, Model 3, Android M”.

It’s not as specific as that. If you can find a common component that’s attackable and common across a bunch of phones, then you just have to make a hack for that component and you can get a pretty good blast radius.

For example, uh over the last several years there have been vulnerabilities in a common open source package that does secure internet connections. I don’t know if anyone would have heard of any of these, but they were discovered and then it turned out that a huge number of companies used this open source package, because it’s very common.

There’s this reliance on using common software because you assume it’s been tested and battle-hardened, so you don’t have this huge amount of diversity in what software people use because they tend to coalesce around the things that everyone else is using. In the case of this software package, Netflix and a bunch of other large companies (including Amazon, who I think revealed the problem) basically all had to upgrade right away to this new patch version.

It’s not clear if any hacks had been committed, and so the process was:

“Okay do the upgrade, now let’s wait and see if there’s any evidence that anybody exploited the software package while it was vulnerable.”

So it’s kind of double-edged sword. With Mac, if you find one hack it’ll work on all iPhones. There was one FaceTime bug where you could remotely enable somebody’s camera and it just worked for all phones, as of some software patch. That’s obviously a very high-leverage attack because you figure it out once, and because Apple phones are so uniform it works for all of them.

Android and other software like that is more diverse so it’s harder to find a silver-bullet kind of attack, but if you know what you’re going after, which is security software or encryption software, then you really only need to attack that. That probably is quite common across phones, because of this phenomenon of people trying to consolidate on canonical versions of security software.

The assumption is that we harden the one package and then everyone can use it, so the whole operating system and the phone may be very diverse but there’s some core components that rely on coalescence around one particular version.

Part 2 of this transcribed interview, coming soon, will focus more specifically on Hex and its staking system, particularly the share system.

Original video interview: https://www.youtube.com/watch?v=RrFJzjoIwC0

--

--

Vince C.

Crypto lover, headphone audiophile, freelance musician.