One of my first email addresses (in 1989) was tektronix!ogicse!reed!minar. I’m feeling old today and I’m guessing half my readers have never seen an email address like that. It was from the long long ago, in the time that was before the Internet, when UUCP was the main Unix mail system.
My unique email address was reed!minar. But there was no ubiquitous routing infrastructure for mail, no global addressing. Unix network email was store-and-forward based on scheduled phone calls and modem transfers via uucico. Each host only talked to a few other hosts. Reed talked to OGICSE regularly, so my address suggested mail be forwarded through there. Other mail hosts might or might not know how to get mail to OGI but they certainly knew how to get to Tektronix, so that sufficed as a global route. UUNET was a hub that knew how to talk to everyone; often addresses began uunet!.
The essential idea is that UUCP email addresses included not just the address but the route to that address. It's a powerful idea. But modern Internet systems don’t do that. Instead we rely on global address lookup systems like DNS and global routing systems like BGP. (If anyone can think of a modern system that includes routes in names, please email me via SMTP)
UUCP users did build a routing system; pathalias. It relied on UUCP maps published to comp.mail.maps. Those maps were discontinued in December 2000. I haven’t found a modern view onto this data; it’d be fascinating to see the history of the growth of UUCPnet. telehack has a usable snapshot of the data, try uumap reed for instance.
The arrogant candidate says “I’m smart so I know my code is good”. That’s certainly a bad sign, although sometimes they’re right. Slightly wiser responses are “I run it and look closely” or “I trace the code and make sure it works like I expect”. Better, but too manual. The truly enlightened say “I have an automated test suite” and then you’re off to the real questions about how to test code properly.
I have a deep distrust of code. Software is organic, unpredictable, chaotically complex. It’s difficult enough to understand what the code you write now is likely to do right now with expected inputs. But hostile inputs, or a weird environment, or the same code a year from now, or the slightly modified open source contribution in some fork somewhere? Forget it. That’s why automated tests are so valuable. It’s a way to demonstrate the code is doing what you expect it to.
Writing good tests is hard, almost as hard as writing good code. Modern environments have a lot of testing tools you should learn. From language unit test frameworks to mock objects for servers to fuzz testing to various continuous integration systems for functional tests. GitHub projects have the miracle which is Travis CI, free no-fuss continuous build and test for any open source project. It’s amazing.
So until software correctness proofs become a real tool we can use in real production code, ask yourself how you know your code is going to work. If you’re honest, you probably don’t. But some testing will certainly help give you at least a little confidence.
Astound is a good ISP. I started getting Internet from them a few months ago, upgrading from a $50 6Mbps DSL link to a $70 100Mbps cable link. And it’s like I can see through time. The difference in usability is astonishing. Equally importantly, Astound has been entirely reliable and trouble-free.
The key thing is Astound is not Comcast. Comcast is an evil company with a long history of breaking TCP/IP in various ways that harm customers. Astound just provides pure, sweet, clean bits. Installation requires they bring their own coaxial from the pole to your house. They also offer phone and TV packages. The customer experience is a bit squirrely, I wouldn’t count on them for email hosting or tech support. But the basic Internet service is terrific.
I’d previously been a very happy Sonic DSL customer. They are also a terrific independent ISP with fantastic service. Unfortunately DSL is limited by the technology, the best they could deliver to my house is 12Mbps and that would have been significantly more expensive than Astound. Sonic is now working on fiber-to-the-house, including San Francisco, which should be terrific if they can do it.
We’re very lucky in SF to have a competitive ISP market. We have two DSL providers, two cable providers, and a surprisingly robust fixed wireless provider in MonkeyBrains. Most of the urban US only has two options and large parts of the rural US don’t even have that. The Sonic CEO’s 2011 blog post about broadband duopoly is fantastic background for how we got to have such crummy service in the US.
Ancestry.com is a good web site. It’s a tool for researching and maintaining family history, genealogy. It’s also a remarkably sophisticated database, data repository, and user interface with a lot of lessons for people who design webapps. I’m particularly fascinated that their target market is older people, your grandma who’s not so good with computers but has gotten interested in family history. But in no way is Ancestry dumbed down.
The web UI is great. The primary view is a visual family tree, a refocusable graph view that’s not much like a web page but works great in the browser. You then click through on a name to get to a person’s profile page that’s more like a normal document view. From there you do extra research, add information, etc.
The facts and sources tab on a person’s profile is my favorite part of Ancestry. They don’t just track a fact like “Born on 29 May 1917”, they also track the source of that fact, like “birth certificate” or “census record”. With a link right to a scan of the source document with the relevant information highlighted. Most people’s genealogy is full of bad data. (No, you’re probably not related to that 16th century king.) Ancestry provides a model for establishing the veracity of the data you record. Crowdsourced databases like OpenStreetMap and Wikipedia would benefit from more explicit attribution.
Ancestry is particularly useful because they have a fantastic collection of American genealogical records. The census records are the ones I use most frequently. Meticulously transcribed images of 100+ year old handwritten pages, completely searchable on fields like name, address, age, etc. They’ve collected all sorts of other data too: immigration records, social registers, railroad payrolls.. All this diverse hand written data, presented in a uniform computer search interface. They even proactively find hints for your family members for you to review and add to your data.
The app has some problems. Most of their data collections are only useful for researching Americans. Grassroots genealogists complain about Ancestry being too commercial and proprietary (see GEDCOM). Some people snark about the site being so grounded in Mormonism, although that criticism seems unfair to me. I’ve enjoyed doing a bit of family research in Ancestry. Mostly I’m impressed with the usability of the web app given how complex the data is.
A friend of mine is buying his first Mac, so here’s a list I made for him for essential software.
IETF has an interesting new working group: TCPINC. “TCP extensions to provide unauthenticated encryption and integrity protection of TCP streams”. Practically what this means is “make it harder for third parties to eavesdrop on your Internet traffic”.
In theory IPsec was going to solve this problem for the Internet, but it is a failed technology. Right now the best we have is HTTPS for some websites. But wrapping every network protocol in an SSL layer is stupid, why not just encrypt the network? TCPINC is making a lot of compromises. “Unauthenticated” means they are punting on the harder half of the crypto problem and will leave users vulnerable to man in the middle attacks. It’s TCP only, and has to be NAT-compatible at that, so it won’t be a complete clean solution. But compared to the status quo of a lot of traffic not being encrypted at all, it’s a good choice. Making it a TCP extension should mean it can be deployed incrementally without a lot of pain.
There’s a few related draft specs already, such as draft-bittau-tcpinc-tcpcrypt-00.txt. tcpcrypt.org has more info as well. The mailing list archives go back to March 2014. The IAB just came out with a statement in favor of encryption, which is nice support.
I’m a huge fan of OpenStreetMap but the organization is a mess. Last year I fished around thinking I should get deeply involved with OSM, it’d be a good use of my time. But I gave up on the idea because I didn’t like what I learned about the culture. I think OSM could grow to be as important and influential as Wikipedia. But not with the current trajectory.
The problem boils down to a question of scale and influence. OSM has accomplished a huge amount with very little. No full time staff, lots of borrowed server resources, annual budget of less than $200,000. Think what it could do with more! The impression I’ve got talking to the folks who make OSM work day to day is they’re perfectly happy with the current scale. The de facto leadership, the most active mappers, sysadmins, developers, don’t want a change. And there’s no single visionary leader to bring things forward.
There are related problems with OSM. There’s a strong anti-commercial bent which not only results in an awkward license but also an inability to engage with potential partners like Apple or MapBox. The community itself has some toxic elements; I gave up asking questions on the IRC channel after the seventh time someone implied my questions were dumb. And right now there’s a bunch of drama around elections for new leadership that indicates structural problems, years-old grievances getting aired ineffectively on mailing lists.
I don’t have a solution to get OSM to grow into the massive influence it could have. I worry there can’t be one, that culturally the active OSM members want to remain small and unsullied by commercial interests. I could say and do a lot more to try to help, but I don’t think it would get me anywhere.
The malware in question is Pando Media Booster. A few years ago this software was arguably useful, it allowed games like LoL to distribute patches via a peer-to-peer network. But Pando was discontinued in August 2013. Then in February 2014 someone used Pando to install malware on any suckers who still had the software. The software Riot is still distributing. And all of Riot’s customers who clicked “yes” on the update dialog had their browsers hijacked.
Riot has millions of users all over the world. I’m sympathetic to how hard it is to make software changes; they’re famously behind on a whole lot of development projects. But continuing to distribute malware to customers is unacceptable.
Update: a Riot employee said on Reddit that the problem was "the amount of work it takes to hand update new installers for every language" and offered the idea that the previous Pando owners might help them prevent the malware. That was five months ago.