What’s nice about Hover is it’s no bullshit. It’s a simple registrar with simple DNS service. And excellent support with questions answered by real, thinking humans. They’re not the fanciest registrar. They don’t offer all the TLDs in the world, their DNS services are limited, they’re not the cheapest. But they are simple and trustworthy. In a business as scammy as domain names it’s nice to buy service from someone decent.
I just had a terrific experience where I asked them why there’s no whois privacy offered on one of the new novelty TLDs. I’d seen a few domains registered there with hidden whois data but Hover wouldn’t do it for me. We went back and forth a few times and he finally explained that the TLD’s policy didn’t allow for whois privacy, but that other registrars might do it anyway and that if I really wanted whois privacy I should use them instead. I appreciated the frank answer.
The Ubiquiti NanoStation loco 5M is good hardware. It’s speciality gear for setting up long distance wireless network links. All of Ubiquiti’s networking gear is worth knowing about if you’re a prosumer-type networking person. I will probably buy their wifi access points next time I need one.
I’m using two NanoStations as a wireless ethernet bridge. My Internet up in Grass Valley terminates 200’ from my house. I couldn’t run a cable but a hacky wireless thing I set up was sort of working. So I asked on Metafilter on how to do a wireless solution right and got a clear consensus on using Ubiquiti equipment. $150 later and it works great! Kind of overkill; the firmware can do a lot more than just bridging and the radios are good for 5+ miles. But it’s reliable and good.
The key thing about Ubiquiti gear is the high quality radios and antennas. It just seems much more reliable than most consumer WiFi gear. Their airOS firmware is good too, it’s a bit complicated to set up but very capable and flexible. And in addition to normal 802.11n or 802.11ac they also have an optional proprietary TDMA protocol called airMax that’s designed for serving several long haul links from a single basestation. They’re mostly marketing to business customers but the equipment is sold retail and well documented for ordinary nerds to figure out.
I still wish I just had a simple wire but I’ve now made my peace with wireless networking. It works well with good gear in a noncongested environment. I wrote up some technical notes on modern wifi so I understood the details better. Starting with 802.11n and MIMO there was a significant improvement in wireless networking protocols, it’s really pretty amazing technology.
The CyberPower CP350SLG is a good small uninterruptible power supply. Its only rated for 250W and it only has a few minutes of battery life. Not suitable for a big computer. But it’s perfect for backup power for network gear, like a router or a modem or the like. And it’s pretty small, just 7x4x3 inches. I made a mistake and bought APC’s small UPS first and the damn thing is ungrounded, which is ridiculous and dumb. I’ve had better luck with CyberPower UPSes anyway and this small one is exactly what I needed.
I’m a big fan of small UPSes. I don’t need something to carry me through a 30 minute power outage, I just want some backup that will keep my equipment running if the power drops for a couple of seconds. Because PG&E, you know? It’s a shame there’s no DC power standard, I bet you could make a DC-only UPS 1/4th the size with a lithium battery. But instead it’s all lead-acid batteries and producing 110V AC just to be transformed back to DC by all the equipment its powering. (That APC UPS does have powered USB ports, a small step towards DC UPS.)
Some day I should look into whole-house UPS units. A quick look suggests it’s about $2500 for 2.7kW, plus installation. This discussion suggests $10k is more realistic if you really mean a whole house.
Jupyter Notebooks (née IPython Notebooks) feel like an important technology to me. It’s a way to interactively build up a computer program, then save the output and share it with other people. You can see a sample notebook I made here or check out this gallery of fancy notebooks. It’s particularly popular with data scientists. If you’re an old Python fogey like me it’s kind of a new thing and it’s exciting and worth learning about. I’m focussing on Python here, but Jupyter is now language-agnostic and supports lots of languages like R, Go, Java, even C++.
The notebook is basically a REPL hosted in a web browser. You type a snippet of code into the web page, run it, and it shows you the output and saves it to the notebook. Because the output is in the browser it can display HTML and images; see cells 6 and 7 in my sample notebook. There’s excellent matplotlib support for quick data visualization and lots of fancier plugins for D3, Leaflet, etc if you need.
Notebooks are made to be shareable. My sample is basically a static snapshot of a program’s output, there’s no Python running when you view it on GitHub. But you can also download that file, run the Python code on your own computer, and modify it however you want. That makes it an incredibly useful pedagogical tool. There are complex notebooks you can download that are effectively whole college courses in computing topics. And you can keep your own notebooks around as documentation of the work you’ve done. It’s a very powerful tool.
Behind the scenes what’s going on is the browser window acts like an attachable debugger. There’s a headless IPython server process running somewhere and the browser connects to it. It’s easy to run IPython yourself on your own machine or there are other options including cloud hosted live notebooks. Most of the display magic works by having objects define their own _repr_html_ methods, that’s what lets Pandas show that nice HTML table in cell 6.
Installing and starting IPython is pretty simple, just install it with pip and run ipython notebook. You’ll also want %matplotlib inline for inline plots. Notebooks seem particularly popular with data scientists; if you want a full Python data environment with Pandas, scikit-learn, etc then the Anaconda distribution is an easy way to get going.
Astound is a good ISP. I started getting Internet from them a few months ago, upgrading from a $50 6Mbps DSL link to a $70 100Mbps cable link. And it’s like I can see through time. The difference in usability is astonishing. Equally importantly, Astound has been entirely reliable and trouble-free.
The key thing is Astound is not Comcast. Comcast is an evil company with a long history of breaking TCP/IP in various ways that harm customers. Astound just provides pure, sweet, clean bits. Installation requires they bring their own coaxial from the pole to your house. They also offer phone and TV packages. The customer experience is a bit squirrely, I wouldn’t count on them for email hosting or tech support. But the basic Internet service is terrific.
I’d previously been a very happy Sonic DSL customer. They are also a terrific independent ISP with fantastic service. Unfortunately DSL is limited by the technology, the best they could deliver to my house is 12Mbps and that would have been significantly more expensive than Astound. Sonic is now working on fiber-to-the-house, including San Francisco, which should be terrific if they can do it.
We’re very lucky in SF to have a competitive ISP market. We have two DSL providers, two cable providers, and a surprisingly robust fixed wireless provider in MonkeyBrains. Most of the urban US only has two options and large parts of the rural US don’t even have that. The Sonic CEO’s 2011 blog post about broadband duopoly is fantastic background for how we got to have such crummy service in the US.
Ancestry.com is a good web site. It’s a tool for researching and maintaining family history, genealogy. It’s also a remarkably sophisticated database, data repository, and user interface with a lot of lessons for people who design webapps. I’m particularly fascinated that their target market is older people, your grandma who’s not so good with computers but has gotten interested in family history. But in no way is Ancestry dumbed down.
The web UI is great. The primary view is a visual family tree, a refocusable graph view that’s not much like a web page but works great in the browser. You then click through on a name to get to a person’s profile page that’s more like a normal document view. From there you do extra research, add information, etc.
The facts and sources tab on a person’s profile is my favorite part of Ancestry. They don’t just track a fact like “Born on 29 May 1917”, they also track the source of that fact, like “birth certificate” or “census record”. With a link right to a scan of the source document with the relevant information highlighted. Most people’s genealogy is full of bad data. (No, you’re probably not related to that 16th century king.) Ancestry provides a model for establishing the veracity of the data you record. Crowdsourced databases like OpenStreetMap and Wikipedia would benefit from more explicit attribution.
Ancestry is particularly useful because they have a fantastic collection of American genealogical records. The census records are the ones I use most frequently. Meticulously transcribed images of 100+ year old handwritten pages, completely searchable on fields like name, address, age, etc. They’ve collected all sorts of other data too: immigration records, social registers, railroad payrolls.. All this diverse hand written data, presented in a uniform computer search interface. They even proactively find hints for your family members for you to review and add to your data.
The app has some problems. Most of their data collections are only useful for researching Americans. Grassroots genealogists complain about Ancestry being too commercial and proprietary (see GEDCOM). Some people snark about the site being so grounded in Mormonism, although that criticism seems unfair to me. I’ve enjoyed doing a bit of family research in Ancestry. Mostly I’m impressed with the usability of the web app given how complex the data is.
IETF has an interesting new working group: TCPINC. “TCP extensions to provide unauthenticated encryption and integrity protection of TCP streams”. Practically what this means is “make it harder for third parties to eavesdrop on your Internet traffic”.
In theory IPsec was going to solve this problem for the Internet, but it is a failed technology. Right now the best we have is HTTPS for some websites. But wrapping every network protocol in an SSL layer is stupid, why not just encrypt the network? TCPINC is making a lot of compromises. “Unauthenticated” means they are punting on the harder half of the crypto problem and will leave users vulnerable to man in the middle attacks. It’s TCP only, and has to be NAT-compatible at that, so it won’t be a complete clean solution. But compared to the status quo of a lot of traffic not being encrypted at all, it’s a good choice. Making it a TCP extension should mean it can be deployed incrementally without a lot of pain.
There’s a few related draft specs already, such as draft-bittau-tcpinc-tcpcrypt-00.txt. tcpcrypt.org has more info as well. The mailing list archives go back to March 2014. The IAB just came out with a statement in favor of encryption, which is nice support.
Screenflick is good software. It captures full video with sound from your Mac desktop, full screen or a portion. I’m using it to record games I play. Could have all sorts of applications.
There’s a variety of screen capture options on the Mac from the free recorder included in Quicktime to the market leader ScreenFlow for $99. Screenflick’s only $29 and is very good at capture, including keystrokes, mouse events, and audio via Soundflower. I also appreciate its ability to downsample the raw video when recording. It also has an impressive variety of export options.
The big drawback is that Screenflick has no editor, not even a simple interface for cropping out sections of video. My theory is that’s what iMovie is for. But folks I know who produce a lot of screencasts appreciate that ScreenFlow is an integrated solution.
Unison is good software. It’s a command line program to synchronize filesystems, to keep a directory tree identical on multiple computers. I use it to sync about 40G of files across two Macs, to keep my home directory and source code and various applications in sync. The neat trick is I sync those two Macs through a portable hard drive so I don’t have to wait for hours for files to go over the Internet. Unison can also work online so changes are propagated automatically.
Unison is a lot like rsync. But Unison is designed to be bidirectional. Rsync always syncs one way: copy A to B. Unison will look at the differences between A and B and merge them, including a limited UI for conflict resolution. This protects me from the case where I modify something on both machines without syncing beforehand.
The main drawback with Unison is it’s slow, it takes many minutes to decide what files to sync. I also hate the interactive UI; it doesn’t work well when you have lots of files that changed in both places. I’m also a bit concerned that it’s no longer under active development but Unison is the rare software that’s a complete product, it’s not clear it needs many changes.
There are other tools solving similar file sync problems, none perfect. Dropbox is phenomenal but doesn’t have offline syncing of large files. Camlistore is promising but not quite ready for civilian use. git can be used to keep stuff in sync but is better suited for text files whose history you want to keepl. And CrashPlan is great for online backup but doesn’t really provide a second live copy.
Gfycat (and CloudFlare) has a fantastic error page for when they have a server error.
Such a clear, simple statement of what the error is and what the user can do. One of my pet peeves is software that blames the user when it's not their fault, like the "your Internet is down" message Steam displays when their client can't connect to their server. This kind of message is much more honest and useful.
BTW, Gfycat is an awesome service. They host animated GIFs for sharing. And they transcode the bloated source GIF to much smaller HTML 5 video, then serve the smaller file to browsers who can handle it. The hosting is good, the 95% bandwidth savings is great.