Last weekend I gave a talk about TopoJSON at State of the map US, the OpenStreetMap conference. TopoJSON is an extension of GeoJSON that encodes topology, enabling interesting visualizations and making for smaller files. The video of my talk is online, you can also see my slides.
The talk is an overview of what TopoJSON is. I also compared the sizes of TopoJSON files to the same data in GeoJSON and found TopoJSON files are about 25–50% the size of the equivalent GeoJSON after gzip. That’s without simplification and GeoJSON rounding comparable to TopoJSON quantization. You get space savings even when there are no shared boundaries, although obviously you get more with shared arcs.
One of the most exciting talks at SotM US was Dane Springmeyer’s talk on what MapBox is doing with their PBF vector tiles. They’ve done a lot of work on making high quality vector data available for cartography. They found they only need to prepare tiles to z=14 (about a square mile); at that scale you can just make the tile encode all features to full precision. They are able to render all of the OSM data for MapBox Streets into just 30GB of tile data in about 100 CPU hours. That’s quite manageable; very exciting.
Dane and I took a quick look and I think their PBF tiles are about the same size as TopoJSON tiles, maybe 15% smaller. OSM data doesn’t have many shared boundaries, so the main thing TopoJSON is doing is delta encoding of arcs. MapBox tiles also use delta encoding. Their PBFs also encode properties more efficiently than JSON, but after gzip I think the difference is less significant.
I like paying for digital movies. So I rented The Hobbit last night to watch on my Xbox. The movie was OK. The twelve times the streaming failed and the movie paused while it buffered was not. Amazon’s movie was about 3.5 gigabytes for 170 minutes, or 2700kbit/s. My download speed is a reliable 6000kbit/s. So what’s the problem?
The bandwidth graph above shows the problem; something terribly wrong with the streaming. First, the Xbox client doesn’t seem to buffer much, if at all. Playback would be a lot better if they used all 6000kbit/s and cached to disk. Second, their streaming server seems to have lost the connection ten times in three hours. Naturally they blame my ISP. At least they refunded the rental fee.
I like to pay for media, but maybe next time I’ll consider downloading an unlicensed copy. Pirate Bay offers a 2000kbit/s version that I could have downloaded and then watched uninterrupted for free. It was available two weeks before the official release.
If you’re looking for a mortgage, think twice before doing business with CitiMortgage. I refinanced my mortgage with them last year. Well, actually this year; it took them nine months. All along the way the process was incompetent and contemptuous of the customer.
I had an existing loan with Citi. They offered to refinance to a lower rate, no cost to me. All very simple: loan-to-value ratio wasn’t a problem, no question about us qualifying. I agreed to refinance in June 2012, gave them all documentation in July, confirmed all documents in place in August. And then nothing happened. For months. All I got was computer-generated letters saying my application had been canceled and empty verbal promises that “we’ll be underwriting soon”. And repeated requests for fresh documents, because the old W-2 copies, the old appraisal, etc, all “expired”.
Underwriting finally looked at the file in January, some six months after I completed my documentation. Then another comedy of incompetence and we finally signed in February. Mortgages usually take 30 days, 60 if it’s low priority. They promised 90 days; it took them 260. From what I’ve heard, that’s been pretty typical for CitiMortgage in the last year.
Maybe it’s all incompetence, but Citi ends up profiting. All told I paid an extra $2800 in interest waiting for them to get around to processing my refinance application. So basically I’m a sucker; I should have gone elsewhere, preferably through a mortgage broker.
Citigroup was one of the main culprits in the mortgage crisis of 2008 that nearly wrecked the US economy. And they were the brokers defrauding their clients in the investment bank scandal of 2003. The incompetence I encountered with my refinance is a different problem. But I really should stick with a policy of not doing business with companies that treat customers with such contempt.
I just finished reading American Nations: A History of the Eleven Rival Regional Cultures of North America, a history and cultural criticism book by Colin Woodward. It’s OK, not great. The map below is the thesis of the book.
Woodward argues North America is best understood as eleven separate distinct “nations” with unique cultural and political identities. The first half of the book gives the origins of these various tribes and argues for their inviolate coherence. This part of the book was insightful and interesting. The second half interprets various recent events in terms of a 400 year old conflict between Yankees and Deep Southerners. This part of the book was boring and ax grindy.
Related: I’m now an active GoodReads user and am trying to do a better job cataloging the books I read.
I no longer really use passwords to log into websites. Instead I use an authentication agent that lives in my browser and proves my identity to websites. Sadly, the authentication protocols of the Web require sending my secret token rather than doing some safer public key protocol. And the details of figuring out how to transmit the token to each website are needlessly complex.
To put it another way, passwords are completely broken; even strong passwords like “qeadzcwrsfxv1331” are crackable. With LastPass in my browser I literally do not know what my password is on pretty much every one of the 479 websites I log in to. I already run a complex authentication protocol. The stupid thing is that it’s a very bad protocol, involving stuffing secrets into random form elements on the web page.
Mozilla Persona is a strong proposal for how to end passwords in a better way, at least for desktop computers. And Tim Bray has lots of good notes on the authentication and identity. I still think OpenID is sufficient, or maybe the newer OpenID Connect system. Hell, at this point I’ll accept log in with Facebook or Google+ Sign-In. But whatever it is needs to be universal. And it really should be vendor neutral.
I love the idea that the JJ Abrams films are not really Star Trek; they’re really Star Trek fanfic. I don’t remember where I first read that idea, but it’s exactly right. I liked both movies, don’t get me wrong, but they are just ridiculous. Here’s the first movie script:
Kirk is this awesome 13 year old kid and he has a hot car and then he drives it off a cliff but he jumps out just in time. And then he gets in a fight in a bar and then he joins Starfleet and sneaks on board the Enterprise. And then Sulu has to space jump and he pulls out this sword and he’s, like, a killer ninja. And there’s a time traveling Romulan with special magical Red Matter. And Vulcan blows up but actually it’s a parallel Star Trek universe where all the same stories happen only totally different. And Spock and Uhura, they kiss.
Totally rad story, right? The new movie is just as ridiculous, if somewhat clever in what it does. I enjoyed it. Here’s hoping Abrams gives the same tawdry treatment to the Star Wars films, that’s a franchise ripe for self-parody.
I just completed a project I’ve been working on for a few weeks, a vector tile map of American rivers based on the NHDPlus dataset. It’s mostly a demo project with readable source, but it’s also kind of pretty.
There are three and a half products:
Vector maps are exciting. The proprietary map world is moving steadily towards vectors; pretty much all mobile maps are vector now and Google Maps is switching to vectors on the desktop. The open source and data world is getting there too. Thanks to Mike Migurski there’s now an experimental OpenStreetMap vector service that’s very promising. Also my personal thanks to Mike: the genesis of this project was getting an hour of his time.
The Tesla S hype has me interested. So now I’m curious, what does it really cost to run per mile? The Tesla site has a decent calculator, here’s some numbers derived from it.
Tesla says they get 283Wh/mile. Electricity in San Francisco costs $0.35/kWh. So that works out to $0.10/mile in a Tesla. Tesla compares itself to 22 MPG cars. Gas in San Francisco is roughly $4/gal, so it’s $0.18/mile in a gas car. By that math, a Tesla is roughly half the cost of a gas car in San Francisco.
San Francisco has outrageously high electricity costs. At the national average of $0.11/kWH a Tesla is more like $0.031/mile, or six times better than a gas car.
On the other hand, batteries wear out. Tesla is offering to replace the battery after the 8 year warranty at a prepaid cost of $10,000 – $12,000. Assuming 12,500 miles a year that adds $0.10/mi to the cost of driving a Tesla, dwarfing the cost of the electricity! The Tesla ends up being $0.13 – $0.20 / mile compared to $0.18/mi for the 22 MPG gas car (and roughly $0.12 – $0.20 / mile for gas cars in general).
Update: Ken points out the battery lasts another 8 years, so battery replacement really adds $0.05/mi. Our SF Tesla then is $0.15/mi. Also Dan asks if some part of drivetrain maintenance should factor in to gas car operating costs.
If you think of the battery as another form of “fuel” that needs replacing every eight years, then the Tesla costs about the same per mile to drive as a gas car no matter what electricity rates you pay. But maybe the battery will last longer; no one really knows. Also, I suspect most Tesla customers think of the battery cost as depreciation and not a consumable.
Another argument for Tesla is that electricity is somehow more environmentally friendly than gas. Not really; a Tesla is metaphorically spewing 44% coal emissions out its tailpipe. It's 20% nuclear though, I think that's a win.
So it's been long enough now I can tell this story about how I met Stewart Brand. Back in 1995 I was a fresh-out-of-college programmer at the Santa Fe Institute, a research place that attracted all sorts of interesting people. And one of the staff asked me if I could give a ride to Esther Dyson from the Albuquerque airport. "She's quite interesting!" I was no dummy and said yes. I mean, my little Honda was big enough for two! And so I got the car washed and met her at the airport. And when we met she asked "could you give my friend Stewart a ride too? He'll be here in about twenty minutes". I had no idea who that'd be until he got into my car and I was just so pleased with myself. The three of us crammed in my little hatchback for the hour long drive with two of the most interesting, provocative technophilosopher types I'd ever met. Not bad for a 23 year old kid.
Needless to say I took advantage of every minute of having them trapped in my car with me. They were quite friendly and thoughtful and fun to talk to. At some point Stewart mentioned that he'd been at the MIT Media Lab for a while (was writing the book on it, actually) and I mentioned I was applying for grad school there. And so he kindly says "Nicolas owes me a favor, I'll write a letter for you" and that's part of how I got to go to the Media Lab for grad school.
I'm embarrassed posting this now because it seems so starfucker, but back in the mid 90s there just weren't that many people talking like Dyson and Brand were. About the intersection of technology and culture, about the Internet, about building things with beauty and depth. That lucky hour had a big influence on me. And they were both so friendly and generous. I've met plenty of arrogant self-proclaimed pundits, maybe even acted like one myself on occasion, and I always try to remember Stewart Brand's friendly humility.
Originally posted to Metafilter
I finally made good on last year’s New Year’s resolution and transferred domain names away from GoDaddy (registered via Google) to Hover. Hover is a humane registrar, the evolution of Tucows, and they have a good service. Getting out of the clutches of GoDaddy is not easy but Hover has put a lot of effort into helping you. Their docs are thorough and the webapp is good. Even so, I was starting to wish I’d used their free valet service. The steps are roughly:
Step 7 has a race condition; Hover has to have received your domain name before you’re allowed to edit the name server authority in the whois data. And various things cache whois and root DNS information for minutes to hours. My site was offline for about 10 minutes while this sorted itself out. The right thing would be to edit the name server authority for your domain first, before initiating the transfer. Hover seemed happy to provide DNS service before the transfer was complete, I just couldn’t update the whois info.
Another glitch was that some of my names weren’t registered directly by me, but instead via Google Sites or AppEngine. That extra step causes a big mess; here’s a detailed description of the solution. In brief, you have to go to Google Admin Control Panel. That has a link for Domain Settings / Advanced DNS settings that gives you login credentials at GoDaddy that Google made and never told you about. There’s a “Sign in to DNS console” link right there that leads you to GoDaddy management, you can unlock the name and get the authorization code there. But that site has been broken for a year and you can’t disable domain privacy with it. Instead log in to this other GoDaddy site; you have to recover the username (a different random number), but the password Google gave you will work. The “cancel private registration” button works there. It’s almost like GoDaddy doesn’t want this transfer to be easy.